通过 flume 上传数据到hive

Posted 林**

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了通过 flume 上传数据到hive相关的知识,希望对你有一定的参考价值。

目标:  通过接受 1084端口的http请求信息, 存储到 hive数据库中,
osgiweb2.db为hive中创建的数据库名称
periodic_report5 为创建的数据表,

flume配置如下:
a1.sources=r1  
a1.channels=c1  
a1.sinks=k1  
  
a1.sources.r1.type = http
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = 1084
a1.sources.r1.handler=jkong.Test.HTTPSourceDPIHandler  

#a1.sources.r1.interceptors=i1 i2
#a1.sources.r1.interceptors.i1.type=regex_filter
#a1.sources.r1.interceptors.i1.regex=\\{.*\\}
#a1.sources.r1.interceptors.i2.type=timestamp
a1.channels.c1.type=memory  
a1.channels.c1.capacity=10000  
a1.channels.c1.transactionCapacity=1000  
a1.channels.c1.keep-alive=30  
  
a1.sinks.k1.type=hdfs  
a1.sinks.k1.channel=c1  
a1.sinks.k1.hdfs.path=hdfs://gw-sp.novalocal:1086/user/hive/warehouse/osgiweb2.db/periodic_report5 
a1.sinks.k1.hdfs.fileType=DataStream  
a1.sinks.k1.hdfs.writeFormat=Text  
a1.sinks.k1.hdfs.rollInterval=0  
a1.sinks.k1.hdfs.rollSize=10240  
a1.sinks.k1.hdfs.rollCount=0  
a1.sinks.k1.hdfs.idleTimeout=60

a1.sources.r1.channels=c1
a1.sinks.k1.channel=c1

 2.  数据表创建:

create table periodic_report5(id BIGINT, deviceId STRING,report_time STRING,information STRING) row format serde "org.openx.data.jsonserde.JsonSerDe" WITH SERDEPROPERTIES("id"="$.id","deviceId"="$.deviceId","report_time"="$.report_time","information"="$.information"); 

  2.1  将数据表中的字段也同样拆分成数据字段的创表语句(还没有试验, 暂时不用)

create table periodic_report4(id BIGINT, deviceId STRING,report_time STRING,information STRUCT<actualTime:BIGINT,dpiVersionInfo:STRING,subDeviceInfo:STRING,wanTrafficData:STRING,ponInfo:STRING,eventType:STRING,potsInfo:STRING,deviceInfo:STRING,deviceStatus:STRING>) row format serde "org.openx.data.jsonserde.JsonSerDe" WITH SERDEPROPERTIES("input.invalid.ignore"="true","id"="$.id","deviceId"="$.deviceId","report_time"="$.report_time","requestParams.actualTime"="$.requestParams.actualTime","requestParams.dpiVersionInfo"="$.requestParams.dpiVersionInfo","requestParams.subDeviceInfo"="$.requestParams.subDeviceInfo","requestParams.wanTrafficData"="$.requestParams.wanTrafficData","requestParams.ponInfo"="$.requestParams.ponInfo","requestParams.eventType"="$.requestParams.eventType","requestParams.potsInfo"="$.requestParams.potsInfo","requestParams.deviceInfo"="$.requestParams.deviceInfo","requestParams.deviceStatus"="$.requestParams.deviceStatus"); 

3. 启动flume语句:flume 根目录

bin/flume-ng agent --conf ./conf/ -f ./conf/flume.conf --name a1 -Dflume.root.logger=DEBUG,console

4. 启动hive语句: hive bin目录

hive    或者:
./hive -hiveconf hive.root.logger=DEBUG,console  #带log信息启动

 

以上是关于通过 flume 上传数据到hive的主要内容,如果未能解决你的问题,请参考以下文章

非Kerberos环境下Kafka数据到Flume进Hive表

flume kafka hive

flume采集数据直接存到hive中

Flume+Hadoop+Hive的离线分析系统基本架构

Flume+Hadoop+Hive的离线分析系统基本架构

日志采集框架Flume