flume采集启动报错,权限不够

Posted 胖子学习天地

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了flume采集启动报错,权限不够相关的知识,希望对你有一定的参考价值。

18/04/18 16:47:12 WARN source.EventReader: Could not find file: /home/hadoop/king/flume/103104/data/HD20180417213353.data
java.io.FileNotFoundException: /home/hadoop/king/flume/103104/trackerDir/.flumespool-main.meta (Permission denied)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at org.apache.avro.file.DataFileWriter.appendTo(DataFileWriter.java:149)
        at org.apache.flume.serialization.DurablePositionTracker.<init>(DurablePositionTracker.java:141)
        at org.apache.flume.serialization.DurablePositionTracker.getInstance(DurablePositionTracker.java:76)
        at com.gw.flume.source.EventReader.getNextFile(EventReader.java:400)
        at com.gw.flume.source.EventReader.readEvents(EventReader.java:174)
        at com.gw.flume.source.FileSource$SpoolDirectoryRunnable.run(FileSource.java:150)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

权限不够

注意这里的这个/home/hadoop/king/flume/103104/trackerDir/.flumespool-main.meta 文件是个隐藏文件,
刚开始没找到,后面使用ll -a找到了。
然后修改权限 chmod 777 /home/hadoop/king/flume/103104/trackerDir/.flumespool-main.meta

以上是关于flume采集启动报错,权限不够的主要内容,如果未能解决你的问题,请参考以下文章

Hadoop

日志采集Flume启动停止脚本

日志采集Flume启动停止脚本

日志采集Flume启动停止脚本

大数据高级开发工程师——数据采集框架Flume

flume采集数据直接存到hive中