flume从本地读取数据录入到hdfs文件系统

    xiaoxiao2021-03-25  115

    配置文件

    agent.sources = origin   agent.channels = memorychannel   agent.sinks = target      agent.sources.origin.type = TAILDIR agent.sources.origin.filegroups=f1 agent.sources.origin.filegroups.f1=/hadoop/flume/logs/test/.*.log* agent.sources.origin.channels = memorychannel   agent.sources.origin.positionFile=/hadoop/flume/logs/test/position.json    agent.sinks.loggerSink.type = logger   agent.sinks.loggerSink.channel = memorychannel      agent.channels.memorychannel.type = memory   agent.channels.memorychannel.capacity = 1000  agent.channels.memorychannel.transactionCapacity = 100    agent.sinks.target.type = hdfs   agent.sinks.target.channel = memorychannel   agent.sinks.target.hdfs.path = hdfs://127.0.0.1:9000/flume/events/%y-%m-%d/%H%M%S   agent.sinks.target.hdfs.filePrefix = data-%{host}   agent.sinks.target.hdfs.rollInterval = 30   agent.sinks.target.hdfs.rollSize = 100   agent.sinks.target.hdfs.rollCount = 0 agent.sinks.target.hdfs.round = true   agent.sinks.target.hdfs.useLocalTimeStamp = true   agent.sinks.target.hdfs.minBlockReplicas=1   agent.sinks.target.hdfs.writeFormat=Text   agent.sinks.target.hdfs.fileType=DataStream  

    把hdfs的驱动包放入到flum的lib目录下

    转载请注明原文地址: https://ju.6miu.com/read-12327.html

    最新回复(0)