我正在用Flume假脱机目录并将文件上传到hdfs。这些是txt / csv文件,我希望它们在hdfs中采用这种格式。但是Flume将它们作为二进制文件加载...

这是我的配置:

tier1.sources  = source1
tier1.channels = channel1
tier1.sinks    = sink1

tier1.sources.source1.type     = spooldir
tier1.sources.source1.channels = channel1
tier1.sources.source1.spoolDir = /var/data
tier1.sources.source1.fileHeader = false
tier1.sources.source1.deletePolicy = immediate
tier1.channels.channel1.type   = memory
tier1.sinks.sink1.type         = hdfs
tier1.sinks.sink1.channel      = channel1
tier1.sinks.sink1.hdfs.path = /user/hdfs/%y-%m-%d/
tier1.sinks.sink1.hdfs.writeFormat=Text
tier1.sinks.sink1.hdfs.useLocalTimeStamp = true
tier1.sinks.sink1.hdfs.rollInterval = 30

tier1.channels.channel1.capacity = 100

我应该怎么做才能使Flume将txt文件作为txt文件加载?

最佳答案

这应该可以解决您的问题:

关于hadoop - Flume将txt文件更改为二进制文件,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/23525996/

10-10 10:56