简体   繁体   English

通过骆驼将文件复制到HDFS2不起作用

[英]Copying a file to HDFS2 via camel doesn't work

Does anyone have a good example of writing files to hdfs2 via camel? 有没有人有通过骆驼将文件写入hdfs2的好例子?

I tried the following code : 我尝试了以下代码:

import org.apache.camel.CamelContext;
import org.apache.camel.ProducerTemplate;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;

public final class Main {

    private Main() {
    }

    public static void main(String args[]) throws Exception {
        CamelContext context = new DefaultCamelContext();
        context.addRoutes(new RouteBuilder() {
            public void configure() {
                from("file:C:\\FILES\\SRC\\2015-03-31_16-58-56.png?noop=true")
                        .to("hdfs2://xxxx:9000/testCamel/D2/qwe.png");
                        //.to("file:C:\\FILES\\OUT");
            }
        });
        //ProducerTemplate template = context.createProducerTemplate();

        context.start();

        context.stop();
    }
}

The files are created in HDFS but all of them are empty (0 bytes). 这些文件是在HDFS中创建的,但是它们都是空的(0字节)。

make sure noop=false when you consume from a file to HDFS. 从文件消费到HDFS时,请确保noop = false。 hdfs component uses chunk to consume, so if noop is true, camel will think that it already consumed it. hdfs组件使用块来消耗,因此如果noop为true,骆驼会认为它已经消耗了它。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM