简体   繁体   English

Java项目如何连接到本地Hadoop实例

[英]Java project how to connect to local hadoop instance

I am pretty new to hadoop and am trying to create a java project that uses hadoop. 我对hadoop很陌生,正在尝试创建一个使用hadoop的Java项目。

I have hadoop running in single node cluster and I have a java project where I have imported hadoop as external jar in the build path. 我有hadoop在单节点集群中运行,并且我有一个Java项目,在其中将hadoop作为外部jar导入了构建路径。

Does the following make sense: How should i connect the java project to the local instance of hadoop? 以下内容是否有意义:我应如何将Java项目连接到hadoop的本地实例?

Thank you 谢谢

That should work. 那应该工作。 You don't have to do much in order to connect to your local Hadoop setup. 您无需执行太多操作即可连接到本地Hadoop设置。 Just create a Configuration object and tell your code where to look for your configuration files using Configuration.addResource() . 只需创建一个Configuration对象,并使用Configuration.addResource()告诉您的代码在哪里可以找到您的配置文件。 A small example : 一个小例子:

public class CopyToHdfs {

    public static void main(String[] args) throws IOException {

        Configuration conf = new Configuration();
        conf.addResource(new Path("/Users/miqbal1/hadoop-eco/hadoop-1.1.2/conf/core-site.xml"));
        conf.addResource(new Path("/Users/miqbal1/hadoop-eco/hadoop-1.1.2/conf/hdfs-site.xml"));
        FileSystem fs = FileSystem.get(conf);
        fs.copyFromLocalFile(new Path("file:///Users/miqbal1/input.txt"), new Path("/"));

    }
}

I think what you need is the hadoop eclipse plugin. 我认为您需要的是hadoop eclipse插件。 I think it doesn't work with all the hadoop versions, but I have it running with hadoop 1.2.0 version. 我认为它不适用于所有hadoop版本,但我可以在hadoop 1.2.0版本中运行它。 Take a look at this tutorial . 看一下本教程

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM