简体   繁体   English

无法将文件从本地磁盘复制到HDFS

[英]Unable to copy files from local disk to HDFS

i have successfully installed ubuntu 12.04 and hadoop 2.4.0. 我已经成功安装了ubuntu 12.04和hadoop 2.4.0。

after entering the jps command i find the output as below 输入jps命令后,我找到如下输出

4135 jps
2582 SeconadaryNameNode
3143 NodeManager
2394 Namenode
2391 Datanode
3021 ResourceManager

now i want to run the wordcount example. 现在我想运行wordcount示例。

i created a .txt file with some content in it 我创建了一个包含一些内容的.txt文件

now whenever i try to copy this into hadoop hdfs by following this command 现在每当我尝试通过遵循此命令将其复制到hadoop hdfs时

hdfs -copyFromLocal /app/hadoop/tmp/input.txt /wordcount/input.txt

("wordcount" in the path is a directory which i have created) (路径中的“wordcount”是我创建的目录)

but it shows 但它表明

unrecognised option: -copyFromLocal
could not create the java virtual machine 

what i am doing wrong? 我做错了什么?

The commands you are using are older ones. 您使用的命令是较旧的命令。 Try, 尝试,

hadoop fs -mkdir -p /wordcount/input
hadoop fs -put /app/hadoop/tmp/input.txt /wordcount/input/input.txt

You'll need to specify the output directory to be /wordcount/output in this case and it should not exist before you run the job. 在这种情况下,您需要将输出目录指定为/wordcount/output ,并且在运行作业之前它不应该存在。 If it does, the job will fail. 如果是这样,作业将失败。 So you can remove the directory as, 所以你可以删除目录,

hadoop fs -rm -R /wordcount/output

Edit: To see the output files, check: 编辑:要查看输出文件,请检查:

hadoop fs -ls /wordcount/output

To see the output on the console, use this: 要在控制台上查看输出,请使用以下命令:

hadoop fs -cat /wordcount/output/part*

Edit 2: The newer Hadoop CLI uses: 编辑2:较新的Hadoop CLI使用:

hdfs dfs <your_command_here>

For example, 例如,

hdfs dfs -ls /

Also, if you want to read gzip files, you can use this, 另外,如果你想阅读gzip文件,你可以使用它,

hdfs dfs -cat /wordcount/output/part* | gzip -d -u

You forgot dfs 你忘记了dfs

hdfs dfs -copyFromLocal /blar /blar

IMO Scalding is the best tool to get started writing MapReduce programs. IMO Scalding是开始编写MapReduce程序的最佳工具。 It's as concise as Pig but as flexible as Java. 它像Pig一样简洁,但与Java一样灵活。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何将文件从本地文件系统复制到HDFS文件系统? - How to copy files from local file system to HDFS file system? 将本地数据复制到hadoop hdfs错误 - Copy local data to hadoop hdfs error 无法将文件从本地计算机转换为无业游民 - Unable to scp files from local machine into vagrant 无法将目录上传到hdfs。 `/ usr / local / tmp /&#39;:没有这样的文件或目录 - Unable to upload directory to hdfs. `/usr/local/tmp/': No such file or directory 无法将文件从pscp复制到ubuntu-权限被拒绝 - Unable to copy files from pscp to ubuntu - persmission denied 使用HDFS中的文件到Apache Spark - Using files from HDFS into Apache Spark 从本地文件上载hdfs上的数据时出错 - error while uploading data on hdfs from local file Docker-compose 不会将测试文件从本地文件夹复制到容器中,而是复制其他所有内容 - Docker-compose doesn't copy Test files from local folder into Container , but copies everything else 如何将文件从 docker 容器传输/复制到 ubuntu 18.04 中的本地目录? - How to transfer / copy files from docker container to local directory in ubuntu 18.04? 如何使用SSH文件传输协议将文件从本地机器复制到服务器? - How to copy files from local machine to server using SSH file transfer protocol?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM