简体   繁体   中英

Hadoop copying local file to HDFS?

I'm trying to copy a local file called 'afile' to the HDFS. So I ran the following command:

'hadoop fs -copyFromLocal /home/neo/afile in' or 'hadoop fs -put /home/neo/afile in'

However, it says: 'File /home/neo/afile does not exist'

Then I put the file 'afile' into the directory under hadoop. Now the copyFromLocal succeeded. However, the file 'in' is empty, since I run 'hadoop fs - ls', it shows

'-rw-r--r--' 1 neo supergroup 0 2015-04-06 17:45 /user/neo/in

I also tried 'hadoop fs -cat in', nothing returned.

Could someone please help?

Thanks!

  1. Create a new file in local filesystem named test.txt in /home/neo/

  2. Add content to test.txt : echo "sample text for wordcount" > /home/neo/test.txt

  3. Create a new directory in hdfs using: hadoop fs -mkdir /user/neo/in/

  4. Copy file from local directory to HDFS: hadoop fs -copyFromLocal /home/neo/test.txt /user/neo/in/test.txt

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM