简体   繁体   English

Hadoop:从源(即我的PC)将文件加载到HDFS时出现错误

[英]Hadoop: Getting error when i load file from source (i.e my PC) to HDFS

Im trying to copy a file from source to hdfs 我正在尝试将文件从源复制到HDFS

Query: Hadoop fs -copyFromLocal (Local path) (source path) 查询: Hadoop fs -copyFromLocal(本地路径)(源路径)

eg Hadoop fs -copyFromLocal C:\\users\\Desktop (source path) 例如Hadoop fs -copyFromLocal C:\\ users \\ Desktop(源路径)

as well eg Hadoop fs -copyFromLocal C:\\users\\Desktop URI 以及Hadoop fs -copyFromLocal C:\\ users \\ Desktop URI

but geting error 但出现错误

-copyFromLocal: Can not create a Path from a null string Usage: hadoop fs [generic options] -copyFromLocal [-f] [-p] (localsrc) ... (dst) -copyFromLocal:无法从空字符串创建路径用法:hadoop fs [通用选项] -copyFromLocal [-f] [-p](localsrc)...(dst)

try hadoop dfs -copyFromLocal file_to_be_copied hdfs://namenode:/path_to_location 尝试hadoop dfs -copyFromLocal file_to_be_copied hdfs://namenode:/path_to_location

fs command is being depreciated. fs命令被折旧。

Note: you do not have to mention the actual HDFS path. 注意:您不必提及实际的HDFS路径。 You could also do something like 您也可以做类似的事情

hadoop dfs -copyFromLocal file_to_be_copied /path_to_location_within_hdfs

  1. You are not specifying the path of file. 您未指定文件的路径。
  2. Make sure hdfs path that you specified exists. 确保指定的hdfs路径存在。
  3. Finally try escaping the forward slash like: 最后尝试像这样转义正斜杠:

     Hadoop fs -copyFromLocal C:\\\\users\\\\Desktop\\myfile.txt URI 

Or use 或使用

    Hadoop fs -copyFromLocal C:/users/Desktop/myfile.txt URI

Unfortunately, I am unable to comment below your reply. 很遗憾,我无法在您的回复下方发表评论。 The easiest way to to get the path is to go under the UI and click on "Browse the filesystem". 获取路径的最简单方法是进入用户界面,然后单击“浏览文件系统”。 Another option would be to hdfs dfs -ls / or hdfs dfs -lsr / 另一个选择是hdfs dfs -ls /hdfs dfs -lsr /

ls will list the dir/files in your root dir. ls将列出您的根目录中的目录/文件。
lsr will recursively read through all the sub directories in your root dir. lsr将递归地读取您的根目录中的所有子目录。

HDFS is just like other file systems. HDFS就像其他文件系统一样。 The path begins with / as root. 路径以/作为根开头。 The period (.) denotes current directory. 句点(。)表示当前目录。 I am on Linux platform. 我在Linux平台上。 Hope the examples below would help. 希望以下示例对您有所帮助。

name@host: ~/oracle/sql$ hdfs dfs -ls /
Found 5 items
drwxr-xr-x   - myname supergroup          0 2014-09-05 03:06 /directory
drwxr-xr-x   - myname supergroup          0 2014-05-10 03:34 /in
drwxr-xr-x   - myname supergroup          0 2014-10-16 22:50 /system
drwxrwx---   - myname supergroup          0 2014-08-18 08:44 /tmp
drwxr-xr-x   - myname supergroup          0 2014-08-18 08:41 /user
name@host: ~/oracle/sql$ hdfs dfs -copyFromLocal foo.sql /
name@host: ~/oracle/sql$ hdfs dfs -ls /
Found 6 items
drwxr-xr-x   - myname supergroup          0 2014-09-05 03:06 /directory
-rw-r--r--   3 myname supergroup         90 2015-03-24 05:33 /foo.sql
drwxr-xr-x   - myname supergroup          0 2014-05-10 03:34 /in
drwxr-xr-x   - myname supergroup          0 2014-10-16 22:50 /system
drwxrwx---   - myname supergroup          0 2014-08-18 08:44 /tmp
drwxr-xr-x   - myname supergroup          0 2014-08-18 08:41 /user

name@name: ~/oracle/sql$ hdfs dfs -copyFromLocal foo.sql /user/myname/
name@name: ~/oracle/sql$ hdfs dfs -ls /user/myname/foo.sql
-rw-r--r--   3 myname supergroup         90 2015-03-24 05:41       /user/myname/foo.sql
name@host: ~/oracle/sql$

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何从侧映射器和减速器即驱动程序类的输入文件路径获取hadoop中的文件名 - How to get the file name in hadoop from input file path out side mapper and reducer i.e driver class Hadoop-我想使用hdfs中的文件在hive中加载表 - Hadoop- I want to load a table in hive using a file in hdfs 将文件从本地复制到hdfs时出现Hadoop DFS错误 - Hadoop DFS Error when coping file from local to hdfs 我对 hdfs hadoop 文件和 Idk 有错误是什么问题 - I have a error with hdfs hadoop file and Idk what is the problem Hadoop 1.2.1-我需要从HDFS删除文件 - Hadoop 1.2.1 - I need to remove a file from HDFS 当我们从 HDFS 加载数据到 Hive 表时,它会从源目录(HDFS)中删除文件 - When we Load data into Hive table from HDFS, it deletes the file from source directory(HDFS) 无法从Pig Latin从Hadoop HDFS加载文件 - Cannot load a file from Hadoop HDFS from Pig Latin Hadoop 2中的JobHistory服务器无法从HDFS加载历史文件 - JobHistory server in Hadoop 2 could not load history file from HDFS 尝试从HDFS读取文件时,Pentaho的“ Hadoop文件输入”(勺)始终显示错误 - Pentaho's “Hadoop File Input” (Spoon) always displays error when trying to read a file from HDFS 为什么我的datanode在hadoop集群上运行,但仍然无法将文件放入hdfs? - why my datanode run on hadoop cluster but I still can't put file into hdfs?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM