简体   繁体   English

导入HDFS时出现Sqoop错误

[英]Sqoop error while importing to hdfs

Im trying to export a small table from mysql to HDFS using sqoop.The table has 2 columns id (primary key ) and name. 我正在尝试使用sqoop将一个小表从mysql导出到HDFS。该表有2列id(主键)和名称。 Im able to list databases and tables via sqoop. 我无法通过sqoop列出数据库和表。 But getting exception while importing table to HDFS. 但是在将表导入到HDFS时出现异常。 Kindly help . 请帮忙。 Below is the error log. 下面是错误日志。

13/12/04 02:05:38 WARN conf.Configuration: session.id is deprecated. 
Instead, use   dfs.metrics.session-id
13/12/04 02:05:38 INFO jvm.JvmMetrics: 
Initializing JVM Metrics  withprocessName=JobTracker,sessionId=
13/12/04 02:05:39 INFO mapreduce.JobSubmitter: 
Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop1439217057
/.staging/job_local1439217057_0001
13/12/04 02:05:39 ERROR 
security.UserGroupInformation:PriviledgedActionException  as:hadoop      
(auth:SIMPLE) 
cause:java.io.FileNotFoundException: 
File does not exist: hdfs://prat1:9000/home/hadoop/usr/sqoop-1.4.3-cdh4.3.0/lib/commons-    compress-1.4.1.jar
13/12/04 02:05:39 DEBUG util.ClassLoaderStack: 
Restoring classloader:sun.misc.Launcher$AppClassLoader@35a16869
13/12/04 02:05:39 ERROR tool.ImportTool: 
Encountered IOException running import job:  java.io.FileNotFoundException: File does not  exist:
hdfs://prat1:9000/home/hadoop/usr/
sqoop-1.4.3-cdh4.3.0/lib/commons-compress-1.4.1.jar
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:824)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:254)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:290)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:361)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1269)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1266)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1266)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
    at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:226)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:555)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:111)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

Try to do this: 尝试这样做:

  1. su root

  2. Type root password 输入root密码

  3. su hdfs

And then run sqoop command and it will work like a champ! 然后运行sqoop命令,它将像冠军一样工作!

As you are not having write permissions, you are getting security.UserGroupInformation:PriviledgedActionException 由于没有写权限,因此可以获得安全性。UserGroupInformation:PriviledgedActionException

Try to login as a hdfs user and then run the sqoop command. 尝试以hdfs用户身份登录,然后运行sqoop命令。

su root 苏根

type password 输入密码

su hdfs 苏HDFS

And then run sqoop export command. 然后运行sqoop export命令。

授予/usr/lib/sqoop/lib/<mysqljavaconnector>-bin.jar文件777权限,并确保/usr/lib/sqoop/*/usr/local/hadoop/*中的文件由同一所有者拥有用户。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM