简体   繁体   English

/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-*.jar 不存在

[英]/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-*.jar does not exist

While running some job I get the following exception:在运行某些作业时,我收到以下异常:

java.io.FileNotFoundException: File file:/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-*.jar does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:511)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:724)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:501)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:397)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:140)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:213)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
        at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
        at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:72) Job Submission failed with exception 'java.io.FileNotFoundException(File file:/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-*.jar does not exist)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

What could be the reason and how can I fix it?可能是什么原因,我该如何解决? The missing jar file actually exists:丢失的 jar 文件实际上存在:

[user@host ~]$ ls -l /usr/lib/hive-hcatalog/share/hcatalog/
total 664
-rw-r--r-- 1 root root 468839 May 27 12:59 hive-hcatalog-core-0.13.0.2.1.2.1-471.jar
lrwxrwxrwx 1 root root     41 Jun 30 15:57 hive-hcatalog-core.jar -> hive-hcatalog-core-0.13.0.2.1.2.1-471.jar
-rw-r--r-- 1 root root  81042 May 27 12:59 hive-hcatalog-pig-adapter-0.13.0.2.1.2.1-471.jar
lrwxrwxrwx 1 root root     48 Jun 30 15:57 hive-hcatalog-pig-adapter.jar -> hive-hcatalog-pig-adapter-0.13.0.2.1.2.1-471.jar
-rw-r--r-- 1 root root  67890 May 27 12:59 hive-hcatalog-server-extensions-0.13.0.2.1.2.1-471.jar
lrwxrwxrwx 1 root root     54 Jun 30 15:57 hive-hcatalog-server-extensions.jar -> hive-hcatalog-server-extensions-0.13.0.2.1.2.1-471.jar
-rw-r--r-- 1 root root  52552 May 27 12:59 hive-hcatalog-streaming-0.13.0.2.1.2.1-471.jar
lrwxrwxrwx 1 root root     46 Jun 30 15:57 hive-hcatalog-streaming.jar -> hive-hcatalog-streaming-0.13.0.2.1.2.1-471.jar
drwxr-xr-x 3 root root   4096 Jun 30 15:57 storage-handlers

I'm trying to use this benchmark: https://github.com/cartershanklin/hive-testbench .我正在尝试使用此基准测试: https : //github.com/cartershanklin/hive-testbench The problem occurs when running the tpcds-setup.sh script, exactly after the following line:运行tpcds-setup.sh脚本时出现问题,正好在以下行之后:

hive -i settings/load-partitioned.sql -f ddl-tpcds/bin_partitioned/store_sales.sql -d DB=tpcds_bin_partitioned_orc_6 -d SOURCE=tpcds_text_6 -d BUCKETS=1 -d RETURN_BUCKETS=1 -d FILE=orc

A bug detailed at https://issues.apache.org/jira/browse/HIVE-2379 mentions this situation. https://issues.apache.org/jira/browse/HIVE-2379中详细介绍的错误提到了这种情况。 It's not possible to test Hive/Hcatalog integration without explicitly exporting the auxiliary jars.如果不显式导出辅助 jar,就无法测试 Hive/Hcatalog 集成。 Performing an export HIVE_AUX_JARS_PATH=/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar in /etc/hive/conf/hive-env.sh and adding the path to hcatalog core lib solves the problem.执行export HIVE_AUX_JARS_PATH=/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar/etc/hive/conf/hive-env.sh和添加的路径hcatalog芯LIB解决了这个问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM