简体   繁体   中英

Hadoop trying to use JDK install directory as executable command

I am new to Hadoop, and attempting to get the first simple 'Word count' example to run. I have the same problem that was reported here (but the responses there don't resolve the problem): Could not run jar file in hadoop3.1.3

Java is installed to c:\jdk-15, and this is configured correctly into the hadoop-env.cmd file, like this:

@rem The java implementation to use.  Required.
set JAVA_HOME=c:\jdk-15

As you can see from the full command prompt dump below, I get the following error. I'd be massively grateful if anyone can guess what might be wrong here, or has any other suggestions.

"Launching container" [2020-09-30 15:17:56.848]Container exited with a non-zero exit code 1. Last 4096 bytes of stderr : '"C:\jdk-15"' is not recognized as an internal or external command, operable program or batch file.

PS C:\hadoop-3.2.1\bin> hadoop jar ..\..\Hadoop\WordCount\wc.jar WordCount /test /WordCount/run1
2020-09-30 15:17:51,675 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
2020-09-30 15:17:51,979 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2020-09-30 15:17:52,052 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/Admin/.staging/job_1601461031349_0008
2020-09-30 15:17:52,259 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-09-30 15:17:52,426 INFO input.FileInputFormat: Total input files to process : 3
2020-09-30 15:17:52,592 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-09-30 15:17:52,762 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-09-30 15:17:52,829 INFO mapreduce.JobSubmitter: number of splits:3
2020-09-30 15:17:53,099 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-09-30 15:17:53,190 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1601461031349_0008
2020-09-30 15:17:53,191 INFO mapreduce.JobSubmitter: Executing with tokens: []
2020-09-30 15:17:53,338 INFO conf.Configuration: resource-types.xml not found
2020-09-30 15:17:53,339 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2020-09-30 15:17:53,389 INFO impl.YarnClientImpl: Submitted application application_1601461031349_0008
2020-09-30 15:17:53,425 INFO mapreduce.Job: The url to track the job: http://CORSAIR:8088/proxy/application_1601461031349_0008/
2020-09-30 15:17:53,426 INFO mapreduce.Job: Running job: job_1601461031349_0008
2020-09-30 15:17:57,447 INFO mapreduce.Job: Job job_1601461031349_0008 running in uber mode : false
2020-09-30 15:17:57,448 INFO mapreduce.Job:  map 0% reduce 0%
2020-09-30 15:17:57,460 INFO mapreduce.Job: Job job_1601461031349_0008 failed with state FAILED due to: Application application_1601461031349_0008 failed 2 times due to AM Container for appattempt_1601461031349_0008_000002 exited with  exitCode: 1
Failing this attempt.Diagnostics: [2020-09-30 15:17:56.845]Exception from container-launch.
Container id: container_1601461031349_0008_02_000001
Exit code: 1
Shell output:         1 file(s) moved.
Setting up env variables
Setting up job resources
Copying debugging information

C:\tmp\hadoop-Admin\nm-local-dir\usercache\Admin\appcache\application_1601461031349_0008\container_1601461031349_0008_02_000001>rem Creating copy of launch script

C:\tmp\hadoop-Admin\nm-local-dir\usercache\Admin\appcache\application_1601461031349_0008\container_1601461031349_0008_02_000001>copy "launch_container.cmd" "C:/hadoop-3.2.1/logs/userlogs/application_1601461031349_0008/container_1601461031349_0008_02_000001/launch_container.cmd"
        1 file(s) copied.

C:\tmp\hadoop-Admin\nm-local-dir\usercache\Admin\appcache\application_1601461031349_0008\container_1601461031349_0008_02_000001>rem Determining directory contents

C:\tmp\hadoop-Admin\nm-local-dir\usercache\Admin\appcache\application_1601461031349_0008\container_1601461031349_0008_02_000001>dir  1>>"C:/hadoop-3.2.1/logs/userlogs/application_1601461031349_0008/container_1601461031349_0008_02_000001/directory.info"
Launching container


[2020-09-30 15:17:56.848]Container exited with a non-zero exit code 1. Last 4096 bytes of stderr :
'"C:\jdk-15"' is not recognized as an internal or external command,
operable program or batch file.


[2020-09-30 15:17:56.848]Container exited with a non-zero exit code 1. Last 4096 bytes of stderr :
'"C:\jdk-15"' is not recognized as an internal or external command,
operable program or batch file.


For more detailed output, check the application tracking page: http://CORSAIR:8088/cluster/app/application_1601461031349_0008 Then click on links to logs of each attempt.
. Failing the application.
2020-09-30 15:17:57,477 INFO mapreduce.Job: Counters: 0
PS C:\hadoop-3.2.1\bin>`

You placed the jdk path inside quotes in hadoop-env.cmd. Remove the quotes from the path. And use the jdk 8 or jdk 11 instead of 15. Hadoop only support jdk11 & jdk8

set JAVA_HOME=C:\Java\jdk-11.0.13

You can refer this video for better understanding.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM