![](/img/trans.png)
[英]MapReduce Hadoop StringTokenizer getting NoSuchElementException
[英]Hadoop MapReduce NoSuchElementException
我想在具有兩個節點的FreeBSD-Cluster上運行MapReduce-Job,但是出現以下異常
14/08/27 14:23:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/08/27 14:23:04 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/08/27 14:23:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/08/27 14:23:04 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/08/27 14:23:04 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String).
14/08/27 14:23:04 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-otlam/mapred/staging/otlam968414084/.staging/job_local968414084_0001
Exception in thread "main" java.util.NoSuchElementException
at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
...
當我嘗試運行job.watForCompletion(true);
時,會發生這種情況job.watForCompletion(true);
在一個新的MapReduce作業上。 應該拋出NoSuchElementException,因為在那里沒有再調用StringTokenizer和next()中的任何元素。 我查看了源代碼,並在RawLocalFileSystem.java中找到以下代碼部分:
/// loads permissions, owner, and group from `ls -ld`
private void loadPermissionInfo() {
IOException e = null;
try {
String output = FileUtil.execCommand(new File(getPath().toUri()),
Shell.getGetPermissionCommand());
StringTokenizer t =
new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
//expected format
//-rw------- 1 username groupname ...
String permission = t.nextToken();
據我所知,Hadoop嘗試使用ls -ld
查找特定文件的某些權限,如果我在控制台中使用它,它可以完美地工作。 不幸的是,我還沒有找到它正在尋找的文件權限。
Hadoop版本為2.4.1,HBase版本為0.98.4,我正在使用Java-API。 其他操作(如創建表)也可以正常工作。 有沒有人遇到過類似的問題或知道該怎么辦?
編輯:我剛剛發現這是一個hadoop相關的問題。 即使不使用HDFS進行最簡單的MapReduce-Operation也給我同樣的例外。
您能檢查一下是否可以解決您的問題。
如果您的權限問題,則可以使用。
public static void main(String[] args) {
//set user group information
UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hdfs");
//set privilege exception
ugi.doAs(new PrivilegedExceptionAction<Void>() {
public Void run() throws Exception {
//create configuration object
Configuration config = new Configuration();
config.set("fs.defaultFS", "hdfs://ip:port/");
config.set("hadoop.job.ugi", "hdfs");
FileSystem dfs = FileSystem.get(config);
.
.
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.