简体   繁体   English

org.apache.commons.logging.Log 无法解析

[英]org.apache.commons.logging.Log cannot be resolved

When I am trying to declare an byte array using private byte[] startTag;当我尝试使用private byte[] startTag; . .

Eclipse show this line as erroneous. Eclipse 将此行显示为错误。
Hovering over it, I get this message:将鼠标悬停在它上面,我收到以下消息:

The type org.apache.commons.logging.Log cannot be resolved. org.apache.commons.logging.Log 类型无法解析。 It is indirectly referenced from required.class files它是从 required.class 文件中间接引用的

I tried adding jar file in the classpath by viewing other solutions, I'm but unable to remove the error.我尝试通过查看其他解决方案在classpath中添加jar文件,但无法消除错误。
What should I do now?我现在该怎么办?

If any specific jar file needs to be added please mention it.如果需要添加任何特定jar文件,请注明。

import java.io.IOException;  
import java.util.List;  
import org.apache.hadoop.fs.BlockLocation;  
import org.apache.hadoop.fs.FSDataInputStream;  
import org.apache.hadoop.fs.FileStatus;  
import org.apache.hadoop.fs.FileSystem;  
import org.apache.hadoop.fs.Path;  
import org.apache.hadoop.io.DataOutputBuffer;  
import org.apache.hadoop.io.LongWritable;  
import org.apache.hadoop.io.Text;  
import org.apache.hadoop.mapreduce.InputSplit;  
import org.apache.hadoop.mapreduce.JobContext;  
import org.apache.hadoop.mapreduce.RecordReader;  
import org.apache.hadoop.mapreduce.TaskAttemptContext;  
import org.apache.hadoop.mapreduce.lib.input.FileSplit;  
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;

public class XmlInputFormat extends TextInputFormat {

    public static final String START_TAG_KEY = "< student>";
    public static final String END_TAG_KEY = "</student>";

    @Override
    public RecordReader<LongWritable, Text> createRecordReader(
       InputSplit split, TaskAttemptContext context) {
       return new XmlRecordReader();
    }

    public static class XmlRecordReader extends 
    RecordReader<LongWritable, Text> {
        private byte[] startTag;
        private byte[] endTag;
        private long start;
        private long end;
        private FSDataInputStream fsin;
        private DataOutputBuffer buffer = new DataOutputBuffer();
        private LongWritable key = new LongWritable();
        private Text value = new Text();

        @Override
        public void initialize(InputSplit is, TaskAttemptContext tac)
        throws IOException, InterruptedException {
            FileSplit fileSplit = (FileSplit) is;
            String START_TAG_KEY = "<employee>";
            String END_TAG_KEY = "</employee>";
            startTag = START_TAG_KEY.getBytes("utf-8");
            endTag = END_TAG_KEY.getBytes("utf-8");

            start = fileSplit.getStart();
            end = start + fileSplit.getLength();
            Path file = fileSplit.getPath();

            FileSystem fs =file.getFileSystem(tac.getConfiguration());
            fsin = fs.open(fileSplit.getPath());
            fsin.seek(start);

        }

        @Override
        public boolean nextKeyValue() throws     
           IOException,InterruptedException {
            if (fsin.getPos() < end) {
                if (readUntilMatch(startTag, false)) {
                    try {
                        buffer.write(startTag);
                        if (readUntilMatch(endTag, true)) {

                            value.set(buffer.getData(), 0,     
                                 buffer.getLength());
                            key.set(fsin.getPos());
                            return true;
                        }
                    } finally {
                        buffer.reset();
                    }
                }
            }
            return false;
        }

        @Override
        public LongWritable getCurrentKey() throws IOException,
        InterruptedException {
            return key;
        }

        @Override
        public Text getCurrentValue() throws IOException,       
          InterruptedException {
            return value;

        }

        @Override
        public float getProgress() throws IOException, 
          InterruptedException {
            return (fsin.getPos() - start) / (float) (end - start);
        }

        @Override
        public void close() throws IOException {
            fsin.close();
        }

        private boolean readUntilMatch(byte[] match, boolean 
            withinBlock)throws IOException {
            int i = 0;
            while (true) {
                int b = fsin.read();

                if (b == -1)
                    return false;

                if (withinBlock)
                    buffer.write(b);

                if (b == match[i]) {
                    i++;
                    if (i >= match.length)
                    return true;
                } else
                    i = 0;

                if (!withinBlock && i == 0 && fsin.getPos() >= end)
                    return false;
            }
        }

    }

}

I have solved the issue, finding the.jar library inside the $HADOOP_HOME.我已经解决了这个问题,在 $HADOOP_HOME 中找到了.jar 库。 I post an image to explain better:我发布了一张图片来更好地解释:

带有 .jar 库的 Eclipse 屏幕截图

I've also answered on this thread, for a similar problem: https://stackoverflow.com/a/73427233/6685449我也在这个线程上回答了类似的问题: https://stackoverflow.com/a/73427233/6685449

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM