简体   繁体   中英

Found class org.apache.hadoop.mapreduce.TaskInputOutputContext, but interface was expected

I'm trying to use MRUnit 1.0.0 to test a Hadoop v2 Reducer, but I get an exception when trying:

java.lang.IncompatibleClassChangeError: 
    Found class org.apache.hadoop.mapreduce.TaskInputOutputContext, but interface was expected
                at org.apache.hadoop.mrunit.internal.mapreduce.AbstractMockContextWrapper.createCommon(AbstractMockContextWrapper.java:59)
                at org.apache.hadoop.mrunit.internal.mapreduce.MockReduceContextWrapper.create(MockReduceContextWrapper.java:76)
                at org.apache.hadoop.mrunit.internal.mapreduce.MockReduceContextWrapper.<init>(MockReduceContextWrapper.java:67)
                at org.apache.hadoop.mrunit.mapreduce.ReduceDriver.getContextWrapper(ReduceDriver.java:159)
                at org.apache.hadoop.mrunit.mapreduce.ReduceDriver.run(ReduceDriver.java:142)
                at org.apache.hadoop.mrunit.TestDriver.runTest(TestDriver.java:574)
                at org.apache.hadoop.mrunit.TestDriver.runTest(TestDriver.java:561)

I assume this means I'm somehow mismatching versions of the Hadoop APIs, as in this SO question , but I'm not sure where the problem is. I'm using Maven to pull in dependencies like so, using Hadoop 2.2.0.2.0.6.0-76 from repo.hortonworks.com, and MRUnit 1.0.0 from repo1.maven.org:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-yarn-common</artifactId>
    <version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
    <groupId>org.apache.mrunit</groupId>
    <artifactId>mrunit</artifactId>
    <version>1.0.0</version>
    <classifier>hadoop2</classifier>
</dependency>

The test case is as follows:

@Test
public void testReducer() throws IOException, InterruptedException {
    HH.Reduce r = new HH.Reduce();

    T1 fx1 = new T1();
    T1 fx2 = new T1();

    List<T1> values = new ArrayList<T1>();
    values.add(fx1);
    values.add(fx2);

    T1 fxBoth = new T1(fx1.size() + fx2.size());
    fxBoth.addValues(fx1);
    fxBoth.addValues(fx2);


    ReduceDriver<NullWritable, T1, NullWritable, T1> reduceDriver = ReduceDriver.newReduceDriver(r);

    reduceDriver.withInput(NullWritable.get(), values);
    reduceDriver.withOutput(NullWritable.get(), fxBoth);

    // TODO I can't seem to get this test to work.  
    // Not sure what I'm doing wrong, whether it's a real 
    // problem or a testing problem.
    reduceDriver.runTest();
}

Elsewhere, in the HH package, Reduce is defined as an inner class that's pretty simple:

public static class Reduce extends Reducer<NullWritable, T1, NullWritable, T1> {
    @Override
    public void reduce(NullWritable key, Iterable<T1> values, Context context)
        throws InterruptedException, IOException {

        // Need to create a new record here, because the one we're handed
        // may be recycled by our overlords.
        T1 out = new T1();
        for (T1  t : values) {
            out.addValues(t);
        }
        context.write(key, out);
    }
}

See anything wonky? Is MRUnit trying to use an older/newer version of the APIs?

I believed that I had the same issue, but I was using hadoop-core.1.2.1 with mrunit-hadoop2-1.1.0 . Check your versions and your classifier in your maven dependencies (used to test, not those declared in pom.xml).

The classifier part in mrunit maven dependency is very important.

As you said, you are using hadoop-core.1.2.1 the TaskAttemptContext is an class in that jar. So you need to set the classifier as hadoop1 in maven dependency of mrunit. Then this works without any issues.

If you set the classifier as hadoop2 it expects latest api, in which TaskAttemptContext is interface. You can simply run the file in junit and check the result.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM