繁体   English   中英

Java SparkContext错误:java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator

[英]Java SparkContext error: java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator

这是我首次涉足Spark上的java 使用Spark 1.X (尝试1.5.0 )或2.X (尝试2.2.0 ), java 1.8scala 2.10时发生以下错误:

JavaSparkContext sc = new JavaSparkContext(sparkConf);

Exception in thread "main" java.lang.NoSuchMethodError: 
io.netty.buffer.PooledByteBufAllocator.<init>(ZIIIIIII)V
    at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:120)
    at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:107)
    at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:70)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:450)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:246)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:432)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at KMeansMP.main(KMeansMP.java:38)

我以为这是一个库不匹配,但是无法隔离出确切的不兼容性。 这是pom.xml相关部分:

<properties>
    <spark.version>2.2.0</spark.version>
</properties>

..

<dependencies>
    <dependency>
        <groupId>org.apache.giraph</groupId>
        <artifactId>giraph-core</artifactId>
        <version>1.1.0-hadoop2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
</dependencies>

鼓励任何有技巧的java火花参加。

spark-coregiraph-core都依赖giraph-core netty-all 您需要将其从giraph-core排除。

<dependencies>
    <dependency>
        <groupId>org.apache.giraph</groupId>
        <artifactId>giraph-core</artifactId>
        <version>1.1.0-hadoop2</version>
        <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty-all</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
</dependencies>

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM