簡體   English   中英

Java SparkContext錯誤:java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator

[英]Java SparkContext error: java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator

這是我首次涉足Spark上的java 使用Spark 1.X (嘗試1.5.0 )或2.X (嘗試2.2.0 ), java 1.8scala 2.10時發生以下錯誤:

JavaSparkContext sc = new JavaSparkContext(sparkConf);

Exception in thread "main" java.lang.NoSuchMethodError: 
io.netty.buffer.PooledByteBufAllocator.<init>(ZIIIIIII)V
    at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:120)
    at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:107)
    at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:70)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:450)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:246)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:432)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at KMeansMP.main(KMeansMP.java:38)

我以為這是一個庫不匹配,但是無法隔離出確切的不兼容性。 這是pom.xml相關部分:

<properties>
    <spark.version>2.2.0</spark.version>
</properties>

..

<dependencies>
    <dependency>
        <groupId>org.apache.giraph</groupId>
        <artifactId>giraph-core</artifactId>
        <version>1.1.0-hadoop2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
</dependencies>

鼓勵任何有技巧的java火花參加。

spark-coregiraph-core都依賴giraph-core netty-all 您需要將其從giraph-core排除。

<dependencies>
    <dependency>
        <groupId>org.apache.giraph</groupId>
        <artifactId>giraph-core</artifactId>
        <version>1.1.0-hadoop2</version>
        <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty-all</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
</dependencies>

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM