简体   繁体   中英

ERROR SparkContext: Error initializing SparkContext. java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE

I am trying to run a spark job but getting below error.

21/12/24 15:40:43 ERROR SparkContext: Error initializing SparkContext.
java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
    at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131)
    at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118)
    at org.apache.spark.network.server.TransportServer.init(TransportServer.java:95)

Here is the netty dependencies which are being used:

netty-3.7.0.Final.jar netty-all-4.0.43.Final.jar
netty-buffer-4.1.69.Final.jar netty-codec-4.1.69.Final.jar
netty-codec-http-4.1.69.Final.jar netty-codec-socks-4.1.60.Final.jar
netty-common-4.1.69.Final.jar netty-handler-4.1.69.Final.jar
netty-handler-proxy-4.1.60.Final.jar netty-resolver-4.1.69.Final.jar
netty-transport-4.1.69.Final.jar
netty-transport-native-epoll-4.1.69.Final.jar
netty-transport-native-epoll-4.1.60.Final-linux-x86_64.jar
netty-transport-native-kqueue-4.1.69.Final.jar
netty-transport-native-kqueue-4.1.60.Final-osx-x86_64.jar
netty-transport-native-unix-common-4.1.69.Final.jar

I have tried with netty-all version 4.0.43 also but somehow i get same error. spark version used: 2.2.3 Can anyone please help me why this issue is coming.

Ensure you force the same netty version for everything. You have multiple versions on the Classpath. Just use 4.1.72.Final

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM