简体   繁体   English

运行 DL4J 示例时的异常

[英]Exception when running DL4J example

I have cloned DL4J examples and just trying to run one of them.我已经克隆了DL4J 示例,只是尝试运行其中之一。 One that I am trying is LogDataExample.java.我正在尝试的一个是 LogDataExample.java。 Project has been build successfully and everyting seams fine expect when starting it following exception is thrown项目已成功构建,并且在引发异常后启动它时一切接缝都很好

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.util.concurrent.SingleThreadEventExecutor.<init>(Lio/netty/util/concurrent/EventExecutorGroup;Ljava/util/concurrent/Executor;ZLjava/util/Queue;Lio/netty/util/concurrent/RejectedExecutionHandler;)V
    at io.netty.channel.SingleThreadEventLoop.<init>(SingleThreadEventLoop.java:65)
    at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
    at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:138)
    at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
    at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:78)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:73)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:60)
    at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50)
    at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
    at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at org.datavec.transform.logdata.LogDataExample.main(LogDataExample.java:85)

I was not able to find anything online that would help me fix this.我无法在网上找到任何可以帮助我解决此问题的内容。 My code is exactly the same as in the example我的代码与示例中的完全相同

pom.xml contains following pom.xml 包含以下内容

<dependency>
    <groupId>io.netty</groupId>
    <artifactId>netty-all</artifactId>
    <version>4.1.46.Final</version>
</dependency>

I think you are forcing a newer version of netty than Spark supports.我认为您正在强制使用比 Spark 支持的更新版本的 netty。

By running mvn dependency:tree you can see what version Spark wants here, and use that instead of the one you've defined.通过运行mvn dependency:tree您可以在这里看到 Spark 想要的版本,并使用它而不是您定义的版本。

If you don't care about Spark, but want to just use DataVec to transform your data, take a look at https://www.dubs.tech/guides/quickstart-with-dl4j/ .如果您不关心 Spark,而只想使用 DataVec 来转换您的数据,请查看https://www.dubs.tech/guides/quickstart-with-dl4j/ It is a little bit outdated concerning the dependencies, but the datavec part shows how to use it without spark.关于依赖关系有点过时,但 datavec 部分展示了如何在没有火花的情况下使用它。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM