[英]Getting java.net.ConnectException when running word count program in Spark Streaming
我试图在Spark Streaming中运行字数统计程序,但是我遇到了以下错误。我正在使用nc -lk 9999
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.Seconds
object wordcount{
def main(args: Array[String]):Unit = {
val conf = new
SparkConf().setMaster("local[2]").setAppName("WordCount")
val ssc = new StreamingContext(conf, Seconds(5))
val lines = ssc.socketTextStream("localhost",9999)
val words = lines.flatMap(_.split(" "))
val pairs = words.map(word => (word, 1))
val wordcount = pairs.reduceByKey(_+_)
wordcount.print()
ssc.start()
ssc.awaitTermination()
WARN ReceiverSupervisorImpl:延迟2000毫秒重新启动接收器:连接到本地主机时出错:9999 java.net.ConnectException:拒绝连接:在java.net.DualStackPlainSocketImpl.connect0(本机方法)处连接java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java :79)在java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)在java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)在java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)在Java .net.PlainSocketImpl.connect(PlainSocketImpl.java:172)在java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)在java.net.Socket.connect(Socket.java:589)在java.net.Socket。 org.apache.spark.streaming.dstream.SocketReceiver.receive上的java.net.Socket。(Socket.java:434)上的connect(Socket.java:538) (SocketInputDStream.scala:73)在org.apache.spark.streaming.dstream.SocketReceiver $$ anon $ 2.run(SocketInp utDStream.scala:59)
我曾经面对这个问题。 当您将spark连接到端口999
netcat服务器但尚未启动时,会发生此异常。 因此,在将spark连接到端口999
之前,请确保您的netcat
正在运行。
您可以看一下这个解决方案的答案
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.