简体   繁体   中英

Spark Streaming : number of executor vs Custom Receiver

Why Spark with one worker node and four executors, each with one core cannot able to process Custom Receiver ??

What are the reason for not processing incoming data via Custom Receiver, if the executor is having a single core in Spark Streaming ?

I am running Spark on Standalone mode. I am getting data in Custom receivers in Spark Streaming app. My laptop is having 4 cores.

master="spark://lappi:7077"

$spark_path/bin/spark-submit --executor-cores 1 --total-executor-cores 4 \\ --class "my.class.path.App" \\ --master $master

You indicate that your (1) executor should have 1 core reserved for Spark, which means you use 1 of your 4 cores. The parameter total-executor-cores is never limiting since it limits the total amount of cores on your cluster reserved for Spark, which is, per your previous setting, 1.

The Receiver consumes one thread for consuming data out of your one available, which means you have no core left to process data. All of this is explained in the doc: https://spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams-and-receivers

You want to bump that executor-cores parameter to 4.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM