简体   繁体   English

Spark Streaming:执行器与自定义接收器的数量

[英]Spark Streaming : number of executor vs Custom Receiver

Why Spark with one worker node and four executors, each with one core cannot able to process Custom Receiver ??为什么 Spark 有一个 worker 节点和四个 executor,每个都有一个 core 不能处理Custom Receiver ??

What are the reason for not processing incoming data via Custom Receiver, if the executor is having a single core in Spark Streaming ?如果执行程序在 Spark Streaming 中只有一个核心,那么不通过自定义接收器处理传入数据的原因是什么?

I am running Spark on Standalone mode.我在独立模式下运行 Spark。 I am getting data in Custom receivers in Spark Streaming app.我正在Spark Streaming应用程序的自定义接收器中获取数据。 My laptop is having 4 cores.我的笔记本电脑有 4 个内核。

master="spark://lappi:7077" master="spark://lappi:7077"

$spark_path/bin/spark-submit --executor-cores 1 --total-executor-cores 4 \\ --class "my.class.path.App" \\ --master $master $spark_path/bin/spark-submit --executor-cores 1 --total-executor-cores 4 \\ --class "my.class.path.App" \\ --master $master

You indicate that your (1) executor should have 1 core reserved for Spark, which means you use 1 of your 4 cores.您指出您的 (1) 执行器应该为 Spark 保留 1 个核心,这意味着您使用了 4 个核心中的 1 个。 The parameter total-executor-cores is never limiting since it limits the total amount of cores on your cluster reserved for Spark, which is, per your previous setting, 1.参数total-executor-cores永远不会受到限制,因为它限制了集群上为 Spark 保留的内核总数,根据您之前的设置,它是 1。

The Receiver consumes one thread for consuming data out of your one available, which means you have no core left to process data.接收器消耗一个线程来消耗可用线程中的数据,这意味着您没有剩余的内核来处理数据。 All of this is explained in the doc: https://spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams-and-receivers所有这些都在文档中进行了解释: https : //spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams-and-receivers

You want to bump that executor-cores parameter to 4.您想将该executor-cores参数提高到 4。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM