简体   繁体   中英

Spark streaming can we create a thread on executor

I have a question on spark streaming. In my spark streaming application, I have a code that runs on worker/executor as a task (inside foreachPartition() while processing a RDD). I want to create a thread as part of this code that will run continuously on executor/worker from the time it is launched till executor is alive, listen to some external events and take some action based on that.

Is this possible to do in spark streaming?

You could try to fit this into a custom receive. You can find some details in Implementing a Custom Receiver . Otherwise it doesn't fit very well in the Spark streaming.

It is possible to start thread on the drive but I understand it is not what you want.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM