简体   繁体   中英

DynamicAllocation enabled with Spark on Kubernetes?

Latestdocumentation for spark 2.4.5 suggests "Dynamic Resource Allocation and External Shuffle Service" in Future work, however, I have also found some older documentation for spark 2.2.0 suggesting it is supported after setting up external shuffle service.

Have you successfully enabled Spark dynamic allocation on Kubernetes? If so, what challenges did you face and which documentation did you reference?

We are currently using AWS EMR service for Spark, and would like to try out Spark on Kubernetes with Dynamic Allocation enabled.

Thanks!

the older docs do belong to the older Spark fork repo , which has been used as a basement and POC for the main Apache Spark repository work related to K8s. If you want to have this feature enabled your you - you are restricted to use only this older Spark 2.2.0 fork. Note that it is not recommended for PROD.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM