简体   繁体   English

减少 Dataproc Serverless CPU 配额

[英]Reducing Dataproc Serverless CPU quota

Aim: I want to run spark jobs on Dataproc Serverless for Spark.目标:我想在 Dataproc Serverless for Spark 上运行 spark 作业。

Problem: The minimum CPU cores requirement is 12 cores for a Spark app.问题:Spark 应用程序的最低 CPU 内核要求是 12 个内核。 That doesn't fit into the default regional CPU quota we have and requires us to expand it.这不符合我们拥有的默认区域 CPU 配额,需要我们对其进行扩展。 12 cores is an overkill for us; 12 核对我们来说太过分了; we don't want to expand the quota.我们不想扩大配额。

Details: This link mentions the minimum requirements for Dataproc Serverless for Spark: https://cloud.google.com/dataprocserverless/docs/concepts/properties详细信息:此链接提到了 Dataproc Serverless for Spark 的最低要求: https://cloud.google.com/dataprocserverless/docs/concepts/properties

They are as follows: (a) 1 driver and 2 executor nodes (b) 4 cores per node它们如下: (a) 1 个驱动程序节点和 2 个执行程序节点 (b) 每个节点 4 个内核

Hence, a total 12 CPU cores is required.因此,总共需要 12 个 CPU 内核。

Can we bypass this and run Dataproc Serverless for Spark with less CPU cores?我们可以绕过这个并使用更少的 CPU 内核运行 Dataproc Serverless for Spark 吗?

Right now Dataproc Serverless for Spark workload requires 12 CPU cores to run - this is a hard minimum that you can not bypass.目前,适用于 Spark 工作负载的 Dataproc Serverless 需要 12 个 CPU 内核才能运行——这是您无法绕过的最低要求。

We are working on relaxing this requirement, but it will not be available until at least Q3 2023.我们正在努力放宽此要求,但至少要到 2023 年第三季度才能提供。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM