簡體   English   中英

Dataproc通過Python客戶端提交Hadoop作業

[英]Dataproc submit a Hadoop job via Python client

我正在嘗試使用Dataproc API,嘗試將gcloud命令轉換為API,但我在文檔中找不到一個好的例子。

%pip install google-cloud-dataproc

我找到的唯一好的樣本就是這個,效果很好:

from google.cloud import dataproc_v1

client = dataproc_v1.ClusterControllerClient()

project_id = 'test-project'
region = 'global'

for element in client.list_clusters(project_id, region):   
    print('Dataproc cluster name:', element.cluster_name)

我需要將以下gcloud命令轉換為Python代碼:

gcloud dataproc jobs submit hadoop --cluster "${CLUSTER_NAME}" \
    --class com.mycompany.product.MyClass \
    --jars "${JAR_FILE}" -- \
    --job_venv=venv.zip \
    --job_binary_path=venv/bin/python3.5 \
    --job_executes program.py \

這有效:

project_id = 'your project'
region = 'global'

# Define Job arguments:

job_args = ['--job_venv=venv.zip',
            '--job_binary_path=venv/bin/python3.5',
            '--job_executes program.py']


job_client = dataproc_v1.JobControllerClient()

# Create Hadoop Job
hadoop_job = dataproc_v1.types.HadoopJob(jar_file_uris=[JAR_FILE], main_class='com.mycompany.product.MyClass',args=job_args)

# Define Remote cluster to send Job
job_placement = dataproc_v1.types.JobPlacement()
job_placement.cluster_name = 'your_cluster_name'

# Define Job configuration
main_job = dataproc_v1.types.Job(hadoop_job=hadoop_job, placement=job_placement)

# Send job
job_client.submit_job(project_id, region, main_job)

# Monitor in Dataproc UI or perform another API call to track status

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM