簡體   English   中英

Apache Beam Pipeline 使用 DirectRunner 運行,但在初始讀取步驟期間使用 DataflowRunner 失敗(SDK harness sdk-0-0 斷開連接)

[英]Apache Beam Pipeline runs with DirectRunner, but fails with DataflowRunner (SDK harness sdk-0-0 disconnected) during initial read step

長話短說

我們有一個默認的 VPC。 試圖運行數據流作業。 初始步驟(讀取文件)設法處理 1/2 個步驟。 獲取JOB_MESSAGE_ERROR: SDK harness sdk-0-0 disconnected錯誤消息,但日志中沒有其他內容。 已嘗試設置角色和 vpc 防火牆規則。

問題

我想使用 Geobeam 圖像 (Apache Beam Python 3.9 SDK 2.41.0) 運行數據流作業。 我已將工作定義如下:

def run(pipeline_args, known_args):
    import apache_beam as beam
    from apache_beam.io.gcp.internal.clients import storage
    from apache_beam.options.pipeline_options import PipelineOptions
    from geobeam.io import GeoJSONSource, filebasedsource
    from geobeam.fn import format_record, make_valid, filter_invalid

    pipeline_options = PipelineOptions([
        
    ] + pipeline_args)

    with beam.Pipeline(options=pipeline_options) as p:
        (p
         | beam.io.Read(GeoJSONSource(known_args.gcs_url, encoding='utf-8'))
         | 'FilterCords' >> beam.Filter(lambda x: len(x[-1]["coordinates"]) > 1)
         | 'MakeValid' >> beam.Map(make_valid)
         | 'FilterInvalid' >> beam.Filter(filter_invalid)
         | 'FormatRecords' >> beam.Map(format_record)
         | beam.io.WriteToText(known_args.gcs_write_url)
        )

if __name__ == '__main__':
    import logging
    import argparse

    logging.getLogger().setLevel(logging.INFO)

    parser = argparse.ArgumentParser()
    parser.add_argument('--gcs_url')
    parser.add_argument('--gcs_write_url')
    known_args, pipeline_args = parser.parse_known_args()

    run(pipeline_args, known_args)

我使用以下命令運行作業:

python -m main --runner DataflowRunner --project [[project_id]] \
--temp_location gs://[[temp_bucket_name]]/tmp \
--gcs_url gs://[[inputbucket_name]]/[[filename]].geojson \
--region europe-north1  --sdk_container_image gcr.io/dataflow-geobeam/example \
--gcs_write_url gs://gs://[[outputbucket_name]]/[[filename]]_processed.geojson \
--subnetwork [[full_link_to_subnet]]

我們已經設置了自定義默認 VPC,並且我為 GCP 中的計算虛擬機資源添加了入口/出口防火牆規則的推薦范圍。 我還為數據流作業使用的默認服務帳戶賦予了以下角色:

  • 計算網絡用戶
  • 數據流管理員
  • 數據流工作者
  • 存儲 Object 管理員

我還在服務帳戶上提供了我的用戶角色:

  • 所有者
  • 服務帳號管理員

Output 來自管道

初始讀取步驟中步驟 1/2 成功的流程圖片

它說工作已停止,但那是因為工作不會繼續進行。 我得到以下日志 output

INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-10-18_05_33_31-17288646308046950877 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:31.708Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use --experiments=disable_runner_v2 to opt out.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:32.780Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-10-18_05_33_31-17288646308046950877. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:32.803Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-10-18_05_33_31-17288646308046950877.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:34.374Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in europe-north1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.092Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.109Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.141Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.160Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteToText/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.184Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.200Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.226Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.243Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/InitializeWrite into WriteToText/Write/WriteImpl/DoOnce/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.262Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3481>) into WriteToText/Write/WriteImpl/DoOnce/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.278Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/DoOnce/Map(decode) into WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3481>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.294Z: JOB_MESSAGE_DETAILED: Fusing consumer Read/Map(<lambda at iobase.py:908>) into Read/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.310Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.325Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.340Z: JOB_MESSAGE_DETAILED: Fusing consumer FilterCords into ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.356Z: JOB_MESSAGE_DETAILED: Fusing consumer MakeValid into FilterCords
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.372Z: JOB_MESSAGE_DETAILED: Fusing consumer FilterInvalid into MakeValid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.387Z: JOB_MESSAGE_DETAILED: Fusing consumer FormatRecords into FilterInvalid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.402Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn) into FormatRecords
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.417Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/WriteBundles into WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.432Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/Pair into WriteToText/Write/WriteImpl/WriteBundles
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.447Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/GroupByKey/Write into WriteToText/Write/WriteImpl/Pair
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.464Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/Extract into WriteToText/Write/WriteImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.489Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.504Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.519Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.535Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.624Z: JOB_MESSAGE_DEBUG: Executing wait step start19
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.655Z: JOB_MESSAGE_BASIC: Executing operation Read/Impulse+Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction+ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.668Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3481>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.682Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:33:35.696Z: JOB_MESSAGE_BASIC: Starting 1 workers in europe-north1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-10-18_05_33_31-17288646308046950877 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:34:21.585Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:37:30.456Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:42:40.315Z: JOB_MESSAGE_BASIC: Finished operation Read/Impulse+Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction+ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:42:40.354Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6-split-with-sizing-out3" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-10-18T12:42:42.422Z: JOB_MESSAGE_ERROR: SDK harness sdk-0-0 disconnected.

然后它再次嘗試將工人數量增加到 1,然后它立即收到JOB_MESSAGE_ERROR: SDK harness sdk-0-0 disconnected. 再三,一而再再而三。 作為旁注 - 在管道實際啟動之前還需要大約 10 分鍾。

幫助

我設法讓它與DirectRunner選項一起工作。 我不知道去哪里看? 會不會跟VPC有關?

編輯:這可能是 Geobeam 圖像嗎?

我嘗試在原生/默認圖像和地理束圖像上運行字數統計示例,它適用於原生/默認圖像,但不適用於地理束圖像。

為什么會這樣?

經過反復試驗,我發現 geobeam 基礎圖像的 python 版本必須與您機器上本地的 python 版本相匹配,否則將無法運行。 在回答時,這是 python 3.8。

  • 有實例化作業的運行程序:您的本地虛擬環境
  • 在執行階段,工作人員使用您的Docker圖像

要正常工作:

  • 跑步者(虛擬環境)需要具有與圖像中使用的版本相同的Python版本
  • 跑步者需要具有與Docker圖像( Beam Python等)所使用的包相同的Python

自定義容器僅支持 Dataflow Runner v2。 如果您要啟動批處理 Python 管道,請設置--experiments=use_runner_v2標志。

您的案例中缺少此參數。

官方文檔鏈接: https://cloud.google.com/dataflow/docs/guides/using-custom-containers#python_6

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM