简体   繁体   English

使用 Python 在 Apache Beam 管道中进行异常处理

[英]Exception Handling in Apache Beam pipelines using Python

I'm doing a simple pipeline using Apache Beam in python (on GCP Dataflow) to read from PubSub and write on Big Query but can't handle exceptions on pipeline to create alternatives flows.我正在使用 python 中的 Apache Beam(在 GCP Dataflow 上)做一个简单的管道,从 PubSub 读取并在 Big Query 上写入,但无法处理管道上的异常以创建替代流。

On a simple WriteToBigQuery example:在一个简单的 WriteToBigQuery 示例中:

output = json_output | 'Write to BigQuery' >> beam.io.WriteToBigQuery('some-project:dataset.table_name')

I tried to put this inside a try/except code, but it doesnt work because when it fails, exceptions seems to be throwed on a Java layer outside my python execution:我试图将它放在try/except代码中,但它不起作用,因为当它失败时,异常似乎在我的 python 执行之外的 Java 层上抛出:

INFO:root:2019-01-29T15:49:46.516Z: JOB_MESSAGE_ERROR: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction -87: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 135, in _execute
    response = task()
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 170, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 221, in do_instruction
    request.instruction_id)
...
...
...
    self.signature.finish_bundle_method.method_value())
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery.py", line 1368, in finish_bundle
    self._flush_batch()
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery.py", line 1380, in _flush_batch
    self.table_id, errors))
RuntimeError: Could not successfully insert rows to BigQuery table [<myproject:datasetname.tablename>]. Errors: [<InsertErrorsValueListEntry
 errors: [<ErrorProto
 debugInfo: u''
 location: u''
 message: u'Missing required field: object.teste.'
 reason: u'invalid'>]
 index: 0>] [while running 'generatedPtransform-63']

        java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:276)
        org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:84)
        org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:119)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1228)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1000(StreamingDataflowWorker.java:143)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$6.run(StreamingDataflowWorker.java:967)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -87: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 135, in _execute
    response = task()
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 170, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 221, in do_instruction
    request.instruction_id)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
    bundle_processor.process_bundle(instruction_id)
...
...
...
    self.signature.finish_bundle_method.method_value())
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery.py", line 1368, in finish_bundle
    self._flush_batch()
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery.py", line 1380, in _flush_batch
    self.table_id, errors))

Even trying to handle this:甚至试图处理这个:

RuntimeError: Could not successfully insert rows to BigQuery table [<myproject:datasetname.tablename>]. Errors: [<InsertErrorsValueListEntry
 errors: [<ErrorProto
 debugInfo: u''
 location: u''
 message: u'Missing required field: object.teste.'
 reason: u'invalid'>]
 index: 0>] [while running 'generatedPtransform-63']

Using:使用:

try:
 ...
except RuntimeException as e:
 ...

Or using generic Exception didn't work.或者使用通用Exception不起作用。

I could find a lot of examples of errors handling in Apache Beam using Java, but no one in python handling errors.我可以找到很多使用 Java 在 Apache Beam 中处理错误的例子,但在 python 处理错误中没有一个。

Does anyone knows how to got this?有谁知道如何得到这个?

I've been only able to catch exceptions at the DoFn level, so something like this:我只能在DoFn级别捕获异常,所以是这样的:

class MyPipelineStep(beam.DoFn):

    def process(self, element, *args, **kwargs):
        try:
            # do stuff...
            yield pvalue.TaggedOutput('main_output', output_element)
        except Exception as e:
            yield pvalue.TaggedOutput('exception', str(e))

However WriteToBigQuery is PTransform that wraps the DoFn BigQueryWriteFn然而WriteToBigQueryPTransform包装了DoFn BigQueryWriteFn

So you may need to do something like this所以你可能需要做这样的事情

class MyBigQueryWriteFn(BigQueryWriteFn):

    def process(self, *args, **kwargs):
        try:
            return super(BigQueryWriteFn, self).process(*args, **kwargs)
        except Exception as e:
            # Do something here

class MyWriteToBigQuery(WriteToBigQuery):
    # Copy the source code of `WriteToBigQuery` here, 
    # but replace `BigQueryWriteFn` with `MyBigQueryWriteFn`

https://beam.apache.org/releases/pydoc/2.9.0/_modules/apache_beam/io/gcp/bigquery.html#WriteToBigQuery https://beam.apache.org/releases/pydoc/2.9.0/_modules/apache_beam/io/gcp/bigquery.html#WriteToBigQuery

You can also use the generator flavor of FlatMap :您还可以使用FlatMap生成器风格

This is similar to the other answer, in that you can use a DoFn in the place of something else, eg a CombineFn to produce no outputs when there is an exception or other kind of failed-preconditions.这类似于另一个答案,因为您可以使用DoFn代替其他东西,例如,在出现异常或其他类型的失败前提条件时, CombineFn不产生任何输出。

def sum_values(values: List[int]) -> Generator[int, None, None]:
    if not values or len(values) < 10:
        logging.error(f'received invalid inputs: {...}')
        return
    yield sum(values)


# Now instead of use |CombinePerKey|
(inputs
  | 'WithKey' >> beam.Map(lambda x: (x.key, x)) \
  | 'GroupByKey' >> beam.GroupByKey() \
  | 'Values' >> beam.Values() \
  | 'MaybeSum' >> beam.FlatMap(sum_values))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM