[英]Dataflow job fails at BigQuery write with backend errors
我有一份工作失敗,有幾個與最終導入 BigQuery 相關的不同錯誤。 我已經運行了 5 次,但每次都失敗,盡管錯誤消息有時會有所不同。 當我在本地針對 SQLite 數據庫運行它時,這項工作運行良好,所以我認為問題出在 Google 后端。
一條錯誤信息:
**Workflow failed. Causes: S04:write meter_traces_combined to BigQuery/WriteToBigQuery/NativeWrite failed., BigQuery import job "dataflow_job_5111748333716803539" failed., BigQuery creation of import job for table "meter_traces_combined" in dataset "ebce" in project "oeem-ebce-platform" failed., BigQuery execution failed., Unknown error.**
另一個錯誤信息:
raceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
op.finish()
File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativefileio.py", line 465, in __exit__
self.file.close()
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
self._uploader.finish()
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
raise self._upload_thread.last_error # pylint: disable=raising-bad-type
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
self._client.objects.Insert(self._insert_request, upload=self._upload)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
http_request, client=self.client)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
return self.StreamInChunks()
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
additional_headers=additional_headers)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
self.RefreshResumableUploadState()
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 873, in RefreshResumableUploadState
self.stream.seek(self.progress)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 301, in seek
offset, whence, self.position, self.last_block_position))
NotImplementedError: offset: 10485760, whence: 0, position: 16777216, last: 8388608
另一個錯誤信息:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
op.finish()
File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativeavroio.py", line 309, in __exit__
self._data_file_writer.fo.close()
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
self._uploader.finish()
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
raise self._upload_thread.last_error # pylint: disable=raising-bad-type
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
self._client.objects.Insert(self._insert_request, upload=self._upload)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
http_request, client=self.client)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
return self.StreamInChunks()
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
additional_headers=additional_headers)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
self.RefreshResumableUploadState()
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 875, in RefreshResumableUploadState
raise exceptions.HttpError.FromResponse(refresh_response)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/oee-ebce-platform/o?alt=json&name=tmp%2Fetl-ebce-combine-all-traces-20191127-152244.1574868164.604684%2Fdax-tmp-2019-11-27_07_24_36-17060579636924315582-S02-0-e425da41c3fe2598%2Ftmp-e425da41c3fe2d8b-shard--try-33835bf582552bbd-endshard.avro&uploadType=resumable&upload_id=AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ>: response: <{'x-guploader-uploadid': 'AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ', 'vary': 'Origin, X-Origin', 'content-type': 'application/json; charset=UTF-8', 'content-length': '177', 'date': 'Wed, 27 Nov 2019 15:30:50 GMT', 'server': 'UploadServer', 'status': '410'}>, content <{
"error": {
"errors": [
{
"domain": "global",
"reason": "backendError",
"message": "Backend Error"
}
],
"code": 503,
"message": "Backend Error"
}
}
有任何想法嗎? 工作 ID 2019-11-27_09_50_34-1251118406325466877,如果谷歌的任何人正在閱讀這篇文章。 謝謝。
谷歌雲支持在這里。 我檢查了你的工作,我發現了兩個可能與這次失敗有關的內部問題。 正如 Alex Amato 在他的評論中所建議的那樣,我會嘗試使用
--experiments=use_beam_bq_sink
否則,我建議您直接在 GCP 上開票,因為這可能需要進一步調查。
我希望這有幫助。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.