简体   繁体   English

PATCH 中的 google-cloud-storage 随机 503 错误

[英]google-cloud-storage random 503 error in PATCH

I have a code that reads files from a bucket then edit the content_encoding from all files metadata.我有一个从存储桶中读取文件然后从所有文件元数据中编辑 content_encoding 的代码。 It usually works fine, but since 3 days ago i'm heaving problems with the google-cloud-storage API, it seems to be returning 503 errors randomly.它通常工作正常,但自从 3 天前我遇到了 google-cloud-storage API 的问题,它似乎随机返回 503 错误。 I already tried to use a retry strategy with an 600 seconds deadline but the code still returning errors.我已经尝试使用具有 600 秒期限的重试策略,但代码仍然返回错误。

Im running this code with python3 in a databricks notebook with an 9.1 LTS (includes Apache Spark 3.1.2, Scala 2.12) cluster.我在具有 9.1 LTS(包括 Apache Spark 3.1.2、Scala 2.12)集群的 databricks 笔记本中使用 python3 运行此代码。

google-cloud-srtorage API version -> 2.5.0 google-cloud-srtorage API 版本-> 2.5.0

I read some old threads about this problem, and it seems to be a known issue but im still not able to resolve it.我阅读了一些关于这个问题的旧线程,这似乎是一个已知问题,但我仍然无法解决它。 Here is the code:这是代码:

def blob_list(bucket_name):
  try:
    client = storage.Client()
    blobs = client.list_blobs(bucket_name)
    print('Bucket read')
    return blobs
  except Exception as e:
    print('Counld not read the bucket', e)
    
b = blob_list(bucket_name)
count = 0
modified_retry = DEFAULT_RETRY.with_deadline(600)
modified_retry = modified_retry.with_delay(initial=1.5, multiplier=1.2, maximum=45.0)

for item in b:
  CS = storage.Client()
  blob = CS.bucket(bucket_name).blob(item.name)
  blob.patch(retry=modified_retry)
#  print(blob.content_encoding)
  if blob.content_encoding == 'gzip' or blob.content_encoding == 'txt': 
    blob.content_encoding = 'csv'
    blob.patch(retry=modified_retry)
    count +=1
print('Changed',count,'metadata files')

The code is taking too long to run and still throwing This error代码运行时间过长,仍然抛出此错误

Deadline of 600.0s exceeded while calling target function, last exception: 503 PATCH https://storage.googleapis.com/storage/v1/b/bucket_name/o/yof0soyd7668_2022-08-15T060000_06db87dcb5e5ee06ec13ab5fbefe4df0_be822a.csv.gz?projection=full&prettyPrint=false : We encountered an internal error. Deadline of 600.0s exceeded while calling target function, last exception: 503 PATCH https://storage.googleapis.com/storage/v1/b/bucket_name/o/yof0soyd7668_2022-08-15T060000_06db87dcb5e5ee06ec13ab5fbefe4df0_be822a.csv.gz?projection=full&prettyPrint=false : 我们遇到了一个内部错误。 Please try again.请再试一次。

The error seems to occurs in the patch() method.该错误似乎发生在 patch() 方法中。

This is unfortunately an issue on Google's end.不幸的是,这是谷歌的一个问题。 I am getting the same 503 error as you on processes that I have been running daily for over 2 years.在我每天运行超过 2 年的进程中,我遇到了与您相同的 503 错误。 I created a support case through GCP and am waiting to hear back.我通过 GCP 创建了一个支持案例,正在等待回复。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 google-cloud-storage ruby gem 是否支持批量操作? - Does the google-cloud-storage ruby gem support bulk operations? 如何在google-cloud-storage中限制某些文件类型上传,同时判断文件是文档还是图片? - How to restrict some file types from uploading in google-cloud-storage, and also determine if the file is a document or an image? 将文件上传到谷歌云存储时出错 - Error uploading file to google cloud storage 如何使用 Python aiohttp PATCH 请求通过 Google Cloud Storage 更新 object 元数据? - How can I update an object metadata through Google Cloud Storage using Python aiohttp PATCH request? 将表从 Google BigQuery 导出到 Google Cloud Storage 时出现权限错误 - Permissions Error Exporting a table from Google BigQuery to Google Cloud Storage 错误:模块“google.cloud.bigquery_storage”没有属性“BigQueryReadClient” - Error: module 'google.cloud.bigquery_storage' has no attribute 'BigQueryReadClient' 谷歌云愿景与谷歌存储 - Google cloud vision with Google storage 谷歌云存储强制下载 - Google Cloud Storage Force Download 将 csv 写入谷歌云存储 - Write csv to google cloud storage 将 blob 发布到谷歌云存储 - post blob to google cloud storage
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM