简体   繁体   English

从 Google AI Platform / Vertex AI 中的本地 Jupyter 和笔记本访问 Google Cloud Storage

[英]Accessing Google Cloud Storage from local Jupyter and Notebooks in Google AI Platform / Vertex AI

Problem statement: have Google Cloud Storage with some Buckets.问题陈述:让谷歌云存储和一些桶。 Need to import data from such buckets into:需要将此类存储桶中的数据导入到:

  • a local Jupyter instance running on my local computer在我的本地计算机上运行的本地 Jupyter 实例
  • a Google Colab notebook谷歌 Colab 笔记本
  • a JupyterLab notebook in Vertex AI (and/or AI Platform) Vertex AI(和/或 AI 平台)中的 JupyterLab 笔记本

Any reference code to be able these cases would be appreciated.任何能够处理这些情况的参考代码将不胜感激。 Kind Regards亲切的问候

local Jupyter instance : First authenticate your local env using gcloud auth login then use gsutil to copy the content to local env.本地 Jupyter 实例:首先使用gcloud auth login验证您的本地环境,然后使用gsutil将内容复制到本地环境。

# Authenticate with your account
!gcloud auth login --no-browser

# Copy from your bucket to local path (note -r is for recursive call)
!gsutil cp -r gs://BUCKET/DIR_PATH ./TARGET_DIR

Colab : First authenticate your Colab session to get access to the cloud APIs.Then you can use gsutil to copy the content to the local env. Colab :首先验证您的 Colab session 以访问云 API。然后您可以使用gsutil将内容复制到本地环境。

# Authenticate with your account
from google.colab import auth as google_auth
google_auth.authenticate_user()

# Copy from your bucket to local path (note -r is for recursive call)
!gsutil cp -r gs://BUCKET/DIR_PATH ./TARGET_DIR

JupyterLab notebook in Vertex AI : Your env is already authenticated. Vertex AI 中的 JupyterLab 笔记本:您的环境已经过身份验证。 Use gsutil to copy the content to local env.使用 gsutil 将内容复制到本地环境。

# Copy from your bucket to local path (note -r is for recursive call)
!gsutil cp -r gs://BUCKET/DIR_PATH ./TARGET_DIR

You can also directly access the files in your Google Cloud Storage via Python using the Cloud Storage client libraries .您还可以使用Cloud Storage 客户端库通过 Python 直接访问 Google Cloud Storage 中的文件。 You will need to authenticate your environment first as mentioned above.如上所述,您需要首先对您的环境进行身份验证。

# Imports the Google Cloud client library
from google.cloud import storage

# Instantiates a client
storage_client = storage.Client()

# The name for the new bucket
bucket_name = "my-new-bucket"

# Creates the new bucket
bucket = storage_client.create_bucket(bucket_name)

print(f"Bucket {bucket.name} created.")

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从 AI Platform Google Cloud 中的 Jupyter Notebook 访问 BigQuery 数据 - Access BigQuery data from Jupyter Notebook in AI Platform Google Cloud 为 Google Cloud AI Platform Notebooks 设置权限 - Setting up permissions for Google Cloud AI Platform Notebooks 如何在一个 go 中将整个 blob(图像)文件夹从 google 存储桶下载到 AI Platform Notebooks? - How to download entire folder of blobs (images) from google storage bucket to AI Platform Notebooks in one go? 导入错误:Google Cloud AI Platform Jupyter Notebook 中没有名为“google.protobuf”的模块 - ImportError: No module named 'google.protobuf' in Google Cloud AI Platform Jupyter Notebook 如何在谷歌云 AI 平台中增加 jupyter notebook 最大缓冲区大小? - How to increase jupyter notebook max buffer size in google cloud AI platform? Google Cloud AI Platform Notebook 实例不会将 GPU 与 Jupyter 一起使用 - Google Cloud AI Platform Notebook Instance won't use GPU with Jupyter 解压缩文件时出错 - Jupyter Notebook - Python 2.x -3.x - AI Notebook -Google Cloud Platform - Error Unziping a file - Jupyter Notebook - Python 2.x -3.x - AI Notebook -Google Cloud Platform 连接到从 Google Cloud 启动的 Jupyter Notebook - Connect to Jupyter Notebooks launched from Google Cloud 如何从托管在外部服务器(谷歌云计算实例)上的 Jupyter Notebook 导出 fast.ai model? - How to export fast.ai model from Jupyter Notebook hosted on external server(google cloud compute instance)? 如何在 GCP AI 平台 Jupyter notebook 中挂载 Cloud Filestore? - How to mount Cloud Filestore in GCP AI platform Jupyter notebook?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM