Problem statement: have Google Cloud Storage with some Buckets. Need to import data from such buckets into:
Any reference code to be able these cases would be appreciated. Kind Regards
local Jupyter instance : First authenticate your local env using gcloud auth login then use gsutil to copy the content to local env.
# Authenticate with your account
!gcloud auth login --no-browser
# Copy from your bucket to local path (note -r is for recursive call)
!gsutil cp -r gs://BUCKET/DIR_PATH ./TARGET_DIR
Colab : First authenticate your Colab session to get access to the cloud APIs.Then you can use gsutil
to copy the content to the local env.
# Authenticate with your account
from google.colab import auth as google_auth
google_auth.authenticate_user()
# Copy from your bucket to local path (note -r is for recursive call)
!gsutil cp -r gs://BUCKET/DIR_PATH ./TARGET_DIR
JupyterLab notebook in Vertex AI : Your env is already authenticated. Use gsutil to copy the content to local env.
# Copy from your bucket to local path (note -r is for recursive call)
!gsutil cp -r gs://BUCKET/DIR_PATH ./TARGET_DIR
You can also directly access the files in your Google Cloud Storage via Python using the Cloud Storage client libraries . You will need to authenticate your environment first as mentioned above.
# Imports the Google Cloud client library
from google.cloud import storage
# Instantiates a client
storage_client = storage.Client()
# The name for the new bucket
bucket_name = "my-new-bucket"
# Creates the new bucket
bucket = storage_client.create_bucket(bucket_name)
print(f"Bucket {bucket.name} created.")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.