简体   繁体   中英

How to load a TensorFlow model from a GCS bucket

Question: Currently, I have a public bucket in GCS that stores my TensorFlow models and I would like to load these models into my python script without any authentication. How would I go about doing this?

The model is in a folder that has the following structure:

my-bucket
├── model_1
|   ├── assets
|   |   └── (this is an empty folder)
|   ├── saved_model.pb
|   ├── variables
|   |   ├── variables.data-00000-of-00001
|   |   └── variables.index

I know I can use

tf.keras.models.load_model("gs://my_bucket/model_1")

but this requires authentication even though the bucket is public.

Note: I am using Python Tensorflow not Tensorflow.js.

Are you sure the objects within the bucket are also public? A simple check to see if this is the case:

gcloud auth revoke --all
gsutil ls -r gs://my-bucket

Another possibility is that you need an account key, even though the bucket is public. The keras documentation says the following:

GCloud authentication happens entirely through your authentication key, without project specification. An example workflow using TensorFlow Cloud from a notebook is provided in the "Putting it all together" section of this guide.

This implies that you would need an account key, even though the bucket is public. Which means that you just need to log into your own gcp account with the following command:

gcloud auth application-default login

A last option, which is a lot more work, is that you could download the model locally from the bucket with the google-cloud-storage client, without authentication. And then load the model from a local directory. This would definitely work but is the least feasible option.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM