简体   繁体   中英

google colaboratory, weight download (export saved models)

I created a model using Keras library and saved the model as .json and its weights with .h5 extension. How can I download this onto my local machine?

to save the model I followed this link

This worked for me !! Use PyDrive API

!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials

# 1. Authenticate and create the PyDrive client.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)

# 2. Save Keras Model or weights on google drive

# create on Colab directory
model.save('model.h5')    
model_file = drive.CreateFile({'title' : 'model.h5'})
model_file.SetContentFile('model.h5')
model_file.Upload()

# download to google drive
drive.CreateFile({'id': model_file.get('id')})

Same for weights

model.save_weights('model_weights.h5')
weights_file = drive.CreateFile({'title' : 'model_weights.h5'})
weights_file.SetContentFile('model_weights.h5')
weights_file.Upload()
drive.CreateFile({'id': weights_file.get('id')})

Now, check your google drive.

On next run, try reloading the weights

# 3. reload weights from google drive into the model

# use (get shareable link) to get file id
last_weight_file = drive.CreateFile({'id': '1sj...'}) 
last_weight_file.GetContentFile('last_weights.mat')
model.load_weights('last_weights.mat')

A Better NEW way to do it (post update) ... forget the previous (also works)

# Load the Drive helper and mount
from google.colab import drive
drive.mount('/content/drive')

You will be prompted for authorization Go to this URL in a browser: something like : accounts.google.com/o/oauth2/auth?client_id=.....

obtain the auth code from the link, paste your authorization code in the space

Then you can use drive normally as your own disk

Save weights or even the full model directly

model.save_weights('my_model_weights.h5')
model.save('my_model.h5')

Even a Better way, use call backs, which automatically checks if the model at each epoch achieved better than the best saved one and save the one with best validation loss so far.

my_callbacks = [
    EarlyStopping(patience=4, verbose=1),
    ReduceLROnPlateau(factor=0.1, patience=3, min_lr=0.00001, verbose=1),
    ModelCheckpoint(filepath = filePath + 'my_model.h5', 
    verbose=1, save_best_only=True, save_weights_only=False) 
    ]

And use the call back in the model.fit

model.fit_generator(generator = train_generator,  
                    epochs = 10,
                    verbose = 1,
                    validation_data = vald_generator,
                    callbacks = my_callbacks)

You can load it later, even with a previous user defined loss function

from keras.models import load_model
model = load_model(filePath + 'my_model.h5', 
        custom_objects={'loss':balanced_cross_entropy(0.20)})

Try this

from google.colab import files
files.download("model.json")

Here is a solution that worked for me:

Setup authentication b/w Google Colab and Your Drive:

Steps:

-Paste the code as is below

-This process will generate two URLs for authentication to complete, where you would have to copy the tokens and paste in the bar provided

!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse
from google.colab import auth
auth.authenticate_user()
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}

Once this authentication is done, use the following codes to establish the connection:

!mkdir -p drive
!google-drive-ocamlfuse drive

Now to see the list of files in your Google Drive:

!ls drive

To save the Keras model output to Drive, the process is exactly the same as storing in local drive:

-Run the Keras model as usual

Once the model is trained say you want to store your model outputs (.h5 and json) into the app folder of your Google Drive:

model_json = model.to_json()
with open("drive/app/model.json", "w") as json_file:
    json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("drive/app/model_weights.h5")
print("Saved model to drive")

You will find the files in the respective folder of Google Drive, from where you can download as we can see below:

在此处输入图片说明

files.download does not let you directly download large files. A workaround is to save your weights on Google drive, using this pydrive snippet below. Just change the filename.txt for your weights.h5 file

# Install the PyDrive wrapper & import libraries.
# This only needs to be done once in a notebook.
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials

# Authenticate and create the PyDrive client.
# This only needs to be done once in a notebook.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)

# Create & upload a file.
uploaded = drive.CreateFile({'title': 'filename.csv'})
uploaded.SetContentFile('filename.csv')
uploaded.Upload()
print('Uploaded file with ID {}'.format(uploaded.get('id')))

To download the model to the local system, the following code would work- Downloading json file:

model_json = model.to_json()
with open("model1.json","w") as json_file:
     json_file.write(model_jason)

files.download("model1.json")

Downloading weights:

model.save('weights.h5')
files.download('weights.h5')

For downloading to local system:

from google.colab import files

#For model json
model_json = model.to_json()
with open("model1.json","w") as json_file:
     json_file.write(model_json)
files.download("model1.json")

#For weights
model.save('weights.h5')
files.download('weights.h5')

You can run the following after training.

saver = tf.train.Saver()
save_path = saver.save(session, "data/dm.ckpt")
print('done saving at',save_path)

Then check the location where the ckpt files were saved.

import os
print( os.getcwd() )
print( os.listdir('data') )

Finally download the files with weight!

from google.colab import files
files.download( "data/dm.ckpt.meta" ) 

simply use model.save(). Below here i created a variable to store the name of the model then i saved it with model.save(). I used google collab but it should work for other s enter image description here

I just drag and dropped the model to drive in the contents folder. And there it was in my google drive. 在此处输入图片说明

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM