简体   繁体   中英

Saving Tensorflow models with custom layers

I read through the documentation, but something wasn't clear for me: if I coded a custom layer and then used it in a model, can I just save the model as SavedModel and the custom layer automatically goes within it or do I have to save the custom layer too? I tried saving just the model in H5 format and not the custom layer. When I tried to load the model, I had an error on the custom layer not being recognized or something like this. Reading through the documentation, I saw that saving to custom objects to H5 format is a bit more involved. But how does it work with SavedModels?

If I understand your question, you should simply use tf.keras.models.save_model(<model_object>,'file_name',save_format='tf') .

My understanding is that the 'tf' format automatically saves the custom layers, so loading doesn't require all libraries be present. This doesn't extend to all custom objects, but I don't know where that distinction lies. If you want to load a model that uses non-layer custom objects you have to use the custom_objects parameter in tf.keras.models.load_model() . This is only necessary if you want to train immediately after loading. If you don't intend to train the model immediately, you should be able to forego custom_objects and just set compile=False in load_model .

If you want to use the 'h5' format, you supposedly have to have all libraries/modules/packages that the custom object utilizes present and loaded in order for the 'h5' load to work. I know I've done this with an intializer before. This might not matter for layers, but I assume that it does.

You also need to implement get_config() and save_config() functions in the custom object definition in order for 'h5' to save and load properly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM