[英]Keras Lambda layer and variables : “TypeError: can't pickle _thread.lock objects”
使用lambda圖層和共享變量時,我無法保存keras模型。 這是一個提供此錯誤的最小代碼:
# General imports.
import numpy as np
# Keras for deep learning.
from keras.layers.core import Dense,Lambda
from keras.layers import Input
from keras.models import Model
import keras.backend as K
n_inputs = 20
n_instances = 100
def preprocess(X,minimum,span):
output = (X - minimum)/span
return output
inputs = Input(shape=(n_inputs,),name='input_tensor')
maximum = K.max(inputs)
minimum = K.min(inputs)
span = maximum - minimum
x = Lambda(preprocess,arguments={'minimum':minimum,'span':span})(inputs)
x = Dense(units=100,activation='elu')(x)
outputs = Dense(units=n_inputs,activation='elu')(x)
model = Model(inputs=inputs,outputs=outputs)
model.compile(optimizer='adam', loss='mse')
x = np.array([np.random.randn(20) for i in range(n_instances)])
y = np.array([np.random.randn(20) for i in range(n_instances)])
model.fit(x,y,epochs=10)
model.save('test.h5') # This line doesn't work.
這是我得到的完整錯誤:
Traceback (most recent call last):
File "C:\Dropbox\HELMo_Gramme\M2\Stage\Programmation\Filter\sanstitre0.py", line 35, in <module>
model.save('test.h5') # This line doesn't work.
File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 2580, in save
save_model(self, filepath, overwrite, include_optimizer)
File "C:\ProgramData\Anaconda3\lib\site-packages\keras\models.py", line 111, in save_model
'config': model.get_config()
File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 2421, in get_config
return copy.deepcopy(config)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 215, in _deepcopy_list
append(deepcopy(a, memo))
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 169, in deepcopy
rv = reductor(4)
TypeError: can't pickle _thread.lock objects
該模型可以訓練並可用於預測,該問題僅在保存時出現。 我已經看到使用lambda圖層的人有類似的錯誤(例如這個鏈接 ),但我認為我的問題略有不同(大多數時候它使用的是seq2seq.py我不使用)。 但它似乎仍然與深拷貝相關聯。
如果我刪除lambda圖層或外部變量它的工作原理。 我可能正在做一些我不應該對變量做的事情,但我不知道如何正確地做到這一點。 我需要它們超出預處理函數的范圍,因為我在后處理函數中使用那些相同的變量。
我知道模型中的預處理效率不是最高,但我有理由這樣做,性能不是這個數據集的問題。
我忘了澄清我希望能夠在另一個Lambda層中重用maximum
, minimum
和span
,這就是為什么它們在preprocess
范圍之外定義的原因。
maxim的解決方案確實有所幫助,但它仍然無法在我的實際代碼中運行。 不同之處在於我實際上在函數內部創建模型並返回它,並以某種方式返回相同類型的錯誤。
示例代碼:
# General imports.
import numpy as np
# Keras for deep learning.
from keras.layers.core import Dense,Lambda
from keras.layers import Input
from keras.models import Model
import keras.backend as K
n_inputs = 101
n_instances = 100
def create_model(n_inputs):
def preprocess(X):
maximum = K.max(inputs)
minimum = K.min(inputs)
span = maximum - minimum
output = (X - minimum)/span
return output
def postprocess(X):
maximum = K.max(inputs)
minimum = K.min(inputs)
span = maximum - minimum
output = X*span + minimum
return output
inputs = Input(shape=(n_inputs,),name='input_tensor')
x = Lambda(preprocess)(inputs)
x = Dense(units=100,activation='elu')(x)
outputs = Dense(units=n_inputs,activation='elu')(x)
outputs = Lambda(postprocess)(outputs)
model = Model(inputs=inputs,outputs=outputs)
model.compile(optimizer='adam', loss='mse')
return model
x = np.array([np.random.randn(n_inputs) for i in range(n_instances)])
y = np.array([np.random.randn(n_inputs) for i in range(n_instances)])
model = create_model(n_inputs)
model.fit(x,y,epochs=10)
model.save('test.h5') # This line doesn't work.
錯誤:
Traceback (most recent call last):
File "C:\Dropbox\HELMo_Gramme\M2\Stage\Programmation\Filter\sanstitre0.py", line 46, in <module>
model.save('test.h5') # This line doesn't work.
File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 2580, in save
save_model(self, filepath, overwrite, include_optimizer)
File "C:\ProgramData\Anaconda3\lib\site-packages\keras\models.py", line 111, in save_model
'config': model.get_config()
File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 2421, in get_config
return copy.deepcopy(config)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 215, in _deepcopy_list
append(deepcopy(a, memo))
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 220, in _deepcopy_tuple
y = [deepcopy(a, memo) for a in x]
File "C:\ProgramData\Anaconda3\lib\copy.py", line 220, in <listcomp>
y = [deepcopy(a, memo) for a in x]
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 220, in _deepcopy_tuple
y = [deepcopy(a, memo) for a in x]
File "C:\ProgramData\Anaconda3\lib\copy.py", line 220, in <listcomp>
y = [deepcopy(a, memo) for a in x]
File "C:\ProgramData\Anaconda3\lib\copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 150, in deepcopy
y = copier(x, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "C:\ProgramData\Anaconda3\lib\copy.py", line 169, in deepcopy
rv = reductor(4)
TypeError: can't pickle _thread.lock objects
我需要在一個函數中創建我的模型,因為我正在優化超參數,因此我使用不同的參數集迭代不同的模型(我可以在循環中完成它,但它不是很好)。
問題在於lambda參數: minimum
和span
。 它們是從輸入推斷出來的,但是當你像這樣定義lambda層時:
x = Lambda(preprocess,arguments={'minimum':minimum,'span':span})(inputs)
...它們被認為是需要序列化的獨立參數 (作為lambda的上下文)。 這會導致錯誤,因為它們都是張量流張量,而不是靜態值或numpy數組。
將您的代碼更改為:
# `preprocess` encapsulates all intermediate values in itself.
def preprocess(X):
maximum = K.max(X)
minimum = K.min(X)
span = maximum - minimum
output = (X - minimum) / span
return output
inputs = Input(shape=(n_inputs,), name='input_tensor')
x = Lambda(preprocess)(inputs)
我有同樣的問題(Lambras層+ keras@2.1.5上的多個參數,tensorflow@1.11.0后端)和model.save_weights(...)
而不是model.save(...)
工作 - 如果你只是想以后加載訓練過的權重,您不需要存儲架構。
在Stackoverflow上查看我在類似問題上的答案。
對於您的特定情況,請更改以下行:
(...)
x = Lambda(preprocess)(inputs)
(...)
outputs = Lambda(postprocess)(outputs)
(...)
用這些線:
(...)
x = Lambda(lambda t: preprocess(t))(inputs)
(...)
outputs = Lambda(lambda t: postprocess(t))(outputs)
(...)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.