简体   繁体   English

将TensorBoard用作Keras模型拟合中的回调会导致FailedPreconditionError

[英]Including TensorBoard as a callback in Keras model fitting causes a FailedPreconditionError

Including tensorboard as callbacks in this code outputs an error:- 在此代码中将tensorboard作为回调包含在内会输出错误:

window_sizes=[3,5]
conv_layers=[1,2]
dense_layers=[1]
for ws in window_sizes:
    for cl in conv_layers:
        for dl in dense_layers:
            name="{}-conv_layers-{}-window_size-{}-dense_layers-{}".format(cl,ws,dl,int(time.time()))
            #keras.backend.clear_session()
            model=Sequential()
            model.add(Conv2D(64,(ws,ws),input_shape=X.shape[1:]))   
            model.add(Activation('relu'))
            model.add(MaxPooling2D(pool_size=(3,3)))
            for i in range(cl-1):
                model.add(Conv2D(64,(ws,ws)))
                model.add(Activation('relu'))
                model.add(MaxPooling2D(pool_size=(3,3)))

            model.add(Flatten())
            for i in range(dl):
                model.add(Dense(64))
                model.add(Activation("relu"))
                model.add(Dropout(0.2))
            model.add(Dense(1))
            model.add(Activation("sigmoid"))
            tensorboard = TensorBoard(log_dir="logs/{}".format(name))
            model.compile(loss="binary_crossentropy",optimizer="adam",metrics=["accuracy"])
            model.fit(X,y,batch_size=32,epochs=3,validation_split=0.2,callbacks=[tensorboard])

However, removing tensorboard from callbacks list fixed the error. 但是,从回调列表中删除tensorboard可修复该错误。 I tried resolving the error with tensorboard included in callbacks list , but nothing seemed to fix it. 我尝试使用回调列表中包含的tensorboard解决该错误,但似乎没有任何解决方法。 Here is the error:- 这是错误:

1-conv_layers-3-window_size-1-dense_layers-1546041626
Train on 19302 samples, validate on 4826 samples
Epoch 1/3
---------------------------------------------------------------------------
FailedPreconditionError                   Traceback (most recent call last)
<ipython-input-10-f65d5bcffb55> in <module>()
     25             tensorboard = TensorBoard(log_dir="logs/{}".format(name))
     26             model.compile(loss="binary_crossentropy",optimizer="adam",metrics=["accuracy"])
---> 27             model.fit(X,y,batch_size=32,epochs=3,validation_split=0.2,callbacks=[tensorboard])
     28 #1-conv_layers-3-window_size-1-dense_layers-1546011295----98.59
     29 #1-conv_layers-3-window_size-2-dense_layers-1546011765----98.15

~\Anaconda3\lib\site-packages\keras\engine\training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs)
   1037                                         initial_epoch=initial_epoch,
   1038                                         steps_per_epoch=steps_per_epoch,
-> 1039                                         validation_steps=validation_steps)
   1040 
   1041     def evaluate(self, x=None, y=None,

~\Anaconda3\lib\site-packages\keras\engine\training_arrays.py in fit_loop(model, f, ins, out_labels, batch_size, epochs, verbose, callbacks, val_f, val_ins, shuffle, callback_metrics, initial_epoch, steps_per_epoch, validation_steps)
    197                     ins_batch[i] = ins_batch[i].toarray()
    198 
--> 199                 outs = f(ins_batch)
    200                 outs = to_list(outs)
    201                 for l, o in zip(out_labels, outs):

~\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py in __call__(self, inputs)
   2713                 return self._legacy_call(inputs)
   2714 
-> 2715             return self._call(inputs)
   2716         else:
   2717             if py_any(is_tensor(x) for x in inputs):

~\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py in _call(self, inputs)
   2673             fetched = self._callable_fn(*array_vals, run_metadata=self.run_metadata)
   2674         else:
-> 2675             fetched = self._callable_fn(*array_vals)
   2676         return fetched[:len(self.outputs)]
   2677 

~\Anaconda3\lib\site-packages\tensorflow\python\client\session.py in __call__(self, *args, **kwargs)
   1380           ret = tf_session.TF_SessionRunCallable(
   1381               self._session._session, self._handle, args, status,
-> 1382               run_metadata_ptr)
   1383         if run_metadata:
   1384           proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

~\Anaconda3\lib\site-packages\tensorflow\python\framework\errors_impl.py in __exit__(self, type_arg, value_arg, traceback_arg)
    517             None, None,
    518             compat.as_text(c_api.TF_Message(self.status.status)),
--> 519             c_api.TF_GetCode(self.status.status))
    520     # Delete the underlying status object from memory otherwise it stays alive
    521     # as there is a reference to status from this from the traceback due to

FailedPreconditionError: Attempting to use uninitialized value training_10/Adam/Variable_9
     [[Node: training_10/Adam/Variable_9/read = Identity[T=DT_FLOAT, _class=["loc:@training_10/Adam/Assign_10"], _device="/job:localhost/replica:0/task:0/device:GPU:0"](training_10/Adam/Variable_9)]]
     [[Node: metrics_10/acc/Mean_1/_1111 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_670_metrics_10/acc/Mean_1", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]

Someone suggested adding keras.backend.clear_session() before creating the model, but even that did not help. 有人建议在创建模型之前添加keras.backend.clear_session(),但即使那样也没有帮助。

Using this 使用这个

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Flatten,Conv2D, MaxPooling2D,Activation

instead of 代替

from keras.models import Sequential
from keras.layers import Dense, Dropout, Flatten,Conv2D, MaxPooling2D,Activation

fixed it for me 为我修复

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM