简体   繁体   English

如何实时可视化神经网络的训练?

[英]How can I visualize the the training of neural network in real time?

I am not using Pandas or PyTorch.我没有使用 Pandas 或 PyTorch。 I am using Keras and TensorFlow.我正在使用 Keras 和 TensorFlow。

# Visualize training history
from keras.models import Sequential
from keras.layers import Dense
import matplotlib.pyplot as plt
import numpy
# load pima indians dataset
dataset = numpy.loadtxt("file.txt", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
history = model.fit(X, Y, validation_split=0.33, epochs=150, batch_size=10, verbose=0)
# list all data in history
print(history.history.keys())
# summarize history for accuracy
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
# summarize history for loss
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()

I found the above code somewhere on the Internet .在互联网上的某个地方找到了上面的代码。

The problem with this code is, it doesn't show the training in real-time .这段代码的问题是,它没有实时显示训练。

How can I achieve that in Keras?我如何在 Keras 中实现这一目标?

I tested your code and implemented this , running the code in Jupiter shows exactly what you want the Realtime plotting graph of training and error loss, I checked outside the Jupiter the graph is not updated in Realtime may be there is some bug or may be it is only meant to be for Jupiter我测试了您的代码并实现了这一点,在 Jupiter 中运行代码准确地显示了您想要的训练和误差损失的实时绘图图,我在 Jupiter 外部检查了该图未在实时更新可能存在一些错误或可能是仅适用于木星

# Visualize training history
from keras.models import Sequential
from keras.layers import Dense
import matplotlib.pyplot as plt
from livelossplot import PlotLossesKeras
import numpy
# load pima indians dataset
dataset = numpy.loadtxt("file.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
history = model.fit(X, Y, validation_split=0.33, epochs=150, batch_size=10, verbose=0, callbacks=[PlotLossesKeras()])
# list all data in history
print(history.history.keys())
# summarize history for accuracy
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
# summarize history for loss
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 tfp 中训练变分贝叶斯神经网络时,如何分别可视化损失中不同项的演变? - When training a variational Bayesian neural network in tfp, how can I visualize the evolution of the different terms in the loss separately? 如何可视化神经网络 - How to visualize a neural network 如何计算或监控pybrain神经网络的训练? - How can I calculate or monitor the training of a neural network in pybrain? 在训练卷积神经网络(DenseNet)时,是否有任何选项或参数可以改变以减少训练时间? - Are there any options or parameters that I can change to reduce the training time when training the Convolutional neural network (DenseNet)? 我将如何测量/记录 keras / tensorflow 的人工神经网络算法的总训练时间? - How would I measure/record the total training time for an artificial neural network algorithm for keras / tensorflow? 如何在神经网络中进行并行处理以减少训练时间? - How to do parallel processing in the neural network to reduce training time? 如何在Keras实施的卷积神经网络的训练过程中修复我的骰子损失计算? - How can I fix my dice loss calculation during the training process of a convolutional neural network implemented in Keras? 训练张量流神经网络的麻烦,我该如何解决这个问题? - Troubles Training Tensor flow Neural Network, how can I fix this problem? Python / Pybrain:如何在训练期间修复神经网络的权重? - Python/Pybrain: How can I fix weights of a neural network during training? 训练我的神经网络后,如何从最后一个解码器层“登录”中提取图像? - how can i extract images from the last decoder layer “logits” after training my neural network?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM