繁体   English   中英

将 keras 用于两个输出的神经网络的问题

[英]Problem using keras for neural network for two outputs

我在 python 中使用 Keras,但遇到了一个问题,在运行下面的代码时,我通常会得到两个准确度结果,10% 或 90%

import numpy as np
from numpy import loadtxt
from keras.models import Sequential
from keras.layers import Dense
ler = loadtxt(r'C:\Users\Mateus\Desktop\Nova\artigo.csv')
ler_norm = ler / np.sqrt(np.sum(ler**2))
entrada = ler_norm[:,0:3]
saida = ler[:,3:5]
model = Sequential()
model.add(Dense(units = 3, input_dim = 3, activation='relu'))
model.add(Dense(units = 2, activation = 'sigmoid'))
model.compile(loss='mean_absolute_error', optimizer='adam', metrics=['accuracy'])
model.fit(entrada, saida, epochs=100, batch_size=10)
_, accuracy = model.evaluate(entrada, saida)
print('Accuracy: {:.2f}%'.format(accuracy*100))

“entrada”和“saida”中使用的部分值(原始数据库有300x5):

68|541|257|72.9|84.0
102|576|322|73.6|84.8
54|528|315|73.6|84.0
99|435|357|73.7|84.0
95|454|115|73.1|83.5
91|300|140|73.5|82.5
118|362|144|73.6|85.0
118|450|233|73.4|83.5
93|378|121|73.7|84.0
95|403|117|73.3|84.0
131|349|80|73.4|85.0
112|467|257|74.0|83.5
50|463|134|73.2|83.5
97|374|159|73.3|85.0

最近 15 个时期:

Epoch 85/100
300/300 [==============================] - 0s 177us/step - loss: 77.1145 - acc: 0.4867
Epoch 86/100
300/300 [==============================] - 0s 167us/step - loss: 77.1126 - acc: 0.5400
Epoch 87/100
300/300 [==============================] - 0s 157us/step - loss: 77.1108 - acc: 0.5600
Epoch 88/100
300/300 [==============================] - 0s 159us/step - loss: 77.1091 - acc: 0.6200
Epoch 89/100
300/300 [==============================] - 0s 167us/step - loss: 77.1073 - acc: 0.6733
Epoch 90/100
300/300 [==============================] - 0s 171us/step - loss: 77.1057 - acc: 0.5333
Epoch 91/100
300/300 [==============================] - 0s 157us/step - loss: 77.1040 - acc: 0.4600
Epoch 92/100
300/300 [==============================] - 0s 164us/step - loss: 77.1024 - acc: 0.5333
Epoch 93/100
300/300 [==============================] - 0s 176us/step - loss: 77.1008 - acc: 0.4800
Epoch 94/100
300/300 [==============================] - 0s 160us/step - loss: 77.0992 - acc: 0.5400
Epoch 95/100
300/300 [==============================] - 0s 150us/step - loss: 77.0977 - acc: 0.6067
Epoch 96/100
300/300 [==============================] - 0s 166us/step - loss: 77.0962 - acc: 0.5133
Epoch 97/100
300/300 [==============================] - 0s 168us/step - loss: 77.0947 - acc: 0.5400
Epoch 98/100
300/300 [==============================] - 0s 150us/step - loss: 77.0933 - acc: 0.4067
Epoch 99/100
300/300 [==============================] - 0s 164us/step - loss: 77.0919 - acc: 0.5267
Epoch 100/100
300/300 [==============================] - 0s 166us/step - loss: 77.0905 - acc: 0.5067

有谁知道出了什么问题? 谢谢收听

您正在尝试使用Sigmoid激活预测大约 80 的连续值,该值将为您提供 0 到 1 之间的输出。
尝试使用线性或 relu 激活:

model.add(Dense(units = 1, activation = 'linear'))
model.add(Dense(units = 1, activation = 'relu'))

此外,在回归问题中使用accuracy没有任何意义,请尝试使用诸如mae东西更改您的指标。

model.compile(loss='mean_absolute_error', optimizer='adam', metrics=['mae'])

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM