简体   繁体   English

为什么我的 Tensorflow CNN 的准确度为零而损失不是?

[英]why my Tensorflow CNN's accuracy is zero while loss is not?

i am trying to make twin CNN, my database has two inputs that finally merge together and a single neuron as output of IC50.我正在尝试制作双 CNN,我的数据库有两个输入,它们最终合并在一起,一个神经元作为 IC50 的输出。

when i'm trying to do so, i get 0 accuracy while the loss is ok.当我尝试这样做时,我的准确度为 0,而损失还可以。 am i using the wrong loss function?我是否使用了错误的损失函数? it is currently mean_squared_error它目前是mean_squared_error

OS: Windows10 tensorflow version:2.3.0操作系统:Windows10 tensorflow 版本:2.3.0

my code"我的代码”

encoded_drugs=np.load('encoded_drugs.npy')
encoded_cells=np.load('encoded_cells.npy')
encoded_ICs=np.load('encoded_ICs.npy')
encoded_drugs_train, encoded_drugs_test,encoded_cells_train, encoded_cells_test, encoded_ICs_train, encoded_ICs_test = train_test_split(encoded_drugs,encoded_cells, encoded_ICs, test_size=0.2)


input1=keras.layers.Input(shape=(139,32,))
x1=keras.layers.Flatten(input_shape=(139,32,))(input1)
x2=keras.layers.Dense(64,activation='relu')(x1)
x3=keras.layers.Dense(64,activation='relu')(x2)

input2=keras.layers.Input(shape=(735,2,))
y1=keras.layers.Flatten(input_shape=(735,2,))(input2)
y2=keras.layers.Dense(128,activation='relu')(y1)
y3=keras.layers.Dense(64,activation='relu')(y2)

merged=keras.layers.concatenate([x3,y3],axis=-1)

z=keras.layers.Dense(64,activation='relu')(merged)
out=keras.layers.Dense(1,activation='sigmoid')(z)

model=keras.models.Model(inputs=[input1,input2], outputs=out)

model.compile(optimizer='sgd',loss='mean_squared_error',metrics=['accuracy'])

model.fit([encoded_drugs_train,encoded_cells_train],encoded_ICs_train,validation_split = 0.2,epochs=2)

test_loss, test_accuracy= model.evaluate([encoded_drugs_test,encoded_cells_test],encoded_ICs_test)

print('Accuracy=', test_accuracy)

my output:我的输出:

2020-02-18 11:06:00.759824: I tensorflow/core/platform/cpu_feature_guard.cc:145] This TensorFlow binary is optimized with Intel(R) MKL-DNN to use the following CPU instructions in performance critical operations:  AVX AVX2
To enable them in non-MKL-DNN operations, rebuild TensorFlow with the appropriate compiler flags.
2020-02-18 11:06:00.774869: I tensorflow/core/common_runtime/process_util.cc:115] Creating new thread pool with default inter op setting: 4. Tune using inter_op_parallelism_threads for best performance.
Train on 75793 samples, validate on 18949 samples
Epoch 1/2
75793/75793 [==============================] - 17s 229us/sample - loss: 10.3671 - accuracy: 0.0000e+00 - val_loss: 10.4082 - val_accuracy: 0.0000e+00
Epoch 2/2
75793/75793 [==============================] - 11s 146us/sample - loss: 10.2673 - accuracy: 0.0000e+00 - val_loss: 10.3852 - val_accuracy: 0.0000e+00


3s 125us/sample - loss: 8.3239 - accuracy: 0.0000e+00
Accuracy= 0.0

You are trying to solve a regression problem(using the mean_squared_error loss) while using the accuracy as a metric.您正在尝试解决回归问题(使用mean_squared_error损失),同时使用准确性作为指标。 In such a case, the accuracy is not a valid metric.在这种情况下,准确性不是有效的度量标准。

First of all, ensure that the problem you are trying to solve is indeed a regression or a classification one.首先,确保您尝试解决的问题确实是回归或分类问题。

In case of the regression, use Dense(1,activation='linear') as the last output layer, and model.compile(optimizer='sgd',loss='mean_squared_error',metrics=['mse']) .在回归的情况下,使用Dense(1,activation='linear')作为最后一个输出层,并使用model.compile(optimizer='sgd',loss='mean_squared_error',metrics=['mse'])

In case of the classification, use Dense(1,activation='sigmoid') as the last output layers, and model.compile(optimizer='sgd',loss='binary_crossentropy',metrics=['accuracy']) .在分类的情况下,使用Dense(1,activation='sigmoid')作为最后的输出层,并使用model.compile(optimizer='sgd',loss='binary_crossentropy',metrics=['accuracy'])

Second of all, you need to train for more epochs(29 seconds is really not enough to provide a good overview on your results).其次,您需要训练更多的 epochs(29 秒确实不足以提供对结果的良好概述)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM