簡體   English   中英

為什么 TensorFlow 模型報告的預測高置信水平不正確?

[英]Why is TensorFlow model reporting incorrect high confidence level for predictions?

我編寫了這個函數來接收圖像並生成預測。 函數報告的預測的置信水平多次大於 100%。 有時預測是正確的,並報告了高度的置信度。 有時它是不正確的,但仍報告高水平的置信度。 你能幫我找出我的置信度代碼中的錯誤嗎? 謝謝你。

def test(image):
  import cv2
  from PIL import Image
  from tensorflow.keras.preprocessing import image_dataset_from_directory
  batch_size = 32
  img = keras.preprocessing.image.load_img(
  image,
  target_size=(180, 180),
  interpolation = "bilinear",
  color_mode = 'rgb'
  )
 
  img_array = keras.preprocessing.image.img_to_array(img)
  img_array = tf.expand_dims(img_array, 0)
 
  predictions = new_model.predict(img_array)
  score = predictions[0]
  classes = ['A', 'B','C']
  result = classes[np.argmax(score)]
 
  print(
      "This image {} most likely belongs to {} with a {:.2f} percent confidence."
      .format(image, classes[np.argmax(score)], 100 * np.max(score))
  )
 
  return result
This image belongs to A with 219.28 percent confidence.
This image belongs to C with a 374.98 percent confidence.

模型架構:

model_input = tf.keras.layers.Input(shape=(180, 180, 3)) 
  x = tf.keras.layers.Rescaling(1./255)(model_input) 
  x = tf.keras.layers.Conv2D(16, 3, activation='relu',padding='same')(x)
  x = tf.keras.layers.MaxPooling2D()(x) 
  x = tf.keras.layers.Conv2D(32, 3, activation='relu',padding='same')(x) 
  x = tf.keras.layers.MaxPooling2D()(x) 
  x = tf.keras.layers.Conv2D(64, 3, activation='relu',padding='same')(x) 
  x = tf.keras.layers.MaxPooling2D()(x) 
  x = tf.keras.layers.Flatten()(x)
  x = tf.keras.layers.Dense(128, activation='relu')(x)
  outputs = tf.keras.layers.Dense(3)(x)
 
  model2 = tf.keras.Model(model_input, outputs)

Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_1 (InputLayer)        [(None, 180, 180, 3)]     0         
                                                                 
 rescaling_1 (Rescaling)     (None, 180, 180, 3)       0         
                                                                 
 conv2d (Conv2D)             (None, 180, 180, 16)      448       
                                                                 
 max_pooling2d (MaxPooling2D  (None, 90, 90, 16)       0         
 )                                                               
                                                                 
 conv2d_1 (Conv2D)           (None, 90, 90, 32)        4640      
                                                                 
 max_pooling2d_1 (MaxPooling  (None, 45, 45, 32)       0         
 2D)                                                             
                                                                 
 conv2d_2 (Conv2D)           (None, 45, 45, 64)        18496     
                                                                 
 max_pooling2d_2 (MaxPooling  (None, 22, 22, 64)       0         
 2D)                                                             
                                                                 
 flatten (Flatten)           (None, 30976)             0         
                                                                 
 dense (Dense)               (None, 128)               3965056   
                                                                 
 dense_1 (Dense)             (None, 3)                 387       
                                                                 
=================================================================

如果您希望輸出介於 0 和 1 之間,則應在最后一層使用'sigmoid''softmax'激活:

outputs = tf.keras.layers.Dense(3, activation='softmax')(x)

但是要小心,因為 softmax 輸出不能真正解釋為概率。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM