I'm trying to rebuild someone else's.network with this shape:
My (image) data going into the.network has this shape:
print(X_train[0].shape)
print(len(X_train))
print(len(y_train))
(150,150,3)
2160
2160
I can write and get a neural.network to run no problem:
model = Sequential()
model.add(Input(shape=(150,150,3)))
model.add(Conv2D(32, kernel_size=3,strides=(1, 1),activation='relu', padding='same', dilation_rate=1))
model.add(MaxPooling2D(pool_size=(2, 2)))
But then when I view the plot, it looks like this:
Can someone explain to me why my output of the Conv2D layer does not decrease from 150 to 148, as expected? (Presumably then, the 'wrong' numbers in the max_pooling layers are a consequence of this, so I only need to focus on understanding the discrepancy in the Conv2D layer).
A possible solution is to use padding=valid
in the Conv2D
layer:
model = Sequential()
model.add(Input(shape=(150,150,3)))
model.add(Conv2D(32, kernel_size=3, strides=(1, 1), activation='relu', padding='valid', dilation_rate=1))
model.add(MaxPooling2D(pool_size=(2, 2)))
The resulting summary:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 148, 148, 32) 896
max_pooling2d (MaxPooling2D) (None, 74, 74, 32) 0
=================================================================
You use padding='same
so you dont "loose" any values on the side
This has a good gif on different padding strategies
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.