简体   繁体   English

在conv2D中填充

[英]padding in conv2D

I am using the following code in keras 我在keras中使用以下代码

from keras.layers import Input, Dense, Conv2D, MaxPooling2D, UpSampling2D
from keras.models import Model
from keras import backend as K

input_img = Input(shape=(28, 28, 1))  # adapt this if using `channels_first` image data format

x = Conv2D(16, (3, 3), activation='relu', padding='same')(input_img)
x = MaxPooling2D((2, 2), padding='same')(x)
x = Conv2D(8, (3, 3), activation='relu', padding='same')(x)
x = MaxPooling2D((2, 2), padding='same')(x)
x = Conv2D(8, (3, 3), activation='relu', padding='same')(x)
encoded = MaxPooling2D((2, 2), padding='same')(x)

# at this point the representation is (4, 4, 8) i.e. 128-dimensional

x = Conv2D(8, (3, 3), activation='relu', padding='same')(encoded)
x = UpSampling2D((2, 2))(x)
x = Conv2D(8, (3, 3), activation='relu', padding='same')(x)
x = UpSampling2D((2, 2))(x)
x = Conv2D(16, (3, 3), activation='relu')(x)
x = UpSampling2D((2, 2))(x)
decoded = Conv2D(1, (3, 3), activation='sigmoid', padding='same')(x)

However if i use the second last Conv2D block: "x = Conv2D(16, (3, 3), activation='relu')(x)" with padding='same' the code gives me error . 但是,如果我使用倒数第二个Conv2D块:“x = Conv2D(16,(3,3),激活='relu')(x)”与padding ='相同'代码给我错误。 i dont understand how padding being same is problematic, if i remove this padding line, the code works fine. 我不明白填充是否相同是有问题的,如果我删除此填充线,代码工作正常。 Any one please? 有人请? Thanks 谢谢

It is happening because 'same' behaves inconistantly with strides !=1 . 这种情况正在发生,因为“相同”的行为与strides !=1不一致strides !=1 Have you tried specifying strides as 1? 您是否尝试将步幅指定为1? The issue is discussed in details here 这里将详细讨论该问题

input_img = Input(shape=(28, 28, 1))  

x = Conv2D(32, (3, 3), activation='relu', padding='same')(input_img)
x = MaxPooling2D((2, 2), padding='same')(x)
x = Conv2D(32, (3, 3), activation='relu', padding='same')(x)
encoded = MaxPooling2D((2, 2), padding='same')(x)


# at this point the representation is (7, 7, 32) 

x = Conv2D(32, (3, 3), activation='relu', padding='same')(encoded)
x = UpSampling2D((2, 2))(x)
x = Conv2D(32, (3, 3), activation='relu', padding='same')(x)
x = UpSampling2D((2, 2))(x)
decoded = Conv2D(1, (3, 3), activation='sigmoid', padding='same')(x)

Now if I use the above code I don't need to omit the padding='same' from second last conv2D block and its working 现在,如果我使用上面的代码,则无需在倒数第二个conv2D块及其工作中省略padding ='same'

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM