[英]Tensorflow CIFAR10 code analysis
Basically it was an arbitrary choice that worked out with their batch size and knowledge about the dataset.基本上,这是一个任意选择,可以根据他们的批量大小和对数据集的了解来确定。
So cifar images are 32 * 32 * 3 and by the convolution now they have 32 * 32 * 64 features and just before that they had 64 filters but they just max pooled it so now it's half the size so now its 16 * 16 * 64. They resized the images to the batch size = 128 so now its 128 * 128. Then they use weights to bring it up to 384.所以 cifar 图像是 32 * 32 * 3 并且通过卷积现在它们有 32 * 32 * 64 个特征,在此之前它们有 64 个过滤器,但它们只是最大池化了它,所以现在它的大小是它的一半,所以现在它的大小是 16 * 16 * 64 . 他们将图像的大小调整为批量大小 = 128,所以现在是 128 * 128。然后他们使用权重将其提高到 384。
Feel free to use another number but make sure you change the next layers as well.随意使用另一个数字,但请确保您也更改下一层。 It's just an example CNN.这只是CNN的一个例子。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.