[英]The difference between tf.nn.conv2d_transpose and slim.conv2d_transpose
What is the difference between this two function in Tensorflow Tensorflow中这两个函数有什么区别
tf.nn.conv2d_transpose(
value,
filter,
output_shape,
strides,
padding='SAME',
data_format='NHWC',
name=None
)
It full defination is in tf.nn.conv2d_transpose slim.conv2d_transpose is define as follow: 它的完整定义在tf.nn.conv2d_transpose slim.conv2d_transpose中定义如下:
tf.layers.conv2d_transpose(
inputs,
filters,
kernel_size,
strides=(1, 1),
padding='valid',
data_format='channels_last',
activation=None,
use_bias=True,
kernel_initializer=None,
bias_initializer=tf.zeros_initializer(),
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
trainable=True,
name=None,
reuse=None
)
full defination is in slim.conv2d_transpose 完全定义在slim.conv2d_transpose中
how can I define the output shape in slim.conv2d_transpose 如何在slim.conv2d_transpose中定义输出形状
There is a significant difference between them. 它们之间有很大的不同。 While tf.nn.conv2d_transpose
represents an operation in the computational graph, tf.layers.conv2d_transpose
defines the entire layer. 虽然tf.nn.conv2d_transpose
表示计算图中的一个操作,但tf.layers.conv2d_transpose
定义了整个层。
Being more precise tf.nn.conv2d_transpose
applies a convolutional filter
to the inputs
. 更精确地说, tf.nn.conv2d_transpose
将卷积filter
应用于inputs
。
tf.layers.conv2d_transpose
, on the other hand, first creates trainable variables that will serve as filter
according to the arguments given, and then it internally calls some conv2d_transpose
operation. 另一方面, tf.layers.conv2d_transpose
首先创建可训练的变量,这些变量将根据给定的参数用作filter
,然后在内部调用一些conv2d_transpose
操作。 Based on the arguments, it also applies some other operations as adding bias, applying non-linearity, or normalizing the weights or inputs. 基于这些参数,它还应用了其他一些操作,例如增加偏差,应用非线性或归一化权重或输入。
With tf.layers.conv2d_transpose
you do not specify output shape
as it is computed from filter size, input size, and stride. 使用tf.layers.conv2d_transpose
您无需指定output shape
因为它是根据过滤器大小,输入大小和跨度计算得出的。 Here is the formula. 这是公式。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.