I tried to create trainable tensorflow model from frozen model.
I have all weights and tensorflow layers from frozen model.
Using that I have created a new graph and initialize with those weights.
It looks ok.
But when I print all layer names, there are three names printed for each layer like
Mconv4_stage6_L2/biases
Mconv4_stage6_L2/biases/Assign
Mconv4_stage6_L2/biases/read
Mconv4_stage6_L2/weights
Mconv4_stage6_L2/weights/Assign
Mconv4_stage6_L2/weights/read
What are those Assign and read layers?
Initialization of weights and biases are as follows.
def make_var(self, name, initializer=[], trainable=True):
'''Creates a new TensorFlow variable.'''
return tf.get_variable(name, trainable=self.trainable & trainable, initializer=initializer)
@layer
def conv(self,
input,
k_h,
k_w,
c_o,
s_h,
s_w,
name,
relu=True,
padding=DEFAULT_PADDING,
group=1,
trainable=False,
biased=True):
# Verify that the padding is acceptable
self.validate_padding(padding)
# Get the number of channels in the input
c_i = int(input.get_shape()[-1])
# Verify that the grouping parameter is valid
assert c_i % group == 0
assert c_o % group == 0
# Convolution for a given input and kernel
convolve = lambda i, k: tf.nn.conv2d(i, k, [1, s_h, s_w, 1], padding=padding)
with tf.variable_scope(name) as scope:
for init_zer in self.inits_:
if(init_zer['name'] == name+'/'+'weights' and 'conv' in init_zer['name']):
#print('weights '+init_zer['name'])
kernel = self.make_var('weights', initializer=tf.constant(init_zer['tensor']), trainable=self.trainable & trainable)#shape=[k_h, k_w, c_i / group, c_o]
if group == 1:
# This is the common-case. Convolve the input without any further complications.
output = convolve(input, kernel)
else:
# Split the input into groups and then convolve each of them independently
input_groups = tf.split(3, group, input)
kernel_groups = tf.split(3, group, kernel)
output_groups = [convolve(i, k) for i, k in zip(input_groups, kernel_groups)]
# Concatenate the groups
output = tf.concat(3, output_groups)
# Add the biases
if biased:
for init_zer in self.inits_:
if(init_zer['name'] == name+'/'+'biases' and 'conv' in init_zer['name']):
#print('bias '+init_zer['name'])
biases = self.make_var('biases', initializer=tf.constant(init_zer['tensor']), trainable=self.trainable & trainable)
output = tf.nn.bias_add(output, biases)
if relu:
# ReLU non-linearity
output = tf.nn.relu(output, name=scope.name)
return output
Do I need to run global_variables_initializer again in this case? My whole codes are https://www.dropbox.com/s/rcoc1hsd3j2i0od/CMUNet.py?dl=0
https://www.dropbox.com/s/kcmrm06k9dz40em/network_base.py?dl=0
https://www.dropbox.com/s/2r164gdk2fzshhr/network_cmu.py?dl=0
My code is correct. What I did is upload the model and initialize the weights and biases. Then the model was saved. From the saved model and checkpoint, the new model is freezed. I made comparison at Tensorboard for old freezed model and new freezed model. They are same, I don't see any node with Assign and read. But I wonder what are they when I print node names.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.