简体   繁体   中英

Does tensorflow2.0 still have parameter 'trainable'?

In tensorboard i can't find gradient ops which to update my parameter just like tensorflow1.X.

And don't find parameter 'trainable' in keras api.

if tf2.0 still have gradient ops can show in tensorboard,how can i add it to my tensorboard.

ps.my tensorflow version is 2.0-rc0.

here is my code to add something to tensorboard file.

logdir = "testlogs"
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=logdir)
.....
model.fit(x=train_x, y=train_y,
      batch_size=256,
      epochs=6,
      shuffle=True,
      callbacks=[tensorboard_callback])

Does tensorflow2.0 still have parameter 'trainable'?.

In keras, determining which variables are trainable is a responsibility of the layers that make up your model . There are a myriad of layers available out of the box but here is a simple dense layer implementation in order to illustrate the use of some trainable variables


class MyLayer(tf.keras.layers.Layer):
  def __init__(self, units=8, input_dim=8):
    super(MyLayer,self).__init__()
    self.w = tf.Variable(initial_value=tf.random_normal_initializer()(shape=(input_dim, units)),
                              trainable=True)
    self.b = tf.Variable(initial_value=tf.zeros_initializer()(shape=(units,)),
                           trainable=True)

  def call(self, inputs):
    return tf.matmul(inputs, self.w) + self.b


which you could, for example, use in a keras model like this:

my_layer = MyLayer(units=8,input_dim=2)
my_model = tf.keras.models.Sequential([
    my_layer
])
my_model.compile(optimizer=tf.keras.optimizers.Adam(), loss=tf.keras.losses.binary_crossentropy)

Of course best to use the out of the box tf.keras.layers.Dense in practice, this is just to illustrate the trainable variables my_layer.w & my_layer.b !

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM