简体   繁体   中英

using tensorflow functions with tf.keras

I have a question regarding tf.keras and tf functions in tf 2.0. If i have a model like this:

 inputdata = keras.Input(shape=(2048, 1))
    x = layers.Conv1D(16, 3, activation='relu')(inputdata)
    x = layers.Conv1D(32, 3, activation='relu')(x)
    x = layers.Conv1D(64, 3, activation='relu')(x)

and I want to add a custom function like this which is a 1D SubPixel Layer:

def SubPixel1D(I, r):
  with tf.name_scope('subpixel'):
  X = tf.transpose(I, [2,1,0]) # (r, w, b)
  X = tf.batch_to_space_nd(X, [r], [[0,0]]) # (1, r*w, b)
  X = tf.transpose(X, [2,1,0])
  return X

can I include this layer in keras without problems? Since tensorflow 2.0 is so much easier then the previous tensorflow versions iam not sure about it if this is not mixing up the backends and the sessions?

inputdata = keras.Input(shape=(2048, 1))
  x = layers.Conv1D(16, 3, activation='relu')(inputdata)
  x = layers.Conv1D(32, 3, activation='relu')(x)
  x = SubPixel1D(x,2)
  x = layers.Conv1D(64, 3, activation='relu')(x)

after that compile and fit model will work? If tensorflow and keras is imported

 import tensorflow as tf
 from tensorflow import keras

similar to the a custom loss function in keras. If I define a custom loss function like this:

def my_loss(y_true, y_pred):
 # compute l2 loss/ equal to Keras squared mean
    sqrt_l2_loss = tf.reduce_mean((y_pred-y_true)**2, axis=[1, 2])
    avg_sqrt_l2_loss = tf.reduce_mean(sqrt_l2_loss, axis=0)
    return avg_sqrt_l2_loss

and use tf. operations or functions, can i just pass this function to keras as usual? Can i just use it in Keras loss?

Just subclass tf.keras.Layer and you will be good to go. Great reference here:https://www.tensorflow.org/guide/keras/custom_layers_and_models . Your layer should look something like this:

class SubPixel1D(tf.keras.layers.Layer):
  def __init__(self, r)
      super(SubPixel1D, self).__init__()
      self.r = r

  def call(self, inputs):
      with tf.name_scope('subpixel'):
          X = tf.transpose(inputs, [2,1,0]) # (r, w, b)
          X = tf.batch_to_space_nd(X, [self.r], [[0,0]]) # (1, r*w, b)
          X = tf.transpose(X, [2,1,0])
     return X

and then call it when defining your model

inputdata = keras.Input(shape=(2048, 1))
  x = layers.Conv1D(16, 3, activation='relu')(inputdata)
  x = layers.Conv1D(32, 3, activation='relu')(x)
  x = SubPixel1D(2)(x)
  x = layers.Conv1D(64, 3, activation='relu')(x)

I don't know how tf.name_scope will behave, but I don't see any immediate issues.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM