简体   繁体   中英

Custom Tensorflow Layer Which Diagonalizes + Trainable

I want to encode the following function into a TS layer. Let x be a d-dimensional vector.
x -> tf.linalg.diag(x)*A + b,

where A is a trainable dxd matrix and b is a trainable (d-dimensional) vector.

If A and b were not there, I would have used a Lambda layer but since they are... how would I go about it.


Ps: for educational perpouses I don't want to feed the lambda layer:

Lambda(lambda x: tf.linalg.diag(x)))

Into a fully-connected layer with "identity" activation. (I know this works but it doesn't help me learn how to address the problem really :) )

you can create your custom layer and put your function in call method.

class Custom_layer(keras.layers.Layer):

    def __init__(self, dim):
        super(Custom_layer, self).__init__()
        self.dim = dim

        # add trainable weight
        self.weight = self.add_weight(shape=(dim,dim),trainable=True)
        # add trainable bias
        self.bias = self.add_weight(shape=(dim))

    def call(self, input):
        # your function
        return (tf.linalg.diag(input)*self.weight) + self.bias
    def get_config(self):
        config = super(Custom_layer, self).get_config()
        config['dim'] = self.dim
        return config

And use it just like normal layer and give it with dimension argument when you use it.

my_layer = Custom_layer(desire_dimension)
output = my_layer(input)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM