简体   繁体   English

热在tensorflow 2.4中制作自定义预处理层?

[英]Hot to make custom preprocessing layer in tensorflow 2.4?

I want to perform some transformation on the input batch during training.我想在训练期间对输入批次执行一些转换。 For example, if I have a batch of images of size (number of samples, width, height, channels), I want to replace the 3rd channel with the difference of the first two channels, then resize the image size and finally normalize it.例如,如果我有一批大小为(样本数、宽度、高度、通道)的图像,我想用前两个通道的差异替换第 3 个通道,然后调整图像大小并最后对其进行归一化。 I tried to define a custom layer:我试图定义一个自定义层:

class CustomLayer(tf.keras.layers.Layer):
    def __init__(self):
        super(CustomLayer, self).__init__()
    def build(self, input_shape):
        pass
    def call(self, input_):
    #Loaded images
    self.img_tr = []
    for image in input_:
        img_input = resize(image,(267,400))#from skimage import resize
        img_diff = (img_input[:,:,1]/np.max(img_input[:,:,1]))-((img_input[:,:,0]+img_input[:,:,2])/np.max(img_input[:,:,0]+img_input[:,:,2]))
        img_temp = np.zeros((267,400,3))
        img_temp[:,:,0] = img_input[:,:,0]/np.max(img_input[:,:,0])
        img_temp[:,:,1] = img_input[:,:,1]/np.max(img_input[:,:,1])
        img_temp[:,:,2] = img_diff/np.max(img_diff)
        self.img_tr.append(img_temp)
    self.img_tr= np.asarray(self.img_tr)
    return self.img_tr

Then I used:然后我用了:

input_0 = tf.keras.Input(shape = (None,None,3))
clayer = CustomLayer()
input_1 = clayer(input_0)

x = tf.keras.layers.Conv2D(filters = 16, kernel_size = (7,7), activation = tf.keras.activations.relu)(input_1)
x = tf.keras.layers.MaxPool2D(pool_size = (2,2))(x)
x = tf.keras.layers.Flatten()(x)
x = tf.keras.layers.Dense(units = 64, activation = tf.keras.activations.relu)(x)
output = tf.keras.layers.Dense(units = 12)(x)
model = tf.keras.Model(inputs = input_0, outputs = output)

model.compile(
    optimizer = tf.keras.optimizers.Adam(),
    loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits = True),
    metrics = tf.keras.metrics.SparseCategoricalAccuracy()
)

model.summary()

I get an error that says:我收到一条错误消息:

AttributeError: 'Tensor' object has no attribute 'ndim'

I think the issue related to the fact that my custom layer expects a 4d numpy array but the input is having this format:我认为这个问题与我的自定义层需要一个 4d numpy 数组但输入具有这种格式有关:

<KerasTensor: shape=(None, None, None, 3) dtype=float32 (created by layer 'input_20')>

How can I resolve the issue?我该如何解决这个问题? I cannot find a way to convert KerasTensor to a numpy array inside my custom layer.我找不到将KerasTensor转换为自定义层内的 numpy 数组的方法。

Edit I tried to avoid for loops and numpy so I tried:编辑我试图避免 for 循环和 numpy 所以我尝试了:

class CustomLayer(tf.keras.layers.Layer):
    def __init__(self):
        super(CustomLayer, self).__init__()
    def build(self, input_shape):
        pass
    
    def call(self, input_):
 
        input_ = tf.Variable(input_)
        img_input = tf.image.resize(input_,(267,400))
        img_diff = (img_input[:,:,:,1])-((img_input[:,:,:,0]+img_input[:,:,:,2]))
        img_input[:,:,:,2] = img_diff
        output_img = tf.image.per_image_standardization(img_input)
            
        return input_

However, when I use the custom layer in the functional API I get the error:但是,当我在功能 API 中使用自定义层时,出现错误:

ValueError: Tensor-typed variable initializers must either be wrapped in an init_scope or callable (e.g., `tf.Variable(lambda : tf.truncated_normal([10, 40]))`) when building functions. Please file a feature request if this restriction inconveniences you.

It seems has something to do with tf.Variable .似乎与tf.Variable Even if I set validate_shape to False I still get the same error.即使我将validate_shape设置为 False,我仍然会遇到同样的错误。

Simply removing tf.Variable does the job.只需删除tf.Variable Below is the full layer:下面是完整的图层:

class CustomLayer(tf.keras.layers.Layer):

    def __init__(self):
        super(CustomLayer, self).__init__()
  
    def build(self, input_shape):
        pass
    
    def call(self, inp):
 
        img_input = tf.image.resize(inp, (267,400))
        img_diff = (img_input[:,:,:,1])-((img_input[:,:,:,0]+img_input[:,:,:,2]))
        img_diff = tf.expand_dims(img_diff, -1)
        img_input = tf.keras.layers.Concatenate()([img_input[:,:,:,:-1], img_diff])
        output_img = tf.image.per_image_standardization(img_input)
            
        return output_img

I used tf.keras.layers.Concatenate to replace the last channel of img_input with img_diff .我使用tf.keras.layers.Concatenateimg_diff img_input

Here the running notebook 是正在运行的笔记本

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在 tensorflow 中应用自定义数据增强作为预处理层? - How do I apply custom data augmentation as preprocessing layer in tensorflow? 将标签编码器实现为 TensorFlow 预处理层 - Implementing Label Encoder as a Tensorflow Preprocessing layer 无法腌制张量流预处理层:AttributeError:无法腌制本地对象&#39;PreprocessingLayer.make_adapt_function。<locals> .adapt_step&#39; - cannot pickle tensorflow preprocessing layer: AttributeError: Can't pickle local object 'PreprocessingLayer.make_adapt_function.<locals>.adapt_step' 带有 Keras 和 Tensorflow 的自定义层 - Custom layer with Keras and Tensorflow 在keras/tensorflow中,有没有办法在输出中添加预处理层,类似于sklearn中的TargetTransformRegressor? - In keras/ tensorflow, Is there a way to add a preprocessing layer to the output, similar to TargetTransformRegressor in sklearn? 如何在 tensorflow 2 中实现随机剪切预处理层? - How can I implement a random shear preprocessing layer in tensorflow 2? Tensorflow 带元数据的时间序列分类:预处理层和数据集集成 - Tensorflow time series classification with metadata: preprocessing layer and dataset integration 如何在python中创建自己的Tensorflow预处理层? - How to create my own preprocessing layer in Tensorflow in python? 在Tensorflow 2.4中可视化一个自定义model的图 - Visualize the graph of a custom model in Tensorflow 2.4 顺序 model tensorflow 中的自定义层 - Custom layer in sequential model tensorflow
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM