简体   繁体   English

张量流中的自定义层

[英]Custom Layers in tensorflow

I am trying to make some changes to the inbuilt dropout function in tensorflow. 我正在尝试对tensorflow中的内置dropout函数进行一些更改。 What is the best procedure to do so? 这样做的最佳过程是什么?

I'd like to make some changes in forward and backpropogation steps. 我想对正向和反向传播步骤进行一些更改。 In Tensorflow Implementation I can only find forward pass not backward pass. Tensorflow实现中,我只能找到前向通过而不是后向通过。 I'd like to modify both forward and backward pass. 我想同时修改正向和反向传递。

You can use tf.custom_gradient to define your own forward and backprop step in a single method. 您可以使用tf.custom_gradient在单个方法中定义自己的前进和后退步骤。 Here is a simple example: 这是一个简单的示例:

import tensorflow as tf

tf.InteractiveSession()

@tf.custom_gradient
def custom_multiply(a, x):
  # Define your own forward step
  y = a * x
  # Define your own backward step
  def grads(dy): return dy * x, dy * a + 100
  # Return the forward result and the backward function
  return y, grads

a, x = tf.constant(2), tf.constant(3)
y = custom_multiply(a, x)
dy_dx = tf.gradients(y, x)[0]
# It will print `dy/dx = 102` instead of 2 if the gradient is not customized
print('dy/dx =', dy_dx.eval())

If your want to customize your own layer , simply replace the core function used in tf.layers.Dropout.call with your own's. 如果要自定义图层 ,只需将tf.layers.Dropout.call使用的核心函数替换为自己的图层即可。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM