简体   繁体   中英

L2 matrix rowwise normalization gradient

Im trying to implement L2 norm layer for convolutional neural network, and im stuck on backward pass:

def forward(self, inputs):
    x, = inputs
    self._norm = np.expand_dims(np.linalg.norm(x, ord=2, axis=1), axis=1)
    z = np.divide(x, self._norm)
    return z,

def backward(self, inputs, grad_outputs):
    x, = inputs
    gz, = grad_outputs
    gx = None # how to compute gradient here?
    return gx,

How to calculate gx? My first guess was

gx = - gz * x / self._norm**2

But this one seems is wrong.

正确的答案是

gx = np.divide(gz, self._norm)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM