繁体   English   中英

Python中softmax函数的导数

[英]Derivative of softmax function in Python

下面是神经网络的 softmax 激活函数。 这个函数的导数是多少?

def softmax(z):
   e = np.exp(z)
   return e / np.sum(e, axis=1)

softmax 导数的迭代版本

import numpy as np

def softmax_grad(s): 
    # Take the derivative of softmax element w.r.t the each logit which is usually Wi * X
    # input s is softmax value of the original input x. 
    # s.shape = (1, n) 
    # i.e. s = np.array([0.3, 0.7]), x = np.array([0, 1])

    # initialize the 2-D jacobian matrix.
    jacobian_m = np.diag(s)

    for i in range(len(jacobian_m)):
        for j in range(len(jacobian_m)):
            if i == j:
                jacobian_m[i][j] = s[i] * (1-s[i])
            else: 
                jacobian_m[i][j] = -s[i]*s[j]
    return jacobian_m

矢量化版本

def softmax_grad(softmax):
    # Reshape the 1-d softmax to 2-d so that np.dot will do the matrix multiplication
    s = softmax.reshape(-1,1)
    return np.diagflat(s) - np.dot(s, s.T)

参考https : //medium.com/@aerinykim/how-to-implement-the-softmax-derivative-independently-from-any-loss-function-ae6d44363a9d

假设您有一个形状为 (N, 1) 的数组

import numpy as np

def softmax(x):
    return np.exp(x) / np.sum(np.exp(x))

def softmax_dash(x):
    I = np.eye(x.shape[0])
    return softmax(x) * (I - softmax(x).T)

您可以观看此视频,该视频通过示例进行了更好的解释。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM