繁体   English   中英

梯度下降

[英]Gradient Descent

我正在尝试计算 function 来计算 python 中的梯度下降。 我知道如何在没有向量的情况下计算它,例如:

 def gradient_descent(x,y):
    m_curr = b_curr = 0
    iterations = 10000
    n = len(x)
    learning_rate = 0.08

for i in range(iterations):
    y_predicted = m_curr * x + b_curr
    cost = (1/n) * sum([val**2 for val in (y-y_predicted)])
    md = -(2/n)*sum(x*(y-y_predicted))
    bd = -(2/n)*sum(y-y_predicted)
    m_curr = m_curr - learning_rate * md
    b_curr = b_curr - learning_rate * bd

但是,当参数是向量时,我遇到了麻烦。 任何帮助,将不胜感激。 我是 python 的新手

# computeMSEBatchGradient: 
#   weights - vector of weights (univariate linear = 2 weights)
#   features - vector (or matrix) of feature values
#   targets - vector of target values, same length as features
#
#   returns average gradient over the batch of features
def computeMSEBatchGradient(weights,features,targets):

  # insert calculation of gradient here
  #return the gradient as a vector


  return gradient

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM