簡體   English   中英

如何正確使用 tensorflow2 中的優化器?

[英]How to use the optimizer in tensorflow2 correct?

我問自己下面的代碼是只做梯度下降的一步還是做整個梯度下降算法?

opt = tf.keras.optimizers.SGD(learning_rate=self.learning_rate)   
opt = tf.keras.optimizers.SGD(learning_rate=self.learning_rate)   
train = opt.minimize(self.loss, var_list=[self.W1, self.b1, self.W2, self.b2, self.W3, self.b3])

您需要在您確定的梯度下降中執行許多步驟。 但我不確定opt.minimize(self.loss, var_list=[self.W1, self.b1, self.W2, self.b2, self.W3, self.b3])是否正在執行所有步驟而不是執行一步梯度下降法。 為什么我認為它會執行所有步驟? 因為在那之后我的損失為零。

tf.keras.optimizers.Optimizer.minimize()計算梯度並應用它們。 因此,這是一個步驟。

在此 function 的文檔中,您可以閱讀:

此方法僅使用 tf.GradientTape 計算梯度並調用 apply_gradients()。 如果您想在應用之前處理漸變,請顯式調用 tf.GradientTape 和 apply_gradients() 而不是使用此 function。

最小化()的實現中也可以看出:

  def minimize(self, loss, var_list, grad_loss=None, name=None, tape=None):
    """Minimize `loss` by updating `var_list`.
    This method simply computes gradient using `tf.GradientTape` and calls
    `apply_gradients()`. If you want to process the gradient before applying
    then call `tf.GradientTape` and `apply_gradients()` explicitly instead
    of using this function.
    Args:
      loss: `Tensor` or callable. If a callable, `loss` should take no arguments
        and return the value to minimize. If a `Tensor`, the `tape` argument
        must be passed.
      var_list: list or tuple of `Variable` objects to update to minimize
        `loss`, or a callable returning the list or tuple of `Variable` objects.
        Use callable when the variable list would otherwise be incomplete before
        `minimize` since the variables are created at the first time `loss` is
        called.
      grad_loss: (Optional). A `Tensor` holding the gradient computed for
        `loss`.
      name: (Optional) str. Name for the returned operation.
      tape: (Optional) `tf.GradientTape`. If `loss` is provided as a `Tensor`,
        the tape that computed the `loss` must be provided.
    Returns:
      An `Operation` that updates the variables in `var_list`. The `iterations`
      will be automatically increased by 1.
    Raises:
      ValueError: If some of the variables are not `Variable` objects.
    """
    grads_and_vars = self._compute_gradients(
        loss, var_list=var_list, grad_loss=grad_loss, tape=tape)
    return self.apply_gradients(grads_and_vars, name=name) 

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM