簡體   English   中英

如何在 TensorFlow2 中寫出這個方程? (4x+2 = 0)

[英]how to write this equation in TensorFlow2? (4x+2 = 0)

這段代碼不能在TensorFlow 2中運行,代碼是什么?

import tensorflow as tf

# the equation is  : 4x+2 = 0
unknownvalue = tf.Variable(0.0)
a = tf.constant(4.0)
b = tf.constant(2.0)
c = tf.multiply(unknownvalue,a)  # 4x
equation  = tf.add(c,b) # 4x+2
zerovalue = tf.constant(0.0)
diff = tf.square(equation-zerovalue) # differnce is : 4x+2  -  0 
solving = tf.train.GradientDescentOptimizer(0.01).minimize(diff)
init = tf.global_variables_initializer()

自 TF1 以來,TF2 改變了構建訓練循環的方式。 我鼓勵您閱讀本指南: 從頭開始編寫訓練循環,以了解更多如何執行此操作。

這是 tf1 代碼在 TF2 中的直接實現:

import tensorflow as tf
x = tf.Variable(0.0)
optimizer = tf.optimizers.SGD(1e-2)
for idx in range(1,26):
    with tf.GradientTape() as tape:
        # the equation is  : 4x+2 = 0
        equation  = x*4 + 2 # 4x+2
        loss = tf.square(equation)
    grad = tape.gradient(loss, [x])
    optimizer.apply_gradients(zip(grad, [x]))
    if not idx%5:
         tf.print(f"Iteration:{idx},loss:{loss:.4f},x:{x.numpy():.4f}")

這導致

Iteration:5,loss:0.1829,x:-0.4273
Iteration:10,loss:0.0039,x:-0.4894
Iteration:15,loss:0.0001,x:-0.4985
Iteration:20,loss:0.0000,x:-0.4998
Iteration:25,loss:0.0000,x:-0.5000

Tensorflow 2 的代碼更加簡單直觀(除了 GradientTape 部分,如果你不習慣的話):

import tensorflow as tf

# the equation is  : 4x+2 = 0
unknownvalue = tf.Variable(0.0)
with tf.GradientTape as tape:
    diff = (4 * unknownvalue + 2 - 0) ** 2
grads = tape.gradient(diff, unknownvalue)
tf.optimizers.SGD(0.01).apply_gradients(zip(grads, [unknownvalue])

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM