简体   繁体   中英

How to use complex variables in TensorFlow eager mode?

In non-eager mode I can run this without issues:

s = tf.complex(tf.Variable(1.0), tf.Variable(1.0))
train_op = tf.train.AdamOptimizer(0.01).minimize(tf.abs(s))

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    for i in range(5):
        _, s_ = sess.run([train_op, s])
        print(s_)

>(1+1j)
(0.99+0.99j)
(0.98+0.98j)
(0.9700001+0.9700001j)
(0.9600001+0.9600001j)

But I cannot seem to find the equivalent expression in eager mode. I've tried the following, but TF complains:

tfe = tf.contrib.eager
s = tf.complex(tfe.Variable(1.0), tfe.Variable(1.0))
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    loss = obj(s)
    grads = tape.gradient(loss, [s])
    optimizer.apply_gradients(zip(grads, [s]))

The dtype of the source tensor must be floating (eg tf.float32 ) when calling GradientTape.gradient, got tf.complex64

and

No gradients provided for any variable: ['tf.Tensor((1+1j), shape=(), dtype=complex64)']

How does one train complex variables in eager mode?

Using eager mode in Tensorflow 2, you can take the real and imaginary parts as the real variables:

r, i = tf.Variable(1.0), tf.Variable(1.0)
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    s = tf.complex(r, i)
    loss = obj(s)
    grads = tape.gradient(loss, [r, i])
    optimizer.apply_gradients(zip(grads, [r, i]))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM