简体   繁体   中英

When to use tf.while_loop()?

This is more like a general question on when to use tf.while_loop. For instance, if you have a graph that can be constructed with a loop of a fixed amount of iterations, it seems to make no sense to use tf.while_loop. That in mind, it only seems to make sense to use this function when it is not clear how long the while loop has to be executed before the graph computation, ie when the condition depends on tensors that have to be computed. Please point me to the right direction if I am wrong.

tf.while_loop makes the graph more compact, as it avoids duplication. Depending on the code, several advantages may ensue. For example, it may use fewer resources, make the intention clearer, and the graph more readable and amenable to optimization. In the worst case, it will be a bit slower than an unrolled loop.

It's a similar trade-off as between a normal while loop and its unrolled version:

i = 0
while i < 4:
    x = f(x)
    i += 1

and

x = f(x)
x = f(x)
x = f(x)
x = f(x)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM