简体   繁体   中英

Tensorflow retrain neural network with different data

I have a list of inputs to the neural network for example

list_of_inputs = [inputs1, inputs2, inputs3, ... ,inputsN]

*and also a corresponding list of labels *

list_of_labels = [label1, label2, label3, ..., labelN]

I want to feed/train each pair of input,label into the neural network, record the loss and then train the next pair of input,label on the same network and record the loss, etc. for all the input,label pairs.

Note: I don't want to reinitialize the weights every time a new input,label is added, I want to use the trained weights from the previous pair. The network is shown below ( where you can see I am also printing the loss). How would I go about this?

with tf.name_scope("nn"):
    model = tf.keras.Sequential([
        tfp.layers.DenseFlipout(64, activation=tf.nn.relu),
        tfp.layers.DenseFlipout(64, activation=tf.nn.softmax),
        tfp.layers.DenseFlipout(np.squeeze(labels).shape[0])
    ])

logits = model(inputs)
loss = tf.reduce_mean(tf.square(labels - logits))
train_op_bnn = tf.train.AdamOptimizer().minimize(loss)


init_op = tf.group(tf.global_variables_initializer(),tf.local_variables_initializer())

with tf.Session() as sess:
    sess.run(init_op)
    for i in range(100):   
        sess.run(train_op_bnn)
        print(sess.run(loss))

EDIT:

The issue is that when I try to format the network in a function as below:

init_op = tf.group(tf.global_variables_initializer(),tf.local_variables_initializer())

with tf.Session() as sess:
    sess.run(init_op)

    inputs,labels = MEMORY[0]

    logits, model_losses = build_graph(inputs)
    loss = tf.reduce_mean(tf.square(labels - logits))
    train_op_bnn = tf.train.AdamOptimizer().minimize(loss)

    sess.run(train_op_bnn)
    print(sess.run(loss))   

I get an error:

FailedPreconditionError                   Traceback (most recent call last)
<ipython-input-95-5ca77fa0606a> in <module>()
     36     train_op_bnn = tf.train.AdamOptimizer().minimize(loss)
     37 
---> 38     sess.run(train_op_bnn)
     39     print(sess.run(loss))
     40 
logits, model_losses = build_graph(inputs)
loss = tf.reduce_mean(tf.square(labels - logits))
train_op_bnn = tf.train.AdamOptimizer().minimize(loss)

should be above

with tf.Session() as sess:

and above your init_op definition

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM