简体   繁体   中英

Tensorflow, declare a vector depending on another tensor

I'm new with tensorflow.

Using tensorflow, I want to define a vector which depends on the output of my neural net to compute the wanted cost function:

# Build the neural network
X = tf.placeholder(tf.float32, shape=[None, n_inputs], name='X')
hidden = fully_connected(X, n_hidden, activation_fn=tf.nn.elu, weights_initializer=initializer)
logits = fully_connected(hidden, n_outputs, activation_fn=None, weights_initializer=initializer)
outputs = tf.nn.softmax(logits)

# Select a random action based on the probability
action = tf.multinomial(tf.log(outputs), num_samples=1)

# Define the target if the action chosen was correct and the cost function
y = np.zeros(n_outputs)
y[int(tf.to_float(action))] = 1.0
cross_entropy = tf.nn.sigmoid_cross_entropy_with_logits(labels=y, logits=logits)

To define y, I need the value of action (between 0 and 9) so that my vector y is [0,0,0,1,0 ...] whith the 1 at the index "action".

But action is a tensor and not an integer so I can't do that !

This code before crashes because I can't apply int to a Tensor object...

What should I do ?

Many thanks

tf.one_hot() is the function you are looking for.

You will have to do as follows :

action_indices = tf.cast(action, tf.int32)
y = tf.one_hot(action_indices)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM