简体   繁体   中英

Tensorflow mask from one-hot encoding

I have labels that are OHE in the form of examples = tf.placeholder(tf.int32, [batch_size]) where each example is an int in the range 0:ohe_size .

My output is in the form of a softmax probability distribution with a shape [batch_size, ohe_size]

I'm trying to work out how to create a mask that will give me just the probability distribution for each example. eg

probs = [[0.1, 0.6, 0.3]
         [0.2, 0.1, 0.7]
         [0.9, 0.1, 0.0]]
examples = [2, 2, 0]

some_mask_func(probs, example) # <- Need this function    
> [0.3, 0.7, 0.9]

If I understood your example correctly, you need tf.gather_nd

range = tf.range(tf.shape(examples)[0])
indices = tf.pack([range, examples], axis=1)
result = tf.gather_nd(probs, indices)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM