简体   繁体   English

Keras 添加归一化层,使值之和为 1

[英]Keras add normalise layer so sum of values is 1

I want to be able to add a layer to my.network that takes the input from previous layer and outputs a probability distribution where all of the values are positive and sum to 1. So any negative values are set to 0, then the remaining positive values are normalised so that the sum of the outputs = 1.我希望能够向 my.network 添加一个层,该层从上一层获取输入并输出概率分布,其中所有值均为正且总和为 1。因此,任何负值都设置为 0,然后将剩余的正值设置为值被归一化,以便输出的总和 = 1。

How can I do this?我怎样才能做到这一点?

IIUC, you can just use the relu and softmax activation functions for that: IIUC,你可以只使用relusoftmax激活函数:

import tensorflow as tf

inputs = tf.keras.layers.Input((5,))
x = tf.keras.layers.Dense(32, activation='relu')(inputs)
outputs = tf.keras.layers.Dense(32, activation='softmax')(x)
model = tf.keras.Model(inputs, outputs)

x = tf.random.normal((1, 5))
print(model(x))
print(tf.reduce_sum(model(x)))
tf.Tensor(
[[0.02258478 0.0218816  0.03778725 0.02707791 0.02791201 0.01847759
  0.03252319 0.02181962 0.02726094 0.02221758 0.02674739 0.03611234
  0.02821671 0.02606457 0.04022215 0.02933712 0.02975486 0.036876
  0.04303711 0.03443421 0.03356075 0.03135845 0.03266712 0.03934086
  0.02475732 0.04486758 0.02205345 0.0416355  0.04394628 0.03109134
  0.03432642 0.03004995]], shape=(1, 32), dtype=float32)
tf.Tensor(1.0, shape=(), dtype=float32)

So, if x is the output of your previous layer, you can just run:所以,如果x是你上一层的 output,你可以运行:

x = tf.nn.relu(x)
x = tf.nn.softmax(x)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM