简体   繁体   English

如何在Tensorflow的神经网络层中实现不同的激活功能?

[英]How to implement different activation functions in a layer of a neural network in Tensorflow?

The following line creates a layer of size three with the sigmoid activation function on each neuron: 下面的行在每个神经元上创建一个具有S型激活功能的大小为3的层:

out = layers.dense(inputs=inp, units=3, activation=sigmoid)

What I would like to do is something like this: 我想做的是这样的:

out = layers.dense(inputs=inp, units=3, activation=[sigmoid sigmoid relu])

In essence, the first two neurons contain the sigmoid activation function and the third neuron contains the relu activation function. 本质上,前两个神经元包含乙状结肠激活功能,第三个神经元包含relu活化功能。

My question is: How do I implement this? 我的问题是:我该如何实施?

I would appreciate it if someone could answer this question. 如果有人可以回答这个问题,我将不胜感激。

The easiest and cleanest way is to just create 2 outputs layers: 最简单最干净的方法是只创建2个输出层:

sigmoid_out = layers.dense(inputs=inp, units=2, activation=tf.nn.sigmoid)
relu_out = layers.dense(inputs=inp, units=1, activation=tf.nn.relu)

You can then concat both layers if you want : 然后,如果需要,可以同时合并两个图层:

out = tf.concat([sigmoid_out, relu_out], axis=1)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM