简体   繁体   English

如何在pylearn2的隐藏层中使用泄漏的ReLus作为激活功能

[英]How to use leaky ReLus as the activation function in hidden layers in pylearn2

I am using pylearn2 library to design a CNN. 我正在使用pylearn2库来设计CNN。 I want to use Leaky ReLus as the activation function in one layer. 我想将Leaky ReLus用作一层的激活功能。 Is there any possible way to do this using pylearn2? 有没有可能使用pylearn2做到这一点? Do I have to write a custom function for it or does pylearn2 have inbuilt funtions for tha? 我是否必须为其编写自定义函数,还是pylearn2具有tha的内置函数? If so, how to write a custom code? 如果是这样,如何编写自定义代码? Please can anyone help me out here? 有人可以帮我吗?

ConvElemwise super-class is a generic convolutional elemwise layer. ConvElemwise超类是通用卷积Elemwise层。 Among its subclasses ConvRectifiedLinear is a convolutional rectified linear layer that uses RectifierConvNonlinearity class. 在其子类中, ConvRectifiedLinear是使用RectifierConvNonlinearity类的卷积整流线性层。

In the apply() method: apply()方法中:

    p = linear_response * (linear_response > 0.) + self.left_slope *\
        linear_response * (linear_response < 0.)

As this gentle review points out: 正如这篇温和的评论指出的那样:

... Maxout neuron (introduced recently by Goodfellow et al. ) that generalizes the ReLU and its leaky version. ... Maxout神经元(由Goodfellow等人最近引入)对ReLU及其泄漏版本进行了概括。

Examples are MaxoutLocalC01B or MaxoutConvC01B . 示例是MaxoutLocalC01BMaxoutConvC01B

The reason for lack of answer in pylearn2-user may be that pylearn2 is mostly written by researches at LISA lab and, thus, the threshold for point 13 in FAQ may be high. pylearn2-user中缺少答案的原因可能是pylearn2主要由LISA实验室的研究编写,因此FAQ中第13点的阈值可能很高。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM