I need to set the attribute activation_out = 'logistic' in a MLPRegressor of sklearn. It is supposed that this attribute can take the names of the rel ...
I need to set the attribute activation_out = 'logistic' in a MLPRegressor of sklearn. It is supposed that this attribute can take the names of the rel ...
I think this is a comprehension issue, but I would appreciate any help. I'm trying to learn how to use PyTorch for autoencoding. In the nn.Linear func ...
I am creating a function that takes a tensor value and returns the result by applying the following formulation, There are 3 conditions so I am using ...
I am taking intro to ML on Coursera offered by Duke, which I recommend if you are interested in ML. The instructors of this course explained that "We ...
I am trying to fine-tune GPT J, but I have this error. I think it's related to the activation function and it's in-place but I don't know how to code ...
self.classifier = nn.Sequential( nn.Flatten(), nn.Linear(in_features = 32*8*8, out_features = 26), nn.ReLU(), nn.Linea ...
I am building a reinforcement learning model. I am trying to use PRelu in my 2D Conv model using tensorflow. Here is the code for Actor Model. code: ...
I am trying to implement a custom version of the PElu activation function in tensorflow. The custom thing about this activation is the knee of the rel ...
Pseudocode for Custom_activation_function is provided below: In the two pieces of code above, Custom_activation_function is a class of customised ...
I have been facing following error while iterating through a a range of neurons and activation functions. The error is encoutered only in case of Prel ...
I would like to predict a multi-dimensional array using Long Short-Term Memory (LSTM) networks while imposing restrictions on the shape of the surface ...
I am trying to build a Keras neural network where the activation function at the output layer (conditionally) depends on the inputs. The activation fu ...
I am trying to figure out what CNN architecture after every activation layers. Therefore, I have written a code to visualize some activation layers in ...
I find that tensorflow and pytorch tanh result is different, I want to know why did this happen? I know that the difference is very small, so is this ...
I have a Perceptron written in Javascript that works fine, code below. My question is about the threshold in the activation function. Other code I hav ...
is it possible to define a function of activation function? I tried to do : But i get an error when trying to call it. Here is an example: it ou ...
I understand that from a computational point of view, even if the output of an activation function is zero it still outputs the zero from the neuron t ...
While defining activation function (tanh), do I need to write lambda x: numpy.tanh(x)? Or Should I write only activation function = numpy.tanh ? This ...
I'm trying to write a class for Invertible trainable LeakyReLu in which the model modifies the negative_slope in each iteration, However I set requ ...
I'm trying to write a piecewise activation function whose slope between -6 and 0 is 0.1 and the other places are one. And the input(X) size is (B, C, ...