Package deepnetts.net.layers.activation
Class LeakyRelu
java.lang.Object
deepnetts.net.layers.activation.LeakyRelu
- All Implemented Interfaces:
 ActivationFunction,Serializable,Consumer<TensorBase>
Leaky Rectified Linear Activation and its Derivative.
 
 y =  x for x > 0, 
      0.1 * x for xinvalid input: '<'0
        - 
       | 1, x > 0
 y' = invalid input: '<'
       | 0.01 , xinvalid input: '<'=0 
        -
 allow a small, positive gradient when the unit is not active
 https://ai.stanford.edu/~amaas/papers/relu_hybrid_icml2013_final.pdf
- Author:
 - Zoran Sevarac
 - See Also:
 
- 
Constructor Summary
Constructors - 
Method Summary
Modifier and TypeMethodDescriptionvoidvoidapply(TensorBase tensor, int from, int to) floatgetPrime(float y) Returns the first derivative of activation function for specified output yfloatgetValue(float x) Returns the value of activation function for specified input xMethods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface deepnetts.net.layers.activation.ActivationFunction
accept 
- 
Constructor Details
- 
LeakyRelu
public LeakyRelu() - 
LeakyRelu
public LeakyRelu(float a)  
 - 
 - 
Method Details
- 
getValue
public float getValue(float x) Description copied from interface:ActivationFunctionReturns the value of activation function for specified input x- Specified by:
 getValuein interfaceActivationFunction- Parameters:
 x- input for activation- Returns:
 - value of activation function
 
 - 
getPrime
public float getPrime(float y) Description copied from interface:ActivationFunctionReturns the first derivative of activation function for specified output y- Specified by:
 getPrimein interfaceActivationFunction- Parameters:
 y- output of activation function- Returns:
 - first derivative of activation function
 
 - 
apply
- Specified by:
 applyin interfaceActivationFunction
 - 
apply
- Specified by:
 applyin interfaceActivationFunction
 
 -