public final class Relu extends Object implements ActivationFunction, Serializable
Constructor and Description |
---|
Relu() |
Modifier and Type | Method and Description |
---|---|
void |
apply(Tensor tensor,
int channel) |
void |
apply(Tensor tensor,
int from,
int to) |
float |
getPrime(float y)
Returns the first derivative of activation function for specified output y
|
float |
getValue(float x)
Returns the value of activation function for specified input x
|
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
create
public float getValue(float x)
ActivationFunction
getValue
in interface ActivationFunction
x
- input for activationpublic float getPrime(float y)
ActivationFunction
getPrime
in interface ActivationFunction
y
- output of activation functionpublic void apply(Tensor tensor, int channel)
apply
in interface ActivationFunction
public void apply(Tensor tensor, int from, int to)
apply
in interface ActivationFunction
Copyright © 2022. All rights reserved.