Package deepnetts.net.weights
Class RandomWeights
java.lang.Object
deepnetts.net.weights.RandomWeights
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic void
gaussian
(float[] weights, float mean, float std) static void
he
(float[] weights, int numInputs) He initialization, used for relu activations.zero-mean.static void
initSeed
(long seed) Initializes random number generator with specified seed.static void
normal
(float[] weights) static void
randomize
(float[] array) Initialize the elements of specified array with random numbers with uniform distribution in range [-0.5, 0.5].static void
uniform
(float[] weights, float min, float max) static void
uniform
(float[] weights, int numInputs) Uniform U[-a,a] where a=1/sqrt(in).static void
widrowHoff
(float[] array, float input, float hidden) static void
xavier
(float[] weights, int numIn, int numOut) Normalized uniform initialization U[-a,a] with a = sqrt(6/(in + out)).
-
Constructor Details
-
RandomWeights
public RandomWeights()
-
-
Method Details
-
initSeed
public static void initSeed(long seed) Initializes random number generator with specified seed.- Parameters:
seed
- init value for random generator
-
randomize
public static void randomize(float[] array) Initialize the elements of specified array with random numbers with uniform distribution in range [-0.5, 0.5].- Parameters:
array
- array to initialize
-
widrowHoff
public static void widrowHoff(float[] array, float input, float hidden) -
uniform
public static void uniform(float[] weights, int numInputs) Uniform U[-a,a] where a=1/sqrt(in). Understanding the difficulty of training deep feedforward neural networks Xavier Glorot, Yoshua Bengio, 2010, Commonly used "heuristic" , Eq. 1 pg 251 http://jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf This method is reffered to as commonly used heuristics in paper above.- Parameters:
weights
- an array of weights to randomizenumInputs
- a number of inputs from previous layer
-
uniform
public static void uniform(float[] weights, float min, float max) -
he
public static void he(float[] weights, int numInputs) He initialization, used for relu activations.zero-mean. Zero-mean Gaussian distribution whose standard deviation (std) is√2/nl. (eq 10 in paper below) Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification https://arxiv.org/pdf/1502.01852.pdf Koristi za Relu i leaky relu- Parameters:
weights
- weights to initializenumInputs
- number of inputs
-
gaussian
public static void gaussian(float[] weights, float mean, float std) -
normal
public static void normal(float[] weights) -
xavier
public static void xavier(float[] weights, int numIn, int numOut) Normalized uniform initialization U[-a,a] with a = sqrt(6/(in + out)). Properly scaled uniform distribution (normalized initialization). Sta treba u brojiocu 6 ili 1 u=i da li u imeniocu treba zbir in + out ili samo in ili da aimam obe verzije i pitanje je da li je ispod uniformna ili gausova distribucija Xavier Glorot, Yoshua Bengio, 2010, Understanding the difficulty of training deep feedforward neural networks http://jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf , pg. 253, eq 16. http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf Use for tanh- Parameters:
weights
-numIn
- size of the previous layer (number of inputs)numOut
- size of initialized layer (number of outputs) https://towardsdatascience.com/weight-initialization-techniques-in-neural-networks-26c649eb3b78
-