Package | Description |
---|---|
deepnetts.net.layers |
Neural network layers, which are main building blocks of a neural network.
|
deepnetts.net.train.opt |
Optimization methods used by training algorithm.
|
Modifier and Type | Method and Description |
---|---|
Optimizer |
AbstractLayer.getOptimizer() |
Modifier and Type | Class and Description |
---|---|
class |
AdaDeltaOptimizer
Implementation of ADADELTA
Optimizer which is a modification od AdaGrad that uses only a limited window or previous gradients. |
class |
AdaGradOptimizer
Implementation of ADAGRAD
Optimizer , which uses sum of squared previous gradients
to adjust a global learning rate for each weight. |
class |
AdamOptimizer
Implementation of Adam optimizer which is a variation of RmsProp which includes momentum-like factor.
|
class |
MomentumOptimizer
Momentum optimization adds momentum parameter to basic Stochastic Gradient Descent, which can accelerate the process.
|
class |
RmsPropOptimizer
A variation of AdaDelta optimizer.
|
class |
SgdOptimizer
Basic Stochastic Gradient Descent optimization algorithm, which iteratively change weights towards value which gives minimum error.
|
Modifier and Type | Method and Description |
---|---|
static Optimizer |
Optimizer.create(OptimizerType type,
AbstractLayer layer)
Factory method to create different types of optimizers
|
Copyright © 2022. All rights reserved.