Uses of Interface
deepnetts.net.train.opt.Optimizer
Packages that use Optimizer
Package
Description
Neural network layers, which are main building blocks of a neural network.
Optimization methods used by training algorithm.
-
Uses of Optimizer in deepnetts.net.layers
Methods in deepnetts.net.layers that return Optimizer -
Uses of Optimizer in deepnetts.net.train.opt
Classes in deepnetts.net.train.opt that implement OptimizerModifier and TypeClassDescriptionfinal classImplementation of ADADELTAOptimizerwhich is a modification odthat uses only a limited window or previous gradients.invalid reference
AdaGradfinal classImplementation of ADAGRADOptimizer, which uses sum of squared previous gradients to adjust a global learning rate for each weight.final classImplementation of Adam optimizer which is a variation of RmsProp which includes momentum-like factor.final classMomentum optimization adds momentum parameter to basic Stochastic Gradient Descent, which can accelerate the process.final classA variation of AdaDelta optimizer.final classBasic Stochastic Gradient Descent optimization algorithm, which iteratively change weights towards value which gives minimum error.Methods in deepnetts.net.train.opt that return OptimizerModifier and TypeMethodDescriptionstatic OptimizerOptimizer.create(OptimizerType type, AbstractLayer layer) Factory method to create different types of optimizers