Package deepnetts.net.train.opt
Interface Optimizer
- All Known Implementing Classes:
AdaDeltaOptimizer,AdaGradOptimizer,AdamOptimizer,MomentumOptimizer,RmsPropOptimizer,SgdOptimizer
public interface Optimizer
Optimization technique to tune network's weights parameters used by training algorithm.
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final intstatic final intstatic final floatSmoothing term to prevent division by zero if sqr grad sum becomes zero 1e-8 should be also tried https://d2l.ai/chapter_optimization/adagrad.html 1e-6 The value to use is 1e-6, 1e-8, Keras uses 1e-7 for adamstatic final intstatic final int -
Method Summary
Modifier and TypeMethodDescriptionfloatcalculateDeltaBias(float gradient, int idx) floatcalculateDeltaWeight(float gradient, int... index) static Optimizercreate(OptimizerType type, AbstractLayer layer) Factory method to create different types of optimizersvoidsetLearningRate(float learningRate)
-
Field Details
-
EPS
static final float EPSSmoothing term to prevent division by zero if sqr grad sum becomes zero 1e-8 should be also tried https://d2l.ai/chapter_optimization/adagrad.html 1e-6 The value to use is 1e-6, 1e-8, Keras uses 1e-7 for adam- See Also:
-
ROW_IDX
static final int ROW_IDX- See Also:
-
COL_IDX
static final int COL_IDX- See Also:
-
DEPTH_IDX
static final int DEPTH_IDX- See Also:
-
FOURTH_DIM_IDX
static final int FOURTH_DIM_IDX- See Also:
-
-
Method Details
-
calculateDeltaWeight
float calculateDeltaWeight(float gradient, int... index) -
calculateDeltaBias
float calculateDeltaBias(float gradient, int idx) -
setLearningRate
void setLearningRate(float learningRate) -
create
Factory method to create different types of optimizers- Parameters:
type-layer-- Returns:
-