Package deepnetts.net.train.opt
Class AdamOptimizer
java.lang.Object
deepnetts.net.train.opt.AdamOptimizer
- All Implemented Interfaces:
 Optimizer,TrainingListener,Serializable,EventListener
public final class AdamOptimizer
extends Object
implements Serializable, Optimizer, TrainingListener
Implementation of Adam optimizer which is a variation of RmsProp which includes momentum-like factor.
- Author:
 - Zoran Sevarac
 - See Also:
 
- 
Field Summary
 - 
Constructor Summary
Constructors - 
Method Summary
Modifier and TypeMethodDescriptionfloatcalculateDeltaBias(float grad, int idx) floatcalculateDeltaWeight(float grad, int... idxs) voidhandleEvent(TrainingEvent event) Invoked when a training event occurs.voidsetLearningRate(float learningRate)  
- 
Constructor Details
- 
AdamOptimizer
 
 - 
 - 
Method Details
- 
calculateDeltaWeight
public float calculateDeltaWeight(float grad, int... idxs) - Specified by:
 calculateDeltaWeightin interfaceOptimizer
 - 
calculateDeltaBias
public float calculateDeltaBias(float grad, int idx) - Specified by:
 calculateDeltaBiasin interfaceOptimizer
 - 
handleEvent
Description copied from interface:TrainingListenerInvoked when a training event occurs.- Specified by:
 handleEventin interfaceTrainingListener- Parameters:
 event- the training event
 - 
setLearningRate
public void setLearningRate(float learningRate) - Specified by:
 setLearningRatein interfaceOptimizer
 
 -