Package | Description |
---|---|
deepnetts.automl |
Support for automatically building deep learning models using hyper-parameter search.
|
deepnetts.net.train |
Training algorithms and related utilities.
|
deepnetts.net.train.opt |
Optimization methods used by training algorithm.
|
Modifier and Type | Method and Description |
---|---|
HyperParameterSearch |
HyperParameterSearch.trainingListener(TrainingListener trainingListener) |
Modifier and Type | Method and Description |
---|---|
TrainingListener |
KFoldCrossValidation.getTrainingListener() |
Modifier and Type | Method and Description |
---|---|
void |
BackpropagationTrainer.addListener(TrainingListener listener)
Adds training listener to this trainer.
|
void |
BackpropagationTrainer.removeListener(TrainingListener listener)
Removes training listener from this trainer.
|
KFoldCrossValidation.Builder |
KFoldCrossValidation.Builder.trainingListener(TrainingListener trainingListener) |
Modifier and Type | Class and Description |
---|---|
class |
AdaDeltaOptimizer
Implementation of ADADELTA
Optimizer which is a modification od AdaGrad that uses only a limited window or previous gradients. |
class |
AdamOptimizer
Implementation of Adam optimizer which is a variation of RmsProp which includes momentum-like factor.
|
class |
LearningRateDecay
https://www.coursera.org/learn/deep-neural-network/lecture/hjgIA/learning-rate-decay
|
Copyright © 2022. All rights reserved.