Package | Description |
---|---|
deepnetts.net.train |
Training algorithms and related utilities.
|
Modifier and Type | Method and Description |
---|---|
BackpropagationTrainer |
TrainingEvent.getSource()
Gets the source of the event.
|
BackpropagationTrainer |
BackpropagationTrainer.setBatchMode(boolean batchMode)
Sets flag whether to use batch mode during the training.
|
BackpropagationTrainer |
BackpropagationTrainer.setBatchSize(int batchSize)
Batch size is number of training examples after which network's weights are adjusted.
|
BackpropagationTrainer |
BackpropagationTrainer.setCheckpointEpochs(int checkpointEpochs)
On how many epochs the snapshots of the trained network should be created.
|
BackpropagationTrainer |
BackpropagationTrainer.setDropout(float dropout)
Dropout is a technique to prevent overfitting, which skips adjusting weights for some neurons with given probability.
|
BackpropagationTrainer |
BackpropagationTrainer.setEarlyStopping(boolean earlyStopping)
Early stopping stops training if it starts converging slow, and prevents overfitting.
|
BackpropagationTrainer |
BackpropagationTrainer.setEarlyStoppingMinLossChange(float earlyStoppingMinLossChange)
Early stopping stops training if the error/loss start converging to slow.
|
BackpropagationTrainer |
BackpropagationTrainer.setEarlyStoppingPatience(int earlyStoppingPatience)
How many epochs to wait to see if the loss is lowering to slow.
|
BackpropagationTrainer |
BackpropagationTrainer.setL1Regularization(float regL1)
L1 regularization (sum of abs values) is used to prevent overfitting and too large weights.
|
BackpropagationTrainer |
BackpropagationTrainer.setL2Regularization(float regL2)
L2 regularization (sum of squares) is used to prevent overfitting and too large weights.
|
BackpropagationTrainer |
BackpropagationTrainer.setLearningRate(float learningRate)
Learning rate controls the step size as a percent of the error to use
for adjusting internal parameters(weights) of the neural network.
|
BackpropagationTrainer |
BackpropagationTrainer.setLearningRateDecay(float decayRate)
Learning rate decay lowers the learning rate with each epoch by devayRate factor,
which may improve error lowering the error.
|
BackpropagationTrainer |
BackpropagationTrainer.setMaxEpochs(long maxEpochs)
Deprecated.
Use setStopEpochs instead
|
BackpropagationTrainer |
BackpropagationTrainer.setMaxError(float maxError)
Sets stopping error threshold for this training.
|
BackpropagationTrainer |
BackpropagationTrainer.setMomentum(float momentum)
Momentum settings helps to avoid oscillations in weight changes and get more stable and faster training.
|
BackpropagationTrainer |
BackpropagationTrainer.setOptimizer(OptimizerType optimizer) |
BackpropagationTrainer |
BackpropagationTrainer.setShuffle(boolean shuffle)
Sets shuffle flag which determines if training set should be shuffled before each epoch.
|
BackpropagationTrainer |
BackpropagationTrainer.setSnapshotPath(String snapshotPath)
Path to use for making snapshots - saving the current state of trained network during
the training in order to be able to restore it from a training point.
|
BackpropagationTrainer |
BackpropagationTrainer.setStopEpochs(long stopEpochs)
Sets number of epochs/iterations to run the training.
|
BackpropagationTrainer |
BackpropagationTrainer.setStopError(float stopError)
The training stops when/if training error has reached this value.
|
Constructor and Description |
---|
TrainingEvent(BackpropagationTrainer source,
TrainingEvent.Type type)
Constructs a new TrainingEvent with specified source and type.
|
Copyright © 2022. All rights reserved.