public class BacktrackingAdaGradOptimizer extends AbstractBatchOptimizer
Handles optimizing an AbstractDifferentiableFunction through AdaGrad guarded by backtracking.
Modifier and Type | Class and Description |
---|---|
protected class |
BacktrackingAdaGradOptimizer.AdaGradOptimizationState |
AbstractBatchOptimizer.OptimizationState
Constructor and Description |
---|
BacktrackingAdaGradOptimizer() |
Modifier and Type | Method and Description |
---|---|
protected AbstractBatchOptimizer.OptimizationState |
getFreshOptimizationState(ConcatVector initialWeights)
This is called at the beginning of each batch optimization.
|
boolean |
updateWeights(ConcatVector weights,
ConcatVector gradient,
double logLikelihood,
AbstractBatchOptimizer.OptimizationState optimizationState,
boolean quiet)
This is the hook for subclassing batch optimizers to override in order to have their optimizer work.
|
addDenseConstraint, addSparseConstraint, optimize, optimize
public boolean updateWeights(ConcatVector weights, ConcatVector gradient, double logLikelihood, AbstractBatchOptimizer.OptimizationState optimizationState, boolean quiet)
AbstractBatchOptimizer
updateWeights
in class AbstractBatchOptimizer
weights
- the current weights (update these in place)gradient
- the gradient at these weightslogLikelihood
- the log likelihood at these weightsoptimizationState
- any saved state the optimizer wants to keep and pass around during each optimization runquiet
- whether or not to dump output about progress to the consoleprotected AbstractBatchOptimizer.OptimizationState getFreshOptimizationState(ConcatVector initialWeights)
AbstractBatchOptimizer
getFreshOptimizationState
in class AbstractBatchOptimizer
initialWeights
- the initial weights for the optimizer to use