|
|||||||||
PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES |
Interface Summary | |
---|---|
DiffFloatFunction | An interface for once-differentiable double-valued functions over double arrays. |
DiffFunction | An interface for once-differentiable double-valued functions over double arrays. |
Evaluator | |
FloatFunction | An interface for double-valued functions over double arrays. |
Function | An interface for double-valued functions over double arrays. |
HasEvaluators | Indicates that an minimizer supports evaluation periodically |
HasFloatInitial | Indicates that a function has a method for supplying an intitial value. |
HasInitial | Indicates that a function has a method for supplying an intitial value. |
LineSearcher | The interface for one variable function minimizers. |
Minimizer<T extends Function> | The interface for unconstrained function minimizers. |
StochasticMinimizer.PropertySetter<T1> |
Class Summary | |
---|---|
AbstractCachingDiffFloatFunction | |
AbstractCachingDiffFunction | |
AbstractStochasticCachingDiffFunction | |
AbstractStochasticCachingDiffUpdateFunction | Function for stochastic calculations that does update in place (instead of maintaining and returning the derivative) Weights are represented by an array of doubles and a scalar that indicates how much to scale all weights by This allows all weights to be scaled by just modifying the scalar |
CGMinimizer | Conjugate-gradient implementation based on the code in Numerical Recipes in C. |
CmdEvaluator | Runs a cmdline to evaluate a dataset (assumes cmd takes input from stdin) |
GoldenSectionLineSearch | A class to do golden section line search. |
HybridMinimizer | Hybrid Minimizer is set up as a combination of two minimizers. |
MemoryEvaluator | Evaluate current memory usage |
QNMinimizer | An implementation of L-BFGS for Quasi Newton unconstrained minimization. |
QNMinimizer.SurpriseConvergence | |
ResultStoringFloatMonitor | |
ResultStoringMonitor | |
ScaledSGDMinimizer | Stochastic Gradient Descent To Quasi Newton Minimizer An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction) and executes SGD for the first couple passes, During the final iterations a series of approximate hessian vector products are built up... |
SGDMinimizer<T extends Function> | Stochastic Gradient Descent Minimizer The basic way to use the minimizer is with a null constructor, then the simple minimize method: |
SGDToQNMinimizer | Stochastic Gradient Descent To Quasi Newton Minimizer An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction) and executes SGD for the first couple passes, During the final iterations a series of approximate hessian vector products are built up... |
SMDMinimizer<T extends Function> | Stochastic Meta Descent Minimizer based on |
SQNMinimizer<T extends Function> | Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in |
StochasticDiffFunctionTester | |
StochasticInPlaceMinimizer<T extends Function> | In place Stochastic Gradient Descent Minimizer - Follows weight decay and tuning of learning parameter of crfsgd of Leon Bottou: http://leon.bottou.org/projects/sgd - Only support L2 regularization (QUADRATIC) - Requires objective function is an AbstractStochasticCachingDiffUpdateFunction NOTE: unlike other minimizers, regularization is done in the minimizer, not the objective function |
StochasticInPlaceMinimizer.InvalidElementException | |
StochasticMinimizer<T extends Function> | Stochastic Gradient Descent Minimizer The basic way to use the minimizer is with a null constructor, then the simple minimize method: |
StochasticMinimizer.InvalidElementException |
Enum Summary | |
---|---|
AbstractStochasticCachingDiffFunction.SamplingMethod | |
QNMinimizer.eLineSearch | |
QNMinimizer.eScaling | |
QNMinimizer.eState | |
StochasticCalculateMethods | This enumeratin was created to organize the selection of different methods for stochastic calculations. |
|
|||||||||
PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES |