Package edu.stanford.nlp.optimization

Interface Summary
DiffFloatFunction An interface for once-differentiable double-valued functions over double arrays.
DiffFunction An interface for once-differentiable double-valued functions over double arrays.
Evaluator  
FloatFunction An interface for double-valued functions over double arrays.
Function An interface for double-valued functions over double arrays.
HasEvaluators Indicates that an minimizer supports evaluation periodically
HasFloatInitial Indicates that a function has a method for supplying an intitial value.
HasInitial Indicates that a function has a method for supplying an intitial value.
HasL1ParamRange Indicates that an minimizer supports evaluation periodically
LineSearcher The interface for one variable function minimizers.
Minimizer<T extends Function> The interface for unconstrained function minimizers.
StochasticMinimizer.PropertySetter<T1>  
 

Class Summary
AbstractCachingDiffFloatFunction  
AbstractCachingDiffFunction A differentiable function that caches the last evaluation of its value and derivative.
AbstractStochasticCachingDiffFunction  
AbstractStochasticCachingDiffUpdateFunction Function for stochastic calculations that does update in place (instead of maintaining and returning the derivative).
CGMinimizer Conjugate-gradient implementation based on the code in Numerical Recipes in C.
CmdEvaluator Runs a cmdline to evaluate a dataset (assumes cmd takes input from stdin)
GoldenSectionLineSearch A class to do golden section line search.
HybridMinimizer Hybrid Minimizer is set up as a combination of two minimizers.
MemoryEvaluator Evaluate current memory usage
QNMinimizer An implementation of L-BFGS for Quasi Newton unconstrained minimization.
QNMinimizer.SurpriseConvergence  
ResultStoringFloatMonitor  
ResultStoringMonitor  
ScaledSGDMinimizer<Q extends AbstractStochasticCachingDiffFunction> Stochastic Gradient Descent To Quasi Newton Minimizer.
ScaledSGDMinimizer.weight  
SGDMinimizer<T extends Function> Stochastic Gradient Descent Minimizer.
SGDToQNMinimizer Stochastic Gradient Descent To Quasi Newton Minimizer An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction) and executes SGD for the first couple passes.
SMDMinimizer<T extends Function> Stochastic Meta Descent Minimizer based on
SQNMinimizer<T extends Function> Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in
StochasticDiffFunctionTester  
StochasticInPlaceMinimizer<T extends Function> In place Stochastic Gradient Descent Minimizer.
StochasticMinimizer<T extends Function> Stochastic Gradient Descent Minimizer.
 

Enum Summary
AbstractStochasticCachingDiffFunction.SamplingMethod  
QNMinimizer.eLineSearch  
QNMinimizer.eScaling  
QNMinimizer.eState  
StochasticCalculateMethods This enumeratin was created to organize the selection of different methods for stochastic calculations.
 



Stanford NLP Group