edu.stanford.nlp.optimization
Class SQNMinimizer<T extends Function>

java.lang.Object
  extended by edu.stanford.nlp.optimization.StochasticMinimizer<T>
      extended by edu.stanford.nlp.optimization.SQNMinimizer<T>
All Implemented Interfaces:
HasEvaluators, Minimizer<T>

public class SQNMinimizer<T extends Function>
extends StochasticMinimizer<T>

Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in

Nocedal, Jorge, and Stephen J. Wright. 2000. Numerical Optimization. Springer. pp. 224--

and modified to the online version presented in

A Stocahstic Quasi-Newton Method for Online Convex Optimization Schraudolph, Yu, Gunter (2007)

As of now, it requires a Stochastic differentiable function (AbstractStochasticCachingDiffFunction) as input.

The basic way to use the minimizer is with a null constructor, then the simple minimize method: !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! THIS IS NOT UPDATE FOR THE STOCHASTIC VERSION YET. !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Minimizer qnm = new QNMinimizer();
DiffFunction df = new SomeDiffFunction();
double tol = 1e-4;
double[] initial = getInitialGuess();
double[] minimum = qnm.minimize(df,tol,initial);

If you do not choose a value of M, it will use the max amount of memory available, up to M of 20. This will slow things down a bit at first due to forced garbage collection, but is probably faster overall b/c you are guaranteed the largest possible M. The Stochastic version was written by Alex Kleeman, but about 95% of the code was taken directly from the previous QNMinimizer written mostly by Jenny.

Since:
1.0
Author:
Jenny Finkel, Galen Andrew, Alex Kleeman

Nested Class Summary
 
Nested classes/interfaces inherited from class edu.stanford.nlp.optimization.StochasticMinimizer
StochasticMinimizer.InvalidElementException, StochasticMinimizer.PropertySetter<T1>
 
Field Summary
 
Fields inherited from class edu.stanford.nlp.optimization.StochasticMinimizer
bSize, file, gain, gen, grad, gradList, infoFile, k, maxTime, memory, newGrad, newX, numBatches, numPasses, outputFrequency, outputIterationsToFile, quiet, v, x
 
Constructor Summary
SQNMinimizer()
           
SQNMinimizer(int m)
           
SQNMinimizer(int mem, double initialGain, int batchSize, boolean output)
           
 
Method Summary
 String getName()
           
protected  void init(AbstractStochasticCachingDiffFunction func)
           
 void setM(int m)
           
protected  void takeStep(AbstractStochasticCachingDiffFunction dfunction)
           
 Pair<Integer,Double> tune(Function function, double[] initial, long msPerTest)
           
 
Methods inherited from class edu.stanford.nlp.optimization.StochasticMinimizer
gainSchedule, minimize, minimize, say, sayln, setEvaluators, shutUp, smooth, tune, tuneBatch, tuneDouble, tuneDouble, tuneGain
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

SQNMinimizer

public SQNMinimizer(int m)

SQNMinimizer

public SQNMinimizer()

SQNMinimizer

public SQNMinimizer(int mem,
                    double initialGain,
                    int batchSize,
                    boolean output)
Method Detail

setM

public void setM(int m)

getName

public String getName()
Specified by:
getName in class StochasticMinimizer<T extends Function>

tune

public Pair<Integer,Double> tune(Function function,
                                 double[] initial,
                                 long msPerTest)
Specified by:
tune in class StochasticMinimizer<T extends Function>

init

protected void init(AbstractStochasticCachingDiffFunction func)
Overrides:
init in class StochasticMinimizer<T extends Function>

takeStep

protected void takeStep(AbstractStochasticCachingDiffFunction dfunction)
Specified by:
takeStep in class StochasticMinimizer<T extends Function>


Stanford NLP Group