edu.stanford.nlp.optimization
Class SGDToQNMinimizer
java.lang.Object
edu.stanford.nlp.optimization.SGDToQNMinimizer
- All Implemented Interfaces:
- Minimizer<DiffFunction>, Serializable
public class SGDToQNMinimizer
- extends Object
- implements Minimizer<DiffFunction>, Serializable
Stochastic Gradient Descent To Quasi Newton Minimizer
An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction)
and executes SGD for the first couple passes. During the final iterations a series of approximate hessian vector
products are built up. These are then passed to the QNminimizer so that it can start right up without the typical
delay.
Note [2012] The basic idea here is good, but the original ScaledSGDMinimizer wasn't efficient, and so this would
be much more useful if rewritten to use the good StochasticInPlaceMinimizer instead.
- Since:
- 1.0
- Author:
- Alex Kleeman
- See Also:
- Serialized Form
Constructor Summary |
SGDToQNMinimizer(double SGDGain,
int batchSize,
int SGDPasses,
int QNPasses)
|
SGDToQNMinimizer(double SGDGain,
int batchSize,
int sgdPasses,
int qnPasses,
int hessSamples,
int QNMem)
|
SGDToQNMinimizer(double SGDGain,
int batchSize,
int sgdPasses,
int qnPasses,
int hessSamples,
int QNMem,
boolean outputToFile)
|
Method Summary |
protected String |
getName()
|
double[] |
minimize(DiffFunction function,
double functionTolerance,
double[] initial)
Attempts to find an unconstrained minimum of the objective
function starting at initial , within
functionTolerance . |
double[] |
minimize(DiffFunction function,
double functionTolerance,
double[] initial,
int maxIterations)
|
void |
shutUp()
|
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
outputIterationsToFile
public boolean outputIterationsToFile
gain
public double gain
SGDPasses
public int SGDPasses
QNPasses
public int QNPasses
SGDToQNMinimizer
public SGDToQNMinimizer(double SGDGain,
int batchSize,
int SGDPasses,
int QNPasses)
SGDToQNMinimizer
public SGDToQNMinimizer(double SGDGain,
int batchSize,
int sgdPasses,
int qnPasses,
int hessSamples,
int QNMem)
SGDToQNMinimizer
public SGDToQNMinimizer(double SGDGain,
int batchSize,
int sgdPasses,
int qnPasses,
int hessSamples,
int QNMem,
boolean outputToFile)
shutUp
public void shutUp()
getName
protected String getName()
minimize
public double[] minimize(DiffFunction function,
double functionTolerance,
double[] initial)
- Description copied from interface:
Minimizer
- Attempts to find an unconstrained minimum of the objective
function
starting at initial
, within
functionTolerance
.
- Specified by:
minimize
in interface Minimizer<DiffFunction>
- Parameters:
function
- the objective functionfunctionTolerance
- a double
valueinitial
- a initial feasible point
- Returns:
- Unconstrained minimum of function
minimize
public double[] minimize(DiffFunction function,
double functionTolerance,
double[] initial,
int maxIterations)
- Specified by:
minimize
in interface Minimizer<DiffFunction>
Stanford NLP Group