edu.stanford.nlp.optimization
Class SGDWithAdaGradAndFOBOS<T extends Function>
java.lang.Object
edu.stanford.nlp.optimization.SGDWithAdaGradAndFOBOS<T>
- All Implemented Interfaces:
- HasEvaluators, Minimizer<T>
public class SGDWithAdaGradAndFOBOS<T extends Function>
- extends java.lang.Object
- implements Minimizer<T>, HasEvaluators
Stochastic Gradient Descent With AdaGrad and FOBOS.
NOTE: similar to Stochastic Inplace Minimizer, regularization is done in the minimizer, not the objective function.
- Author:
- Mengqiu Wang
Constructor Summary |
SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses)
|
SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses,
int tuningSamples)
|
SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses,
int tuningSamples,
int batchSize)
|
SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses,
int tuningSamples,
int batchSize,
java.lang.String priorType,
double alpha)
|
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
x
protected double[] x
initRate
protected double initRate
lambda
protected double lambda
alpha
protected double alpha
quiet
protected boolean quiet
numPasses
protected final int numPasses
bSize
protected int bSize
tuningSamples
protected final int tuningSamples
gen
protected java.util.Random gen
maxTime
protected long maxTime
SGDWithAdaGradAndFOBOS
public SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses)
SGDWithAdaGradAndFOBOS
public SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses,
int tuningSamples)
SGDWithAdaGradAndFOBOS
public SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses,
int tuningSamples,
int batchSize)
SGDWithAdaGradAndFOBOS
public SGDWithAdaGradAndFOBOS(double initRate,
double lambda,
int numPasses,
int tuningSamples,
int batchSize,
java.lang.String priorType,
double alpha)
terminateOnEvalImprovement
public void terminateOnEvalImprovement(boolean toTerminate)
suppressTestPrompt
public void suppressTestPrompt(boolean suppressTestPrompt)
setTerminateOnEvalImprovementNumOfEpoch
public void setTerminateOnEvalImprovementNumOfEpoch(int terminateOnEvalImprovementNumOfEpoch)
toContinue
public boolean toContinue(double[] x,
double currEval)
shutUp
public void shutUp()
getName
protected java.lang.String getName()
setEvaluators
public void setEvaluators(int iters,
Evaluator[] evaluators)
- Specified by:
setEvaluators
in interface HasEvaluators
minimize
public double[] minimize(Function function,
double functionTolerance,
double[] initial)
- Description copied from interface:
Minimizer
- Attempts to find an unconstrained minimum of the objective
function
starting at initial
, within
functionTolerance
.
- Specified by:
minimize
in interface Minimizer<T extends Function>
- Parameters:
function
- the objective functionfunctionTolerance
- a double
valueinitial
- a initial feasible point
- Returns:
- Unconstrained minimum of function
minimize
public double[] minimize(Function f,
double functionTolerance,
double[] initial,
int maxIterations)
- Specified by:
minimize
in interface Minimizer<T extends Function>
sayln
protected void sayln(java.lang.String s)
say
protected void say(java.lang.String s)
Stanford NLP Group