Requires objective function to be an AbstractStochasticCachingDiffUpdateFunction.
NOTE: unlike other minimizers, regularization is done in the minimizer, not the objective function
- Author:
- Angel Chang
Method Summary |
protected java.lang.String |
getName()
|
double |
getObjective(AbstractStochasticCachingDiffUpdateFunction function,
double[] w,
double wscale,
int[] sample)
|
protected void |
init(AbstractStochasticCachingDiffUpdateFunction func)
|
double[] |
minimize(Function function,
double functionTolerance,
double[] initial)
Attempts to find an unconstrained minimum of the objective
function starting at initial , within
functionTolerance . |
double[] |
minimize(Function f,
double functionTolerance,
double[] initial,
int maxIterations)
|
protected void |
say(java.lang.String s)
|
protected void |
sayln(java.lang.String s)
|
void |
setEvaluators(int iters,
Evaluator[] evaluators)
|
void |
shutUp()
|
double |
tryEta(AbstractStochasticCachingDiffUpdateFunction function,
double[] initial,
int[] sample,
double eta)
|
double |
tune(AbstractStochasticCachingDiffUpdateFunction function,
double[] initial,
int sampleSize,
double seta)
Finds a good learning rate to start with
eta = 1/(lambda*(t0+t)) - we find good t0 |
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
xscale
protected double xscale
xnorm
protected double xnorm
x
protected double[] x
t0
protected int t0
sigma
protected double sigma
lambda
protected double lambda
quiet
protected boolean quiet
numPasses
protected int numPasses
bSize
protected int bSize
tuningSamples
protected int tuningSamples
gen
protected java.util.Random gen
maxTime
protected long maxTime
StochasticInPlaceMinimizer
public StochasticInPlaceMinimizer()
StochasticInPlaceMinimizer
public StochasticInPlaceMinimizer(double sigma,
int numPasses)
StochasticInPlaceMinimizer
public StochasticInPlaceMinimizer(double sigma,
int numPasses,
int tuningSamples)
StochasticInPlaceMinimizer
public StochasticInPlaceMinimizer(LogPrior prior,
int numPasses,
int batchSize,
int tuningSamples)
shutUp
public void shutUp()
getName
protected java.lang.String getName()
setEvaluators
public void setEvaluators(int iters,
Evaluator[] evaluators)
- Specified by:
setEvaluators
in interface HasEvaluators
init
protected void init(AbstractStochasticCachingDiffUpdateFunction func)
getObjective
public double getObjective(AbstractStochasticCachingDiffUpdateFunction function,
double[] w,
double wscale,
int[] sample)
tryEta
public double tryEta(AbstractStochasticCachingDiffUpdateFunction function,
double[] initial,
int[] sample,
double eta)
tune
public double tune(AbstractStochasticCachingDiffUpdateFunction function,
double[] initial,
int sampleSize,
double seta)
- Finds a good learning rate to start with
eta = 1/(lambda*(t0+t)) - we find good t0
- Parameters:
function
- initial
- sampleSize
- seta
-
- Returns:
minimize
public double[] minimize(Function function,
double functionTolerance,
double[] initial)
- Description copied from interface:
Minimizer
- Attempts to find an unconstrained minimum of the objective
function
starting at initial
, within
functionTolerance
.
- Specified by:
minimize
in interface Minimizer<Function>
- Parameters:
function
- the objective functionfunctionTolerance
- a double
valueinitial
- a initial feasible point
- Returns:
- Unconstrained minimum of function
minimize
public double[] minimize(Function f,
double functionTolerance,
double[] initial,
int maxIterations)
- Specified by:
minimize
in interface Minimizer<Function>
sayln
protected void sayln(java.lang.String s)
say
protected void say(java.lang.String s)
Stanford NLP Group