edu.stanford.nlp.optimization

## Class SMDMinimizer<T extends Function>

• All Implemented Interfaces:
HasEvaluators, Minimizer<T>

public class SMDMinimizer<T extends Function>
extends StochasticMinimizer<T>

Stochastic Meta Descent Minimizer based on

Accelerated training of conditional random fields with stochastic gradient methods S. V. N. Vishwanathan, Nicol N. Schraudolph, Mark W. Schmidt, Kevin P. Murphy June 2006 Proceedings of the 23rd international conference on Machine learning ICML '06 Publisher: ACM Press

The basic way to use the minimizer is with a null constructor, then the simple minimize method:

Minimizer smd = new SMDMinimizer();
DiffFunction df = new SomeDiffFunction();
double tol = 1e-4;
double[] initial = getInitialGuess();
int maxIterations = someSafeNumber;
double[] minimum = qnm.minimize(df,tol,initial,maxIterations);

Constructing with a null constructor will use the default values of

batchSize = 15;
initialGain = 0.1;
useAlgorithmicDifferentiation = true;

Since:
1.0
Version:
1.0
Author:
Alex Kleeman
• ### Field Detail

• #### mu

public double mu
• #### lam

public double lam
• #### cPosDef

public double cPosDef
• #### meta

public double meta
• #### printMinMax

public boolean printMinMax
• ### Constructor Detail

• #### SMDMinimizer

public SMDMinimizer()
• #### SMDMinimizer

public SMDMinimizer(double initialSMDGain,
int batchSize,
StochasticCalculateMethods method,
int passes)
• #### SMDMinimizer

public SMDMinimizer(double initGain,
int batchSize,
StochasticCalculateMethods method,
int passes,
boolean outputToFile)

Stanford NLP Group