edu.stanford.nlp.optimization
Class ScaledSGDMinimizer<Q extends AbstractStochasticCachingDiffFunction>

java.lang.Object
  extended by edu.stanford.nlp.optimization.StochasticMinimizer<Q>
      extended by edu.stanford.nlp.optimization.ScaledSGDMinimizer<Q>
All Implemented Interfaces:
HasEvaluators, Minimizer<Q>

public class ScaledSGDMinimizer<Q extends AbstractStochasticCachingDiffFunction>
extends StochasticMinimizer<Q>

Stochastic Gradient Descent To Quasi Newton Minimizer. An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction) and executes SGD for the first couple passes, During the final iterations a series of approximate hessian vector products are built up... These are then passed to the QNMinimizer so that it can start right up without the typical delay.

Since:
1.0
Author:
Alex Kleeman

Nested Class Summary
static class ScaledSGDMinimizer.Weights
           
 
Nested classes/interfaces inherited from class edu.stanford.nlp.optimization.StochasticMinimizer
StochasticMinimizer.PropertySetter<T1>
 
Field Summary
 double[] diag
           
 java.util.List<double[]> sList
           
 java.util.List<double[]> yList
           
 
Fields inherited from class edu.stanford.nlp.optimization.StochasticMinimizer
bSize, file, gain, gen, grad, gradList, infoFile, k, maxTime, memory, newGrad, newX, nf, numBatches, numPasses, outputFrequency, outputIterationsToFile, quiet, v, x
 
Constructor Summary
ScaledSGDMinimizer(double SGDGain, int batchSize)
           
ScaledSGDMinimizer(double SGDGain, int batchSize, int sgdPasses)
           
ScaledSGDMinimizer(double SGDGain, int batchSize, int sgdPasses, int method)
           
ScaledSGDMinimizer(double SGDGain, int batchSize, int sgdPasses, int method, boolean outputToFile)
           
 
Method Summary
static double[] getDiag(java.lang.String loadPath)
           
 java.lang.String getName()
           
static double[] getWeights(java.lang.String loadPath)
           
protected  void init(AbstractStochasticCachingDiffFunction func)
           
static void serializeWeights(java.lang.String serializePath, double[] weights)
           
static void serializeWeights(java.lang.String serializePath, double[] weights, double[] diag)
           
 void setBatchSize(int batchSize)
           
 void setMaxTime(java.lang.Long max)
           
 void shutUp()
           
protected  void takeStep(AbstractStochasticCachingDiffFunction dfunction)
           
 Pair<java.lang.Integer,java.lang.Double> tune(Function function, double[] initial, long msPerTest)
           
 double tuneFixedGain(Function function, double[] initial, long msPerTest, double fixedStart)
           
 
Methods inherited from class edu.stanford.nlp.optimization.StochasticMinimizer
gainSchedule, minimize, minimize, say, sayln, setEvaluators, smooth, tune, tuneBatch, tuneDouble, tuneDouble, tuneGain
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

yList

public java.util.List<double[]> yList

sList

public java.util.List<double[]> sList

diag

public double[] diag
Constructor Detail

ScaledSGDMinimizer

public ScaledSGDMinimizer(double SGDGain,
                          int batchSize,
                          int sgdPasses)

ScaledSGDMinimizer

public ScaledSGDMinimizer(double SGDGain,
                          int batchSize,
                          int sgdPasses,
                          int method)

ScaledSGDMinimizer

public ScaledSGDMinimizer(double SGDGain,
                          int batchSize,
                          int sgdPasses,
                          int method,
                          boolean outputToFile)

ScaledSGDMinimizer

public ScaledSGDMinimizer(double SGDGain,
                          int batchSize)
Method Detail

tuneFixedGain

public double tuneFixedGain(Function function,
                            double[] initial,
                            long msPerTest,
                            double fixedStart)

tune

public Pair<java.lang.Integer,java.lang.Double> tune(Function function,
                                                     double[] initial,
                                                     long msPerTest)
Specified by:
tune in class StochasticMinimizer<Q extends AbstractStochasticCachingDiffFunction>

shutUp

public void shutUp()
Overrides:
shutUp in class StochasticMinimizer<Q extends AbstractStochasticCachingDiffFunction>

setBatchSize

public void setBatchSize(int batchSize)

setMaxTime

public void setMaxTime(java.lang.Long max)

getName

public java.lang.String getName()
Specified by:
getName in class StochasticMinimizer<Q extends AbstractStochasticCachingDiffFunction>

takeStep

protected void takeStep(AbstractStochasticCachingDiffFunction dfunction)
Specified by:
takeStep in class StochasticMinimizer<Q extends AbstractStochasticCachingDiffFunction>

init

protected void init(AbstractStochasticCachingDiffFunction func)
Overrides:
init in class StochasticMinimizer<Q extends AbstractStochasticCachingDiffFunction>

serializeWeights

public static void serializeWeights(java.lang.String serializePath,
                                    double[] weights)

serializeWeights

public static void serializeWeights(java.lang.String serializePath,
                                    double[] weights,
                                    double[] diag)

getWeights

public static double[] getWeights(java.lang.String loadPath)
                           throws java.io.IOException,
                                  java.lang.ClassCastException,
                                  java.lang.ClassNotFoundException
Throws:
java.io.IOException
java.lang.ClassCastException
java.lang.ClassNotFoundException

getDiag

public static double[] getDiag(java.lang.String loadPath)
                        throws java.io.IOException,
                               java.lang.ClassCastException,
                               java.lang.ClassNotFoundException
Throws:
java.io.IOException
java.lang.ClassCastException
java.lang.ClassNotFoundException


Stanford NLP Group