edu.stanford.nlp.math
Class ADMath

java.lang.Object
  extended by edu.stanford.nlp.math.ADMath

public class ADMath
extends java.lang.Object

The class ADMath was created to extend the current calculations of gradient to automatically include a calculation of the hessian vector product with another vector v. It contains all the functions for the DoubleAlgorithmicDifferentiationo class This is used with the Stochastic Meta Descent Optimization, but could be extended for use in any application that requires a additional order of differentiation without explicitly creating the code.

Author:
Alex Kleeman

Constructor Summary
ADMath()
           
 
Method Summary
static DoubleAD divide(DoubleAD a, DoubleAD b)
           
static DoubleAD divideConst(DoubleAD a, double b)
           
static DoubleAD exp(DoubleAD a)
           
static DoubleAD log(DoubleAD a)
           
static DoubleAD logSum(DoubleAD[] logInputs)
           
static DoubleAD logSum(DoubleAD[] logInputs, int fromIndex, int toIndex)
           
static DoubleAD minus(DoubleAD a, DoubleAD b)
           
static DoubleAD minusConst(DoubleAD a, double b)
           
static DoubleAD mult(DoubleAD a, DoubleAD b)
           
static DoubleAD multConst(DoubleAD a, double b)
           
static DoubleAD plus(DoubleAD a, DoubleAD b)
           
static DoubleAD plusConst(DoubleAD a, double b)
           
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

ADMath

public ADMath()
Method Detail

mult

public static DoubleAD mult(DoubleAD a,
                            DoubleAD b)

multConst

public static DoubleAD multConst(DoubleAD a,
                                 double b)

divide

public static DoubleAD divide(DoubleAD a,
                              DoubleAD b)

divideConst

public static DoubleAD divideConst(DoubleAD a,
                                   double b)

exp

public static DoubleAD exp(DoubleAD a)

log

public static DoubleAD log(DoubleAD a)

plus

public static DoubleAD plus(DoubleAD a,
                            DoubleAD b)

plusConst

public static DoubleAD plusConst(DoubleAD a,
                                 double b)

minus

public static DoubleAD minus(DoubleAD a,
                             DoubleAD b)

minusConst

public static DoubleAD minusConst(DoubleAD a,
                                  double b)

logSum

public static DoubleAD logSum(DoubleAD[] logInputs)

logSum

public static DoubleAD logSum(DoubleAD[] logInputs,
                              int fromIndex,
                              int toIndex)


Stanford NLP Group