public class ADMath
extends java.lang.Object
ADMath
was created to extend the
current calculations of gradient to automatically include a calculation of the
hessian vector product with another vector v
. It contains all the functions
for the DoubleAlgorithmicDifferentiation class. This is used with
Stochastic Meta Descent Optimization, but could be extended for use in any application
that requires an additional order of differentiation without explicitly creating the code.Modifier and Type | Method and Description |
---|---|
static DoubleAD |
divide(DoubleAD a,
DoubleAD b) |
static DoubleAD |
divideConst(DoubleAD a,
double b) |
static DoubleAD |
exp(DoubleAD a) |
static DoubleAD |
log(DoubleAD a) |
static DoubleAD |
logSum(DoubleAD[] logInputs) |
static DoubleAD |
logSum(DoubleAD[] logInputs,
int fromIndex,
int toIndex) |
static DoubleAD |
minus(DoubleAD a,
DoubleAD b) |
static DoubleAD |
minusConst(DoubleAD a,
double b) |
static DoubleAD |
mult(DoubleAD a,
DoubleAD b) |
static DoubleAD |
multConst(DoubleAD a,
double b) |
static DoubleAD |
plus(DoubleAD a,
DoubleAD b) |
static DoubleAD |
plusConst(DoubleAD a,
double b) |