The class ADMath was created to extend the
current calculations of gradient to automatically include a calculation of the
hessian vector product with another vector v. It contains all the functions
for the DoubleAlgorithmicDifferentiation class. This is used with
Stochastic Meta Descent Optimization, but could be extended for use in any application
that requires an additional order of differentiation without explicitly creating the code.