Optyx provides automatic differentiation through symbolic differentiation. Gradients, Jacobians, and Hessians are computed analytically from expression trees.
from optyx.core.autodiff import gradient, compute_jacobian, compute_hessian
2 gradient
Compute the symbolic derivative of an expression with respect to a variable.
from optyx.core.autodiff import gradientdf_dx = gradient(expr, var)
2.1 Parameters
Parameter
Type
Description
expr
Expression
The expression to differentiate
wrt
Variable
The variable to differentiate with respect to
2.2 Returns
Expression — A new expression representing the derivative
Some derivative formulas produce undefined values at certain points:
Function
Derivative
Singularity
\(\lvert x \rvert\)
\(x/\lvert x \rvert\)
\(0/0\) → NaN at \(x=0\)
\(\sqrt{x}\)
\(1/(2\sqrt{x})\)
\(1/0\) → +Inf at \(x=0\)
\(\ln(x)\)
\(1/x\)
\(1/0\) → +Inf at \(x=0\)
Optyx handles these automatically when using compiled derivative functions (compile_gradient, compile_jacobian, compile_hessian):
NaN values are replaced with 0.0 (subgradient convention)
+Inf values are replaced with +1e16 (large but finite)
-Inf values are replaced with -1e16 (large but finite)
This prevents solver crashes while preserving gradient direction. For best results, avoid exact singularities by using appropriate variable bounds (e.g., lb=1e-6 instead of lb=0).
8 Performance
Symbolic differentiation with expression simplification:
Constants are folded: 0 + x → x, 1 * x → x
Zero propagation: 0 * expr → 0
Identical subexpressions are cached
For production use with many evaluations, always use compiled functions.