Parameters

Updatable constants for fast re-solves
Published

February 8, 2026

1 Overview

Parameter objects act like constants in your optimization problem but can be updated between solves without rebuilding the problem structure. This is crucial for:

  • Sensitivity Analysis: Solving the same problem with slightly different inputs.
  • Real-time Optimization: Updating current state or prices.
  • Efficient Re-solving: Avoiding the overhead of reconstructing the expression tree.
from optyx import Parameter, Variable, Problem

x = Variable("x")
p = Parameter("price", value=10.0)

prob = Problem().minimize(p * x).subject_to(x >= 5)
prob.solve()

2 Scalar Parameters

# Create a parameter
limit = Parameter("limit", value=100.0)

# Use in constraints
prob.subject_to(x <= limit)

# Solve
prob.solve()

# Update and re-solve
limit.set(120.0)
prob.solve()

3 Vector Parameters

VectorParameter holds an array of values, useful for price vectors or demand forecasts.

from optyx import VectorParameter

# Create vector parameter
prices = VectorParameter("prices", values=[10, 20, 30])

# Use in objective (dot product)
revenue = prices @ x  # x is a VectorVariable
prob.maximize(revenue)

# Update all values at once
prices.set([12, 22, 32])

4 Matrix Parameters

MatrixParameter holds a 2D array, useful for covariance matrices or cost grids.

from optyx import MatrixParameter, VectorVariable
import numpy as np

# Covariance matrix
cov = MatrixParameter("Sigma", values=np.eye(3), symmetric=True)
x = VectorVariable("x", 3, lb=0, ub=1)

# Math-like quadratic form: x · (Σx)
risk = x.dot(cov @ x)

# Update matrix
new_cov = np.array([[1.0, 0.2], [0.2, 1.0]])
cov.set(new_cov)

5 Performance Benefits

Using Parameter is significantly faster than creating a new Problem instance when only the numerical data changes. The solver can reuse the compiled structure of the problem (Jacobian/Hessian sparsity patterns), only updating the specific values that changed.