Working with 2D arrays of decision variables using MatrixVariable
Published
February 8, 2026
1 Introduction
Many optimization problems have naturally 2D structure: assignment problems, transportation networks, scheduling matrices, or covariance models. MatrixVariable provides a clean way to work with 2D arrays of decision variables.
By the end of this tutorial, you’ll understand how to:
Create 2D matrices of decision variables
Use 2D indexing and row/column slicing
Work with matrix operations (transpose, diagonal, trace)
Build matrix-vector products for quadratic forms
2 Creating Matrix Variables
from optyx import MatrixVariable# 3x4 matrix of decision variablesA = MatrixVariable("A", rows=3, cols=4, lb=0)print(f"Matrix: {A.name}")print(f"Shape: {A.shape}")print(f"Total elements: {A.rows * A.cols}")
Matrix: A
Shape: (3, 4)
Total elements: 12
Elements are named with 2D indices: A[0,0], A[0,1], …, A[2,3].
TipSymmetric Matrices
For covariance or distance matrices, use symmetric=True to enforce A[i,j] == A[j,i]:
These return VectorVariable objects, so you can use all vector operations:
# Sum of first rowrow_sum = A[0, :].sum()# Dot product with columnimport numpy as npcoeffs = np.array([1.0, 2.0, 3.0])col_dot = coeffs @ A[:, 1]print(f"Row sum type: {type(row_sum).__name__}")print(f"Column dot type: {type(col_dot).__name__}")
Row sum type: VectorSum
Column dot type: LinearCombination
5 Submatrix Slicing
Extract rectangular submatrices:
# Top-left 2x2 submatrixsubmatrix = A[:2, :2]print(f"Submatrix shape: {submatrix.shape}")# Specific rows and columnsmiddle = A[1:3, 1:3]print(f"Middle shape: {middle.shape}")
Submatrix shape: (2, 2)
Middle shape: (2, 2)
6 Matrix Transpose
The .T property returns a transposed view:
# Original: 3x4print(f"Original shape: {A.shape}")# Transposed: 4x3A_T = A.Tprint(f"Transposed shape: {A_T.shape}")# Elements are the same, just reorderedprint(f"A[1,2] is same as A.T[2,1]: {A[1,2].name} vs {A_T[2,1].name}")
Original shape: (3, 4)
Transposed shape: (4, 3)
A[1,2] is same as A.T[2,1]: A[1,2] vs A[1,2]
7 Iteration
Iterate over rows, or use rows_iter() and cols_iter() for explicit control:
B = MatrixVariable("B", 2, 3)# Iterate over rows (default behavior)print("Iterating over rows:")for row in B:print(f" {row.name}: {len(row)} elements")
Iterating over rows:
B[0,:]: 3 elements
B[1,:]: 3 elements
# Iterate over rowsprint("\nRows:")for i, row inenumerate(B.rows_iter()):print(f" Row {i}: {[v.name for v in row]}")
# Iterate over columnsprint("\nColumns:")for j, col inenumerate(B.cols_iter()):print(f" Col {j}: {[v.name for v in col]}")
Columns:
Col 0: ['B[0,0]', 'B[1,0]']
Col 1: ['B[0,1]', 'B[1,1]']
Col 2: ['B[0,2]', 'B[1,2]']
8 Diagonal and Trace
For square matrices, extract the diagonal or compute the trace:
# Square matrixS = MatrixVariable("S", 3, 3)# Main diagonal as VectorVariablediag = S.diagonal()print(f"Diagonal: {[v.name for v in diag]}")# Trace: sum of diagonal elementstr = S.trace()print(f"Trace type: {type(tr).__name__}")
Multiply a constant NumPy matrix by a VectorVariable to create linear combinations:
from optyx import VectorVariableimport numpy as np# Constant coefficient matrix (3x4)A = np.array([ [1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12],])# Decision variablesx = VectorVariable("x", 4)# Matrix-vector product: A @ x (math-like!)result = A @ xprint(f"Result type: {type(result).__name__}")print(f"Result length: {len(result)}")
Result type: MatrixVectorProduct
Result length: 3
This creates a VectorExpression where each element is a linear combination of the input vector.
NoteConstant vs Variable Matrices
matmul and A @ x work when A is a constant NumPy array. If you need a MatrixVariable (where matrix elements are also decision variables), that would create quadratic expressions—use the pattern in the transportation example instead.
10 Quadratic Forms
For portfolio variance (wᵀΣw), use the math-like syntax w.dot(Σ @ w):
from optyx import VectorVariableimport numpy as np# 5 assetsn =5weights = VectorVariable("w", n, lb=0, ub=1)# Covariance matrix (constant, not a variable)np.random.seed(42)cov_data = np.random.randn(n, n)cov_matrix = cov_data @ cov_data.T / n # Positive semi-definite# Quadratic form: w · (Σw) = wᵀΣwportfolio_variance = weights.dot(cov_matrix @ weights)print(f"Quadratic form type: {type(portfolio_variance).__name__}")
Quadratic form type: QuadraticForm
NoteMath-like Syntax with Optimized Gradients
The expression w.dot(Σ @ w) reads as “w dot (Σ times w)” = wᵀΣw.
Optyx automatically detects this pattern and creates a QuadraticForm expression with O(1) analytic gradients: ∇(xᵀQx) = (Q + Qᵀ)x. For symmetric matrices, this simplifies to 2Qx.
11 Complete Example: Transportation Problem
Ship goods from 3 warehouses to 4 stores, minimizing total cost.
from optyx import MatrixVariable, Problemimport numpy as np# Problem datan_warehouses =3n_stores =4# Supply at each warehousesupply = np.array([100, 150, 200])# Demand at each storedemand = np.array([80, 120, 100, 150])# Shipping cost per unit from warehouse i to store jcosts = np.array([ [4, 6, 9, 5], # From warehouse 0 [5, 3, 7, 8], # From warehouse 1 [6, 8, 4, 3], # From warehouse 2])# Decision variables: shipments[i,j] = units from warehouse i to store jshipments = MatrixVariable("ship", n_warehouses, n_stores, lb=0)# Objective: minimize total shipping costtotal_cost =0for i inrange(n_warehouses):for j inrange(n_stores): total_cost = total_cost + costs[i, j] * shipments[i, j]# Build problemproblem = Problem("transportation").minimize(total_cost)# Supply constraints: can't ship more than availablefor i inrange(n_warehouses): problem = problem.subject_to(shipments[i, :].sum() <= supply[i])# Demand constraints: must meet each store's demandfor j inrange(n_stores): problem = problem.subject_to(shipments[:, j].sum().eq(demand[j]))# Solvesolution = problem.solve()print("="*60)print("TRANSPORTATION SOLUTION")print("="*60)print(f"Status: {solution.status}")print(f"Total Cost: ${solution.objective_value:.2f}")print()# Display shipment matrixprint("Shipments (warehouse → store):")print(" ", end="")for j inrange(n_stores):print(f"Store {j:2d} ", end="")print()for i inrange(n_warehouses):print(f"Warehouse {i}: ", end="")for j inrange(n_stores): val = solution[f'ship[{i},{j}]']print(f"{val:7.1f} ", end="")print()
============================================================
TRANSPORTATION SOLUTION
============================================================
Status: SolverStatus.OPTIMAL
Total Cost: $1660.00
Shipments (warehouse → store):
Store 0 Store 1 Store 2 Store 3
Warehouse 0: 50.0 0.0 0.0 50.0
Warehouse 1: 30.0 120.0 0.0 0.0
Warehouse 2: 0.0 0.0 100.0 100.0
12 Complete Example: Assignment Problem
Assign 4 workers to 4 tasks to minimize total cost.
from optyx import MatrixVariable, Problemimport numpy as npn =4# Cost matrix: cost[i,j] = cost of worker i doing task jcost_matrix = np.array([ [9, 2, 7, 8], [6, 4, 3, 7], [5, 8, 1, 8], [7, 6, 9, 4],])# Binary assignment matrixassign = MatrixVariable("assign", n, n, domain="binary")# Objective: minimize total costtotal_cost =0for i inrange(n):for j inrange(n): total_cost = total_cost + cost_matrix[i, j] * assign[i, j]problem = Problem("assignment").minimize(total_cost)# Each worker assigned to exactly one taskfor i inrange(n): problem = problem.subject_to(assign[i, :].sum().eq(1))# Each task assigned to exactly one workerfor j inrange(n): problem = problem.subject_to(assign[:, j].sum().eq(1))solution = problem.solve()print("="*50)print("ASSIGNMENT SOLUTION")print("="*50)print(f"Status: {solution.status}")print(f"Total Cost: {solution.objective_value:.0f}")print()print("Assignments:")for i inrange(n):for j inrange(n):if solution[f'assign[{i},{j}]'] >0.5:print(f" Worker {i} → Task {j} (cost: {cost_matrix[i,j]})")
/home/runner/work/optyx/optyx/src/optyx/problem.py:586: UserWarning: Variables [assign[0,0], assign[0,1], assign[0,2], assign[0,3], assign[1,0], assign[1,1], assign[1,2], assign[1,3], assign[2,0], assign[2,1], assign[2,2], assign[2,3], assign[3,0], assign[3,1], assign[3,2], assign[3,3]] have integer/binary domains but will be relaxed to continuous. linprog does not support integer programming. For true MIP, consider scipy.optimize.milp or PuLP.
return solve_lp(self, strict=strict, **kwargs)
13 Symmetric Matrices
For problems like covariance estimation or distance matrices:
# Symmetric 3x3 matrixSigma = MatrixVariable("Sigma", 3, 3, symmetric=True)# Check symmetry: same variable for [i,j] and [j,i]print(f"Sigma[0,1] name: {Sigma[0,1].name}")print(f"Sigma[1,0] name: {Sigma[1,0].name}")print(f"Same object: {Sigma[0,1] is Sigma[1,0]}")
Sigma[0,1] name: Sigma[0,1]
Sigma[1,0] name: Sigma[0,1]
Same object: True
This halves the number of variables for symmetric problems.
14 Performance Tips
Use row/column slicing for aggregations:
# Good: single slice operationrow_total = A[0, :].sum()# Slower: element-wise looprow_total =sum(A[0, j] for j inrange(cols))
Use QuadraticForm for quadratic objectives:
# Good: analytic gradientvariance = QuadraticForm(x, cov_matrix)# Slower: builds full expression treevariance =sum(x[i] * cov[i,j] * x[j] for i inrange(n) for j inrange(n))
Leverage symmetry when applicable:
# Good: half the variablesA = MatrixVariable("A", n, n, symmetric=True)# Wasteful: full n² variables with equality constraintsA = MatrixVariable("A", n, n)# + manual constraints for A[i,j] == A[j,i]