4.2. Quadratic Program¶
Quadratic programming adds curvature to the objective while keeping the constraints affine.
The standard form is
Here \(x \in \mathbb{R}^n\) is the decision vector, \(P \in \mathbb{S}_+^n\) is a positive semidefinite matrix that defines the quadratic curvature, \(q \in \mathbb{R}^n\) is the linear term, \(Gx \le h\) collects the inequalities, and \(Ax = b\) collects the affine equalities.
The matrix \(P\) is what makes this problem different from a linear program. If \(P\) is positive semidefinite, the objective stays convex, so we still have a convex optimization problem.
Step 1: Generate a convex quadratic instance
We use np.random.seed(1) so the same synthetic problem is produced every time.
The key construction is P = P.T @ P. That guarantees \(P \succeq 0\), which
makes the quadratic term convex.
import numpy as np
import admm
np.random.seed(1)
m = 15
n = 10
p = 5
P = np.random.randn(n, n)
P = P.T @ P
q = np.random.randn(n)
G = np.random.randn(m, n)
h = G @ np.random.randn(n)
A = np.random.randn(p, n)
b = np.random.randn(p)
Step 2: Create the model and variable
This example needs one vector decision variable \(x \in \mathbb{R}^{10}\).
model = admm.Model()
x = admm.Var("x", n)
Step 3: Write the objective
The symbolic API follows the math almost verbatim:
0.5 * x.T @ P @ xis the quadratic term \(\tfrac{1}{2} x^\top P x\).q.T @ xis the linear term \(q^\top x\).
Putting those two pieces together gives the full objective.
model.setObjective(0.5 * x.T @ P @ x + q.T @ x)
Step 4: Add the constraints
The two constraint families also map directly into symbolic expressions:
\(Gx \le h\) becomes
model.addConstr(G @ x <= h).\(Ax = b\) becomes
model.addConstr(A @ x == b).
model.addConstr(G @ x <= h)
model.addConstr(A @ x == b)
Step 5: Solve and inspect the result
Once the model is complete, we solve it and print the standard solver outputs.
model.optimize()
print(" * model.ObjVal: ", model.ObjVal) # Expected: 86.89077551539528
print(" * model.StatusString: ", model.StatusString) # Expected: SOLVE_OPT_SUCCESS
Complete runnable example:
import numpy as np
import admm
np.random.seed(1)
m = 15
n = 10
p = 5
P = np.random.randn(n, n)
P = P.T @ P
q = np.random.randn(n)
G = np.random.randn(m, n)
h = G @ np.random.randn(n)
A = np.random.randn(p, n)
b = np.random.randn(p)
model = admm.Model()
x = admm.Var("x", n)
model.setObjective(0.5 * x.T @ P @ x + q.T @ x)
model.addConstr(G @ x <= h)
model.addConstr(A @ x == b)
model.optimize()
print(" * model.ObjVal: ", model.ObjVal) # Expected: 86.89077551539528
print(" * model.StatusString: ", model.StatusString) # Expected: SOLVE_OPT_SUCCESS
This example is available as a standalone script in the examples/ folder of the ADMM repository:
python examples/quadratic_program.py
The solution minimizes \(\tfrac{1}{2}x^\top P x + q^\top x\) over the polyhedron \(\{x : Gx \le h,\; Ax = b\}\). Notice that the ADMM code is a line-by-line transcription of that formulation — no manual dualization or reformulation required.