Documentation and examples on using Sum of Squares solvers, tutorial and examples of Sum of Squares programming
Introduction
In this guide we explain how to perform basic sum of squares (SOS) computations,
and provide numerical examples of sum of squares programming/optimization using
solvers. Depending on which mathematical computer language/environment you are
most familiar with, you may use either of the following tools:
At a high level, these tools parse an SOS problem expressed in
terms of polynomials, into a semidefinite optimization problem (SDP)
which is later solved numerically using a backend SDP solver.
In the simple examples below, we provide implementations using either
of these tools. Just click on the corresponding tab to see the code,
which you can then copy and paste into the corresponding program.
We point out that other SOS tools are also available, mainly in MATLAB, such as
YALMIP
and
GloptiPoly.
Installation and configuration
Here are simple instructions to install and configure the desired
tools. Recall that for either choice, a backend SDP solver is needed,
so you may need to install that too. Please see the full documentation
of each package for additional details.
In the next example we consider the variables \(s\), \(t\) as parameters, and find
values for them such that the following polynomial is a sum-of-squares.
\[p = s x^6+t y^6-x^4 y^2-x^2 y^4-x^4+3 x^2 y^2-y^4-x^2-y^2+1\]
We can also do parameter optimization with \(s\) and \(t\). Among the values of \(s\)
and \(t\) that make the polynomial a sum-of-squares, we look for those with the
minimum value of \(s+t\).
Finally, we consider a problem involving multiple SOS constraints. Suppose we
want to find the smallest \(t\) so that both \(p_1 = t(1+xy)^2 - xy + (1-y)^2\) and
\(p_2 = (1-xy)^2+xy+t(1+y)^2\) are sums of squares.
Consider the polynomial \(p = x^4+x^2-3 x^2 z^2+z^6\). We may find a lower
bound on \(p\) by looking for the largest value of \(t\) so that \(p - t\) is a
sum-of-squares. This can be formulated using techniques similar to the previous
section.
Consider minimizing a polynomial \(f(x)\) subject to equations \(h_i(x)=0\) and
inequalities \(g_j(x)\geq 0\). Given a degree bound-\(2d\), we can find a lower
bound based on SOS. The next example has only equality constraints:
\[\min_{x,y} \quad x-y \quad\text{ s.t. }\quad x^2 - x = y^2 - y = 0.\]
Now consider a problem with equalities and inequalities:
\[\min_{x,y} \quad x+y
\quad\text{ s.t. }\quad
x^2 + y^2 = 1, \;
y - x^2 = 0.5, \;
x \geq 0, \;
y \geq 0.5.\]
When solving constrained optimization problems, one usually also have to specify
a degree bound to indicate the level of the sum-of-squares hierarchy to use. The
higher the degree the better the relaxation, but it comes at a cost of
increased computation time.
Example 5: Dual formulation and Pseudoexpectations
The dual of a SOS problem can be interpreted as optimizing linear functionals of
low-degree polynomials, also known as
pseudoexpectations. Using
SumOfSquares.py, we can
specify constraints in terms of pseudoexpectations, as well as extract the
pseudoexpectation corresponding to the dual of a SOS constraint.
The following example explicitly computes an infeasibility certificate for
writing the Motzkin polynomial \(p(x, y) = x^4 y^2 + x^2 y^4 - 3 x^2 y^2 + 1\) as
a sum of squares, a pseudoexpectation where \(\tilde{\mathbb{E}}[p(x,y)] = -1\).
Every SOS constraint has an associated pseudoexpectation, we can access it for
use in rounding algorithms. The next example presents both the primal and dual
SOS relaxations of the following optimization problem over the sphere: