# Getting started with Sum of Squares

## Introduction

In this guide we explain how to perform basic sum of squares (SOS) computations, and provide numerical examples of sum of squares programming/optimization using solvers. Depending on which mathematical computer language/environment you are most familiar with, you may use either of the following tools:

At a high level, these tools parse an SOS problem expressed in terms of polynomials, into a semidefinite optimization problem (SDP) which is later solved numerically using a backend SDP solver.

In the simple examples below, we provide implementations using either of these tools. Just click on the corresponding tab to see the code, which you can then copy and paste into the corresponding program.

For more advanced problems, please look at the exercises for the AMS Short Course on Sums of Squares.

We point out that other SOS tools are also available, mainly in MATLAB, such as YALMIP and GloptiPoly.

## Installation and configuration

Here are simple instructions to install and configure the desired tools. Recall that for either choice, a backend SDP solver is needed, so you may need to install that too. Please see the full documentation of each package for additional details.

### Main package

For installation, type the following commands.

Once installed, the following lines must be entered at the beginning of each session.

### Backend SDP Solvers

SOS computations rely on semidefinite programming (SDP) solvers. The following solvers are available as of December 2018:

## Examples

### Example 1: Checking if a polynomial is SOS

Here we find an SOS decomposition for the polynomial $$p = 2 x^4 + 2 x^3 y - x^2 y^2 + 5 y^4$$.

We next consider the Motzkin polynomial, $$p = x^4 y^2 + x^2 y^4 - 3 x^2 y^2 + 1$$, which is nonnegative but no SOS decomposition exists.

### Example 2: Parametric SOS problems

In the next example we consider the variables $$s$$, $$t$$ as parameters, and find values for them such that the following polynomial is a sum-of-squares.

$p = s x^6+t y^6-x^4 y^2-x^2 y^4-x^4+3 x^2 y^2-y^4-x^2-y^2+1$

We can also do parameter optimization with $$s$$ and $$t$$. Among the values of $$s$$ and $$t$$ that make the polynomial a sum-of-squares, we look for those with the minimum value of $$s+t$$.

Finally, we consider a problem involving multiple SOS constraints. Suppose we want to find the smallest $$t$$ so that both $$p_1 = t(1+xy)^2 - xy + (1-y)^2$$ and $$p_2 = (1-xy)^2+xy+t(1+y)^2$$ are sums of squares.

### Example 3: Global polynomial optimization

Consider the polynomial $$p = x^4+x^2-3 x^2 z^2+z^6$$. We may find a lower bound on $$p$$ by looking for the largest value of $$t$$ so that $$p - t$$ is a sum-of-squares. This can be formulated using techniques similar to the previous section.

Alternatively, some tools have a simplified way for finding this lower bound.

### Example 4: Constrained polynomial optimization

Consider minimizing a polynomial $$f(x)$$ subject to equations $$h_i(x)=0$$ and inequalities $$g_j(x)\geq 0$$. Given a degree bound-$$2d$$, we can find a lower bound based on SOS. The next example has only equality constraints:

$\min_{x,y} \quad x-y \quad\text{ s.t. }\quad x^2 - x = y^2 - y = 0.$

Now consider a problem with equalities and inequalities:

$\min_{x,y} \quad x+y \quad\text{ s.t. }\quad x^2 + y^2 = 1, \; y - x^2 = 0.5, \; x \geq 0, \; y \geq 0.5.$

When solving constrained optimization problems, one usually also have to specify a degree bound to indicate the level of the sum-of-squares hierarchy to use. The higher the degree the better the relaxation, but it comes at a cost of increased computation time.

### Example 5: Dual formulation and Pseudoexpectations

The dual of a SOS problem can be interpreted as optimizing linear functionals of low-degree polynomials, also known as pseudoexpectations. Using SumOfSquares.py, we can specify constraints in terms of pseudoexpectations, as well as extract the pseudoexpectation corresponding to the dual of a SOS constraint.

The following example explicitly computes an infeasibility certificate for writing the Motzkin polynomial $$p(x, y) = x^4 y^2 + x^2 y^4 - 3 x^2 y^2 + 1$$ as a sum of squares, a pseudoexpectation where $$\tilde{\mathbb{E}}[p(x,y)] = -1$$.

Every SOS constraint has an associated pseudoexpectation, we can access it for use in rounding algorithms. The next example presents both the primal and dual SOS relaxations of the following optimization problem over the sphere:

$\max_{x,y,z} \quad (x+\phi y)(x-\phi y)(y+\phi z)(y-\phi z)(z+\phi x)(z-\phi x) \quad\text{ s.t. }\quad x^2 + y^2 + z^2 = 1.$

After solving the SOS relaxation, we can produce a feasible solution using the following rounding algorithm:

1. Sample $$v$$ uniformly at random from the sphere
2. Compute $$M = \tilde{\mathbb{E}}_x[\langle v, x \rangle^4 xx^T]$$
3. Return the normalized top eigenvector of $$M$$

We can implement this rounding algorithm using either the primal or dual formulation: