On the complexity of matrix Putinar’s Positivstellensätz
Abstract
This paper studies the complexity of matrix Putinar’s Positivstellensätz on the semialgebraic set that is given by the polynomial matrix inequality. When the quadratic module generated by the constrained polynomial matrix is Archimedean, we prove a polynomial bound on the degrees of terms appearing in the representation of matrix Putinar’s Positivstellensätz. Estimates on the exponent and constant are given. As a byproduct, a polynomial bound on the convergence rate of matrix sum-of-squares relaxations is obtained, which resolves an open question raised by Dinh and Pham. When the constraining set is unbounded, we also prove a similar bound for the matrix version of Putinar–Vasilescu’s Positivstellensätz by exploiting homogenization techniques.
keywords:
Polynomial matrix, polynomial optimization, matrix Putinar’s Positivstellensätz, matrix sum-of-squares relaxations90C23, 65K05, 90C22, 90C26
1 Introduction
Optimization with polynomial matrix inequalities has broad applications [6, 12, 16, 17, 18, 25, 47, 48, 57, 58]. A fundamental question to solve it globally is how to efficiently characterize polynomial matrices that are positive semidefinite on the semialgebraic set described by the polynomial matrix inequality. To be more specific, given an -by- symmetric polynomial matrix in and a semialgebraic set , how can we certify that on ? Here, the set is given as
(1.1) |
where is an -by- symmetric polynomial matrix.
When is a scalar polynomial (i.e., ) and is a diagonal matrix whose diagonal entries are polynomials , , , can be equivalently described as
(1.2) |
Then, it reduces to the classical Positivstellensätz in real algebraic geometry and polynomial optimization. A polynomial is said to be a sum of squares (SOS) if for polynomials . Schmüdgen [49] proved that if is compact and on , then there exist SOS polynomials such that
(1.3) |
When the quadratic module of scalar polynomials generated by is Archimedean (a condition slightly stronger than compactness, refer to Section 2.2), Putinar [42] showed that if on , then there exist SOS polynomials such that
(1.4) |
which is referred to as Putinar’s Positivstellensätz in the literature.
The complexities of the above scalar Positivstellensätze have been studied extensively. Suppose that and the degree of is . For the degree , denote
For a polynomial , denote the norm It was shown in [51] that if on , then for
there exist SOS polynomials with such that (1.3) holds. Here, is a positive constant that depends on the constraining polynomials . Based on the above bound, Nie and Schweighofer [39] proved that if is Archimedean, there exists a constant , depending on , such that if
(1.5) |
then (1.4) holds for some SOS polynomials , , with , . Recently, Baldi and Mourrain improved the exponent bound (1.5) to a polynomial bound. It was shown in [2] that the same result holds if
(1.6) |
In the above, is a positive constant depending on , , is the Łojasiewicz exponent associated with and is the max norm on the cube . The bound (1.6) has been further improved in [3] by removing the dependency of the exponent on the dimension and providing estimates on the constant . There exist better degree bounds when the set is specified, see [1, 4, 12, 53, 54, 55]. We also refer to [29] when admit the correlative sparse sparsity.
Based on Putinar’s Positivstellensätz, Lasserre [31] introduced the Moment-SOS hierarchy for solving polynomial optimization and proved its asymptotic convergence. The above degree bounds give estimates on the convergence rates of the Moment-SOS hierarchy directly. We also refer to [8, 20, 21, 23, 36, 37, 46] for the finite convergence of the Moment-SOS hierarchy under some regular assumptions.
When , are general polynomial matrices, a matrix version of Putinar’s Positivstellensätz was provided in [28, 48]. Let be the ring of polynomials in with real coefficients. A polynomial matrix is SOS if for some polynomial matrix . The set of all -by- SOS polynomial matrices in is denoted as . The quadratic module of -by- polynomial matrices generated by is the set
The set is Archimedean if there exists a scalar such that . When the quadratic module is Archimedean, it was proved in [28, 48] that if on , then we have that . For the case that is a polynomial matrix and is diagonal (i.e., is as in (1.2)), the exponential bound (1.5) was generalized to the matrix case in [15]. There are specific estimates on degree bounds when is a simple set defined by scalar polynomial inequalities. We refer to [12] for the sphere case and refer to [55] for the boolean hypercube case. When is a general polynomial matrix, the complexity of matrix Putinar’s Positivstellensätz is open, to the best of the author’s knowledge. We also refer to [7, 27, 50] for other types of SOS representations for polynomial matrices.
Contributions
This paper studies the complexity of matrix Putinar’s Positivstellensätz.
Throu-
ghout the paper, we assume that
unless otherwise specified. Denote the scaled simplex
(1.7) |
Note that the set in our settings. The degree of the polynomial matrix is denoted by , i.e.,
(1.8) |
For a positive integer , denote
(1.9) |
Let be the Bernstein norm of on the scaled simplex (see Section 3 for details).
The following is our main result. It generalizes the recent results of scalar Putinar’s Positivstellensätz, as shown in [2, 3], to the matrix case and provides the first degree bound for matrix Putinar’s Positivstellensätz.
Theorem 1.1.
Let be as in (1.1) with . Suppose is an -by- symmetric polynomial matrix of degree , and . If on , then for
(1.10) |
there exists an -by- SOS matrix and polynomial matrices with
such that
(1.11) |
In the above, are respectively the Łojasiewicz constant and exponent in (5.3), and is a constant independent of , . Furthermore, the Łojasiewicz exponent can be estimated as
The positivity certificate (1.11) can be applied to construct matrix SOS relaxations for solving polynomial matrix optimization [16, 48]. As a byproduct of Theorem 1.1, we can obtain a polynomial bound on the convergence rate of matrix SOS relaxations (see Theorem 5.7), which resolves an open question raised in [10].
When is a scalar polynomial and is diagonal with diagonal entries , it was shown in [3] that if on , then admits a representation as in (1.4) for
where are respectively the Łojasiewicz constant and exponent, and is a constant independent of , . Comparing this with the bound as in Theorem 1.1, one important additional factor is . This comes from the estimates on the quantity and degree bounds of scalar polynomials that can be used to describe the polynomial matrix inequality . Please see Section 4 for details. The other additional factors such as , are artifacts of the proof technique. They primarily arise from the degree estimates of in Theorem 3.5. Since we want the constant to be independent of and , we retain all these factors in Theorem 1.1.
Remark.
The assumption that is not serious. When the quadratic module is Archimedean, i.e., for some , we have that . Theorem 1.1 remains true with replaced by , .
When is unbounded, the Putinar-type SOS representation may not hold. For the scalar case (i.e., and is as in (1.2)), it was shown in [43, 44] that if the highest degree homogeneous term of is positive definite in and on , then there exists a degree such that has a representation (1.4), which is referred to as Putinar-Vasilescu’s Positivstellensätz. A polynomial complexity for this type of representation was proved by Mai and Magron [35]. Recently, a matrix version of Putinar–Vasilescu’s Positivstellensätz was established in [11].
In the following, we state a bound similar to that in Theorem 1.1 for matrix Putinar–Vasilescu’s Positivstellensätz. Let denote the highest degree homogeneous term of . The homogenization of is denoted by , i.e., for . The notation denotes the smallest eigenvalue of the matrix .
Theorem 1.2.
Let be as in (1.1) with . Suppose that is an -by- symmetric polynomial matrix of degree . If on and on , then for
there exists an -by- SOS matrix and polynomial matrices with
such that
(1.12) |
In the above, , are respectively the Łojasiewicz constant and exponent in (5.8), is a constant independent of , , and is the optimal value of the optimization
(1.13) |
Furthermore, the Łojasiewicz exponent can be estimated as
For unbounded optimization, the positivity of is also crucial for obtaining SOS-type representations because it reflects the behavior of at infinity (e.g., see [22]). The quantity is the optimal value of the homogenized optimization (1.15). It is an important quantity for evaluating the positivity of on the unbounded set , as it reflects not only the behavior of on but also its behavior at infinity. Suppose that on , on , and is a feasible point of (1.15). One can see that and for all feasible with , and for all feasible with . However, it is impossible to bound from below with and , since can be arbitrarily close to . For instance, consider the scalar polynomial and . Note that for , we have . Let , . Then, it follows that
for some . This implies that . One can verify that , as , while for arbitrary .
Outline of the proofs
The proof of Theorem 1.1 involves three major steps:
Step 1. When is an -by- symmetric polynomial matrix and is described by finitely many scalar polynomial inequalities (i.e., is as in (1.2) ), we establish a polynomial bound on degrees of terms appearing in matrix Putinar’s Positivstellensätz (see Theorem 3.5). It improves the exponential bound given by Helton and Nie [15]. The proof closely follows the idea and strategies of [2, 3], adapting it to the matrix setting. This proof is provided in Section 3 and is completed in three steps:
Step 1a. We perturb into a polynomial matrix such that is positive definite on the scaled simplex . To be more specific, we prove that there exist and with controlled degrees such that
This step uses a result by Baldi-Mourrain-Parusinski [3, Proposition 3.2] about the approximation of a plateau by SOS polynomials, and some estimates on the matrix Bernstein norm (see Lemma 3.3).
Step 1b. Using the dehomogenized version of matrix Polya’s Positivstellensätz in terms of the Bernstein basis (see Lemma 3.1), we show that there exist matrices such that
and provide an estimate on .
Step 1c. Note that the factors of satisfy . We know that . Consequently, a polynomial bound on the degrees of terms appearing in matrix Putinar’s Positivstellensätz can be derived from the assumption that .
Step 2. When is described by a polynomial matrix inequality (i.e., is as in (1.1)), we show that can be equivalently described by finitely many scalar polynomial inequalities with exact estimates on the quantity and degrees, by exploiting a proposition due to Schmüdgen [50]. Additionally, these polynomials belong to the quadratic module generated by in . This is discussed in Section 4.
2 Preliminaries
2.1 Notations
The symbol (resp., ) denotes the set of nonnegative integers (resp., real numbers). Let be the ring of polynomials in with real coefficients and let be the ring of polynomials with degrees . For and a degree , denote , For a polynomial , the symbol denotes its total degree. For a polynomial matrix , its degree is denoted by , i.e.,
For a real number denotes the smallest integer greater than or equal to . For a matrix , denotes its transpose and denotes the spectral norm of . A symmetric matrix (resp., ) if is positive semidefinite (resp., positive definite). For a smooth function in , its gradient is denoted by .
2.2 Some basics in real algebraic geometry
In this subsection, we review some basics in real algebraic geometry. We refer to [7, 16, 27, 28, 33, 38, 48, 50] for more details.
A polynomial matrix is said to be a sum of squares (SOS) if for some polynomial matrix . The set of all -by- SOS polynomial matrices in is denoted as . For a degree , denote the truncation
For an -by- symmetric polynomial matrix , the quadratic module of -by- polynomial matrices generated by in is
The th degree truncation of is
(2.1) |
Note that . For the case , we simply denote
The quadratic module is said to be Archimedean if there exists such that The polynomial matrix determines the semialgebraic set
If is Archimedean, then the set must be compact. However, when is compact, it is not necessarily that is Archimedean (see [26]). The following is known as the matrix-version of Putinar’s Positivstellensätz.
2.3 Some technical propositions
In this subsection, we present several useful propositions that are used in our proofs. The following is the classical Łojasiewicz inequality.
Corollary 2.2 ([5, 34]).
Let be a closed and bounded semialgebraic set, and let and two continuous semialgebraic functions from to such that . Then there exist two positive constants , such that:
The smallest constant is called the Łojasiewicz exponent and is the corresponding Łojasiewicz constant.
The Markov inequality below can be used to estimate gradients of polynomials.
The following theorem is a slight variation of [14, Theorem 3.3], which gives explicit Łojasiewicz exponents for basic closed semialgebraic sets.
Theorem 2.4 ([14]).
Suppose that and let
Then for any compact set , there is a constant such that for all ,
where is the Euclidean distance to the set and .
3 Degree bounds for scalar polynomial inequalities
In this section, we consider the case where is a diagonal matrix with diagonal entries . Then, the set as in (1.1) can be equivalently described as
(3.1) |
If the symmetric polynomial matrix on , we prove a polynomial bound on the degrees of terms appearing in matrix Putinar’s Positivstellensätz. This improves the exponential bound shown by Helton and Nie [15]. The proof essentially generalizes the idea and strategies used in [2, 3] to the matrix case. We refer to Step 1 in the section ‘Outlines of proofs’ for an overview of the proof.
Let be the -dimensional standard simplex, i.e.,
Denote the scaled simplex by
For a degree , let be the Bernstein basis of degree on , i.e.,
(3.2) |
For a degree , the symmetric polynomial matrix of degree can be expressed in the Bernstein basis as
where are -by- symmetric matrices. The Bernstein norm of with respect to is defined as
For convenience, we denote simply as . Similarly, can also be expressed in the monomial basis as
The spectral norm of is defined as For , and , the following is a basic property of the Bernstein norm (see [13], [3, Lemma 1.5]):
(3.3) |
Suppose that is a homogeneous polynomial with . Polya’s Positivstellensätz [40, 41] states that if on the standard simplex , then the polynomial has only positive coefficients if This result has been generalized to the matrix case by Scherer and Hol [48]. It was shown in [48, Theorem 3] that if is an -by- homogeneous symmetric polynomial matrix with and on , then for all coefficient matrices of in the monomial basis are positive definite.
In the following, we present a dehomogenized version of matrix Polya’s Positivstellensätz in terms of the Bernstein basis.
Lemma 3.1.
Suppose is an -by- symmetric polynomial matrix with . If on , then for
all coefficient matrices of in the Bernstein basis are positive definite.
Proof 3.2.
Suppose that can be expressed in the Bernstein basis as
where are -by- symmetric matrices. Denote
For each , let
Then we know that
which implies that . By computations, we have that for , the following holds
Hence, on . It follows from [48, Theorem 3] that for , all coefficient matrices of in the monomial basis are positive definite, i.e., there exist such that
By substituting
into the above identity, we obtain
Since , the conclusion follows.
Lemma 3.3.
Suppose is an -by- symmetric polynomial matrix with . Let with , and denote
Then, we have that . Furthermore, there exists with such that .
Proof 3.4.
Suppose that can be expressed in the Bernstein basis as
where are -by- symmetric matrices. Note that
Thus, we have Since there exists with such that
then we have that
Denote
(3.4) |
For , let denote the Euclidean distance to the set , i.e.,
If , then we have and . The distance function is a continuous semialgebraic function (see [5, Proposition 2.2.8], [14, Proposition 1.8]). By Corollary 2.2, there exist positive constants , such that the following Łojasiewicz inequality holds:
(3.5) |
Note that
In the following, we prove a polynomial bound on the degrees of terms appearing in matrix Putinar’s Positivstellensätz when is given by finitely many scalar polynomial inequalities, which improves the exponential bound established in [15]. Recall that the quadratic module of scalar polynomials generated by in is denoted by (see Section 2.2).
Theorem 3.5.
Let be a diagonal matrix with diagonal entries and let be as in (3.1). Suppose that is an -by- symmetric polynomial matrix with , and . If on , then for
(3.6) |
In the above, , are respectively the Łojasiewicz constant and exponent in (3.5) and is a constant independent of , . Furthermore, the Łojasiewicz exponent can be estimated as .
Remark. At a feasible point , the Mangasarian–Fromovitz constraint qualification (MFCQ) holds if there exists a direction such that for all active inequality constraints . The linear independence constraint qualification (LICQ) holds at if the set of gradient vectors of all active constraints is linearly independent. It was shown in [45] that if the MFCQ holds at every , then the Łojasiewicz exponent, as in (3.5), is equal to 1. Since the MFCQ is weaker than the LICQ, we know that if the LICQ holds at every , the Łojasiewicz exponent (see also [2, 3])111The author would like to thank the referee for pointing out this fact.. Then, under the assumptions of Theorem 3.5, we have that if
(3.7) |
Proof 3.6.
Without loss of generality, we assume that for . Otherwise, we can scale by . We remark that scaling does not change , since we normalize in and . Thus, the bound remains unchanged. Let
For the above , , it follows from [3, Proposition 3.2] that there exists such that for , we have that
-
(i)
if .
-
(ii)
if .
-
(iii)
.
-
(iv)
with .
Let
In the following, we show that on . Equivalently, for a fixed , we prove that for each with , the following holds:
(3.8) |
In the above,
Case I. Suppose that . Then, we know that . Let be a vector such that . Since , we have that
where the last inequality follows from Theorem 2.3. Since (see (3.3)), it holds that
(3.9) |
where the last inequality is due to Lemma 3.3. Then, we know that
By the Łojasiewicz inequality (3.5), we have that
From the definition of the function as in (3.4), there exists such that . Since is an SOS (see item (iv)) and , we have by item (ii). Note that
In the above, we use the inequality (3.3). Then, we have that
One can see from the relation (3.9) that and , which implies that
Then, it follows that
where the second inequality follows from the inequality (3.3).
Since with is arbitrary, we have on . Note that can be any fixed point in , we know that for all . Since for , we have that
By Lemma 3.3, there exists with such that . Then, we obtain that
where the second inequality is due to the inequality (3.3). Note that
It follows from Lemma 3.1 that all coefficient matrices of in the Bernstein basis are positive definite, for
i.e., there exist matrices such that
In the following, we show that . Consider the optimization
(3.10) |
One can verify that the optimal value of (3.10) is 0. Since and are SOS convex and the feasible set has an interior point, it follows from [32, Theorem 3.3] that the lowest order SOS relaxation of (3.10) is exact and the optimal values of all SOS relaxations are achievable. Consequently, we have that . Similarly, we know that . Hence, we have
Then, from the definition of , we have . Since for some , it follows that , and thus . Then, the conclusion holds for sufficiently large . Note that
It follows from Theorem 2.4 that .
4 A scalar representation for polymomial matrix inequalities
In this section, we provide exact estimates on the quantity and degree bounds for polynomials with , such that the polynomial matrix inequality can be equivalently described by the scalar polynomial inequalities , , . This serves as a quantitative version of the following proposition due to Schmüdgen [50].
Proposition 1 (Proposition 9, [50]).
Let be an -by- symmetric polynomial matrix and . Then there exist -by- diagonal polynomial matrices , -by- symmetric polynomial matrices and polynomials , such that:
-
(i)
,
-
(ii)
For , if and only if for all .
Let A direct corollary of Proposition 1 is that there exist finitely many polynomials in such that can be equivalently described by these polynomials. We also refer to [7, Proposition 5] for a generalization of Proposition 1, where the author considers the case that can be described by possibly infinitely many polynomial matrix inequalities. Denote
Recall that In the following, we provide a quantitative version of Proposition 1.
Theorem 4.1.
Suppose is an -by- symmetric polynomial matrix with . Then there exist such that
Proof 4.2.
We prove the result by applying induction on the matrix size . Suppose that and . In the following, we show that the inequality can be equivalently described as
If , we know that , , , which imply the above inequalities. Suppose satisfies these inequalities. If (, respectively), it follows from the inequality (, respectively) that . This implies that . Otherwise, if , the last inequality reduces to and the inequality reduces to . Then, we have that and . Note that
where
We know that all the above scalar inequalities involve polynomials that belong to . Note that . Hence, the result holds for .
Suppose the result holds for and consider the case that . In the following, we construct block diagonal polynomial matrices with a block and a block, such that the inequality can be equivalently described by the main diagonal blocks of these matrices. This reduction in matrix size allows us to apply induction. For , let denote the permutation matrix that swaps the first row and the -th row. For , denote the row operation matrix as
Note that is the matrix obtained by first adding the -th row of to the -th row and then adding the -th column to the -th column. Let
Clearly, for , is the matrix obtained by swapping the first row with the -th row of and then swapping the first column with the -th column. Hence, we know that for , the matrices are all invertible and it holds that
where is a -by- symmetric polynomial matrix, and
Denote the matrices
By computations, we have that
(4.1) |
(4.2) |
(4.3) |
where
Let
Since are scalar matrices, we have
In the following, we show that
(4.4) |
If , i.e., , it follows from (4.2) that , which implies that
(4.5) |
For the converse part, suppose satisfies (4.5). If for all , then we have for all , leading to . If there exist such that , the relation (4.3) gives that
(4.6) |
Since , we conclude that .
Note that each is a -by- symmetric polynomial matrix with for . By induction, there exist such that
Combining this with the relation (4.4), we have that
The total number of the above constraints is which equals to . Hence, we have completed the proof.
Remark. There exist alternative ways to obtain finitely many scalar polynomial inequalities that describe . For instance, consider the characteristic polynomial of . It can be expressed as
Since is symmetric, all eigenvalues are real, and has only real roots. We know that if and only if for . Note that , which is always in . Hence, if the polynomials , the quantity and the degree as stated in Theorem 4.1 can be improved to and , respectively. However, typically do not belong to .
5 Degree bounds for matrix Positivstellensätze
This section is mainly to prove Theorems 1.1 and 1.2. As a byproduct, we can get a polynomial bound on the convergence rate of matrix SOS relaxations for solving polynomial matrix optimization, which resolves an open question raised by Dinh and Pham [10]. Let
(5.1) |
where is an -by- symmetric polynomial matrix.
5.1 Proof of Theorem 1.1
For the polynomial matrix , let be the polynomials as in Theorem 4.1. Note that we assume
Denote
(5.2) |
If , we have that , i.e., . By Corollary 2.2, there exist positive constants , such that the following Łojasiewicz inequality holds:
(5.3) |
In the following, we give the proof of Theorem 1.1.
Theorem 5.1.
Proof 5.2.
By Theorem 4.1, we know that and
It follows from Theorem 3.5 that , where is the diagonal matrix with diagonal entries , , for
Equivalently, there exist SOS polynomial matrices () with , , , such that
Suppose that for and for some , . Then, we know that
which implies that . Since , we know that for some . Then, we know that . Let be sufficiently large. Then, we can drop the small terms , , and thus for satisfying (5.4). Since and
it follows from Theorem 2.4 that .
Remark. When is a diagonal polynomial matrix with diagonal entries , i.e.,
Theorem 5.1 implies that for greater than the quantity as in (5.4), we have . However, the bound given in (3.6) from Theorem 3.5 is generally smaller than that in (5.4). Additionally, the proof of Theorem 5.1 builds upon Theorem 3.5.
5.2 Proof of Theorem 1.2
For a polynomial matrix with , denotes its highest-degree homogeneous terms, i.e., The homogenization of is denoted by , i.e., for . Let
(5.5) |
where . Consider the optimization
(5.6) |
where is the smallest eigenvalue of . Let denote the optimal value of (5.6).
Lemma 5.3.
Let be as in (5.1). Suppose that is an -by- symmetric polynomial matrix with . If on and on , then and on .
Proof 5.4.
Suppose . If , then we have If , then
which implies that . Since on , the degree of must be even, and we have
Since is compact, the optimal value of (5.6) is positive, i.e., .
For the polynomial matrix , let be the polynomials as in Theorem 4.1. Denote
(5.7) |
Note that implies that . By Corollary 2.2, there exist positive constants , such that for all satisfying , it holds that
(5.8) |
In the following, we give the proof of Theorem 1.2, which provides a polynomial bound on the complexity of matrix Putinar–Vasilescu’s Positivstellensätz.
Theorem 5.5.
Let be as in (5.1) with . Suppose that is an -by- symmetric polynomial matrix with . If on and on , then for
(5.9) |
In the above, , are respectively the Łojasiewicz constant and exponent in (5.8), is a constant independent of , and is the optimal value of (5.6). Furthermore, the Łojasiewicz exponent can be estimated as
Proof 5.6.
By Lemma 5.3, we have that on . Note that is Archimedean, and can be equivalently described by
By following the proof of Theorem 1.1 for , we know that for satisfying (5.9), there exist , , () with
such that
Replacing by in the above identity, we get that
(5.10) |
Suppose that for some polynomial matrix . Then, there exist polynomial matrices with , such that
It holds that
Similarly, there exist polynomial matrices with , satisfying
Then, we have that
Combining these results with equation (5.10), we know that for satisfying (5.9), the following holds:
for some with and a symmetric polynomial matrix . Since is not a rational function, we conclude that , which implies .
Proof of Corollary 1.3. Note that on , and we have
Hence, for , the homogenization of satisfies
The conclusion follows from Theorem 5.5.
5.3 Convergence rate of matrix SOS relaxations
Consider the polynomial matrix optimization
(5.11) |
where , is an symmetric polynomial matrix. We denote the optimal value of (5.11) as .
A standard approach for solving (5.11) globally is the sum-of-squares hierarchy introduced in [16, 28, 48]. This hierarchy consists of a sequence of semidefinite programming relaxations. Recall that for a degree , the th degree truncation of is
Checking whether a polynomial belongs to the the quadratic module can be done by solving a semidefinite program (see [48]). For an order , the th order SOS relaxation of (5.11) is
(5.12) |
Let denote the optimal value of (5.12). It holds the monotonicity relation: The hierarchy (5.12) is said to have asymptotic convergence if as , and it is said to have finite convergence if for all big enough. When the quadratic module is Archimedean, it was shown in [16, 28, 48] that the hierarchy (5.12) converges asymptotically. Recently, Huang and Nie [24] proved that this hierarchy also has finite convergence if, in addition, the nondegeneracy condition, strict complementarity condition and second order sufficient condition hold at every global minimizer of (5.11). In the following, we provide a polynomial bound on the convergence rate of the hierarchy (5.12).
Theorem 5.7.
Suppose with , , and . Then we have that
(5.13) |
where , are respectively the Łojasiewicz constant and exponent in (5.3), and is a constant independent of , . Furthermore, the Łojasiewicz exponent can be estimated as
6 Conclusions and discussions
In this paper, we establish a polynomial bound on the degrees of terms appearing in matrix Putinar’s Positivstellensätz. For the special case that the constraining polynomial matrix is diagonal, this result improves the exponential bound shown in [15]. As a byproduct, we also obtain a polynomial bound on the convergence rate of matrix SOS relaxations, which resolves an open question posed by Dinh and Pham [10]. When the constraining set is unbounded, a similar bound is provided for matrix Putinar–Vasilescu’s Positivstellensätz.
Theorem 1.1 uses the Łojasiewicz inequality (5.3) for equivalent scalar polynomial inequalities of . There also exists the Łojasiewicz inequality for polynomial matrices (see [9, 10]), where the corresponding Łojasiewicz exponent is typically smaller than the scalar one as in (5.3). An interesting future work is to provide a proof based directly on the Łojasiewicz inequality for , which may yields a better bound. On the another hand, the set can be equivalently described by the quantifier set It would also be interesting to study the complexity of Putinar-type Positivstellensätze with universal quantifiers for the above reformulation. We refer to [19] for the recent work on Positivstellensätze with universal quantifiers.
When is given by scalar polynomial inequalities, it was shown in [45] that the Łojasiewicz exponent as in (3.5) is equal to if the linear independence constraint qualification holds at every . Consequently, we can get a representation with order . An analogue to the linear independence constraint qualification in nonlinear semidefinite programming is the nondegeneracy condition [52, 56]. It would also be interesting to explore whether similar results exist for polynomial matrix inequalities under the nondegeneracy condition.
Acknowledgements: The author would like to thank Prof. Konrad Schmüdgen for helpful discussions on Proposition 1, Prof. Tien-Son Pham for fruitful comments on this paper. The author also thanks the associate editor and three anonymous referees for valuable comments and suggestions.
References
- [1] F. Bach and A. Rudi, Exponential convergence of sum-of-squares hierarchies for trigonometric polynomials, SIAM J. Optim., 11 (2023), pp. 796–817.
- [2] L. Baldi and B. Mourrain, On the effective Putinar’s Positivstellensätz and moment approximation, Math. Program., 200 (2023), pp. 71–103.
- [3] L. Baldi, B. Mourrain and A. Parusinski, On Łojasiewicz inequalities and the effective Putinar’s Positivstellensätz, J. Algebra, 662 (2025), pp. 741–767.
- [4] L. Baldi and L. Slot, Degree bounds for Putinar’s Positivstellensätz on the hypercube, SIAM J. Appl. Algebra Geom., 8 (2023), pp. 1–25.
- [5] J. Bochnak, M. Coste and M. Roy, Real Algebraic Geometry, Springer, Berlin, 1998.
- [6] G. Chesi, LMI techniques for optimization over polynomials in control: a survey, IEEE Trans. Autom., 55 (2010), pp. 2500–2510.
- [7] J. Cimpric, Real algebraic geometry for matrices over commutative rings, J. Algebra 359 (2012), pp. 89–103.
- [8] E. De Klerk and M. Laurent, On the Lasserre hierarchy of semidefinite programming relaxations of convex polynomial optimization problems, SIAM J. Optim., 21 (2011), pp. 824–832.
- [9] S. Dinh. and T. Pham, Łojasiewicz-type inequalities with explicit exponents for the largest eigenvalue function of real symmetric polynomial matrices, Internat. J. Math., 27 (2016) , 1650012.
- [10] S. Dinh. and T. Pham, Łojasiewicz inequalities with explicit exponents for smallest singular value functions, J. Complex., 41 (2017), pp. 58–71.
- [11] T. Dinh., M. Ho and C. Le, Positivstellensätze for polynomial matrices, Positivity 25 (2021), pp. 1295–1312.
- [12] K. Fang and H. Fawzi, The sum-of-squares hierarchy on the sphere and applications in quantum information theory, Math. Program. 190 (2021), pp. 331-–360.
- [13] G. Farin, Curves and Surfaces for CAGD: A Practical Guide, 5th ed. Morgan Kaufmann Series in Computer Graphics and Geometric Modeling, San Francisco, 2021.
- [14] H. H and T. Pham, Genericity in Polynomial Optimization, World Scientific Publishing, Singapore, 2017.
- [15] J. Helton and J. Nie, Semidefinite representation of convex sets, Math. Program., 122 (2010), pp. 21–64.
- [16] D. Henrion and J. Lasserre, Convergent relaxations of polynomial matrix inequalities and static output feedback, IEEE Trans. Autom. Control, 51 (2006), 192–202.
- [17] D. Henrion and J. Lasserre, Inner approximations for polynomial matrix inequalities and robust stability, IEEE Trans. Autom. Control, 57 (2011), pp. 1456–1467.
- [18] D. Henrion, M. Korda and J. Lasserre, The Moment-SOS Hierarchy: Lectures in Probability, Statistics, Computational Geometry, Control and Nonlinear PDEs, World Scientific, Cambridge, 2020.
- [19] X. Hu, I. Klep and J. Nie, Positivstellensätze and moment problems with universal quantifiers, arXiv preprint, https://arxiv.org/abs/2401.12359, 2024.
- [20] L. Huang, Optimality conditions for homogeneous polynomial optimization on the unit sphere, Optim Lett, 17 (2023), pp. 1263–1270.
- [21] L. Huang, J. Nie and Y. Yuan, Homogenization for polynomial optimization with unbounded sets, Math. Program., 200 (2023), pp. 105–145.
- [22] L. Huang, J. Nie, and Y. Yuan, Generalized truncated moment problems with unbounded sets, J. Sci. Comput., 95 (2023).
- [23] L. Huang, J. Nie and Y. Yuan, Finite convergence of Moment-SOS relaxations with non-real radical ideals, SIAM J. Optim., 34 (2024), pp. 3399–3428.
- [24] L. Huang and J. Nie, On the tightness of matrix Moment-SOS hierarchy, arXiv preprint, https://arxiv.org/abs/2403.17241, 2024.
- [25] H. Ichihara and J. Lasserre, Optimal control for polynomial systems using matrix sum of squares relaxations, IEEE Trans. Autom. Control, 54 (2009), pp. 1048–1053.
- [26] T. Jacobi and A. Prestel, Distinguished representations of strictly positive polynomials, J. Reine Angew. Math., 532 (2001), pp. 223–235.
- [27] I. Klep and M. Schweighofer, Pure states, positive matrix polynomials and sums of hermitian squares, Indiana Univ. Math. J., 59 (2010), pp. 857–874.
- [28] M. Kojima, Sums of squares relaxations of polynomial semidefinite programs, In: Research Reports on Mathematical and Computing Sciences Series B: Operations Research B-397, Tokyo Institute of Technology, 2003.
- [29] M. Korda, V. Magron and Z. Rios-Zertuche, Convergence rates for sums-of-squares hierarchies with correlative sparsity, Math. Program., (2024), to appear.
- [30] A. Kroo and S. Revesz, On Bernstein and Markov-type inequalities for multivariate polynomials on convex bodies, J Approx Theory, 99 (1999), pp. 134–152.
- [31] J. Lasserre, Global optimization with polynomials and the problem of moments, SIAM J. Optim., 11 (2001), pp. 796–817.
- [32] J. Lasserre, Convexity in semialgebraic geometry and polynomial optimization, SIAM J. Optim., 19 (2009), pp. 1995–2014.
- [33] M. Laurent, Sums of squares, moment matrices and optimization over polynomials, In: Emerging Applications of Algebraic Geometry of IMA Volumes in Mathematics and its Applications, 149 (2009), pp. 157–270.
- [34] S. Łojasiewicz, Sur le problème de la division, Studia Math., 18 (1959), pp. 87–136.
- [35] N. Mai and V. Magron, On the complexity of Putinar–Vasilescu’s Positivstellensätz, J. Complexity, 72 (2022), 101663.
- [36] M. Marshall, Representation of non-negative polynomials with finitely many zeros, Annales de la Faculte des Sciences Toulouse, 15 (2006), pp. 599–609.
- [37] J. Nie, Optimality conditions and finite convergence of Lasserre’s hierarchy, Math. Program., 146 (2014), pp. 97–121.
- [38] J. Nie, Moment and Polynomial Optimization, SIAM, Philadelphia, 2023.
- [39] J. Nie and M. Schweighofer, On the complexity of Putinar’s Positivstellensätz, J. Complexity, 23 (2007), pp. 135–150.
- [40] G. Polya, Uber positive Darstellung von Polynomen, Vierteljahresschrift der Naturforschenden Gesellschaft in Zurich, 73 (1928), pp. 141–145.
- [41] V. Powers and B. Reznick, A new bound for Polya’s theorem with applications to polynomials positive on polyhedra, J. Pure Appl. Algebra, 164 (2001), pp. 221–229.
- [42] M. Putinar, Positive polynomials on compact semi-algebraic sets, Indiana Univ. Math. J., 42 (1993), pp. 969–984.
- [43] M. Putinar and F Vasilescu, Positive polynomials on semi-algebraic sets, C. R. Acad. Sci. Ser. I Math., 328 (1999), pp. 585–589.
- [44] M. Putinar and F Vasilescu, Solving moment problems by dimensional extension, C. R. Acad. Sci. Ser. I Math., 328 (1999), pp. 495–499.
- [45] S. Robinson, Stability theorems for systems of inequalities. Part II: Differentiable nonlinear systems, SIAM J. Numer. Anal., 13 (1976), pp. 497–513.
- [46] C. Scheiderer, Distinguished representations of non-negative polynomials, J. Algebra, 289 (2005), pp. 558–573.
- [47] C. Scherer, LMI relaxations in robust control, Eur. J. Control, 2 (2006), pp. 3–29.
- [48] C. Scherer and C. Hol, Matrix sum-of-squares relaxations for robust semi-definite programs, Math. Program., 107 (2006), pp. 189–211.
- [49] K. Schmüdgen, The -moment problem for compact semi-algebraic sets, Math. Ann., 289(1991), pp. 203–206.
- [50] K. Schmüdgen, Noncommutative real algebraic geometry: some basic concepts and first ideas, In: Emerging Applications of Algebraic Geometry of IMA Volumes in Mathematics and its Applications, 149 (2009), pp. 325–350.
- [51] M. Schweighofer, On the complexity of Schmüdgen’s Positivstellensätz, J. Complexity, 20 (2004), pp. 529–543.
- [52] A. Shapiro, First and second order analysis of nonlinear semidefinite programs, Math. Program., 77 (1997), pp. 301–320.
- [53] L. Slot, Sum-of-squares hierarchies for polynomial optimization and the Christoffel–Darboux kernel, SIAM J. Optim., 32 (2022), pp. 2612–2635.
- [54] L. Slot and M. Laurent, An effective version of Schmüdgen’s Positivstellensätz for the hypercube, Optim Lett, 17 (2023), pp. 515–530.
- [55] L. Slot and M. Laurent, Sum-of-squares hierarchies for binary polynomial optimization, Math. Program., 197 (2023), pp. 621–660.
- [56] D. Sun, The strong second order sufficient condition and constraint nondegeneracy in nonlinear semidefinite programming and their implications, Math. Oper. Res., 31 (2006), pp. 761–776.
- [57] M. Tyburec, J. Zeman, M. Kruzık and D. Henrion, Global optimality in minimum compliance topology optimization of frames and shells by moment-sum-of-squares hierarchy, Struct Multidisc Optim , 64 (2021), pp. 1963–1981.
- [58] Y. Zheng and G. Fantuzzi, Sum-of-squares chordal decomposition of polynomial matrix inequalities, Math. Program., 197 (2023), pp. 71–108.