# Linear matrix inequality

### From Wikimization

(→LMI Geometry) |
|||

Line 82: | Line 82: | ||

Inverse image of the positive semidefinite cone under <math>\,g(y)\,</math> | Inverse image of the positive semidefinite cone under <math>\,g(y)\,</math> | ||

- | must therefore have dimension <math>_{} | + | must therefore have dimension equal to <math>\dim\!\left(\mathcal{R}(A^{\rm T})_{}\!\cap\mbox{svec}\,\mathbb{S}_+^{_{}n}\right)</math>. |

+ | When this dimension is <math>\,m\,</math>, the dual cone interior is nonempty | ||

- | + | <math>\textrm{int}\,\mathcal{K}^*=\{y\!\in_{}\!\mathbb{R}^m~|\,\sum\limits_{j=1}^my_jA_j\succ_{}0_{}\}</math> | |

- | + | ||

- | <math>\textrm{int}\mathcal{K}^*=\{y\!\in_{}\!\mathbb{R}^m~|\,\sum\limits_{j=1}^my_jA_j\succ_{}0_{}\}</math> | + | |

having boundary | having boundary | ||

<math>\partial^{}\mathcal{K}^*=\{y\!\in_{}\!\mathbb{R}^m~|\,\sum\limits_{j=1}^my_jA_j\succeq_{}0\,,~\sum\limits_{j=1}^my_jA_j\nsucc_{}0_{}\}</math> | <math>\partial^{}\mathcal{K}^*=\{y\!\in_{}\!\mathbb{R}^m~|\,\sum\limits_{j=1}^my_jA_j\succeq_{}0\,,~\sum\limits_{j=1}^my_jA_j\nsucc_{}0_{}\}</math> | ||

+ | |||

+ | and then cone <math>\mathcal{K}</math> is pointed. | ||

== Applications == | == Applications == |

## Revision as of 15:18, 10 April 2009

In convex optimization, a **linear matrix inequality (LMI)** is an expression of the form

where

- is a real vector,
- are symmetric matrices in the subspace of symmetric matrices ,
- is a generalized inequality meaning is a positive semidefinite matrix belonging to the positive semidefinite cone in the subspace of symmetric matrices .

This linear matrix inequality specifies a convex constraint on *y*.

## Contents |

## Convexity of the LMI constraint

is a convex constraint on *y* which means membership to a dual (convex) cone as we now explain: **(**Dattorro, Example 2.13.5.1.1**)**

Consider a peculiar vertex-description for a closed convex cone defined over the positive semidefinite cone

**(**instead of the more common nonnegative orthant, **)**:

for given ,

where

- ,
- symmetric vectorization svec is a stacking of columns defined in
**(**Dattorro, Ch.2.2.2.1**)**, - is assumed without loss of generality.

is a convex cone because

since a nonnegatively weighted sum of positive semidefinite matrices must be positive semidefinite.

Now consider the (closed convex) dual cone:

that follows from Fejer's dual generalized inequalities for the positive semidefinite cone:

This leads directly to an equally peculiar halfspace-description

The summation inequality with respect to the positive semidefinite cone
is known as a *linear matrix inequality*.

## LMI Geometry

Although matrix is finite-dimensional, is generally not a polyhedral cone (unless equals 1 or 2) simply because .

Provided the matrices are linearly independent, then relative interior = interior

meaning, the cone interior is nonempty; implying, the dual cone is pointed (Dattorro, ch.2).

If matrix has no nullspace, on the other hand, then is an isomorphism in between the positive semidefinite cone and range of matrix .

In that case, convex cone has relative interior

and boundary

When the matrices are linearly independent, function on is a linear bijection.

Inverse image of the positive semidefinite cone under must therefore have dimension equal to . When this dimension is , the dual cone interior is nonempty

having boundary

and then cone is pointed.

## Applications

There are efficient numerical methods to determine whether an LMI is feasible (*i.e.*, whether there exists a vector such that ), or to solve a convex optimization problem with LMI constraints.
Many optimization problems in control theory, system identification, and signal processing can be formulated using LMIs. The prototypical primal and dual semidefinite program are optimizations of a real linear function respectively subject to the primal and dual convex cones governing this LMI.

## External links

- S. Boyd, L. El Ghaoui, E. Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory

- C. Scherer and S. Weiland Course on Linear Matrix Inequalities in Control, Dutch Institute of Systems and Control (DISC).