12.1. Relaxed Complementary Slackness
Given a primal LP
\[\begin{gathered} \text{Minimize } \sum_{j=1}^n c_jx_j\\ \begin{aligned} \text{s.t. } \sum_{j=1}^n a_{ij}x_j &\ge b_i && \forall 1 \le i \le m\\ x_j &\ge 0 && \forall 1 \le j \le n, \end{aligned} \end{gathered}\tag{12.1}\]
its dual is
\[\begin{gathered} \text{Maximize } \sum_{i=1}^m b_iy_i\\ \begin{aligned} \text{s.t. } \sum_{i=1}^m a_{ij}y_i &\le c_j && \forall 1 \le j \le n\\ y_i &\ge 0 && \forall 1 \le i \le m. \end{aligned} \end{gathered}\tag{12.2}\]
Theorem 4.5 states that \(\hat x\) is an optimal solution of the primal LP and \(\hat y\) is an optimal solution of the dual LP if and only if
-
Primal complementary slackness: For all \(1 \le j \le n\),
\[\hat x_j = 0 \text{ or } \sum_{i=1}^m a_{ij}\hat y_i = c_j \text{ and}\]
-
Dual complementary slackness: For all \(1 \le i \le m\),
\[\hat y_i = 0 \text{ or } \sum_{j=1}^n a_{ij}\hat x_j = b_i.\]
(To remember which ones are the primal complementary slackness conditions, and which ones are the dual ones, note that primal complementary slackness states when the primal variables must be zero, and analogously for dual complementary slackness.)
Now let us relax these conditions so that \(\hat x_j > 0\) and \(\sum_{i=1}^m a_{ij}\hat y_i < c_j\) may hold simultaneously, but we require that \(\hat x_j > 0\) implies that \(\sum_{i=1}^m a_{ij}\hat y_i\) cannot be too small. We can obtain a similar condition that \(\hat y_i > 0\) implies that \(\sum_{j=1}^n a_{ij}\hat x_j\) cannot be too big. This gives the following relaxed complementary slackness conditions, for some parameters \(\alpha \ge 1\) and \(\beta \ge 1\):
-
\(\boldsymbol{\alpha}\)-Relaxed primal complementary slackness: For all \(1 \le j \le n\),
\[\hat x_j = 0 \text{ or } \sum_{i=1}^m a_{ij}\hat y_i \ge \frac{c_j}{\alpha}.\]
-
\(\boldsymbol{\beta}\)-Relaxed dual complementary slackness: For all \(1 \le i \le m\),
\[\hat y_i = 0 \text{ or } \sum_{j=1}^n a_{ij}\hat x_j \le \beta b_i.\]
For succinctness, we call the combination of \(\alpha\)-relaxed primal complementary slackness and \(\beta\)-relaxed dual complementary slackness \(\boldsymbol{(\alpha, \beta)}\)-relaxed complementary slackness. By the following lemma, the objective function values of feasible primal and dual solutions that satisfy \((\alpha, \beta)\)-relaxed complementary slackness differ by at most \(\alpha\beta\). Thus, the primal solution is an \(\alpha\beta\)-approximation of an optimal primal solution.
Lemma 12.1: Let \(\hat x\) and \(\hat y\) be feasible solutions of (12.1) and (12.2), respectively, that satisfy \((\alpha, \beta)\)-relaxed complementary slackness for some \(\alpha, \beta \ge 1\). Then
\[\sum_{j=1}^n c_j\hat x_j \le \alpha\beta \cdot \sum_{i=1}^m b_i\hat y_i.\]
Proof: Let us define values
\[\hat z_{ij} = \alpha a_{ij} \hat y_i \hat x_j \quad \forall 1 \le i \le m, 1 \le j \le n.\]
We prove that \(\alpha\)-relaxed primal complementary slackness implies that
\[\sum_{j=1}^n c_j \hat x_j \le \sum_{i=1}^m \sum_{j=1}^n \hat z_{ij} \tag{12.3}\]
and that \(\beta\)-relaxed dual complementary slackness implies that
\[\sum_{i=1}^m \sum_{j=1}^n \hat z_{ij} \le \alpha\beta \cdot \sum_{i=1}^m b_i\hat y_i.\tag{12.4}\]
Together, these two inequalities imply the lemma.
To prove (12.3), it suffices to prove that
\[c_j\hat x_j \le \sum_{i=1}^m \hat z_{ij} \quad \forall 1 \le j \le n.\]
If \(\hat x_j = 0\), then \(c_j \hat x_j = 0\) and \(\hat z_{ij} = \alpha a_{ij} \hat y_i \hat x_j = 0\) for all \(1 \le i \le m\). Thus,
\[c_j\hat x_j = \sum_{i=1}^m \hat z_{ij}\]
in this case.
If \(\hat x_j > 0\), then the \(j\)th \(\alpha\)-relaxed primal complementary slackness condition states that
\[\begin{aligned} \sum_{i=1}^m a_{ij} \hat y_i &\ge \frac{c_j}{\alpha} &&\Longleftrightarrow\\ \alpha \hat x_j \cdot \sum_{i=1}^m a_{ij} \hat y_i &\ge \alpha \hat x_j \cdot \frac{c_j}{\alpha} &&\Longleftrightarrow\\ \sum_{i=1}^m \alpha a_{ij} \hat y_i \hat x_j &\ge c_j \hat x_j &&\Longleftrightarrow\\ \sum_{i=1}^m \hat z_{ij} &\ge c_j \hat x_j. \end{aligned}\]
To prove (12.4), it suffices to prove that
\[\sum_{j=1}^n \hat z_{ij} \le \alpha\beta b_i \hat y_i \quad \forall 1 \le i \le m.\]
If \(\hat y_i = 0\), then \(\alpha\beta b_i \hat y_i = 0\) and \(\hat z_{ij} = \alpha a_{ij} \hat y_i \hat x_j = 0\) for all \(1 \le j \le n\). Thus,
\[\sum_{j=1}^n \hat z_{ij} = \alpha\beta b_i \hat y_i\]
in this case.
If \(\hat y_i > 0\), then the \(i\)th \(\beta\)-relaxed dual complementary slackness condition states that
\[\begin{aligned} \sum_{j=1}^n a_{ij} \hat x_j &\le \beta b_i &&\Longleftrightarrow\\ \alpha \hat y_i \cdot \sum_{j=1}^n a_{ij} \hat x_j &\le \alpha \hat y_i \cdot \beta b_i &&\Longleftrightarrow\\ \sum_{j=1}^n \alpha a_{ij} \hat y_i \hat x_j &\le \alpha \beta b_i \hat y_i &&\Longleftrightarrow\\ \sum_{j=1}^n \hat z_{ij} &\le \alpha \beta b_i \hat y_i.\ \text{▨} \end{aligned}\]
We can use Lemma 12.1 to obtain efficient approximation algorithms for NP-hard optimization problems. These algorithms maintain an infeasible primal solution \(\hat x\) and a feasible dual solution \(\hat y\) that satisfy \((\alpha, \beta)\)-relaxed complementary slackness for appropriate choices of \(\alpha\) and \(\beta\) (usually, \(\alpha = 1\) or \(\beta = 1\)). Each iteration now moves the primal solution closer to feasibility while maintaining the feasibility of the dual solution and maintaining \((\alpha, \beta)\)-relaxed complementary slackness. Therefore, once we obtain a feasible primal solution, it is an \(\alpha\beta\)-approximation of an optimal solution.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.