- #1
Master1022
- 611
- 117
- Homework Statement
- Given the primal problem below: (i) is it convex?, (ii) find the dual problem
- Relevant Equations
- Lagrangian, hessian
Hi,
I am working on the following optimization problem, and am confused how to apply the Lagrangian in the following scenario:
Question:
Let us look at the following problem
[tex] \min_{x \in \mathbb{R}_{+} ^{n}} \sum_{j=1}^{m} x_j log(x_j) [/tex]
[tex] \text{subject to} A \vec{x} \leq b \rightarrow A\vec{x} - b \leq \vec{0} [/tex]
[tex] \sum_{j=1}^{n} x_j = 1 [/tex]
(i) Is this problem convex?
(ii) Find the dual problem
Attempt:
(i) For the convexity, I argued that it is convex and used the hessian argument to show that (although I don't know if I applied it correctly):
[tex] \frac{\partial ^2 (obj)}{\partial x_k ^2} = \frac{1}{x_k} \geq 0 \forall x_k \geq 0 [/tex]
[tex] \frac{\partial ^2 (obj)}{\partial x_k \partial x_j} = 0 [/tex]
Thus the second order convexity condition is satisfied and the problem is convex.
(ii) This is where I get confused as I am not sure how to deal with the equality constraint. We can define the Lagrangian to be:
[tex] L(\vec{x}, \vec{\lambda}, \vec{\nu}) = \sum_{j=1}^{m} x_j log(x_j) + \vec{\lambda}^{T} \left( A\vec{x} - b \right) + \text{something with inequality constraint} [/tex]
The first thing that comes to mind is: [itex] -1 + \sum_{j=1}^{n} \nu_j x_j = 0 [/itex]. However, this ignores the fact that the one is also present in the expression. Other options I have thought about include:
(a) Introducing extra slack variables
(b) Re-formatting the equation
But I am not sure how to make any progress.
I am working on the following optimization problem, and am confused how to apply the Lagrangian in the following scenario:
Question:
Let us look at the following problem
[tex] \min_{x \in \mathbb{R}_{+} ^{n}} \sum_{j=1}^{m} x_j log(x_j) [/tex]
[tex] \text{subject to} A \vec{x} \leq b \rightarrow A\vec{x} - b \leq \vec{0} [/tex]
[tex] \sum_{j=1}^{n} x_j = 1 [/tex]
(i) Is this problem convex?
(ii) Find the dual problem
Attempt:
(i) For the convexity, I argued that it is convex and used the hessian argument to show that (although I don't know if I applied it correctly):
[tex] \frac{\partial ^2 (obj)}{\partial x_k ^2} = \frac{1}{x_k} \geq 0 \forall x_k \geq 0 [/tex]
[tex] \frac{\partial ^2 (obj)}{\partial x_k \partial x_j} = 0 [/tex]
Thus the second order convexity condition is satisfied and the problem is convex.
(ii) This is where I get confused as I am not sure how to deal with the equality constraint. We can define the Lagrangian to be:
[tex] L(\vec{x}, \vec{\lambda}, \vec{\nu}) = \sum_{j=1}^{m} x_j log(x_j) + \vec{\lambda}^{T} \left( A\vec{x} - b \right) + \text{something with inequality constraint} [/tex]
The first thing that comes to mind is: [itex] -1 + \sum_{j=1}^{n} \nu_j x_j = 0 [/itex]. However, this ignores the fact that the one is also present in the expression. Other options I have thought about include:
(a) Introducing extra slack variables
(b) Re-formatting the equation
But I am not sure how to make any progress.