Feasible solution for Linear Optimization

In summary, the first question discusses justifying why a tie for the minimum b-ratio at some iteration of the phase II simplex algorithm leads to a degenerate next basic feasible solution. The second question is a linear programming problem involving maximizing z while satisfying constraints involving matrices and vectors. Part (a) of the second question asks to prove that a certain type of solution is feasible, while part (b) asks to use this to prove unboundedness. The conversation concludes with a request for help and clarification on how to approach the problem.
  • #1
hkus10
50
0
1) How to justify if there is a tie for the minimum b-ratio at some iteration of the phase II simplex algorithm, then the next basic feasible solution is degenerate.

I have no idea how to justify it. Please give me some direction

2) Max. z = transpose of C * the vector x
s.t. Ax less or = b
& X bigger or = to 0

Suppose d belong in R^n satisfies Ad=0 and d bigger or equal to 0
a) Prove that if u belongs to R^n is a feasible solution of P, then so is u+td for all 0 less than or equal to t belongs to R.
b) Use part (a) to prove that if the transpose of C * d >= 0, the P is unbounded.

I think the big problem here is that I do not know how to approach part a.
please give some insight for how to at least begin to solve this problem?

Thank you!
I appreciate your time!
 
Physics news on Phys.org
  • #2
hkus10 said:
1) How to justify if there is a tie for the minimum b-ratio at some iteration of the phase II simplex algorithm, then the next basic feasible solution is degenerate.

I have no idea how to justify it. Please give me some direction

2) Max. z = transpose of C * the vector x
s.t. Ax less or = b
& X bigger or = to 0

Suppose d belong in R^n satisfies Ad=0 and d bigger or equal to 0
a) Prove that if u belongs to R^n is a feasible solution of P, then so is u+td for all 0 less than or equal to t belongs to R.
b) Use part (a) to prove that if the transpose of C * d >= 0, the P is unbounded.

I think the big problem here is that I do not know how to approach part a.
please give some insight for how to at least begin to solve this problem?

Thank you!
I appreciate your time!

About your first question: if a nonbasic variable xj increases from 0 to t and drives TWO (or more) basic variables xr and xs to zero at the same time, then we can decide to let, say, xs leave the basis (to be replaced by xj). The other variable xr remains basic but will be zero (because xj = t will be the new value of xj and we know already that xj = t drives xr to zero).

By the way, you can just write x >= 0, rather than writing out "x bigger or = 0"; similarly for Ax <= b, etc.

If your undefined problem P means the LP you wrote, then supposing Ad=0, d>=0, what can you say about x = u + td, t >= 0? Does it satisfy all the constraints of P? What is the resulting value of z(x) = c.x? (Here c,x are vectors, "." = inner product.)

RGV
 

Related to Feasible solution for Linear Optimization

What is a feasible solution for Linear Optimization?

A feasible solution for Linear Optimization is a set of values that satisfies all the constraints and produces the best possible objective function value. It is a solution that is both feasible and optimal.

How is a feasible solution determined in Linear Optimization?

A feasible solution is determined by finding the values for the decision variables that satisfy all the constraints set forth in the Linear Optimization problem. This can be done through various methods such as graphical analysis, algebraic methods, or computer software.

Why is it important to have a feasible solution in Linear Optimization?

A feasible solution is important because it ensures that the solution to the Linear Optimization problem is realistic and can be implemented in real-world scenarios. It also helps to identify any infeasible regions and improve the overall efficiency of the optimization process.

Can a Linear Optimization problem have multiple feasible solutions?

Yes, a Linear Optimization problem can have multiple feasible solutions. This happens when there are multiple combinations of values for the decision variables that satisfy all the constraints and produce the same optimal objective function value.

What happens if a Linear Optimization problem has no feasible solution?

If a Linear Optimization problem has no feasible solution, it means that there is no combination of values for the decision variables that satisfy all the constraints. This could be due to conflicting constraints or an error in the problem formulation. In such cases, the optimization process cannot be completed and alternative approaches need to be explored.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
500
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
4K
  • Calculus and Beyond Homework Help
Replies
14
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Back
Top