Proof of convergence theory in optimization

In summary: In fact, the two sequences do not always converge at the same rate. It is possible for one sequence to converge faster than the other, depending on the specific function and values of x_k and x_* in the given context.
  • #1
ianchenmu
10
0

Homework Statement



The question is:


Suppose that lim [itex]x_k=x_*[/itex], where [itex]x_*[/itex] is a local minimizer of the nonlinear function [itex]f[/itex]. Assume that [itex]\triangledown^2 f(x_*)[/itex] is symmetric positive definite. Prove that the sequence [itex]\left \{ f(x_k)-f(x_*) \right \}[/itex] converges linearly if and only if [itex]\left \{ ||x_k-x_*|| \right \}[/itex] converges linearly. Prove that the two sequences converge at the same rate, regardless of what the rate is. What is the relationship between the rate constant for the two sequences?







Homework Equations



n/a

The Attempt at a Solution


I guess we may use the orthogonal diagonalization of a symmetric matrix and [itex]f(x_k)-f(x_*)=\triangledown f(x_*)+\frac{1}{2}(x_k-x_*)^T\cdot\triangledown^2 f(\xi)(x_k-x_*)[/itex] and [itex]\triangledown f(x_*)=0[/itex]... But I got stuck here. So what's your answer?
 
Physics news on Phys.org
  • #2
ianchenmu said:

Homework Statement



The question is:


Suppose that lim [itex]x_k=x_*[/itex], where [itex]x_*[/itex] is a local minimizer of the nonlinear function [itex]f[/itex]. Assume that [itex]\triangledown^2 f(x_*)[/itex] is symmetric positive definite. Prove that the sequence [itex]\left \{ f(x_k)-f(x_*) \right \}[/itex] converges linearly if and only if [itex]\left \{ ||x_k-x_*|| \right \}[/itex] converges linearly. Prove that the two sequences converge at the same rate, regardless of what the rate is. What is the relationship between the rate constant for the two sequences?





Homework Equations



n/a

The Attempt at a Solution


I guess we may use the orthogonal diagonalization of a symmetric matrix and [itex]f(x_k)-f(x_*)=\triangledown f(x_*)+\frac{1}{2}(x_k-x_*)^T\cdot\triangledown^2 f(\xi)(x_k-x_*)[/itex] and [itex]\triangledown f(x_*)=0[/itex]... But I got stuck here. So what's your answer?

My answer is that the result you are being asked to prove is wrong.
 

FAQ: Proof of convergence theory in optimization

What is the proof of convergence theory in optimization?

The proof of convergence theory in optimization is a mathematical concept that shows that certain optimization algorithms will always converge to the optimal solution, given certain conditions. It provides a guarantee that the algorithm will eventually find the best possible solution for a given problem.

What are the conditions for convergence in optimization?

The conditions for convergence in optimization depend on the specific algorithm being used. However, some common conditions include a continuous and convex objective function, bounded feasible region, and a step size that decreases as the algorithm progresses.

How is the proof of convergence theory applied in real-world problems?

The proof of convergence theory is applied in real-world problems by providing a theoretical guarantee that an optimization algorithm will find the best possible solution. This is important for various applications such as machine learning, data analysis, and engineering design.

Can convergence be guaranteed for all optimization problems?

No, convergence cannot be guaranteed for all optimization problems. The proof of convergence theory only applies to certain algorithms and specific conditions. In some cases, the problem may be too complex or the algorithm may not be suitable, resulting in non-convergence.

Are there different types of convergence in optimization?

Yes, there are different types of convergence in optimization, such as local convergence and global convergence. Local convergence guarantees that the algorithm will find the optimal solution in a small neighborhood of the starting point, while global convergence ensures that the algorithm will find the optimal solution regardless of the starting point.

Similar threads

Replies
1
Views
1K
Replies
2
Views
1K
Replies
5
Views
1K
Replies
21
Views
3K
Replies
6
Views
2K
Replies
3
Views
1K
Replies
3
Views
968
Back
Top