What Conditions Ensure Convergence of Sequences?

  • Thread starter DespotDespond
  • Start date
  • Tags
    Convergence
In summary: The special thing is that the function has a unique inverse. If you could find that inverse, then the algorithm would be guaranteed to converge.In summary, The series x1, x2, ... converges if and only if the function, f, has a unique inverse.
  • #1
DespotDespond
3
0
Hi,

I have a basic question about convergence.

I have two sequences, x1, x2, ... and y1, y2, ..., where yn = f(xn) for some function f : ℝN → ℝ.

I have shown that the sequence, y1, y2, ... converges. What conditions do I need on the function, f, to ensure that the sequence x1, x2, ... also converges?

Thanks in advance
 
Physics news on Phys.org
  • #2
You need the function f to be continuous.
 
  • #3
In fact this is an equivalent definition of being continuous: f is continuous if and only if whenever [itex] x_n \to x [/itex], [itex] f(x_n) \to f(x) [/itex].
 
  • #4
But the poster is asking the converse. He's asking if ##f(x_n)\rightarrow x## implies ##x_n\rightarrow x##. So continuity is irrelevant here.
 
  • #5
Indeed, R136a1 is correct. I want the convergence of the xn → x given the convergence of f(xn) → f(x).

In this case, continuity of f is not enough. For example, f(x) = x2 is continuous, but if we define the two series as follows

x1 = √2, x2 = -√2, x3 = √2, x4 = -√2, ...

and

y1 = 2, y2 = 2, y3 = 2, y4 = 2, ...

then yn is convergent and yn = f(xn), but xn is not convergent.

If the function, f, were invertible and the inverse was continuous, then that would be enough. Right?

The problem is that in my case I don't think that f is invertible.

I don't necessarily need the result to hold globally, i.e. for all x. A local result might suffice.

Any syggestions would be appreciated.
 
  • #6
DespotDespond said:
Indeed, R136a1 is correct. I want the convergence of the xn → x given the convergence of f(xn) → f(x).

In this case, continuity of f is not enough. For example, f(x) = x2 is continuous, but if we define the two series as follows

x1 = √2, x2 = -√2, x3 = √2, x4 = -√2, ...

and

y1 = 2, y2 = 2, y3 = 2, y4 = 2, ...

then yn is convergent and yn = f(xn), but xn is not convergent.

If the function, f, were invertible and the inverse was continuous, then that would be enough. Right?

The problem is that in my case I don't think that f is invertible.

I don't necessarily need the result to hold globally, i.e. for all x. A local result might suffice.

Any syggestions would be appreciated.

You don't even need fully invertible. Just a continuous left-inverse suffices.

I kind of doubt there is a simple condition. Could you perhaps give some more information about your specific situation?
 
  • #7
Hi,

Thanks for the response.

The problem is related to an algorithm that I've constructed for optimising Markov decision processes. I'm trying to prove the global convergence of the algorithm. I don't want to go into any unnecessary detail, so I will try to explain as briefly as I can.

I have an objective function, f, of which the x's are the parameters I wish to optimise.

My algorithm generates a series of parameter vectors, x1, x2, ...

I've managed to show that the objective is strictly monotonically increasing with respect to this series, i.e. that

f(x1) < f(x2) < ...

I know that f is bounded from above, so I used the monotone convergence theorem to conclude that the series

f(x1), f(x2), ...

converges.

Generally, f is not injective and so doesn't have an inverse.
 
  • #8
Thanks for your response. But what I really wanted to know is whether ##f## has some special properties that we could use. We know it's not injective, but is there perhaps something else? Obviously, if ##f## can be any continuous function, then what you're trying to do is false. So there must be something special going on.
 

FAQ: What Conditions Ensure Convergence of Sequences?

What is a simple convergence question?

A simple convergence question is a mathematical problem that requires finding the limit of a sequence or series. It involves determining whether a sequence or series of numbers approaches a specific value, known as the limit, as the number of terms approaches infinity.

How do you determine if a sequence or series converges?

To determine if a sequence or series converges, you can use various tests such as the comparison test, the ratio test, or the root test. These tests involve analyzing the behavior of the sequence or series to see if it approaches a specific value or if it diverges.

What is the difference between convergence and divergence?

Convergence refers to the behavior of a sequence or series that approaches a specific value as the number of terms increases. On the other hand, divergence occurs when the terms of a sequence or series do not approach a specific value and instead, either increase or decrease without bound.

Why is understanding convergence important in mathematics?

Understanding convergence is essential in mathematics as it helps us determine the behavior of a sequence or series and if it has a finite limit. This concept is crucial in various fields such as calculus, statistics, and analysis, where it is used to solve problems and make predictions.

Can a sequence or series have multiple limits?

No, a sequence or series can only have one limit. The limit represents the value that the terms of the sequence or series are approaching as the number of terms increases. If a sequence or series has multiple limits, then it is not a well-defined mathematical concept.

Similar threads

Back
Top