On asymptotically stable systems and bounded solutions

  • #1
psie
261
32
Homework Statement
Assume that the homogenous system ##x'=Ax## is asymptotically stable. Show that if ##b(t)## is bounded for ##t\geq t_0## then every solution of the system ##x'=Ax+b(t)## is bounded for ##t\geq t_0##.
Relevant Equations
A system ##x'=Ax## is said to be asymptotically stable iff all eigenvalues of ##A## have a negative real part. Moreover, the general solution to ##x'=Ax+b(t)## and ##x(t_0)=x_0## is given by ##x(t)=e^{tA}x_0+\int_{t_0}^te^{(t-\tau)A}b(\tau)d\tau##.
We need to show ##\lVert x(t)\rVert## is bounded. It is given that ##\lVert b(t)\rVert\leq c_1## for ##t\geq t_0##. A TA has claimed that ##\lVert e^{tA}\rVert\leq ce^{-\epsilon t}## holds for some ##\epsilon>0## and a constant ##c##, when ##t\geq0##. I have a hard time confirming this claim and I'd be grateful if anyone could comment on this. If this bound is true, then the statement in the exercise follows from the following estimates

First, for ##t\geq t_0##, \begin{align}\left\lVert \int_{t_0}^te^{(t-\tau)A}b(\tau)d\tau\right\rVert&\leq \int_{t_0}^t \left\lVert e^{(t-\tau)A}\right\rVert \left\lVert b(\tau) \right\rVert d\tau \nonumber \\
&\leq c\cdot c_1 \int_{t_0}^te^{-\epsilon(t-\tau)}d\tau \nonumber \\
& =c\cdot c_1\left(\frac1{\epsilon}-\frac{e^{-\epsilon(t-t_0)}}{\epsilon}\right) \nonumber \\
&\leq \frac{c\cdot c_1}{\epsilon} \nonumber\\
& =\frac{C}{\epsilon}
\nonumber\end{align}

Second, for ##t\geq t_0##, $$\lVert e^{tA}x_0\rVert\leq ce^{-\epsilon t}\lVert x_0\rVert \leq ce^{-\epsilon t_0} \lVert x_0\rVert=D$$

Finally then, ##\lVert x(t)\rVert## must be bounded by ##\frac{C}{\epsilon}+D## when ##t\geq t_0##.

But why does ##\lVert e^{tA}\rVert\leq ce^{-\epsilon t}## hold? I know that every element in ##e^{tA}## is a linear combination of terms of the form ##t^je^{\lambda t}##, where ##\lambda## is an eigenvalue of ##A## and ##j## is less than the multiplicity of that eigenvalue. Moreover, I know of ##\lVert e^{A}\rVert\leq e^{\lVert A\rVert}##, but I don't know if this is helpful.

Also, I have assumed in my computations that ##t_0\geq 0##. I guess it makes no sense for ##t_0<0##, right?
 
Physics news on Phys.org
  • #2
nuuskur said:
If ##A## is a Jordan block, then ##\|\exp(At)\|\to 0## as ##t\to\infty##. More specifically, if we write ##A=\lambda I_n + N##, where ##N## is nilpotent, then ##\exp (At) = \exp (\lambda I_nt)\exp (Nt)##. In particular ##\exp(Nt) = \sum _{k=0}^M \frac{(Nt)^k}{k!}##, where ##N^{M+1}=0## due to nilpotency. Putting ##\lambda = a+bi##, where by assumption ##a<0## it follows that
[tex]\left\|\exp(At)\right\| = \left\|e^{at}\exp(Nt)\right\| =: Ce^{at}.[/tex]
In general, assume ##A## is in its Jordan normal form and since there are finitely many blocks, you can take maximums.
Interesting. I am not too familiar with this decomposition, and Wikipedia is not so helpful, but we have as many Jordan blocks as there are distinct eigenvalues, right?

My understanding is also that when we take the matrix exponential of a matrix in Jordan normal form, then this is simply the matrix exponential applied to each block. However, how do I compute the norm for a matrix in Jordan normal form?

Lastly, why does ##\lVert\exp(Nt)\rVert## evaluate to a constant? Shouldn't it depend on ##t##?
 
  • Like
Likes nuuskur
  • #3
Correct, it does depend on ##t##. I was getting ahead of myself. It holds that
##\|e^{-\delta t}\exp(Nt)\| \to 0## as ##t\to\infty## for any ##\delta >0##. Indeed we see that
\begin{align*}
\|e^{-\delta t}\exp(Nt)\| = \left\|e^{-\delta t}\sum _{k=0}^M\frac{N^k}{k!}t^k\right\| \leqslant e^{-\delta t}\sum _{k=0}^M \frac{\|N\|^k}{k!}t^k \xrightarrow[t\to\infty]{}0
\end{align*}
because polynomial grows slower than exponential goes to zero. So we have boundedness ##\|e^{-\delta t}\exp(Nt)\| \leqslant C##.

Let ##A = \lambda I_n + N## be a Jordan block, where ##\lambda = a+bi## (and ##a<0##).
\begin{align*}
\left\|\exp(At)\right\| = \left\|e^{a t}\exp(Nt)\right\| = \left\|e^{(a+\delta - \delta)t}\exp(Nt)\right\| = e^{(a+\delta)t}\left\|e^{-\delta t}\exp(Nt)\right\| \leqslant C e^{(a+\delta)t }.
\end{align*}

In general, assume ##A## is in its Jordan normal form. There are finitely many blocks, each corresponding to respective eigenvalue ##\lambda _k := a_k+ b_ki##. Pick ##\delta>0## sufficiently small such that ## \max a_k + \delta \leqslant -\varepsilon## for some ##\varepsilon >0##.

---

Regarding exponential of Jordan normal form. Yes, if ##J## is Jordan matrix, then ##J= \bigoplus J_k##, where the ##J_k## are Jordan blocks. So ##\exp(Jt) = \bigoplus \exp(J_kt)##. I confess I haven't tried computing (2-)norm of a Jordan matrix. I know that spectral radius is a lower bound, but that's about all I can say.
 
Last edited:
  • Like
Likes psie
  • #4
nuuskur said:
So we have boundedness ##\|e^{-\delta t}\exp(Nt)\| \leqslant C##.
Thanks. Do we have boundedness for all ##t## or only for some ##t##?
 
  • #5
This bound applies for all ##t\geqslant 0##. If ##t_0## is sufficiently large, then ##\|e^{-\delta t}\exp(Nt)\|\leqslant \varepsilon ## for ##t>t_0## and ##\|e^{-\delta t}\exp(Nt)\|## is continuous (because it's a composition of continuous maps) on any closed interval ##[0,t_0]##.

edit: I should mention the ##C## is a bound for this particular Jordan block. But again, you can take the maximum of these bounds to obtain a bound for the entire matrix.
 
Last edited:
  • Like
Likes psie

FAQ: On asymptotically stable systems and bounded solutions

What is an asymptotically stable system?

An asymptotically stable system is one in which any small perturbation or deviation from an equilibrium state will decay over time, causing the system to return to its equilibrium state. Mathematically, this means that solutions to the system's differential equations will converge to a steady state as time approaches infinity.

How can you determine if a system is asymptotically stable?

To determine if a system is asymptotically stable, you typically analyze the eigenvalues of the system's Jacobian matrix at the equilibrium point. If all eigenvalues have negative real parts, the system is asymptotically stable. Alternatively, Lyapunov's direct method can be used, where a Lyapunov function is constructed to show that the system's energy decreases over time.

What is the difference between stability and asymptotic stability?

Stability, in general, means that the system will not exhibit unbounded behavior in response to small perturbations. Asymptotic stability is a stronger condition where, in addition to being stable, the system's state will return to equilibrium over time. In other words, while a stable system resists deviations, an asymptotically stable system actively returns to equilibrium.

What are bounded solutions in the context of dynamical systems?

Bounded solutions in dynamical systems refer to solutions that remain within a finite range for all time. This means that the state variables of the system do not grow without bound, regardless of the initial conditions, ensuring that the system's behavior remains predictable and contained.

Why is the concept of bounded solutions important in control theory?

The concept of bounded solutions is crucial in control theory because it guarantees that the system's response to inputs and disturbances will not result in runaway behavior. Ensuring bounded solutions helps in designing controllers that maintain system performance within acceptable limits, thereby ensuring safety, reliability, and robustness in practical applications.

Similar threads

Replies
2
Views
780
Replies
1
Views
1K
Replies
5
Views
1K
Replies
1
Views
1K
Replies
3
Views
878
Replies
2
Views
1K
Replies
3
Views
2K
Back
Top