How to do Magnus expansion for time-dependent companion matrix?

  • I
  • Thread starter adf89812
  • Start date
  • Tags
    Matrix
In summary, the Magnus expansion for a time-dependent companion matrix involves expressing the solution of a linear differential equation in terms of the matrix exponential. The process begins by identifying the time-dependent matrix and its associated companion matrix. The Magnus series is then constructed by calculating the nested commutators of the matrix, leading to an infinite series representation. Typically, the first few terms of the series are computed to approximate the solution. The convergence of the series depends on the properties of the matrix and the specific application.
  • #1
adf89812
37
1
TL;DR Summary
Magnus expansion of companion matrix time-dependent or solve by substitution how?
For example, consider the following system of 2 first order ODEs:
$$
\left\{\begin{array}{l}
x_1^{\prime}=2 t x_1+t^2 x_2 \\
x_2^{\prime}=t^3 x_1+4 t x_2
\end{array}\right.
$$

This is a linear homogeneous system of 2 first order ODEs with $$A(t)=\left[\begin{array}{ll}2 t & t^2 \\ t^3 & 4 t\end{array}\right]$$.


"Secondly, the substitution method works in the same manner as usual. Indeed, the first line of the system leads to $$y = t^{-2}\dot{x} + 2t^{-1}x$$, which can be differentiated in order to find $\dot{y}$ in terms of $$x$$ and $$t$$. Next, plugging these expressions into the second line, you will end up with a second-order linear ODE with non-constant coefficients for $$x$$, which itself might not be easy to solve in the present case."

"
Firstly, you mentioned diagonalization; however, in that case, the eigenvalues and the eigenvectors will be themselves time-dependent. If $$S$$ denotes the change of basis allowing the diagonalization of $$A$$ as $$D$$, i.e. $$ D= SAS^{-1}$$, then the system of equations $$\dot{u} = Au$$, where $$u = (x,y)$$, becomes
$$
\dot{v} = \partial_t(Su) = S\dot{u} + \dot{S}u = \left(SA + \dot{S}\right)u = \left(SAS^{-1} + \dot{S}S^{-1}\right)Su = \left(D + \dot{S}S^{-1}\right)v
$$
for $$v = Su$$. This new system might be even harder to solve yet, because of the extra (non-diagonal) term $$\dot{S}S^{-1}$$."
 
Physics news on Phys.org
  • #2
Please let me know
[tex]x’=\dot{x}=\frac{dx}{dt}?[/tex]
 
Last edited:
  • #3
So is it basically ##\dot{x}(t)=A(t)x(t)##?
One sort of solution is by ##x(t)=e^{\int^t A(s)ds}x(0)##.

I think so, am I wrong?
 
  • #4
billtodd said:
I think so, am I wrong?
You are correct only if ##x,A## are scalars (1-dimensional). Otherwise, if ##x(t)## is an n-dimensional vector and ##A(t)## is a ##n\times n## matrix then the correct solution involves the ordered-exponential involving nested integrals over products of ##A(t)##. This will reduce to your usual exponential for special matrices that commute at different times, like a constant matrix.
 
  • #5
renormalize said:
You are correct only if ##x,A## are scalars (1-dimensional). Otherwise, if ##x(t)## is an n-dimensional vector and ##A(t)## is a ##n\times n## matrix then the correct solution involves the ordered-exponential involving nested integrals over products of ##A(t)##. This will reduce to your usual exponential for special matrices that commute at different times, like a constant matrix.
I was refering to the vector case.
Obviously using ##e^A=\sum_{n=0}^\infty A^n/n!##.
And here (in the OP) you first integrate ##A##, and then plug it to the sum. Though I am not sure if there's a closed form solution here.
 
  • #6
billtodd said:
I was refering to the vector case.
Obviously using ##e^A=\sum_{n=0}^\infty A^n/n!##.
And here (in the OP) you first integrate ##A##, and then plug it to the sum. Though I am not sure if there's a closed form solution here.
Sorry, that doesn't work to solve your vector ODE for a general matrix ##A(t)##. Just try it out: start from your proposed "solution" ##x\left(t\right)=\exp\left[\intop_{0}^{t}A\left(t^{\prime}\right)dt^{\prime}\right]x\left(0\right)## with the exponential expanded as a series. Now differentiate the series term-by-term to see if ##x(t)## satisfies ##\dot{x}\left(t\right)=A\left(t\right)x\left(t\right)##. (Hint: it won't unless ##A(t_1)## commutes with ##A(t_2)## for all pairs ##t_1,t_2## (i.e., it's a constant matrix) so that ##A## can be moved completely to the left of the exponential.
 
Last edited:
  • Like
Likes billtodd
  • #7
Hi @billtodd. If you write ##x(t) = M(t) x(0)## then ##\dot{x} (t) = \dot{M} (t) x(0)= A(t) x(t) = A(t) M(t)x(0)##. Implying you need to solve:

\begin{align*}
\frac{d}{dt} M(t) = A(t) M(t)
\end{align*}

subject to ##M(0)= \mathbb{1}##. I derived the formal solution to this for the general case of a time dependent matrix ##A(t)## here:

https://www.physicsforums.com/threa...unction-valued-matrices.1046714/#post-6814958

and it involves the time-ordered product of matrices ##A(t_1),A(t_2), \dots##. This solution reduces to ##M(t) = \exp (\int_0^t A (t') dt')## in the case where ##A(t)## is a constant matrix.
 
  • #8
renormalize said:
Sorry, that doesn't work to solve your vector ODE for a general matrix ##A(t)##. Just try it out: start from your proposed "solution" ##x\left(t\right)=\exp\left[\intop_{0}^{t}A\left(t^{\prime}\right)dt^{\prime}\right]x\left(0\right)## with the exponential expanded as a series. Now differentiate the series term-by-term to see if ##x(t)## satisfies ##\dot{x}\left(t\right)=A\left(t\right)x\left(t\right)##. (Hint: it won't unless ##A(t_1)## commutes with ##A(t_2)## for all pairs ##t_1,t_2## (i.e., it's a constant matrix) so that ##A## can be moved completely to the left of the exponential.
So how would you solve it?
One can try power series for both ##x_1(t),x_2(t)##. But then you translate the differential equations to recurrence equations, which doesn't necessarily make the problem any easier.
OK, now I understand what is that ordered exponential.
First time I had seen it was in QM2.
https://en.wikipedia.org/wiki/Dyson_series
But one can only solve this numerically with the ordered exponential.
 
Back
Top