Prove that exp[A].exp = exp[A+B] only if A and B commute.

  • Thread starter humanist rho
  • Start date
  • Tags
    Commute
In summary, the conversation discusses the proof that exp(a+b)=exp(a)*exp(b) only if a and b commute. The conversation covers expanding both sides of the equation and using the series definition of exp(x). The final steps involve defining n-k=l and rearranging the sums to get two expressions for k and l from 0 to infinity, which can prove the desired result.
  • #1
humanist rho
95
0
Prove that exp[A].exp = exp[A+B] only if A and B commute.

Homework Statement



Prove that eA.eB = e(A+B) only if A and B commute.


The attempt at a solution

I expanded both sides of the equation.

(1+A+A2/2!...)(1+B+B2/2!+..) = (1+(A+B)+(A+B)2/2!+..)

Now how to proceed ?
 
Physics news on Phys.org
  • #2


When you work out the brackets on the left hand side, you will get products like An Bm, where all the A's are to the left of the B's.

Now try to work out (A + B)n (e.g., start with (A + B)2). What do you need to get this in the same form?
 
  • #3


Yeah,Now I got it.
Thank you.
 
  • #4


Hi guys.
I was reading some QM and they mentioned exp(a+b)=exp(a)*exp(b) only if [a,b]=0; so I thought to go to the series definition of exp(x) and work it out by myself.

I do understand where the problem arises if suddenly a and b don't commute anymore, but that's not my problem. I want to write down the proof for exp(a+b)=exp(a)*exp(b) for "typical" a and b. iykwim

So, I came across your post and I would like some further details of how to preoceed in the proof.
this is what I have so far:

If:
[tex]
exp(x)=\sum_{n=0}^{\infty}{\frac{x^n}{n!}}
[/tex]

Then, for exp(a)*exp(b), we have:[tex]
exp(a)*exp(b)=\sum_{n=0}^{\infty} \sum_{m=0}^{\infty}{\frac{a^n}{n!}}{\frac{b^m}{m!}}
[/tex]

However, if I start with exp(a+b), I go like this:[tex]
exp(a+b)=\sum_{n=0}^{\infty}{\frac{(a+b)^n}{n!}} =
\sum_{n=0}^{\infty} \sum_{k=0}^{n}\frac{1}{n!}\binom{n}{k}a^k\cdot b^{n-k} =
\sum_{n=0}^{\infty} \sum_{k=0}^{n}\frac{a^k}{k!}\frac{b^{n-k}}{(n-k)!}
[/tex]

-------------------------------------------------------------
Summarizing, I get on one hand:
[tex]
\sum_{n=0}^{\infty} \sum_{m=0}^{\infty}{\frac{a^n}{n!}}{\frac{b^m}{m!}}
[/tex]
While on the other hand I get:
[tex]
\sum_{n=0}^{\infty} \sum_{k=0}^{n}\frac{a^k}{k!}\frac{b^{n-k}}{(n-k)!}
[/tex]

The two expressions look alike, but I can't put them in the very exact form. Could you help me here?

How should I proceed? I have a feeling that some index substitution is the answer, but I haven't figured out which one...
 
  • #5
You are quite there. If you define n-k=l, then you obtain only the sum over n of expressions of k and l with the condition that k+l=n for k,l>=0. This sum can then be written as two sums for k and l from 0 to infinity in both cases.
 
  • #6
jjalonsoc said:
You are quite there. If you define n-k=l, then you obtain only the sum over n of expressions of k and l with the condition that k+l=n for k,l>=0. This sum can then be written as two sums for k and l from 0 to infinity in both cases.

This homework thread is 2 years old, so the OP is probably not working on the problem any more... :smile:
 

Related to Prove that exp[A].exp = exp[A+B] only if A and B commute.

What does the equation "exp[A].exp = exp[A+B]" mean?

The equation is a mathematical representation of the relationship between two exponential functions, where the exponential of the sum of two matrices is equal to the product of the exponentials of each individual matrix.

Why is it important to prove that "exp[A].exp = exp[A+B]" only if A and B commute?

This proof is important because it shows that the sum of two matrices can only be written as a product of exponentials if the matrices commute. This has significant implications in fields such as linear algebra and quantum mechanics.

What is the definition of "commutativity" in mathematics?

In mathematics, commutativity refers to the property of two operations or elements that can be interchanged without affecting the result. In the context of matrices, it means that the order of multiplication does not change the final result.

How do you prove that "exp[A].exp = exp[A+B]" only if A and B commute?

To prove this, one can use the Baker-Campbell-Hausdorff formula, which is a mathematical formula used to calculate the exponential of a sum of matrices. By applying this formula and using properties of commutativity, the proof can be demonstrated.

What are the practical applications of this proof?

This proof has applications in various fields such as physics, engineering, and computer science. It is used to solve problems involving exponential functions, matrix operations, and quantum mechanics. It also helps in understanding the behavior of systems with non-commuting elements.

Similar threads

  • Advanced Physics Homework Help
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
856
  • Advanced Physics Homework Help
Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
1K
  • Advanced Physics Homework Help
Replies
3
Views
2K
Replies
1
Views
646
Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
878
Replies
0
Views
363
Back
Top