Any square matrix can be expressed as the sum of anti/symmetric matrices

In summary, we can define the matrix ##X## of size ##(n,n)## with entries ##x_{ij}=\frac{1}{2}\left(a_{ij}+a_{ji}\right)## and the matrix ##Y## of size ##(n,n)## with entries ##y_{ij}=\frac{1}{2}\left(a_{ij}-a_{ji}\right)##. Using the properties of transposition and linearity, we can show that ##X^T=X## and ##Y^T=Y##, which implies that ##X+Y=A##. Furthermore, this result is not valid in characteristic 2, as skew symmetric matrices are a subset of symmetric matrices in this
  • #1
Eclair_de_XII
1,083
91
TL;DR Summary
es. Show that any square matrix can be expressed as the sum of a symmetric matrix and an anti-symmetric matrix of the same size.

Important: I would like critique on the style of my proof more than its content. That being said, I am more than open to critique on the latter, as well. In my academic career, I've often been told that my proofs are very long and needlessly excessive. This is the very first proof-based exercise in my linear algebra book, and it took about a page for me to write.
Let ##A## be a matrix of size ##(n,n)##. Denote the entry in the i-th row and the j-th column of ##A## by ##a_{ij}##, for some ##i,j\in\mathbb{N}##. For brevity, we call ##a_{ij}## entry ##(i,j)## of ##A##.

Define the matrix ##X## to be of size ##(n,n)##, and denote entry ##(i,j)## of ##X## as ##x_{ij}##, where ##x_{ij}=\frac{1}{2}\left(a_{ij}+a_{ji}\right)##. We note that ##x_{ji}=\frac{1}{2}\left(a_{ji}+a_{ij}\right)=\frac{1}{2}\left(a_{ij}+a_{ji}\right)=x_{ij}##. Since matrices are uniquely identified by their entries, we conclude from this string of equalities that ##X^T=X##.

Define the matrix ##Y## to be of size ##(n,n)##. Denote entry ##(i,j)## of ##Y## as ##y_{ij}##, where ##y_{ij}=\frac{1}{2}\left(a_{ij}-a_{ji}\right)##. We note that ##y_{ji}=\frac{1}{2}\left(a_{ji}-a_{ij}\right)=-\frac{1}{2}\left(a_{ij}-a_{ji}\right)=-y_{ij}##. Since matrices are uniquely identified by their entries, we conclude from this string of equalities that ##Y^T=Y##.

\begin{eqnarray}
x_{ij}+y_{ij}&=&\frac{1}{2}\left(a_{ij}+a_{ji}\right)+\frac{1}{2}\left(a_{ij}-a_{ji}\right)\\
&=&\frac{1}{2}a_{ij}+\frac{1}{2}a_{ji}+\frac{1}{2}a_{ij}-\frac{1}{2}a_{ji}\\
&=&\left(\frac{1}{2}a_{ij}+\frac{1}{2}a_{ij}\right)+\left(\frac{1}{2}a_{ji}-\frac{1}{2}a_{ji}\right)\\
&=&a_{ij}+0\\
&=&a_{ij}
\end{eqnarray}

Since matrices are uniquely identified by their entries, we conclude that ##A=X+Y##.
 
Physics news on Phys.org
  • #2
I guess the absolute minimalist version of this is
Note that for a square matrix A $$2\mathbf A= [\mathbf A+{\mathbf A}^t]+ [ \mathbf A-{\mathbf A}^t] $$
 
  • #3
Wouldn't you need to prove that the sum of any matrix and its transpose is symmetric, and the difference of any matrix and its transpose is anti-symmetric? Or is that just a proposition that is too obvious to prove --- which is to say, it could be proven in just a handful of lines of algebra?
 
  • #4
I would not feel the need but I am not a mathematician. Do you need to define what "square" means?
I understand your angst and it used to drive me crazy..."where do I start??""what do I assume?"
I suggest that you figure out the really hard part first and write it down. Then anything else is is just the decoration.. maybe you need it
I would appreciate other opinions.
 
  • Like
Likes Keith_McClary
  • #5
Eclair_de_XII said:
Wouldn't you need to prove that the sum of any matrix and its transpose is symmetric, and the difference of any matrix and its transpose is anti-symmetric? Or is that just a proposition that is too obvious to prove --- which is to say, it could be proven in just a handful of lines of algebra?
It is too obvious to prove. What are ##\left(A^\tau\right)^\tau## and ##(A+B)^\tau ?##
 
  • #6
Oh. So set ##B=\pm A^T##.
 
  • #7
Eclair_de_XII said:
Oh. So set ##B=A^T##.
Yes. You use linearity of transposition and that it is inverse to itself. If you transpose a product, then you must change the order of the factors, but this isn't used here.
 
  • Like
Likes Frabjous
  • #8
hutchphd said:
I guess the absolute minimalist version of this is
Note that for a square matrix A $$2\mathbf A= [\mathbf A+{\mathbf A}^t]+ [ \mathbf A-{\mathbf A}^t] $$
This is how I'd write it, with a footnote saying that in the case of characteristic 2, this isn't meaningful and OP's claim isn't true. (Skew symmetric matrices are a subset of symmetric matrices in characteristic 2, so by a dimension argument it can't be true or just exhibiting a single non-symmetric matrix gives a counterexample.)
 

FAQ: Any square matrix can be expressed as the sum of anti/symmetric matrices

What is a square matrix?

A square matrix is a matrix with equal number of rows and columns. It is represented as a grid of numbers or variables enclosed in brackets.

What is an anti/symmetric matrix?

An anti/symmetric matrix is a square matrix where the elements below the main diagonal are the negative of the elements above the main diagonal. In other words, the matrix is symmetric if it is equal to its own transpose, and anti-symmetric if it is equal to the negative of its own transpose.

How can any square matrix be expressed as the sum of anti/symmetric matrices?

Any square matrix can be expressed as the sum of an anti-symmetric matrix and a symmetric matrix. This is known as the "symmetric decomposition theorem." The symmetric matrix is obtained by taking the average of the original matrix and its transpose, while the anti-symmetric matrix is obtained by taking the difference between the original matrix and its transpose.

What is the significance of expressing a matrix as the sum of anti/symmetric matrices?

Expressing a matrix as the sum of anti/symmetric matrices can help simplify calculations and make it easier to understand the properties of the matrix. It also allows for easier manipulation and can reveal patterns or relationships between the elements of the matrix.

Can any square matrix be expressed as the sum of only anti/symmetric matrices?

No, not all square matrices can be expressed as the sum of only anti/symmetric matrices. This is because not all matrices have an equal number of positive and negative elements, which is necessary for an anti-symmetric matrix. However, any square matrix can be expressed as the sum of an anti-symmetric matrix and a symmetric matrix, as mentioned in the "symmetric decomposition theorem."

Similar threads

Replies
9
Views
1K
Replies
8
Views
2K
Replies
3
Views
1K
Replies
3
Views
2K
Replies
2
Views
2K
Back
Top