Proving Matrix Equality Using Singular Value Decomposition

In summary, the conversation is about proving the equivalence of $AA^T = BB^T$ and $A = BO$ where $O$ is an orthogonal matrix. The speaker initially suggests using singular value decomposition (SVD) but later questions its necessity. The expert clarifies that SVD is only needed for the forward direction of the proof. The forward and reverse directions are then explained, with the expert pointing out that the singular values of $A$ and $B$ must be equal if their products are equal. Finally, the expert suggests using SVD to find orthogonal matrices that satisfy the equation.
  • #1
linearishard
8
0
Hi, I have another question, if A and B are mxn matrices, how do I prove that $AA^T = BB^T$ iff $A = BO$ where $O$ is some orthogonal matrix? I think I need to use a singular value decomposition but I am not sure. Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Can you at least prove the reverse direction, that is, if $A = BO$ for some orthogonal matrix $O$, then $AA^T = BB^T$? You don't need to use SVD for this.
 
  • #3
Yeah I did that but it seemed too simple, my study guide says I should be using SVD. Is it actually unnecessary?
 
  • #4
You use SVD for the forward direction, not the reverse direction.
 
  • #5
what do you mean by that? What is the forward and reverse directions?
 
  • #6
The forward direction: If $AA^T = BB^T$, then $A = BO$ where $O$ is some orthogonal matrix. The reverse direction: If $A = BO$ where $O$ is an orthogonal matrix, then $AA^T = BB^T$.
 
  • #7
If $AA^T = BB^T$, then the singular values of $A$ are the singular values of $B$. You can write $A = U\Sigma V^T$ and $B = U\Sigma Q^T$ for some orthogonal matrices $U, V$, and $Q$. Then $A = BO$ where $O = QV^T$. Since the transpose and product of orthogonal matrices are orthogonal, $O$ is orthogonal.
 

FAQ: Proving Matrix Equality Using Singular Value Decomposition

What is Singular Value Decomposition (SVD)?

Singular Value Decomposition (SVD) is a mathematical technique used in linear algebra to decompose a matrix into three matrices: a left singular matrix, a diagonal matrix of singular values, and a right singular matrix. It is often used in data analysis, signal processing, and image compression.

What are the applications of SVD?

SVD has a wide range of applications in various fields such as image and signal processing, data compression, natural language processing, and recommendation systems. It is also used in solving linear systems, data clustering, and data approximation.

How does SVD work?

SVD works by decomposing a matrix into three matrices: U, Σ, and V. The matrix U contains the eigenvectors of the original matrix multiplied by its transpose, the matrix Σ contains the square root of the eigenvalues of the original matrix, and the matrix V contains the eigenvectors of the original matrix.

What are the advantages of using SVD?

SVD has several advantages, including its ability to handle data with missing values, its robustness to noise, and its ability to reduce the dimensionality of data while preserving important information. It is also numerically stable and can handle very large datasets.

Can SVD be used for data compression?

Yes, SVD can be used for data compression by reducing the dimensionality of a dataset while preserving important information. This can result in significant storage and computational savings, making it a useful tool for compressing large datasets. Additionally, SVD is used in image compression algorithms such as JPEG and MPEG.

Similar threads

Replies
5
Views
2K
Replies
1
Views
2K
Replies
5
Views
2K
Replies
1
Views
2K
Replies
4
Views
2K
Replies
1
Views
2K
Replies
10
Views
2K
Back
Top