- #1
feynman1
- 435
- 29
A and B are 2*2 matrices. A' is the transpose of A. Will the solution of A*A'=B for A yield a unique A(B)?
then should we conclude the derivative dA/dB doesn't in general exist?Infrared said:No. For example, if ##B## is the identity matrix, then ##A## could be any rotation or reflection matrix.
What derivative! ##A'## was your notation for the transpose.feynman1 said:then should we conclude the derivative A'(B) doesn't in general exist?
changed the notation nowmartinbn said:What derivative! ##A'## was your notation for the transpose.
This cannot be concluded so easily. You have to define what you mean by your notation.feynman1 said:then should we conclude the derivative dA/dB doesn't in general exist?
feynman1 said:A and B are 2*2 matrices. A' is the transpose of A. Will the solution of A*A'=B for A yield a unique A(B)?
feynman1 said:then should we conclude the derivative dA/dB doesn't in general exist?
It seems obvious to me that you don't understand this problem. At first you wrote that A' meant the transpose of A, then you decided that it meant the derivative of A. Is that your final answer?feynman1 said:changed the notation now
dA/dB means a tensor valued derivativeMark44 said:It seems obvious to me that you don't understand this problem. At first you wrote that A' meant the transpose of A, then you decided that it meant the derivative of A. Is that your final answer?
Also, if A' is the derivative, which derivative is it? It doesn't make much sense to me to talk about the derivative of one matrix with respect to another; e.g., dA/dB, but it doe make sense to talk about the derivative of a matrix with respect to some variable, say t; e.g., dA/dt. Since you are so uncertain about this problem, it seems reasonable to assume that you aren't certain which derivative is meant.
If we have ##A(t) = \begin{bmatrix} a(t) & b(t) \\ c(t) & d(t) \end{bmatrix}##, then ##A'(t) = \frac{dA(t)}{dt}## would be ##\begin{bmatrix} a'(t) & b'(t) \\ c'(t) & d'(t) \end{bmatrix}##. In this case, the equation AA' = B could mean ##A \frac{dA}{dt} = B##, and this differential equation could be solved, although not for a unique solution.
feynman1 said:then should we conclude the derivative dA/dB doesn't in general exist?
dA/dB is the derivative of some component of A w.r.t some component of B, with multiple elements. It is tensor valued because both A and B are second order tensors, matrices.martinbn said:After all this, what is the original questions? What is the equation? Is it with ##A'## being the transpose or is it some kond of derivative? I think it would be helpfull if you gave us more information. Also you shouldn't assume that "tensor valued derivative" is something everyone knows. Give as reference or a definition. The notation you use dA/dB is unclear as well. Is it ##dAdB^{-1}## or ##dB^{-1}dA##? After all matrices do not commute in general, so a fraction is ambiguous.
This doesn't actually answer my question. Anyway, can you at least tell us where the question came from? It is ok to give information about the question. There is no need for us to pry it out of you.feynman1 said:dA/dB is the derivative of some component of A w.r.t some component of B, with multiple elements. It is tensor valued because both A and B are second order tensors, matrices.
Just wanted to extend a scaler property to a tensor property without other thoughts, sorry for having no background infomartinbn said:This doesn't actually answer my question. Anyway, can you at least tell us where the question came from? It is ok to give information about the question. There is no need for us to pry it out of you.
Amen to this!martinbn said:It is ok to give information about the question. There is no need for us to pry it out of you.
feynman1 said:A*A'=B where * is a normal dot product. Aik*Ajk=B. A, B are matrices.
Matrix multiplication is a mathematical operation that involves multiplying two matrices to create a new matrix. It is different from regular multiplication because it follows specific rules and properties, such as the number of columns in the first matrix must match the number of rows in the second matrix.
In this equation, A represents the original matrix, A' represents the transpose of A, and B represents the resulting matrix after multiplication. The transpose of a matrix is created by flipping the rows and columns of the original matrix.
To solve for unique A(B), you can use the inverse of A to multiply both sides of the equation. This will result in A(B) = A'^(-1) * B, where A'^(-1) is the inverse of A'. This method will give you the unique solution for A(B).
Solving for unique A(B) allows you to find the original matrix A that, when multiplied by its transpose, results in the given matrix B. This can be useful in various applications, such as data analysis, optimization problems, and solving systems of linear equations.
Yes, there are special cases such as when the matrix A is not square or when the matrix A is not invertible. In these cases, it may not be possible to solve for unique A(B) using traditional methods. Other techniques, such as least squares approximation, may need to be used.