Can A=B be Proved Algebraically without Linear Transformations?

  • Thread starter Bipolarity
  • Start date
In summary, if A and B are matrices of the same size and x is a column vector such that Ax and Bx are defined, and if Ax=Bx for all x, then it is true that A=B. This can be proven by viewing Ax and Bx as linear transformations or by trying different choices of x.
  • #1
Bipolarity
776
2
Suppose A and B are matrices of the same size, and x is a column vector such that the matrix products Ax and Bx are defined.

Suppose that Ax=Bx for all x. Then is it true that A=B?

I know that this is true and I can prove it using the idea of transformation matrices, and viewing Ax and Bx each as linear transformations and showing that those two transformations are equivalent, but I was curious if this can be proved without appealing to the notion of a linear transformation.

Tips?

BiP
 
Physics news on Phys.org
  • #2
If you view them as linear transformations, then there's nothing to prove, since "Ax=Bx for all x" is by definition what A=B means. (This holds for all functions A and B that have the same domain, not just for the linear ones).

If you don't, then you can do it by trying many different choices of x. For example, if you try (1,0,...,0), then the equality tells you that the first column of A is equal to the first column of B.
 
  • #3
A=B <=> A-B=0
write as
(A-B)x=0

transform way
Tx=0 for all x
T=0

matrix way
the matrix is defined by the action on any basis
Tx=0 for all x
let B be a basis
TB=0
T=0
 

Related to Can A=B be Proved Algebraically without Linear Transformations?

1. What does the equation Ax=Bx for all x implies A=B mean?

This equation means that if two matrices, A and B, have the same product with every possible vector x, then A and B must be equal to each other.

2. How is this equation related to linear algebra?

This equation is a fundamental property of linear algebra. It is a result of the commutative property of matrix multiplication, which states that the order of multiplication does not affect the final result.

3. Can you provide an example of this equation in action?

Yes, for example, let A and B be 2x2 matrices: A = [1 2; 3 4] and B = [5 6; 7 8]. We can see that A and B have the same product with any vector x, such as [1; 1]. A*[1; 1] = [3; 7] and B*[1; 1] = [11; 15]. Since both products are equal, A and B must also be equal.

4. What are the implications of this equation in solving systems of linear equations?

This equation can be used to simplify the process of solving systems of linear equations. If we can prove that two matrices are equal through this equation, then we can use this information to reduce the number of equations that we need to solve.

5. Is this equation always true for any matrices A and B?

No, this equation is only true if the matrices A and B have the same dimensions. If the dimensions are different, then the equation will not hold.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
19
Views
729
  • Linear and Abstract Algebra
Replies
1
Views
147
Replies
12
Views
4K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
27
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top