Linear transformation and its matrix

In summary, we discussed the map $\mathcal{A}$ from $\mathbb{R}^3$ to $\mathbb{R}^3$ defined by $\mathcal{A}(x,y,z) = (x+y, x-y, z)$ and determined that it is a linear transformation. We also found its matrix in standard basis to be $M_{\mathcal{A}} = \begin{bmatrix}1&1&0\\1&-1& 0\\ 0& 0& 1\end{bmatrix}$. We then moved on to finding the dimensions of $\text{Im}(\mathcal{A})$ and $\text{
  • #1
Guest2
193
0
1. Show that the map $\mathcal{A}$ from $\mathbb{R}^3$ to $\mathbb{R}^3$ defined by $\mathcal{A}(x,y,z) = (x+y, x-y, z)$ is a linear transformation. Find its matrix in standard basis.

2. Find the dimensions of $\text{Im}(\mathcal{A})$ and $\text{Ker}(\mathcal{A})$, and find their basis for the linear transformation $\mathcal{A}$ on $\mathbb{R}^3$ defined by $\mathcal{A} (x,y,z) = (x-2z, y+z, 0)$.

1. If let $u = (x,y,z)$, and $v = (x', y', z')$. Then

$\mathcal{A}(u+v) = (x+y+x'+y', x-y+x'-y', z+z')= (x+y, x-y, z)+(x'+y', x'-y', z') = \mathcal{A}(u)+\mathcal{A}(v).$

Moreover, $ \mathcal{A}(\lambda u) = (\lambda x +\lambda y, \lambda x- \lambda y, \lambda z) = \lambda (x+y, x-y, z) = \lambda \mathcal{A}(u)$

Hence $\mathcal{A}$ is a linear transformation. However, I can't find its matrix.
 
Last edited:
Physics news on Phys.org
  • #2
A partial answer:

$M_{\mathcal{A}} = \begin{bmatrix}1&1&0\\1&-1& \ast\\ \ast&\ast&\ast\end{bmatrix}$.

Perhaps you can finish.
 
  • #3
Deveno said:
A partial answer:

$M_{\mathcal{A}} = \begin{bmatrix}1&1&0\\1&-1& \ast\\ \ast&\ast&\ast\end{bmatrix}$.

Perhaps you can finish.
$M_{\mathcal{A}} = \begin{bmatrix}1&1&0\\1&-1& 0\\ 0& 0& 1\end{bmatrix}$ I think.

So for question $2$ the concerned matrix is

$M_{\mathcal{A}} = \begin{bmatrix}1&0&-2\\0&1& 1\\ 0& 0& 0\end{bmatrix} $

Solving we get $y = -z, ~ x = 2z,$ and $z$ is free. So $\text{Ker}\mathcal{A} = \left\{(2z,-z,z): z \in \mathbb{R}\right\}$

How do I find $\text{Im}\mathcal{A}$? Is it true that we just take the columns of $M_{\mathcal{A}}$?
 
Last edited:
  • #4
Guest said:
$M_{\mathcal{A}} = \begin{bmatrix}1&1&0\\1&-1& 0\\ 0& 0& 1\end{bmatrix}$ I think.

So for question $2$ the concerned matrix is

$M_{\mathcal{A}} = \begin{bmatrix}1&1&-2\\0&1& 1\\ 0& 0& 0\end{bmatrix} $

Solving we get $y = -z, ~ x = 3z,$ and $z$ is free. So $\text{Ker}\mathcal{A} = \left\{(3z,-z,z): z \in \mathbb{R}\right\}$

How do I find $\text{Im}\mathcal{A}$? Is it true that we just take the columns of $M_{\mathcal{A}}$?

If the matrix for #2 was as you say, we would would have:

$\begin{bmatrix}1&1&-2\\0&1&1\\0&0&0\end{bmatrix} \begin{bmatrix}x\\y\\z\end{bmatrix} = \begin{bmatrix}x+y-2z\\y+z\\0\end{bmatrix}$

suggesting $\mathcal{A}(x,y,z) = (x+y-2z,y+z,0)$, which is not the case.
 
  • #5
Deveno said:
If the matrix for #2 was as you say, we would would have:

$\begin{bmatrix}1&1&-2\\0&1&1\\0&0&0\end{bmatrix} \begin{bmatrix}x\\y\\z\end{bmatrix} = \begin{bmatrix}x+y-2z\\y+z\\0\end{bmatrix}$

suggesting $\mathcal{A}(x,y,z) = (x+y-2z,y+z,0)$, which is not the case.
Yeah, I made a mistake. It should have said:

$M_{\mathcal{A}} = \begin{bmatrix}1&0&-2\\0&1& 1\\ 0& 0& 0\end{bmatrix} $

Solving we get $y = -z, ~ x = 2z,$ and $z$ is free. So $\text{Ker}\mathcal{A} = \left\{(2z,-z,z): z \in \mathbb{R}\right\}$, which has dimension $1$.

Apparently, the dimension of $\text{Im}(\mathcal{A}) = \text{rank}(\mathcal{A})$, so in this case dimension of $\text{Im}(\mathcal{A}) = 2$.
 
Last edited:
  • #6
Guest said:
Yeah, I made a mistake. It have said:

$M_{\mathcal{A}} = \begin{bmatrix}1&0&-2\\0&1& 1\\ 0& 0& 0\end{bmatrix} $

Solving we get $y = -z, ~ x = 2z,$ and $z$ is free. So $\text{Ker}\mathcal{A} = \left\{(2z,-z,z): z \in \mathbb{R}\right\}$, which has dimension $1$.

Apparently, the dimension of $\text{Im}(\mathcal{A}) = \text{rank}(\mathcal{A})$, so in this case dimension of $\text{Im}(\mathcal{A}) = 2$.

It is also the case for this example that $\dim(\text{Im}(\mathcal{A})) = 3 - \dim(\text{ker}(\mathcal{A}))$.

Do you understand why $\dim(\text{Im}(\mathcal{A})) = \text{rank}(\mathcal{A})$? Does this suggest a way to define the rank of a linear transformation, without using a matrix?

Your amended answer looks much better. ^^
 
  • #7
Deveno said:
It is also the case for this example that $\dim(\text{Im}(\mathcal{A})) = 3 - \dim(\text{ker}(\mathcal{A}))$.

Do you understand why $\dim(\text{Im}(\mathcal{A})) = \text{rank}(\mathcal{A})$? Does this suggest a way to define the rank of a linear transformation, without using a matrix?

Your amended answer looks much better. ^^
My understanding is that the image is the space spanned by the columns. So the rank then would be the dimension of the space spanned by the columns? I would say my understanding of why $\dim(\text{Im}(\mathcal{A})) = \text{rank}(\mathcal{A})$ is actually shaky at best.

Also, I wonder whether one can determine whether a transformation is linear by consider the corresponding matrix? If so, what property are we looking for in the matrix?
 
Last edited:
  • #8
Any mapping from $\Bbb R^n \to \Bbb R^m$ that can be represented as (left) multiplication by a matrix IS a linear transformation.

This is because, for two $n$-vectors $x,y$ expressed as $n \times 1$ matrices, with $A$ an $m \times n$ matrix:

$A(x + y) = Ax + Ay$ (matrix multiplication DISTRIBUTES over matrix addition).

$A(rx) = (rA)(x) = r(Ax)$ (scalar multiplication can be regarded as multiplication by the $n \times n$ matrix $rI_n$ on the right of $A$, and the left of $x$, and as the $m \times m$ matrix $rI_m$ on the left of $A$).

You are correct about the image of $A$ being spanned by the columns of $A$. This is immediate from the fact that $A$ is linear, and that if we write the $j$-th standard basis vector as a column vector ($n \times 1$ matrix), then $Ae_j$ gives us the $j$-th column of $A$.

(Recall that the image of $A$ is determined by the images of the basis vectors, since:

$A(v) = A(c_1e_1 +\cdots + c_ne_n) = c_1A(e_1)+\cdots +c_nA(e_n)$).

Thus the columns of $A$ SPAN the image of $A$. They may not be linearly independent, but if we find a maximally linearly independent subset of the columns, they form a BASIS.

The size of such a basis IS the rank of $A$. This follows from the fact that performing row-reduction amounts to multiplying on the left by an INVERTIBLE matrix, so it doesn't change the dimension of the image of $A$ (invertible matrices preserve dimensions), and performing column-reduction amounts to multiplying on the right of $A$ by an invertible matrix (it is easily checked that row-reduction elementary operations and the equivalent for columns are reversible).

Since the goal of both these types of operations is to maximize the number of zero rows (or columns), after we have done so, the number of non-zero rows (or columns) is the dimension of the image we originally sought.

In a nutshell: column rank = row rank, which we simply call "rank". Row-reduction, and column-reduction are two different paths to the same place: the dimension of the "range".

But it gets even better: if we start with a space with $n$-dimensions, we have the RANK-NULLITY theorem:

$\dim(\text{ker}(A)) + \text{rank}(A) = n$.

So we only ever need to find one of:

a) The dimension of the kernel (= null space)
b) The rank

This is a real time-saver.
 
  • #9
That was most helpful, thank you very much!
Deveno said:
The size of such a basis IS the rank of $A$. This follows from the fact that performing row-reduction amounts to multiplying on the left by an INVERTIBLE matrix, so it doesn't change the dimension of the image of $A$ (invertible matrices preserve dimensions), and performing column-reduction amounts to multiplying on the right of $A$ by an invertible matrix (it is easily checked that row-reduction elementary operations and the equivalent for columns are reversible).
This demystifies the whole idea of row reduction for me. I've often wondered in passing why it works.
 
Last edited:

FAQ: Linear transformation and its matrix

What is a linear transformation?

A linear transformation is a mathematical function that maps one vector space to another. It preserves lines and origin points, meaning that the transformation of a straight line remains a straight line and the point (0,0) remains unchanged.

What is the matrix representation of a linear transformation?

The matrix representation of a linear transformation is a square matrix that describes how the transformation affects each basis vector in the original vector space.

How do you perform a linear transformation using matrices?

To perform a linear transformation using matrices, you simply multiply the matrix representation of the transformation by the vector representing the original point. This will give you the coordinates of the transformed point.

What is the relationship between a linear transformation and its matrix?

The matrix representation of a linear transformation is unique to that transformation. Therefore, every linear transformation has a specific matrix associated with it, and vice versa.

Can a linear transformation be represented by more than one matrix?

Yes, a linear transformation can have multiple matrix representations. This can occur when the transformation is performed in different coordinate systems, or when a different basis is chosen for the vector space.

Similar threads

Replies
34
Views
2K
Replies
1
Views
885
Replies
52
Views
3K
Replies
1
Views
935
Replies
9
Views
1K
Replies
20
Views
3K
Replies
10
Views
1K
Back
Top