No prefix added: How do I find a basis of a $\mathbb{R}$-vector space?

In summary: R}^n &= \mathbb{C}^n \end{align*}In summary, we can find a basis of $V$ as a $\mathbb{R}$-vector space by choosing seven elements from the set $\left \{\begin{pmatrix}1 & 0 \\
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

I want to prove that $$V=\left \{\begin{pmatrix}a & b\\ c & d\end{pmatrix} \mid a,b,c,d\in \mathbb{C} \text{ and } a+d\in \mathbb{R}\right \}$$ is a $\mathbb{R}$-vector space.
I want to find also a basis of $V$ as a $\mathbb{R}$-vector space. We have the following:
Let $K$ be a field. A vector space over $K$ (or $K$-vector space) is a set $V$ with an addition $V \times V \rightarrow V : (x, y) \mapsto x + y$ and a scalar multiplication $K \times V \rightarrow V : (\lambda , x) \mapsto \lambda \cdot x$, so that the following holds:
  • (V1) : $(V,+)$ is an abelian group, with the neutral element $0$.
  • (V2) : $\forall a, b \in K, \forall x \in V : (a + b) \cdot x = a \cdot x + b \cdot x$
  • (V3) : $\forall a \in K, \forall x, y \in V : a \cdot (x + y) = a \cdot x + a \cdot y$
  • (V4) : $\forall a, b \in K, \forall x \in V : (ab) \cdot x = a \cdot (b \cdot x)$
  • (V5) : $\forall x \in V : 1 \cdot x = x$ ( $1 = 1_K$ is the identity in $K$).
We have that it is closed under addition and multiplication, right? (Wondering)

The properties 2-5 are also satisfied, or not? How can we check the property 1? (Wondering) Could you give me a hint how to find a basis? (Wondering)
 
Physics news on Phys.org
  • #2
[tex]\begin{pmatrix}a & b \\ c & d \end{pmatrix}= \begin{pmatrix}a & 0 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & b \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ 0 & d \end{pmatrix}[/tex]
[tex]= a\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}+ b\begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}+ c\begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}+ d\begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}[/tex]
 
  • #3
HallsofIvy said:
[tex]\begin{pmatrix}a & b \\ c & d \end{pmatrix}= \begin{pmatrix}a & 0 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & b \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ 0 & d \end{pmatrix}[/tex]
[tex]= a\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}+ b\begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}+ c\begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}+ d\begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}[/tex]

Ah! And so, the basis is $$\{\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}\}$$ right? (Wondering)
 
  • #4
I think we need a couple more elements in the basis, since the entries in the matrix are complex, but the scalars we multiply by are real.
Furthermore, we have the restriction that the imaginary part of the trace must be zero.
I think we need 7 elements in the basis. (Thinking)

mathmari said:
We have that it is closed under addition and multiplication, right? (Wondering)

The properties 2-5 are also satisfied, or not? How can we check the property 1? (Wondering)

Yes.
For property 1, we already know that regular matrix addition of complex matrices is an abelian group.
So it's sufficient to verify it's a subgroup, meaning we have to verify if the neutral element is an element of the group, and if all additive inverses are elements of the group. (Thinking)
 
  • #5
I like Serena said:
I think we need a couple more elements in the basis, since the entries in the matrix are complex, but the scalars we multiply by are real.
Furthermore, we have the restriction that the imaginary part of the trace must be zero.
I think we need 7 elements in the basis. (Thinking)

Ah ok... (Thinking)

Is the basis the followig?
$$\left \{\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}, \begin{pmatrix} i & 0 \\ 0 & -i \end{pmatrix}, \begin{pmatrix} 0 & i \\ 0 & 0 \end{pmatrix},\begin{pmatrix} 0 & 0 \\ i & 0 \end{pmatrix} \right \}$$

(Wondering)
 
  • #6
Yes (Mmm)
 
  • #7
I like Serena said:
For property 1, we already know that regular matrix addition of complex matrices is an abelian group.
So it's sufficient to verify it's a subgroup, meaning we have to verify if the neutral element is an element of the group, and if all additive inverses are elements of the group. (Thinking)

We have that $\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}\in V$ since $0\in\mathbb{R}\subset \mathbb{C}$.

When $\begin{pmatrix} a & b \\ c & d \end{pmatrix}\in V$ then $\begin{pmatrix} -a & -b \\ -c & -d \end{pmatrix}\in V$, since $-a,-b,-c,-d\in \mathbb{C}$ and $-a-d=-(a+d)\in \mathbb{R}$, since $a+d\in \mathbb{R}$.

Is this correct? (Wondering)
 
Last edited by a moderator:
  • #8
Yup. (Mmm)
 
  • #9
We have the following:

\begin{align*}\begin{pmatrix}a & b \\ c & d \end{pmatrix}& = \begin{pmatrix}a_1+a_2i & b_1+b_2i \\ c_1+c_2i & d_1+d_2i \end{pmatrix} = \begin{pmatrix}a_1 & b_1 \\ c_1 & d_1 \end{pmatrix}+\begin{pmatrix}a_2i & b_2i \\ c_2i & d_2i \end{pmatrix} \\ &=
\left (\begin{pmatrix}a_1 & 0 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & b_1 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c_1 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ 0 & d_1 \end{pmatrix}\right )+ \left (\begin{pmatrix}a_2i & 0 \\ 0 & d_2i \end{pmatrix}+ \begin{pmatrix}0 & b_2i \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c_2i & 0 \end{pmatrix} \right ) \\ &=
\left (a_1\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}+ b_1\begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}+ c_1\begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}+ d_1\begin{pmatrix}0 & 0 \\ 0 &1 \end{pmatrix}\right )+ \left (\begin{pmatrix}a_2i & 0 \\ 0 & d_2i \end{pmatrix}+ b_2\begin{pmatrix}0 & i \\ 0 & 0 \end{pmatrix}+ c_2\begin{pmatrix}0 & 0 \\ i & 0 \end{pmatrix} \right )\end{align*}

How do we get from $\begin{pmatrix}a_2i & 0 \\ 0 & d_2i \end{pmatrix}$ the $\begin{pmatrix}i & 0 \\ 0 & -i \end{pmatrix}$ ? (Wondering)
 
  • #10
The trace of the matrix must be real, so $\text{Im}(a+b) = 0 \quad\Rightarrow\quad d_2 = -a_2$. (Thinking)
 
  • #11
I like Serena said:
The trace of the matrix must be real, so $\text{Im}(a+b) = 0 \quad\Rightarrow\quad d_2 = -a_2$. (Thinking)

Ah ok... I see! To show that $V$ is a real vector space, could we maybe show that it is a subvector space of the real vector space of the complex matrices? (Wondering)

We have that $I_2=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}$ is an element of $V$ since $1,0\in \mathbb{C}$ and $1+1=2\in \mathbb{R}$.

So, the set is non-empty. We have the followig:
$$\begin{pmatrix}a_1 & b_1\\ c_1 & d_1\end{pmatrix}+\begin{pmatrix}a_2 & b_2\\ c_2 & d_2\end{pmatrix}=\begin{pmatrix}a_1+a_2 & b_1+b_2\\ c_1+c_2 & d_1+d_2\end{pmatrix}\in V$$
Since $(a_1+a_2)+(d_1+d_2)=(a_1+d_1)+(a_2+d_2)\in \mathbb{R}$, since $(a_1+d_1)\in \mathbb{R}$ and $(a_2+d_2)\in \mathbb{R}$.

We have also that $$r\begin{pmatrix}a & b\\ c & d\end{pmatrix}=\begin{pmatrix}ra & rb\\ rc & rd\end{pmatrix}\in V$$
since $ra+rd=r(a+d)\in \mathbb{R}$, since $a+d\in \mathbb{R}$ and $r\in \mathbb{R}$. Or is it not known that the set of complex matrices is a real vector space? (Wondering)
 
  • #12
mathmari said:
Ah ok... I see! To show that $V$ is a real vector space, could we maybe show that it is a subvector space of the real vector space of the complex matrices? (Wondering)

What's a "real" vector space? (Wondering)

The real vector space is the space of the real matrices with real scalar multiplication.
The complex vector space is the space of the complex matrices with complex scalar multiplication, which is indeed a known vector space.
So yes, it suffices if we can show that $V$ is a sub vector space of the complex vector space.
We have that $I_2=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}$ is an element of $V$ since $1,0\in \mathbb{C}$ and $1+1=2\in \mathbb{R}$.

So, the set is non-empty. We have the followig:
$$\begin{pmatrix}a_1 & b_1\\ c_1 & d_1\end{pmatrix}+\begin{pmatrix}a_2 & b_2\\ c_2 & d_2\end{pmatrix}=\begin{pmatrix}a_1+a_2 & b_1+b_2\\ c_1+c_2 & d_1+d_2\end{pmatrix}\in V$$
Since $(a_1+a_2)+(d_1+d_2)=(a_1+d_1)+(a_2+d_2)\in \mathbb{R}$, since $(a_1+d_1)\in \mathbb{R}$ and $(a_2+d_2)\in \mathbb{R}$.

We have also that $$r\begin{pmatrix}a & b\\ c & d\end{pmatrix}=\begin{pmatrix}ra & rb\\ rc & rd\end{pmatrix}\in V$$
since $ra+rd=r(a+d)\in \mathbb{R}$, since $a+d\in \mathbb{R}$ and $r\in \mathbb{R}$.

The identity matrix is not necessarily part of our vector space, since we're not multiplying matrices.

The zero matrix does have to be an element (implying it's not empty), just like every additive inverse. (Thinking)

And you have already shown closure for addition and closure for scalar multiplication.
 
  • #13
I like Serena said:
The identity matrix is not necessarily part of our vector space, since we're not multiplying matrices.

The zero matrix does have to be an element (implying it's not empty), just like every additive inverse. (Thinking)

And you have already shown closure for addition and closure for scalar multiplication.

Do we know that the zero matrix exists because for each $u,v\in V$ it must hold that $u+v\in V$, and also for $u=-v$ ? (Wondering)
 
  • #14
mathmari said:
Do we know that the zero matrix exists because for each $u,v\in V$ it must hold that $u+v\in V$, and also for $u=-v$ ? (Wondering)

That's one way yes.
With the proof that all additive inverses are in $V$, it follows that the zero matrix is in there as well.
Alternatively we can also tell because its trace, which is $0+0=0$, is a real number. (Thinking)
 
  • #15
I like Serena said:
That's one way yes.
With the proof that all additive inverses are in $V$, it follows that the zero matrix is in there as well.
Alternatively we can also tell because its trace, which is $0+0=0$, is a real number. (Thinking)

I see... Thank you very much! (Mmm)
 

FAQ: No prefix added: How do I find a basis of a $\mathbb{R}$-vector space?

What is a basis in R-vector space?

A basis in R-vector space is a set of vectors that can be used to represent any vector in the vector space through linear combinations. These vectors are linearly independent, meaning that no vector in the set can be expressed as a linear combination of the other vectors. They also span the entire vector space, meaning that any vector in the space can be written as a linear combination of the basis vectors.

How is a basis determined in R-vector space?

A basis in R-vector space can be determined through a process called basis selection. This involves choosing a set of linearly independent vectors from the vector space and then testing to see if they span the entire space. If they do, then those vectors form a basis for the R-vector space.

Can a basis in R-vector space have more than one set of vectors?

No, a basis in R-vector space can only have one set of vectors. This is because the basis vectors must be linearly independent and span the entire space. If there were multiple sets of vectors that fulfilled these requirements, they would be equivalent and therefore considered the same basis.

Can a vector space have an infinite basis?

Yes, a vector space can have an infinite basis. This is often the case for vector spaces with infinite dimensions, such as function spaces. In these cases, the basis vectors are usually chosen to be infinitely differentiable functions.

How is a basis used in R-vector space?

A basis in R-vector space is used to represent vectors in the space through linear combinations. This allows for efficient calculations and transformations of vectors, as well as simplifying the understanding of the vector space. Additionally, a basis can be used to determine the dimension of the vector space, as the number of basis vectors equals the dimension of the space.

Similar threads

Replies
52
Views
3K
Replies
15
Views
1K
Replies
23
Views
1K
Replies
5
Views
1K
Replies
11
Views
2K
Replies
4
Views
675
Back
Top