Finding values for a linearly independent subset

In summary, the conversation discusses finding the values of \alpha\inℝ that make a set of 2x2 matrices linearly independent in a vector space with real entries. The approach involves setting up a matrix and using row operations to obtain reduced row echelon form. However, in order for the matrices to be linearly independent, \alpha must not equal -4.
  • #1
Cottontails
33
0

Homework Statement


There is a vector space with real entries of all 2x2 matrices. You have to find what values of [itex]\alpha[/itex][itex]\in[/itex]ℝ make the set Z = [itex]\{
\begin{pmatrix}
1 & 2\\
1 & 0
\end{pmatrix},
\begin{pmatrix}
3 & 7\\
0 & 0
\end{pmatrix},
\begin{pmatrix}
2 & 6\\
\alpha & 0
\end{pmatrix}
\}
[/itex]


Homework Equations


To find for linear independence, I wrote the equation:
[tex]a\begin{pmatrix}
1 & 2\\
1 & 0
\end{pmatrix}+b
\begin{pmatrix}
3 & 7\\
0 & 0
\end{pmatrix}+c
\begin{pmatrix}
2 & 6\\
\alpha & 0
\end{pmatrix}=
\begin{pmatrix}
0 & 0\\
0 & 0
\end{pmatrix}
[/tex]


The Attempt at a Solution


This is where I'm a bit confused. Is that equation correct and then you would simply write it as a matrix and then use row reductions to make the matrix into reduced row echelon form?
Or would you not test it for the set of matrices but just for the last matrix and let it equal to the zero vector to find linear independence - and find the value of alpha that way?
 
Physics news on Phys.org
  • #2
Is that equation correct and then you would simply write it as a matrix and then use row reductions to make the matrix into reduced row echelon form?
That is a good approach.
Or would you not test it for the set of matrices but just for the last matrix and let it equal to the zero vector to find linear independence - and find the value of alpha that way?
I don't understand what you mean. The third matrix will never be identical to 0, independent of the value of α. If there would be such an α, the system would be linear dependent (as it would have a zero vector), of course.
 
  • #3
Okay, thanks. So I set up the matrix:
[tex]
\begin{pmatrix}
1 & 2 & 3 & 7 & 2 & 6 & | & 0 & 0\\
1 & 0 & 0 & 0 & \alpha & 0 & | & 0 & 0
\end{pmatrix}
[/tex]

I then did row operations and made the matrix into reduced row echelon form, obtaining:
[tex]
\begin{pmatrix}
1 & 0 & 0 & 0 & \alpha & 0 & | & 0 & 0\\
0 & 1 & 3/2 & 7/2 & -\alpha/2+1 & 3 & | & 0 & 0
\end{pmatrix}
[/tex]

So, α is now in both rows. How am I able to solve to find the value of α? I did try through back substitution, with row 1 giving v1=-αv3. Although, there are still many values in row 2 so I can't solve it further.
 
  • #4
You need a separate row for all 4 components of your matrices.
 
  • #5
Sorry, I don't really understand by what you mean for having a separate row for each component? And how would you create it?
 
  • #6
You have a 4-dimensional vector space, as your matrices have 4 independent components.

Your equation for linear independence can be written as
[tex]a\begin{pmatrix}
1 \\
2 \\
1 \\
0\end{pmatrix}+b
\begin{pmatrix}
3 \\ 7\\
0 \\ 0
\end{pmatrix}+c
\begin{pmatrix}
2 \\ 6\\
\alpha \\ 0
\end{pmatrix}=
\begin{pmatrix}
0 \\ 0\\
0 \\ 0
\end{pmatrix}
[/tex]
You can convert this into a matrix in the usual way.
 
  • #7
So, you would convert that into a matrix to then solve for reduced row echelon form and as it is linearly independent, you should only have one value in each row, with the rest as zero.
I have tried solving the matrix and obtained:
[tex]
\begin{pmatrix}
1 & 2 & 0 & | & 0\\
0 & 1 & 2 & | & 0\\
0 & 0 & \alpha+4 & | & 0\\
0 & 0 & 0 & | & 0
\end{pmatrix}
[/tex]
However, I don't know how to solve it further from there.
Yet, since it's given that the system is linearly independent, then where α+4 is should equal 1, moreover with the "2's" being 0 as well.
Did I make some error then since I still have some "2's" in my matrix, since it is not yet linearly independent?
 
  • #8
As you have written, let

[tex]a\begin{pmatrix}
1 & 2\\
1 & 0
\end{pmatrix}+b
\begin{pmatrix}
3 & 7\\
0 & 0
\end{pmatrix}+c
\begin{pmatrix}
2 & 6\\
\alpha & 0
\end{pmatrix}=
\begin{pmatrix}
0 & 0\\
0 & 0
\end{pmatrix}
[/tex]

Then,

[tex]\begin{pmatrix}
a+3b+2c & 2a+7b+6c\\
a+c\alpha & 0
\end{pmatrix}
=
\begin{pmatrix}
0 & 0\\
0 & 0
\end{pmatrix}[/tex]

This can be written as a system of linear equations (which is another representation of what mfb has written above):

[tex]
a+3b+2c=0\\
2a+7b+6c=0\\
a+c\alpha=0\\
[/tex]

If we reduce the system to its echelon form we get

[tex]
a+3b+2c=0\\
b+2c=0\\
c(\alpha+4)=0\\
[/tex]

For the the set of 3 matrices to be linearly independent, [tex]\alpha\neq-4[/tex].
 
Last edited:

FAQ: Finding values for a linearly independent subset

What does it mean to find values for a linearly independent subset?

Finding values for a linearly independent subset means identifying a set of numbers or variables that can be used to create a linear combination of other elements in a larger set. This subset is independent because it cannot be expressed as a linear combination of the other elements.

Why is it important to find values for a linearly independent subset?

It is important to find values for a linearly independent subset because it allows for simpler and more efficient calculations and analysis. Linear independence is a fundamental concept in linear algebra and is essential for solving systems of equations and understanding the behavior of vectors and matrices.

How do you determine if a subset is linearly independent?

To determine if a subset is linearly independent, you can use the definition of linear independence: a set of vectors is linearly independent if the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is c1 = c2 = ... = cn = 0. In other words, the only way to get a linear combination of the subset equal to 0 is by multiplying each element by 0.

What is the process for finding values for a linearly independent subset?

The process for finding values for a linearly independent subset involves identifying a set of numbers or variables that cannot be expressed as a linear combination of the other elements in a larger set. This can be done by solving a system of equations or using other techniques such as Gaussian elimination.

Can a subset be both linearly independent and dependent?

No, a subset cannot be both linearly independent and dependent. By definition, a linearly independent subset cannot be expressed as a linear combination of the other elements, whereas a linearly dependent subset can be expressed as a linear combination of the other elements.

Similar threads

Replies
2
Views
671
Replies
6
Views
765
Replies
6
Views
1K
Replies
3
Views
1K
Replies
12
Views
2K
Replies
7
Views
2K
Replies
2
Views
981
Replies
4
Views
2K
Back
Top