Independence of Rows and Columns in Matrices

In summary, a 3x4 matrix can have at most a rank 3, so the maximum number of linearly indepedant rows/columns is 2.
  • #1
stunner5000pt
1,465
4
a few questions
a) can a 3x4 matrix have independant columns? rows? Explain
if i were to reduce to row echelon form then i could potentially have 4 leading 1s. I m not quite sure about this.
if i were to reduce this 3 x 4 matrix into row echelon form then the number of rows is less than the number of variables. SO the answer is no.

b) if A is a 4 x3 matrix and rank A = 2, can A have independant columns? rows? Explain
ok rank A means that out of the 4 rows only 2 are non zero when A is in row echelon form. Potentially 3 leading 1s in the columns so at least 2 of the columns may be dependant on each other. So independant columns are not possible.
Indepednat rows not possible.

c) Can a non square matrix has its rows indepedant and its columns independant?
im not sure about this. If A (MxN) then for m rows A has n unknowns so it is not possible to have indepdnatn rows. As for the columns i ahve no idea.

If A is m x n and B is n x m show taht AB = 0 iff [itex] col B \subseteq null A [/itex]
suppose AB = 0
let columns of B = [itex]C_{i}[/itex]
rows of A = [itex]R_{i}[/itex]
for all i
then [tex] R_{i} C_{i} = 0 [/itex] if Ci = 0 for all i. Thus Ci belongs to null A
Suppose [tex] col B \subseteq null A [/tex]
then anything times a column of B is zero. Thus AB = 0
Is this proof adequate?

your input is greatly appreciated!
 
Physics news on Phys.org
  • #2
Remember that a rank of a matrix is equal to the ranks of its transpose, this allows you to intechange rows & columns for your explanation.

A 3x4 matrix can have at most a rank 3, so what does that tell you about the maximum number of linearly indepedant rows/columns?
For the second, the rank is now given - what does this tell you again?
You can use the same argument again for a non-square m x n. Suppose m > n, then the maximal possible ranks is n.
 
  • #3
for a 3x4 matrix\
the rank can be at most 3
that means it can have at most 3 linearly independant rows
4 linear independant columns

for hte second
for rank A = 2
then there are 2 indpendant rows
so at most only 2 indpendant columns?
 

FAQ: Independence of Rows and Columns in Matrices

What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations, linear transformations, and vector spaces. It involves the use of matrices and vectors to solve problems related to systems of linear equations, geometric transformations, and data analysis.

What are matrices?

Matrices are rectangular arrays of numbers or symbols arranged in rows and columns. They are commonly used in linear algebra to represent linear transformations and solve systems of linear equations. Matrices can also be used for data organization and manipulation in fields such as computer science and engineering.

What is the importance of matrix operations in linear algebra?

Matrix operations, such as addition, subtraction, multiplication, and inversion, are crucial in solving problems in linear algebra. These operations allow us to manipulate and transform matrices, which are used to represent linear equations and geometric transformations. They also help in solving systems of linear equations and performing computations in data analysis.

How is linear algebra used in real life?

Linear algebra has numerous applications in various fields, such as physics, computer science, economics, and engineering. It is used to solve problems related to systems of equations, data analysis, and geometric transformations. Linear algebra is also used in machine learning and artificial intelligence to train models and make predictions.

What are eigenvectors and eigenvalues in linear algebra?

Eigenvectors and eigenvalues are important concepts in linear algebra that are used to study the properties of linear transformations. An eigenvector is a vector that does not change direction when multiplied by a matrix, while an eigenvalue is a scalar that represents the amount by which the eigenvector is scaled during the transformation. These concepts have various applications, such as in data analysis and image processing.

Similar threads

Back
Top