Seemingly simple linear algebra

In summary: It is more general, and can be proved using the Farkas' lemma.In summary, the Farkas' lemma states that for arbitrary z, there exists a y ∈ Rn such that A^{T}y ≥ 0 and b^{T}y < 0.
  • #1
phasic
21
0
Why is it that for arbitrary z, A[itex]^{T}[/itex]z = 0 and b[itex]^{T}[/itex]z ≠ 0 when there does not exist an x such that Ax = b, i.e. that b is not in the range space of A, where A is an n x m matrix?
 
Last edited:
Physics news on Phys.org
  • #2
"Range space" ?? Do you mean "row space" ??
 
  • #3
Actually, it means column space. This means that there is no linear combination of A's columns that gives b. This must mean that there is no x for which Ax = b, right?
 
  • #4
phasic said:
Why is it that for arbitrary z, A[itex]^{T}[/itex]z = 0 and b[itex]^{T}[/itex]z ≠ 0 when b is not in the range space of A when Ax = b, where A is an n x m matrix?

Huh? How can 'b' not be in the range space of A when AX=b? That is contradictory!
 
  • #5
Edited the post so that it makes more sense. Does it now?
 
  • #6
phasic said:
Edited the post so that it makes more sense. Does it now?

Yes.
 
  • #7
phasic said:
Actually, it means column space. This means that there is no linear combination of A's columns that gives b. This must mean that there is no x for which Ax = b, right?

Okay, sorry. I am just not used to using that term. I usually use the following for the 4 fundamental spaces

C(A) - column space (of A)
[itex] C(A^T) [/itex] - row space (of A)
N(A) - null space (of A)
[itex] C(A^T) [/itex] - left handed nullspace of A (i.e nullspace of A transpose)
 
Last edited:
  • #8
So the column space of A transpose is the null space of A?
 
  • #9
phasic said:
Why is it that for arbitrary z, A[itex]^{T}[/itex]z = 0 and b[itex]^{T}[/itex]z ≠ 0 when there does not exist an x such that Ax = b, i.e. that b is not in the range space of A, where A is an n x m matrix?

What have you tried so far? Show your work.
 
  • #10
phasic said:
So the column space of A transpose is the null space of A?

Whoops... My error. No it is not. The nullspace of A transpose is NOT the column space of A transpose I meant to write

[tex] N(A^T) [/tex]

The null space of A transpose is of course the set of all vectors y such that

[tex] A^T y \, = \, 0 [/tex]

We sometimes call it the "left null space" because is we take the transpose of both sides of the above equation.

[tex] (A^Ty)^T \, = \, 0^T [/tex]

we get

[tex] y^T A \, = \, 0^T [/tex]

Sorry for my mistake. I'm going to try to edit it and correct my typo before it causes confusion. Unfortunately I don't see an edit function available :(
 
  • #11
This statement came from the book Convex Optimization by Boyd and Vandenberghe. I forget which page now, but the idea has come up again in Farkas' lemma, but in a different form.

http://en.wikipedia.org/wiki/Farkas'_lemma

Let A be an n × m matrix and b an n-dimensional vector. Then, exactly one of the following two statements is true:

(1) There exists an x ∈ Rm such that Ax = b and x ≥ 0.
(2) There exists a y ∈ Rn such that A[itex]^{T}[/itex]y ≥ 0 and b[itex]^{T}[/itex]y < 0.

This is close but the original statement involves equalities similar to (2).
 

FAQ: Seemingly simple linear algebra

What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations, vectors, matrices, and their properties. It is used to solve problems involving systems of linear equations and to represent geometric transformations.

What are the basic operations in linear algebra?

The basic operations in linear algebra include addition, subtraction, multiplication, and division of vectors and matrices. These operations follow specific rules and properties, such as the commutative and associative properties.

How is linear algebra used in scientific research?

Linear algebra is used in a wide range of scientific research areas, including physics, engineering, computer science, and economics. It is used to model and solve complex systems, analyze data, and develop efficient algorithms.

What are eigenvectors and eigenvalues?

Eigenvectors and eigenvalues are important concepts in linear algebra. An eigenvector is a vector that, when multiplied by a matrix, remains parallel to its original direction. The corresponding eigenvalue is a scalar that represents the amount of stretching or shrinking that occurs along the eigenvector.

How can I improve my understanding of linear algebra?

To improve your understanding of linear algebra, it is important to practice solving problems and working with matrices and vectors. You can also read textbooks or take online courses to learn the fundamentals and applications of linear algebra. Additionally, collaborating with others and seeking help from experts can also aid in improving your understanding of the subject.

Back
Top