Square Matrix Proof: Diagonal Entries & Properties

In summary, the conversation discussed questions about the solutions of a linear system represented by a coefficient matrix Mx=0, and how to describe these solutions geometrically. It also touched on the properties of matrices satisfying A=-A^T, and the general solution of Mx=d, with the goal of understanding these solutions geometrically.
  • #1
nacho-man
171
0
Hi, I have two questions.

firstly;
1) If M is the coefficient matrix of a linear system, Mx = 0, what are the solutions? Describe the set of solutions geometrically.

Now this is just a trivial solution since x = 0,
but to describe the solution geometrically, how would you do that? Do you just describe it as the origin, or 0-space perhaps?

And, in regards to the topic title:

2) Let A be a square matrix satisfying A = $-A^T$
- prove that the diagonal entries of A are all zero,
- Prove that if B is any square matrix, then $ A = (B - B^T)$ satisfied the property that $A = -A^T$
 
Physics news on Phys.org
  • #2
nacho said:
1) If M is the coefficient matrix of a linear system, Mx = 0, what are the solutions? Describe the set of solutions geometrically.

Now this is just a trivial solution since x = 0,
but to describe the solution geometrically, how would you do that? Do you just describe it as the origin, or 0-space perhaps?...

Let's suppo to have a 2 x 2 linear sysytem, that you can write as...

$\displaystyle m_{11}\ x_{1} + m_{12}\ x_{2} = 0$

$\displaystyle m_{21}\ x_{1} + m_{22}\ x_{2} = 0\ (1)$

In the $x_{1}, x_{2}$ plane the (1) is represented by two straigh lines having the origin as common point. The same is for systems like (1) of higher dimension [$\mathbb R^{3}$, $\mathbb R^{4}$ and so on...]

Kind regards

$\chi$ $\sigma$
 
  • #3
There is no reason to suppose the ONLY solution of:

$M\mathbf{x} = \mathbf{0}$

is $\mathbf{x} = \mathbf{0}$

unless $M$ is an INVERTIBLE matrix, if $M$ is square, or more generally, if $M$ is $m \times n$:

a) of full rank, if $m < n$

b) never, if $m > n$.

The set of all vectors $\mathbf{x}$ such that $M\mathbf{x} = \mathbf{0}$ is a subspace of the domain of $M$ (regarding an $m \times n$ matrix as a linear function:

$\Bbb R^n \to \Bbb R^m$, with $\mathbf{x} \mapsto M\mathbf{x}$), and this subspace is called, variously:

the kernel of $M$
the null-space of $M$
the solution space of the system $M\mathbf{x} = \mathbf{0}$.

For example, if

$M = \begin{bmatrix}1&0&0\\0&1&0\\0&0&0 \end{bmatrix}$

it is not hard to see the null-space consists of all vectors of the form $(0,0,a)$ for any real number $a$. This is equivalent to the system:

$x_1 = 0$
$x_2 = 0$

Where we are looking for solutions $\mathbf{x} = (x_1,x_2,x_3)$.

This type of system (indeed ANY system of type (b) above) is called "under-determined", systems of type (a) above are called "over-determined" and may not have any solutions beyond the trivial solution (but they might, because the equations may have "hidden dependencies" that we discover upon row-reducing $M$).

If our original space that $\mathbf{x}$ lies in is say, $\Bbb R^3$, the possibilities are:

1) Any $\mathbf{x}$ is a solution (if $M$ is the 0-matrix)
2) The solution set is a PLANE in $\Bbb R^3$
3) The solution set is a LINE in $\Bbb R^3$ (that is scalar multiples of a single vector)
4) The solution set is trivial (only the origin works).

Seeing as how in this "limited" example, your thought that only #4 can happen is only batting .250, you might want to re-think your approach.

**********

Your first question is vague, we need to know a bit more about the matrix $M$ to really say anything useful.

**********

For your second question, suppose $A = (a_{ij})$. Then $A^T = (a_{ji})$.

If $A = -A^T$ this means that for every entry $a_{ij}$ we have:

$a_{ij} = -a_{ji}$

The diagonal entries are $a_{ii}$ (that is: $i = j$). From the above, we have:

$a_{ii} = -a_{ii}$ for every $i$. What can you say about a real number $r$ for which:

$r = -r$?

For the second part, let $A = B - B^T$.

Then $-A^T = -[(B - B^T)^T] = -[B^T - (B^T)^T] = -B^T -(-B) = ?$
 
  • #4
Deveno said:
For your second question, suppose $A = (a_{ij})$. Then $A^T = (a_{ji})$.

If $A = -A^T$ this means that for every entry $a_{ij}$ we have:

$a_{ij} = -a_{ji}$

The diagonal entries are $a_{ii}$ (that is: $i = j$). From the above, we have:

$a_{ii} = -a_{ii}$ for every $i$. What can you say about a real number $r$ for which:

$r = -r$?
$
In extension to what has been said, it can also be shown that $a_{ii} = 0$ with the following:

If you have $a_{ii} = -a_{ii}$ for every $i$, you can simply add $a_{ii}$ to the LHS of the equation, meaning you have:
$2a_{ii} = 0$. Solve for $a_{ii}$ and voila, you have shown that the diagonals are equal to 0.
 
  • #5
Thank you for the response guys!

I just realized how silly i was for question 1,

the previous parts for it determined that the matrix we are dealing with is:

1 2 -1 3 8
-3 -1 8 6 1
-1 0 3 1 -2 which i reduced to

1 0 -3 0 4
0 1 1 0 -1
0 0 0 1 2

and that is called the coefficient matrix M.

We are then told M is the augment matrix of a linear system Ax = b, so M = (A|b) and find the solutions, then describe the solutions geometrically.

So I let that reduce row echelon form = to
b1
b2
b3

and my solution was

x = (x1, x2, x3, x4, x5) = (b2, b2, 0, b3, 0) + s(3,1,1,0,0) + t(-4,-1,0,-2,1)
And I had let the free variables x3 = s and x5 = t where s and t are parameters.
then we arrive to the question i posted, that M is a coefficient matrix of a linear system, Mx = 0, and what are the solutions, and describe these solutions geometrically.
.
Finally, in addition, I have another question:

Let w = (1,0,1,1,1)$^T$. Find d such that Mw=d. What is the general solution of Mx =d and describe this set geometrically.

so am i correct in saying
d =
1
0
1
1
1

and Mx = d is just
x = d?
 
Last edited:
  • #6
nacho said:
Thank you for the response guys!

I just realized how silly i was for question 1,

the previous parts for it determined that the matrix we are dealing with is:

1 2 -1 3 8
-3 -1 8 6 1
-1 0 3 1 -2 which i reduced to

1 0 -3 0 4
0 2 -8 -6 -1
0 0 0 1 2
Yeah, that's right. That is the Reduced Row Echelon Form (RREF) for that matrix.

Now in regards to Mx=0. You have to realize that we're dealing with a homogenous equation. Geometrically, this means the lines, planes or hyperplanes represented by the equations in a homogeneous system all pass through the origin. Also note that the linear combinations are also orthogonal as the dot product (that is, M $\cdot$ x) are equal to 0.
 
  • #7
Hi again,

I have updated my last post on this page with what I am having difficulty with:

Firstly;
We are told M is the augment matrix of a linear system Ax = b, so M = (A|b) and find the solutions, then describe the solutions geometrically.

So I let that reduce row echelon form = to
b1
b2
b3

and my solution was

x = (x1, x2, x3, x4, x5) = (b2, b2, 0, b3, 0) + s(3,1,1,0,0) + t(-4,-1,0,-2,1)
And I had let the free variables x3 = s and x5 = t where s and t are parameters.then we arrive to the question i posted, that M is a coefficient matrix of a linear system, Mx = 0, and what are the solutions, and describe these solutions geometrically..

First of all, how do I express them a set of solutions geometrically.
Is there a clear cut procedure to follow? Someone mentioned that if it is a line, plane or hyperplane, it will be determined by the number of free variables in the system,
where 1 free variable = line,
2 = plane, 3+ = hyperplane.

is this so?

Secondly, how do I do:

Let w = (1,0,1,1,1)$^T$. Find d such that Mw = d. What is the general solution of Mx=d? Describe the set of solitions geometrically.
 

FAQ: Square Matrix Proof: Diagonal Entries & Properties

1. What is a square matrix?

A square matrix is a mathematical matrix that has an equal number of rows and columns. This means that it has the same number of entries in each row and column, making it a perfect square.

2. What are diagonal entries in a square matrix?

Diagonal entries refer to the elements of a square matrix that are located on the main diagonal, which extends from the top left corner to the bottom right corner. In other words, they are the elements that are in the same row and column position, such as a11, a22, a33, etc.

3. Why are diagonal entries important in a square matrix proof?

Diagonal entries are important in a square matrix proof because they hold certain properties that can be used to simplify or solve the proof. For example, they can be used to prove if a matrix is symmetric or skew-symmetric.

4. What are the properties of diagonal entries in a square matrix?

Some properties of diagonal entries in a square matrix include: they are always equal to themselves (a11 = a11, a22 = a22, etc.), they are equal to the transpose of the matrix (a11 = a11, a22 = a22, etc.), and they can be used to determine the trace of a matrix (sum of diagonal entries).

5. How do you prove properties of diagonal entries in a square matrix?

To prove properties of diagonal entries in a square matrix, you can use mathematical proofs and properties of matrices such as the commutative and associative properties of addition and multiplication, as well as the properties of transposes and traces. It is also important to understand the definitions and properties of diagonal entries in order to effectively use them in a proof.

Back
Top