Basic Questions in linear algebra and group theory

In summary, we discussed two questions related to matrices and tensors. The first question was about the determinant of a matrix and how to deduce if the matrix is real or complex from it. The second question was about the possibility of having tensors in an N-dimensional space with indices bigger than N. We also discussed an example from A. Zee's book "Quantum Field Theory in a nutshell" and the use of the transpose and complex adjoint in determining the reality of a matrix. Further reading and understanding of the topic may be helpful in finding a clearer answer.
  • #1
Heisenberg1993
12
0
1- How can infer from the determinant of the matrix if the latter is real or complex?
2- Can we have tensors in an N-dimensional space with indices bigger than N?
 
Physics news on Phys.org
  • #2
1. I don't think you can. Is there a reason why you're asking?

2. No. The components of a tensor are the output you get when you take basis vectors as input. So the indices label basis vectors as well as tensor components. For example, the components of the metric tensor at a point p are given by ##g_{ij}(p)=g_p(e_i,e_j)##. (The metric tensor field ##g## takes each point ##p## to a "metric at p", which I'm denoting by ##g_p##).
 
  • #3
Fredrik said:
1. I don't think you can. Is there a reason why you're asking?

2. No. The components of a tensor are the output you get when you take basis vectors as input. So the indices label basis vectors as well as tensor components. For example, the components of the metric tensor at a point p are given by ##g_{ij}(p)=g_p(e_i,e_j)##. (The metric tensor field ##g## takes each point ##p## to a "metric at p", which I'm denoting by ##g_p##).
Thanks for the reply
concerning the first question, in A. Zee's book "Quantum Field Theory in a nutshell", it is stated that "any orthogonal matrix can be written as O=eA. from the conditions that OTO=1 and det(O)=1, we can infer that A is real and anti-symmetric." From the first condition, I deduced anti-symmetry. However, I didn't know how to deduce that A is real from the second. That's why I asked the question.
Concerning the second question, can't I define a 3-rank tensor, which takes 3 basis vectors as an input, in a two dimensional space? I will get two indices with the same value, but what's wrong with that(assuming the tensor is non anti-symmetric)
 
  • #4
Heisenberg1993 said:
Thanks for the reply
concerning the first question, in A. Zee's book "Quantum Field Theory in a nutshell", it is stated that "any orthogonal matrix can be written as O=eA. from the conditions that OTO=1 and det(O)=1, we can infer that A is real and anti-symmetric." From the first condition, I deduced anti-symmetry. However, I didn't know how to deduce that A is real from the second. That's why I asked the question.
I'm puzzled by this too. The term "orthogonal" is only used for matrices whose components are real numbers. So wouldn't a theorem that says that there's an A such that ##O=e^A## mean that the components of A are real? When we're dealing with a real vector space, every matrix has real components unless we explicitly say otherwise.

If the theorem says that A may be complex, then it makes sense to try to prove that the conditions imply that A is real. But I don't see how to prove it either.

Heisenberg1993 said:
Concerning the second question, can't I define a 3-rank tensor, which takes 3 basis vectors as an input, in a two dimensional space?
You can. You will have three indices that each take values in the set {1,2}.
 
  • Like
Likes Heisenberg1993
  • #5
Heisenberg1993 said:
1- How can infer from the determinant of the matrix if the latter is real or complex?
[...]

1. No. Check out the matrix diag (##e^{i\phi}, e^{-i\phi}##), ##\phi\in\mathbb{R}^*##. It has complex non-zero entries with 1 as determinant.
 
  • #6
Fredrik said:
I'm puzzled by this too. The term "orthogonal" is only used for matrices whose components are real numbers. So wouldn't a theorem that says that there's an A such that ##O=e^A## mean that the components of A are real? When we're dealing with a real vector space, every matrix has real components unless we explicitly say otherwise.
If the theorem says that A may be complex, then it makes sense to try to prove that the conditions imply that A is real. But I don't see how to prove it either.

You can. You will have three indices that each take values in the set {1,2}.

So in a sense we don't need the second condition to show that A is real, just the fact that we used the transpose of the matrix is enough. If the m,atrix was complex, then we should have used the complex adjoint of it, right?
 
  • #7
The adjoint is more useful when we're working with complex matrices, but we can certainly transpose a complex matrix if we want to. However, the Wikipedia definition of "orthogonal" is

In linear algebra, an orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors (i.e., orthonormal vectors), i.e.
$$Q^TQ=QQ^T=I,$$ where ##I## is the identity matrix.
Unfortunately this doesn't imply that A is real in any obvious way. But it makes me wonder if the theorem that Zee is using to rewrite ##O## as ##e^A## really says that we may need an A that isn't real. It seems that this would have been stated explicitly in the theorem, and in that case, Zee should have stated it explicitly too.
 
  • #8
I don't think Zee is that much careful in talking about math. So maybe reading another book covering the same subject helps.
 
  • #9
Theorem 2.9 in "Lie groups, Lie algebras, and representations" by Brian C. Hall says that for every invertible matrix M, there's a complex matrix A such that ##M=e^A##. I'm going to take a look at the proof and see if I can understand what's going on there.
 
  • Like
Likes Heisenberg1993

FAQ: Basic Questions in linear algebra and group theory

What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations and their representations in vector spaces. It involves the manipulation and analysis of vectors, matrices, and linear transformations.

Why is linear algebra important?

Linear algebra is a fundamental tool in many fields, including physics, engineering, economics, and computer science. It provides a framework for solving complex systems of equations, understanding geometric concepts, and analyzing data.

What are the basic concepts in linear algebra?

The basic concepts in linear algebra include vectors, matrices, linear transformations, and eigenvalues and eigenvectors. Other important topics include systems of linear equations, vector spaces, and determinants.

What is group theory?

Group theory is a branch of mathematics that deals with the study of symmetry and structure in mathematical objects. It involves the study of groups, which are sets of elements that satisfy certain algebraic properties, such as closure, associativity, and identity.

What are some applications of group theory?

Group theory has many applications in various fields, including physics, chemistry, cryptography, and computer science. It is used to study the symmetries of physical systems, understand molecular structures, create secure encryption algorithms, and develop efficient algorithms for solving computational problems.

Similar threads

Replies
9
Views
2K
Replies
7
Views
2K
Replies
3
Views
2K
Replies
3
Views
2K
Replies
4
Views
1K
Back
Top