Be ##T_{1}, T_{2}## upper and lower matrix, respectivelly. Show that we haven't matrix ##M(NxN)## such that ##M(NxN) = T_{1}\bigoplus T_{2}##
I am not sure if i get what the statement is talking about, can't we call ##T_{1},T_{2} = 0##? Where 0 is the matrix (NxN) with zeros on all its entries...
There is no specific example but my attempt at one would be to make the non-symmetric matrix symmetric. Then we would be able the usual formulas as designed for symmetric matrices. Is this how it works?
Alternatively, do I just calculate the Eigen values without making it symmetric? I don't...
I need to find the matrix transformation of y = \frac{1}{x} onto y = \frac{-1}{3x-1}-2
I think its
\begin{bmatrix}
x'\\
y'
\end{bmatrix}
=\begin{bmatrix}
3 & 0 \\
0 & -1
\end{bmatrix}
\begin{bmatrix}
x\\
y
\end{bmatrix}
+
\begin{bmatrix}
-1\\
-2
\end{bmatrix}
is is possible to multiply the matrix M with either A or c->?
And if i have to write the matrices in this form: , do i divide c-> by A or do i follow som other formula?
The basis he is talking about: {1,x,x²,x³,...}
I don't know how to answer this question, the only difference i can see between this hermitians and the others we normally see, it is that X is acting on an infinite space, and, since one of the rules involving Hermitian fell into decline in the...
Given that the normal vector cross product is rotational invariant, that is $$\mathbf R(a\times b) = (\mathbf R a)\times(\mathbf R b),$$ where ##a, b \in \mathbb{R}^3## are two arbitrary (column) vectors and ##\mathbf R## is a 3x3 rotation matrix, and given the cross product matrix operator...
Indeed, if we take a vector field which dual to the covector field formed by the gradient from a quadratic interval of an 8-dimensional space with a Euclidean metric, then the Lie algebra of linear vector fields orthogonal (in neutral metric) to this vector field is isomorphic to the...
Hi, I have been having some trouble in finding the determinant of matrix A in this Q
Which relevant determinant property should I make use of to help me find the determinant of matrix A and maybe matrix B also
This is what I have tried for matrix A so far but it's not much help really
Any...
I'd like to show that, by minimizing this functional
$$\Omega[\hat \rho] = \text{Tr} \hat \rho \left[ \hat H - \mu \hat N + \frac 1 {\beta} \log \hat \rho \right]$$
I get the well known expression
$$\Omega[\hat \rho_0] = - \frac 1 {\beta} \log \text{Tr} e^{-\beta (\hat H - \mu \hat N )}$$
I'm...
If I have a matrix representing a 2nd order tensor (2 2) and I want to convert this matrix from M$$\textsuperscript{ab}$$ to $$M\textsubscript{b}\textsuperscript{a}$$ what do I do? I'm given the matrix elements for the 2x2 tensor. When applying the metric tensor to this matrix I understand...
I have 2 Fisher matrixes which represent information for the same variables (I mean columns/rows are the same in the 2 matrixes).
Now I would like to make the cross synthesis of these 2 matrixes by applying for each parameter the well known formula (coming from Maximum Likelihood Estimator...
So let's say that we have som unitary matrix, ##S##. Let that unitary matrix be the scattering matrix in quantum mechanics or the "S-matrix".
Now we all know that it can be defined in the following way:
$$\psi(x) = Ae^{ipx} + Be^{-ipx}, x<<0$$ and $$ \psi(x) = Ce^{ipx} + De^{-ipx}$$.
Now, A and...
In a certain anisotropic conductive material, the relationship between the current density ##\vec j## and
the electric field ##\vec E## is given by: ##\vec j = \sigma_0\vec E + \sigma_1\vec n(\vec n\cdot\vec E)## where ##\vec n## is a constant unit vector.
i) Calculate the angle between the...
The note I get from the teacher states that for transition matrix, the column part will be current state and the row part will be future state (let this be matrix A) so the sum of each column must be equal to 1. But I read from another source, the row part is the current state and the column...
Starting from the definition of a matrix exponential as a power series, how would we show that ##(e^A)^n=e^{nA}##?
I know how to show that if A and B commute then ##e^Ae^B = e^{A+B}## and from this we can show that the first identity is true for integer values of n, but how can we show it’s...
In Rigid body rotation, we need only 3 parameters to make a body rotate in any orientation. So to define a rotation matrix in 3d space we only need 3 parameters and we must have 6 constraint equation (6+3=9 no of elements in rotation matrix)
My doubt is if orthogonality conditions...
We are trying to find the complete solution to the matrix equation ##A\vec x = \vec b## where A is an m x n matrix and ##\vec b## can be anything except the zero vector. The entire solution is said to be:
##\vec x = \vec x_p + \vec x_n##
where ##\vec x_p## is the solution for a particular ##\vec...
It says in any textbook (for example, in classical text «Theory of matrices» by P. Lankaster) on matrix theory that matrices form an algebra with the following obvious operations:
1) matrix addition;
2) multiplication by the undelying field elements;
3) matrix multiplication.
Is the last one...
upon finding the eigenvalues and setting up the equations for eigenvectors, I set up the following equations.
So I took b as a free variable to solve the equation int he following way.
But I also realized that it would be possible to take a as a free variable, so I tried taking a as a free...
Hi everyone.
I am studying Chebyshev Polynomials to solve some differential equations. I found in the literature that if you have a function being expanded in Chebyshev polynomials such as
$$
u(x)=\sum_n a_n T_n(x),
$$
then you can also expand its derivatives as
$$
\frac{d^q u}{dx^q}=\sum_n...
I have a trouble showing proofs for matrix problems. I would like to know how
A is invertible -> det(A) not 0 -> A is linearly independent -> Column of A spans the matrix
holds for square matrix A. It would be great if you can show how one leads to another with examples! :)
Thanks for helping...
We need to find the normal modes of this system:
Well, this system is a little easy to deal when we put it in a system and solve the system... That's not what i want to do, i want to try my direct matrix methods.
We have springs with stiffness k1,k2,k3,k4 respectively, and block mass m1, m2...
I've attempted to solve this by separating A into a diagonal matrix D and nilpotent matrix N:
D = {{1, 0}, {0, 0}}
N = {{0, 0}, {1, 0}}
e^(At) = e^((D + N)t) = e^(Dt) * e^(Nt)
When N is raised to the second power, it becomes the zero matrix. Therefore,
e^(Nt) = I + Nt = {{1, 0}, {t, 1}}
Note...
Hi PF!
I'd like to make one matrix from a cell. I've checked several suggestions, the most promising here but this did not work, giving me the unhelpful error
Brace indexing is not supported for variables of this type.
Error in cell2mat (line 42)
cellclass = class(c{1});
Error in Feven (line...
This is the question. The following is the solutions I found:
I understand that the first line was derived by setting one vertex on origin and taking the transpose of the matrix. However, I cannot understand where the extra row and column came from in the second line. Can anyone explain how...
question:
My first attempt:
my second attempt:
So I am getting 0 (the right answer) for the first method and 40 for the second method. According to the theorem, shouldn't the determinant of the matrix remain the same when the multiple of one row is added to another row? Can anyone explain...
Hello everyone,
I find an interesting matrix which seems to be always invertible. But I have no idea how to prove it! So I write down here for some ideas. Here is the problem:
Let us take $n\in \mathbb{N}^*$ bins and $d\in \mathbb{N}^*$ balls. Denote the set $B = \{\alpha^1, \ldots...
Let us take $n\in \mathbb{N}^*$ bins and $d\in \mathbb{N}^*$ balls. Denote the set $B = \{\alpha^1, \ldots, \alpha^m\}$ to be all possible choices for putting $d$ balls into $n$ bins, such as
$$\alpha^1 = (d,0,\ldots, 0), ~ \alpha^2 = (0,d,\ldots, 0), \ldots$$
Let us define the matrix $V$ as...
The summary pretty much explains my question. I know that ##\left[ A, e^B \right]=0## if ##[A,B]=0## (and can prove it), but I can't figure out how to prove if it is or is not an "if and only if" statement.
Thanks in advance!
Hi PF!
I am trying to multiply each component of B by the matrix A and then solve A\C. See the code below.
A = rand(4);
B = rand(5,1);
C = rand(4,1);
for i = 1:5
sol(:,i) = (B(i)*A)\C
end
But there has to be a way to do this without a for-loop, right? I'd really appreciate any help you have!
Hi PF!
Each element of an ##n\times m## matrix is complex valued. In the following code, I call this "domain". There is also an ##n\times m## matrix that is real valued, below I call this "f". I'd like to plot a 3D image where the ##x-y## plane is the complex plain given by the coordinates...
Hey! 😊
Let $1\leq m,n\in \mathbb{N}$ and let $\mathbb{K}$ be a field.
For $a\in M_m(\mathbb{K})$ we consider the map $\mu_a$ that is defined by \begin{equation*}\mu_a: \mathbb{K}^{m\times n}\rightarrow \mathbb{K}^{m\times n}, \ c\mapsto ac\end{equation*}
I have show that $\mu_a$ is a linear...
Let me start with the rotated vector components : ##x'_i = R_{ij} x_j##. The length of the rotated vector squared : ##x'_i x'_i = R_{ij} x_j R_{ik} x_k##. For this (squared) length to be invariant, we must have ##R_{ij} x_j R_{ik} x_k = R_{ij} R_{ik} x_j x_k = x_l x_l##.
If the rotation matrix...
Hi, I think this is a nitpicking question, but oh well let me hear your inputs.
Actually I tried to solve this question straightforwardly, by Taylor expanding the exponential and showing that:
\textbf{A}^n = \begin{pmatrix} a^n & nba^{n-1} \\ 0 & a^n \end{pmatrix}
i.e.
e^{\textbf{A}t} =...
In graphene system, the velocity operator sometimes is v= ∂H/ħ∂p, and its matrix element is calculated as <ψ|v|ψ>, i.e., v_x = v_F cos(θ) and v_y = v_F sin(θ) [the results are the same with Eq. 25] for intraband velocity. Recently, I see a new way to calculate the velocity matrix (Mikhailov...
Hello
Suppose if have a matrix that is purely diagonal with NO zeros: M (which is n by n -square)
Suppose I have another matrix the contains coordinate information, call it A.
This one is NOT a square matrix, but, (n by m) (where, in general m < n)
I form this: Q = A-transpose * M * A...
I'm trying to 'see' what the generators of the Poincare Group are. From what I understand, it has 10 generators. 6 are the Lorentz generators for rotations/boosts, and 4 correspond to translations in ℝ1,3 since PoincareGroup = ℝ1,3 ⋊ SO(1,3).
The 6 Lorentz generators are easy enough to find in...
I have an equation that comes from an especific topic of cam mechanisms and it goes like this:
$$
2M[tan(B)-B] - \beta Ntan(B) - 2\pi\sqrt{1 - N^2} = 0 \ \ \ \ \ \ \ \ \ (1)
$$
For this it doesn't matter what each variable means.
I'm trying to create a 3x3 matrix with a determinant equal to...
After computind dirac 1D equation time dependant for a free particle particle I get 2 matrixs. From both,them I extract:
1) the probablity matrix P =ps1 * ps1 + psi2 *psi2
2) the current matrix J = np.conj(psi1)*psi2+np.conj(psi2)*psi1
I think that current is related to electricity, and...
Hello everyone,
I have a math / physics question that has been with me for a while. I would be grateful if someone could help me.
Given a density matrix, what is the minimum value a sum of some of its off-diagonal elements can assume (or the most negative value)?
Remark: if one collect an...
I derive the quadratic form of Dirac equation as follows
$$\lbrace[i\not \partial-e\not A]^2-m^2\rbrace\psi=\lbrace\left( i\partial-e A\right)^2 + \frac{1}{2i} \sigma^{\mu\nu}F_{\mu \nu}-m^2\rbrace\psi=0$$
And I need to find the form of the spin dependent term to get the final expression
$$g...
I was solving a problem for my quantum mechanics homework, and was therefore browsing in the internet for further information. Then I stumbled upon this here:
R is the rotation operator, δφ an infinitesimal angle and Ψ is the wave function.
I know that it is able to rotate a curve, vector...
Hi PF!
I am trying to computer a matrix of integrals. Think of it something like this:
Table[Integrate[x^(i*j), {x, 0, 1}], {i, 0, 5}, {j, 0, 5}]
I have 16 cores, and would like to have each core handle a specified amount of integrals. Anyone know how to do this?
Thanks so much!
Hi
I need to solve an equation of the form $$\dot{X}(t) = FX(t) + X(t)F^T + B$$
All of these are matrices. I have an initial condition X(0)=X_0.
However, I have no idea how to proceed. How can I make any progress?