Spring static equilibrium Problem

In summary: Now, substitute that into the 2nd equation to get an equation involving only y2. What do you get?##m_1\ddot y_1+m_2\ddot y_2=-ky_1##or##m_1\ddot y_1+m_2\ddot y_2=-k_1 (\frac{m_2}{k_2}\ddot y_2+y_2)
  • #36
Today in lecture our teacher made a solution to this problem by using the Matrix approach as Orodruin pointed out. He write the matrix form then he found the eigenvalues and eigenvectors. And then well he kind of stopped there.
 
Physics news on Phys.org
  • #37
then maybe you should try to take it from there do you know how to diagonalize a matrix
 
  • #38
timetraveller123 said:
then maybe you should try to take it from there do you know how to diagonalize a matrix
Once he has the eigenvalues and eigenvectors, there is no need to diagonalize anything. All he needs to do is use these to determine the coefficients required to satisfy the initial conditions.
 
  • Like
Likes timetraveller123
  • #39
Chestermiller said:
Once he has the eigenvalues and eigenvectors, there is no need to diagonalize anything. All he needs to do is use these to determine the coefficients required to satisfy the initial conditions.
Well, the process of finding the eigenvalues and eigenvectors essentially gives you the diagonalisation as well ...
 
  • Like
Likes timetraveller123
  • #40
wait even after obtaining eigenvectors and eigenvalues
you still have to change basis from
##
y' = p^{-1} y
##
and construct a diagonal matrix filled with the eigenvalues right? at least this is what i know
then after solving that revert back to normal basis
 
  • #41
timetraveller123 said:
wait even after obtaining eigenvectors and eigenvalues
you still have to change basis from
##
y' = p^{-1} y
##
and construct a diagonal matrix filled with the eigenvalues right? at least this is what i know
You don't really have to do it. You just note that with a complete set of eigenvectors ##v_i##, you can expand the solution in terms of them, i.e.,
$$
y(t) = \sum_i \alpha_i(t) v_i.
$$
Now, inserting into the differential equation would give
$$
\ddot y = \sum_i \ddot{\alpha}_i(t) v_i = K \sum_i \alpha_i(t) v_i = \sum_i \lambda_i \alpha_i(t) v_i,
$$
where ##\lambda_i## are the eigenvalues. Since the ##v_i## are linearly independent, the coefficients in front of ##v_i## on either side of the equation must be the same and therefore
$$
\ddot \alpha_i = \lambda_i \alpha_i.
$$
Of course, this is the same thing as you will get if you do the diagonalisation explicitly.
 
  • Like
Likes Delta2, Arman777 and timetraveller123
  • #42
oh wow that's actually rather neat i never learned it that way
 
  • Like
Likes Delta2
  • #43
I find something like

##\begin{pmatrix}
\ddot y_1 \\
\ddot y_2 \\
\end{pmatrix} =
\begin{pmatrix}
-10 & 4 \\
4 & -4 \\
\end{pmatrix}
\begin{pmatrix}
y_1 \\
y_2 \\
\end{pmatrix}##

I find the values ##λ_1=-12## and ##λ_2=-2##

Correct ?
 
Last edited:
  • #44
were you given the values
the eigen values seem correct
 
  • #45
timetraveller123 said:
were you given the values
the eigen values seem correct
Thanks :)
 
  • #46
Orodruin said:
You don't really have to do it. You just note that with a complete set of eigenvectors ##v_i##, you can expand the solution in terms of them, i.e.,
$$
y(t) = \sum_i \alpha_i(t) v_i.
$$
Now, inserting into the differential equation would give
$$
\ddot y = \sum_i \ddot{\alpha}_i(t) v_i = K \sum_i \alpha_i(t) v_i = \sum_i \lambda_i \alpha_i(t) v_i,
$$
where ##\lambda_i## are the eigenvalues. Since the ##v_i## are linearly independent, the coefficients in front of ##v_i## on either side of the equation must be the same and therefore
$$
\ddot \alpha_i = \lambda_i \alpha_i.
$$
Of course, this is the same thing as you will get if you do the diagonalisation explicitly.

So ##\ddot y_1 = -12y_1## and ##\ddot y_2 = -2y_2## or which eigenvalue corresponds to which ?
 
  • #47
Arman777 said:
So ##\ddot y_1 = -12y_1## and ##\ddot y_2 = -2y_2## or which eigenvalue corresponds to which ?
No, you need to use the eigenvectors. The equations where the differential equations are not coupled are the ones for the ##\alpha##s, not for the ##y##s.
 
  • #48
For ##λ_1=-12## I find eigenvector
##\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}##

and for ##λ_1=-2## I find
## \begin{pmatrix}
1 \\
2 \\
\end{pmatrix}##

so ##\ddot y_1=
-12
\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}##

##\ddot y_2=-2
\begin{pmatrix}
1 \\
2 \\
\end{pmatrix}## ?
 
  • #49
No. You need to write your differential equations on the form
$$
\ddot Y = \begin{pmatrix}
\ddot y_1 \\ \ddot y_2
\end{pmatrix}
=
\ddot \alpha_1 v_1 + \ddot \alpha_2 v_2
= \lambda_1 \alpha_1 v_1 + \lambda_2 \alpha_2 v_2.
$$
This will give you differential equations for the ##\alpha##s, not for the ##y##s.
 
  • #50
What is ##α## ??

I am so confused right now. ##ν## are the eigenvectors okay ##λ## is the eigenvalue.

Its so sad that our teacher never solved a problem like this before. Even once and I guess my algebra sucks.
 
  • #51
The ##\alpha## are the expansion coefficients that tell you how much of each eigenvector there is in the solution. Generally those coefficients will be time dependent. The idea is that any vector
$$
Y =
\begin{pmatrix}
y_1 \\ y_2
\end{pmatrix}
$$
can be written as a linear combination of the eigenvectors
$$
Y = \alpha_1(t) v_1 + \alpha_2(t) v_2
$$
where the expansion coefficients generally depend on time. Inserting this into the differential equation gives you separated differential equations for the ##\alpha##, i.e., the differential equation for ##\alpha_1## does not depend on ##\alpha_2## and vice versa.
 
  • #52
So we have
For

##\begin{pmatrix}
\ddot y_1 \\
\ddot y_2 \\
\end{pmatrix}=-12α_1
\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}
-2α_2 \begin{pmatrix}
1 \\
2 \\
\end{pmatrix}##
 
  • #53
You need to insert the expression for Y in terms of the alphas on the left side as well.
 
  • #54
Orodruin said:
You need to insert the expression for Y in terms of the alphas on the left side as well.
Could you write it please..so I can learn it.. I don't get it this way. I need to proceed . This is painful
 
Last edited:
  • #55
##\begin{pmatrix}
y_1 \\
y_2 \\
\end{pmatrix}=C_1e^{-12t}
\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}+
C_2e^{-2t} \begin{pmatrix}
1 \\
2 \\
\end{pmatrix}##
 
Last edited:
  • #56
That would be the result if you had a first order derivative and not a second order one in your differential equation.
 
  • #57
Orodruin said:
You need to insert the expression for Y in terms of the alphas on the left side as well.
You mean it will be ##\ddot α_1v_1## etc
 
  • #58
i think what @Orodruin menas is that the aplhas you got in post 56 would be the solution of
##
\dot \alpha_i = \lambda_i \alpha_i
##
instead of
##
\ddot \alpha_i = \lambda_i \alpha_i
##
you wouldn't get an exponential solution if you used the second one
 
Back
Top