- #1
bolzano95
- 89
- 7
- Homework Statement
- A car is driving at the distance ##d= 15m## after the truck. Both vehicles drive with the constant velocity. Velocity of the car is ## v_1= 30,6 \frac{m}{s}##, velocity of the truck is ## v_2= 25 \frac{m}{s}##. Suddenly the truck starts to slow down with deceleration ##-a=5\frac{m}{s^2}##.
1. Calculate the time necessary so that the car and the truck don't crash. You know the car driver starts slowing down (##0,5s##) later than the truck and both the truck and the car decelerate with ##-a=5\frac{m}{s^2}##.
2. Calculate the time of the crash.
- Relevant Equations
- Basic kinematic equations
1. I'm trying to calculate the time at which the crash does not happen (if possible, because I don't know the official solution. I assume the crash is preventable).
At the time t the truck decelerates and makes the distance ##s_2= \frac{v_2^2}{2a}##. In the same time the car drives with the constant velocity ##v_1## for a time ##t_0## ##\implies s_{11}= v_1\cdot t_0 ## and then decelerates with ##a## ##\implies s_{12}= \frac{v_1^2}{2a}##. Therefore I assume the condition for no accident is : ##|s_{11}|+ |s_{12}| < d + |s_2|\implies |v_1\cdot t_0|+ |-\frac{1}{2}a(t-t_0)^2| < d + |-\frac{1}{2}at^2|##. After putting in the known values the value of is ##t > 0.362s##. After a little bit of reflection this doesn't make any sense to me - shouldn't be the time bigger?
2. When does the crash happen?
I assumed the distance driven by the car is bigger than the ( distance d + the distance driven by the truck):
##|s_{11}|+ |s_{12}| = d + |s_2|## (so both vehicles are at the same position)
After plugging the values in I get ##t= 0.362s##.
I feel like I missed something here.
Will be grateful for any help.
At the time t the truck decelerates and makes the distance ##s_2= \frac{v_2^2}{2a}##. In the same time the car drives with the constant velocity ##v_1## for a time ##t_0## ##\implies s_{11}= v_1\cdot t_0 ## and then decelerates with ##a## ##\implies s_{12}= \frac{v_1^2}{2a}##. Therefore I assume the condition for no accident is : ##|s_{11}|+ |s_{12}| < d + |s_2|\implies |v_1\cdot t_0|+ |-\frac{1}{2}a(t-t_0)^2| < d + |-\frac{1}{2}at^2|##. After putting in the known values the value of is ##t > 0.362s##. After a little bit of reflection this doesn't make any sense to me - shouldn't be the time bigger?
2. When does the crash happen?
I assumed the distance driven by the car is bigger than the ( distance d + the distance driven by the truck):
##|s_{11}|+ |s_{12}| = d + |s_2|## (so both vehicles are at the same position)
After plugging the values in I get ##t= 0.362s##.
I feel like I missed something here.
Will be grateful for any help.
Last edited: