- #1
spaceshipone
- 8
- 0
A car has a maximum constant acceleration of 10 ft/s^2 and a maximum constant deceleration of 15 ft/s^2. determine the minimum amount of time it would take to drive one mile assuming the car starts and ends at rest and never exceeds the speed limit 55 mi/hr.
Why do you need to know the deceleration constant if we are talking about the car accelerating and going to 55 mi/hr?
I would assume you just plug it into the constant acceleration formula and that is it.
I was also thinking maybe it has to accelerate to first at a constant rate and then decelerate to slow to 55 mph. I which case I have no idea how to solve the problem.
Anyone want to take a stab at this?
Why do you need to know the deceleration constant if we are talking about the car accelerating and going to 55 mi/hr?
I would assume you just plug it into the constant acceleration formula and that is it.
I was also thinking maybe it has to accelerate to first at a constant rate and then decelerate to slow to 55 mph. I which case I have no idea how to solve the problem.
Anyone want to take a stab at this?