- #1
Researcher X
- 93
- 0
If you jump, throw a ball, or a catapult fires a shot, the projectile's acceleration surely has a maximum possible time period in which to take place? If I want to even know the acceleration I need to determine the length of this time period between velocity 0 and take-off velocity.
For example, if I throw a ball at 10m/s and the ball accelerates for the 0.8 meter of my arm before I let go, I know intuitively that this time period couldn't ever be as long as say, 100 seconds, so even though acceleration can vary independent of speed, there must be a maximum possible time period for the acceleration (and therefore a minimum acceleration) based of the distance through which the projectile accelerates and it's take-off velocity.
How do you solve for this, or am I mistaken in this assumption?
For example, if I throw a ball at 10m/s and the ball accelerates for the 0.8 meter of my arm before I let go, I know intuitively that this time period couldn't ever be as long as say, 100 seconds, so even though acceleration can vary independent of speed, there must be a maximum possible time period for the acceleration (and therefore a minimum acceleration) based of the distance through which the projectile accelerates and it's take-off velocity.
How do you solve for this, or am I mistaken in this assumption?