- #1
metalmagik
- 131
- 0
A major league pitcher can throw a baseball in excess of 42.0 m/s. if a ball is thrown horizontally at this speed, how much will it drop by the time it reaches the catcher who is 15.1 m away from the point of release?
I worked this out as an x and y problem and I got 2.57 for the distance at wihch the ball dropped (d on the Y side of the problem.)
I first found time for the X side, then carried that to the Y side which had Vi = 0 and a = -9.81 m/s^2.
Did I do anything wrong? Any help would be great, thanks so much
I worked this out as an x and y problem and I got 2.57 for the distance at wihch the ball dropped (d on the Y side of the problem.)
I first found time for the X side, then carried that to the Y side which had Vi = 0 and a = -9.81 m/s^2.
Did I do anything wrong? Any help would be great, thanks so much