- #1
luke34
- 3
- 0
Ok here's the problem:
A batter hits a ball which leaves the bat 1 meter above the ground at an angle of 65 degrees with an initial velocity of 30 m/s. How far from home plate will the ball land if not caught and ignoring any air resistence?
So I tried solving for how much of the velocity is is the x and y components.
X= Vcos65
Y= Vsin65
which gave me x=12.68 and y=27.19
Then using the equation X=X(initial) + V(initial x)T + .5A(x)T^2
I plugged in my numbers to get 0=1+12.68t+.5(9.8)T^2.
This is where I'm stuck. Am I on the right track or what? Help please. Thanks.
A batter hits a ball which leaves the bat 1 meter above the ground at an angle of 65 degrees with an initial velocity of 30 m/s. How far from home plate will the ball land if not caught and ignoring any air resistence?
So I tried solving for how much of the velocity is is the x and y components.
X= Vcos65
Y= Vsin65
which gave me x=12.68 and y=27.19
Then using the equation X=X(initial) + V(initial x)T + .5A(x)T^2
I plugged in my numbers to get 0=1+12.68t+.5(9.8)T^2.
This is where I'm stuck. Am I on the right track or what? Help please. Thanks.