- #1
balancedlamp
- 3
- 0
Problem: You are playing soccer with some friends while some people nearby are playing hide and seek. When the seeker gets to '4', you kick the soccer ball high in the air. WHen the seeker says '5', the ball is 10 ft in the air. When the seeker gets to '7', the ball is 20 feet in the air. How many seconds will the ball be in the air in total before it hits the ground?
Given that after 1 second the ball traveled 10 ft in the air, I had the initial velocity as 10 ft/s. At the apex, the velocity will be 0 ft/s. Since change in position is not given I used the equation Vf = Vi + at. My thought was to find the time this way then multiply by two. Using a=-32.2 ft/s/s, I got t=5/8 s at the end of it all, which doesn't make any sense. I acknowledge that I left out the part about it being 20ft in the air after 2 seconds, but I thought that would erroneous considering we know the initial velocity and the value of a. What am I doing wrong?
Given that after 1 second the ball traveled 10 ft in the air, I had the initial velocity as 10 ft/s. At the apex, the velocity will be 0 ft/s. Since change in position is not given I used the equation Vf = Vi + at. My thought was to find the time this way then multiply by two. Using a=-32.2 ft/s/s, I got t=5/8 s at the end of it all, which doesn't make any sense. I acknowledge that I left out the part about it being 20ft in the air after 2 seconds, but I thought that would erroneous considering we know the initial velocity and the value of a. What am I doing wrong?