- #1
Mantikore
- 14
- 0
I know to find the time it takes to travel a distance, you divide the distance by the speed.. but what if you have an increasing speed?
Say I wanted to travel 5 miles.. and I start off at 0 mph and by the time I reached the 5 mile mark, I would be traveling at 40 mph and there was a steady increase. Meaning at 1 mile I would be traveling at 8 mph, at 2.5 miles I'd be traveling at 20 mph, and at 4 miles I'd be traveling at 32 mph.
Is there an equation that could tell me how long it would take to travel that distance with a steadily increasing speed?
Say I wanted to travel 5 miles.. and I start off at 0 mph and by the time I reached the 5 mile mark, I would be traveling at 40 mph and there was a steady increase. Meaning at 1 mile I would be traveling at 8 mph, at 2.5 miles I'd be traveling at 20 mph, and at 4 miles I'd be traveling at 32 mph.
Is there an equation that could tell me how long it would take to travel that distance with a steadily increasing speed?