- #1
res3210
- 47
- 0
Hello everyone,
I am not completely familiar with the way small particles behave, but I assume that if one applies an acceleration to a particle(such as an electron) then that particle will accelerate. So here is my hypothetical: suppose we apply an acceleration of 1 m/s^2 to an electron. After about 300000000 seconds, it should be traveling at the speed of light. However, we know no physical particle can reach c, so that would mean that the constant acceleration would have to in fact decrease over time. Is this the case? And if so, what would happen if we applied an increasing acceleration to a particle? Also, would this mean that the constant acceleration would never reach zero, and we would be looking at a series which only achieves a value of c with respect to velocity when t equals infinity?
I am not completely familiar with the way small particles behave, but I assume that if one applies an acceleration to a particle(such as an electron) then that particle will accelerate. So here is my hypothetical: suppose we apply an acceleration of 1 m/s^2 to an electron. After about 300000000 seconds, it should be traveling at the speed of light. However, we know no physical particle can reach c, so that would mean that the constant acceleration would have to in fact decrease over time. Is this the case? And if so, what would happen if we applied an increasing acceleration to a particle? Also, would this mean that the constant acceleration would never reach zero, and we would be looking at a series which only achieves a value of c with respect to velocity when t equals infinity?