- #1
kaikalii
- 17
- 0
So, assume a system of two solid, spherical masses in a vacuum. Sphere A has a mass of 10 kg, and Sphere B has negligible mass in comparison. The centers of the two spheres are 10 meters apart. For the sake of simplicity, let's say that in this universe, the gravity constant, G, is 1 N*m2/kg2. How would you determine the distance, x, between the centers of the two spheres at time t?
I have been trying to solve this problem using many methods, but have been unsuccessful.
I have been using the gravitational acceleration equation: g = G*m/r2
as well as the derivatives of acceleration: a = g, v = gt + v0, x = 1/2*gt2 + v0t + x0
I tried approximating on intervals of 1 second, i.e. x(0) = 10, x(1) = 9.95, x(2)= 9.8, x(3) = 9.547, etc.
However, I was unable to reconcile this pattern to an integral using an infinitely short time.
How do I solve this problem?
(It's not for homework, just curiosity.)
I have been trying to solve this problem using many methods, but have been unsuccessful.
I have been using the gravitational acceleration equation: g = G*m/r2
as well as the derivatives of acceleration: a = g, v = gt + v0, x = 1/2*gt2 + v0t + x0
I tried approximating on intervals of 1 second, i.e. x(0) = 10, x(1) = 9.95, x(2)= 9.8, x(3) = 9.547, etc.
However, I was unable to reconcile this pattern to an integral using an infinitely short time.
How do I solve this problem?
(It's not for homework, just curiosity.)