- #1
jcott
- 2
- 0
So here's a question I'm struggling with:
The rotation speed of the sun around the Milky Way center is 220 km/s and it takes the sun around 200 million years to orbit once around center of the galaxy.
Given that the rotation curve is relatively flat (i.e. the rotation speed stays the same as function of distance), how long does it take to make one full orbit for a star that is two times further from Milky Way center than the Sun?
I've tried many things and can't seem to find an answer that fits. I used the formula for orbital period (T): T = 2∏√(r^3/GM) where r is the distance of the star from the center of the milky way (supposedly a supermassive black hole), G is the gravitational constant (6.67*10^-11 m^3/kgs^2), and M is the mass of the supermassive black hole, Sagittarius A (8.2*10^36 kg), around which all stars in our galaxy rotate.
If r is 56,000 ly (twice the distance from our Sun to the center of the galaxy which is 28,000 ly) which is 5.298*10^20 m, this all calculates to be 3.27*10^18 seconds. This is 1.036222602924*10^-7 years... which seems WAY too small compared to the given orbital period of our Sun around the center of the galaxy, which is 200 million years. Where did I go wrong?
The rotation speed of the sun around the Milky Way center is 220 km/s and it takes the sun around 200 million years to orbit once around center of the galaxy.
Given that the rotation curve is relatively flat (i.e. the rotation speed stays the same as function of distance), how long does it take to make one full orbit for a star that is two times further from Milky Way center than the Sun?
I've tried many things and can't seem to find an answer that fits. I used the formula for orbital period (T): T = 2∏√(r^3/GM) where r is the distance of the star from the center of the milky way (supposedly a supermassive black hole), G is the gravitational constant (6.67*10^-11 m^3/kgs^2), and M is the mass of the supermassive black hole, Sagittarius A (8.2*10^36 kg), around which all stars in our galaxy rotate.
If r is 56,000 ly (twice the distance from our Sun to the center of the galaxy which is 28,000 ly) which is 5.298*10^20 m, this all calculates to be 3.27*10^18 seconds. This is 1.036222602924*10^-7 years... which seems WAY too small compared to the given orbital period of our Sun around the center of the galaxy, which is 200 million years. Where did I go wrong?