- #1
Stephen_D
- 3
- 0
Given the energy if sun were to instantly vaporize (using E = mc^2) = 2.7 x 10^47 J
( E = (mass of sun) * c^2)
how far would one have to be from a gamma ray burst in order for the average power from it to be equivalent to the average power from the sun's radiation at the Earth (solar constant, 1300 watt/m^2)
I understand the problem, but I can't seem to find a formula that would solve for distance using units of the solar constant. The only thought I have is using the potential energy formula, but that is joules. Any hints on what formula to use? or solving it could also be helpful :-)
thanks
Oops, this is my first post, didn't notice the homework section, sorry about that!
( E = (mass of sun) * c^2)
how far would one have to be from a gamma ray burst in order for the average power from it to be equivalent to the average power from the sun's radiation at the Earth (solar constant, 1300 watt/m^2)
I understand the problem, but I can't seem to find a formula that would solve for distance using units of the solar constant. The only thought I have is using the potential energy formula, but that is joules. Any hints on what formula to use? or solving it could also be helpful :-)
thanks
Oops, this is my first post, didn't notice the homework section, sorry about that!
Last edited: