How far can you detect a Morse code signal in space?

  • Thread starter Dc2LightTech
  • Start date
  • #1
Dc2LightTech
22
1
If you have a 1Km dish in space and radiated 100% of the energy into it. what would be the maximum distance that in identical receiver in a 1Km dish could receive it before the signal was below the background noise? so a CW signal modulated at 1Hz.
 
Engineering news on Phys.org
  • #2
The cosmic "background noise" is a function of frequency. There is less cosmic noise at higher frequencies.

The directive gain of your 1 km aperture antenna is also important. That is proportional to frequency. But the surface must be accurate to better than λ/20.

What is the temperature of your receiver?

So different frequencies, have different ranges. You should use the highest frequency, compatible with the surface accuracy of your dishes. Once that is specified, an energy audit can be compiled.
 
  • Like
Likes russ_watters
  • #3
Well, lets see.

The NASA Deep Space Network (DSN) uses 70 meter antennas. They are still receiving signals from the 3.66 meter antenna on the Voyager 2 spacecraft.

Voyager 2 spacecraft is 133AU away. (1AU is the mean distance between the Earth and the Sun)
That's about 93,000,000 miles for an AU, times 133 (you get to do the math :wink:)

The data rate from Voyager is 160 bits per second.

Now since dish antenna gain is proportional to area, lets see how antenna size affects things.
Using area = π x R2
π x 352 = 3847 square meters. (DSN antennas)
π x 5002 = 785,000 square meters (your antenna)
So your antenna will be 204 times as sensitive as the DSN antenna.

Now if you multiply the antenna gain by the bit rate ratio (not necessarily real accurate)
you get 204 x 160 = 32640. Multiply that by the answer you got for Voyagers current distance, above, and you get a lower limit for how far you could communicate with a 1Km antenna talking with a 3.66 meter antenna.

Trip time from Sun to Earth at 186,268 miles per second is 8.32 minutes.
8.32 x 133 = 18.4 hours for a signal sent from Voyager 2 to reach Earth.

Now another question arises:
Multiply that 18.4hrs by the 32640 found above, and you get 600,576 hours for a one-way message.
Divide that by 365.25 days in a year and you get 1644 years for a spacecraft transmission to reach Earth.

I think it will be a while before we have power sources that can last that long (and that is assuming the space craft was traveling at the speed of light, not likely!)

Fun Idea though, (gave me something to do this evening)

Cheers,
Tom
 
  • Like
Likes TonyStewart and russ_watters
  • #4
Tom.G said:
you get 1644 years for a spacecraft transmission to reach Earth.

At least we can be sure we could easily send and receive messages to Proxima Centauri with even smaller antennas.
 
  • #5
If you want to increase the communication distance, you can try to increase the transmit power, increase the antenna gain, reduce background noise, etc., such as moving the receiving antenna to a location without electromagnetic interference.
 
  • #6
Tom.G said:
The data rate from Voyager is 160 bits per second.
That is the DATA rate and it is irrelevant except that it gives a lower bound on your possible transmission frequency. What counts is the transmission frequency. I COULD, if I wanted, send 160bps over any frequency from several hundred hz on up but the higher my transmission frequency can go, the higher my data rate can go.

EDIT: where did you get that data rate? I can't find it in my, admittedly brief, search.

I only found that it was as high as 115.2 kilobits per second at the distance of Jupiter but decreases with distance.

EDIT #2: Never mind. I found it.
 
Last edited:
  • #7
The OP not only doesn't specify frequency, he doesn't specify power. Doubly unanswerable.
 
  • Like
Likes davenn and hutchphd
  • #8
phinds said:
That is the DATA rate and it is irrelevant....
It's plenty relevant in that higher data rates require wider bandwidth channels. Signal to noise ratio gets poorer as channel width increases. There are ways to get around this by using multiple narrow bandwidth channels to simultaneously transmit separate portions of the data package at lower data rates. But this certainly is not what we are discussing here with Morse code.
 
  • Like
Likes hutchphd
  • #9
Averagesupernova said:
It's plenty relevant in that higher data rates require wider bandwidth channels
Which is exactly what I said.
...irrelevant except that it gives a lower bound on your possible transmission frequency.
 
  • #10
Your last post seems contradictory to me.
 
  • #11
Averagesupernova said:
Your last post seems contradictory to me.
What I have said is, low signal-transmission rates only allow for low data rates (or conversely, low data rates can be accommodated by lower signal-transmission rates) and higher data rates require higher transmission rates.
 
Last edited:
  • #12
Although the information provided by the OP is very limited, it is difficult to accurately answer this question.
But let me try to explore it theoretically. According to the Shannon-Hartley theorem, even if there is a lot of noise in the communication channel, it only needs to occupy a very narrow bandwidth, and the receiver detects very weak signal power, theoretically it can still recover a valid message, but the message transmission will become very slow.
So the answer to the OP's question is probably that it can theoretically be very far.:smile:
https://en.wikipedia.org/wiki/Shannon–Hartley_theorem
 
  • Like
Likes berkeman
  • #13
CW or Morse code is limited because carrier phase is lost in the spaces. Better to transmit a continuous carrier, phase modulated, probably with a PRBS ranging code. That way, you can follow the carrier deep into the noise.

Morse code was first used to modulate radio signals in about 1900.
It is now 2023.
That gives an alternative answer to the question; 123 light years.
 
  • Like
  • Haha
Likes Vanadium 50, hutchphd and sophiecentaur
  • #14
Tom.G said:
Voyager 2 spacecraft is 133AU away. (1AU is the mean distance between the Earth and the Sun)
That's about 93,000,000 miles for an AU, times 133 (you get to do the math :wink:)
As it happens, I was at an Astro event at the weekend where a guy was showing his hand made models of the craft used on various space missions. Very impressive bits of craft.

The model of Voyager 2 made me think . . . The inverse square law is a wonderful thing and (apart from the power supply going flat) is pretty much the only thing to limit link performance in space. Voyager can go twice as far away and yet its received signal will only drop a mere 6dB in level. By that time, technology could well make it possible to demodulate its signal still. I won't be around for that but several of you young whippersnappers may well be.
 
  • #15
Vanadium 50 said:
The OP not only doesn't specify frequency, he doesn't specify power. Doubly unanswerable.
oops i did miss that detail... a 1Km antenna, transmitting a 1Kw signal into a 1Km antenna X light years away.
 
  • #16
Dc2LightTech said:
oops i did miss that detail... a 1Km antenna, transmitting a 1Kw signal into a 1Km antenna X light years away.
What is the surface accuracy of the dishes.
 
  • #17
Baluncore said:
What is the surface accuracy of the dishes.
λ/20? :smile:
 
  • #18
Baluncore said:
What is the surface accuracy of the dishes.
Anything that big would use a large number of sections which could be controlled and phased. Mechanical stability would be a different problem under low gravity conditions. But how long is a piece of string?
 
  • #19
The parameters have ventured into the silly. The system has two dishes, costing tens or more likely hundreds of billions of dollars. one of them launched into space at who knows how much cost (literally "astronomical") and connected to a transmitter you can buy for a few hundred bucks at the next hamfest.
 
  • Like
Likes berkeman
  • #20
Vanadium 50 said:
The parameters have ventured into the silly. The system has two dishes, costing tens or more likely hundreds of billions of dollars. one of them launched into space at who knows how much cost (literally "astronomical") and connected to a transmitter you can buy for a few hundred bucks at the next hamfest.
Agreed; this thread is getting close to being tied off...

That said, discussions about Shannon's Theorem, antenna gain, optimum frequency band selection and coding for interstellar communication, etc. should still be fine.
 
  • Like
Likes Vanadium 50 and sophiecentaur
  • #21
I recall the Voyager modulation was biphase (BPSK) which was coincidental to the same modulation that I was using at the same time. The bit rate significantly affects distance as the SNR is improved by matching the BW of the receiver to the spectrum of the signal. This is called the Matched Receiver condition for Gaussian Noise.

Thus reducing the bit rate also extends the lifetime of the mission.

But I learned much more from reading this 2013 paper about the sources of noise from solar flares, and adjacent instruments. Such that Friis Law is not the only one for path loss, SNR and error rate and ( i expect) the next Carrington Effect is due shortly after the planned 2025 termination of the Voyageur Interstellar Missions (VIM). (unless some posts an update)

1696975705495.png


https://voyager.gsfc.nasa.gov/Library/DeepCommo_Chapter3--141029.pdf

They used all the encoding methods to optimize error rates with convolution and ECC which extends the error rates by at least 2 decades of bits between errors after error correction.

There are many modes of redundancy, so this article is intended for those with Telecom/Telemetry experience but achieved the greatest distance for a Telemetry signal much more complex than the original Morse code in order to track the Doppler shift with Earth's rotation, maintain solar power, and earth tracking while collecting an array of electromagnetic experiments.

https://voyager.jpl.nasa.gov/
 
Last edited:
  • #22
I am going to assume a system noise temp T, equal to the CMB, of 3K, and a bandwidth B of 100Hz. The noise power in the receiver, Pn, in dBW will then be 10 log kTB, where k is Boltzmann's Constant, equal to 1.38 x 10^-23 Watts per Hertz per Kelvin. Pn = -203 dBW.

Next I will estimate the gain of a 1km diameter dish at an assumed wavelength of 10cm. G = 10 log n (pi D/lambda)^2, where n is the illumination efficiency, assume 0.5, and D is the diameter in metres. This gives an antenna gain of 90dBi. For the space craft I will assume a 1m diameter dish giving a gain of 27dBi.

I am going to assume a transmitter power of 20dBW for the space craft, based on the available power.
Assume the Morse code is readable with a S/N of 10dB. Our overall system gain is then 203 + 20 + 90 + 20 -10 = 323 dB, this being the maximum allowable loss between transmitter and receiver antenna ports.

To find the maximum distance we use the formula for free space propagation, which can be expressed as L = 20 log (4 pi d/lambda), where d is the distance in m and L is the attenuatiion between isotropic antennas. So 323 = 20 log (4 pi d/0.1), giving a distance of 3 x 10^10 km. This is about 1.5 times the distance to Voyager 2.

As a matter of interest, forward error correction can only extend the detection threshold for a Bit Error Ratio of 10^-3 by a small amount, maybe 2 decibels approx. Apologies if I have tripped up in the numbers anywhere.
 
  • Like
  • Informative
Likes Tom.G, berkeman and Baluncore
  • #23
tech99 said:
I am going to assume ..
These are theoretical assumptions, not proven yet with many problems to solve.

- Is thermal noise the only dominant source of noise in the presence of solar and cosmic impulse radiation?
- How does message bits/baud/message affect BER when the dominant noise is not Gaussian?
- If one assumes a 1 km dish with 90 dB gain that has never been built, (0.5 km Chinese FAST dish is largest) what assumptions are made about the precision of every parabolic sub-panel reflector angle in parts per billion? ( while correcting the earth's rotational effects)
- VIM's best earth antenna only has a gain of __ ?
- How much distortion is created by load and environmental effects of a 1km dish.
- Does the earth's rotational effect become significant for tracking a ultra-low-frequency sub-carrier
- What stability of the carrier is needed in ppb?
- etc.

"On the Lighter Side" Theory is when you know everything but nothing works. Practice is when everything works but no one knows why. Theory and practice are combined: nothing works and no one knows why.
 
  • #24
TonyStewart said:
These are theoretical assumptions, not proven yet with many problems to solve.
Let's hope the estimated power failure year of 2023, in your table is a gross underestimate. Just as they were doing so well. . . . .
 
  • #25
sophiecentaur said:
Let's hope the estimated power failure year of 2023, in your table is a gross underestimate. Just as they were doing so well. . . . .
I think it is pretty accurate as they have already shut off some heaters and will shut down experiments to extend the life a bit more. They had LOS in July when a command sequence resulted in a shift of the antenna by 2 degrees but luckily were able to "Shout" a re-aim command sequence in August to re-acquire the signal. https://voyager.jpl.nasa.gov/news/details.php?article_id=130
 
  • #26
TonyStewart said:
These are theoretical assumptions, not proven yet with many problems to solve.
It is sometimes useful to consider the most optimistic set of assumptions as a starting point for system design. Regarding noise floor, I was taking the noise to be predominantly the CMB, which as far as I now is Gaussian. We are not talking about a sub carrier by the way, this is the main carrier being keyed, so for frequency tracking purposes I imagine the Doppler shift can be calculated, and then either use AFC or manually tune the signal over a range of a few kilohertz.
 
  • #27
tech99 said:
then either use AFC
Not a big deal, I suspect. The thing is only departing at about 15km / s and its speed will be changing very slowly. It's not too different from Earth's orbital speed (variation over a year would not be much) any suitable receiver would be phase locked, I'd suspect.
 
  • #28
Dc2LightTech said:
oops i did miss that detail... a 1Km antenna, transmitting a 1Kw signal into a 1Km antenna X light years away.

and you still missed what frequency
and it's kW and km :wink: :smile:
 
  • Like
  • Haha
Likes sophiecentaur and berkeman

FAQ: How far can you detect a Morse code signal in space?

How far can a Morse code signal travel in space?

The distance a Morse code signal can travel in space depends on several factors, including the power of the transmission, the sensitivity of the receiving equipment, and the presence of any interference. In theory, with powerful enough equipment, a Morse code signal could be detected over interstellar distances, potentially spanning light-years.

What factors affect the detection range of a Morse code signal in space?

Several factors affect the detection range of a Morse code signal in space, including the transmission power, the frequency of the signal, the sensitivity and quality of the receiving equipment, and any potential interference from cosmic sources or other signals. Additionally, the presence of interstellar dust and gas can attenuate the signal over long distances.

Can existing technology detect Morse code signals from other star systems?

Current technology, such as large radio telescopes and highly sensitive receivers, can detect very weak signals from other star systems. However, detecting a Morse code signal from such a distance would require the signal to be extremely powerful or the receiving equipment to be exceptionally sensitive, often beyond current capabilities for routine detection.

What is the role of signal frequency in detecting Morse code in space?

The frequency of the Morse code signal plays a crucial role in its detection. Higher frequency signals tend to travel longer distances with less attenuation, but they can also be more susceptible to interference. Lower frequency signals may not travel as far but can penetrate obstacles better. The choice of frequency depends on the desired balance between range and signal clarity.

Have Morse code signals ever been detected from space?

To date, there have been no confirmed detections of Morse code signals originating from space. While Morse code has historically been used for communication on Earth, any extraterrestrial civilization using this method would need to transmit a very powerful signal for us to detect it with our current technology. Most searches for extraterrestrial signals focus on other types of modulated signals that may be more likely to be used for long-distance communication.

Back
Top