- #1
dnyberg2
- 125
- 2
Okay, so my short experience in radio has taught me that the input and output of every RF system is designed around some certain impedance. Typically 50 ohms or 75 or 300... So you take a transmitter and hook it to a tuned antenna with some sort of feedline like coax.
The coax is the same impedance as the design of the output of the transmitter and the antenna, let's says this is all in a 50 ohm world for this example okay?
Now you decide you need to move the transmitter a little further away from the antenna feed point, so you go to your shack and grab a roll of coax that gets the job done. This new coax is 10 feet longer than the one you were using. Let's say for argument sake its RG8.
The new longer length has a little more loss to it than the shorter length, I get that much. But what else does this new longer coax have going on than a bit more loss and why?
Does the longer coax now phase shift the RF to some other phase angle than the shorter one?
I assumed all these years that a slight difference in length of coax does little but change the loss in this perfect 50 ohm matched system.
I know different lengths of coax are used as tuned lines when phasing two antennas together and such or making a filter even, but the length of coax in those cases are at some multiple of the wavelength or present a calculated impedance, delay or phase change right?
What would you say if I told you that I have an RF system that seems so impedance dependent from source to load that even a slight change in the length of transmission line (inches) seriously affects the overall performance of the entire system from RF source to antenna load?
Doesn't that smack of some kind of design flaw? If an RF system were designed to act that way on purpose, what does that say about the system?
As always, your comments are greatly appreciated.
The coax is the same impedance as the design of the output of the transmitter and the antenna, let's says this is all in a 50 ohm world for this example okay?
Now you decide you need to move the transmitter a little further away from the antenna feed point, so you go to your shack and grab a roll of coax that gets the job done. This new coax is 10 feet longer than the one you were using. Let's say for argument sake its RG8.
The new longer length has a little more loss to it than the shorter length, I get that much. But what else does this new longer coax have going on than a bit more loss and why?
Does the longer coax now phase shift the RF to some other phase angle than the shorter one?
I assumed all these years that a slight difference in length of coax does little but change the loss in this perfect 50 ohm matched system.
I know different lengths of coax are used as tuned lines when phasing two antennas together and such or making a filter even, but the length of coax in those cases are at some multiple of the wavelength or present a calculated impedance, delay or phase change right?
What would you say if I told you that I have an RF system that seems so impedance dependent from source to load that even a slight change in the length of transmission line (inches) seriously affects the overall performance of the entire system from RF source to antenna load?
Doesn't that smack of some kind of design flaw? If an RF system were designed to act that way on purpose, what does that say about the system?
As always, your comments are greatly appreciated.