- #1
goodphy
- 216
- 8
Hello.
I'm using coaxial cable to transfer power to the load with original aim that radiation from the coax would be low or almost nothing as current on both conductor (central conductor and outer shield layer - ground) are equal in amplitude and opposite in phase so their sum is zero; zero current means zero radiation.
But my colleague said he was probably measuring significant radiation emitted from the cable, although measurement was done in very rough way so it was not for sure. But this makes me think about my simple assumption seriously.
Is there a guaranteed currents are always canceled out within coaxial cable in principle? If so, please give me proof or logic for this conclusion. If not, how can I make up such a special and ideal case? what kind of factors are to be considered?
I'm using coaxial cable to transfer power to the load with original aim that radiation from the coax would be low or almost nothing as current on both conductor (central conductor and outer shield layer - ground) are equal in amplitude and opposite in phase so their sum is zero; zero current means zero radiation.
But my colleague said he was probably measuring significant radiation emitted from the cable, although measurement was done in very rough way so it was not for sure. But this makes me think about my simple assumption seriously.
Is there a guaranteed currents are always canceled out within coaxial cable in principle? If so, please give me proof or logic for this conclusion. If not, how can I make up such a special and ideal case? what kind of factors are to be considered?