Started by DU2XXR, Mar 24, 2024, 06:34 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.


by Warren Allgyer WA8TOD


One of the insidious things about the very popular end fed antennas matched by ferrite impedance transformers is the effect of losses in those transformers on the overall performance of the antenna system.

For reasons beyond the scope of this post, losses in the transformer mask the actual SWR of the antenna. In reality the higher the transformer loss, the lower the SWR reading at the transceiver. And the higher the actual SWR at the antenna, the higher the transformer loss.

One very practical issue has prevented a lot of us from understanding this and being able to quantify it. While every ham has an SWR meter almost no one has the ability to measure power at an impedance of other than 50 ohms. So while we wind 9:1 UNUNs for antennas we assume to be in the 400 - 500 ohm range, and 49:1 UNUNs for resonant half waves, we have no way of determining how well they actually work because we do not have the instrumentation to do it.

In the diagram I show a method of adjusting a normal tuner to replicate the impedance of an unknown wire at a particular frequency. Once that is done I feed the test transformer output to the tuner in reverse so the high impedance terminals of each match and the low impedance terminal of the tuner then reflects the power level that has survived the transformer and the tuner together. Since the tuner loss is normally less than 1 dB we can discount it in this test.

One caveat: While the input impedance of the reversed tuner will closely match that of the antenna, its reactance sign will be opposite that of the wire. I don't THINK that adversely affects the accuracy of the loss measurement but I will gladly entertain viewpoints to the contrary.