Hi all. My current setup is pretty standard, an RPi Model 3, ProStick Plus, 3 ft SMC cable, and a short indoor 1090 antenna, which is on a window. It can see aircraft maybe 40-50 miles away to the N, E, and S, and maybe 30 miles to the W.
I have an outdoor 26" antenna with an N connector. I want to mount that antenna on my roofline and run a cable to the ProStick. I know I need adapters and such to get the antenna N connector to a cable and then to the SMA connector on the ProStick.
Any advice as to cable types and such? I have a long run of RG-9 I can use, or is there a better option? From my roof line to the ProStick and Pi would be about 15 ft, but I have an antenna mast on my roof that I could mount the new antenna on that would add about another 15 feet.
There are two basic properties of a coax cable you want to know - the impedance and the attenuation (or loss per length).
Generally cables seem to tend to come in either 50 Ohm or 75 Ohm impedances - 50 Ohm seems used for radio related applications (like here), and 75 Ohm is used more in Cable TV, etc. It’s not critical that you match the impedances in a system, but any mismatch will incur some (modest) additional loss, so when given a choice try to match - which in this case means “use 50 Ohm”.
The attenuation is really the issue you want to focus on more. Cables will generally have increasing losses at higher frequencies - “loss” (usually measured in “dB per length”) tend to increase as the square-root of the frequency; so the same cable will have ~41% more loss when you double the frequency.
TV frequencies are low enough that cables which are typically used there (e.g. RG-6) will have a lot of loss in the 1000 MHz range (where ADS-B operates). For TV channel 13 (top of the VHF) RG-6 has loss of ~4dB per 100 ft - at 1090MHz (ADS-B) it’s nearly 11 dB. RG-8 is even worse (~14dB/100ft @ 1090MHz)
[per CoaxCableLossChart.pdf (kd3y.com) ]
The “good stuff” seems to be cable designed for (Ham?) radio applications, with the gold standard being “LMR” From Times Microwave.
LMR 240, LMR 400, and LMR 600 are increasingly “low loss” (and increasingly expensive!) cables intended for this sort of use. From the above link, losses per 100ft @1200 MHz are:
LMR 240: 8.8dB
LMR 400: 4.8dB
LMR 600: 3.1dB
There are a number of “other/off brand” versions of these cables, usually with the same numbering, but a different set of letters (e.g. RFC400, KMR400, KSR400, …) Based on reviews I’ve read, some of these are basically the same (just not “Times” branded), and some are potential crappy Chinese knockoffs.
Given the total cable loss is a function of length, generally you don’t want a lot more cable than required. That said, if you are using something “good” like LMR 400, an extra 10 feet of cable is adding < 0.5dB - which is fairly low. You’ll probably get that much loss from a couple of adapters (N-to-SMA) in the system.
FWIW, after much reading, researching, and head scratching, I ended up getting this cable:
which seemed like it would be a good compromise between cost and quality. So far it seems to be working for me, although I don’t have a good way to measure its actual attenuation.
Actually the Times Microwave datasheets for their LMR cables say every pair of connectors adds 0.1dB of attenuation so you count the total number of coax connectors in your entire antenna system (N and SMA, male and female, including the ones on any adapters or filters) and divide the result by two to find the total loss of those connections.
To the original poster, besides Mark’s good information I’d like to add: beware of listings on Amazon or eBay that call their coax “low loss” and it turns out to be RG58, which isn’t even really recommended in lower-frequency amateur radio except for very short cable runs such as for a mobile radio in a vehicle due to how lossy and noisy it is.
Yes, @jaymot , a very good point. In my searching I found a lot of listings that said “Low Loss” in big text in the heading, and “RG-58” in the specs/fine print.
LMR 400, etc. is significantly beefier than RG-58 or RG-6. Just looking at it, you can tell it’s “something else” (looks like 0.41" vs 0.26" so more than 50% larger diameter).
And also, “good to know” on the fittings rule of 0.1dB per fitting. I think I learned to use 0.25dB per “coupling” for TV cable (RG-6) with type-F connectors, so I figured this would be similar. I believe I also learned that a 50-75 Ohm impedance mismatch was around 0.25dB as well - but now I wonder if I’m confusing the two in my mind. Do you have a good figure for that loss factor?
If I’m correct a 50 to 75 ohm impedance mismatch would give you an SWR of 1.5:1 which represents a 3% loss of signal strength so 97% would still be making it to the SDR. This isn’t factoring in the attenuation of the coax. https://n6pet.com/power-loss-various-swr/
If I’m doing my math right, a 3% reduction would correspond to about -.13dB, which seems in the right ballpark - and my “-0.25dB” recollection was likely for something else (like a type-F joint).
Thanks!
FWIW, I took my recently acquired NanoVNA and a collection of adapters and I think I was able to measure the attenuation of some of the cables I have lying around:
25 ft RG-58 = -6.6dB @ 1090MHz
50 ft RG-6 = -5.2dB @ 1090MHz
35 ft KMR400 = -2.0db @ 1090MHz
The traces were a bit choppy, so I had to estimate a bit - I’m not sure why they were not smoother, but probably something I was doing wrong with the VNA settings. Anyhow, the KMR400 is doing a bit worse than theoretical for LMR400, but the couple extra adapters weren’t helping, so I would say that it’s “close” to what LMR would be expected to be.
I ordered LMR400 from a supplier and was sent Powax400. Tested on HF, no problems, but on UHF it did not perform. Quality control at the factory is the issue. There are sections in the cable where the foil is twisted around and only two thirds of the way around the dielectric. The quality difference between proper LMR400 and clones shows in the braid density as well.