Is there a way to know what actual gain value is being used when dump1090 (or variants) is set to automatic (–gain -10)? Does the system pick a static value or vary it dynamically? In either case though I’d like to know what it is using.
It’s pretty horrible actually…
The R820T tuner has three gain stages: LNA, mixer, VGA.
If you ask librtlsdr for a manual gain value (or “maximum”, which picks the largest manual setting) there is a table that sets fixed gains for each stage based on the requested total gain.
Setting --gain -10 asks librtlsdr to use automatic gain settings in the tuner. For a R820T tuner, librtlsdr uses these settings:
- LNA gain is set to be controlled by the tuner’s AGC
- Mixer gain is set to be controlled by the tuner’s AGC
- VGA gain is set to a fixed value
The tuner’s AGC (in hardware) samples the input power (probably at the tuner output? I can’t remember…) and adjusts the LNA/mixer gain based on that. Unfortunately it is not well suited to ADS-B traffic (which has short bursts and highly variable power levels), it’s designed more for DVB-T signals (continuous signal, stable power level). So what usually happens is that the tuner’s AGC sees what seems to be a very weak signal and so sets LNA / mixer gain to maximum.
So the net effect is that “–gain -10” actually sets a fixed gain (!)
The reason that it can produce better results than the maximum manual gain used by librtlsdr is that the VGA gain setting used in AGC mode is slightly higher than the maximum value you can set with a manual gain (this is arguably a bug, you should be able to access this gain setting manually)
So you should interpret “–gain -10” as something more like “–gain 55” (I can’t remember the exact value, it’s a little higher than the usual maximum of 49dB)
Fantastic explanation. Thank you!