It’s a rough rule of thumb, I never intended that value to be anything more than a “hey maybe you’re overloading things” indication.
-3dBFS is something of a magic constant because it means 50% of max power. Mode S messages effectively have a 50% duty cycle, so in a hypothetical perfect noise-free case, the strongest signal you could receive without overloading the ADC would have 50% of max power. After you add in some real-world noise to the nominally “off” half-bit periods, it’ll be somewhat above 50%. So a message with a signal strength >-3dBFS is probably saturating the ADC.
The demodulator is fairly tolerant of this, but it does eventually reach a point where very strong messages are undecodable.
Here’s a couple of graphs from some testing I did recently that show the effect. (Note that transmit levels are on an arbitrary scale, they don’t correspond to any particular absolute power level)
Things to note:
- the successful message decode rate starts to fall off for transmit levels > -15 or so
- the received signal level starts to go non-linear at a transmit level of -25 or so, which corresponds to a measured receive level of around -4dBFS. This is where something is starting to overload (or at least go non-linear) - compression effects or clipping somewhere in the receive chain. When the message decode rate starts to drop, the measured receive level is somewhere around -2dBFS
Since you can’t measure by how much a signal is overloading the receiver (and you can’t measure messages that you fail to decode), assuming that there’s some sort of distribution of signal levels and measuring how many fall into the “strong enough to be in the danger area, but not so strong that they couldn’t be received” can be used as a sort of proxy for measuring how many messages you’re losing to overload.
The actual shape of the distribution of signal levels, and your tolerance for losing strong/nearby signals vs. losing distant signals, is going to vary from site to site.