I wonder if there is significant difference in TV technology. Mine is an older, 720p plasma. I recently installed a 15 MHz bandpass filter on the antenna closer to TV. But before that, I don’t remember it being affected by TV.
Now that I start to understand viedehopf’s graphs1090, these effects can be quantified. In this plot, noise rose 10 dB after plug on, depressing usable dynamic range from 30 dB to 15 dB.
The “shield up” point was when I used paper clips to connect the metal case and exposed metal on the HDMI plug. The effect wasn’t as pronounced as I hand-press a metal rod.
In comparison, the second receiver 15m, two or three dry walls away shows no obvious change. (The two plots are from the same time period, using different local time.)
This is the one close to TV AND with the narrowband filter. So I can do a similar test by swapping the filter with the other antenna and switching TV on and off.