Questions and brainstorming on MLAT error/noise issues



I’ve noticed that MLAT data can be really jumpy (up to a mile between data points). This is even with >20 synchronized sites around KSEA. I’m curious if the problem is incorrect/low-precision location data entered by individual piaware sites or inherent lack of precision in hardware timers on RP2s. PCs have the RDTSC opcode to read the processor’s time stamp counter (TSC) directly, but there are issues here (different readings based on which core you happen to be executing on). There’s also the Intel HPET Timer on PCs which is supposed to be better. There’s a pretty good Windows based article here: … 08(v=vs.85.aspx

The ARM doesn’t have the equivalent of RDTSC and I don’t know if the RP2 has any other hardware to help. Light travels almost a quarter mile in a microsecond so there’s little room for error.

If the problem is bad location data entered by piaware feeders, could the ADS-B signals that are used to synchronize multiple MLAT stations also be used to precisely locate the piaware site? By looking at several ADS-B readings from several directions over time you could compute the only location consistent with all those readings. Again though, this only works if there’s a precise clock/timer on the RP2.

From a completely different angle, I’ve seen people experimenting with using an RTL2832U dongle to process GPS signals. If that was possible then you would have precise time and location. Could enough GPS signal pass through a 1090MHz filter to allow this to work even if the chip could be programmed to do it?

I did look at the MLAT code to try and answer some of this myself, but it was quickly clear this was a very complex (and justifiably so) piece of software. Also my system level experience is all on Windows making it more difficult.

In summary then:

  1. Are MAT errors due to bad user entered locations or lack of precise clock hardware on RP2s?
  2. If bad location, could a series of ADS-B receptions from a variety of directions be used to determine the correct position?
  3. Could the RTL2832U be used as a GPS receiver to calibrate actual location/time from time to time?

As I said, more brainstorming than feature request. I think I’m particularly hoping OBJ is reading this…



  1. Probably a bit of both. Locations that are very wrong tend to get thrown out early on (they fail to synchronize); locations that are slightly wrong don’t get detected and add error.
  2. I did look at this a while back but it’s not really feasible, not to the accuracy you’d need to see those “slightly wrong” locations.
  3. No; the GPS signal is too far away (frequency-wise) from the ADS-B signal to hear both at the same time with the same dongle (+ any 1090MHz filters or tuned antennas will have a hard time hearing GPS in the first place). You could do it with a separate dongle and a lot of work but at that point you are probably better off just plugging in a dedicated GPS (you can get stuff like this pretty cheaply: The real win would be if you could use the GPS time to timestamp samples as they come in, but that basically requires dedicated hardware that’s designed to do that.

nb: the approach taken for multilateration using the dongles does not depend on the host clock at all, it counts samples from the dongle to calculate intervals. The host clock isn’t useful here because there is an unknown, variable, delay between the signal arriving at the dongle, and samples arriving over USB.

The jumpy locations are probably (it’s always hard to tell without spending a day looking at the raw data) mostly a combination of poor receiver geometry and multipath issues.
That said, do you have an example of a bad track? (given a timestamp and the ICAO, I can go back to the raw results)


Ahhh, the time difference is provided by the dongle itself. OK, that makes sense. How about this one:


Pretty sure it was last Sunday (Nov 29), but could have been last Saturday.


PS: Could not figure out how to embed the image in the reply.


Unfortunately that’s slightly too old for me to easily look at (the data is archived after a week)


… is there really no thinkable way to rewrite the time-stamps (e.g. based on gps-mouse connected to raspberry pi) - as this would allow to use two or three dongles parallel and doing mlat from their aggregated messages?


Rewriting the timestamps etc. is essentially what the mlat server does anyway all the time. You need a synchronization point in the sample stream to be able to do that. The mlat server uses ADS-B equipped aircraft. If you knew that the dongles were feeding from the same physical location, other approaches are possible - but this is very much an edge case.


my idea was that - running 3 cheap dongles on one raspi and aggregating the messages before mlat would produce a lot more messages and could compete much better against really expensive solutions with dedicated hardware …


I doubt it will help; but implement it and prove me wrong.


The more I learn about how MLAT is implemented the more impressive it is that it was possible at all with such limited radio hardware available.

Even if I got a dedicated RP2 HAT GPS device, there would be no way to synchronize between plane signals arriving at the RTL2832U and the very precise timestamps generated from the GPS. If the USB cost was very consistent it could perhaps be accounted for, but all it takes is hundreds of nanoseconds to give errors of hundreds of feet, and the WiFi adapter is on the USB bus as well. Now if someone would create a RP2 HAT card that had both the RTL2832U and the MT3339 GPS Chipset so that Mode-S messages could get that precise timestamp on receipt then very precise results should be possible.

BTW, when I look at this flight’s summary, … /KOAK/KSEA, it doesn’t show all the jumping around. So I guess after the fact FAA data is used for this historical flighttrack data?



Is’t that where the RadarCape comes in - admitted it’s built for the BeagleBone not the Pi2 - but look at the price of the thing.