Tool for decoding beast format ADS-B data?


Hi there, and please forgive me if this is something already answered, I couldn’t quite see it in the forums so far with some searching…

What I’d like to do is to capture the outputted data from dump1090, but in a human readable format, and then take that output and (on another pi attached to a big hard drive), save it into a MySQL database or something similar. The idea being that for each complete collection event of the following:

Date/Time, Callsign, Lat/Long, Alt, Hdg, Speed

I am able to input those values into a new row in the database (along with the unique id of course). My current setup collects about 60,000 positions a day, which should give me about 2 million records a month. My plan is to use that ‘bulk data’ if you will to be able to draw out stats about airlines with common routes, to be able to answer things like:

How close in three dimensions do their aircraft fly along their set routes? How much does the speed of the aircraft vary (e.g., weather patterns influence), how punctual are they (e.g. over the same spot at the same time each time it is flown).

I’d also like to be able to run some real time analytics on the data as it’s being piped to the database - e.g. look for any airliners whose rate of descent is more than 4,000ft/min (possible emergency situation, therefore send an SMS from the Pi alerting me), and look for aircraft of type 737/A320 with an airspeed of less than 130kts above 10,000ft (similar to before)

So my question really is - is there a tool that can convert beast format coming out of port 30005 into human readable format (I can do the text manipulation etc in python on the human readable bit, I just need to get it there first)?

Thanks again for your help.



The output on port 30003 will probably give you what you want - this is “basestation format” output, which is decoded messages in a CSV format.


Hi, thanks! I just took a look using netcat and that does look a lot closer to what I’m looking for! Now to code some python to grab it and format it for MySQL!


Incidentally, is there an equivalent of RFC that describes what a full basestation format output would look like? Also, the hex value of the fifth field (looks like 4CA97C, AD8EDE, 45CE4F etc) - I’m sure that relates to flight specific information. Does that translate into Flight number, or Tail number?


Oh and sorry for bombarding you with questions, but if I sit and watch my web interface (piaware:8080) and follow an Easyjet flight, I can have a netcat session open on 30003 which outputs to grep which filters out all but the flight I want… the web interface shows regular updates on position and altitude, but the raw data on 30003 doesn’t reflect it.


Getting there: … _dump1090/


The 6 digit hex value is the 24-bit ICAO address - this is the unique identifier in ADS-B that identifies the physical aircraft.
This is what you want to be filtering for when grepping (and, later, keying your database by) - every message has the ICAO address but not all messages have a callsign etc.


I uploaded some old code, I haven’t run it in a long time, as I’ve been away from the radios. But anyway it’s in Java and you probably don’t want that, but maybe the database SQL sample will give you some Python ideas.

Basically I connect to the 30003 port, throw away all the MSG,8 poop, and parse the CSV into table data.

Then there some triggers after inserting and updating.


Thanks for that, I’ll look to see if I can find a way of parsing it into something like a tail number, or if nothing else, scanning the ADS-B data to see when it was correlated with a callsign and then insert the callsign for entries that didn’t have it. Perhaps…

That’s excellent, thanks, it’s given me a lot to consider on the design of the SQL database.


There is no universal way of turning a ICAO address into a tail number (or vice versa). Some countries have a fixed mapping between them; some countries just allocate them as needed and so you need a registry to convert.


Yeah, I have since discovered that and to that end I’ll have to essentially do some coding that matches hex values to to flight/tail numbers by grabbing it from the MSG,1

Looking at what comes out of the basestation format on 30003, I’ve deduced the following:

In the CSV format, the fields are as follows:

5 - Hex
7 - Date
8 - Time
11 - C/S
12 - Alt
13 - Speed
14 - Hdg
15 - Lat
16 - Long

The trouble is, you don’t get all of them in every message. Only the hex is broadcast with every message.

You get the following depending on which message the plane is sending:

MSG,1 - Gives reg or tail number. Seen as frequently as every five seconds in some cases.
MSG,3 - Gives alt and position
MSG,4 - Gives speed and course.

As the hex is common to all of them, I’ll need to write something that says ‘examine MSG,1 till you see a hex and a tail number together, now associate hex and tailnumber’ and set association to expire in 30 minutes. Every time a correlation is seen, the keepalive is extended by 30 minutes and all records inserted with the hex will be given the associated tailnumber too.

It may then be that I wait until a new hex and tailnumber is seen for the first time, then use that to generate a unique flight ID - eg, RYR317_DDMMYY_1 - Flightnumber, date, number (in case the service runs many times a day) - once an entry has been made in the database for the unique flights, correlated hex, speed, course, heading etc can all be inputted into the one flight’s entry to create a log of its progress.

Any thoughts are welcome.


Yep. This reflects what’s in the underlying raw messages, there is no single on-the-air message that has all the data. Each line on port 30003 reflects just one on-the-air message. If you want to gather all the data being sent then you need to accumulate data from multiple messages over time.

If you don’t care about seeing the individual messages then there are other, more processed, forms available (e.g. faup1090’s output, or polling the contents of aircraft.json via the web interface or written to the filesystem). But you lose a little detail as it’s a more abstract view of what the aircraft is doing.


Thanks again for the advice, I’ll take a look into those, and have a think about what I might want to write in Python to grab the raw data, hold some of it in memory for a short period of time until the messages have cycled through and a complete dataset is held before writing it to the database. Admittedly, there will be a bit of fudge in there as the speed and course might come a few seconds after the callsign, and the altitude a few seconds after that etc, but it should be able (hopefully) to give an accurate full set of readings about once every ten to fifteen seconds, which is fine for my purposes I suppose.


Actually, I think I might be going about this the wrong way. It doesn’t take long to get an initial full record of icao, date, time, c/s, alt, lat, long, course and speed.

I could hold it all in memory until the first complete record is built, write that to the database, and then from that time on, duplicate the entire entry on a new row any time any data is updated, but only update the fields within that row that have new data, e.g. copy a row but replace any old data with new data as soon as it’s received. I think I’ll do it that way.


I picture the database as memory. Don’t waste RAM when you can use disk :slight_smile:

When I detect a new ICAO, I give it a new flight_id and store whatever I get in the target table. Then when I get more data for the same ICAO, I just update the row.

If I get a new lat/lon position, I create a new row in the targetecho table. That way I have a target history from detect to fade, that can be used in a display program (for example).

You can also create a callsign table, and a modes table. The modestable being used to store registration and ownership as you figure it out by other means.

Just some ideas…


If you want simple, have a python script grab the aircraft.json file that is used to provide data to the local google map. From there you could connect to your MySQL database and insert/update the data. Or, create a request object in the python script and send the json to a php script. The php script could receive the json, then handle the insert/update process into the database.



This tools is now available at GitHub: