Updating local DB

Hi,

is it possible to update the local DB which seem to be a bit old? Aircraft types etc are missing which are shown on other sites.
I’m using the latest PiAware Release.

1 Like

The official version is up to date in regards to the sources it has access to.
(sources that allow to be redistributed)

Here is on how to change the database, good luck.

1 Like

Thanks, i will check it or leave it as it is.

1 Like

Also see: Updating the dump1090-fa database

Maybe you can get together on the project.
I’m still curious if the database VRS is using is somehow accessible.
Might also ask at adsbexchange if they would consider publishing their database.

But they might both be limited on redistribution by what datasources they use.

2 Likes

I tried extracting the data from the SQL database in flightairmap but it doesn’t contain what is required so I wrote a script to extract the data from the VRS database instead.

My VRS database is 25 gigabytes and the export file was about 500 megabytes.

That has to be passed through a mixture of python 2, 3 and JS scripts before it can be used in DUMP1090-FA and it seemed to work fine but when I put it on my Pi, it works for a while and then crashes.

I abandoned the project due to other commitments and can’t find some of the information but I can send what I have to anyone who would like it.

2 Likes

As far as I can tell, the data in the VRS database is publicly available so it can be used for non commercial purposes without any problems. I have found the script I wrote and will run it to see what I get out of my database now that I have two receivers feeding it. Let me know if you would like a copy.

1 Like

Whatever works

I just come to that point because i see planes with missing/wrong information where the other sites are having the information in place.

1 Like

Just ran my export script and it looks like my memory is bad because the resultant csv file is actually 18 megabytes but it contains in the region of 470,000 records.
edit: make that 415,730 - told you my memory was bad.

1 Like

Oh well i’ll take a look.

Why is it so many entries, shouldn’t it be one entry per aircraft frame?
(don’t think there are quite that many aircraft, are there?)

1 Like

zip containing sqlite3, script and exported csv here: export vrs zip

script obviously needs altering to match your setup. run it with ‘sqlite3 <vrs_to_dump1090-fa.sql’. need to shutdown vrs first if the database is locked.

1 Like

Just noticed something in the export that might break dump1090-fa if it gets through the next processing steps - there’s a couple of records that contain what would surely constitute illegal data.

I wonder if cleaning it up would solve the problem?

1 Like

Something like 250k in the US alone.

Some of the 400k VRS entries are likely to be deterministic hexid/tail number mappings rather than real aircraft (the dump1090 toolchain filters those out when there’s no additional data and the mapping is predictable)

1 Like

I’m pretty sure I’ve seen 'planes here in the UK that have been acquired and re-registered to UK operators that are shown as having their original Nxxxxx tail number.

I assume this is due to dump1090 automatically determining the ‘tail numbers’ from the hex id rather than just the database being out of date?

1 Like

If they’re still transmitting the old US-allocated hexid, then that’s an avionics problem, yeah. It’s a fairly common problem with tail number changes.

1 Like

I’d always assumed the hex id was permanently assigned to the airframe and never changed but you seem to be saying that is not the case.

1 Like

The address is allocated by the country responsible for registering the aircraft, from a block of addresses assigned to each country. That address has to change if the aircraft registration moves to a different country, and can change even within a single country (e.g. in the US, if the aircraft changes to a different N-number, the address will also change).

Only one airframe should be using a particular address at any one time, but an airframe might change addresses over its lifetime, and a single address might get reused for different airframes.

3 Likes

I knew this was the place to hang out. You learn something in every session.

Thanks.

4 Likes

What do you mean?

Anyway i’ve processed all three csv giving your csv the highest priority if there are duplicates.

Working fine for me so far.
You can try it like this:

Edit: File removed, this won’t work anymore

wget -O /tmp/db.zip https://github.com/wiedehopf/fluffy-blobs/raw/master/db.zip
sudo rm /usr/share/dump1090-fa/html/db/*.json
sudo unzip -d /usr/share/dump1090-fa/html/db /tmp/db.zip

It’s probably not necessary to get rid of the old json files but it doesn’t hurt either.

1 Like

There are a couple of records that have spurious quote marks in them and the records at lines 285506/7 should be A78FE3,N58695,ZZZZ,“Timm N2T-1 Tutor”.

I don’t know how the conversion tools would deal with that.

As I said, it ran fine on mine for a few hours and then hung so it will be interesting to see what happens for you.

edit: I have fixed the errors in the export file and now I want to try and fix them in VRS.

1 Like

I’m not sure how it would hang after a few hours.

As in you couldn’t load the page anymore?

I see. The filter script removes all quote marks but a few.
I can fix the rest up manually.