Data clarification

Hi! Could you please clarify?
Why timestamp at “timeAtServer” is less then timestamp at sensors “measurements”?
I thought that signal from aircraft go to sensors (and here we have timestamp in “measurements”), then information about new entry go to server (and here we have timestamp in “timeAtServer”).


timeAtServer is a date in seconds from 0 to 3600 as we are working with one hour of data, 0 being the first entry in the file. I do not know exactly how this date is computed.

The timestamps are the dates (in nanoseconds) at which different receivers receive the same message. We do not have the date at which the aircraft has emitted the message.


@richardalligier, Thank you.

I understand, that timestamps in “measurements” are the time of arrival of the signal at the receivers (in nanoseconds).

According to the documentation, “timeAtServer” - is the time when the information was received by OpenSky’s ingestion server.

In data, sensors timestamps increases from ~0.9 sec, to ~3600,9 sec; timeAtServer is a time in seconds from 0 to ~3600 sec. So, I assumed that both indicates the same one hour. Is it correct?

So, the question is: if both timestamps of the same aircraft indicates time in the same hour, and server received data from sensors, why timestamp of receiving information about the aircraft by the server is less then timestamp of signal arriving at sensors (it is indicates, that information about signal arrive to server earlier than signal arrived to sensors) ?

@masorx could you please comments



Yep all in the same hour. It’s an indication of the sorting basically.

So, you are confused about the sensor timestamps? Are you aware that these are not synchronized with each other? Did you read the examples here, in particular from the part about timestamps:

Yes, I am confused about the difference between sensors timestamps and timeAtServer timestapms for the same observaation. I read the documentation, and understand that “the timestamps of some sensors are broken”.

At the same time, Overview page tell us, that in the first round “the competitors do not have to put any effort into sensor time synchronization”. I assumed, it means that we do not have to put any effort into time synchronization.

But for example in data we have (round1_competition.csv, id=10):
TimeAtServer = 0.0119998455047607 (sec)
timeAtSensor = 1.000432156 (sec)
1.0004>0.012, TimeAtSensor > TimeAtServer.
The question is: Why?

The issue is that sensors don’t have an absolute value (e.g. unix time) but an internal clock, which runs over. The timeatserver is set to start at zero. The sensor values could start at any point, it’s just what they deliver. I think the misunderstanding comes from there: they are (supposed to be) synchronized via GPS, i.e. their clock stays at more or less the same speed as they get it from GPS. That doesn’t mean they have all been set to the same value at the start of the hour.