How Inaccurate Timekeeping can Unravel the Metaverse
We were not celebrating this New Year’s Eve. As the clock clicked down to midnight on 31 December 1999, we became increasingly anxious. How were the world’s networks going to react when the millennium (computer) bug awoke?
We had been preparing for years to confront this moment of truth. The bug had inadvertently been introduced by computer programmers who, for reasons of simplicity and compactness, had been using two digits instead of four to denote the year (like “94” instead of “1994”) in their programs. What would happen when computer clocks clicked over to the new century? Would bank systems start billing clients for negative interest payments? Would air travel systems understand that this “00” year meant 2000 and not 1900, a time before airplanes had been invented? Would the internet collapse?
This event came to be known as Y2K. Everyone – managers, programmers and network administrators – labored feverishly to try to avert disaster. And whatever they did to make systems “Y2K compliant” pretty much worked because the world’s networks did not crash in a heap at exactly 00:00:01 on 1 January 2000.
So what’s the big deal about time? Last week we talked about how GPS relies on atomic clock accuracy. That’s because even tiny time inaccuracies can result in huge locational errors.
Now let’s talk about your computer. The CPU (Computer Processing Unit) in your computer contains a Real Time Clock (RTC). The RTC creates continuous waves that are converted into digital pulses. A typical clock speed is 2 GHz or 2 billion pulses per second. This clock speed determines how quickly a computer can retrieve and execute instructions, and therefore is one – but not the only – measure of how fast a computer operates.
The clock’s function in life is, obviously, to keep time. It synchronizes activities across the computer so that things happen in the order they are supposed to happen. It date- and time-stamps your emails and all the other files you create and edit. It helps the computer keep track of tasks that are scheduled, like software updates and programs that check computer health and status.
Left to its own devices, the clock in your computer could drift a couple seconds each day away from the correct time. Over a year, that could add up to an inaccuracy of over 12 minutes! For this reason, your computer gets clock corrections from the networks to which it is linked. When your computer isn’t connected to any network, its internal clock keeps time for you until it can reconnect and get a more precise time update.
Why is this important? On the internet, your access to software tools, financial websites, or demo apps may require valid credentials – including an accurate time – for access. If your computer clock is wildly inaccurate compared to the time the internet says it is, you will be denied access. Furthermore, accurately time-stamped information can be essential to restoring the functionality of your computer after a crash or a cyber-attack.
The network shares timing information with your computer using a protocol. A protocol is an agreed upon set of procedures that devices follow in order to interoperate. For timing, a network uses something like the Network Time Protocol (NTP) to synchronize your computer’s clock with more accurate clocks.
As the protocol standard tells us:
The goal of the NTP algorithms is to minimize both the time difference and frequency difference between UTC and the system clock.
When these differences have been reduced below nominal tolerances, the system clock is said to be synchronized to UTC.
What’s UTC? UTC stands for Coordinated Universal Time. It is determined by the International Bureau of Weights and Measures (BIPM) in Sèvres, France. UTC is based on astronomical data about the Earth’s rotation and readings from hundreds of atomic clocks hosted in time standard organizations worldwide, like the US Naval Observatory (USNO).
The NTP model relies on a network of primary time servers. In the US, these servers use the USNO’s estimate of the coordinated universal time, known as the UTC (USNO). The USNO employs dozens of independently operating atomic clocks to estimate UTC and shares this with BIPM. And even though this USNO UTC drifts by only a handful of nanoseconds (10-9 seconds) each year, its time is still periodically adjusted to align with UTC.
NTP can be used by client computers to maintain system time synchronization with the USNO master clock. NTP runs as a client program on a computer. It sends periodic requests to one or more servers, obtaining time stamps and using them to adjust the client’s clock. The typical accuracy achieved is in the range 1-30 milliseconds.
What have we learned? Computers and the internet need accurate time measures. The general public first became aware of the importance of time during Y2K. Programmers’ failure to record an accurate year was actually a failure to record an accurate time. After all, the year 2000 began 2000 x 365 days/year x 24 hours/day x 60 minutes/hour x 60 seconds/minute = 6.3 x 1010 seconds after the year 0 CE began. Today, extremely accurate time synchronization is essential to the correct functioning of all our everyday technologies, like computers, cell phones, and GPS. Thank goodness we have things like UTC, time standard organizations, and timing protocols to keep our networks synchronized and humming.