This is release 2. It includes an updated ClockTracker application, and 96-hours of data gathered from Dec. 30, 2016 to Jan. 2, 2017.
The time-of-day clock of a Windows PC running in a domain does not pass time at a constant speed. Instead, it speeds up and slows down in order to stay synchronized with the domain controller.
This is shown by comparing long term data from both the current time of day and the Win32 MultiMedia timer.
The download includes:
- the source code for the
ClockTracker application used to acquire the data sets
- an executable version of
- the collected data sets #1 and #2 (described below), and (4) the leap second data (see the addendum below)
This article shows the results of comparing two sources of time information under Windows: the periodic timer and the time-of-day value.
The periodic timer, also known as the system timer, is available via the Win32 Multimedia Timers library. It was consistently configured to have a granularity of one millisecond. The time-of-day value comes from the .NET
One would expect that the relationship between the periodic timer and the time-of-day clock to be linear. That is, for each tick of the periodic timer, the time-of-day clock increases by a fixed delta value, equal to the true time difference between the periodic timer ticks.
This is NOT the case. For a Windows PC in a domain, the time-of-day clock, relative to the periodic timer, speeds up and slows down numerous times in any 24-hour period.
Given the following values:
P0 = initial periodic timer value
T0 = initial time-of-day value
Pi = i-th periodic timer value (i > 0)
Ti = i-th time-of-day value (i > 0)
The difference between the time-of-day clock and the periodic timer as of the i-th sample is Di, calculated as:
DPi = Pi - P0
DTi = Ti - T0
Di = DTi - DPi
The DPi, DTi and Di values are all converted as necessary to be expressed in milliseconds.
The data set data included in the download has only the values Pi and Di.
Two sets of data were gathered. In each, data was acquired for approximately 24-hours at rate of one sample per second using a PC running Windows 7 SP1 (called the "Test PC"). In each plot, the Y axis is the value of D, and the X axis is the number of samples. Each tick on the X axis is 3600 samples, or approximately one hour.
Data Set #1
In the first data set (figure 1), the Test PC was connected to a domain for the entire duration. What we see are the D values flipping positive and negative numerous times over a 24-hour period.
Figure 1 - Plot of D values from data set #1 - Test PC connected to a domain
Data Set #2
In the second data set (figure 2), the Test PC was disconnected from the network (and hence the domain) immediately before the start of the test. It was reconnected to the network (and the domain) approximately 18 hours after the start of the data collection.
Figure 2 - Plot of D values from data set #2 - Test PC detached from domain for first 18 hours
Data Sets Combined
Note that Figures 1 and 2 have different Y-axis scales! If the two data sets are plotted together, it appears as in Figure #3 below.
Figure 3 - Plot of D values from data sets #1 (blue) and #2 (red)
What is Happening?
Warning: This is all supposition on my part. I don't have any formal references to back it up.
The periodic timer is precise but inaccurate. By precise, I mean that the time between each increment in the counter value is fixed and does not change. By inaccurate, I mean that the time between each increment is not exactly equal to what it was set.
If the periodic timer was set to a resolution of 1 millisecond, the real time between each tick could be something a bit different, such as 1.0001 or 0.9999 milliseconds,
The time-of-day clock on a Windows PC is driven by the periodic timer. After each tick, the time-of-day clock is incremented by the resolution of the timer. For example, if the periodic timer was set to 4 milliseconds, then after each tick, the time-of-day clock would be incremented by the same amount: 4 milliseconds.
Any inaccuracies in the periodic timer will cause the time-of-day clock to drift. For example, if the periodic timer is set to a resolution of one millisecond, but the real time between ticks is 1.0001 milliseconds (an error of 0.01%), then after 24 hours, the time-of-day clock will be off by more than eight seconds.
Enter the domain controller...
The domain controller knows the true time to a high level of accuracy, usually receiving such information from a known-to-be-accurate network time server. Each PC in the domain has a time-of-day offset value, which, in addition to the periodic timer resolution, is added to the time-of-day clock after each periodic timer tick. The controller routinely checks the PC's time-of-day clock. If the clock is ahead of the time-of-day clock on the domain controller, the offset value is decreased. If the clock is behind, the offset value is increased.
Data Set #1 would imply that offset value is not perfect. That is, the true duration between each periodic tick plus the offset value never adds up to exactly one millisecond. It's either just a bit less or a bit more. Hence the constant flipping of direction as the domain controller makes constant corrections.
Data Set #2 shows that without the domain controller as a time source, the time-of-day clock gets further and further from the truth. Once the Test PC is reconnected, the time-of-day clock is rapidly adjusted to bring it back in sync with the domain controller clock. Examination of the data following the reconnection to the domain shows the time-of-day clock shifting by 3 miiliseconds per second, or about 0.3%.
Does it Really Matter?
Well ... no, not really.
No matter how much correction the domain controller applies to the time-of-day clock, the speed of the clock will only ever change by a small amount. The clock will certainly not stop or go backwards!
Users will not notice, and almost all software will be unaffected. Note that I said almost all, as it did affect one of my projects, which was how this was discovered. See "Points of Interest" below.
But I tend to find these under-the-hood discoveries interesting, even irrelevant ones like this...
The complete source code is available in the download.
The code shown below is of the thread responsible for collecting the time data. The code has been simplified to make the intent clearer.
The variables defined outside of
ThreadProc() are as follows:
dataCount - number of data samples to collect
sampleTime - number of milliseconds between samples
ClockEvent() - an event handler to receive the collected data; the values
clockDiff (corresponding respectively to Pi and Di described above) are the only values recorded
TimePeriod - a class wrapping the Win32 timePeriod API
private void ThreadProc()
DateTime startClock = DateTime.Now;
int startPeriod = TimePeriod.Ticks;
while (dataCount > 0)
DateTime curClock = DateTime.Now;
int curPeriod = TimePeriod.Ticks;
int clockOffset = (int)(curClock - startClock).TotalMilliseconds;
int periodOffset = (curPeriod - startPeriod);
int clockDiff = clockOffset - periodOffset;
ClockEvent(this, periodOffset, clockDiff);
Points of Interest
I stumbled upon this issue while developing .NET software to communicate with an external device.
The device and the PC would be communicating for many hours during which it was necessary to keep the device clock and the PC clock tightly synchronized. Unfortunately, the device clock was not as accurate as we required. Our experimentation, confirmed by the clock manufacturer's specifications, showed a drift of up 1 part in one hundred thousand (or 0.001%). This seems like a small number, but after a period of 10 hours, it could translate into a clock difference of more than 300 milliseconds.
For us, the 300 milliseconds was a problem, but the solution was simple. The software would maintain a clock offset value which would be added to the timestamps from the device to get the "true" time. The offset value was updated about once per minute by comparing the device timestamps and the correspond PC clock values.
After communicating with the device for many hours, I expected the plot of the offset value to be a straight line - uniformly increasing or decreasing depending on whether the particular clock crystal on the device was running fast or slow. But the plot was similar to Figure #1 above - in changed directions numerous times.
Initially, I thought there was a problem with the clock crystals. It was only after comparing the periodic timer with the time-of-day clock was the problem solved.
I think it will be fun to pursue this, just to see what else can be discovered. I'm curious to see how the clock time pattern changes over longer periods of times, such as weeks and months. One way of handling this would be to change the app into a service so as to be able to track time drift automatically without having to be logged in.
Addendum #1 - The 2016 Leap Second
A leap-second was added to the world clocks at the end of 2016. Google's way of handling the leap second was to "smear" the leap second over 20 hours, starting ten hours before midnight December 31, and ending ten hours after midnight.
I ran the
ClockTracker for four days (96 hours) - December 30, 31 and January 1, 2 - at a rate of one sample every ten seconds. I was hoping that given a sufficiently long data acquisition, the smear would be apparent.
The results (shown below) were .... disappointing. There is no apparent smear.
The clock time pattern did definitely change starting close to midnight on Dec 31, and there was a big jump starting at noon January 1st (and again around 3pm on January 2nd). These might be significant, but there is nothing in the data which jumps out and says "Hi! It's me! The leap second! Here I am!!". Oh well...
- December 6, 2016 - First release
- January 14, 2017 - Second release; new version of the
ClockTracker application and data from the 2016 leap second
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.