# In Theory...

#### Geek-o-Terica 17: Accuracy vs. Precision, The Time Drift Chronicles

##### Bob Balaban  September 8 2013 07:09:08 AM
Greetings, geeks!

Ever need a software timer to see how long it took to do something? This is a common task, right? And the basic technique is simple: capture the current time, do your thing, preferably for a while so that you average out a large number of operations to eliminate random perturbations, capture the current time again, and report on the difference between the two times.

Sounds easy, right? But there are complications. Such as: What is the relative granularity of the time span you are measuring, vs. the granularity of the timer? Here’s a brief anecdote to illustrate what I mean.

In a previous job I was asked to do benchmarking of the data flowing through a piece of software (it happened to be a server backup system, but that’s not the important point). The test I devised involved copying about 3 terabytes (TB) – that’s 3000 gigabytes – from one place on the network to another, and seeing how long it took. The software I was using to do the data movement had built-in timers and logging, so I didn’t have to do anything special: just set up 3TB of random stuff somewhere, point the software at it and provide a destination location. The log would then contain the elapsed time.

I knew it would take a while (more than a day, in fact, for reasons I won’t bother to explain here), so I kicked off the test, wrote down the date and time, and went home early (I love those kinds of tests…)

When I got in the next morning, the test was still running. I hung around for another couple of hours until it was done, wrote down the end date/time, and went looking for the execution log. I noticed two odd things right away: 1) My computed elapsed time, based on writing down the start and end times, was something like 16.5 hours, but the elapsed time in the log was closer to 18 hours; 2) The start and end time recorded in the execution log were in microseconds.

So the logged results were very precise, but not very accurate. So here’s the first point I want to make about accuracy and precision:

Point 1 About Accuracy and Precision: For 99% of the real-world “timing” applications you will ever encounter, it makes no sense for the granularity of the event itself to be out of proportion to the granularity of the timer. If the thing you’re timing (copy 3TB of data over a network) takes hours (or days), then what’s the point of timing it in microseconds? Is it going to matter if the resulting elapsed time is 1 day and 2 microseconds, vs. 1 day and 3 microseconds? Very doubtful, as you’re looking at a difference in the 11th decimal place, or thereabouts, which is extremely unlikely to be significant, especially given Point 2.

Point 2 About Accuracy and Precision: Software timers drift. That is, the “current time” measured in milliseconds, microseconds or nanoseconds is an estimate, not an absolute. The finer-grained (more “precise”) the measurement, the more likely it is to drift from “real” (clock) time as you measure longer and longer intervals. This explains why the log for my experiment was off by a few hours: had the software used millisecond-, or even better, second-level granularity, the results would have been much closer to wall clock time.

I got to thinking about how I might demonstrate this issue in a simple Java program. Java has two different built-in “timer” capabilities. The java.util.Date class will give you its date/time value in milliseconds, measured since the “reference date” (usually January 1, 1970, 12AM is time 0). So you can create a Date instance as your start time, create one as your end time and subtract the values returned by Date.getTime(). That’s your elapsed time in milliseconds.

There’s also a nanosecond-level timer in Java (1000 nanoseconds == 1 microsecond), System.nanoTime(). The documentation for this call isn’t precise about when time 0 is for this timer, but I suspect it’s related either to computer boot time, or JVM start time (it would take too many bits to measure the current time as number of nanoseconds since January 1, 1970).

I wanted to see if the two timer techniques tracked each other over some number of minutes, so I wrote this Java program: NanoTime.java

The basic idea is to sample the timers (millisec. and nanosec.) every 10 seconds (make the thread sleep in between) or so, and see if they stay about the same relative to each other, or if the difference between them gets bigger as time passes. So, I get the “start” time in both milliseconds and nanoseconds, then loop for N minutes, sleeping for 10 seconds after every iteration of the loop. Inside the loop I get the elapsed time from “start” in both milliseconds and nanoseconds. The “drift” value is computed by simply dividing the nanosecond value by 1000000 (convert to millisec.)  and subtract from the millisecond elapsed time.

I know the 2 values will never be the same. For one thing, getting the current timer values requires two Java calls. By the time we’ve made the second call, some time has already passed since the first call. But that’s not the important thing. The important thing is to monitor the change in value of “drift” as the program continues to run. If both timers are equally accurate, then the value of “drift” should not vary very much over, say, 10 or 15 minutes.

In fact, however, it does (on my laptop). Here’s a selection from the program output:

At time  17:53:22  elapsed millisec = 0 elapsec nanosec = 3124  Drift in millisec = 0
At time  17:53:32  elapsed millisec = 10120 elapsec nanosec 10119357918  Drift in millisec = 1
At time  17:53:42  elapsed millisec = 20121 elapsec nanosec 20119601550  Drift in millisec = 2
At time  17:53:52  elapsed millisec = 30121 elapsec nanosec 30119829118  Drift in millisec = 2
At time  17:54:02  elapsed millisec = 40122 elapsec nanosec 40120027683  Drift in millisec = 2
At time  17:54:12  elapsed millisec = 50123 elapsec nanosec 50120249897  Drift in millisec = 3
At time  17:54:22  elapsed millisec = 60123 elapsec nanosec 60120394024  Drift in millisec = 3

SNIP…

At time  18:07:02  elapsed millisec = 820167 elapsec nanosec = 820136832058  Drift in millisec = 31
At time  18:07:12  elapsed millisec = 830167 elapsec nanosec = 830137065428  Drift in millisec = 30
At time  18:07:22  elapsed millisec = 840168 elapsec nanosec = 840137282287  Drift in millisec = 31

SNIP…

At time  18:23:42  elapsed millisec = 1820224 elapsec nanosec = 1820158420728  Drift in millisec = 66
At time  18:23:52  elapsed millisec = 1830224 elapsec nanosec = 1830158597874  Drift in millisec = 66
At time  18:24:02  elapsed millisec = 1840225 elapsec nanosec = 1840158851769  Drift in millisec = 67
At time  18:24:12  elapsed millisec = 1850226 elapsec nanosec = 1850159074430  Drift in millisec = 67
Done!

The “Drift” number is the milliseconds number minus the nanoseconds number (scaled to milliseconds, of course). So the drift values behavior is very interesting. It shows that the elapsed time derived in milliseconds advances faster than the elapsed time derived in nanoseconds. Had the drift remained more or less constant over time, we’d chalk up the difference between the two values to program execution time: it takes a finite amount of time to execute a line of code, and the two time samples can’t occur at exactly the same moment. Plus, of course, other things are going on in the Java VM (garbage collection, perhaps other background thread execution), and on the computer itself (many processes time-sharing the CPUs…). But the fact that the drift value keeps increasing over time indicates a real difference in the sampling techniques.

I ran the program again on another laptop, and got some very interesting results:

At  12:37:47  elapsed millisec = 0 elapsec nanosec = 2165  Drift in millisec = 0
At  12:37:57  elapsed millisec = 10080 elapsec nanosec = 10080167661  Drift in millisec = 0
At  12:38:07  elapsed millisec = 20080 elapsec nanosec = 20080718639  Drift in millisec = 0
At  12:38:17  elapsed millisec = 30080 elapsec nanosec = 30081314176  Drift in millisec = -1
At  12:38:27  elapsed millisec = 40080 elapsec nanosec = 40081868017  Drift in millisec = -1
At  12:38:37  elapsed millisec = 50080 elapsec nanosec = 50082447490  Drift in millisec = -2

SNIP…

At  12:43:47  elapsed millisec = 360080 elapsec nanosec = 360100194438  Drift in millisec = -20
At  12:43:57  elapsed millisec = 370080 elapsec nanosec = 370100772026  Drift in millisec = -20
At  12:44:07  elapsed millisec = 380080 elapsec nanosec = 380101345283  Drift in millisec = -21
At  12:44:17  elapsed millisec = 390080 elapsec nanosec = 390101918540  Drift in millisec = -21
At  12:44:27  elapsed millisec = 400080 elapsec nanosec = 400102496756  Drift in millisec = -22
At  12:44:37  elapsed millisec = 410080 elapsec nanosec = 410103064147  Drift in millisec = -23

SNIP…

At  12:51:27  elapsed millisec = 820079 elapsec nanosec = 820126346016  Drift in millisec = -47
At  12:51:37  elapsed millisec = 830079 elapsec nanosec = 830126909636  Drift in millisec = -47
At  12:51:47  elapsed millisec = 840079 elapsec nanosec = 840127478912  Drift in millisec = -48
At  12:51:57  elapsed millisec = 850079 elapsec nanosec = 850128065229  Drift in millisec = -49
At  12:52:07  elapsed millisec = 860079 elapsec nanosec = 860128631991  Drift in millisec = -49
At  12:52:17  elapsed millisec = 870079 elapsec nanosec = 870129212163  Drift in millisec = -50
At  12:52:27  elapsed millisec = 880079 elapsec nanosec = 880129787934  Drift in millisec = -50
At  12:52:37  elapsed millisec = 890079 elapsec nanosec = 890130352950  Drift in millisec = -51
At  12:52:47  elapsed millisec = 900079 elapsec nanosec = 900130939617  Drift in millisec = -51
At  12:52:57  elapsed millisec = 910079 elapsec nanosec = 910131482563  Drift in millisec = -52
At  12:53:07  elapsed millisec = 920079 elapsec nanosec = 920132072303  Drift in millisec = -53
At  12:53:17  elapsed millisec = 930079 elapsec nanosec = 930132644303  Drift in millisec = -53
At  12:53:27  elapsed millisec = 940079 elapsec nanosec = 940133205897  Drift in millisec = -54
At  12:53:37  elapsed millisec = 950079 elapsec nanosec = 950133785300  Drift in millisec = -54
Done!

Cool, huh? Similar behavior in that the two values diverge at a more or less constant rate over time. BUT, in the opposite direction! In this case, the nanosecond value advances faster than the millisecond value. Go figure.

One additional interesting question might be: which value is “right”, or “more accurate”? I don’t have an answer to that question, and within the world of this test program, there’s absolutely no way to tell. What you’d need to do is measure the results of the two sampling techniques against an object outside measurement, a 3rd clock that doesn’t depend on the computer running the Java program. After all, a “gap” of 50-60 milliseconds over a time interval of 30 minutes is a pretty small variation, approximately .003%.

Geek ya later!

Follow me on Twitter @LooseleafLLC
This article ©Copyright 2013 by Looseleaf Software LLC, all rights reserved. You may link to this page, but may not copy without prior approval.

Comments

1Scott Leis  10/30/2013 1:37:55 AM  Geek-o-Terica 17: Accuracy vs. Precision, The Time Drift Chronicles

Interesting results. Were those two laptops running the same OS and version?

If Java is used in the Windows OS, I would expect System.nanoTime() to depend on the Windows API's high-resolution timer.

I'm inclined to try writing a similar program using Windows-native code to see if direct calls to Windows API functions produce similar behaviour.

2Bob Balaban  11/3/2013 7:05:11 AM  Geek-o-Terica 17: Accuracy vs. Precision, The Time Drift Chronicles

Both laptops run Windows7. One is a Lenovo and one is an HP, so the hardware config is different.

Would love to hear the results of your experiment