December 3, 2009

Python - Accurate Cross Platform Timers - time.time() vs. time.clock()

In Python, you can add simple timers around any action using the clock() or time() functions from the time module. time.clock() gives the best timer accuracy on Windows, while the time.time() function gives the best accuracy on Unix/Linux. You should use whichever one is more appropriate for your system.

Inside the timeit module's source code (in Python's standard library), you can see an example of this. The timer type is specified by the platform:

if sys.platform == "win32":
    # On Windows, the best timer is time.clock()
    default_timer = time.clock
else:
    # On most other platforms, the best timer is time.time()
    default_timer = time.time

This seems like a nice way to create a timer that is accurate on either platform.

Here is an example:

import sys
import time

# choose timer to use
if sys.platform.startswith('win'):
    default_timer = time.clock
else:
    default_timer = time.time

start = default_timer()
# do something
finish = default_timer()
elapsed = (finish - start)

1 comment:

Marius Gedminas said...

Also note that on platforms other than 'win32' they measure different things. If your application is spending most of the time waiting for I/O (from disk, or from network), time.clock() will measure only the bits when your app is doing something, while time.time() will also measure time spent waiting.

win32 is special (as usual), and time.clock() measures wall-clock, like time.time() time instead of CPU time, like time.clock() on all other platorms.