As a psychologist, it's useful to be able to time people's reactions in milliseconds. What kind of accuracy could I expect to get in this regard from Python? From what I've read so far, it doesn't look like I can hope to get precise enough timing to get reliable results, but I'm having difficulty finding good sources. Anyone out there know for sure?

Recommended Answers

All 14 Replies

I'm not sure you'll get meaningful results from anything mouse-and-keyboard driven *because* the mouse and keyboard events can sometimes take unpredictable amounts of time to be passed along to your program.

I'm not a hardware expert, so I could be wrong about that ... but at least keep the possibility in mind.

That said, time.time() does indeed give fractional times. So it might work for your purposes, if you can find an independent way to check its reliability.

Jeff

Thanks for the thoughts. I did manage to find this web page, however, which claims it has some Python add-ons that can achieve sub-millisecond accuracy(!), depending on one's set up.

http://www.psychology.nottingham.ac.uk/staff/jwp/psychopy/home.php/Main/HomePage

I tend to believe it's possible because there are many experiments out there that do require millisecond accuracy and make use of computers.

It's not something I need badly right now, but it's nice to know that there's the option.

P.S. I don't know of any way to reliably test time.time() -- I mean, I tested it for second-level accuracy, but it wasn't surprising that it could handle that, heh.

If you have a windows machine, you can achieve accuracy better than 1 millisecond with time.clock() ...

# time.clock() gives wallclock seconds accurate to at least 1 millisecond
# it updates every millisecond, but only works with windows
# time.time() is more portable, but has quantization errors
# since it only updates updates 18.2 times per second
 
import time
 
print "\nTiming a 1 million loop 'for loop' ..."
start = time.clock()
for x in range(1000000):
  y = 100*x - 99  # do something
end = time.clock()
print "Time elapsed = ", end - start, "seconds"
 
"""
result -->
Timing a 1 million loop 'for loop' ...
Time elapsed =  0.471708415107 seconds
"""

The module timeit has accuracy better then 1 microsecond and turns off garbage collection too. However, it is mostly used to time statements and functions.

I dont know about python but i assume that like most other languages, there are certain datatypes that offer greater precision?

Hmm, thanks for the info vegaseat! But (and you knew this was coming) what about for Macs?

The datetime module will do micro-seconds

import datetime 
print datetime.datetime.now()
time_now = datetime.datetime.now()
print time_now

Glad to be wrong. :)

The datetime module will do micro-seconds

import datetime 
print datetime.datetime.now()
time_now = datetime.datetime.now()
print time_now

Not really! If you look at the output, the digits that correspond to microseconds will always be zeroes. But you could hope for millisecond resolution.

Where is your info coming from? The link below to the Python docs specifically states that one can do microseconds. On my computer, at this time,
print now.microsecond
988561
Also, a google for "python datetime, microsecond" yields 16,800 hits (some of which are dupes) but none the less you will have to do it yourself asI have no desire or time to spoon feed everything to everyone. Saying that the datetime module will do microseconds is more than enough of a hint. Some basic research would serve you much better. Imaginary "truths" will not.
http://docs.python.org/lib/datetime-time.html

I think OS might play a role. My experiments confirm Ene Uran's results, while obviously woooee's results show that microseconds are possible in some cases.

>>> datetime.datetime.now().microsecond
796000
>>> datetime.datetime.now().microsecond
328000
>>> datetime.datetime.now().microsecond
562000
>>> datetime.datetime.now().microsecond
515000
>>> datetime.datetime.now().microsecond
500000
>>> datetime.datetime.now().microsecond
375000
>>> datetime.datetime.now().microsecond
546000
>>> datetime.datetime.now().microsecond
390000
>>>

Either (a) I've got an amazing talent for hitting 'Enter' twice, with the second one happening on the millisecond, or (b) I can't get microseconds on my system.

Assuming (b) (though (a) would probably make me a wealthy man), the only explanation I can imagine is that WinXP limits the clock precision in some way.

Wooeee -- you running Linux by chance? Or OSX?

Jeff

While I know that it's certainly possible to get millisecond (and sometimes microsecond) *readings* from certain functions, the question I was really after was whether they can truly be trusted. It's possible to imagine that, if a computer is slow enough, such a stamp might not reflect accurately, say, when a user pushes a button, because it takes the computer a while to register the button press, etc.

The truth is, my knowledge of hardware is woefully lacking, so I don't know if this is a true concern -- and I haven't really been able to find anything on the web that goes into this kind of detail. Thus I was hoping someone here might have this kind of knowledge.

The hardware should achieve microsecond resolution, but the layers of software you have to go through will negate that!

Even if the Python interpreter has a high priority level, the timing will change, since the operating sytems runs plenty of other programs in the background (keyboard, mouse, screen updates, virus, active malware, spyware etc.).

If you want to achieve just millisecond resolution amongst samples, you will have to average a fair number of samples. That holds true for many things you do in the sciences.

Hmm, thanks for the information, Ene Uran.

There is a module, PsychoPy, that claims to get millisecond accuracy and recommends using the screen's refresh rate to do so (by, for example, displaying a stimulus for X screen draws). What do you think?

Also, plenty of psychology experiments use computers to gauge response time in milliseconds, so it seems possible with some programs -- but is it with python?

I guess it depends on what you want to do but there's a subtle difference between accuracy and precision. Even using the WinAPI you'll be millisecond precise but not accurate. That is, units of milliseconds but not that you're reading the timer at the right time.

Plus if you're trying to time say a visual display to the millisecond then you're likely to be out of luck. PC's, Macs and Linux will all have similar problems as they use the same hardware. Even RTOS systems won't help you all that much as soon as you interact with a standard TFT or keyboard.
SNIP

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.