0

I set the .Interval of a timer to "10", which I believe means 10 milliseconds. Since there are 1000 ms in 1 second, if I count to 100, I should have 1 second, right? The problem is, if I use a physical stop watch and compare it to my program, it is about 2x slower than the real time. Has anyone else experienced this?

Thanks,

Dave

3
Contributors
4
Replies
5
Views
6 Years
Discussion Span
Last Post by __avd
Featured Replies
  • 1
    __avd 1,826   6 Years Ago

    Text from MSDN blog : [URL="http://blogs.msdn.com/oldnewthing/archive/2005/09/02/459952.aspx"]Accuracy [/URL]is how close you are to the correct answer; precision is how much resolution you have for that answer. Here is a blog and a thread. Hope it will help you. 1. [url]http://ejohn.org/blog/accuracy-of-javascript-time/[/url] 2. [url]http://www.dotnet247.com/247reference/message.aspx?vote=163d5129-b83a-49de-994a-39f12405f519&id=256675&threadid=714288[/url] Read More

0

Well I am assuming you have got alot of apps open, or have an old computer.

Try making a button and click it when everything is loaded to test the timer.

0

finito - I actually have a reasonable computer (IBM T43P) and nothing else is running.

adatapost - so what do people do in this situation? Just have an incorrect timer?? That doesn't seem too reasonable.

Dave

This article has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.