I set the .Interval of a timer to "10", which I believe means 10 milliseconds. Since there are 1000 ms in 1 second, if I count to 100, I should have 1 second, right? The problem is, if I use a physical stop watch and compare it to my program, it is about 2x slower than the real time. Has anyone else experienced this?

Thanks,

Dave

Recommended Answers

All 4 Replies

Well I am assuming you have got alot of apps open, or have an old computer.

Try making a button and click it when everything is loaded to test the timer.

Hi,

I set the .Interval of a timer to "10", which I believe means 10 milliseconds. Since there are 1000 ms in 1 second, if I count to 100, I should have 1 second, right? The problem is, if I use a physical stop watch and compare it to my program, it is about 2x slower than the real time. Has anyone else experienced this?

Text from the CodeProject article - It is important to understand that the accuracy of timers is limited.

finito - I actually have a reasonable computer (IBM T43P) and nothing else is running.

adatapost - so what do people do in this situation? Just have an incorrect timer?? That doesn't seem too reasonable.

Dave

Text from MSDN blog : Accuracy is how close you are to the correct answer; precision is how much resolution you have for that answer.

Here is a blog and a thread. Hope it will help you.

1. http://ejohn.org/blog/accuracy-of-javascript-time/
2. http://www.dotnet247.com/247reference/message.aspx?vote=163d5129-b83a-49de-994a-39f12405f519&id=256675&threadid=714288

commented: Very true it's all about preception +1
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.