I set the .Interval of a timer to "10", which I believe means 10 milliseconds. Since there are 1000 ms in 1 second, if I count to 100, I should have 1 second, right? The problem is, if I use a physical stop watch and compare it to my program, it is about 2x slower than the real time. Has anyone else experienced this?

Thanks,

Dave

Well I am assuming you have got alot of apps open, or have an old computer.

Try making a button and click it when everything is loaded to test the timer.

Hi,

I set the .Interval of a timer to "10", which I believe means 10 milliseconds. Since there are 1000 ms in 1 second, if I count to 100, I should have 1 second, right? The problem is, if I use a physical stop watch and compare it to my program, it is about 2x slower than the real time. Has anyone else experienced this?

Text from the CodeProject article - It is important to understand that the accuracy of timers is limited.

finito - I actually have a reasonable computer (IBM T43P) and nothing else is running.

adatapost - so what do people do in this situation? Just have an incorrect timer?? That doesn't seem too reasonable.

Dave

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.