I set the .Interval of a timer to "10", which I believe means 10 milliseconds. Since there are 1000 ms in 1 second, if I count to 100, I should have 1 second, right? The problem is, if I use a physical stop watch and compare it to my program, it is about 2x slower than the real time. Has anyone else experienced this?

Thanks,

Dave

Well I am assuming you have got alot of apps open, or have an old computer.

Try making a button and click it when everything is loaded to test the timer.

finito - I actually have a reasonable computer (IBM T43P) and nothing else is running.

adatapost - so what do people do in this situation? Just have an incorrect timer?? That doesn't seem too reasonable.

Dave

This article has been dead for over six months. Start a new discussion instead.