Hey guys I'm writing a program that profiles the speed of different algorithms and I have been using the getTickCount() method in order to do so. However the program is reporting that the algorithm took 0ms. Doesn't have to be too accurate but currently 'RunTime' is coming up as 0.

Code:

{	
				StartTime = GetTickCount();
				quadratic_dll(img);	//calls dll containing algorithm.
				RunTime = GetTickCount() - StartTime;
				lastAlgorithm = "Cubic";
}

Edited 6 Years Ago by Nick Evan: Added code-tags

Can you post some compilable code (i.e. which headers are you using?) Have you output the value of GetTickCount() before and after the algorithm?

Maybe the precision is not good enough. Is it running in any sort of time that you know for sure is more than 1ms? If it does definitely take more than 1ms, then, as david said, try to inspect the tick count values directly to see if the GetTickCount() and your conversion to milliseconds is working as expected. If it might be that your algorithm is too fast for the precision of the clock or, more probably, of your conversion method, then simply run the algorithm many times, say 1000 times, and divide the profiling result by 1000.

Yeah I'll post up some code. I'm pretty new to profiling and I just want a rough time that the algorithm takes for comparison purposes. I am pretty sure that it takes more than 1ms but I could be mistaken.
Do you know of any other methods that are as simple but would give me the same kind of estimate?

Hiya, i tried something similar (locking openGL app to 50fps ish) using performance counters. article:MSDN performance timers It gives a good explination of how to achieve what you want (geting time from one point in code to another) and the resolution can be extremely precise.

Hope this helps

This article has been dead for over six months. Start a new discussion instead.