pouët.net

High Precision Timer in C

category: general [glöplog]
 
I've never tried this before so I'm not sure what I'm doing. I usually rely on something like allegro or sdl to help me. I'm trying to write a good timer/timed loop so I can control execution speed (60fps). Here's what I have so far.

Code: #include <stdio.h> #include <time.h> #include <sys/timeb.h> int GetMilliCount(){ struct _timeb tb; _ftime( &tb ); int nCount = tb.millitm + (tb.time & 0xfffff) * 1000; return nCount; } int GetMilliSpan( int nTimeStart ){ int nSpan = GetMilliCount() - nTimeStart; if ( nSpan < 0 ) nSpan += 0x100000 * 1000; return nSpan; } int main(){ int x = 0; int y = 0; time_t startTime; time(&startTime); x = GetMilliSpan( startTime ); while(1){ y = GetMilliSpan( startTime ); if( y - x >= 100 && y - x <= 400 ){ printf("%i\n",x); } } return 0; }


The example just spits txt out for (hopefully) 300ms. What do you think? Is there a better way to do this. Thnx
added on the 2009-06-30 13:01:52 by xteraco xteraco
You should use gettimeofday(), but other than that... there's to my knowledge no better portable timers available.

reading the cycle counter from the CPU used to be popular on Intel platforms but now with multiple cores it's not so clean cut...
added on the 2009-06-30 13:10:18 by sasq sasq
if you are on mac, you can use mach_absolute_time.
added on the 2009-06-30 13:17:11 by neoneye neoneye
I'm thinking about looking into the winapi if this doesn't work well.
added on the 2009-06-30 14:53:04 by xteraco xteraco
in windows the high resolution timer ( look for QueryPerformanceCounter ) does the job nicely.
added on the 2009-06-30 15:01:13 by smash smash
On Win32, you should use GetPerformanceCounter/GetPerformanceFrequency (or GetTickCount if you can live with the limited precision). On Unixish systems, gettimeofday is the weapon of choice.
added on the 2009-06-30 15:08:18 by KeyJ KeyJ
Aren't you better off using the graphics API's VSync for accurate frame timing and something like Windows' GetTickCount() to control the speed of what's fed to the rendering engine?
added on the 2009-06-30 15:10:50 by doomdoom doomdoom
I hereby donate to you my timer class:

Code: #ifndef VSX_TIMER_H #define VSX_TIMER_H //#ifndef WIN32 #include <time.h> #include <sys/time.h> //#endif class vsx_timer { double startt; double lastt; double dtimet; #ifdef _WIN32 LONGLONG init_time; #endif public: void start() { startt = atime(); lastt = startt; } double dtime() { double at = atime(); dtimet = at-lastt; lastt = at; return dtimet; } // normal time double ntime() { return ((double)clock())/((double)CLOCKS_PER_SEC); } // accurate time double atime() { #ifdef _WIN32 LARGE_INTEGER freq, time; QueryPerformanceFrequency(&freq); QueryPerformanceCounter(&time); return (double)((double)(time.QuadPart-init_time) / (double)freq.QuadPart);; #else struct timeval now; gettimeofday(&now, 0); return (double)now.tv_sec+0.000001*(double)now.tv_usec; #endif } vsx_timer() { #ifdef _WIN32 LARGE_INTEGER time; QueryPerformanceCounter(&time); init_time = time.QuadPart; #endif } }; #endif
added on the 2009-06-30 15:11:42 by jaw jaw
run dtime() to get the delta time since last call (once per frame)
added on the 2009-06-30 15:12:36 by jaw jaw
Have the issues with performance counters and speedstepping CPUs been solved, though?
added on the 2009-06-30 16:35:30 by doomdoom doomdoom
Mind you, SDL_GetTicks() does end up calling gettimeofday() on most platforms. If you run unix setitimer ITIMER_REAL and hooking SIG_ALRM gives you a resolution of about 10ms typically. Some systems it's more precise
added on the 2009-06-30 16:55:57 by sigflup sigflup
doom, not if you are a purist: The resolution of gettickcount is 60fps which is low enough to cause time aliasing artefacts if you happen to render at any other rate. And 60Hz flickers awful lot on CRTs..

Of course, if you just can't get an accurate timer: float hack(float t) { static float t2=0, t3=0, wtf=0.1; t2 += (t-t2)*wtf; t3 += (t2-t3)*wtf; return 2*t2-t3; }

As a bonus it will cause hell of a lot cooler artifacts when the fps changes:)
added on the 2009-06-30 17:11:07 by 216 216
Mine, for windows:

In C: .h + .cpp
(replace int32 and int64 types as you need).

And a more "modern" (from 2003) port to .net, in c#: here

Don't know which dependencies it might need but should be easy to sort that out
added on the 2009-06-30 17:20:21 by Jcl Jcl
deja vu
added on the 2009-06-30 17:55:03 by texel texel
How about timeBeginPeriod(1) and timeGetTime() for a timer with 1 ms precision? Seems to work on XP and win7 at least, haven't tested on anything else.
added on the 2009-06-30 17:56:36 by snq snq
216: Yeah, I did a Google and it's a confused issue apparently. As far as I can tell the resolution of GetTickCount() is 1 ms, but the accuracy varies wildly, and it can be off by 20-100 ms, depending on who you believe.

Seems like a lot of people recommend the performance counter, but then a lot of people say it's completely unreliable due to speedstepping and multi-core systems.

timeGetTime() should be good though, right?
added on the 2009-06-30 18:14:44 by doomdoom doomdoom
on Mac there is also CFAbsoluteTimeGetCurrent().
added on the 2009-06-30 18:18:11 by neoneye neoneye
GetTickCount() and timeGetTime() return the exact same value on all systems i've tried so far.

timeBeginPeriod(1) affects the scheduler granularity which makes both timers more accurate.
added on the 2009-06-30 18:43:07 by ryg ryg
216: I guess you would need 30fps to use it then?
added on the 2009-06-30 20:51:49 by Hatikvah Hatikvah
Ryg, what systems have you tested that on?
timeBeginPeriod(1) has never affected GetTickCount() in my tests, again only tested XP Pro and Win7 (RC1).

I just tested using this loop, and GetTickCount still has a precision of 16-17 msec, not 1.

Code:timeBeginPeriod(1); DWORD last = 0; for(int i=0; i<1000; i++) { while(timeGetTime()==last) {}; printf("timeGetTime:%u GetTickCount:%u\n", last=timeGetTime(), GetTickCount()); } timeEndPeriod(1);
added on the 2009-06-30 20:57:52 by snq snq

login