achieving compatibility with OpenGL

category: code [glöplog]
This is a coder thread.

Everyone who did an OpenGL demo know how painful and poorly undocumented it is to be compatible with all gfx cards out there. Especially porting from NVIDIA to ATI brings lots of problems.

Why about sharing knowledge about these incompatibilities ? I suggest a format for the problems one can encounter:

vendor GFXcard x84524+:
- does not support (advertised) GL_WTF_esoteric_extension
- can't render to elliptic textures
- GLSL: no support for statements

We started hosting a bunch of glinfo2.exe reports here: OpenGL capabilities
With the disappearing of delphi3d.net, there seems to be no place to find such information. Er... there is this this page but the full database don't seem to be available.

I hope it will provide useful tips about what you can expect from a given card, even if glinfo2 become less and less informative with the new OpenGL versions. I don't know if GPU Caps Viewer has an export function but it could be interesting aswell.
added on the 2010-04-09 16:52:53 by ponce ponce
Starting with pretty old cards:

ATI Radeon 9800:
- cannot render to NPOT texture
- GLSL: no texture2DLod in fragment shader
- GLSL: no constants array of matrices
- GLSL: no dynamic array index

ATI x700
- GLSL: no texture2DLod in fragment shader
- GLSL: no switch-case statement
- GLSL: no constants array of matrices
- GLSL: no dynamic array index
- should call glEnable(GL_TEXTURE_xD) and glDisable(GL_TEXTURE_xD) for each used texture unit or samplers yields black
added on the 2010-04-09 16:57:17 by ponce ponce
Use the highest OpenGL version as possible (3.2 at the moment).
added on the 2010-04-09 17:02:22 by xernobyl xernobyl

vsync differences:

(swapInterval(1) and vsync enabled in driver settings)

ATI (Radeon Mobility 4570, latest Catalyst drivers):
- SwapBuffers() does not block
- VSync does not work in Win7/x64 (it works in Linux!)

NVidia (9800 GT)
- SwapBuffers() blocks

I had to put a glFinish() before the SwapBuffers() call to get
real smooth vsync'd animation on ATI. The alternative
would have been to usleep()/Sleep() before calling SwapBuffers.

Anyone knows a better solution ?
added on the 2010-04-09 19:21:28 by xyz xyz
boobs. everything's better with boobs.
added on the 2010-04-09 19:27:07 by ferris ferris
cool link indeed
added on the 2010-04-09 20:06:06 by ponce ponce
ATI HD 2400, Catalyst 04/05/2010
- you cannot pass a sampler wrapped in a struct to a function. Pass the sampler separately else your sampler will yield black.
- a preprocessor routine must end with \n, EOF won't work (compiler fails with blank error).
- #version must be the first line of a shader, if used (NVIDIA accepts it anywhere)
- GL_AMD(X)_debug_output works but is nearly useless, don't bother implementing
added on the 2010-06-17 11:24:24 by ponce ponce
OpenGL still suxx ? ( altho i feel bad about saying this, and even more about not already using this ! )
achieving compatibility with OpenGL

... through simply not using opengl

thats the easiest way ;)
added on the 2010-06-17 11:39:21 by gopher gopher
To be fair, ATI drivers for the HD series are miles better than for the x1000 series. Yet something usually break with each release.
added on the 2010-06-17 11:46:58 by ponce ponce
i´d love to break up with m$/dX and go oGL completely , but hearing/reading about stuff like this alll the time just kills the movement !
I use http://www.kludx.com/. And it lists both Direct3D and OpenGL features.
added on the 2010-06-17 13:13:17 by pmdata pmdata
I suggest to rename this thread to "achieving ATI-compatibility with OpenGL".
added on the 2010-06-17 14:38:01 by hfr hfr
I use http://www.kludx.com/. And it lists both Direct3D and OpenGL features.

Nice database. I find funny how intel claims OpenGL 2.0 support without GL_ARB_multisample, nor GL_ARB_imaging.

Anybody has experience with Intel + OpenGL "2.0" ?
added on the 2010-06-17 14:45:58 by ponce ponce
The #version directive must occur in a shader before anything else, except for comments and white space

hfr: that's just because #ponce hasn't started on glsl1.5 or geoshaders yet ;) (nvidia has turned pretty strict (and more than what the standard says) with the newer versions).
added on the 2010-06-17 17:09:31 by Psycho Psycho
Actually this is in the spec since glsl 1.10, debugging bad GLSL is also part of my work.
added on the 2010-06-17 18:04:10 by ponce ponce