pouët.net

determining why glsl shaders fail on ati

category: code [glöplog]
the situation: Godwin's law calls for a refresh :D
added on the 2012-01-05 18:30:16 by kbi kbi
Back on topic: here are the problems I've run into this week making an ATI/AMD-compatible version of our latest demo ( http://www.pouet.net/prod.php?which=58261 ) :


  • ATI has no automatic casting from float to int, so place int( ...) around mod, step, etc when assigning to int..
  • Also accessing arrays must be done with ints, nVidia happily accepts floats.
  • ATI is far pickier about the number of components:
  • This is OK on nVidia: texture2D( Texture0, gl_TexCoord[0] + vec2( theDistance, 0.0) )
    ATI requires you to follow the spec: texture2D( Texture0, gl_TexCoord[0].st + vec2( theDistance, 0.0) )
  • If a sampler is defined but not accessed in a shader (for example for shadowmapping with a variable number of lightsources), nVidia doesn't care about the texture unit value, it can be completely invalid. ATI crashes in the driver if invalid texture units are used. It seems they all must have a different texture unit as well, assigning 0 for several unused sampler doesnt crash but doesn't draw anything either.


Conclusion: supporting both ATI and nVidia is a good way to discover bugs in your demo :) (and the occasional driver bug :( Shakes fist at AMD...)
added on the 2012-01-05 23:00:24 by Seven Seven
<procrastinating Bypass coder>: Was trying to be sarcastic. I just find the idea of the OpenGL mess clashing with Apple's fascist nitpicking very amusing. :)

But there also might be a solution around there. Technical and/or 'political'. Apple is big enough to lead the way and snap at OpenGL consortium + h/w vendors. There is also the concern that the OGL situation is in part Microsofts doing. NVidia is working closely with Microsoft, no? No real invested interest to remedy OpenGL there. Direct3D works beautifully for the most part, on Windows and Xbox.

psonice: So I guess its still up to the drivers to compile/verify shader code?
added on the 2012-01-06 02:44:57 by Yomat Yomat
yomat: i've never investigated it, but i can't see it working any other way (unless there's some glsl -> byte code layer written by apple, then the drivers interpret that.. but I doubt it).

Related note: OSX includes Core Image as well as GLSL. It's basically a 2d image processing language that runs on the GPU, with the language itself being mostly a subset of GLSL (I guess this gets translated to GLSL by the OS and then compiled by the GPU driver). CI is really good for a lot of image processing stuff, because you get all the benefits of opengl without all the setup and other shit. It does have some annoying limitations though (e.g. conditionals are somewhat crippled).
added on the 2012-01-06 02:56:42 by psonice psonice
One of the problems shown by moving away from the compatibility profile is that gl_Color and other variables are not supported anymore. I found a demo which had this problem, and used this awesome tool to dump the shaders. I found the previous version of the demo via google and compared shaders to see how they fixed the problems people were having on ati.

Unfortunately, the glsl in question,
Code: gl_FrontColor = gl_Color;

was simply removed from the shader, so I'm not sure how it was actually fixed :D
added on the 2012-01-06 04:26:50 by shuffle2 shuffle2
also, about osx. just browse this and cry:
http://developer.apple.com/graphicsimaging/opengl/capabilities/GLInfo_1072_Core.html
added on the 2012-01-06 04:35:53 by shuffle2 shuffle2
<procrastinating Bypass coder>

plek | supah | warp ?
Shuffle2: I guess thats one approach. Lockdown to older, stable version. Also, whatever that software renderer is (Mesa?) it would act as common tool for developers like the D3D reference rasterizer.

Kaneel: 'the situation'.
added on the 2012-01-06 12:43:54 by Yomat Yomat
yomat: failed to pick up on that, shame on me :)
added on the 2012-01-06 12:49:37 by superplek superplek
Just wonder...frostbite engine works just fine with ati and nvidia. So do we have to learn something or blame the shit?
added on the 2012-01-06 13:01:27 by moredhel moredhel
In the end it's perfectly possible to write even OGL software that runs on both, duh. Just takes knowhow and *time*.
added on the 2012-01-06 13:20:58 by superplek superplek
@the situation

Coders just want to have to fu...err. :D Kind a funny thread itself. Nvidia is shit, so is ATi. Only prob seems to code pathway for them. Skyrim engine and frostbite engine proved that its possible to do on both.


added on the 2012-01-06 13:57:59 by moredhel moredhel
moredhel: hey genius - frostbite 2 is d3d11 not opengl, which doesn't suffer from the retarded situation gl has got in of putting all of the shader compilation - front and back end - in the driver code and letting the vendors do what they want for it. it makes it a bit more practical making code that works cross vendor if you "only" have to deal with their hw differences, warp sizes and driver bugs, not the whole shader compiler being different as well.

imagine if all your apps shipped as c source and had to be compiled on the target machine. but every target machine had its own different compiler to do it. and most of the compilers didn't implement the standard properly. and some of them let you use c++ even tho its meant to just be c, some of them let you use c99 and some claimed to be strict ansi but had a bunch of their own extensions anyway just for a laugh. that's how idiotic it is. combine that with glsl's utterly pointless pedantry, and you wonder why people have problems making compatible code.

added on the 2012-01-06 14:11:16 by smash smash
The situation is actually not as bad as what you could imagine when you don't use OpenGL though, it's just that if you use Cg functions in your GLSL code, well, it's not working very well on ATI. :-) doh.
added on the 2012-01-06 14:18:23 by nystep nystep
@smash: amen.

@moredhel: if it's just fun you're after it's acceptable to limit your target audience right? though in my experience it's also fun to *properly* fix things :) but that differs among the spectrum of coders.
added on the 2012-01-06 14:52:09 by superplek superplek
d3d *cough* portable *cough*
added on the 2012-01-06 14:57:37 by raer raer
nystep: um, I don't know how it is not clear yet, but I was never using Cg functions in GLSL...
added on the 2012-01-06 22:16:38 by shuffle2 shuffle2
d3d is a ms-specific api and everyone knows that or should know and apply it likewise.

it really doesn't excuse opengl for being untrue to whatever the hell "cross-platform" is supposed to mean.

opengl is what the amsterdam gay parade is to most gay men: a travesty.
added on the 2012-01-06 22:29:33 by superplek superplek
Quote:
nystep: um, I don't know how it is not clear yet, but I was never using Cg functions in GLSL...


yes you were. saturate() and frac() al Cg, not GLSL, as pointed out earlier.

again, if you write legal GLSL, it just works everywhere. if you write illegal GLSL, it doesn't. how surprising!
added on the 2012-01-06 23:16:08 by iq iq
Quote:
Code:#define frac(x) fract(x) #define saturate(x) clamp(x, 0.0f, 1.0f)
???
added on the 2012-01-06 23:25:47 by shuffle2 shuffle2
@shuffle2: regarding your earlier question: instead of using gl_FragColor (etc) you are supposed to declare an "out vec4 fragColor;" variable in your fragment shader, then call glBindFragDataLocation to bind that variable to the output 'slot'. being a GLSL noobie myself, I found this info on the GLSL wikipedia page ;) works fine on GL 3.3 although the manpage says it's a GL 4.x call. hope that helps!

as someone else already mentioned, check the shader compiler logs (glGetProgramInfoLog etc) and fix any warnings to improve compatibility (same as you (should) do when coding C/C++!)
added on the 2012-01-06 23:27:11 by xyz xyz
xyz I found that there too ;) Haven't done it yet...but my first question was "why have they changed it to this?" (ie what is the advantage) I'm just curious.
added on the 2012-01-06 23:34:10 by shuffle2 shuffle2
ah sry, you were talking about gl_FrontColor. the Nvidia driver here does not complain when using that in a #version 130 vertex shader (?!)
added on the 2012-01-06 23:34:54 by xyz xyz
@shuffle2: I am not exactly sure but I guess it has something to do with the output formats, e.g. when one of the outputs is just luminance (instead of a vec4) *shrug*
added on the 2012-01-06 23:37:43 by xyz xyz

login