pouët.net

UINT in HLSL

category: general [glöplog]
 
Is there some sort of restriction for how you can use uints in HLSL? The MSDN page for HLSL scalar types doesn't seem to mention anything special regarding them. The problem is that if I declare a variable as uint in my HLSL pixel shader, the program crashes. Signed ints work (and unsigned int in GLSL works).
This is on DX10 / GF7700 btw, using the ps_3_0 profile.
added on the 2009-03-08 20:38:52 by mic mic
are you sure unsigned int in GLSL works on a card that is less than shader model 4? i.e. does it works also on your setup?
added on the 2009-03-08 21:32:42 by nystep nystep
Quote:
are you sure unsigned int in GLSL works on a card that is less than shader model 4? i.e. does it works also on your setup?

Yep, it works on this rig. It also works on a GF6200.
added on the 2009-03-08 21:41:09 by mic mic
Quote:
DX10 / GF7700 btw, using the ps_3_0 profile

Does not compute.
It doesn't make sense. DX10 does not work on this card. And integer arithmetic is something new in Shader Model 4.0
added on the 2009-03-09 00:04:55 by imbusy imbusy
imbusy: i think if a card doesn't support DX10 features it doesn't matter, it still runs the application as long as you don't use any of those features (in other words d3d10.dll is ABI compatible) as far as i know. but i really am not the one to speak about DX10, smash, ryg or chaos might be more of a help.

mic_: on the other hand, for integer operations on the GPU you need shader model 4, and unfortunate for you, none of the cards you cited does support SM4.0
added on the 2009-03-09 00:22:37 by decipher decipher
uhm, but since they removed the caps-bits, how can one know if a program will run or not?
added on the 2009-03-09 00:50:53 by kusma kusma
to be a dx10 compatible card they must support the whole dx10 specification. thats one of the reasons why microsoft removed the caps bits. (to get less card specific code).
added on the 2009-03-09 00:57:10 by pantaloon pantaloon
Yeah, that's what I thought. So how come mic_ is able to run DX10 on gf6 and gf7?
added on the 2009-03-09 01:08:34 by kusma kusma
he can't, period. he's using dx9, obivously :)
added on the 2009-03-09 01:11:53 by ryg ryg
while trying to use DX10-features, obviously :)
while he should be coding GBA, obviously :)
@graga: GBA? You've had one too many Faxe to drink. I'm coding on Sega consoles these days. Sega does what Nintendon't, y'know? :P

@rest: What I'm trying to do is to divide a value by some other value, then take the lowest 8 bits of the result (to get a value in the range 0..255). Since I can't use bitwise operators, the way I did it in GLSL was like this:
Code: uintVar=floor(floatVar); uintVar/=someValue; uintVar=mod(float(uintVar),256.0); floatVar=float(uintVar);

Works fine on the two cards I mentioned. Trying to do a similar operation in HLSL doesn't, because it crashes just from including the variable declaration.
added on the 2009-03-09 07:12:22 by mic mic
mic_: my guess is that your shader fails to compile, and you end up trying to make COM-calls to an uninitialized pointer. Which is a very bad idea. Make sure you check return-codes, especially on object creation.
added on the 2009-03-09 08:40:46 by kusma kusma
Well, yeah, it crashes when I change a single variable declaration from int to uint, so I guess it's pretty obvious that it fails to compile the shader. What I'd like to know is why it fails. Or I'll just have to see if I can use another workaround in HLSL..
added on the 2009-03-09 09:29:51 by mic mic
no error message returned? :)
added on the 2009-03-09 11:33:42 by Gargaj Gargaj
mic_: because you need shader model 4 and DX10 to use "uint" in shaders. It's not a supported datatype in shader model 3. This has been pointed out many times in this thread already.
added on the 2009-03-09 11:37:31 by kusma kusma
That being said, uint works just fine on my gf8 in dx9. But I can't find it mentioned in the documentation for other purposes than DX10. Anyway, I gave your code a go, and it didn't compile, due to the use of mod(). When I replaced that with "%", the code worked. But again, I don't think you can reliably use uint in your DX9-code - I suspect the support is only there in the compiler for DX10-purposes, and that it's a bug that it doesn't prevent you from using it. But I could be wrong, of course.
added on the 2009-03-09 11:44:23 by kusma kusma
@moose: Decipher and imbusy mentioned "integer operations/arithmetic". They didn't make any distinction between signed and unsigned, and signed ints are functioning through HLSL on my GF7.

@gargaj: None checked. I'm using Hitchhikr's DX framework to load the shader, and it's written to be as small as possible, not to check for errors. I guess I could make a debugging sandbox for HLSL shaders, but I was mostly just curious about why it would work in GLSL but not in HLSL and ass-u-med that someone here would've run into this before since there are people here who do a lot of shader coding.

I'll just use something else in HLSL, it's not a major issue.
added on the 2009-03-09 11:53:17 by mic mic
@moose: The code I pasted was GLSL. I'm using fmod in the HLSL code.
added on the 2009-03-09 11:54:35 by mic mic
whatever. just use GLSL it's better. it's not me who said it, it's mic_ ! :)
added on the 2009-03-09 11:57:33 by nystep nystep
the problem is that at least in dx9, uint doesnt appear to be a valid type (when i checked it in fxcomposer, anyway). even though it claims to be in the hlsl docs. so thats why your shader doesnt compile. compilers spit out warnings and errors for a reason, you know.. :) could you not just use int instead, though?
on ps3.0 profiles/dx9 pixelshaders it'll all be compiled using floats anyway, so hey. :)
added on the 2009-03-09 12:49:52 by smash smash
mic_: What I tried to say was "Just because it runs, doesn't mean it's correct".
Smash: sure you didn't meant "could you not just use float instead, though?"
added on the 2009-03-09 12:55:55 by kusma kusma
Yeah, I've fixed it already, using an extra if-clause.
added on the 2009-03-09 13:35:23 by mic mic

login