pouët.net

*-only demos

category: general [glöplog]
scali: i think you trust vendor driver quality a bit too much.
added on the 2004-01-29 12:56:26 by smash smash
Radeon drivers worked fine for me so far (plenty of experience with them, I test on a Radeon laptop and R8500). GeForce drivers have some problems (gf4 doesn't seem to be able to handle 1d texcoord generation in some cases).
It's better than not having it run at all anyway.
In general both the GeForce and Radeon drivers are quite mature at this point, and you shouldn't have to worry. I am sure that an exception can be made if you happen to run into a driver bug.
But yes, I am willing to say, if it runs on the GeForce drivers, it will most probably run on Radeon drivers. My experience is that Radeon drivers are closer to Direct3D spec than GeForce drivers are.
added on the 2004-01-29 13:16:56 by Scali Scali
Compatibility with APIs between ATI and NVIDIA is obviously a matter of competition between them. They make alliances with Microsoft as to how the next version of DX should work etc. etc. It's a complete mess for end-users but that's how it is. I don't believe this marketshare competition should affect the demoscene. And since inevitably it does, I think we should try to avoid flaming about it. It's their problem, let's not make it ours too.

Anyway, I think it would be nice if compatibility was mentioned in parties (this prod runs on ATI only, for example) so that voters can take that into account. I think it's wrong to bash people for making *-only demos. It's their work, they should do as they please. Finally, if party organizers want to enforce some compatibility rules in the demo compo (like ATI 8500+ and GF3+) it's their choice, which (imho) should be based on the general quality / importance of the party, the significance of the sponsors, the amount of prizes involved etc. I mean, you cannot demand specific compatibility guidelines from contestants when the prizes are nil or insignificant.

Just for the record, I am an nvidia card owner myself and I, too wasn't able to watch the SotA ATI-only winning demo, which I suppose started this thread.

added on the 2004-01-29 13:24:44 by moT moT
Looks like it was the nv-only runner-up by blackmaiden that started it: http://www.pouet.net/prod.php?which=11456

The winner is not ATi-only, it is reported to run on GeForce FX, although slowly, but that is a known problem of FX anyway (it might also render some parts incorrectly, I haven't seen it yet).
In fact, I have never seen an ATi-only demo yet. Only nv-only. Which makes the choice for ATi-hardware much more acceptable. It might start an ATi-only trend, but this is unlikely for various reasons already mentioned above. It will however eliminate the nv-only trend, which is a good thing. At worst the trend goes from nv-only to ATi-only, and nothing is gained, nothing lost. At best, demos will become compatible, I think it's worth a try.
added on the 2004-01-29 13:42:24 by Scali Scali
moT, erm. seriously, if (and I could not for the world understand why) coocoon really made it vendor specific, then so be it. they obviously write really lousy code if they must rely on opengl extensions that are for ati cards only, so it's their problem imo if that's the case. if it's a dx9 demo, then tough luck for you if you bought a sucky dx9 class card instead of a good one.
that's it :)
added on the 2004-01-29 13:45:35 by steffo steffo
Smash: What century are you living in? Using Geforce3 class functionality and you claim ATI can break on it? On WHAT ?

For instance you blamed ati for not being able to have zbuffer rendertarget in an ugly hack that nVidia does, sure, they dont support it! But they dont claim they do! Check the caps! (I remember you running around nagging about bad ATI drivers in this case, wich is obviously not the case!)

So again, WHAT can you break ? My old DX7 shitstuff still runs on all cards ive tried, my dx8 shit stuff still runs aswell (i had a bug in the vertexbufferclass in one intro (chipgoa) wich made it bug with ONE ati driver release (the same one wich fucked with fr's candytron), but then again, that was *my* fault).

Speed my differ between drivers, rendertargets where buggy for some years ago on nVidia, but still even nVidia reported back the problem in their drivers so if you handled your API back then, it was still not a problem, you could easily check it was missing, inform the user and the user would have to (obviously) switch drivers since he should know that his geforce card supports alt. rendertargets...

And again, if you make anything run faster on GF4 than on Radeon 9800 pro, BE MY GUEST! >> stefan@mirrorgate.se , ( sourcecode included ) otherwise i will just void all your comments on this matter.
added on the 2004-01-29 13:58:05 by Hatikvah Hatikvah
And while I own a radeon9500 myself, I have to agree slightly with Smash. Driver support is still a bit flaky sometimes (unless the most impressive thing you're doing is a render-to-texture), but at least then you can assume your prod will eventually work on ATI. Just blatantly using NV-only stuff, without a good reason, will just lock out a large percentage of viewers.

Again, if that's what you want to do, that's fine, but it just seems stupid. (So that also means I agree with DXMStefan, argh).
added on the 2004-01-29 14:01:40 by sagacity sagacity
stefan: im not talking about extra stuff nvidia support which ATI dont (like the zbuf hack), but standard stuff. clearly in my post i didnt advocate writing nvidia/ati only code using gl extensions or whatever (what did i say? "asking for trouble").

but we still live quite far from a world where standards really are standard, and you can code on one card which claims to support some features (any feature of your choice), run it on another card which claims to support some features, and see the same visual result, only the speed differing. that just doesnt happen at the moment. on that day its reasonable to demand compatibility or kick it out the compo, but until then some understanding is still required.
sagacity, yeh i do expect it eventually will work on ATI, but thats not particularly useful just before a compodeadline =)
added on the 2004-01-29 14:20:08 by smash smash
smash: still, my question was, what can you do on a gf3-4 with dx that doesnt run straight off on the radeon 9500+ ?
added on the 2004-01-29 14:38:10 by Hatikvah Hatikvah
added on the 2004-01-29 14:46:25 by superplek superplek
stefan: IF the drivers are working, nothing.
added on the 2004-01-29 14:52:27 by smash smash
stefancrs: Hey, I've mostly managed to avoid landing in a hospital after organizing a party so far. :)
added on the 2004-01-29 15:10:41 by ryg ryg
plek: damn straight, and rightly so.

smash: yeah okay, but if there are slight graphic glitches you could argue that "it's still ATI-compatible" and then have them run it on the NV-compomachine (in the case of 2 compomachines, that is).
added on the 2004-01-29 16:45:08 by sagacity sagacity
unfortunatly, if you write clean code, don't use any vendor specific feature, if you are doing some advanced stuff, and only test on one card, then it will surely have a few bugs on the other vendors card.

but most likely that's easy to fix, especially at a big party-place with plenty of opportunity to test and some friends offering help.

the only demos that are really hard to fix are those that use vendor specific extensions.

i can remeber running through a partyplace, checking my stuff on every available setup, finding and fixing bugs even for my old A500 demos.
added on the 2004-01-29 17:15:54 by chaos chaos
The solution is to provide a divx version with the demo.

Ask to provide a divx together with the demo :)
Be nice with coders.
I m not a coder myself but i feel so sorry for them
added on the 2004-01-30 09:57:29 by nytrik nytrik
"Demos do not have to exceed 3mb - just look at what you can do in 64kb"

Now that's the quote of the year. And last year.
added on the 2004-01-30 13:46:27 by thorsten thorsten
I'm 100% with ryg. nVidia or ATi only demos are really annoying.

In OpenGL the ARB path is always there, fragment program, vertex program, and vertex buffer object... no reason for vendor specific stuff.

On the D3D side: CHECK THE CAPS, DUMBHEAD!
added on the 2004-01-30 14:23:17 by EvilOne EvilOne
I really don't know about D3D, but with OpenGL the whole thing has gotten out of hands, mainly thanks to nVidia. I use ARB extensions *only*, and try to do a fallback for the features used as well, whenever possible.

So, B would be ok for me, but i'd prefer C since i think that the "pushing the limits" should not be forbidden. "Let the people decide".
added on the 2004-01-30 15:40:39 by kurli kurli
On the D3D side: CHECK THE CAPS, DUMBHEAD!

But to trust those cap values and rely on that can be pretty risky sometimes. But I guess it's not such an issue today, with rather mature drivers and all...
added on the 2004-01-30 15:51:17 by tomaes tomaes
vote : c)
have a nice day
added on the 2004-01-30 16:49:57 by v3nom v3nom
So, I've got to chime in for a last point
Yes, coders (as well as the rest of the democrew) do it for free, but it's not all about (trying to) winning a compo. I mean demos are meant to be seen by other sceners, not only by yourself and the orgas...
It would be like writing a book, giving it to the publisher but saying "no, don't publish this book please because I'm too arrogant"
Sorry guys, *-only demos suck, especially when *={your computer, orgas' computer(s)}
And I can remember a bunch of demos that wouldn't run even with the right hardware.
LEARN TO CODE!

login