pouët.net

Intel and DreamWorks Working On Rendering Animation In Real-Time

category: offtopic [glöplog]
 
http://tech.slashdot.org/story/11/11/16/2035250/intel-and-dreamworks-working-on-rendering-animation-in-real-time
You wouldn't say, but realtime is faster? I'm curious if they're going to release any specs, or that it is just vaporware.
added on the 2011-11-17 02:54:41 by numtek numtek
hmm, what does that remind me off.. hmmm...
added on the 2011-11-17 03:19:11 by CobaltHex CobaltHex
I suspect that by 'realtime' they mean at least interactive speeds, on a hugely expensive workstation or server setup.

I saw something recently about intel making a new 'coprocessor' with a huge number of pentium-like cores (basically what became of larabee), I guess a rack or two full of these and maybe a few dozen high end GPUs could give a decent realtime render. Stream the results back to the animator's workstation onlive style and you have what they're talking about.

Something like that would be seriously revolutionary for animators, especially those bits of the pipeline where you have to wait for a fairly high quality render to get some feedback on what you're doing. It won't mean much to us for quite a while though I reckon.
added on the 2011-11-17 10:59:56 by psonice psonice
So, 3D rendering goes cloud?
added on the 2011-11-17 11:27:10 by nitro2k01 nitro2k01
imagine the fancy new avatar fullhd 3d realtime
added on the 2011-11-17 11:29:18 by pista pista
Prediction: They're so used to their current workflow that they'll just use the really fast rendering procedure to render offline at even finer details with even crazier realism.
What Graga said.
Maybe they're using this F/CPU.
added on the 2011-11-17 12:12:04 by raer raer
rare: that's the one i was thinking of. A rack full of them would churn out some pretty pixels at speed :)
added on the 2011-11-17 12:39:13 by psonice psonice


From the article: "However, developers need to code their software in order to take best advantage of GPUs."

Personally, I'm a bit concerned by this. Are we in the demoscene community ready to CODE our software and thereby taking the best advantage of the GPUs? Frankly, I'm not convinced that we are. People! You NEED to CODE your software. Otherwise there's a bunch of GPU capacity just sitting out there not having someone taking best advantage of it. Just because people were too lazy to CODE THEIR SOFTWARE. Sheeesh...
added on the 2011-11-17 18:31:50 by elfan elfan
Yeah, back in the people still wrote code. The games these youngster play nowadays,.. just no code at all. Where did the glamour go? Bring back the code! Even Os'es these days contain next to nothing code-wise.
added on the 2011-11-17 21:21:34 by numtek numtek
Quote:
You NEED to CODE your software. Otherwise there's a bunch of GPU capacity just sitting out there not having someone taking best advantage of it.


on modern PC demos, GPU processing power is already heavily used by more and more complex shaders (not to mention geometry shaders) and geometries. To be able to use GPU processing power to achieve GPGPU stuff you'd need an additional GPU. That would raise the price of the demo-ready PC even higher.
added on the 2011-11-17 21:57:27 by zerkman zerkman

login