VC++ is slow.

category: general [glöplog]
Of course there is a way to manually evaluate that, but what he's trying to say is that the human capability to evaluate such things decreases as the complexity of the hardware you're working on increases. On consoles for example, it's often still very doable to evaluate such things (allthough you'll mostly end up evaluating on a more algorithmic or at least slightly higher level, defeating the need to write assembly code). It also very much depends on the scope you're targetting, hand-optimizing using special instruction sets like MMX and SSE (audio decoding, you name it..) can be sufficiently predictable and useful.

But besides that, it is possible to describe the ruleset that applies to the platform at hand and design certain heuristics, even though one would probably not be able to take all these factors into account when manually performing the same task. We have computers to do that for us you know.. :)

And yeah, this is pretty "duh". I know.
added on the 2004-05-14 23:51:21 by superplek superplek
In the case of Intel, we have processors from different vendors with different architectures. Something made for the Pentium 4's 20-stage pipeline probably wouldn't do as well on an Athlon, for example. With enough detailed information, you could certainly count cycles on a specific chip, but then your work only applies to that chip. Plus, on a multitasking OS, the cache gets to be unpredictable depending on the kernel's task switching.
added on the 2004-05-15 06:28:43 by s_tec s_tec
And that's why PCs sucks. I will never be able to put my ass down, learn to optimize from Pentium to Athlon and show you something. And then, people would say that on Athlon64 or Athlon128 or Athlon256 or Pentium V/VI/VII it's useless anymore :P

I guess I will only try to start from my 386, no matter if you think it's stupid, cause it's a personal dream of mine. Perhaps I will move to 486 and Pentium later from that. I'll see..

Something else? You think that optimizing ends up at compiling a programm and trying to optimize that??? The best codes on the 8bits are so unpredictable, with unrolled codes, reusing data on registers, the feeling is like hardwiring your algorithm in assembly. It's preety much diferrent than what a compiler does. Then, you may see diferrences. But then again, I don't know what happens today with the shitty PCs :P
added on the 2004-05-15 10:14:23 by Optimonk Optimonk
Why don't you just learn to write code for PCs in C/C++ and optimize there, and worry about asm later?
You rarely need to optimize in asm anyway, for modern (accelerated) PC demos, since the CPU is rarely the bottleneck anymore.
Instead, you should concentrate on how to send the data to the GPU in the most efficient way, and how to make the GPU process it as efficiently as possible.
added on the 2004-05-15 12:51:30 by Scali Scali
why are you so concerned with "optimization"? why not concentrate on making something original and beautiful instead of boring seen a million times "plasmas" and "fires"? with modern pc's we have enough speed without concerning ourselves with tweaking the hell out of every function we write.

if optimization is so important, do it last, _after_ you have written something worth seeing. use your imagination and artistic ability first. after that, optimizing will get you some extra brownie points, but it's not the most important thing imo.
added on the 2004-05-15 13:52:12 by Bagpuss Bagpuss
Scali: I'd guess for _lots_ of modern demos, the CPU is still the primary bottleneck.
added on the 2004-05-15 15:03:00 by Sesse Sesse
Hm, I dare to oppose that. Really.

It's just the Optimus is and probably will never be a good programmer, so he has to focus on bullshit like this :)

added on the 2004-05-15 15:22:13 by superplek superplek
To put it this way.. Impressing does not solely imply hand-optimizing functions.
added on the 2004-05-15 15:22:41 by superplek superplek
>>The best codes on the 8bits are so unpredictable,
>>with unrolled codes, reusing data on registers,
>>the feeling is like hardwiring your algorithm in

Can you imagine how much more complicated modern PC effects are from the fireplasmas that your beloved 8bit machines are running? 1 theoretical clock tick spent in vain here and there because of "bad compiler optimization" has no effect on the FPS in these cases. Even on codes of medium size, hand-optimizing would take incredibly much time and experience.
added on the 2004-05-15 16:20:13 by uutee uutee
Though micro-optimizing and smart handling of the architecture/compiler does a lot of good things for you. There still *is* something called 'the efficient way' you know.. It's just taken to a higher level.

And not all real-time CG programming consists of over-complicated stuff. Not at all. Certainly not some important trivial tasks.
added on the 2004-05-15 17:55:56 by superplek superplek
learning your way to decent highlevel datastructures can lead to an increase in both productivity and execution time :-)
Bagpuss: This is just what I'd like to code.

Anyway,. this thread had no meaning, I originally opened it under some weird state just for fun, to see your responses..
added on the 2004-05-15 22:57:32 by Optimonk Optimonk
Is it me, or Optimus just did is coming out and said he is a troll ? o__O
added on the 2004-05-16 00:19:41 by p01 p01
I think he tries to create posts like his old "serious" ones intentionally to get attention. It just seems that he is not under control of his "talent".

His post "This town is Z" must be a tremendous failure for him. So many letters typed, so few replies.

Class clown behaviour i'd say..
added on the 2004-05-16 01:11:28 by Stelthzje Stelthzje
I love Optimus.
added on the 2004-05-16 09:54:17 by texel texel