pouët.net

Demos with remarkable transparency effects

category: general [glöplog]
 
Hello,

I am looking for demos that had groundbreaking effects for the time when it comes to transparency. There are plenty of Glenz Vector effects on Amiga demos and then there's of course the beginning of Second Reality, but which specific demos were considered the first of their kind having unprecedented transparency effects? The more overlapping transparent surfaces in one scene, the better.

If you have good other examples, the window can be widen for games and computer graphics in general.
added on the 2020-02-08 17:31:40 by wabe wabe
We had some discussion on the glenz forum, and the one remarkable I remember after glenz is https://youtu.be/amfdVqhzpHI?t=202
added on the 2020-02-08 21:14:53 by Optimus Optimus
added on the 2020-02-08 21:15:10 by Optimus Optimus
Really, the ones I can remember are from the late 90s PC software rendered.
Many scenes in this one https://www.youtube.com/watch?v=GGffKMfCFto

But I don't find anything remarkable about transparency as an effect besides being too CPU heavy in the past (and heavy for early GPU too), but can't think any remarkable differences between demos as they would all look very similar.
added on the 2020-02-08 21:19:32 by Optimus Optimus
Thanks, these are pretty cool. I especially like the dancing body on Boost. Do you happen to know what are the implementation details in the case of Boost? Was every triangle (or pixel?) simply sorted by depth and then blended together along with the background to form the final image or were better algorithms used back then? I am currently exploring the research work of nvidia and trying to get a general understanding of how the transparency algorithms and techniques have developed over the years.

If you have any other 90s software rendered demos in mind with similar effects to Boost, feel free to post more examples. Also, if you happen to know for sure if some of the modern demos are using some specific transparency algorithms or techniques like: weighted blended order-independent transparency, depth-peeling, dual-depth-peeling, etc. I'd love to inspect these as well.
added on the 2020-02-09 15:04:55 by wabe wabe
added on the 2020-02-09 16:15:24 by MrVainSCL MrVainSCL
anything from the GL_ONE, GL_ONE era :-/
added on the 2020-02-09 20:28:50 by EvilOne EvilOne
wild light - drift (during the julia water voxels)
Wonder by Sunflower maybe fits the criteria?
added on the 2020-02-10 10:43:38 by keito keito
Quote:
Do you happen to know what are the implementation details in the case of Boost? Was every triangle (or pixel?) simply sorted by depth and then blended together along with the background to form the final image or were better algorithms used back then? I am currently exploring the research work of nvidia and trying to get a general understanding of how the transparency algorithms and techniques have developed over the years.


Conducting a prior art search?
I also found the Glenz thread Optimus was talking about and it had some good related discussion. Here's the link to the thread: https://www.pouet.net/topic.php?which=11829

Quote:
wild light - drift (during the julia water voxels)


Wow! Have not seen this one before and I gotta say I love it. The color scheme is very appealing to me and the music is nicely connected to what is happening on the screen. Couple nice transparent effects as well.

Quote:
Wonder by Sunflower maybe fits the criteria?


Yep, that orange blob looks pretty sick. A bit similar thing can be seen on Boost. After the start it gets pretty hard to see what's going on with all the blending going on. Cool demo regardless with an interesting soundtrack. It turns out that asking for certain features to be included in a demo is a good way to find some gems you have not seen before. Maybe I should actively search and watch more demos myself.

Quote:
Conducting a prior art search?

I got interested about this topic as I started working on my own transparent 3D scenes earlier this year and quickly realized that it can be a bit more complicated issue than just lowering the fragment's alpha value... I also had to pick a topic for my thesis so I picked "Transparency in Real-Time Computer Graphics" or something along those lines, we'll see how that goes. My thesis is supposed take a look at the development of transparency and the algorithms related it including some of the history. Maybe there's room for some demoscene related stuff if I run out of smart academic things to say.
added on the 2020-02-10 21:08:49 by wabe wabe
Actually in math terms, alpha blending was about linear interpolating two values (in this case color values). But could be used for other tricks.
There was a time when Alpha Blending was covered in coding tutorials for DOS, was a tad popular to be able to crossfade between two images. Many early demos did crossfades between <groupname> <presents> <demotitle> etc. There was afair trick to blend between palettes before vesa and 32-bit modes.
When the first one was made for PC i dont remember. And I dont know which platform did it first.
added on the 2020-02-11 00:48:24 by rudi rudi
this was written in April 1994: http://archive.gamedev.net/archive/reference/articles/article357.html but my guess is that transparency in other simpler forms where covered many years before that, on home computers.
added on the 2020-02-11 00:54:00 by rudi rudi
I remember simple addition and conditional branching was enough to do "simple" crossfading between several screens.
added on the 2020-02-11 00:57:56 by rudi rudi
Quote:
I also had to pick a topic for my thesis so I picked "Transparency in Real-Time Computer Graphics" or something along those lines, we'll see how that goes. My thesis is supposed take a look at the development of transparency and the algorithms related it including some of the history.


The algorithms were clear from day one and were used since the invention of the alpha channel. Formalism came with Porter/Duff - you are done call it a day. But for anything interesting you should research order independent transparency. Read anything Morgan McGuire and the likes, Intel has done some interesting stuff too.

Fun fact of the day: I'm researching transparency currently too and this one seems interesting: https://cg.cs.uni-bonn.de/aigaion2root/attachments/Muenstermann2018-MBOIT.pdf
added on the 2020-02-12 17:39:22 by EvilOne EvilOne
rudi: Thanks for the historical perspective.

EvilOne: With formalism and Porter/Duff, are you referring to the paper called "Compositing digital images", because I am aware of it seems to be a good source to cite the basics of transparency. I am also aware of the work of Morgan McGuire as NVIDIA has nice web page containing information about transparency in general (https://developer.nvidia.com/content/transparency-or-translucency-rendering).

The paper you posted I have not seen before, but it looks promising and it is very recent. I will also keep an eye on the Intel stuff. Thanks for tips and good luck with your research!
added on the 2020-02-16 12:17:26 by wabe wabe
Edit: ...I am aware of it (and it) seems to be...
added on the 2020-02-16 12:22:20 by wabe wabe
I've implemented WBOIT on OpenGL 3.3 and it worked, but it was really hard to tune the weighting function. You have to basically have one for every scene / z-range / -distribution. That kind of sucked. MBOIT looks much better, but I didn't get around to implementing it yet...
added on the 2020-02-16 21:51:44 by raer raer
There are two very similar papers about MBOIT. The other one is by Brian Sharpe.
added on the 2020-02-16 21:54:01 by raer raer
A 1-bit "transparent" alpha-channel could well be confused with a 1-bit masking bitmap - if you wanted to have holes in a texture or image or whatever where there is no intensity in the pixels. For me I usually think of transparency as a 8-bit array or an own bit-plane that represents the intensities of the pixels of that texture/image. So, the most common is that it is a 8-bit plane, equally aligned in memory as the r,g,b planes - so the alpha values (intensities) are from 0-255. Its possible to confuse the terms with an 8-bit mask table as well. So... The point is really to define the terms you use before you write about them.
added on the 2020-02-17 03:22:47 by rudi rudi
Im not sure if GPUs still have fixed function pipelines or if shaders have taken all over that yet. So they had to follow their own standards for transparency. In shaders there was room for more flexibility, however the mathematics is still the coders or the designers decision to make. For example, the maths of it should be discrete such that your have direct control over the state space of the alpha channel. If you use floats you are bounded or limited by the precision of the float datatype. In most cases that is enough though. But its still a limitation if you decided to use the alpha-channel for something else...
added on the 2020-02-17 03:43:01 by rudi rudi
Quote:
wild light - drift (during the julia water voxels)

Wow! That was awesome!
"what we're gonna do is go back. Way back into time"
added on the 2020-02-17 09:37:06 by numtek numtek
This one by "Sanity" had some pretty optimized transparency routines. The engine was used in a commercial game and was very capapable for it's time.

Alto Knallo by "Sanity" (second part)
https://www.pouet.net/prod.php?which=1239

It used a 4bit red 4bit blue 6bit green approach and emulated the whole Playstation1 graphics pipeline on a normal "that decades" PC. The engine even featured some realtime dithering to map the 446 mode to 256 colors :D..

Even if you don't have an alpha channel there is the problem of clipping if you do transparencies. I didn't look into the transpency code of it for myself but the 446 layout made much of the clipping related stuff pretty easy since it's hicolor but thus byte aligned.

I could imagine that Mr.Pet used some 64k tables for doing the red+blue channel and a simple 8 bit addition for the green channel, but not sure about that, memory was pretty slow and Mr.Pet always tried to get the maximum possible performance, for instance using the famous pentium opcode pairing of "that age". I don't think he accepted loosing bits by faking stuff, if I remember correctly..

The "Playstation emulation" in action:
https://www.youtube.com/watch?v=Q6wOnNVpaVk

Perhaps that helps a little.. :)
added on the 2020-02-17 23:38:03 by mad mad
raer, looking through the Brian Sharpe paper... that looks really promising. Currently I'm using weighted McGuire/Bavoil so this seems like a drop in replacement. Now it's time to find some time for coding it.

Great, great, great.
added on the 2020-02-18 09:53:43 by EvilOne EvilOne

login