pouët.net

Intro making questions...

category: code [glöplog]
Couple of quick questions, about making intros (4..64k):

1. What do people do to store scenes in shaders nowadays? Previously I've done something like:

Code:if (time < 20) { drawCube(); drawSphere(); } else if (time < 40) { ...etc. }


but it'd be nice to have some less sucky approach. Anyone worked out something better?


2. I need a softsynth. Yes, yes, there are many, but... I'm writing for macOS, and I'm also doing stuff that means my exe has to be 64bit. All the common ones seem to be written in 32bit assembly, which is a problem (especially as I have minimal asm knowledge and can't fix that..)

Something in C and reasonably cross platform would be good. Ideally I'd like to target 4/8K, but 64K is an option. Any suggestions?
added on the 2017-10-21 19:25:48 by psonice psonice
Regarding 1. what you described is essentially what I do (in shader code). It's probably not the best but has worked well enough for me, and allows more flexible editing (can edit without recompiling the exe). Also if you write it in a smart and consistent way Crinkler will compress the heck out of it anyway. Additionally, see this thread:https://www.pouet.net/topic.php?which=10820.
added on the 2017-10-21 19:40:52 by noby noby
Also the two Rootkids 4k and 8k macOS intros are using 4klang, so it's at least possible to get it somehow working :)
added on the 2017-10-21 19:47:24 by noby noby
It’s possible yes... in 32bit mode :) Which is what I need to avoid (mostly due to using Metal for rendering which is 64bit only)
added on the 2017-10-21 20:33:11 by psonice psonice
1. I've done different camera-angles with just modulo: angle1=floor(mod(beats, 2.)), angle2=floor(mod(beats, 3.)) etc. With enough time and live shader editing you can get decent results. But proper direction seems to require if statements in one form or another, just like noby said.

2. You don't need a real softsynth if you've got a musician who is able to program compute shaders. In Pheromone the music is produced by a single shader.
added on the 2017-10-21 21:35:17 by cce cce
1. I am doing timings of the intro scenes base on the value returned by Clinkster_GetPosition() function: http://www.pouet.net/topic.php?which=11181 I know, I know this approach (many if statements) sucks but it works for me;) I went through sources of few 4kb intros and found "direction array". It is very interesting:
Code: /** Direction and scene information. * * This consists of 'scene' information where it is saved how the scenes are played. * * Data is listed as follows: * 0: Time at the end of this scene, in samples, multiplied by 44100. * 1: Bits 1-2 compose the scene number 0-3. * Bit 3 tells if relative time should affect distort. * Bit 4 tells if relative time base is 1 or 0. * Bits 5-6 tell the color choice of background. 0=red, 16=magenta, 32=blue. * Full number is input to srand() for the scene. * 2: Subjective time at the start of this scene, multiplied by 44100. * 3: Camera position at start of the scene (as input to srand()). * 4: Camera forward at start of the scene (as input to srand()). * 5: Camera up at start of the scene (as input to srand()). * 6: Camera position at end of the scene (as input to srand()). * 7: Camera forward at end of the scene (as input to srand()). * 8: Camera up at end of the scene (as input to srand()). */ IVARIABLE uint8_t g_direction[] = { 8, 32 + 0, 255, 34, 225, 60, 182, 198, 35, 21, 32 + 0, 0, 104, 226, 89, 82, 87, 89, 29, 32 + 4 + 1, 1, 78, 165, 178, 222, 122, 89, 33, 32 + 4 + 0, 8, 108, 102, 49, 191, 223, 6, 38, 32 + 8, 5, 139, 200, 102, 191, 56, 254, 52, 16 + 8 + 4 + 1, 19, 247, 230, 102, 12, 117, 13, 57, 16 + 8 + 4 + 0, 31, 140, 233, 175, 140, 233, 171, 75, 16 + 8 + 4 + 1, 37, 247, 230, 102, 12, 117, 13, 82, 8 + 2, 0, 65, 126, 89, 135, 200, 19, 90, 8 + 4 + 2, 2, 47, 27, 65, 4, 70, 108, 104, 8 + 4 + 2, 20, 221, 235, 89, 221, 235, 89, };

Hope it helps:)
added on the 2017-10-21 22:08:59 by flow flow
cce:
Seems we're working on similar stuff :) I did something similar in the last scene of partycoded.exe (the camera is just lerping through an array of fixed camera positions).

There's also a nice path tracing trick in there. If you have motion blur working, just increase the blur duration up to a couple of seconds just before and after a scene change and you'll get a crossfade ;)

You can also see what happens when you realise you've got no synth just before the party in that one :D

Doing the synth as a shader is interesting. I'd like to leave the GPU free (because my recent stuff is getting pretty heavy), but I guess this could just run as precalc.
added on the 2017-10-21 22:14:42 by psonice psonice
flow: Interesting. Not seen that particular way to do it before. It's not a good fit for what i'm working on, but I might use elements of that.
added on the 2017-10-21 23:30:24 by psonice psonice
"Where the dead things dwell" uses a beat-based approach, in which I define a base beat for each scene then a vec4 multitrack Rocket-kinda sequence (by hand, no Rocket involved).

Example of base beats for scenes:
Code: #define SCENE0_BASE 0 #define SCENE2_BASE SCENE0_BASE + 32 #define SCENE3_BASE SCENE2_BASE + 96


Example of a sequenced track:
Code: TRACK(UserData1) // DOF (XYZ), Alpha (W). TRACK_EVENT(SCENE0_BASE + 0, eKT_Step , 8, 1, 2, 0) TRACK_EVENT(SCENE0_BASE + 8, eKT_Smooth , 8, 1, 4, 1) TRACK_EVENT(SCENE0_BASE + 16, eKT_Smooth , 8, 1, 15, 1) TRACK_EVENT(SCENE0_BASE + 28, eKT_Smooth , 8, 1, 15, 1) TRACK_EVENT(SCENE0_BASE + 32, eKT_Smooth , 8, 1, 15, 0) TRACK_EVENT(SCENE2_BASE + 0, eKT_Step , 8, 1, 3, 1) TRACK_EVENT(SCENE2_BASE + 32, eKT_Smooth, 7, 4, 40, 1) TRACK_EVENT(SCENE2_BASE + 32, eKT_Step , 8, 1, 10, 1) TRACK_EVENT(SCENE2_BASE + 64, eKT_Smooth, 8, 1, 40, 1) TRACK_EVENT(SCENE2_BASE + 64, eKT_Step , 8, 1, 40, 1) TRACK_EVENT(SCENE2_BASE + 90, eKT_Step , 8, 1, 40, 1) TRACK_EVENT(SCENE2_BASE + 96, eKT_Smooth, 8, 1, 40, 0) TRACK_EVENT(SCENE3_BASE + 0, eKT_Step , 8, 1, 3, 1) TRACK_EVENT(SCENE3_BASE + 30, eKT_Smooth, 7, 4, 40, 1) TRACK_EVENT(SCENE3_BASE + 32, eKT_Smooth, 7, 4, 40, 0) TRACK_EVENT(SCENE3_BASE + 32, eKT_Step , 8, 1, 40, 1) TRACK_EVENT(SCENE3_BASE + 62, eKT_Step , 8, 1, 2, 1) TRACK_EVENT(SCENE3_BASE + 80, eKT_Smooth, 8, 1, 40, 1) TRACK_EVENT(SCENE3_BASE + 104, eKT_Step , 8, 1, 40, 1) TRACK_EVENT(SCENE3_BASE + 112, eKT_Smooth, 8, 1, 2.5, 1) TRACK_EVENT(SCENE3_BASE + 112, eKT_Step , 8, 1, 2.5, 1) TRACK_EVENT(SCENE3_BASE + 120, eKT_Smooth, 8, 1, 30, 1) TRACK_EVENT(SCENE3_BASE + 128, eKT_Smooth, 8, 1, 40, 0) END_TRACK
added on the 2017-10-21 23:58:59 by merry merry
Re. #1, I used if statements, but in a BST automatically generated by a Python script :) . It was much faster than linear. This was for webgl 1 in ShaderToy (before the update), so I didn't have access to static data. E.g., https://bitbucket.org/inclinescene/tdf2017/src/68cdae5cc7bf1af195c28ef9eaacbf46be32c209/gen-gfx.py?at=master&fileviewer=file-view-default#gen-gfx.py-216
added on the 2017-10-22 00:00:57 by cxw cxw
I do all the timing in the shader. Only the time is provided from outside, and the shader translates it to bars/beats by dividing with a suitable float.
added on the 2017-10-22 00:01:12 by yzi yzi
We don't hardcode timings at all in our 64ks: shaders have uniform inputs (positions of things, boolean toggles, colors, basically whatever numbers we decide to not hardcode). At editing time, our tool displays UI elements for all of those and we can animate them with keyframes and such. The keyframe data is dumped by the tool as a file containing C code that is basically a huge array initialization. That file gets included when we build the intro executable, so the keyframe interpolation code ends up using the exact same input data to produce those uniforms as it was using when we were editing. In addition to the interpolation data, there's also information about the current shot, i.e. basically a number indicating which rendering function should be called at that time.
added on the 2017-10-22 12:37:36 by cupe cupe
I think what cupe's describing is probably the more popular way to do 64k timing. The keyframing method probably makes it considerably easier (in my limited experience) to quickly create, iterate, and refine animations, with the only major disadvantage afaik being that you have to create tooling for it. Trackers like Rocket can help here as they provide an animation editor GUI and library you can include in your project, and it just works. I know Logicoma uses Rocket in their recent 64ks, pretty sure it's reasonably common in 4ks as well (correct me if I'm wrong).
added on the 2017-10-22 16:25:41 by cpdt cpdt
cupe: hmm.... so very tempting to write a better editor after reading that :) (Current one is merely a buggy but very productive live shader editor, great for code, but no sliders even never mind key framing).

I'm wondering now just how much can be moved out of the shader. With "real" motion blur (since I'm path tracing stuff), everything gets interpolated, not just per frame but within it. And if say there's a cross fade (which is cheap with path tracing), that implies there's 2 camera paths (both interpolated), plus two sets of scene data (also interpolated) during a single frame...

Maybe, then, the best approach is something like this:

- The actual "drawScene()" function just uses a bunch of "if (scene==1) {} etc." to draw each scene
- The scene number is handled CPU side, but it's a floating point number. Integers = scene, fractions = crossfade between scenes.
- The rendering loop can just do something like:

Code:float startScene = floor(uniformSceneID); // Current scene float percentageOfFrame = rayNumber / rayCount; // % of the current frame rendered float ab = step(uniformSceneID - startScene, percentageOfFrame); // Normally 0, if it's a crossfade, after n frames goes to 1 int scene = int(startScene + ab); drawScene(scene);


That should draw a percentage of frames from scene n, plus the remainder from scene n+1. (Code not checked and probably wrong :) Which handles scene number plus crossfades.
added on the 2017-10-22 19:47:02 by psonice psonice
Do take the time to integrate GNU Rocket into your framework. You will never look back.

Rocket parameters hook up quite naturally to shader inputs. If you query the shader for its inputs at each reload, you can even add parameters without restarting your intro.
added on the 2017-10-22 19:56:23 by Blueberry Blueberry
Quick test of my 'crossfade' thing:

BB Image

Yep :) That's fading between 2 scenes (the floor changes material and the sphere changes from mirror to glass). If you're rendering many times per frame, this is a nice and pretty cheap way to fade. (Just make sure to get the step() parameters in the right order ;)

Blueberry: will take a look, see how it might fit in, thanks.
added on the 2017-10-22 20:49:39 by psonice psonice
flow
Quote:
code from Yog-Sothoth

Didn't expect my direction code to find its way into a thread like this.

NOTE: I had a discussion about this with noby some time ago, and as he says up there, he's writing his direction code completely in the shaders. Tests seem to indicate this compresses very well. The direction array code seen above is, from the viewpoint of the compressor, essentially noise.

Reliable comparison would require one to do the same direction using both methods. I haven't had the patience.
added on the 2017-10-23 15:58:09 by Trilkk Trilkk
Ok, so block of if(){} it is.

Any suggestions on the audio side, other than generating some waves in a shader? :)
added on the 2017-10-24 11:51:27 by psonice psonice
psonice: oidos seems to have a mac 64bits VST version atleast, so i guess the source player code should also compile in 64bits. O_o haven't tested though.
added on the 2017-10-24 14:58:35 by psenough psenough
Quite a few of them have 64bit compatible VSTs, but the actual replayer coder is usually 32bit x86 assembly. Makes sense if you're optimising for size for a 32bit exe (most common case for intros)
added on the 2017-10-24 15:59:24 by psonice psonice
psonice: We also had the same issue with macOS API's being 64bit only, however we came around that by having a x86 exe running 4klang, then we fired up python and used PyObjC (part of macOS, a bit buggy but works for it purposes) to access the ObjC API's. I think you can bootstrap MTKView pretty easily with python and PyObjC, and basically do the same hack as we did.

If you want, you can get access to our source for inspiration.
added on the 2017-10-25 17:38:17 by Tapped Tapped
Tapped: if you could write a couple paragraphs describing the process more clearly (with example code and links to stuff) that would be quite useful to include at http://in4k.github.io
added on the 2017-10-25 23:38:50 by psenough psenough
Tapped: I'd very much like a look at that source, that sounds like a hack of pure beauty! Thanks! Please drop me a line at: psonice at me dot com.

Btw, I thought I had a different fix earlier. I noticed, if I build my exe as 32bit, it fails on MetalKit but *not* Metal. And, in fact, it's possible to build a 32bit exe using only Metal! Of course there's a catch: it won't detect any GPU /o\ :D

I've also looked at including the MSL shaders as both raw source strings and compiled libraries, the compiled library ends up bigger than the whole exe with source, so that's a no-go.
added on the 2017-10-26 00:09:40 by psonice psonice
psenough: I could probably do something like that.

psonice: I've sent you a copy of the source to your email :)
added on the 2017-10-26 11:04:03 by Tapped Tapped
Tapped: did you have to make any modifications to 4klang to get it running on macOS?

All my attempts so far have just resulted in segfaults, but you're proven it's at least somehow possible.
added on the 2018-02-27 13:09:44 by yx yx

login