pouët.net

effects synchronization with music

category: general [glöplog]
Hi all,

What is the best way to synchronize demo (C++/openGL) effects with music (mp3 format). I checked fmod, bass music libraries, but I can not find any source/tutorial about synchronization.

What ado you think about other music libraries (openAL)?

Any help will be appreciated.

Thank you
use bass for good timing.
as such, you want to use bass's getposition and bytestoseconds functions to find out exactly where you are. then, render the frame that belongs to that particular point in time. as such, make sure your renderFrame() function is somehow a method of time. that's all.

now you can even skip through your demo and pause it and whatnot simply by skipping/pausing the tune.
added on the 2009-02-15 19:45:44 by skrebbel skrebbel
for actual effect timing, find out what the BPM of the tune is, so that you can compute where in the song you are in terms of beats, rather than seconds (seconds are a completely stupid unit when it comes to synchronization - except if you want to sync to how fast the world turns). then you can either use awesome tools such as GNU Rocket to time your demo-effects, use some own scripting thingy, or just hardcore stuff, i.e. if((32.0f <= beat) && (beat < 64.0f)) { doPartTwo(beat); }
added on the 2009-02-15 19:50:19 by skrebbel skrebbel
s/hardcore/hardcoded
added on the 2009-02-15 19:50:44 by skrebbel skrebbel
The old "tap the space bar" method should do it for most people. (Read: Let your demo run through, tap keys for different timing events, record those time stamps and use them in your code, f.e. as effect parameters whose value diminish in time).
added on the 2009-02-15 20:56:34 by tomaes tomaes
tomaes: that's if you have any sense of rythm... I've seen plenty of people that do not :)
added on the 2009-02-15 21:05:15 by Jcl Jcl
I am a music person, so no problems for me. :)

If you happen to have a somewhat decent midi controller available, a more sophisticated version of that method would be to record a VJ-ed version of your demo and use the best timing session you recorded. Or even make different timing sets available on startup.
added on the 2009-02-15 21:13:49 by tomaes tomaes
you could also run a FFT analysis over the mp3 first (BASS can do that for you) and use this as a "rhythm detector" (e.g. many bass frequencies = there must be a beat = sync!)...
That method doesn't work very well with most genres/types of music and/or gives you rather fuzzy results that are not as good as actual directed (as in: making an artistic choice) timing. But in general terms, there's no one-shoe-fits-all answer. It really depends on what kind of music you have and what kind of demo you imagine.

If you're really lazy and you have some four-to-the-floor techno beat, you can just do something like "scene( (runningTime/200)%4 );" and be done with that. ;)
added on the 2009-02-15 21:36:34 by tomaes tomaes
Quote:
but I can not find any source/tutorial about synchronization


There is no magic tool for synchronization. Of course it will help to have a nice UI on which you can tag events, mark specific moments or even better, some kind of demotool with which you can drop fancy boxes on a timeline to trigger specific effects but in the end, the tool won't make the important part of the work for you, what will make your stuff unique : decide when to do what, according to your feelings and what you have in mind.

If you want to make something interesting, you can forget all those automatic beat detection methodes like FFT analysis, it will only give poor repetitive results devoid of soul or mood. At best, they are just good to modulate the basic intensity of a value on a visual effect, but you will want something more advanced than that.


I usually load the future demo soundtrack in some basic .wav editor to have a visual representation of it along with the sound. Then I start marking the main parts (what could be used for an introduction effect, what could be used for a transition), then I take care of the general tempo of the tune, determining the beats (once you have one and the bpm, it's easy to get them all) then I take care of more specific things like half-time beats, exceptions in the structure of the tune etc...

I write down all those things manually just because I find it fun that way but you can do it with a nice demotool that will take care of saving the events and the time indexes.

Demotool or not, the point is : don't let detection tools do those things automatically for you, because they won't. The result will be poor and those tools won't be able to find a structure of events that will make audio and visuals work together, following an atmosphere or the ideas. The manual part of it is really important. You determine the time indexes that you deem interesting. It might be weird at first but chances are good that you eventually find it to be quite interesting and fun.
added on the 2009-02-15 23:52:29 by keops keops
(sorry for the long post, hope it helps getting started though)
added on the 2009-02-15 23:55:48 by keops keops
Better even: make your musician export the midi tracks if available ;)
added on the 2009-02-16 00:39:40 by xernobyl xernobyl
Just go with Keops. He pretty much said it all.
added on the 2009-02-16 03:08:07 by Paralax Paralax
beat detection in a 4k. i believe iq is smarter than that (hint, as you generate the music from data, that data is also your sync data. done.)

anyhow, i do recommend attaching some fft output to your glow intensity parameter, however. otherwise do what keops said.
added on the 2009-02-16 08:04:53 by skrebbel skrebbel
I always did the music sync the lame way. I played the music in my demo and made a sync at a specific value in the timer (not the music pattern but the ticks timer). If it didn't was exactly at the point I wanted, I moved the timer value a bit till it fits. Of course sometimes there is inaccurate sync with this (go check the different ports of led blur)
added on the 2009-02-16 08:20:41 by Optimus Optimus
Ask your musician to pick the timings for you.
added on the 2009-02-16 08:46:20 by willbe willbe
I use mp3/ogg for audio but sync on the tracker style pattern data (parsed from Renoise project file) to get exact sync with patterns and instruments. So basically I play two files which are in sync, the actual audio and the pattern data.
added on the 2009-02-16 08:50:39 by fragment fragment
just blink at every beat and use the bpm-info! INSTANT COOL... or wait....
added on the 2009-02-16 08:55:33 by Proteque Proteque
Quote:
Better even: make your musician export the midi tracks if available ;)

Actually, I don't favor such things at all. That just leaves the coder or graphician with loads of sync-data which is tied to specific things in the soundtrack. The problem here is that it is too easy just to say "ok, this track is the bassdrum - just flash on every beat then".

I prefer it when the sync-job starts with the music, that the musician actually thinks of things that can be synced to, because that tends to give you results that feel less "forced" - so I completely agree with Willbe.

Also, what skrebbel said about FFT output on glow intensity -- sooo dull to do manually. :)

Oh, and the best "sync tools" are those dedicated to doing actually syncing, not everything else.
added on the 2009-02-16 09:30:47 by gloom gloom
0) Think before you do. Listen to the song without your visuals. Make a script. Visualize what parts in the music should be synced into, and what kind of visuals fit with the music. This is the most important step.

1) You can use pure time values. The "tap the space bar to the beat" trick is pretty good, even though it's simple to implement. Just prepare to tweak your timestamps a bit if you miss some :) I use triggers that have time and length. Something like

Code: int time = system.getDemoTime(); float value = 0.0f; for (int i = 0; i < num_of_triggers; i++) { if (time >= triggers[i].time && (time < triggers[i].time+triggers[i].length)) { //we are inside the trigger int pos = time - triggers[i].time; value += 1.0f - (pos / (float)triggers[i].length); } } value = min(1.0f, value); //clip it to 1.0f in case of multiple triggers


2) You can use the wavedata or FFT values to sync visuals and make subtle effects. The results are often good, but usually need tweaking according to the song. I do a lot of stuff like
Code: float FFT[512]; system.getFFT(FFT); //get FFT the size of 512 from the music player float sum = 0.0f; for (int i = 0; i < 512; i++) { sum += FFT[i]; } sum /= SOME_NICE_CONSTANT; //now use sum to do something nice like sync glow to it or movement or something.

You could also assign weights to different parts of the spectrum, like these visuals react to the bass, these to mid frequencies and the rest of the visuals to the overall volume. Read up a bit on basic DSP and learn about filters. Lowpass-filtered FFT data is good for syncing ("blurred" data from many frames to us mere mortals)

3) BPM sync is sometimes useful, but it tends to get boring. I have a bunch of BPM calculators that return 1 when they're on the beat, and then interpolate down to be 0 just before the next beat. But I don't really like it or use it that much.

4) If your music is originally in .xm or .rns or whatever format like that, you might be able to parse the patterns or midi data or whatever. I've never tried it though.

5) Be prepared to spend time with syncing. It's definitely one of the most time-consuming parts in making a demo, though also one of the more rewarding.
added on the 2009-02-16 09:52:22 by Preacher Preacher
(all code examples from the top of my head, I'm not responsible for bugs :))
added on the 2009-02-16 09:52:48 by Preacher Preacher
0) Really helps if you have the music first... but many times the musician wants to graphics to do a suiting soundtrack... it's catch 22 there ...

What keops said.
added on the 2009-02-16 17:11:16 by thec thec
fuck fft
added on the 2009-02-16 17:14:25 by Gargaj Gargaj
ass boob fart
added on the 2009-02-16 17:22:43 by sagacity sagacity

login