Recently I started working on a quick demo to create an interactive visualization with Kinect. The idea was to create a music visualization that could be interacted with Kinect and after someone interacts a trigger causes a message to appear.
Starting out, I experimented with creating the demo using OpenFrameworks using CPU Particles. I experimented with particles morphing into multiple shapes/images and different visual combinations.
After a couple of days, I realized that this path would require time and effort to produce any decent results and I was planning to make this demo within 2 weeks. Looking up at multiple different solutions, I went for the one with the one that would produce the biggest bang for the buck i.e Unity3D.
I started experimenting with 2D GPU particles using different examples and did rapid iterations, but after a few experiments I realized that too required more time than I was willing to invest right now.
This hunt for a nice GPU particle solution led me to TC Particles. This package is incredible, I have gone up to 2 million particles maintaining 60 fps. So now that I had the particles resolved, I messed around with my audio device settings and read the input from my mic so that I can use the song currently being played as input.
For the visualization I just mapped the FFT to particle properties. Properties were mapped with a simple formula, Property = Factor * fn(FFT[sample]). I used the factor to scale the FFT value to produce a better visual result and used some kind of a function (e.g cos, log, exponent) to produce a different mapping for particle behaviour.
After that I just integrated a Unity Kinect sample to use skeleton tracking and added box colliders to the skeleton to allow interaction. I don’t really like this method of interaction, it would be far better if the particle systems was in 2D and we could collide directly with depth map data, but ill leave that for some other day.
All in all it turned out to be a nice proof of concept.