James on Kinect

I have just bought an XBOX Kinect and intend to get some serious use out of it!! Not using an XBOX tho.

Using Processing’s openKinect library and (as ever) some help form Mr Daniel Shiffman I am going to start off with some simple MIDI control and (fingers crossed) make some engaging interactive works! All to be posted here

OK so very bad start!! Looks like for the time being I have just bought an over priced webcam, Shiffman has a great library that can do some Z-Axis stuff, and have point tracking, so I can control some simple MIDI with that … but I could do that before with a great colour tracking class in Processing. Sooooooo it looks like I am going ti have to update my OS to the new Lion (when it is released this month (fingers crossed) to get OpenNI to MSP via OSC working smooth … so for now I am back using OpenCV to create what I want.

But on the plus side, wanting to use Sound in this projects the BETA version of Minim sound library for Processing now has some great effects in its UGens including DELAY! and PANNING! (if you can call panning an effect) so that’s something to keep me busy

so the plan is to make some very cool interactive music installation, based sound the Pulsate flash game programmed by Andre Michelle, but rather than it being on a screen, a clap or click will create a circle on a large screen in front of the user, I have the basis of this working, the click triggering an action, and the face detection and tracking, BUT circles are not my friends at the minute, its been a long time since I have done any serious maths! hopefully on the next update I will have the circles all working properly and some sound on the go :)

also how cool is this!!

So i have finally figured out some simple circle collision! I was making it very difficult for myself for no reason, now I just calculate the distance between the ‘x’ and ‘y’ of the circles against the size of their radii. If the radius is bigger then their must be a collision! :) simple. there are a few glitchy aspects to it but I am slowly working on it in my openProcessing account. Once I have something better to show I will upload a video of it all working :)

So I have got very far since the last post! I have the face tracking integrate with the creation of the circles, collision between the circles with each-other, and on the boundaries of the sketch, and the ability to have a circle inside a circle and not to glitch out. AND! sound is played on each collision, starting on C going up 2 octaves using the Pentatonic scale.

Unfortunately the video doesn’t have sound, but trust me it works!! Soon I will start using the scale of the circles to determine which note is plays, but for now I am focusing on removing the face detection in OpenCV, and using the Microsoft Kinect! *which was the whole point of this project.*

For now, the circles are created by pressing the space bar, but on the Kinect I will have it triggered by putting your hands together, as you would to clap.

SO I have Mac OSX Snow Leopard on its way to me as I type :) the next update will be a how to on getting the Kinect up and running on a Mac.

**UPDATE**

Well it has been a while since I have been on to update, I have been  very busy, researching and messing about. and I finally have a solution!! so bare with me on this little essay »

The issue I have had is getting the Kinect ‘joint possition’ data sent into Processing, for me to play with. Using Daniel Shiffman’s Library I can get the points cloud working, and get the depth planes working, thats easy, but not enough detail for what i wanted. 

So I started playing around with this tutorial by Thom Judson which goes into lots of details and is for the complete amateur. After the 3hours it took to get just over hal way through, I started getting issues in Terminal, and thats just a world that is strange and intimidating to me. SO I pulled all my hair out trying to sort all of that out. So i decided to start fresh and come from a different angle, I have seen lots of VERY impressive clips of people controlling Ableton Live (as seen above) which uses specific coordinate data, so I looked into what they were using SYNAPSE which is ace! plug and play :) just excellent, with some helpful (if at times vague) documentation to help you get started, showing how you can use the Kinect to control, Ableton, QC, MaxMsp and Jitter. And if you want to get into it just about anything that can take incoming data from OSC (Open Sound Control) which is just what I needed. So just so it is clear NO NEED TO MESS AROUND WITH NITE, PRIME, TERMINAL, MaxMsp or OPENNI, its all done for you.

I knew that Processing had an OSC library, to communicate with incoming and send outgoing OSC messages, so all I needed to do is know the syntax for the communication. Which I did enlist some help for, but I think I was just being a bit slow because it is actually all up on the SYNAPSE website, and the Processing OSC library website. But thats how these things go.

And as if by magic there we have it, all the data streaming through :) I will eventually put it up on the openProcessing site (although it wont work unless you do have an XBOX Kinect)

So I now have the Kinect integrated with a rough version of these musical circles, when your hand come together a circle is created :D Happy Days 

There are still a few holes in it, but I am so happy with it I thought I should just put this video up and update it later on with the final version

the processing sketch/source code you are best to download it and run it from your desktop Processing rather than the online one. 

I decided to use the Kinect rather than the OpenCV face tracking, just because the face tracking has so many variables that can go wrong, like the wrong lighting, or position of your face, so there is allot of interference. ALTHOUGH, saying that, it is very cool that you can just walk into camera shot and be tracking instantly, rather than having to have the PSI/calibration pose for the Kinect. But with the Kinect one you are tracked, it is pretty solid, and because it works with Infra Red, it can work in the dark, so lighting can be as bright or dark as you like. I think the Kinect has far more creative possibilities than the openCV face tracking, but as a mix of them both, I think there could be some serious scope! especially for live animations. Imagine a live Family Guy or Simpsons table read, but with each character animated infront of you, complete with lip sync from the Face tracking! instant screen test :)