No Jetpacks announcements this week (especially from Qantas), but there’ve been a few exciting one-foot-in-the-future live video developments lately.
Syphon Teaser from vade on Vimeo.
Syphon
Syphon is “an open source Mac OS X technology that allows applications to share frames – full frame rate video or stills – with one another in realtime. Now you can leverage the expressive power of a plethora of tools to mix, mash, edit, sample, texture-map, synthesize, and present your imagery using the best tool for each part of the job. Syphon gives you flexibility to break out of single-app solutions and mix creative applications to suit your needs.”
Out of the box, this means you can send live video signals between these applications: quartz composer, max msp jitter, Freeframe GL and Unity 3D Pro ( a game engine ). Within a short while of release though, this list has been extended to include Modul8 and the MadMapper mapping software, Resolume Avenue, built-in support within the new beta of VDMX, Isadora, CoGe, Open Frameworks, Cinder and Mix Emergency ( software for scratching video with a Serato turntable set-up ).
It all happens on the graphics card, not the CPU, which means HD video can be shuffled between applications at 60 frames per second. This is a great boost for live video flexibility ( lossless live video mixing in and out of 3D game engines? No problem. ), and when it evolves to include being sent over networks, awesome collaborative possibilities await.
Kinect
Another splasher in the visual tech world has been the recent release of the Kinect camera add-on for the XBox. The device features “an RGB camera, depth sensor and multi-array microphone running proprietary software”, and enables 3D motion capture, facial recognition and voice recognition. Being such a hacker’s delight of a device, within a week of release there’s already a growing range of software written that enables it to be used outside the Xbox. Some of these include the ability to draw in 3D (and rotate the image) using gestures, visual FX applications and thanks to Syphon, there’s already a way to include Kinect 3D depth images within Quartz Composer – by first using it within OF and then sending it through Syphon to your visual app of choice.
Notable Kinect shout-outs: real-time light sabers(love the use of the little mirror to frame and juxtapose the original footage here), kinect puppet shows and instant fat-suits.
And there’s a nice extended Kinect round-up of projects over at Creative Applications.
So there you go, the future is already here – *and* it is starting to become more evenly distributed…