For the TZU ‘Beautiful’ music video, I recently found myself out near Hanging Rock, with plastic-wrapped laptop, projector, camera, lights, and a mini-crew – filming ghost projections in the night winter rain. Despite the weather drastically mismatching the supposed forecast, slowing everything to a snail’s pace, we salvaged the situation as best we could, reworking the storyboard around some of the less exposed areas, and soldiered on until about 5am. Not the end result we’d aimed for, but am happy with what we managed in the circumstances. So it goes. Full credits/links, and a series of behind the scenes photos over at the project page.
“The *spark d-fuser lets you crossfade between laptops. Whether switching between presenters or pushing avant-garde pixels, hands-on control for mixing DVI and VGA signals is now available in a compact and affordable package.
If you want to know more or see it in action, jump straight to the demo video below. If you’ve been following the project, the message is simple: pay and yours will be produced. Orders are being taken on September 5th, the manufacturing run will then take six weeks from there. Price: £710 ex. VAT, £852 inc. VAT.”
We have no jetpacks, but soon it seems, we will have affordable mixing of digital video signals, thanks to the herculean efforts of 1 x Toby Harris aka *spark aka ‘card carrying Timelord amongst VJs’.
Rattling along in the tube, in between bankers reading 50 Shades of Kindles… Toby envisioned a better world, a world where VGA and DVI signals could be mixed without repercussions, and a world where smooth crossfading could happen with a device carried in your backpack. It was also a world that he would have to build himself, and a couple of years down the track, here we are. In between priming conveyor belts and supervising factory elves, Toby was kind enough to answer these questions:
What have you enjoyed about using your prototypes during performances?
The mixer for me is in support of the laptop, and damn have I enjoyed pushing crazy pixels with my laptop. Using it two-up in a D-Fuse show with Mike, I’m freed from the need for it always to be my mix on screen, so I can rip down, prepare and experiment with the mix. Makes me push things much further! That, and I’m freed from the fear of my bleeding-edge software taking down the whole show.
The surprise for me was the tap buttons, I love them. The original prototype didn’t have them, I envisaged a cross fade from one to the other and not much else. But in the expression of interest form, lots of people asked, so on they went… and wow, tapping in a slight variation of the main laptop’s mix is a really powerful thing.
What sorts of firmware additions would you like to see / develop? (you mentioned multiply mode as an option once?)
Mix modes are in the realm of possibility. The processor has the power to compute a soft-edged key for every pixel, so there’s some per-pixel computing power to play with. Additive is the bangs-for-your-buck upgrade here, and I think would really creatively transform what is possible with the mixer as you get the ability to truly composite the two sources together. I talk about this at the end of the demo video, and I’m really trying to make it happen.
I’d love to see the processor lose its line limit of 2048 pixels, there’s the naive observation that TripleHead 800×600 should be possible given that is actually fewer pixels to process than the 1920×1080 it definitely can handle. TV-One have in a way already answered this in the 1T-C2-750‘s sibling, the 760. It can do 2880×900, but at the price of being able to fade both sources.
You have to realise however it’s TV One’s processor, and the firmware that runs it is very much their core product, their IP. There’s no possibility of them giving it to us to do, and them doing anything for us is a decision intertwined with their wider business plans. I wish it weren’t so, but the sheer fact they designed the 750 and produced it for an affordable price is something to wonder at.
Why release the firmware as Open Source?
The frustrations above should go some ways to answer! If you need to tweak, extend or optimise, its in your own hands, and in the best case that gets shared back to all. Simply put, its what I would want if I were in the community buying one. There is more to it than that, and there certainly are risks, so let’s call it an experiment and see how it plays out.
Why has the video hardware world been so slow in releasing affordable digital mixers?
Well, one thing I can say is that this project has been one of the most ridiculous things I’ve ever done in terms of effort and reward – if I had an eye on the bottom line I’d have stuck to bespoke development and on-site fees! There’s obviously a quantity sold at which point that changes, but I’m not sure that quantity is comfortably within the VJ market, and I’m doubly not sure of that if you have the overheads of a worldwide corporation.
I’m surprised however that VJs haven’t been able to co-opt generic presentation kit, the 750 is as close as I’ve seen.
In what kinds of ways have you played (live) with the OSC / DMX and ethernet capacities?
The simple answer is I haven’t – the ability to have that is everybody’s gift back to me for doing this project, along — hopefully — with additive mixing. Come the first D-Fuse gig with the new controller, we’ll be rocking the OSC out. Finally we can cut between visual laptops and have the audio follow!
1. Re-created Middle-Earth on a kitchen tabletop… (hello – every backyard plant we have, hello – fallen moss covered branches from the park across the road, hello – turntable, hello – flashing bicycle lights, hello – wonky lampshades and plastic toys.)
2. Made some custom animations (Quartz composer, After Effects), and projected these onto Middle-Earth, using software to manipulate the projections (VDMX, Madmapper, quartz composer).
3. Recorded the results (Canon 7D, various lenses), and edited together (Premiere).
As part of doing live video for a event a few months ago, I was asked about displaying a live twitter feed for it.
“I can probably take care of that.” Which meant…
QUARTZ COMPOSER WRESTLING
Ingredient 1:
The basic RSS patch that comes with Quartz Composer…
(Entering the skynoise RSS feed URL into the patch on the left, generates the output in the viewer on the right.)
The Quartz patch didn’t seem to continually update from the twitter RSS feed – but starts to cycle back through older tweets in a loop after a while… ( like 10-15 tweets ? ) + couldn’t figure out how to display author, alongside the text … Tried looking through the RSS info to find author parameters, then see where these might be adjusted within the quartz patch, no dice.
I posted a description of the problem to pastebin and asked on twitter.. and @lumabeamerz kindly wrote back *and* adjusted the quartz patch, noting…
“If you put your mouse pointer for a moment to a structure’s output, you will see what is “flowing”, like this:
So, 0-4 are indexes, “…” are keys. Basically, we need the member of key “authors”, which will give use an other structure. The index 0 member of structure is good for us, and give us an other structure. From the last structure, we can extract the name with the key “name”. It is simple if you are a programmer, since the method is same in the Obj-C land to access structures. For the updating, I connected a Signal patch to the RSS patch’s update signal input, so it actually refreshes in a 60sec. period.”
Here’s the final quartz patch, edited by @lumabeamerz – which continuously updates any tweets from a particular hashtag, and displays author name alongside. Maybe it’s a useful template for you to modify however you wish?
Resolume Avenue 4.1
Pitched as a HD video mixing desk, this does everything you’d expect of modern VJ software. Detailed specs listed here.
Strengths At a glance? – Instrument like interface – built for performance. The screen layout simplicity that avoids complex navigations. It’s built for quick and easy triggering…and gives an easy visual overview of the overall composition, selected layers, and selected clips. There are plenty of nice interface touches – eg click columns for all clips in that column to play on each layer ( makes for easy switching between combinations of clips ), or adjust transition timings for new clips in each layer to trigger as fades or cuts, and custom dashboards for quick viewing of FX parameters.
( Above : the tightly arranged mixing section )
– Solid playback of video, audio and image files. ( Includes very fast custom codec, DXV, see below for codec comparison.)
– Audiovisual effects and integration – many features including VST FX for audio, easy combining of audio and visual effects, and the ability to easily crossfade just the video, audio, or both at once.
– Plays back interactive compositions – eg quartz composer, and flash animations including AS2 and AS3 scripting. It also handles FFGL plugins eg the IR Mapio plug-in which can be used for mapping videos to projection surfaces.
– midi / osc – use any hardware or virtual controller – and nicely, it includes preferences for starting clips an adjustible few milliseconds further in the timeline to deal with delays that midi triggering can incur.
– BPM Tempo + Snapping – Everything can be linked to the global (Beats Per Minute) tempo to create a fully synchronized audiovisual performance. Automatically pitch the audio and video and synchronize parameter automation. Use the beat snap function to trigger clips in sync with the beat. Audio analysis – Utilise environmental sound to bounce any clip parameter to the music.
New to 4.1?
– Native Syphon support on MacOSX. This means you can re-route Resolume into other video apps, and vice versa. ( hello madmapper, VPT, quartz, jitter, VDMX, Modul8, Unity Game Engine etc )
– Layer Routing – ‘create a clip that takes the input from any layer below’. ( All kinds of remixing and compositing capacity )
And The Flipside?
Some complexity and versatility is lost with the design decision to streamline the interface performance. eg You can’t easily preview, compare or adjust effects on 2 or more layers at once. I also asked a group of Australian VJs who’ve used Resolume much more than me, what they’d like improved:
– some still prefer earlier versions for stability reasons (V4 crashes – with V3 decks loaded / when too many notes are fired in Ableton)
– lack of a search function in the file browser + effects list
– needs a text tool
– needs an audioreactive timeline like v2.0
– midi can sometimes be inconsistent
– € 299.00 Euro for 1 computer. (50% discount available for staff/students) ( Includes all 4.x.x updates, eg 4.1.1 that came out just as I got ready to press publish)
– Windows 7 or XP: ATI Radeon 9600 or better. NVIDIA GeForce FX 5200 or better. 2GB RAM.
– Mac OSX 10.4.9 or later. Intel processor. Quartz Extreme graphics card (Resolume is not compatible with integrated Intel graphics processors). 2GB RAM. DXV Codec.
Resolume Arena 4 Media Server
Arena has all the features of Avenue aaaaand..
– Screen Warping & Video Mapping – In the advanced output window you can now create as many slices from your composition and position and transform them to your liking – good for multi-surface projections. New – route layers directly to slices, masking and cropping now added to the Advanced Output of Arena – and use bezier curves to map video onto curved screens.
– Soft Edge – With soft edging to seamlessly project 1 widescreen image with 2 or more projectors. Or wrap around the composition for 360 degrees seamless projection.
– SMPTE Timecode Input – “With SMPTE Timecode input you can run you clips in sync with anything you want. Lights, lasers, even fireworks!”
– DMX Input – Control Arena from a lighting desk using DMX. It works similar to MIDI so it’s very easy to configure. Input can be done via ArtNet or an Enttec DMX USB Pro.
These features will appeal to some ( esp perhaps PC users who don’t have access to the easy mapping capacities of the mac only Madmapper ), an interesting option compared to much more expensive hardware media servers.
(Above: the mapping related bits inside Arena’s advanced output. )
Arena 4.1 Requirements
(Same tech requirements as Avenue 4.1)
€ 699.00 Euro for 1 computer. (50% discount available for staff/students)
“We’re always saying generative content is the future, so it’s about time we proved it! The original Quartz Composer patch is included to create endless variations yourself”
H.264 helped popularise web video with it’s intensive file shrinking while maintaining a lot of visual quality. It’s terrible for real-time VJ software though, because of the relatively painful CPU intensity required to decode it. I do use it sometimes ( the Canon DLSRs record natively as h264, and sometimes I’ll just throw a clip straight in, and it’ll play fine, but it’s generally best avoided. Photo Jpeg at 75% seems to work nicely on most platforms, though Apple Pro Res and AIC give a slightly better image quality when a gig / clip needs more detail. So where does DXV fit into this?
“Regular video codecs require the CPU to decompress the frames before they can be pushed to the videocard for display. With our DXV Codec video frames are decompressed by the videocard’s GPU which can do this much more efficiently than the CPU. The DXV codec can also store the alpha channel. This is essential for preserving translucency in complex video compositions.Hardware accelerated playback is only done when played in Resolume – the video will play in any other software but it will not be hardware accelerated.”
A quick lo-fi test for comparison’s sake then?
1 minute 1920 x 1080 / Pro Res video in Resolume Avenue 4.1 with no FX.
(On a 2010 macbook pro running the usual hundred apps / million browser tabs open)