Audiovisual turntablist Sampology recently commissioned a music-video for the interlude track on his upcoming album. The catch was, it had to be 100% animated cats.
IF Felinology = the study of cats and their anatomy, THEN: GIF-felinology = Kind of like the torture scene from A Clockwork Orange, where eyes are propped open by pins… and as a piercing cat midi loop blares, an avalanche of cat tumblr feeds scratch away at the poor researchers eyes, until nothing remains but twitching and furballs.
I survived though – and am now a semi-professional animated cat-gif expert (contact me for rates). Catch the clip below ( or over at the projects page, where I now archive + document my video activities), or see it on the big screen during Sampology’s upcoming Apocalyptic AV tour.
Next up: more apocalypse, preparing ‘post-apocalyptic visual backdrops’ for upcoming TZU tour…
Warning: playing around long enough with 2D billboards and sprites (to display images) inside Quartz Composer, will eventually lead you to a third dimension. Fear not, inept explorers before you have survived, and they’ve even left some string at the edge of the cave:
Playing with 3D in QC
1. Create a new clear patch. This creates a blank background, choose colour in its inspector.
Type ‘clear’ into the library window, drag and drop a ‘clear’ patch onto your editor window.
2. Drag a ‘cube’ onto your window too.
Drag a movie from your hard-drive, into your window. Attach the image output to one of the image inputs of the cube.
Play with the rotation and x / y / z values of the cube in the inspector, and you should see video on one side of the cube.
( Tip - When adjusting Quartz parameters, adjust at 10 x speed by pressing shift at the same time, helpful for quicker compositing.)
You now have video playing on a 3D object in QC. Woo!
To easily put video on all surfaces – create an input splitter – by right clicking on the cube. Select ‘Insert input splitter’ and choose the the cube image input you’ve been using. Drag from the output of the input splitter, to as many of the cube image surfaces as you wish.
3. For easier 3D space manipulation try using a trackball. Drag one in from the library. Click and drag to select your cube(s) and movie files, then press command + x to cut them. Leave the clear patch. Double-click the Trackball patch to go within it. Paste (command + v) your cube here. Now try clicking and dragging on your QC viewer window – the trackball enables this more intuitive navigation / perspective alignment. (Click edit parent in the editor window to go back to your upper layer patches at any time.)
(Similarly, the 3D Transform patch can be used to allow easier control of 3D space. For example, placing several 3d objects inside a 3D transform, allows easy perspective adjustment of the whole scene by changing the 3D transform parameters.)
4. To add complexity – drag a ‘Replicate in Space’ patch from the library into your editor window, and place beside your cube. Again select your cube(s) + movie(s) + cut these. Double click to go inside the Replicate patch, then paste these inside. Play around with the Replicate parameters and watch the results.
3D experiment quartz patch ( Update the movie location to match a video file of yours, or delete the included movie link, and drag a movie into the editor)
Playing with your 3D QC file inside VDMX
Option 1: Drop it into the media bin and it should playback the 3D model with pre-connected movie.
Option 2: To use the QC file as an effect – allowing any clip being played in VDMX to appear on the cube – we need to do a few things :
- To enable any of the QC parameters to be adjusted live inside VDMX, we need to adjust the relevant parameters in the QC patch. This is called ‘publishing an input in quartz’ ( and enables that QC parameter to be adjusted within VDMX - see instructions). However – slight complication – these need to be published in the topmost layer of quartz ( ie the root macro patch) to be accessible in VDMX. So if you’ve published an input within a subpatch of your main patch, this won’t show up in VDMX. To solve this, publish the input at the layer of QC you are in ( eg inside a replicate in space patch), then go up one level – this published input will now appear listed in that patch ( eg replicate in space). Repeat the process of right clicking to publish again, and it will appear in the next patch up, and so on until it appears in the root macro patch.
- Save the QC within your VDMX QC FX folder. Select it from assets ( if needed, refresh via VDMX > prefs > User paths ). Whatever VDMX clip is playing will now be composited onto the cubes. Dude! OH: and click-dragging in the VDMX preview window works for the trackball navigation, the same way it does within QC.
3D experiment quartz patch for VDMX (Drop into VDMX QC folder. Change parameters in this QC patch, save as new name in the same QC FX folder, and you’ve got as many new 3D compositing tools as you want.)
Playing With 3D Models
Download and install Vade’s v002 Model Loader - which allows you to “load 35+ 3D model formats into Quartz Composer, including animation data.”
Drag the v002 model importer into your editor. For ‘model path’, enter the address of a 3D model. (Drag a 3D model to the editor, click it, select path in inspector, copy and paste into v002.)
Connect an image or video to the v002 Model Importer ‘image’ input, to texture your model.
Read the included notes for more fun – including models with embedded 3D animation parameters.
Recording Your QC Experiments
Install Syphon Recorder.
Install the Syphon for Quartz Composer plug-in.
Put ‘Syphon server’ into your toplevel Quartz editor window. To use the OpenGL scene as the source, set this patch to be the last (top) layer rendered. ( ie – click on the right of your syphon server box, and ensure it’s number is higher than other layer.) This enables the Quartz output to be displayed within Syphon Recorder.
Open ‘Syphon Recorder’. Your quartz video should be already visible. Adjust recording preferences to suit (it handles 1080P fine on this 3 year old macbookpro), and hit record. It seems to manage 1080P HD recordings fine, and it even records alpha channels ( ie you can record audio-responsive 3D models on a transparent background, for easy compositing later into the likes of After Effects.
(See also: QTZ Rendang - Free software for rendering out quartz patches. Potentially useful for non-real-time rendering out intensive patches that give slow playback. Haven’t tested that yet though.)
- to Vade for providing both the v002 model importer, Syphon and the Syphon recorder, which makes a lot of the above possible.
- To VJ Shakinda ( co-author of a forthcoming book on QC with @momothemonster) for his trilogy of 3D tutorials on youtube, which really helped me get to grips with 3D in QC. If you found this stringy guide helpful, wait until you’re bungee jumping with this fella:
Aphex Twin played in Melbourne recently, which also meant a chance to catch the pixel mutations of his regular tour VJ, Weirdcore. After a summer of stage screens saturated with glossy, void-of-personality motion graphics templates, it was refreshing to catch live tour visuals that were ambitious, sophisticated and raw – very obviously being generated and manipulated live, mistakes and all. Probably not many other ways to approach a set for Aphex, and apparently he does improvise a different set every single time.
Below, a diagram from Weirdcore’s artist profile at vidvox.net, explaining his video set-up**.
“With additional FX programming from Andrew Benson, Flight404, and Vade this is one of the wildest tour setups we’ve seen in a while… but you wouldn’t expect anything less for the worlds most known electronic musician. Pitchfork may have said it best, “First, we can’t really talk about anything until we talk about the visuals.”
[[ **"UPDATE: Weirdcore mentions that diagram is now dated (was for his 2011 set-up), and he intends to shift everything to jitter, with one computer direct to LEDs, much simpler and less likely to fuck up." ]]
Above : 1 x Richard D. James + 1 x Kinect, flanked by L.E.D. Screens of the face-replaced crowd at the Palace, Melbourne.
Below : The Kinect in action at The Future Music Festival ( Bonus Melbourne software plug : kinectar.org ).
Aside from general pixel mangling and the fast and fluid Weirdcore style – I was curious to see how effective the live face-replacing would be (snippets of it in live Aphex shows have been glimpsed online for a couple of years now). The software and camera set-up seemed to take a while to tune, but when it locked in the effect was mesmerising, updating fast enough to cope with panning cameras of a boisterous crowd, all while being relentlessly modified and further manipulated.
The face-tracking was also put to work in a section of the show that is customised for each location. Weirdcore had asked for feedback on the list of Aussie celebrities he’d compiled, so I threw him a few more names, including 1 x Wally. Crowd reactions varied, but were probably loudest for the photo below of Julia Gillard, and Weirdcore mentioned later that for whatever reason – unlike most other countries, the response had been loudest for politicians at Australian gigs.
Aphex + Die Antwoord Live
[[ DieAntwoord? If needing a crash course in South African cartoon-rave-gangsterism - spend 15 minutes of your life inside the short film, Umshini Wam made with Harmony Korine ( Gummo, Kids, etc). ]]
If the Aphex Palace gig wasn’t overloaded enough, 3/4 through the set - on top of a cacophony of Aphex – Die Antwoord burst out from backstage in orange flouro suits and rapid-fire a couple of tracks, one of which has ninja rapping from above the crowd he has stage-dived into. There are almost enough camera phones in the air, for ninja to get back onstage by walking across a bridge made of gadget. Another highpoint of weirdness is reached when ninja + Yo-landi rasp in their South African accents, ‘Aussie, Aussie, Aussie’ and the crowd eats it up, barking enthusiastically back ‘Oi, Oi, Oi!’. Somehow it all makes sense, including the ideas that future Die Antwoord videos are to be made by Weirdcore and later by Chris Cunningham. One extended global mutant franchise. And yes, after Die Antwoord depart, Aphex still has plenty in reserve (as do the lighting and laser operators), so by the time it’s done, we can only depart exhausted.
Thanks to Weirdcore and his tech companion Fede for taking some tour time out to meet up and chat about pixels.
Sadly, we’re now without a living Moebius. All the more reason to revisit (or explore anew) his wild and fantastic creations. Most people might recognise his graceful intergalactic palette as some of the best parts of Heavy Metal magazine (originally Metal Hurlant in France) – but even the comic-less would be familiar with his work and influence on films such as Alien, Tron + Blade Runner. Indeed, we’ve lost a gentle giant.
After being invited to perform at the Australian Natural History Museum in Sydney ( back in Aug 2011), I couldn’t resist the chance to project onto a dinosaur skeleton. I was performing video in the foyer with Pattern Machine as part of the Museum’s Jurassic Lounge evening series, and the museum staff were super helpful in squeezing in this extra request during an already hectic night ( which amongst other things, included a silent disco and live taxidermy!).
The Muttaburrasaurus was apparently a herbivorous Aussie dinosaur that lived 100 million years ago, around 8m / 26 ft tall and weighing around 3 tonnes. To project onto it, I connected my camera to my laptop and used Madmapper‘s spatial scanner function to generate an image mask, tidied this up in photoshop, then played with the masked image using VDMX.
Congo Tardis #1 are an absurdly talented trio of Melbourne turntablists, who recently asked me to put together a micro-budget music video for their new EP. Part of their pitch was that they already had some green screen footage of the guest vocalist (hula-hooping extraordinaire: Marawa the Amazing), so ‘hopefully a quickly composited clip wouldn’t take too long’. The green screen footage, of course, turned out to be a drunken ninja fest: most shots cut off part of Marawas’ head, or featured the camera wobbling on the tripod (when the cameraman wasn’t busy zooming in and out). And yeah – it featured juggling of *green* limes. There was enough gold slithers to make it work though, and making it reminded me how useful VJ tools can be for video production today.
Integrating VJ Software Into Post-Production
Congo just wanted a series of composited scenes cut to music, rather than any narrative, so given their tropical flavour I set about designing some lo-fi cosmic jungle compositions. Typically that’d involve playing with images and video files in Photoshop and After Effects, but being easily able to record HD video with Syphon suddenly makes software like VDMX all the more useful for generating customised textures / timings / visual effects.
- Play music track. Adjust visual parameters with the knobs and sliders of a midi controller and record as you go.
- Playback instantly, see what worked, then rinse, repeat and refine. Or remix that video for a more layered effect.
- Throw these recorded clips into After Effects for some ‘mastering’ / fine-tuning / colour grading etc
The current Adobe Premiere + After Effects integration is another great contributor to an improved video workflow. Using what they call ‘dynamic linking‘ many rendering steps can be cut out and this can make a great difference for putting together a video. It allows precision editing a clip within Premiere, easy offloading into After Effects for parameter adjustments, then re-importing the newly adjusted clip back into Premiere – all without needing to render. Or work on a complex composition in After Effects, then import just this composition into Premiere for timeline cut-up. And of course, the end result of these processes can be exported and used within the likes of VDMX for further real-time splicing, remixing and layering, and easily recorded – for reintegration into… etc etc. THIS MAKES ME HAPPY!
Even as the VJ software market matures, it’s refreshing to see with releases like CoGe – that there’s still room for new players and perspectives. From deep in his East European code-bunker, creator Tamas Nagy was kind enough to provide a review copy and an interview.
“CoGe is a semi-modular, Quartz Composer® powered VJ application for Mac OS X®, designed for real-time media mixing and compositing.”
So – aside from the usual playback and manipulation of clips, what distinguishes CoGe? At first glance, the VDMX-like modular framework is immediately obvious, enabling CoGe to be easily customised for different performance styles or needs. Of the modules available, of notable merit is the very easily used sequencer.
A comprehensive wiki outlines the structure and approach ( the rendering chain / how the modular structure works / various automations+ mappings etc ) / clip synths, and a useful forum is fleshed out with fans eager to push it forward (At time of writing, CoGe 1.2.1 was just released, with significant performance increases ( lots more FPS on HD clips).
Quartz Composer is also quite deeply integrated into the software, which makes sense – given Tamas has developed a whole range of QC plug-ins which can be used within CoGe – eg PSD Brushes / PSD layers Textfile readers / Webkit ( rendering webpages within CoGe )Beat Detektor / GPL reader ( reads GIMP palette files ) / Mouse co-ordinates etc. As well as possibilities for integrating customised Quartz files and effects, CoGe allows use of 3D animation meshes (Collada .dae format) and flash files alongside any movies and stills used for mixing and compositing. Want to build your own CoGe module using QC? Tutorial – how to create a simple effect for CoGe with Quartz Composer. And yes, rendering is through the graphics card for maximum performance, and double yes – syphon is well integrated too, for easy sending or receiving video to and from other applications.
It only took a little while to adapt to the CoGe world, and what initially seemed quirky, now makes some sense. The interface elements are easily moved, re-arranged and intelligently grouped together using what CoGe calls ‘aligners’ to “arrange other windows together into manageable organizational ‘buckets’”. It might look a little ravetastic, but it makes for easy navigation and visual feedback while performing, and the sequencer action is great.
A welcome addition to today’s VJ software library, it’ll be interesting to see if CoGe manages to continue developing in some interesting directions, given what is being covered elsewhere.
$US99 (custom + educational pricing also available)
A Mac computer with 10.6.7/10.7 or later with a dual-core CPU and at least 1 GB of Ram.
Given there’s quite a range of existing VJ software – what inspired you to build CoGe?
It’s a kind of funny story I saw a first vj gig in 2001 in a music festival, and I fell in love with that thing. Never thought about being a vj, I made music before, so I’m from audio land. Then, when I got my first Mac in 2006, and saw Quartz Composer and saw how Quartonian works, I just think I can do something funny_ – never thought about to make a commercial application, haha – with a kind of sync with the music, so just tried it. It was the early version of CoGe, called LovQC, haha You can find some test videos made with it on Youtube.
Then I spent more time on the software, added lots of new features, and with 2 friends we just created a VJ team Luma Beamerz and CoGe was born. Anyway, the first version of the app was a 50mb QC composition with just an interface, then I started to learn Cocoa, Obj-C, OpenGL and other stuffs - Vade helped me a lot with the GL stuff, so CoGe now is a “real” application.
Anyway, I never used any other VJ softwares, I just created my own for my own wishes: triggering different points of movie on beat, sequencing still images, etc.
So, I think the big difference between applications is the workflow, so it depends on how you, the user think about creating things. All VJ software has a bunch of same features, triggering files, change speed, colors, etc., the big difference is the workflow, so I think an artist will choose software which works like his/her workflow. For an other example, modularity is a great thing, but a lots of users happy with built-in features in apps and never thought about it can be different.
It’s integrated into your app quite a lot – but what attracts you to Quartz Composer?
The great thing with QC is very easy to learn the basics and use for non-programers too, you shouldn’t be a coder to do a simple image rotation for example. On a developer side, the system integration – using QC stuff inside an application is easy – is a very important thing in my opinion.
It also have a lots plugins, and great media handlers, so a lots of things is possible with QC – basically, CoGe just connects QC stuffs under the hood, nothing magic.
What impact does Syphon have on how developers might approach VJ software today?
I think Syphon has a lots of potencial and its a very great stuff – connecting different sources into other applications is really opens some doors, just think about “sending” images from Max and Processing, and you can mix them in CoGe in a very simple way. That couldn’t be possible before Syphon.
Do you have opinions about whether VJ software should provide more advanced audio controls? And sequencing controls? Or is it better to sync VJ software to something like ableton?
I really like sequencing, using clips and stills in sequencing can provide really good things. With audio controls you can have some fun, but the really good choice is syncing with an audio host if you would like to make real AV things.
What are the challenges of making a good performance interface?
It depends on your workflow and what you wanna do in the performance. I recently just using 3 layers with a lots of media presets and some simple effects. If you using a lots of things the new Aligner stuff helps you to make smaller groups on the screen, I think its a very important feature.
What are you happy about in CoGe today?
CoGe 1.1 release makes me happy, i got a lots of positive feedback on it, and saw some really nice things created with CoGe. Also happy because i have a lots of ideas for the future
Partially, the skynoise part of my brain has been ant-eaten away by the likes of twitter.
There’s also a whole bunch of almost-ready posts waiting to be covered in finishing sauce :
- CoGe review (including an interview with Tamas Nagy)
- VDMX 5 (Beta 8 ) review ( it is now 10 years since I reviewed VDMX 2, including a tiny interview with Johnny DeKam )
- Resolume Avenue / Arena review
- Web Aesthetics by Vito Campanelli, book review
- Art Rage Pro Review
- Sydney Film Festival + Melbourne Film Festival reviews
- another Quartz Composer tutorial / set of links + observations
- science fiction books set in the non-anglo world
- reflections on touring with Gotye..
Expect those to start trickling through in January. And after that, probably occasional longer form pieces on current obsessions, and more with images and video, less of the pop cultural snapshots. That said – everything about 2011 was probably covered in David Weinberger’s amusing top ten list of top ten list of top ten lists. And as Umberto Eco reckons, liking lists is part of the human condition.. we face infinity and our mortality by making lists / catalogs / encylopedias / museum collections etc .
This week’s mission – finish off a video clip for Congo Tardis, using wobbly green screen footage sent by their charismatic guest vocalist, Marawa the amazing.
(( PS. The duo above, aping the Gotye bodypainting filmclip with 35 million views(!!), were wandering around at the 2011 Peat’s Ridge festival, and they became a pretty apt 2012 countdown backdrop on the big screen.. ))
Reasons you might find yourself wanting to read this very long but very awesome Raquel Meyers interview:
- Because you love 8bit graphics and people who push them to their limits
- Because Raquel makes rad stuff ( eg her recent DVD of ‘fighting washing machines and killer lego ducks’, full of videoclips, remixes and collaborations with chiptune musicians and pixel pushers – Useless Yet Crucial).
- Because you want to find out about her ascii storytelling experiments with the C64 shredding musician Goto80.
- Because you love reading about how artists wrestle with their processes.
- Because you need a crazy and wonderful collection of visual links in your day.
Who knows, but I hope you enjoy these responses as much as I did. Thanks Raquel~!
- What’s inspiring you these days?
At the moment I am experimenting with storytelling and text-based graphics like Ascii, Ansi, Petscii and Teletext with Goto80. I’ve changed both the tools and the purpose of what I’m doing during the past months. I guess what I’m doing now is formally similar to text adventures, cartoons, silent movies, text art, demos…
I’ve been mostly inspired by animations and short movies from the 20th century, like “Little island”(1958) by Richard Williams or “Cowboys”(1991) by Phil Mulloy; and also, children’s books. Because of the brutal style of the “Simple storytelling”, the combination of a drawing plus a short phrase who builds a full dream up. This one makes me think about 2 frames animation, and how something simple it become even more brutal, especially working with the C64.
In the case of the short movies, the animation comes before the music, so the video is not the slave of the music (music video style). Sound effects increase the tension and the verve of the animation, and could be use in a shorter way like an interlude, or something longer. But the main thing is the story behind it, whit out it you cannot go further.
A cinematic new age terror is coming!. It operates in text mode, only using characters of the Commodore 64 and Amiga. This applies both to the graphics and the music.
[[ EDIT:Terror is now live - witness “2SLEEP1”, a "66-minute playlist of audiovisual performances in text mode, designed to make you fall asleep. Press play, go fullscreen and lie down. Made by Raquel Meyers and Goto80." screenshots below:]]
- What hardware and software do you use to create your animations?
I use several computers. A C64 with Letter Noperator and DigiPaint. An Amiga 1200 with DPIV, Brilliance, Prism and also an Amiga 600 provided by Archeopterix. A PC and Mac, with Flash, Photoshop, video editors and the (unreleased) petsciibrush software made by Linde. Soon I will add a Teletext device.
I’m not a gear freak. I don’t really care about the tools. I used to work primarily with Flash and Photoshop, which was a pain in the ass for the things I was doing. But I still liked it. Now I use old things (Amiga and C64), and that’s also quite painful sometimes. So to answer the question – I blend old and new technologies. It doubles the pain!
I am not a purist, I am a blender.
- How much of your creative process is defined by the limitations of such technologies?
I prefer to talk about possibilities instead of limitations. I think the technology is not the limited one, is the human behind it. It doesn’t matter how old or new the technology is, there is always something new to discover and learn. It’s not a such a big thing to use old technology, it doesn’t make everything more special, different or better. In my case, I use it because I like it.
But the things I do in Flash are different from what I do on C64. So the process is different. But I don’t really like to think too much about those things.
- Is there some cut-off line for retro computer graphics, where they are too new for you to use? What is it about 8-bit that manages to sustain appeal for you?
At least not for me, I’m not interested in the retro version of 8-bits, so I don’t think about if something is too new to use or not.
I remember playing pong with my brother in the TV console, meet my friends at ‘ la sala de máquinas’ and how I had stuck in my head every night before going to sleep the Tetris song. I grow up with arcade games and graphic adventures but, it wasn’t until 21 century when I discover a C64 music archive on Internet, and all these memories becomes something else because of the music.
It wasn’t a revival, it was something else, the imaginary frame in my head that before was a picture now become pixels looking for to be animated.
I don’t really know, but I think what keeps my interested in 8-bit is the brutalism. Big blocky objects, raw animation techniques, few frames, cuts, etc. I think it’s better if the animation method is brutal, because then it contains so much more than with some detailed video where there’s less room to think on your own.
- What do you find interesting about making live visuals versus production work?
A Live Performance is always open to improvisation and mistakes, meanwhile production work is always under control in the time line. You can rehearse or planning live visuals but at the end you don’t know what is gonna happen. Is really fun put yourself in a non control mode, keeps the spark. And since I don’t really use VJ-software to perform, it’s always a challenge.
- What work have you done on combining and compositing 8-Bit and recorded video together?
As part of Entter (2000-2007), the video clip Fantasy’ by Goto80, and ‘Dietetic Music’ by Eat Rabbit with graphics from Otro. Both of them were my earliest works in the 8-Bit, 2004 and 2005. Based on video recordings and post-production. In latest video clips, I mixed photo animations and graphics like the ‘Droidduck’ by Psilodump (2010), ‘Pink Snow’ by La belle Indifference (2010) and ‘Polybius’ by tr1c3 (2010), based on the main live cinema project ‘Polybius’ with Goto80. Also parts of the vj set contains video and graphics mixed. The reason of that is because my first background was Analog photography. I started when I was 14 years old, with black & white films and experimenting in the lab. The first thing jumping in my mind is always a static picture, a frame. My work is based in the movement or animation of such frames.
- Can you describe your AV set with musician Goto80, Polybius? ( and your aims behind it?)
Polybius …. the idea came from a post I read in my brother’s blog in 2007. The post was about an urban myth about an arcade game from the 1980s (Polybius) that created a sensory and cognitive deprivation in its users. So I started to talked with Goto80 about it and how much I would like to do something with it and with him. The basic idea was explode the links between fiction and reality by encouraging a loss of senses. But it was not until 2009 when the french collectif ‘Homemade’ invited me for a 2 weeks residence at Le maki (Angoulême, France) when the Polybius experience become something else tahn talks. I developed there a first 20 min version, using a ‘cute’ character like a rabbit to hide my really epileptic and apophenic purpose, and Goto80 was working in the audio online from Sweden. The project was officially presented at the Cimatics festival the same year.
In the beginning of the 2010 we develop together in Berlin the second version who combines line vector aesthetics with video manipulation and 8-bit technology to induce feelings of apophenia, amnesia and panic. The Polybius experience – invented and created by us in the form of a white rabbit with a sectarian-politonic-track to be stuck in your head.
- What’ve been the challenges of developing that, and what has worked or not, when performed live?
One of the biggest challenges was working in the distance via Spain-Berlin-Sweden thought Internet. Because we build the project together from the beginning and sometimes was really difficult to define and create the content without being in the same place. When we presented the project at Cimatics, we realized we need to meet physically to develop a second version and special place to performed it, out of the club experience. So in the beginning of 2010 we meet in Berlin for a week to prepared the second version, because we were invited by the PlazaPlus Festival in Eindhoven NL to performed it in january. We made a special pass before for the visualberlin collective at fh.meppen (Berlin) to test the extended version of 32min and got feed-backs from the public. The third and last version is pending, who icludes the physical game and an installation. But for this we need budget and maybe a residence to develop it. It’s one of the most complicated projects I have ever done.
- To what extent are you able to adapt the visual side of that with each performance?
My set is manual. To be able to adapt to whatever happens in the live performance. Before I was only using one laptop running an aplication who host all the visual content (graphics, animations, videos …) controlling by hand with the keyboard. So the rhythm was build in the way I click on the keyboard and load the different content. Now I’m working in a new set, who consists in a C64 and an Amiga, still in process, so I used the laptop as extra support with the same technic. A video mixer is used to change the sources, but there is not so much effects involve. The thing that takes more time is making all the animations, graphics and videos. I only used my own material, and always try to made a special set for each performance.
- Have your computer / animation processes ever entered / filtered / affected your dreams in any way?
Yes it does, because I listen so many times the songs when I’m working with it and also I dream with the animations. But ‘Polybius’ was something really insane, I had one of the tracks stuck in my head, like a trance mode to my own sense deleting experience.
- At the ‘Artists-Who-Inspired-Raquel Meyers’ Award Ceremony, who gets the following awards?
- Visual artist who most steps outside the echo chamber of contemporary styles?
Nam June Paik, the retrospective exhibition ‘The Worlds of Nam June Paik’ in 2001 at Guggenheim Museum Bilbao I saw, put him for me in this category, like the “Magnet TV”.
- Visual artist with the most exquisite and hard to understand technique?
Poison, I know the technique, but is not enough, because even if you use the same software you cannot have the same results. As PETSCII graphician was really impress how he made ’2frames’ animations and graphics for the C64.
- Visual artist who best gets under your skin? ( transcends technique to grab your emotions ? )
Otromatic, he is my favourite 8 bit graphician. He become one of the reasons why I start to make Lo-fi graphics and animations.
- Best coherent, integrated audiovisual act?
Gangpol & mit. Really impressive performance, one of my favorites. I really enjoy the animations.
But wait, there’s more:
This is something really difficult to do because inspiration doesn’t come only from visuals. They are so many things involve in this process. Here there is some of them, older and newer inspirations:
- Visions of Frank. The dreamlike world of ‘Frank’ a comic by Jim Woodring converted in animations.
- Jan Švankmajer and his surreal animations like ‘Meat Love‘.
- Professor Balthazar, a cartoon series for children, created for television by the Croatian animator Zlatko Grgić. Watching this as a child build a surreal imagery, who come up when you become older.
- Poison, C64 graphician. The ‘Notemaker Demo II‘, all you can do just typing characters.
If you’ve watched The Century of Self, The Power of Nightmares – or really, any series by Adam Curtis, (this could keep you busy for a while), then you’re aware of his formidable skills in crafting a compelling documentary. Fans have probably already seen his eagerly awaited most recent series, All Watched Over by Machines of Loving Grace, which claims that computers have failed to liberate us and instead have “distorted and simplified our view of the world around us”. Once again we find Curtis swinging his sword at the notion of power in the twentieth century, slashing his way through the deepest undergrowth of the BBC archives along the way.
As always, his arguments focus on the emergence of significant ideas in the past, from where he traces a path – to how they’ve impacted the world today. And so, he explores the effects of Ayn Rand‘s ideas on American financial markets, looks into the selfish gene theory which holds that humans are machines controlled by genes, and examines how “the ‘ecosystem’ myth has been used for sinister means”. It’s fantastic as televisual essay, even if that essay repeats bits of his other essays, and occasionally feels like he may be stretching a point or ignoring others – so that his narrative threads can stay intact.
Narrates Ben, over the top of some creative commons licenced footage:
“This is a short film about a documentary film maker who made critically lauded films for the BBC, and about how, along the way, he proved that style always triumphs over substance. In 1992, a strange and brilliant That’s Life researcher with a Skinny Puppy CD embarked up on a career producing documentaries about how ideas can spark social movements. Adam Curtis believes that 200,000 guardian readers watching BBC2 can change the world. But this was a fantasy. In fact, he had created the televisual equivalent of a drunken late night wikipedia page with pretensions to narrative coherence.
Combining archive documentary material with interviews, Curtis filled the gaps by vomiting grainy library footage to the screen to a soundtrack Brian Eno and Nine Inch Nails. He had discovered, that it did not matter what footage he used, so long as he changed the shots so bewilderingly fast that the audience didn’t notice the chasm between argument and conclusion. This was especially effective when he simply cut the music mid-bar.”
Riding a bicycle downhill to the studio today – with blues skies all around – really felt like spring arriving. Winter seems to take longer to leave Melbourne than anywhere else in Australia, which is maybe why there’s so many visual art events crammed into the wintery months here. Samplers:
This grows nicely each year, transforming lots of shopfronts and buildings in Gertrude st for a week. Above, a nicely mapped facade by Olaf Meyer. There was apparently a pretty good opening night party of projections, which I missed due to projecting elsewhere for the Scattermusic label launch party. Below, a mapped sculptural piece by studio neighbour, Kit Webster, alongside a fancy dress store where peering into a camera projected your face onto that of a shopfront scuplture. (More projection photos).
I went to this because local video artist Lucy Benson, now in Berlin, had a hypnotic piece in it - ’Gotta Sleep now’, but my camera phone couldn’t really capture her shimmery work. Below, a sculpture that nicely incorporates video and little people. Can’t figure out from the event page who actually made it though, maybe you can. Nice idea for an exhibition, and great to see the different interpretations of the tracks.
Hadn’t even heard of the warehouse venue Nosaj was playing at – Revolt – and arrived to a building crazily decked out with technical and bar infrastructure, including pyramid mapped video sculptures by Kit in the distance. Came complete with a 90s black light chill out room. The Nosaj set was great, the rest of it got a bit wonk-saturated after a while.
Zeal and Time Shield have been steadily honing their AV performances around town, and recently Zeal invited me to do an AV set at Bar Open in support of his threepiece Virtual Proximity (see above). I was quite happy with this set, playing with some ambient music, ocean footage and quartz patches in VDMX. Elsewhere, Sampology came down from the subtropics to do an AV show, and Naysayer and Gilsun more recently launched their new AV set. There be audiovisual things happening. (Often at Racket – first thursday of each month at Miss Libertines in the city, and Plug N Play – last Thu of each month at Kent st bar, Fitzroy. )
MÖBIUS from ENESS on Vimeo. This ‘collaborative stop motion scuplture’ was the brain child of Benjamin Ducroz, an extension of his work with time lapse and physical sculpture – this time using lots of help from public volunteers in rearranging the pieces over and over throughout the day.
Above : sample of recent projection experiments with triangular screens made from nursery store bamboo, white lycra and gaff tape. After explorations in Sydney, I’ve been keen to continue playing with fragmented screens and composing video throughout a space. This is all made more interesting with the extra flexibility that a triplehead2go graphics card brings ( portions of panoramic output from one laptop to 2 or 3 projectors ), as well as Madmapper for easily and precisely aligning pixels to fit screens / objects / spare wall spaces etc. The Madmapper folk have been releasing an inspiring set of very detailedtutorials too, as well as pretty useful add-ons.
End result: Lots of fun – and a new set of challenges to deal with. Spatial composition with video is getting easier and easier, and as we outgrow the novelty of seeing buildings lit up / architectural deconstruction by light, there’s such ripe terrain to explore with today’s software. And as the barriers to entry continue to lower, it’ll be the imaginative approaches that prove most successful.
[[ Oh yeah - and that video - not a manifesto for spatial video by any means, just some example snippets from a fun night with the Scattermusic Sound System.. still getting my head around how this can all work well. And there be photos too. ]]
After doing live video for 2 shows there last weekend with the Gotye band, I can add to that list:
- It is a rabbit warren under the sails.
- The salad sandwiches in the green room are very ordinary.
- The elevator under the concert stage is faulty (I was trapped there with a weary tech guy for 5 tense minutes.. )
I got roped in to do live video for Gotye’s tour for his just released Making Mirrors album, which has accompanying animations for most songs. There’s some pretty nice work amongst it – I’ll have to do a follow-up post soon with links to all the animation houses. For me, my work is mostly editing and formatting to suit the main screen and 2 vertical side screens, then while the band plays – triggering short sections of these clips to ensure the right visual moments are synchronised with the band playing live.
Despite an almost comical list of headaches – long fog delays at Melbourne airport, animations arriving at the last minute, software quirks, a compressed set-up time, hardware quirks, that elevator(!) and so on – the first shows of the tour ended up running really well. Having a crack team of musicians (and tech folk) definitely helps in that regard (including Tim Shiel aka ‘Faux Pas’ beside me onstage). Below, the band and my hard-drive covered laptop during sound / vision check at the Opera House.
And the VDMX interface spreading its wings up on the screen briefly during rehearsal.
- Tekkon Kinkreet - fantastic animated film – with accompanying live soundtrack by Plaid (Warp) + Fourplay (strings) + Synergy (robotic rubber limbed percussionists). Really luscious sound, really luscious film.
- Silent Comics – a series of comic panels projected while musicians provide a soundtrack. This included sound foley artists, Captain Beefheart-esque carnival bands, Seekae, Wally from Gotye in splinter-sample mode, and probably nailing it best, Plaid. Great idea for a session.
- Scott McCloud – from ‘Understanding Comics’ (also used as a multimedia bible in explaining media and visual storytelling concepts ) did a great one hour presentation, which harnessed visual support material as effectively as you’d hope a guy like him would. Lots of interesting points, though I found myself laughing at his interface observation- ”Why does Tom Cruise need a glove to do all that in Minority report?”. He also ended with this pretty funny reading of a scrolling comic that involved monkeys mutating into progressively crazier proportions.
- Pete Kuper – aka the guy who did Spy Vs Spy from Mad magazine.
- An assortment of Aussie comic artists doing talks and workshops – including Mandy Ord, Pat Grant and more.
Sadly Robert Crumb wasn’t part of the mix – but I was amused to learn from the Festival organiser about the communication process they had – “Yes, Robert uses email, but that involves….” – his assistant scanning his recent emails, printing the interesting ones, highlighting the relevant bits, cutting those out and putting them in an envelope and mailing them to Robert, who replies on the back with his pen. When he’s around.
So a while ago I interviewedFernando Llanos, a Mexican artist with a huge catalogue of artworks under his belt. Notably, this included the Videohuahua project – which involved a micro projector strapped to the back of his pet chihuahua. Turned out he was bringing a video blimp to Australia for the Splendour in the Grass festival, and was spending a few days in Melbourne afterwards – so we made plans to meet up.
A couple of days later, I was introduced to Gonzalo who runs the enchanting Magic Lantern Studio ( 155 Brunswick St, Fitzroy, Melbourne ), which is filled with puppets, optical illusion and vintage pre-cinema moving image devices. At some point I noticed he had a few paintings of chihuahuas on the walls, and we got talking about them – and then I mentioned Videohuahua – Gonzalo stared at me, then lead me laughing to his computer where he showed a series of paintings that feature chihuahuas with cameras strapped to their heads.
Inevitably Fernando’s Melbourne visit had to include a trip to Magic Lantern, where it turned out the art and chihuahua anecdotes flew thick and fast ( mostly in fast-forward Spanish). Below, Fernando on the left, Gonzalo on the right, in front of the shop and a painting of a chihuahua with an electric shaver as head. Photographed and blogged, so I can say, no, I am not making this up.
By the end of the week, after much tech configuration, island sampling*, and software wrestling, we’d concocted a work in progress that was deemed seaworthy enough for 3 x 45 minute audiovisual sets during the public exhibition night. And during that day the space was filled with people wandering around the inflatable sculpture, while cocooned by a generative surround installation busy mutating captured island sounds into new species. Turns out the accumulated ferry rides, nautical rust and winter winds were worth enduring in the end, as the performance seemed to go really well, much of the pieces falling into shape on the very last evening before the event.
For myself, it was very satisfying to have an opportunity to explore video composition in a great setting, and in a more spatial way – using an external graphics card to send a different signal to 3 different projectors simultaneously, using madmapper to position and map the video from each of these, and having the luxury of returning each day to experiment with equipment that was already set-up. And it was super-satisfying to be doing that with…
Jean Poole: spatial video composition and live video manipulation with 3 projectors, vdmx, quartz composer and madmapper. Dan MacKinlay + James Nichols: Quadrophonic soundscapes using field recordings, vintage synthesisers and heavily customised super collider patches. (They don’t have much vinyl, but their phd maths books weigh a tonne.. ) Sarah Harvie: inflatable sculpture, tailor designed for our space with lots of late night industrial sewing machine sweat.
Aside from the audacious setting, part of what made the residency great was the motley collection of artists also spending time on the island, each struggling with their own peculiar set of problems to solve. And it was inspiring to see everyone’s work evolving over the week. This extensive festival review gives a good taste of how the exhibition day unfolded, and these were some of my favourites:
Case Study - This was my pick of the bunch, 6 artists who had the aim of building a new colonial society in their allocated portion of the island. Which they built out of everything they brought in their suitcases, as well as using their suitcases themselves to build individual artist houses. There were telescopes and projected moons, ornate water features, mossy forests growing from open suitcases and test tubes, every step a new photogenic overload.
Younes Bachir and Strings Attached got the jaw-drop-spectacle medal – with their meat-suits, paint-splashy aerial choreography ( imagine a dozen people 4 storeys up dynamically moving about in space ) and flair in abundance. ( This gives a good taste of why it excited.. )
Brad Miller’s Data_shadow video installation was super-slick, an exploration of memory, technology and how lusciously you can make a database of photographs and video wander across 4 screens with motion detection cues from visitors. Biljana Jancic‘s wooden boxed shafts of light played beautifully with the smoke machines, silhouettes and the industrial space and SWANBRERO used inflatable car sales dancers to great effect in their piece - INFLATE MY HEART WITH 1000 GUSHES OF WIND .