Higgs Boson Explained in a Cake Recipe

Thank you, @fustar! (+ fustar.info).

“It’s like zoologists trying to determine if they’ve discovered a new species of butterfly by looking at a meadow through binoculars. From 16 miles away.”

From the above Guardian article, I learnt that all the matter we can see in the universe accounts for only 4% of the total, and the Higgs Boson might help us figure out the other 96%. That seems odd in a Douglas Adams kind of way, that we’re so blindly adrift. And he would’ve been thrilled today if he were still around. Though he’s probably flicking someone with a towel somewhere inside that 96% we can’t see.

The Higgs boson, the Guardian also explains – is the force-carrying particle that helps transmit the effects of the Higgs field. And the Higgs field? A kind of cosmic “treacle” spread through the universe. As particles travel through the treacle they slow down, lose energy and get heavier. Kind of like cake mixing really.

And so, a significant day for science, a landmark that’ll be looked back on in decades and centuries to come apparently.

And then the next thing in my feed full-screen floors me… I don’t mean to get all Californian on your ass – but these are some of the thickest slow-motion barrels I’ve ever seen anyone surf through – we’re talking some serious overhanging slabs of ocean here. We’re talking some serious mass. Try keeping your jaw up at 2.29, 3.33, 4.29, 5.23…..  Physics eh?

[[That was by Chris Bryan. Also check out his 12 minute show-reel shot with the same super-slo-mo Phantom cameras (in an underwater body).]]

by j p, July 5, 2012 0 comments

Amon Tobin Taxed : ISAM AV in Melbourne

amon tobin isam melbourne performance

Apologies if you’ve arrived in search of Tobin Tax discussions (a tax on market speculation proposed by Nobel Laureate James Tobin as a way of managing exchange-rate volatility..). Here we have only a murky swamp of audiovisual performance questions, all of them generated by Amon Tobin’s recent performance of ISAM in Melbourne.

ISAM?
– AKA  >> An album. And a projection mapped, audiovisual extravaganza, premiered at Mutek, 2011.
– Lengthy behind the scenes article, including video and storyboard examples .. over at the Touch Designer blog, the software used to run the show.
– Pixel creation by V Squared Labs and Leviathan, production design by Alex Lazarus, stage design and set design by Vitamotus.
– And yes, the stage set was designed to fit within a few inches of biggest possible travel container. Precision mapped, video glimpse.

Ramblings / Rants / Observations

1. First up – some congratulations are in order – ISAM’s a stunning and well fleshed out achievement, raising the technical benchmark for live audiovisual shows. And perhaps because of that, the next 22 points are an assortment of thoughts the show triggered during and after. What is this live audiovisual thing, anyway?

2. Some of my favourite bits were the more imaginative transitions – the projected video morphing between different content, while simultaneously transforming the type of perspective being overlaid on the cubes. ( eg shifting from a scene where the three dimensionality of the cubes were being emphasised – with each surface mapped as walls / textures of a spaceship… and then a zooming morph that ‘flattened’ everything to more of a cinema screen showing a spaceship shrinking away into the distance…). I remember thinking these bits appealed because they were moments where imagination seemed more dominant than visual plug-ins or render farm hours.

3. How much of this was real-time, how much was choreographed? Does this matter? Paging Toby. From my vantage point, it was clear that the visual director/ technician / booth operator was doing very little behind his console for many of the tracks, arms literally folded as he leaned against the desk.

4. “It’s like we’re watching something that happened 12 months ago”, said a friend, referencing the online saturation of ISAM over the last year, and perhaps, that the visual creativity was mostly employed long ago in pre-production, rather than the live arena.

5. If you were looking at the audience for most of the gig, it would’ve been hard to tell whether they were at a screening or a concert. Does this matter?

6. After the visual avalanche – the outside world seemed more vivid… and on several occasions while riding my bicycle home afterwards, I found myself thinking – this is way more beautiful than the show –  the crisp and vivid  silhouettes of spotlit cathedral architectural elements ( flanked by fluffy night clouds ), later – a golden second floor window streaming light out onto a bunch of autumn leaves, composited within a mostly pitch black sky, and while crossing train tracks, looking right and catching bright red train lights reflected in the curve of shiny metal train tracks, glistening lines in the dark. ISAM is visually exquisite – but something bugged me – and it’s possibly the complete emphasis on synthesised environments, which alway seek to simulate but can never quite get there. A lot of the charm of Amon Tobin’s music comes from the way he uses field recordings and everyday sounds as part of his heavy digital processing – I wonder why the visual aspects weren’t considered the same way?

7. The window reveals of Amon inside the structure worked great, the lighting perfectly compositing him within the overall picture. Where and how else could rear lighting have been used playfully with the structure?

8. The music! Oh yes. Almost forgot. Periodically closed my eyes, and alternately felt the music was less – or more – interesting without the visual. Lots of the set featured luscious sound, but after a while the compositions themselves struggled to distinguish themselves from the rest of the set. The bass heavier finale hinted at directions it could’ve gone in, and friends mentioned the after-show DJ set by Amon was musically much better.

9. Wonder how modular / flexible that structure and overall software system they have is – how easily could they re-fit / re-model the shapes differently in another space?

10. What would the combined creative crew do differently if they were starting this project from scratch again? How much could their combined system be considered a platform / foundation for a more flexible / organic approach next time?

11. If it did feel a little bit like early cinema viewings, where the audience swoons at a train coming at the camera, where will the process of video-mapped surfaces go from here?

12. Are we starting to see an increasing divide between what’s possible with a few clever friends with some imaginatively deployed DIY tools – and larger scaled spectacles with matching budgets and limitations?

13. What’s with the overdose of mechanical sci-fi imagery? Galactic Giger transmissions might be contributing to the oversaturation, but surely there’s room for a projection mapping palette that expands beyond a Quake skin, to aesthetics that might include organic shapes, lifeforms, or even characters (whether narrative driven or not)? Not everyone has to be Gangpol Und Mit, (1000 people band, yo!) but other spatial composition palettes are possible!

14. Yes, there was more than visual-machine-lust, we also got geometric deconstruction – sophisticated even, but still, there it was, almost like a logo popping up logo for a software plug-in, the virtual form outlined, crumbled and rebuilt. What other ways can we play with spatial composition?

amon tobin in Melbourne, 2012

15. I was definitely transported at times, enchanted.

16. DJ as cloud? The kinect bit was effective, transforming the stage cubes with a swarm of Amon dots, as he manipulated his DJ booth gear. Was kinda arbitrary though, equivalent of stepping on a giant FX pedal during a song, and letting it utterly dominate the composition. Which is fine… but yes, trains coming towards the camera. Great effect – how could it be integrated meaningfully?

17. The audience was often quiet, near motionless. Weird for a music gig of an esteemed producer. A constant nearby soundtrack? People laughing in an excitable disbelief ( kind of like a cough-laugh) as though they were watching a sideshow circus of things that possibly shouldn’t be happening.

18. Overheard outside: “I think Melbourne is now out of weed.”

19. No matter the cost of the algorithm or render farm rental, an object artificially shattering into pieces still looks like 90’s desktop publishing. Or Lawnmower man. If you’re contractually bound to deconstruct, stylise it to some interesting degree, or show us some actual ruins, or some hybrid of your go-pro kite in Baghdad and your abstract shape plug-in of the moment.

20. So you’ve watched / listened to ISAM in Montreal, Melbourne and Manchester. How different were those shows? How different could they be?

21. Was that a pair of stacked Christie projectors?

22. Who knew the Palce in Melbourne went 4 levels high? New to me and quite the vantage point from up there, gave the cubes a much more pronounced 3D effect, from the floor it definitely felt much more cinema screen flattened rather than 3dimensional.

23. I was impressed by ISAM rather than seduced by it. Undeniably a really accomplished show, it deserves the praise, but I’m wondering if it left other live-video-heads with such mixed feelings about what was sacrificed as part of the production upscaling.

Totally curious as to what any other folks thought.

PS. I did a phone interview with Amon a long time ago, and he seems like a lovely guy.

 

by j p, June 26, 2012 5 Comments

Animation for Attack of the Cats by Sampology

Audiovisual turntablist Sampology recently commissioned a music-video for the interlude track on his upcoming album. The catch was, it had to be 100% animated cats.

IF Felinology = the study of cats and their anatomy,
THEN: GIF-felinology = Kind of like the torture scene from A Clockwork Orange, where eyes are propped open by pins… and as a piercing cat midi loop blares, an avalanche of cat tumblr feeds scratch away at the poor researchers eyes, until nothing remains but twitching and furballs.

I survived though – and am now a semi-professional animated cat-gif expert (contact me for rates). Catch the clip below ( or over at the projects page, where I now archive + document my video activities), or see it on the big screen during Sampology’s upcoming Apocalyptic AV tour.

Next up: more apocalypse, preparing ‘post-apocalyptic visual backdrops’ for upcoming TZU tour…

Elsewhere: holy felines, batman!

by j p, April 19, 2012 0 comments

Learning With Quartz Part 4: 3D Objects With Video Textures in VDMX

quartz composer dancing with syphon recorder and alpha channels

Warning: playing around long enough with 2D billboards and sprites (to display images) inside Quartz Composer, will eventually lead you to a third dimension. Fear not, inept explorers before you have survived, and they’ve even left some string at the edge of the cave:

Playing with 3D in QC

1. Create a new clear patch. This creates a blank background, choose colour in its inspector.
Type ‘clear’ into the library window, drag and drop a ‘clear’ patch onto your editor window.

2. Drag a ‘cube’ onto your window too.
Drag a movie from your hard-drive, into your window. Attach the image output to one of the image inputs of the cube.
Play with the rotation and x / y / z values of the cube in the inspector, and you should see video on one side of the cube.
( Tip – When adjusting Quartz parameters, adjust at 10 x speed by pressing shift at the same time, helpful for quicker compositing.)
You now have video playing on a 3D object in QC. Woo!

To easily put video on all surfaces – create an input splitter – by right clicking on the cube. Select ‘Insert input splitter’ and choose the the cube image input you’ve been using. Drag from the output of the input splitter, to as many of the cube image surfaces as you wish.

3. For easier 3D space manipulation try using a trackball. Drag one in from the library. Click and drag to select your cube(s) and movie files, then press command + x to cut them. Leave the clear patch. Double-click the Trackball patch to go within it. Paste (command + v) your cube here. Now try clicking and dragging on your QC viewer window – the trackball enables this more intuitive navigation / perspective alignment. (Click edit parent in the editor window to go back to your upper layer patches at any time.)

(Similarly, the 3D Transform patch can be used to allow easier control of 3D space. For example, placing several 3d objects inside a 3D transform, allows easy perspective adjustment of the whole scene by changing the 3D transform parameters.)

4. To add complexity – drag a ‘Replicate in Space’ patch from the library into your editor window, and place beside your cube. Again select your cube(s) + movie(s) + cut these. Double click to go inside the Replicate patch, then paste these inside. Play around with the Replicate parameters and watch the results.

quartz composer and 3d

3D experiment quartz patch ( Update the movie location to match a video file of yours, or delete the included movie link, and drag a movie into the editor)

Playing with your 3D QC file inside VDMX

Option 1: Drop it into the media bin and it should playback the 3D model with pre-connected movie.

Option 2: To use the QC file as an effect – allowing any clip being played in VDMX to appear on the cube – we need to do a few things :

– So that the video input is user defined rather than pre-defined, the root macro patch needs ‘an input splitter of type ‘image’ set up receive the incoming video stream‘.

– To enable any of the QC parameters to be adjusted live inside VDMX, we need to adjust the relevant parameters in the QC patch. This is called ‘publishing an input in quartz’ ( and enables that QC parameter to be adjusted within VDMX –  see instructions). However – slight complication – these need to be published in the topmost layer of quartz ( ie the root macro patch) to be accessible in VDMX. So if you’ve published an input within a subpatch of your main patch, this won’t show up in VDMX. To solve this, publish the input at the layer of QC you are in ( eg inside a replicate in space patch), then go up one level – this published input will now appear listed in that patch ( eg replicate in space). Repeat the process of right clicking to publish again, and it will appear in the next patch up, and so on until it appears in the root macro patch.

– Save the QC within your VDMX QC FX folder. Select it from assets ( if needed, refresh via VDMX > prefs > User paths ). Whatever VDMX clip is playing will now be composited onto the cubes. Dude! OH: and click-dragging in the VDMX preview window works for the trackball navigation, the same way it does within QC.

3D experiment quartz patch for VDMX  (Drop into VDMX QC folder. Change parameters in this QC patch, save as new name in the same QC FX folder, and you’ve got as many new 3D compositing tools as you want.)

Playing With 3D Models

Download and install Vade’s v002 Model Loader – which allows you to “load 35+ 3D model formats into Quartz Composer, including animation data.”

Drag the v002 model importer into your editor. For ‘model path’, enter the address of a 3D model. (Drag a 3D model to the editor, click it, select path in inspector, copy and paste into v002.)

Connect an image or video to the v002  Model Importer ‘image’ input, to texture your model.

Read the included notes for more fun – including models with embedded 3D animation parameters.

Recording Your QC Experiments

Install Syphon Recorder.
Install the Syphon for Quartz Composer plug-in.
Put ‘Syphon server’ into your toplevel Quartz editor window. To use the OpenGL scene as the source, set this patch to be the last (top) layer rendered. ( ie – click on the right of your syphon server box, and ensure it’s number is higher than other layer.) This enables the Quartz output to be displayed within Syphon Recorder.

Open ‘Syphon Recorder’. Your quartz video should be already visible. Adjust recording preferences to suit (it handles 1080P fine on this 3 year old macbookpro), and hit record. It seems to manage 1080P HD recordings fine, and it even records alpha channels ( ie you can record audio-responsive 3D models on a transparent background, for easy compositing later into the likes of After Effects.

(See also: QTZ Rendang – Free software for rendering out quartz patches. Potentially useful for non-real-time rendering out intensive patches that give slow playback. Haven’t tested that yet though.)

Special shout-outs:

– to Vade for providing both the v002 model importer, Syphon and the Syphon recorder, which makes a lot of the above possible.

– To VJ Shakinda ( co-author of a forthcoming book on QC with @momothemonster) for his trilogy of 3D tutorials on youtube, which really helped me get to grips with 3D in QC. If you found this stringy guide helpful, wait until you’re bungee jumping with this fella:


Above, 3d objects moving to music in 4 mins. Also tasty: 3d beat reactive scenes – part 1 / part 2 / part 3.

Previously on Breaking Bad:

Learning With Quartz Part 3: DIY Anchor Rotation FX for VDMX

Learning Quartz Composer Part 2

Learning Quartz Composer Part 1

by j p, April 16, 2012 0 comments

Weirdcore, Aphex Twin + Die Antwoord in Melbourne

aphex twin future music festival weirdcore

Aphex Twin played in Melbourne recently, which also meant a chance to catch the pixel mutations of his regular tour VJ, Weirdcore. After a summer of stage screens saturated with glossy, void-of-personality motion graphics templates, it was refreshing to catch live tour visuals that were ambitious, sophisticated and raw – very obviously being generated and manipulated live, mistakes and all. Probably not many other ways to approach a set for Aphex, and apparently he does improvise a different set every single time.

Below, a diagram from Weirdcore’s artist profile at vidvox.net, explaining his video set-up**.

“With additional FX programming from Andrew BensonFlight404, and Vade this is one of the wildest tour setups we’ve seen in a while… but you wouldn’t expect anything less for the worlds most known electronic musician. Pitchfork may have said it best, “First, we can’t really talk about anything until we talk about the visuals.”

weirdcore aphex twin video set-up

[[ **”UPDATE: Weirdcore mentions that diagram is now dated (was for his 2011 set-up), and he intends to shift everything to jitter, with one computer direct to LEDs, much simpler and less likely to fuck up.” ]]

Aphex Twin, kinect, weirdcore, melbourne,

Above : 1 x Richard D. James + 1 x Kinect, flanked by L.E.D. Screens of the face-replaced crowd at the Palace, Melbourne.

Below : The Kinect in action at The Future Music Festival ( Bonus Melbourne software plug : kinectar.org ).

Aphex Twin, kinect, melbourne, future music festival, lasers, weirdcore,

Aside from general pixel mangling and the fast and fluid Weirdcore style – I was curious to see how effective the live face-replacing would be (snippets of it in live Aphex shows have been glimpsed online for a couple of years now). The software and camera set-up seemed to take a while to tune, but when it locked in the effect was mesmerising, updating fast enough to cope with panning cameras of a boisterous crowd, all while being relentlessly modified and further manipulated.

The face-tracking was also put to work in a section of the show that is customised for each location. Weirdcore had asked for feedback on the list of Aussie celebrities he’d compiled, so I threw him a few more names, including 1 x Wally. Crowd reactions varied, but were probably loudest for the photo below of Julia Gillard, and Weirdcore mentioned later that for whatever reason – unlike most other countries, the response had been loudest for politicians at Australian gigs.

Aphex Twin, Gotye, Weirdcore, Melbourne

Aphex Twin, weirdcore, julia gillard, melbourne

Aphex + Die Antwoord Live

die antwoord, aphex twin, melbourne

[[ Die Antwoord? If needing a crash course in South African cartoon-rave-gangsterism – spend 15 minutes of your life inside the short film, Umshini Wam made with Harmony Korine ( Gummo, Kids, etc). ]]

If the Aphex Palace gig wasn’t overloaded enough, 3/4 through the set  – on top of a cacophony of Aphex – Die Antwoord burst out from backstage in orange flouro suits and rapid-fire a couple of tracks, one of which has ninja rapping from above the crowd he has stage-dived into. There are almost enough camera phones in the air, for ninja to get back onstage by walking across a bridge made of gadget. Another highpoint of weirdness is reached when ninja + Yo-landi rasp in their South African accents, ‘Aussie, Aussie, Aussie’ and the crowd eats it up, barking enthusiastically back ‘Oi, Oi, Oi!’. Somehow it all makes sense, including the ideas that future Die Antwoord videos are to be made by Weirdcore and later by Chris Cunningham. One extended global mutant franchise. And yes, after Die Antwoord depart, Aphex still has plenty in reserve (as do the lighting and laser operators), so by the time  it’s done, we can only depart exhausted.

**

Thanks to Weirdcore and his tech companion Fede for taking some tour time out to meet up and chat about pixels.

Be sure to check out  his video projects, including works for MIA, Chuck D, Cassette Playa + Simian Mobile Disco etc, and this Weirdcore video interview at the Creators Project.

( Thanks also, to juanjocruz for letting me use his zoomed in photos, the best on flickr for the Melbourne show. )

 

by j p, April 10, 2012 0 comments