Interview with Hugh Macdonald

Written by Jack Binks on .

hughmacdonald headshot

nvizible-logo

(CO-WRITTEN BY BEN MINALL)

We're here with Hugh Macdonald of Nvizible to talk about pipelines, their use of Nuke, custom development, the thrills and spills of open sourcing tools, relighting, particles and more. So kick off a render, grab a cuppa and let's get stuck in.

Tell us a little bit about yourself Hugh.

Well, I studied computer science at Bristol, graduating in '03 and having taken as many modules as I could in graphics topics. I was lucky because one of my lecturers was Alan Chalmers, former VP of Siggraph, and as a result Bristol had the only UK based Siggraph chapter. His focus was definitely more in simulating the real world when it came to rendering, but my interest leant more towards learning how to fake something to get it looking good; something more appropriate to the post world.

At the time there was a d'earth of good forums dedicated to compositing - we were mainly dealt with as an afterthought compositing sub-forum on CG centric sites such as CGTalk (as CGSociety was known at the time). As a result Paul Moran and I set up a comp centric forum - VFXTalk. The site took off beyond both of our expectations mainly, I think, thanks to Aruna's adoption of the forums lending the professional's viewpoint and really leading a generation of students into the industry. Paul ended up heading off to pastures anew and the site was sold off in '07 after which I slowly drifted away from the community.

When I left university I had the idea that I either wanted to be an editor or a compositor, and managed to land myself a role as a runner at Skaramoosh. I was paired with a couple of other new starters, both of whom wanted to be editors, so took the cynical choice and decided to get myself up to speed on the compositing side of things. Up until that time I'd mainly just played with Fusion and AE, but I was fortunate enough to be end up sat between two very talented supes on the evenings, using the newly-acquired-by-Apple Shake. As I got more involved I started to also code up tools, using my uni degree knowledge, to help accomplish things otherwise boring or difficult to achieve in vanilla Shake, and before long I was taken on as junior comper.

As that show wound down I applied for junior comper roles at the big 4 in London, plus a few other places, not really aware that junior generally stood for roto prep. MPC called back, instead offering pipeline dev, which I took, but by fortunate happenstance did make the point up front that I wanted to be comp rather than dev in the long run. I ended up working with Damien Fagnou, who handled the 3D pipeline tools side of things whilst I dealt with 2D. Before long I landed a role in comp kicking off with Amazing Grace.

After a few shows there I then moved on to Rainmaker, later acquired by CIS and shut down not long after. During that period I started supeing shows, including Slumdog, and by the time CIS closed its London doors had built up some really good contacts.

At the time CIS closed the UK, and indeed most of the world, was in the grip of the financial crisis and jobs were scarce. Instead in October '08 I joined up with a number of colleagues, including experienced VFX supervisor Paddy Eason, to form Stranger FX, a cooperative group of freelancers. We managed to pick up a number of the shows, including a couple initially slated for CIS, overall working on features such as Fantastic Mr Fox, Malice in Wonderland, and Tormented. We then met Nic Hatch and Martin Chamney, who had been running Nvizage for 8 years, and created Nvizible, which seemed like a natural pairing as they were strong in 3D and we were strong in 2D.

Since then we've really gone from strength to strength, starting off taking on some outsourced work from the big 4, moving towards running primary on a range of shows. I've carried on supervising and have worked recently on John Carter, Frankenweenie, Skyfall and am currently working on Kick Ass 2. On top of that I've been largely responsible for our 2D pipeline and a lot of our techy tools at Nvizible.

Nvizible Skyfall Scorpion01 before copyright Nvizible Skyfall Scorpion01 after copyright

Skyfall.

Nvizible Franken Shed before Nvizible Franken Shed after  Frankenweenie.

How would you describe your pipeline?

It's really strong for our size of company (although I guess I'm bound to say that, being the guy that's written most of it!). I've worked in a bunch of small to mid sized places and none of them have had quite the level of integration and robustness that we've got here.

We employ Shotgun as our backbone production tracking system, with Nuke (of course!) hung off it, alongside Maya, RV and RVIO. In Nuke we've completely replaced the write node with a custom op, to ensure that naming conventions and facility conventions are enforced when rendering. I read a good article recently that said even with top notch people and the best will in the world, if you leave things to chance then 10% of the time they'll likely go wrong. I want artists to be able to focus on getting shots looking good - the creative side of things - and not have to worry about data management and so on. This gets rid of that risk and we do the same thing for the remainder of our pipelined tools. We've yet to do the same for read ops, which are lower risk, and they are next on the list.

How do you find developing for Nuke vs Shake?

The best bit is that development for Nuke doesn't necessarily involve c++, whereas for Shake it generally did. Now I use a combo of prototyping with gizmos, Python and only really go for c++ when I have hardcore image or 3D manipulation to do.

Internally we have a huge swathe of python code which comprises our pipeline (plus a little Mu for RV), and I've even open sourced some tools up on Nvizible github account.

How do you find open sourcing facility tools?

Well, the main Nuke one up there at the minute is the NukeExternalControl wrapper, which we have running on our facility machines, but which we don't rely on for pipeline type tasks. It allows you to run python scripts in either pre-existing, or by firing up new Nuke sessions, kinda of like being able to 'import Nuke' in an external python interpreter. You can even run scripts across the local network on other machines, which is great fun for random hijacking someone's Nuke session and messing around with their current script!

Open sourcing that was actually very rewarding - when I've open sourced tools in the past it's rare that you get much in the way of contributions back, however with NukeExternalControl, Nathan Rusch of Luma Pictures contributed back to improve functionality no end, thanks Nathan!

As for general open sourcing, I'm definitely pro it if the tools in question aren't of significant competitive advantage to us. You tend to find that larger houses have already implemented the sort of things that we'd put up, which is great as it means that those that get the most benefit are the freelancers, students and small facilities. For example, one thing I've been planning on releasing for a long time is a set of relight tools (just need documentation!)

Sounds great, what do they involve?

So it's a toolset I've been working on for a while to offer a whole range of relighting functionality. Obviously the Nuke Relight node was promoted from the 'other' menu a while back, but these offer some additional bits and bobs including reflections, refractions and is quite atomic by comparison to Nuke's built in monolithic node.

The core plug-in takes 3 main inputs, namely per pixel position pass, per pixel normal pass and a camera position (a lot of people think camera direction is important in this case, but it's not). It then generates:

-Surface to camera vector pass
-Reflection vector pass (i.e. camera to surface and bounce based on normal).
-Refraction angle
-Fresnel pass

For the last two I also played with front and back surface bounce options, but since you don't generally have back surface normals then result looks a bit funky, so I stuck with single surface bouncing.

These passes then feed into the light node. This takes the passes and relights based on the lights specified. I've been also playing with a light shader node where you can specify a custom shader to get a whole range of surface properties.

This is one fun test that I made in Nuke using my relight tools. The only external data used in each of these are the differing light probe images (from here). Other than that, they are all passed through exactly the same script, and everything is generated inside Nuke. It is a test that I initially set up a few years ago, and have been tweaking on and off since then. The basic geometry is generated inside Nuke, and then the reflections, refractions and relighting are generated using my relight tools.

On top of that we also have some other useful related tools, such as:

-Mattes from position pass, based on designed position, falloffs, etc.
-Our own position to points node so proxies for geometry can be seen in the 3D preview.
-A tool to bake 3 chan vector passes down to 2 chan lat long mattes.
-A generic vector operations node, so pulling cross and dot products on arbitrary passes for example.
-A read addon which allows a Nuke Camera to be created based on PRMan metadata written into the exr file.

Interesting; where else do you employ metadata in your pipeline?

We use a lot from Arri Alexa, particularly since dependent on body and lens you even get data from set like tilt, roll, focal distance, and the like. For example, we have a tool to automatically match zoom on fg plates to bg plates, so that its technically accurate, not just eyeballed. That said, the sampling period on things like tilt and roll is pretty low - once every 4 frames in my tests, which is a bit of a shame since you can't use that data for stabilisation.

To preserve this data down the pipeline we pull a number of tricks - for example we have to bake to DPX to get the colourspace metadata we subsequently write to EXR headers, using a combo of the Alexa tools and RVIO. We also track shot info and slates back against our Shotgun deployment to make shoot information available at all stages of the pipeline.

In the future we also want to connect to our sister company Ncam's real time onset tracking technology, which allows live previs of set extensions, cg elements and environments. The guys are currently looking at inserting metadata into Arri streams, which if they succeed we'll be able to pull out at the post stage, hopefully allowing you to create matching cameras with a single click, and so minimising the need for matchmoving. At the minute they're using the system on Roland Emmerich's new feature White House Down, as well as a bunch of other sets - seems to be a whole lot of interest!

Do you see this kind of virtual production-esq on set live previs work as the way of the future?

I think so, yes. We've used it before, in more limited ways, for example on the set of Hugo with Nvizage. There we were able to give Rob (Legato), as well as Scorsese a preview of how the shot would look with a variety of post work done, which helped make decisions on set. Back then though we had to do a whole bunch of manual measurement work, whereas with the ncam system you're able to simply sit your camera on this dedicated bed, between it and the main mount, and it's able to provide this real time tracking information regardless of whether the shot is crane, steadicam or tracking based.

So what other cool stuff have you done with Nuke?

As a smaller shop we often end up doing more in Nuke than our larger neighbours. For example, where they are likely to kick a shot back for 3D FX work, we often trying to complete it later in the pipeline. A good example was one of the shots we got commissioned on John Carter for. There's a shot of Carter looking down at a semi transparent surface covered in dust, and a reverse from underneath the glass where the dust has magically removed itself. Carter then wipes across the glass with his arm to clear the non-existent dust away.

John Carter Nuke Particles Breakdown from Nukepedia on Vimeo.

particle line force ui
particle line force curves ui

Now 3D FX work could've obviously done the job, however there were no additional plates shot for it, and Nuke 6.3 had recently been released with the particle toolset included, so we decided to do it in there. I dropped a plate in the 3d system, eyeballed to match the glass height and area emitted a whole batch of particles across it, all good. To get the arm movement I initially tried chaining around 20 particle point forces together, however it proved both a royal pain in terms of managing all the different nodes as well as not quite giving the look I was after, with particles smearing or slipping through between the areas covered. Instead, I quickly knocked together a line particle force node, which looked pretty much identical to point force in UI (screen grabs to the right), but with 2 points to define the line in question. It also had the ability to adjust the strength, radius and probability using a curve along the line from start to end. So one end of the line could have a stronger force than the other, or there could be a part of the curve that deliberately didn't affect 50% of the points. I then tracked the points on the arm, projected onto the 3d plane positions and tied to the two line forces, and it was pretty much good to go.

Another great couple of examples of this was for Grabbers (ed: worth checking out if you enjoy comedy horror and/or drunken Irish people). There were two main shots we did which stood out as something a little special, both completed by Adam Rowland who's now a supe with us.

Grabbers JCB Environment Shot Breakdown from Nukepedia on Vimeo.

The first was for a JCB travelling shot, which due to time constraints was only shot static and against greenscreen. Adam recreated the entire environment inside of Nuke's 3D system (with the only exception being ground roughening which was taken out to Maya to do). The environment included a whole range of elements, from full models to planes in the background, as well as fully synthetic glare, flare and particulate matter, and which at any large company would've definitely gone to 3D.

The second for this show was part of a creature shot. Obviously the creature was 3D work, however the plate was shot in the rain, and the director wanted the creature's tentacles to flick water when reaching limits of travel. We tied Axis nodes to the positions taken by importing Maya locators to give us positions to work from, then uses rough estimates of distance travelled via Nuke's expressions to provide emitter velocities. With more time we might've looked to see if there was options to get differentials to provide closer estimates, although I suspect there isn't such functionality built into the expression language, as I don't believe Nuke can integrate or differentiate a curve.

Grabbers Nuke Particles Breakdown from Nukepedia on Vimeo.

 

Comments   

 
+1 # Jordan Olson 2013-04-21 20:06
Great interview, thanks!
 
 
+1 # jithin joseph 2013-10-31 05:54
rocking
 
 
0 # iMan Javaherypour 2015-09-06 03:46
Nice stuff!
 

You have no rights to post comments

We have 2136 guests and 78 members online