Interview with Jon Wadelton
(CO-WRITTEN BY JACK BINKS)
Tell us a little bit about your background
So I’ve been in software development around 20 years, which makes me pretty old now! I started off working life as an audio engineer back in Australia, but found myself increasingly moving towards a more TD like role, building tools for mastering and so on. Since then I’ve worked in a whole range of fields, from networking and web development through to the world of VFX these days.
What's an Aussie doing living amongst all the pommes? Have you gone native?
Well, I’m much fatter since I moved here!
How long have you been NUKE product manager?
I joined The Foundry around 5 years ago, though it feels more like 25, as lead engineer for NUKE. Did that for a couple of years then moved on to become the product manager.
You and I almost started on the same day. I remember asking you what you were working on and you said "paint and roto". After 5 years how is that going?
(laughs) We’re getting there - for 7 the team have managed to get some really impressive speed ups in! Anyway, the first job was actually going 64-bit - can you believe that was 5 years ago?!
How old is NUKE, 10 years? How long has The Foundry had it?
NUKE is actually more like 20 years old, and The Foundry has been working on it since ‘07 time.
So your prints are all over it, what are you most proud of?
That’s a really difficult question - there’s loads of bits so it’s tricky to pick one. In the recent stuff I’m really happy with the new tracker for 7, as well as the paint/roto improvements.
Have to ask, what BUG irritates you the most?
The one that really annoys me at the minute is after hitting play when processing heavy material on OSX it becomes tricky to hit the stop button. Seems like a simple thing to fix, but is a real pain in the ass. Still, gonna get it fixed if it kills me.
What can you tell us about hashgate?
Hah, you’re really digging into history here aren’t you?! So hashgate was a beta issue that slipped out in a v1 release. For those that don’t remember, it involved switching to using # marks (UK: hash rather than US pound) to denote frame numbers in a filename, rather than regex style %0xd as an option. We patched it with a v2 fix within 2 weeks, but in that intervening time the bug got nicknamed hashgate internally.
I know when you changed the Icons people threw down their wacoms, are you going to update the GUI?
We definitely want to improve the DAG node and connection representations, along with potentially the text rendering. What we have to be careful of is affecting the UI responsiveness - you’ve got to bear in mind there’s people out there routinely using scripts with thousands of nodes, and just adding a small overhead to each one of those quickly adds up.
Lots of people ask me about Raster paint because they're bored of having to round trip to tools like Photoshop. Do you think there is a need?
I’m hoping that the vector paint speed improvements for 7 alleviate the need, as that’s the most common reason I hear cited for raster paint inclusion. I’m uneasy about inclusion of raster generally since it breaks the procedural nature of the tool somewhat - things like distribution of material to farms and that kind of thing suddenly become somewhat more of an issue. Obviously, they’re surmountable problems, but just the sorta thing I have to consider.
On the flip side, we also have MARI which is an excellent raster paint package with brushes that feel just like Photoshop, and over time the tools will only get more closely integrated.
Is that true in general for The Foundry's products - tighter integration?
Absolutely - so NUKE and MARI currently have the bridge, and NUKE and HIERO are obviously tightly coupled, but over time we want to make life better for people working with multiple TF products. We also work hard to make sure our products work with other third party applications. We want customers to be able to make their own choices.
Where do you see the industry going in general? What has changed with the needs of comping and what do you reckon is going to change in the future?
The obvious trend is that of costs driven down while standards keep increasing. The Davinci and Smoke price drops, AE improvements, hardware increases - a real commoditisation. The cost of one of the workstations in use 10/20 years ago in our industry could outfit small production team with high end commodity workstations and your ‘industry standard’ production software these days.
It's been a while since the last major version of NUKE - over a year I think - has development slowed down?
No, not at all - for one thing in that past year we’ve had around 10 v releases I believe. We just needed a clear run at some of the bigger aspects, such as the GPU work and the roto rework, to get the core work done.
I'm trying to make my NUKE workstation as fast as posisble. What would you recommend in terms of good hardware to invest in? Faster drives, more memory, more cores, better gfx card, what?
All of the above! Generally, I’d opt for faster cores rather than more of them. Lots of RAM for the new ram caching, as well as general op tree processing, as well as a decent SSD for your disk cache are probably all good bets.
Can you tell us the plans for GPU, or even AVX/other optimisations and speed ups?
So NUKE 7 includes 5 of the slowest ops rebuilt to run on your GPU (or fall back to CPU if no suitable graphics card found), which is a pretty good start. It’s all based on the Blink framework our HPC team has been building.
For the future we’re looking at other suitable nodes for such optimisation, as well as automatic vectorisation for better CPU performance (which encompasses the AVX instruction set you mentioned).
The other interesting area the team are looking at is rather geekily termed ‘heterogeneous compute,’ but which basically means using all the compute devices in your machine simultaneously - so shipping off parts to compute on GPU and others on CPU in a sensible, sustainable fashion.
I hear that there is a node that compiles on the fly, when can we get that?
Yes, there is! It’s rather cool and is built on the next stage of the Blink framework stuff I mentioned before. You write your algorithm in this c++ -esq language and coding standard, which could think of as a kind of shading language, and it handles compiling that up at run time for the hardware in your machine. Kind of like an expression node on steroids.
It’s just as preliminary stages internally, focussed on solely image processing, but we’ll be adding support for other types in the future, and getting it included in the builds.
Will I ever be able to use the same tech with other TF applications, or even non TF applications?
The tech is developed by the core HPC group I mentioned before, so I’m sure it’ll find its way into other products over time. As for other non-TF applications - I’m not invited to those meetings! Maybe!
Will the ScanlineRender node ever produce Deep Data?
Yes, soon.
Have you thought about adding a GPU renderer?
That’s a bigger question than it sounds. I’m guessing you mean for final, rather than playblast type use. In that circumstance I wouldn’t want to sacrifice scalability or quality, which an opengl/glslang type renderer likely would. Who knows what might be possible with the Blink tech in the future though.
Have you played with the alpha from VRay?
Yup, seems handy in the quick look I had. Be excellent for those places already set up with a Vray pipeline who want to match motion blur and the like.
What do you think of other rendering options, such as Atomkraft stuff?
Yeah, it’s interesting to see what people are coming out with.
Do you think the inclusion of lighting into NUKE is the way of the future?
Hah, that’s a big statement. So going back a bit, the whole division between 3D and comp was about what you could do fast as a post process. Over time, what you could do fast enough to qualify as that post process increased, so for example we started to see the inclusion of 3D type projection tools into the comp stage. Lighting can be a very intensive process, so I wouldn’t necessarily say lighting as a whole is the short term future, more the parts of the process you can render fast enough.
This is why things like deep data really interest me - it’s got that balance between maintaining control of aspects of your scene until as late as possible in the chain.
What is happening with Open EXR2?
EXR2 is currently in beta and they're getting very close to release. NUKE 7 has support for deep EXR2 built in.
NUKE is an awesome environment tool, anything cooking in research to make it more complete?
Yup, got some really cool work being done there - let’s leave it at ‘closing the loop.’
What's the most exciting product at the Foundry right now?
NUKEX of course! But I am biased of course.
Comments
RSS feed for comments to this post