Interview with Artixels

on .



Who is behind Escher/Artixels?

Hello Nukers, this is Mike Wong, I am the owner of Artixels and the developer of Escher.  We are a small boutique based in my home town, Hong Kong and we are founded in 2011.  My colleague Kachi Chan joined a few months ago and he takes care of technical support, promo vids and documentation.

What tools does Escher consist of?

Escher tools are divided into three categories ( 10 nodes in total )

Image generators
  • caustix for rippling caustics pattern
    {yoogallery src=[images/users/frank/articles/Escher/caustix] width=[200]}
  • skyWalker for daylight rendering based on preetham sky model
Technical Ops
  • depthOp, depth data processor for manipulating data, e.g. clamping, scaling, inverse etc.
  • normalOp, normal vector processor, apply transform using MetaData embedded matrix or custom one.
  • aaOp, morphological antialiasing node, perfect cleaner for 2.5D shaded images.
  • SHOp, SH-project lightprobe images and outputs SH coefficients into MetaData, nukers may embed these SH coefficients into OpenEXR files as metaData, so all lightprobe images might have SH data right inside them for use in other parts of pipeline.

2.5D Creative tools
  • SHader, 2.5D image based Lambertian shader using SH data.
    {yoogallery src=[images/users/frank/articles/Escher/SHader] width=[200]}
  • glossy, 2.5D image based Glossy shader with importance sampling and pre-filtered samples.
    {yoogallery src=[images/users/frank/articles/Escher/Glossy] width=[200]}
  • depthTrooper, 2.5D image space ambient occlusion shader.
    {yoogallery src=[images/users/frank/articles/Escher/depthTrooper] width=[200]}
  • depthVader, multi-purpose depth data op (renders depth gradient, normal map and gradient blur etc.)

Check out the videos for an overview:

Escher Plugin Suit - Teaser 1
Escher Plugin Suit - Teaser 2 part 1
Escher Plugin Suit - Teaser 2 part 2

Can you give us some more background info on the spherical harmonics tools, and how you expect to see them used?

Okay, wish I could explain it clearly, will avoid CGI jargons …. Spherical Harmonics (SH) is a mathematical device which brings us two advantages for image based lighting with diffuse shading:

  1. Incoming light approximation: SH is able to transform incoming light data on a spherical surface into a few coefficients (SH coefficients).  Number of coefficients are determined by number of bands of the SH transform, ‘band’ can be understood as a parameter for controlling the accuracy of the approximation.
  2. Efficient image based diffuse shading: Once we have approximated the incoming light with SH, we may simply use the SH coefficients to evaluate diffuse shading quickly in frequency space instead of doing hemispherical integration on every point.

SHOp is designed to achieve the first point (incoming light approximation), it transforms a given lightprobe into a bunch of SH coefficients and injects them into the metadata for downstream use.  Current SHOp supports SH transform up to 6 bands but 3 bands is usually good enough because of the low-frequency nature of the application.

About Pipeline: As I know there exists some Nuke gizmos ( I saw it in some training video from The Foundry, was it dneg showing some SH tools ? ) which take SH coefficients to do relighting but the SH transform is done outside of Nuke, so I believe SHOp gives this kind of in-house tools better flexibility to collect SH coefficients right inside Nuke such that any intermediate comp could be used as environment light source too.
SH coefficients embedded inside OpenEXR: As metadata inside Nuke can be exported into OpenEXR, so all lightprobes used in a show could be SHOp’d once and have the SH coefficients embedded right inside back to the lightprobe and be used in other parts of pipeline.

SHader is obviously developed to do the SH shading part, one may use it to pull the SH coefficients from upstream to shade some 3D normal passes or they are encouraged to input their own SH coefficients !  Yes, why not !?  SHader also has a phong shading mode but provided the low-frequency lighting nature of the approach, phong mode might not match expectations and that’s why we have a glossy node :)

I believe these two SH tools would be very useful to compers who want to give an overall environment color/lighting tweak of a CGI layer (perfect if a normal pass is there, if not, at least a depth pass because depthVader node can render an approximate normal pass from depth data)

Is there a story behind the name?

Artixels or Escher?  I want to talk about both actually if you don’t mind, haha.  Artixels very much come from the idea of ‘art-in-pixels’ but I actually want people see our brand as a basic unit of digital art making just like a pixel in a digital image.  About Escher, I am sure most people will quickly relate to the great Dutch Artist M.C. Escher that we want to pay tribute to, he’s my favourite artist of all time and most of his works explore the possibilities stashed between 2D and 3D, so I guess we could not find a better name for the product.

What was your motivation to write Escher?

Personally I enjoy compositing a lot and I always feel that what a comper needs and wants is the ability to tweak and experiment with ideas and materials quickly but stay focused in a 2D image context.  There are imageries which are difficult to produce if one approaches them solely with 2D or 3D tools only.  A simple example is how one may render M.C. Escher’s drawings photo-realistically, I believe the easiest way is to apply 3D normal vectors onto different areas of the drawing and then color the areas with shaders and this is 2.5D.

I have been following the video game rendering R&D a while, and the deferred shading pipeline used in most  game renderers today is doing exactly the 2.5D and I think what we are trying to do technically here is to bring the possibility of deferred shading practice to compers.  Nuke is especially a good candidate because it has such an efficient scanline processing engine, it’s like a software GPU for 2.5D shading.  Needless to say, with the upcoming genuine GPU context in Nuke 7, the future releases of Escher are going to offer more.

It’s interesting to hear you are connecting games technology to vfx compositing. Something I (Frank) personally think has been grossly neglected in the past years. Is there anything else you are interested in bringing from games to Nuke? Such as Relief Mapping?

There are really lots of good brains in game tech when it comes to performance and just-right tricks and I think more or more game-targeted technologies are good for FX, e.g. bullet physics and Jack’s Mullet is cool.  I think once we have seen how the new GPU context works in Nuke, we will have better idea of what to bring to Nuke next through Escher, I guess Nuke’s GPU context should be glsl based ?  so the latest Compute shader in OpenGL should enable lots of good stuffs to happen !  Relief Mapping? why not, then we need some UV node too ….

What did you do before Escher?

I have been in the CGI industry for almost 20 years now and have played lots of different roles.

My story …......  It was 1993, I was pursuing my PhD but could not wait to create visual effects after I saw the T-Rex in Jurassic Park, so I chose to become a PhD dropout (although I was surrounded by dozens of SGI Indigo 2s in the lab and I remember I coded an image morpher on a SGI following a PDI paper about how they did Michael Jackson’s Black and White Music Video) and I started a small animation studio with my good friend Kelvin Lee.  Year 1993 was the dawn of desktop PC computer animation revolution, I recall we used Pixar’s MacRenderMan, 3D Studio and DPS Animation recorder on an Amiga A4000 (anyone recall ? The T-Rex test of ILM was also done on an amiga only) in our jobs.  I later joined Centro Digital Pictures Ltd ( VFX of Storm-riders, Shaolin soccer, Kung Fu Hustle and Kill Bill 1,2 ) and got chances to work on larger scale projects and those years were truly exciting.  I recall we were invited by Ellen Poon of ILM to present our project Storm-riders in SIGGRAPH 1998 and shared the floor with ILM (Tom Bertino talked about Flubber) and PDI (Beth Hofer talked about ANTZ) ….

Later on, I joined a local University and started a computer animation program for them, I never planned to stay teaching but had unexpectedly spent long years there, c’est la vie.  Until 2008, I decided to leave the comfort zone, and spent time updating my skill set on GPU computation, fluid simulation and tools development.  So here I am at artixels now.  (The name artixels was conceived in 2002).

How long did it take to write?

It’s been almost a year now, I remember I put together quickly a few proof of concepts (fluidsim and 2.5D tools) last November and then shared with visitors during siggraph asia 2011.  Feedbacks on the 2.5D tools were quite encouraging and so I decided to give it a go right after the show.

Can you mention any big customers?

We don’t have any big boys committed to acquire Escher license yet.  As you can tell we are relatively badly connected to the industry both geographically and network-wise, so we are working very hard to build our client base progressively through quality demo materials and good words from our testers.  At this moment, our beta test group is growing steadily, we have testers from almost all time zones now.  We have a brave user in Berlin who has used an alpha version of Escher tool on a movie project and once we got the clearance, we would love to show what they have done.

Are you going to support MAC and if so, when can we expect it?

Absolutely, I have just fired up CMake on my Mac to start porting, My mission is to bring Escher beta 2 to OSX too, so hopefully early November.

What was the most challenging node?

As most engines of Escher tools are not too complicated, so the main challenge remains on robustness and usability.  Concerning robustness, considerable amount of time has been spent on understanding the plug-in host as the documentation is sort of limited ( I was not a very experienced Nuke comper is another reason) but I found Nuke-Dev forum has been immensely useful.  I recommend developers to read every single posts there.  About usability, I had good fun designing caustix user interface as I really enjoy using it as a comper although the coding part was a bit routine.

What would you like to see added to the NDK?

Would love to see a GPU context API, I guess it is happening now in Nuke 7 and of course improved documentation and more examples (concerning robustness).

What’s next?

As we have done some ground works on fluid simulation already, our next product will be hopefully simulation based.

That sounds very exciting. Will it tie in with Nuke’s particle systems or a completely new set of tools?

We have a 2D fluid OFX plug-in prototype running on Nuke last year before the appearance of Nuke’s particle system and I have not quite figured out how to solve some usability and user interaction issues in a 2D image compositing context.  But there is a problem here …. when you put simulation stuffs into a comp environment, we have to be very careful of how these tools should be integrated into their workflow and practice.  ( I guess similar struggles existed whether you should do asset-centric lighting in Nuke or in a whole new environment like Katana ).  Running iterated pure simulation could be boring and should be done outside of Nuke if it doesn’t allow comper’s creative input, so integrated sim tool must allow lots of interactive control and directability.  Some new fluidsim nodes for Nuke’s particle system is a possibility but it’s also likely we might make something standalone.

What’s the best way for potential clients to contact you?

Right, for any questions about Escher for Nuke, please email This email address is being protected from spambots. You need JavaScript enabled to view it. .  We also have a Facebook page where you can see more updates about Escher:



0 # Jordan Olson 2012-10-28 15:54
good job Mike! these plugins are a great asset for any studio. Keep up the great work. And thank Frank for the writeup.

You have no rights to post comments

We have 4334 guests and 156 members online