Interview with Mike Romey

on .

MikeRomey

 (CO-WRITTEN BY JACK BINKS)

Hi Mike. Tell us a bit about yourself.

Prior to joining Zoic Studios I worked in the industry as a CG supervisor, effectsTD, art director and broadcast real-time graphics artist.  I came to Zoic with a broad set of skills and quickly gravitated to our pipeline.  Currently, I am the head of ZOIC’s pipeline and the CTO of new venture, ZEUS which specializes in virtual production.  My responsibilities include overseeing the evolution of our software pipeline and workflows.  This includes integrating off the shelf-software into our workflow as well as supporting and maintaining proprietary software.   Our pipeline team is small and agile and we tend to focus our energy in the facility where our efforts can yield the most return.  We are constantly challenged with both maintaining our workflow but also pushing the envelope.

Tell us a bit about ZOIC; the sort of work, and what sort of challenges do these types of work present?

Zoic is a diverse visual effects company offering services ranging from on-set production services, virtual production, visual effects supervision, concept and creative direction, editorial, motion graphics, in-engine game cinematics, full CG animation, post-production, project management, pipeline consulting and production design capabilities for the entertainment industry. With such a broad breath of services, our pipeline department is constantly challenged to develop ways to create continuity, efficiencies and new workflow methodologies.

zfb

What main tools does your pipeline revolve around?

We use tools like Fabric, Maya, Vray, Shotgun, Nuke, Hiero and RV.  The glue that threads them together is their strong python scripting language.  Most of our tools are developed using pySide as the front end for Python scripted back ends.

How long have you been using Nuke?

We have been using Nuke for roughly 4 years.  In that time we have built up our pipeline to support some very extreme production scenarios.  For example some of our broadcast productions require us to turn over 200-400 completed shots every 2-3 weeks 22 times a season.  We have done this by generating a workflow that centralizes shot information into a shared custom dock able tool in Maya, Nuke, Hiero called the “Zoic File Browser”.  The Z.F.B. allows a user to see shot assignments for themselves as well as neighboring pipeline steps.  They can use the tool to open, save, time log, annotate, transfer and reference scenes between our multiple facilities.  Additionally from the ZFB they can save and share snippets of nodes and resources across shots, sequences, episodes and projects.  This makes it easy for a lead artist to build composite templates that can be broadly shared across similar shots.  This becomes a necessity to deliver volume and continuity.

How much work has gone into building the ZFB do you reckon?

The ZFB was built over the course of 2 months before deployed into production roughly 18 months ago.  Since that time our shows have grown in size and the feature requests have grown causing the user interface to slow down.  We recently upgraded our Shotgun server and software resulting in some very impressive speed improvements to the ZFB making it a more critical and responsive tool in our pipeline.  The ZFB is now able to query a large show with 200-400 shots with close to 2500 tasks in 3-5 seconds.  The speed improvement allows artist to more quickly receive minute by minute updates to a project performed by the producer on the back end inside of Shotgun.

What sort of challenges did you face creating a tool to be shared between such a diverse range of software?

Building a tool that can be hosted in different applications like the ZFB was tricky.  Each hosting application requires support for third party python site packages such as PySide or PyQt.  When we started developing the ZFB this narrowed the scope of applications we could initially support.  Since then we have lobbied a number of developers to support these additions, ultimately expanding the number of host applications that could be supported by the ZFB.  

Additionally we needed to also build some lightweight support tools to make building custom dialog interfaces easy.  Historically we would use Qt’s designer to build user interfaces, but have learned over the years that using the designer tends to lead to some design drift and continuity issues.  To solve this issue, we opted to build a custom python dialogue box builder that could be dynamically populated on the fly with a number of pre-made widget description combinations.  This insured that the design of each dialogue of the ZFB look and feel like the next.  This also insures that they look identical across multiple hosted applications.  By doing these we were able to make more of our code base reusable.

So how long have you been using Hiero?

We recently started to tie Hiero into our pipeline.  I guess the rule of thumb “go large or go home” applies to our adoption.  We have 25 licenses of Hiero being used in production today and have deployed 38 custom Python tools each with varying complexity and user interfaces.  Most of the custom development for Hiero is being handled by only a few developers.  To date, 1.5+ million plate frames have been transcoded and 5000+ conforms performed using Hiero.   We have delivered nearly 60 episodic shows.  It has been used by our feature department to deliver over cuts and conforms for 4 features so far and additionally is being used by our commercial department to deliver multiple high profile spots for clients like Audi and Toys’R’Us.   All of this has been done in a matter of months not years, with deployment and training of Hiero starting in August of 2012.

Why did you adopt Hiero?

The dilemma we were facing prior to its adoption was three fold.  The first issue was a bottleneck bringing plates online for production in a way that didn’t jeopardize our short production cycle.  Our Machine Room and the volume of shots they were requested to bring online couldn’t keep up with the volume of footage we were receiving from production. We had been building custom python tools to aggregate footage to client provided EDLs.  This had proven to be very successful but had its handicaps.  The tools were being used by producers and coordinators to transcode footage but had no visual front end for quality assurance.  Hiero become the frontend to our backend tools providing users with a comfortable interface where they could conform, verify and transcode selected takes to our database.  We found that under the trunk of Hiero, its PySide and Python bindings closely matched our existing tools and only required nominal tooling to integrate.

The second issue was sendingedited sequences to our clients of the visual effects shots cut on top of the offline. With so many shots changing so quickly, we needed a faster way to conform our visual effects shots as quick as they got completed off the render farm.  With so many shots in production at one time we needed a tool that could allow us to publish a conform to our Shotgun production tracking system.  We needed a nonlinear timeline where we could easily build tools to query the database for changes and additions.  Producers and coordinators needed to be able to review notes in context to the footage they were conforming. Hiero’s pySide UI allowed us to build a notation browsing tool that automatically updates as the producers scrub and change selections in the timeline.  Additionally we build custom render farm exports for Hiero that simplify the process of delivering over cuts for client review.  This not only insures that our clients received work in progress reviews in a timely manner but also guaranteed that we received working notes from them just as quickly.

The last issue was deliveries.  Prior to Hiero we had a mixed bag of different software in the facility used to produce a single shot.  Each software and vendor had different color configurations.  By using Hiero we were able to normalize our client delivery tools with our compositing tools.  This insures that the same tools our artists are using to check and align color are the same tools we are using to deliver.  We supported this process by additionally building tools for Hiero to batch distribute transcoded footage to DNXHD’s and MXF’s for final editorial conforming.  This gives us the flexibility to deliver Avid bins to our clients with hundreds of versions conformed to their delivery specifications in minutes not hours.

Adopting Hiero also required us to evaluate the distribution of labor for our shows.  Historically we had a very mixed bag of responsibilities for each job title.  This often created a certain amount of confusion as to which task needed to be completed before the next one began.  By centralizing the tasks into Hiero we were able create some common ground as to what procedures needed to be followed to start and deliver a show.

Sounds like Hiero is deeply integrated, what does this give you?

The most unique thing about our conforming pipeline and workflows is that we have been able to empower our producers and coordinators to manage parts of the pipeline they otherwise were not able to do.  With some training they are able to generate shots and plates fairly quickly distributing the load to our render farm.  Once those plates get translated into working versions they are then able to overlay them back on top of a working edit and generate over cuts for client approval.  Upon approval they can then batch generate resources that can be bundled into an Avid bin for client ingestion.  These tools and process allow us to take on more shots, deliver more versions and create a better product because of it.

How did you find the learning curve for Hiero’s Python API?

At first, it was very difficult.  We were one of a small handful of facilities using Hiero and an even smaller handful of users trying to trick it out.  Many of the Python methods had not been tested in production.  Sometimes those gaps required us to work with the Foundry to build custom temporary methods to fill in the gaps while we waited for them to be added to the trunk of the Hiero.  During the Alpha and Beta cycles of Hiero each iteration of the application would bring new features and break older Band-Aids.  Since then it has stabilized and we have been able to build tools quickly without too much support.

Where do you see Hiero going in the future?

Zoic does a large amount of onset virtual production work.  In the future I would love to see flavors of Hiero on set and on location logging takes for not just live action production but virtual camera, facial performance, witness camera and motion capture productions in a similar way that Storm once did.  My hope, in very near future releases, that Hiero will grow to support more sophisticated color grading tools to simplify the process of grading shot sequences and handing LUTs back down to artist for shot production.   Additionally Hiero will need to grow to more seamlessly support interchange between traditional editorial facilities using Avids.  Ideally supporting the creation of Avid bins from Hiero would simplify the process by which we deliver to clients.

How about Nuke’s future?

I am often asked “What is Hiero?  My typical response is Hiero is a nonlinear timeline for Nuke.  Building an alternate application like Hiero on top of the back end of Nuke requires a close relationship between both software’s product managers.  Clearly, Nuke plays a critical role in Hiero’s future.  The more you expose new functionality in Nuke, the more likely you’re able to pass that functionality down to Hiero.  We hope to see more advanced GPU accelerations in Nuke that can be passed down to Hiero in the form of color grading tools, and gpu rendering.

Comments   

 
0 # Henrik Cednert 2013-02-17 06:19
Oh my! Excellent piece! =) Can I trade my left foot for your custom Hiero tools? Oh, not? How about the right one? Both? Both and a hand?
 

You have no rights to post comments

We have 2051 guests and 76 members online