As far back as I can remember, I always liked to tinker.

To me, figuring out how things worked and solving problems was better than getting the highest grades or working on the “coolest” shot. Even back at university, I had a knack for it, professors and assistants would send classmates my way whenever they had a technical issue. It was clear early on that I didn’t just want to follow the usual path, I wanted to carve out my own role, one where I could make a real difference.

When I found the role of a Lighting TD and department TD at Trixter, it felt like stepping into a place I was meant to be. In this industry, it wasn’t just about doing my job as a lighter and making pretty shots by shaping light, it was about anticipating the needs of my team and even helping out colleagues from other departments. I got to be the one who filled the gaps, who bridged those technical challenges with creative solutions. It was like being somebody in an environment where everybody was so specialised that there were always new ways to stand out.

But as much as I loved to tinker and invent, I came to realise that it wasn’t enough to just enjoy solving problems, you had to find the right problems to solve. In the past few years, I saw that need. I took it upon myself to fill it, using every bit of my curiosity and technical know-how to come up with solutions that mattered. And in doing so, I found my own kind of place where I belonged.

This little blog journey is about showcasing some of those tools I've developed along the way. As a department TD, sometimes the hardest part isn’t coming up with a solution, it’s keeping a record of those small, yet impactful tools that help your team. By writing about them, I can share the thought process and techniques behind solving those challenges, documenting the work that often goes unnoticed but makes all the difference.

Lighting, Lookdev, and Rendering: It's All About the Attributes:

In the world of computer graphics, especially within lighting, look development, and rendering, everything ultimately boils down to attributes. Whether it’s procedural alembics, USD instancing, or viewport proxies, the key lies in making the right choices and optimising the process to make the workflow more efficient and less resource intensive. Throughout my career, I've been passionate about creating tools that enhance productivity and streamline complex tasks, and here’s a glimpse into some of the strategies and solutions I developed my career.

Render reports:

Firstly we need to dive a little into the past, starting my VFX journey as a render wrangler to find a way into the industry, I found my self having the time to explore and tinker with the pipeline that was built throughout the years at Trixter, and besides just restarting failed jobs on the render farm or flagging them, I actually started looking into solutions for them, always having a little biased towards those from the lighting and lookdev department. Quickly I stood out from the crowed here and quickly moved into becoming a rendering TD and taking ownership of the render farm and working out some of its workflows. From writing the basis of getting render reports by tinkering with the api from “tractor”, the render farm manager used, which later on was made into an automatic email sent to production, supervisors and pipeline, with render times, reports and or any issues that had happened overnight. The wrangling team used to do this manually, but since I wanted to have a little more time to tinker and help out with other issues, finding a way to automate this process, allowed me to do so.

Katana Farmsubmitter:

Now with the above said, shortly into my stay at Trixter I was able to show my potential to quickly start on my mission to work my way into the lighting and lookdev department. One of the first tools to write for the Lighting and Lookdev department at Trixter, was to take the outdated render farm submitter, this was also to take advantage of the newly written farmsubmitter api tools written by pipeline at the time. Exposing only the parts that really mattered to the Artist, like if they wanted to render a full sequence, chunks or a set of frames, change render priority or farm pools. Also to take advantage of the newly added validators, where we could potentially catch some known mistakes before submitting the job to the farm, these could be warnings or actual errors. Since I was not only just writing these tools but also using them, that made somewhat of a difference. Such validations included, checking if the camera was in the scene and named correctly, frame ranges and resolution where consistent with the information from production, aka ftrack, amongst others.

Sequence, setup:

One of my opportunities changes and pushes into the lighting and lookdev department, was when Julius Ihle joined the team as the department lead at the time, he quickly became one of my biggest mentors to date. With his vast experience and also being a tinkerer, he had a lot of ideas for the department. One of the first projects we worked on together was to take the somewhat outdated Katana templates, Katana being the tool used for lighting and lookdev. The first part was to find a way to adapt a new workflow concept, which was to implement a sequence workflow, so the template did not only focus on one shot but multiple shots. Besides making sure that the tools that already existed had to be reworked to work with this new concept, such as render settings, the convenient startup tools to pre-setup a scene for the artist based on seq/shot context. One of the parts that my self and Julius worked on was what would be know as the “shot work area“, this was basically the area where the artist worked the the specific shot. So in Katana taking advantage of the variable enabled switch we where able to create this sections. But we did not want to have every single shots from a sequence in a scene. One of the tools I thought about and wrote for this, was the shot loader, which was a shelf tool, that would open a simple dialog box, that would take some info from environment variables, such as artist name and sequence context, so that the tool would get information from ftrack to display the shots assigned to the artist, this would load all the needed nodes the artist required to start the shot work, there was also the ability to ignore the filter by artist name, in case you need to load a specific shot, even from another sequence if needed.

User Tools:

As a my responsibilities grew within the department also did my visibility to required tools help the team, but one important part to see was, I was not the only one writing these convenient tools or even coming up with ideas or prototypes for such. Pipeline had already worked on a convenient way to store user tools for other dcc’s as a shelf tool, so I had the idea to incorporate this into Katana, but take it a step further to make it easier for artists to store these tools, so the solution to this, was to write a simple ui that would allow the artist to select the node/s that made their tool, they could write a description and it would save the tool into the correct place and add the needed json and python files for the tool to be discovered and load correctly, whether it was a livegroup, opscript etc.. simplifying the process of sharing these handy tools.

User Tools to Studio Tools:

Most user tools would have a start as a simple handy set of scripts or nodes done quickly by my self for example or another artist in the department. Most of the time these tools are done quickly and debugged during its use, not really the pipeline way, and pipeline would not be responsible to maintain these. One such tool was the RenderSettings node that would in the future replace the current “pipeline“ one, which exposed some information to the artist, like calculating samples to give an idea on the amount of samples being used, exposing new features such as adaptive rendering, and optimising parameters such as low light threshold and deep samples to name a few, but also hide parameters to to overwhelm the artist. Also the ability to have render presets, that would be setup by the leads, for low medium and high presets.

As my trust grew with pipeline, the doors to their world was also opened to me, that also allowed me to take some of these “artist tools“ that started as quick rough tools, to build them in to more robust and pipeline complaint tools, so that they could be pushed into the repository and maintained by pipeline if needed. One such tool being the render-settings revamp mentioned above.

Enhancing Internal Tools and Workflow Efficiency:

One of the primary tasks I undertook was evolving an internal instancing tool within Katana to leverage USD instancing, specifically "point instancing." This was particularly useful for large-scale scenes, allowing us to manage massive environments more effectively.

Non persistent helpers were employed extensively: for instance, a tool to adjust resolution percentages dynamically and another to control fur attributes by dividing the density and multiplying the thickness. This approach significantly reduced rendering times and resource consumption, whilst setting up scenes.

Optimising shading and geometry settings was another. I developed non-persistent tools that allowed artists to switch off unnecessary subdivisions, disable displacement, and simplify shading setups for faster preview renders.

Breaking Ground with USD Integration:

During the early stages of adopting Universal Scene Description (USD), we faced the challenge of incorporating it into our existing pipeline. At the time, Solaris was still in its infancy, so we often relied on manually written USDs using the Python API. These USDs primarily served as "containers" to organise referenced alembics and other assets in a more modular fashion. Despite being a time-consuming process, it laid the groundwork for developing more automated approaches in the future.

The introduction of USD also necessitated smarter environment assembly techniques. I worked closely with the pipeline and layout teams to establish a framework for organising large scenes using camera-based LOD loading and procedural proxies. This allowed for seamless scene handling, even with vast data sets, and made it easier for lighting artists to iterate quickly.

Tackling Arnold AOVs and Custom Passes:

Rendering passes and AOVs (Arbitrary Output Variables) are crucial for compositing, but they often come with technical challenges. One such problem I addressed was rewriting the Z-depth AOV for Arnold, which encountered issues with transparent shaders, causing inaccuracies in compositing.

Developing a Denoising Tool for Multi-Part EXRs:

Rendering noiseless images is a challenge. I created a Nuke-based tool for handling arnold denoising operations, allowing for the deconstruction and reconstruction of multi-part EXRs. To achieve optimal results, the tool ensured that all required denoise AOVs were activated in the revamped Katana render settings. A custom albedo AOV was also introduced to handle refractions correctly, maintaining the integrity of the image during denoising.

Leading the Workflow as a CG Supervisor at Orca Studios:

In my role as CG Supervisor at Orca Studios, I was responsible for overseeing the entire production workflow, from modelling and texturing to look development, rigging, animation, layout, lighting, and rendering. My technical background allowed me to continue developing user tools to help the team, such as a tool in Maya for rendering multi-part EXRs efficiently.

I also spearheaded the integration of Houdini into the pipeline, focusing on Solaris and exploring the potential of adopting Karma as a future renderer. This involved setting up initial standards and exploring various procedural workflows for lighting, environment assembly and the development of templates.

Technical Adventures Beyond the Studio:

My technical and artistic pursuits aren't limited to the workplace. I have a deep passion for painting and photography, which have often led me to explore the technical side of art. A few years ago, when transitioning from Trixter, I found myself with plenty of artistic work to show but very little in the way of technical work that could be shared publicly. This inspired me to develop solutions for my personal projects, such as creating time-lapses from raw camera footage.

One of the hurdles was converting raw photos from my camera to EXRs while preserving the dynamic range and conforming to the ACES colour workflow. I customised an existing tool, `rawtoaces`, to create a more user-friendly solution, which I later shared as an open-source project on GitHub: https://github.com/curadotd/rawtoaces_gui. This project not only made my personal workflow easier but also served as a demonstration of my coding skills and understanding of colour science.

Future Projects and Research:

Looking ahead, I am excited about further exploring USD and MaterialX, aiming to establish an agnostic rendering setup that isn’t tied to a specific render engine. The goal is to enhance viewport representation for lighting and look development, making setups more efficient and flexible. One of the ideas I’m pursuing is creating a bridge from Mari to a Hydra render delegate, allowing artists to convert texture properties into a basic MaterialX shader. This would enable quick and reliable turntable previews for leads and supervisors to review assets.

These endeavours reflect my philosophy of continuous learning and pushing the boundaries of what’s possible in computer graphics. The future holds many exciting possibilities, and I’m eager to see where this journey will take me next.

Next
Next

The VFX Industry Is Upside Down (and Why a Plan B Is So Important)