At Substance we live, breathe and dream in materials. They are at the core of what we do.

For us to be effective, we need our materials to be as portable and applicable as possible through our users vastly different shading and texturing pipelines.

In a typical workflow, we structure our materials as procedural textures, neatly packaged in archive files, that rasterize to actual textures before the renders happen, be it in memory or to a file on disk.

The textures typically take the form of well-defined channels, such as baseColor, roughness, height, and so on. That is because we rely on some form of standardized shader, or Uber Shader, to know what to do with these textures and to render good looking images with a physically based model. In general, the knowledge around these standardized shaders has been critical to most texturing workflows in the CG industry.

Introduction

My name is Davide Pesare, I have been passionate about shaders for almost two decades and since I joined Substance in France a few years back, I am at the helm of our Labs team. Our missions include innovating and developing new technology, as well as exploring the existing tech landscape around our industry. Pursuing that mission led us to expand on our shader work.

Sometimes uber shaders are not enough

For years, we have largely relied on a shading model we called MetalRoughness, which is heavily inspired by Disney’s principled shader from 2012. It is a very effective, concise and friendly way to parameterize and paint a physically based shader. Along with its sibling called SpecularGlossiness, this model is still largely in use in many applications around the digital content creation industry, as well as in standard platforms such as Kronos Group’s glTF.

Despite its broad success, this one-size-fits-all model does not, in fact, fit all. Here are a few reasons for it:

  • There are materials that are common in the world that are hard to match with this model. Omnipresent materials like car paint, with its flakes, or surfaces with droplets of liquids on top. Iridescent materials, fuzzy surfaces, and many more.
  • While a physically principled is a great improvement over previous models, it is not a physically accurate simulation of what light does. The Disney model strives for simplicity and directability. Real Metal surface reflectance and physical lobe layering, are good examples where doing it correctly would require more measurements and parameters than is deemed acceptable for the vast majority of users. That penalizes accuracy to a point of being inacceptable for some applications.
  • Baking all the textures ahead of time may be good for making a single object, but if you have 1000 variants of that same object, you need to export 1000 sets of textures, which is not always practical in production. Having some procedural pattern and texture modulation directly in the shader can help to create much more variety with a smaller footprint.
A car paint model involves multiple layers of specularity, a physically correct absorption and scattering model, complex behaviors for flakes and scratches and flexible levels of detail for different viewing distances.

The Witcher 3, CD Projekt Red - Beauclair, from the Witcher 3. Fitting a wild variety of textures in the city without running out of memory required carefully crafting shaders for all the buildings in the city, so texture reuse is maximized.

Substance Designer’s MDL graph

We recognized this need years ago, and in 2015 we deployed a shader authoring graph. We partnered with NVIDIA and built on their Material Definition Language (MDL). This opened the door to having a path tracer (Iray) in our applications and provide much better previews of our materials.

Substance Designer’s MDL graph can be used for combining illumination lobes into customized shading models, to match the shaders used in almost any production.

Portability challenges

While we were happy with the quality of the results, we still faced portability challenges. The MDL ecosystem does not reach all industries, and many popular renderers rely on other solutions and architectures, such as OSL or custom languages. We needed something that could reach these industries, something less dependent on which renders specifically supported a language.

MaterialX

Fortunately someone else had been needing the same thing for some time: Lucasfilm and a few partners started developing a standardized schema for describing procedural textures (as described in a shader), color spaces, compositing math and material assignments. This initiative was called MaterialX, and it allowed Lucasfilm to deploy large shading networks that worked and looked correct across the multiple offline and real-time shaders. So our Labs team began a collaboration with ILM and the Lucasfilm ADG team in 2018.

That Summer, during SIGGRAPH, we presented a prototype for reading MaterialX files to populate and maintain a library of material presets in Substance Painter, which artists could use to synchronize their projects to the latest looks approved in production. We also had a Substance Painter plugin which exported, along with the regular textures, a MaterialX file including info about bindings and shaders.

https://video.tv.adobe.com/v/3421520?autoplay=true

Our 2018 proof of concept with MaterialX.

A Portable Shader Graph

In 2019, we pushed the Material workflows much further. Enhancing the customizability of Substance Designer, we built a “portable” shader graph editor entirely as a python plugin.

This is an extension of the existing MDL graph which is able to export MaterialX as well as MDL. From MaterialX, we are able to extract customized GLSL and OSL, which were two of the major export targets we were missing, and which opened the doors to a lot more renderer options downstream.

https://video.tv.adobe.com/v/3421521?autoplay=true

A capture of our prototype in Portable Shader Graph in Substance Designer, Aug 2019.

This new graph and the exported shaders allowed us to get matching renders in Iray and GL viewports in Substance Designer, which is great as it was rather hard to preview a GL version of an MDL graph before.

Out of the same graph we can also export shaders to OSL, which we tested in both Arnold (a wildly successful top tier offline renderer, primarily used for film making) and AppleSeed (a popular open source path tracing solution).

Providing appropriate hints and tags, we were able to export GLSL and MDL to Substance Painter, bringing a welcome rendering cohesiveness to our toolset.

With a single button we were able to open the model in Substance Painter and then MaterialXView, the native MaterialX preview application, which enables to validate a shader and its parameterization on a neutral application before sending them to other facilities.

https://video.tv.adobe.com/v/3421519?autoplay=true

A MaterialX shader is being used to paint on in Substance Painter, and opened as a preview in MaterialXViewer, along with all its parameters still live-editable.

Finally, we could export the MaterialX network in a standardized, authorable form to other early adopters such as Maya and 3dsMax. This could create new opportunities for exchange of shading data across tools and studios in the world.

How does it work?

Initially, MaterialX was mainly a collection of schemas for compositing patterns and encapsulating looks and bindings to geometry. Illumination properties were left as black boxes, in the hope that facilities would talk to each other and be able to match the final looks.

Thanks to an initiative called ShaderX (born from a collaboration between Lucasfilm and Autodesk), MaterialX was extended in 2019 to support a well-specified custom brdf authoring as well as shader code-generation. We use this system for generating both OSL and GLSL.

Sharp readers could have noticed that I didn’t specifically include MDL across our “exports”. That is because our Portable Shader Graph is already a native MDL graph under the hood. We could have decided to export MDL using the same mechanism as OSL and GLSL, but our native implementation yields much higher performance. We may want to revise this in the future, should NVidia embrace MaterialX more officially.

To enforce the MaterialX compatibility in the Portable Shader Graph, we built MDL modules for every MaterialX construct and deprecating any (rare) non-conforming nodes. We also added support for sub-graphs, very well supported in Substance Designer, which simplifies authoring by encapsulating a common network of nodes into a single node for later reuse.

The important caveat is that to simplify the artist workflow and maximize portability, we decided to focus explicitly on shader pattern modulation and authoring in the graph and use a single illumination model as a terminal for our networks. We already had custom illumination authoring in MDL anyway, via the various MDL illumination lobes (such as “GGX specular”, “Oren-Nayar diffuse”, etc…) and layering modes provided.

What shading model did we use? That brings us to the next topic.

A surface standard to grow towards

As discussed, the classic metalRoughness model does not give us all the features we need, so over the last few years we started extending it into several semi-documented extension, each adding a single feature to it, such as coat, anisotropy, subsurface, etc…

We soon realized this was not tenable because the new surfaces and their parameters, from the outside, looked somewhat arbitrary.

On the other hand, expecting users to write their own brdf networks in MDL was a bad choice because plainly put, uberShaders are much easier to use and to exchange textures for.

We started looking into drafting a whitepaper to document our ideal surface, spurring from our existing tests. That was effectively creating a new standard.

This was just during Siggraph 2018, and that is when Autodesk approached us to ask for feedback on a draft whitepaper that was just about that: establishing a new standard shading model, called StandardSurface – which is inspired to the Arnold shaders as well as the Disney model and its descendants.

That was great timing, not just because we wanted to do the same, but also because Autodesk was already involving some of the best minds in the field.

While the word “perfect” rarely fits in the same sentence as the word “standard”, we like what we have found in StandardSurface. It strikes a good balance by extending the existing capabilities of common uberShaders to cover a vast majority of cases in the real world. It does this without overcomplicating the model with too many parameters and arbitrary rules. It provides guidelines for simplification, so real-time or less featured renderers can be still have good approximations. Perhaps most importantly, it is well reasoned and documented, which helps accept or discuss the choices made, and provides a guide to grow towards this standard.

In general, we think it is preferable to have a slightly over specified standard over an underspecified one. Especially on the authoring side, it is easy and safe to support an uncontroversial subset of a given standard. What is produced will be perfectly readable by any renderer that supports that standard. On the other hand, adding parameters to an underspecified standard puts us back in the situation we were in before, where we have no “rails” and we need to make up rules as we go, usually with the result that the new exports won’t be very widely used or supported.

Conclusions

The results of this exploration created a lot of interest and confirmed our belief that there is value in having a portable shader graph of this kind. It has been inspiring to see growing parts of the film, game and DCC industries come together for this project.

Node graphs are a fantastic way to express and visualize the complex networks of behaviors that happen in a shader (or in a program more in general), and being able to integrate that in the existing ecosystem that Substance products provide shows a lot of potential.

In addition, in Labs we would like to explore deeper the relationship between the MaterialX pattern and shading schemas and Pixar’s USD as a supporting format. USD has really reached a critical mass in adoption but despite their clarity and stability, its shading schemas are heavily underspecified (by design, to increase flexibility in production). The lack of a strong shading exchange protocol has in practice limited its expressivity. The MaterialX plugin to USD is showing some promise in that direction, but we think that more work is needed to solidify these advances.