A classic game revisited! Today we interview the NVIDIA team and learn more about the making of “Marbles“, a game simulation running in real-time thanks to RTX technology. Let’s find out how the team textured virtually all the assets thanks to Substance Painter, as to preserve a highly photo-realistic feel to the real-time demo.
Gavriil: Hi, my name is Gavriil Klimov, and I am senior art director at NVIDIA. I lead the amazing team that works on real-time technical demos, products, industrial design, cinematics, and so on. I set the creative direction for the projects we work on, art direct the team, as well as ensure that we work according to our deadline while maintaining the quality we strive for.
Jacob: Hi there, my name is Jacob Norris and I’m the lead environment artist at NVIDIA. My role here is to create 3D environments for the tech demos, games, and cinematics that we work on here in the studio. A lot of what I do on the team, not only consists of working within our collaboration platform called Omniverse, but also involves modeling, texturing assets, world-building, and working with other artists to ensure quality, as well as various other responsibilities.
NVIDIA Marbles RTX Demo
Gavriil: Omniverse is a collaboration platform in which creators, designers, and even studios will be able to create their work together, from across the room to across the globe. One of key the goals of Omniverse is to combine the technologies of real-time ray tracing, physics, and AI into one platform for creatives to use interactively. Only at this point, can you be building real-life simulators for robotics, drive, visualization, and entertainment. In order to build a great platform, we “stress test” internally with real-world scenarios. Previously, we have done factory floors that simulate robotics, as well as cars across various driving scenarios, and for this, we decided to have some fun based on the old game called Marble Madness: The player controls a marble around a level full of obstacles trying to reach the end. The catch, in our version, is that the player controls the orientation of the world, tilting it along its axis and making the marble roll accordingly, while obeying the laws of physics along the way. The entire simulation is being rendered by the NVIDIA RTX Renderer inside of Omniverse, which is a pure ray-tracer that uses DLSS to enhance sharpness. We textured over 165 unique assets for the project.
Gavriil: As mentioned, this was sort of based on an homage of an old game called Marble Madness. Our goal was to have the marble move physically correctly in a highly detailed realistic environment. When creating the art direction, it’s important to answer some specific questions. The first one was, what is the theme and the setting of the game? Well, we knew that this was going to be based around the idea of a little marble moving around the world. We knew that the setting had to be a real-life scale setting, based on true physical dimensions. Building arbitrary elements (Like the Marbles for NES) to look sci-fi could be fun, but also throw the scale entirely off, as the audience does not necessarily have a mental frame of reference for what those items are going to be. Putting together something known like a marble and something not known would not have worked as well for our case. In terms of theme, one of the first things that came to my mind was the Toy Story game for SNES. I played that game as a kid non-stop. I found it beautiful for the time, and to be fair it still holds up fine. I could imagine the marble rolling down a kid’s room, lots of ideas came out right away on what could be used to construct a marble circuit. However, more ideas kept coming, spawned from this first one. A musician studio. A “studio” room. When we reached the idea for the studio room, I immediately had a feeling and knew what was right: an artist’s studio. I could not think of a better place for our demo.
What we do at NVIDIA is combine and fuse art and technology. Art gets elevated by technology, and vice-versa. However, all the art that we create is digital. I really liked the aspect of having an actual, analog studio full of art supplies. It made me remember back when I attended college and felt I was familiar enough with all the items to be able to visualize a space that would properly utilize all of these elements. That’s when we knew what our theme was.
Once you have a theme and a setting, it’s important to set the game’s design goals right away. We knew we needed the marble to be pushed, kicked, blown away, etc. We basically started to make a list of all the possible “enemies” for the marble. We originally had a long list, but due to the timeline we decided to trim it down. Some earlier versions included glue on the floor that would slow you down if you went over it and falling brushes. Having to overcome obstacles makes the gameplay much more fun and appealing. We knew we wanted to create these custom looking elements for gameplay that would be assembled by many other standalone assets, these would become our enemies. We worked together to brainstorm many different ideas and then we made a final selection. This is how we came up with our list of things like “Pencil Pushers,” “Spray Can Blowers,” “Plastic Hammers,” and so on. We started immediately to work on iterations of the game level to test them out and see where they would work best. Once we felt we reached a pretty good spot in the level iteration, we then started to create the concept art. I worked with the concept artist to execute our list of gameplay concept-enemy ideas, as well as pushing him to come up with his own list of visually interesting transitions between one part of the map to the other. When creating these custom assets, the important questions were: How are they combined together? How do they work in the level? It’s one of the earliest problems to solve and often one of the most difficult ones but it’s the most important one to solve and it helps if you have this figured out early on, rather than later on.
At this point, I headed out to Michael’s and bought about $1,000 worth of art supplies.
I basically bought nearly 1 to 1, most of the items that we would end up having in the level. From the brushes to the erasers, the pastels to the spray cans, I got it all. I am a huge believer in having super solid references that the team can follow and it does help your job as an art director, to analyze the real-life items, dissect them, learn what makes them authentic and realistic.
One of the major things I knew we had to do correctly was imperfections and all the subtle elements that make an asset feel used. One of the most common mistakes I see, not only at junior level, but even at a professional level between 3D artists, is that they would model their assets as if everything were “brand new” or nearly brand new, and then they would add all the damage and wear mostly in the texturing pass. Even assuming the artist is really good, the image generated would still feel a bit as if it were in the uncanny valley of great, but not quite realistic. I am a big proponent to create the right amount of changes in the geometry stage. The silhouette read is very important, and there is only so much help a normal map can give you. For this demo, we did not use any displacement, so all of our imperfections came at the hand-modeled geometry level.
I set up my balcony with all the tools I bought and started to use them. I mixed all the paints. I sprayed the spray cans. I built our cork cube that is later seen in the map as one of the cubes that you cross over with the marble. I took pictures of splatter and paint, and I used all of this reference to give feedback to the team.
Here is an example of the real-life cork cube I built, and below you can find the 3D reconstruction.
Gavriil: This project was completed in a relatively short amount of time. We needed to be as efficient as possible, so it came down to how we distributed the work between artists. Whether it was models, textures, UVs, or then importing/placing the assets into the Omniverse, Substance Painter played a critical role in making sure we achieved that goal while still maintaining the quality level we wanted with the timeline we had.
The entire Marbles RTX Project — every single asset, model, gameplay element — was textured 100 percent inside Substance Painter. Of course, we would get some normal information from high-poly models, ZBrush sculpts, or things of that nature, but all of the Albedo information and final texture pass was completed in Substance Painter. There was no photogrammetry or source photography used for any of the textures or models in the scene. It is all handmade with love and care by each of the artists on the project.
Jacob: This is why Substance Painter was critical in the role of ensuring we could provide the quality level we wanted with the timeline we had. The sheer number of assets we had to texture was incredibly vast. Luckily, we were able to use a lot of the smart materials, as well as creating many of our own, to be able to get a lot of the unique texture information we needed for each asset. As you can see from the demo, a lot of the assets also had custom paint splatter and droplets across all of them to sell this idea of it being “inside of an art studio.” Substance Painter made it incredibly easy for us because we were able to use the alphas and brushes provided with the software, as well as our own custom ones, to easily paint all of that information within a 3D viewer. It allowed us to get some really beautiful and custom texture work onto pretty much every single asset and preview it in a similar lighting scenario so that we knew it would look good before we even brought it into our level.
In many cases, you would see the same asset in multiple spots right next to each other in the game world, so we needed to quickly create variations of everything too. Through the utilization of the Substance layers system and anchors, we could simply set up different layers within one Substance file to create many variations of the same asset by feeding it new seed numbers or painting slightly different custom masks for each of our texture sets and model variations.
In short, Substance makes it very easy to work with a large team, a large number of assets, and to be able to do so in a way that allows artists to bypass a lot of the busy work, such as creating simple imperfections, edge grunge, dirt buildup, or layering details on surface. Using Substance gives us more time and energy to create the more artistic aspects of telling interesting stories through our texture placement, custom paint splatter, and “showing the way objects are used,” like specific wear of rotating pieces or gravity causing directional dripping of paint and water, and other similar variations of texture storytelling.
Gavriil: The team was all sharing the Substance Painter files with each other in the cloud. At the end of each day, images would be shared of the progress on Slack. This allowed Jacob, Andrej, and me to comment and provide feedback as the assets were being created. If we wanted to look at something in more detail, it was quick and easy to simply open the Substance Painter file and talk about areas we felt could use improvement. This also gave us an opportunity: As we were seeing textures being made and assets being worked on, we could determine whose version of the wood was the best, or which metal we liked the most, and could then ask artists or recommend to artists to grab this wood from artist “A” and use it on asset “B.” So we were quickly able to increase the quality of the entire project by using the best work from each person and share it across assets while things were being populated.
Jacob: This is similar to what a lot of people do throughout game projects already. Generally, while you’re working on a level or environment, you’ll walk by someone’s desk and see a concrete ground they’re working on and think to yourself, “This ground would be perfect for my section, too!” Whereas, in our case, we weren’t working on large environments with many tiling textures. Everything we were doing was all small individual props and assets. So, while the idea was the same, the utilization was just slightly different, in that we would grab custom textures from within each prop and just use them throughout the game space on different custom textured assets. This also, inadvertently but purposefully, kept a similar look throughout the project. It helped everything to feel as though the person who owned this fictional art studio (that we constructed for marbles and for which we created all of these assets by hand) was using the same materials that they had purchased from that same art store to build everything in this world.
Jacob: The first step to making texturing easier and making your assets better is having good UVs. Triplanar mapping and Substance Painter auto-UVs can help in many cases, but the ultimate asset with proper preparation will provide the best canvas to texture, easy-to-use UVs, and retain a much higher texel density. Here is an example of good versus bad UVs.
The next step of course, is gathering tons of reference images. Then, do your best to simply notice all the minute subtleties and details, as well as the imperfections in your reference that make your asset feel real. It’s all these imperfections and subtle bends in your geometry or scuffs on your wood that really sell the photorealism of an asset. You don’t see this as often in games because of the optimization necessary. People also often miss the geometry details and subtle imperfections when doing personal pieces, but it’s these imperfect details that create the perfect models. In this example, I’m texturing some duct tape on this “Cup Chute” asset I created. I prepared the tape model with the torn edges in mind beforehand by adding some polygons and sculpting some torn edges in ZBrush. The folds that you see in the tape were all created inside of Marvelous Designer after many tweaks to the settings to mimic a sort of tapelike stickiness.
From here, as I had mentioned before, UVs are very important, especially when it comes to things like folded cloth or, in this case, folded tape, which is important for your texture to sit properly on the surface with directionality. I laid out the UVs perfectly straight, so that when I applied a pattern inside of Substance, the pattern conformed perfectly to the surface of the tape. This would not have been possible with triplanar projection or improperly laid out UVs. The first step in recreating this duct tape texture was getting the strings under the surface of the tape just as they are in real life. I did my best to literally recreate the entire structure of the duct tape in order to truly sell the realism that we were going for.
If you look at the GIF image above, you’ll see first the layer of strings is laid down by using a simple brick pattern and inversed heightmap inside of Substance Painter. I then apply a fill on top of the “strings” heightmap, which is a rope alpha tiled many times to truly get the effect of threaded string. After that, I then layer the top of the metallic tape onto the strings. This was done by simply placing a “Iron Rough” premade material over this layer and then adjusting the roughness and height values to match my reference images. At this point I want to get the subtle warping of the strings, so that we don’t have perfect lines or straight wires under the surface of the tape. I blur and warp my height layer only (so as not to blur or warp my Color or Roughness Maps) to get the feeling that the tape is laying on top of these strings.
Now, since I have layered everything as it would be in real life, I can simply apply a mask to this Iron Rough surface and mask away some of this Iron Rough layer to show this string underneath, near the edges. This helps to sell the torn effect that we are going for. After that, I simply layer on some dirt and a subtle edgewear mask on top to get the feeling of scuffs and age to this tape. Lastly, I apply some paint layers, which I also mask and hand paint with alphas and custom brushes to give it the feeling of being inside of the art studio and splattered with paint like the rest of the assets.
The GIF above has another example of how I can simply layer many different fills, generators, and masks to create the surface types that I’m going for directly inside of Substance Painter. I begin with a base layer using the “Concrete Dusty” Substance material. At this point, it’s easy enough to simply add an adjustment layer for levels to decrease the contrast in the base color and use this to get some subtle variations on the surface of my Cardboard. Looking at all the reference images, we can see that cardboard is made up of many, many tiny chunks of paper that look like paper fibers — hopefully recycled paper fibers 😉, haha. To get this effect I’m adding directional scratches onto the surface of my now colored “concrete” base layer.
Unfortunately, these directional scratches look too perfect to be randomly torn shreds of papers mushed together. So I’m applying a warp on top of this to break up the directional scratches and get many randomized shapes that are now starting to represent the look we are going for. In real life, you’ll get a lot of different values and subtle different hues in your surface color when you have this conglomerate of paper creating the cardboard structure. To recreate that, I’m now simply layering some lighter splotches and darker splotches of color using the “Dirt” height texture fills to get that varying effect. Perhaps some subtle directional warps on top of these splotches would have also benefited the texture more, but I did not do it at the time.
As you can see, our surface is looking a lot more like cardboard now, or the large paper towel roll that it is. The final steps from here are to again add some edgewear and dirt onto the asset. This time I’m using the edgewear to make it look like the edges of uneven paper fibers, where the seams of the cardboard are meeting, by having it affect my heightmap and color brightness. The last step is to again place my splotches of paint on top, and here you can see the Substance Painter and final asset in the game.
One of the more unique assets/textures created for the Marbles game, although perhaps less impressive at first glance, was the Marbles RTX painted on a canvas. This asset was created with a new method I had an idea for while working with so much “paint” on the project. I found a way to, essentially, artificially use Substance Painter to blend, blur, mix, and drip paint to actually create new colors, proper smearing, and almost literally mimic “painting” inside of the software. Perhaps others may be able to utilize this technique in other ways and repurpose it for themselves. Beware though, as there is no layering with this method, it is actually like painting on a flat canvas, so it cannot be easily changed or updated later on (I learned the hard way when I had to change things on it, haha)
Perhaps a short video is the best way to show off the “Painting” inside of Painter technique. So, I’ll just stop talking and start showing.
Here is a more detailed example of how this technique worked on the final version of the Marbles RTX “Painted Canvas.” You can see the actual height information, thickening of paint, brush strokes, and blending of colors throughout the canvas. Doing everything possible to try to go for that realism! Even if it means, no more easy “undo” later on …
Jacob: It is very interesting, actually, to use Sub-D for games, because at least for me, personally, it was the first time texturing something with a Sub-D workflow. For those unfamiliar with Sub-D assets, it’s generally a practice used more often in VFX and film content creation. Assets are created at a lower polycount, but with edge loops placed in specific parts throughout the asset, so that when the asset is smoothed you get beautiful creasing on the edges and a more high-poly refined look to your 3D models. It’s done is so animators, world builders, lighters, etc., can view the content in their respective 3D software like Max, Maya, or Blender at a lower-poly stage, and not put their PC under so much stress from the fully smoothed and content-heavy high-poly models while they’re working on the content. Then, at the very end, the Sub-D models are smoothed only for the actual rendering stage in the VFX pipeline, when the final images for the movie/film are processed.
In our case however, we wanted to use Sub-D assets to “futureproof” our creation. Think of it more like having a reverse LOD system, where we are creating our content at LOD 3 (instead of LOD0 and down-resing), but then at runtime we would smooth the Sub-D models to be LOD 2 or 1 or 0. But these smoothed models could essentially be smoothed to any level we would want them to be. We could continue smoothing them to ensure that, as our hardware rendering capacity increases, so too could our game increase the polycount to match whatever new hardware capabilities it was being rendered on.
So essentially, in 10 or 15 years, when the hardware is capable of doing so — who knows, hopefully sooner — we could then smooth our models to such a degree that you could zoom in on a microscopic level and never see any faceting from edge loops on our assets. Hopefully, by that time we will have some AI technology to up-res our textures to match that level of magnification as well, but there is already some great tech available online that starts hinting toward this.
As far as the texturing aspect: When it came to the Sub-D models, there were definitely a lot of things we had to look out for and be aware of to make sure that textures appeared properly on assets, both at subdivision Level 0 and all the way up to subdivision level “infinity.” We did this by applying a checkerboard texture to all of our models and then smoothing them to about subdivision Level 2 to check them for any stretching or distortion across the surfaces of our assets. Working in this way of requiring proper UVs on all Sub-D levels provided some challenges for us, because we had to make sure that every model was set up for this type of texturing work and the method had to be taught quickly to all of the outsource artists we were working with.
UV seam placement was crucial. When you are UVing a Sub-D model, it’s always important to place your UV seams in between a few edge loops on either side of it, or else there will most definitely cause some stretching or distortion on the surface of your asset once it is smoothed. It can be even more complicated when you have irregularly shaped surfaces such as folded tape, curved stapler edges, or more complicated shapes in general. The more you work with Sub-D models, the more you’ll get a feel for how things will look after they’ve been smoothed out and how your UV seams and surfaces will be affected by the different Sub-D levels. It’s difficult at first, but there are definitely many benefits to working in this way after you get the workflow in place.
Gavriil: The main challenges we faced were probably the timeline for the project, the use of complex Sub-D models in a real-time pipeline, and working with a developing product (Omniverse) while working with many artists and trying to synchronize all of our workflows simultaneously.
The first problem that we wanted to solve, of course, was making this all work within our timeline. To do this, we had to structure the teams and the assets to work in a way that would allow us to model, UV, and texture in tandem. We accomplished this by first having a strong concept of the environment we were trying to create. I answered more of this in my art-direction-related section.
Once that is taken care of, we were able to determine a full list of assets we wanted to complete for the space. This allowed us to prioritize which assets would be most important to us in the scene and start modeling those first. By the time those initial models were completed, we had other artists starting to help UV the finished assets and at the same time the modelers then moved on to creating their next asset. We continued this hands-off sort of workflow into the texturing phase, at one point having a set of artists modeling, a set of artists creating UVs, and a set of artists beginning to texture — all in tandem, as planned.
(Jacob) As the first artists began to texture, we now had to quickly sort out our Sub-D texturing workflow. Kyle, one of the artists who worked on the project with us, did a great job running some tests and experimenting with different setups for how to texture these Sub-D models. We researched many different workflows used in the VFX industry to put together a library of information. After finding a workflow that we knew would be best for our use case, we then quickly put some documentation together that could be shared with the rest of the texture artists to ensure everyone was following the same structure. After the last of the artists had finished their modeling and UV work, we then slowly moved the entire team onto texturing, and pushed on through until the end working this way.
Now that we had a solid timeline, a good Sub-D texture workflow setup, and everything was in motion, we were now doing our best to request the necessary features and get the game running properly inside of Omniverse. This is the first “game-like” project that has ever been created inside of Omniverse, so there was a bit of a learning curve for everyone. Some of our main challenges were setting up the assets inside of the engine in a way that would allow our engineers to make them all physically interact with each other and simulate proper physics within the game world. Furthermore, we were also working to optimize the way we imported our assets and set them up, to make sure that we were doing our best to get the game running at frame rate on the art side as well. Much of this is still new because we are also utilizing .usd file formats inside of Omniverse. It has been incredible learning all of the fancy ways .usd files can be used and the vast amount of information that can be stored inside of them. We’re only beginning to unlock their true potential and the many new feature sets that were developed specifically for us during the project that we will continue to use moving forward.
Gavriil & Jacob: We were very lucky that Omniverse allowed us the canvas and raw power to render these beautiful images, but many of the tools were still very early on, in terms of the user interface and functionality for creating games. So, luckily, a lot of us on the team, having had game design and game art backgrounds, were able to utilize simpler and more rudimentary techniques to achieve the looks that we wanted across the level. Which to be honest, in many cases actually still provides some of the best-looking results for what we were trying to achieve. Another huge benefit we had, of course, was the ability to have so many unique textures all loaded at once onto such powerful GPUs. We could then texture literally every asset to the highest quality that we wanted, almost without even having to worry about whether we could get it running on our machines.
In the end, the fact that we work with so many intelligent people here at NVIDIA and had the chance to work with so many talented artists and freelancers is honestly what made this whole project possible. Considering our timeline and the challenges we faced, we’re very happy with the end result and so pleased to see how much everyone else enjoyed it as well!
Marbles at Night RTX
With the launch of the new RTX 3000 series, we knew that we needed to present something even more spectacular to the world that would truly show the future of games and RTX Technology. The sheer power of the new cards is simply astounding. Once we saw that we could have literally hundreds of real-time ray-traced area lights in the scene, on top of the already high density of geometry and texture sets in the scene, our immediate instinct, of course, was to do a nighttime version to really show off all the new tech and performance that these cards are capable of.
Our lighting artist Artur began by working on different lighting setups to see what would be most interesting to showcase for this nighttime scene. After going back and forth on a few different ideas, we decided to implement some LED light sticks into the play space, as well as rotating light spheres, some hanging chandeliers, and light strings and other elements that helped to showcase the GI, real-time shadows, and RTX tech throughout the environment.
Everything is shown in real time, running on the latest Ampere GPU, and presented in NVIDIA’s Omniverse. Honestly, the video speaks for itself, so if you haven’t watched it yet, perhaps the best thing for us to do is simply share with you Marbles RTX at Night. Thanks so much for your interest, and we hope you enjoyed following along with us for this breakdown!
Additional video content about Marbles:
All images courtesy of NVIDIA.