We are part of the Creative Technology Team at Seymour Powell a small and specialist unit comprised of digital designers, developers, and artists that create world-leading content and innovations. The Creative Technology Team was founded in 2015 and focus their day-to-day workflow on the development of AR, VR, and immersive experiences. Our goal is to work with brands to help them embrace upcoming tech trends in new and exciting ways.

Artists

The two artists on the team are Craig Bunyan and Rafe Johnson. Craig is the lead Creative Technologist with 16 years of experience working in the design industry. Craig is a multidisciplinary designer focused on managing the team, working on business development while also having a deep skillset in 3D art. Rafe is a 3D designer having joined recently during the Virgin Galactic project coming from a games design and product design background. He’s a sci-fi nerd with a particular interest in the world of transhumanism, the merging of humans and technology.

Seymour Powell

Seymour Powell is the UK’s leading and most awarded design and innovation agency; it specializes in helping businesses grow by designing transformative product and brand experiences that people love.

For over 35 years, we’ve been proud to build a renowned heritage for imagining and delivering world-first product and brand innovations for some of the world’s best-loved brands and most disruptive start-ups across a diverse range of categories spanning everything from spaceships to sex toys.

From the days of designing the world’s first cordless kettle for Tefal in the ’80s to creating the world’s first space tourism experience for Virgin Galactic more recently, we’ve always believed that truly great product and brand experiences are the results of carefully designed interactions across every touchpoint: physical, digital, and experiential.

Virgin Galactic’s SpaceShipTwo Unity

Our work with Virgin Galactic began over a decade ago, when they approached us to bring their vision for the future of space tourism to life.

The output of this project was a vision piece designed to drive excitement, and early investment, and to generate ticket sales. The success of this work, combined with the amazing engineering effort from the Virgin Galactic team, meant that by late 2018, Virgin Galactic were ready to bring this vision into reality.

https://video.tv.adobe.com/v/3418436

Over the next 2 years, the Transport and Creative Technology teams worked closely with the Virgin Galactic teams in London and Mojave to deliver a cabin and seat design. During this phase of work, we built and evolved a real-time driven VR and AR collaboration tool which served as the backbone of the communication and decision-making pipeline.

By Christmas 2019, as we were nearing the completion of the project, news of a novel coronavirus was starting to make the headlines. 12 weeks later we were in a global lockdown.

The initial strategy for revealing the much-anticipated interior of the ship was built around a live event. This was to include a fully manufactured prototype of the interior displayed onstage at a VIP event in the recently completed Space Port America in New Mexico.

The groundwork our team had done in creating the real-time collaboration tool meant that we were in a great position to pivot and develop a ‘virtual reveal’ centered around a collaborative VR experience for VIPs and the world’s press and an interactive AR app deployed on the iOS and Android app stores.

https://video.tv.adobe.com/v/3418437

Virtual product reveal

As with any moonshot innovation program, early product reveals are an invaluable part of the process of getting to market. For never-before-seen product experiences where market and shareholder confidence is both more fragile and essential than normal, the need for VG’s cabin reveal event to run smoothly and impactfully couldn’t have been greater.

The added complexity for VG’s cabin reveal of course was that it had to occur within the depths of a global lockdown brought about by a global pandemic.

Almost immediately we established that consistency and ease of access were of highest importance in any experience we built. The reveal had to deliver both a knockout impression of the new design and convey an impression of a company which was comfortable with the technical cutting edge.

For those reasons, we quickly discounted PC-based VR because of the logistical challenge of reaching a large audience (over 150 people) with absolute consistency. We felt that trying to talk a celebrity through installing a VIVE Pro or Oculus Rift in their home would detract from the wonder of the reveal!

Whilst fully immersive, collaborative VR, delivered via Oculus Quest, was our chosen approach for the VIP reveal. Virgin Galactic were also committed to creating a global audience experience in the form of a publicly available iOS and Android AR app.

We were astonished by the results, and we learned that with the right creative strategy, it is arguably possible to drive more commercial impact and at scale than ever before, by reimagining the customer journey with real-time immersive experiences.

As a result of building the VR app for Oculus Quest 1, our asset library was essentially optimized for mobile. This meant that we were able to significantly expedite the development of the AR app to hit the same end-of-quarter investor announcement deadline as the VR reveal.

Creating an optimal real-time experience

As the design partners for the project, we were responsible for designing and developing the interior and therefore were the authors of the final CAD data. Whilst our creative pipeline centered around a digital twin and real-time technology, our workflow for integrating the evolving CAD data into our VR collaboration software was, for expediency, largely based on auto conversion. During the design development phase, the resultant high-poly data didn’t pose too much of a problem as our system was PC-based.

Screenshot from PTC Creo Screenshot from PTC Creo
As we transitioned to the performance envelope of the Oculus Quest 1, the multi-million poly data set was now a major hurdle. After exhaustive testing, considering post-processing and texture budgets, we established a poly budget of around 100k per scene. The process of converting the CAD data to low-poly assets followed the typical high- to low-poly conversion done in the games industry, using Maya or an equivalent to rebuild the asset then using Substance 3D Painter to bake our assets taking the high-poly details and applying these details to the low-poly asset through texture maps.

From CAD to polygons

As a business, we are attempting to make a significant adjustment to our traditional 3D development pipeline, transitioning from a parametric-CAD-first to an ‘iterative poly-first’ workflow. We are doing this because we believe that a ‘real-time-centric’ development and communications workflow is and will continue to become more relevant to our clients in the future. The 3D content we produce with this new development pipeline can be utilized in far more ways than traditional CAD, with polygon-based content being suitable for a range of output devices from including AR, VR, mobile devices, and going forward it puts us in a position to share our content with the next generation of hardware and software.

We also believe, and have proven, that real-time enabled in-project collaboration results in better design through expedited exploration, clearer communications, and more exhaustive iteration. The ability to make changes and instant updates to designs, without being held up by painstaking render times, has shaved days off our development process. Whilst parametric CAD won’t disappear from our workflow, the benefits outlined of polygon-based content may result in it no longer being relied upon as the default starting point for our 3D development.

As an associated benefit, a poly-based, real-time-centric pipeline means we can employ rapid and procedural-based material creation tools like Substance 3D Designer to empower our CMF teams to retain control of their concepts. Our teams are able to do more than what was previously possible, and at greater speed. Whilst in the past each stylistic choice or individual decision may require lengthy periods of work to visualize, our teams are evaluating their work faster than ever before, giving them more time to shape their ideas.

Before

After

Texture and material creation

Texturing and material creation were a central part of this project and ensuring our materials looked accurate and the visuals were pushed to the limit was essential. Therefore, we required the best tools to achieve this. Painter streamlined what would have otherwise been a long and arduous process and allowed us to texture and run through iterations quickly. Using familiar methods like masking to quickly create separate materials on one texture map both optimized our time and the assets performance.

Now with even more tools at our disposal such as Substance 3D Sampler, creating accurate materials quickly is easier than ever and we look forward to utilizing these tools for our upcoming projects.

3D art production pipeline

Because this was such a large project requiring input from our entire company, just about every design tool was used at some point in the project, however, our 3D art production pipeline focused on Maya for modeling, Substance 3D Painter, Designer, and Photoshop for texture and material creation, V-Ray for texture/light baking, and Unity to weave it all together.

One critical stage in our pipeline was using V-ray within Maya to light bake our textures then plug the output texture into Unity’s Universal Render Pipeline material. This allowed us to create the illusion of high-quality lights in our Unity scene, but have it run on an Oculus Quest. The way we achieved this was by building a high-quality render scene inside Maya/V-ray with as realistic lighting as possible, apply the textures we created in Painter to V-ray materials, and then bake these textures out. This allowed us to use one map rather than multiple textures for albedo, normals, and so on. We would then take the baked texture into Photoshop and make tweaks including layering the ambient occlusion map on top to highlight better shaded areas.

Texturing workflow

We used Substance Painter as our central texture and material creation tool, utilizing both Substance 3D Assets and Substance 3D Community Assets platforms to obtain base materials to work from rather than building from the bottom up every time. Substance 3D Assets as well as the base material library gave us a lot of options to quickly test and prototype to get to the right look and feel as fast as possible. We used most of the features of Painter during our workflow, a major one being the baker, getting the high-poly CAD details baked onto our normal and ambient occlusion maps. As we were restrained by the number of texture maps and materials, we tried to keep as much of the texturing on one set of texture maps and so making use of layering, masking and the polygon fill tool was essential in getting impressive results with limited materials.

Real-time cinematic in Unity

We wanted our final cinematics to look and feel authentic, rather than faking content with various tricks and editing techniques, so we decided to showcase the high-quality real-time assets we had developed. To do this we utilized the same models with the textures being increased in resolution to push the visual fidelity that little bit further. Once this was complete the addition of some depth of field and other post processing effects allowed us to arrive at a visual standard that we were happy with. The added benefit of using real-time assets was that it drastically shortened our rendering process to days rather than weeks.

Best practices

As with anything we recommend understanding the underlying principles of both the tool and the work you are doing, really understanding the function of texturing and material creation as well as how the tool works will help you create your designs far more intuitively and efficiently. With a larger project like this one, its hugely effective to test early and quickly to get to grips with the best workflow and pipelines as fast as possible. As a result of our early testing, we found that we had far more poly count budget than we initially anticipated but less texture size budget and so being able to adapt and pivot our workflow to handle this quickly was hugely beneficial.

With VR you have even less room to make mistakes as it’s easier to spot texture issues or missing topology therefore it is important to get your models in and reviewed early on. We found it highly effective to have review sessions with other designers, not only for the visual design and general art direction but also to help spot these issues quickly.

End word

With the latest updates and releases from Substance 3D and the nature of our work continuing to pursue game engines, we will look to incorporate more of the Substance 3D toolset into our work. We have a particular interest in Designer due to flexibility and overall control, especially with the recent parametric modeling updates. Currently, it is primarily just the Creative Technology Team making use of the Substance 3D suite however we see exciting potential in getting the wider company involved in learning and using these tools. For our Color, Material, and Finish team (CMF), not only will this help them conceptualize and test materials more effectively, but it will also allow them to be more involved in the actual development side of our work. They could build up material libraries for us and even attach these to our design DNA documents, making our work with real-time digital experiences more of a core component to the projects Seymour Powell carries out. Having to rely less on materials and textures from third-party sources, and having more control on our end, will lead to more accurate and desired results.

As more of our design workflow becomes digital, we plan to use the Substance 3D suite through our entire design phase from our early sketches and prototypes to our final products and experiences.