Ruined Temple

Environment Breakdown

Thomas Bauer

Main_Shot
108

Thomas Bauer

Environment Artist

Introduction

I just graduated with my BA in Animation & Game at the University of Applied Sciences in Darmstadt Germany and am now open to opportunities preferably in the Berlin area or remote.
During my studies, I have worked on multiple game projects in teams of 4-5 people during which I grew a solid understanding of how to create beautiful and performant Game Environments.

Blockout

I started by creating a blockout while also testing my modules. The plan here was to build a blockout of each asset the way it will be used later on with correct pivots etc. so I can just replace them with the actual asset later.

I also established the directional light and main camera shot in this stage so I don’t get into trouble later on in the pipeline if anything doesn’t work out. I ended up doing a slight correction later anyways but getting the camera 95% there early on really paid off.

Untitled

Modules and Modelling

I chose to cut up the environment into larger modules than I originally intended just because this is a portfolio piece and the module size was perfect for this temple on its own. If I had to fill an entire level I would have built smaller modules to get more variation in and to be able to create walls with different angles. Most of the meshes use tiling materials which is why I created two UVs for each, one to project the tiling materials and Trimsheets and the other confined to a single 1×1 UV space to allow the usage of texture blending masks which I will talk about later in this article. The meshes have a pretty low polycount on the flat surfaces with an additional subdivision on edges so they catch a nice edge highlight and look more natural. Saving subdivisions on flat surfaces allowed me to spend more polys on more dynamic shapes like the arch and the sculptures.

Untitled1

Materials

While I downloaded the ground materials from Megascans, I built the limestone material and its variations from scratch in Substance Designer. I chose to build the limestone materials myself because I didn’t find a material online which I liked which was also easily modifiable to fit all the different variations of stone. Also, I later split the fine porous noise on its own normal map and used it as a detail normal overlay in order to save texture memory and reduce tiling repetition. You can read about the detail normals later in this article. The base of the limestone material was built using the workflow depicted in Phil Liu’s amazing Fantasy Trim Texture tutorial. It has the perfect amount of grunge and fractures on the surface to build the limestone material variations on top.

Material_Showcase

Trimsheet

My Trimsheet has been built by blocking out the basic shapes in 3Ds Max after scanning the ref for possible trims and arranging them on a 1×1 square in Photoshop. I then exported the basemesh to ZBrush to begin sculpting. After that the sculpted highpoly was exported to Substance Designer where I baked it onto a single 1×1 UV’d square and merged the resulting maps with my already existing basic limestone material to introduce cracks, color variations, and dirt streaks.

Trimsheet_Breakdown

Master Materials

For the limestone materials I created a custom master material. My requirements for it were as follows:

  1. Support channelpacked Maps
  2. Overlay the normalmap with an extra detail normal
  3. Toggle between Vertex Painting and Texture Painting for blending of the materials as needed
  4. Setup for realtime virtual texturing to blend meshes to the landscape
  5. Expose all variables so I can adjust them per instance
  6. Parameters for colortint and detail normal intensity are stored in a global material parameter collection so they stay the same for each instance
Limestone_MasterMaterial

Channel Packing

I chose to pack the AO, roughness, and height into a single texture to improve the memory overhead. Usually, you would find a metalness map in the blue channel but I don’t use that in my whole environment, and height maps can come in handy for tesselation and height blending. I added an extra output node in Designer which packs the AO, roughness, and height into the RGB channels respectively. When importing to Unreal you have to make sure to turn off sRGB in the texture settings as Unreal assumes you intended your colorful texture to be a beautiful albedo and turn it on by default. Also, make sure you tweak your master materials to work with a channel-packed texture.

Channel_Packing_noArrow

Detail Normals

Due to the enormous size of the temple and the thus needed texture resolutions to avoid visible repetition in the brick patterns I used a detail normal workflow.

By using a detail normal it’s possible to increase the fidelity of a surface while saving texture memory. This is achieved by splitting the fine cracks and pores from the main normal map and making a new lower-resolution detail normal map which later gets blended on top of the main normal map. This allowed the tiling textures to remain at a 2048px resolution for 8 m² which would result in a Texel Density of only 2.56px/cm. By overlaying a 1024px detail normal map, the perceived texel density can be increased artificially. I found the sweet spot to be at around double that of the native normal map which resulted in 5.12px/cm. This is still rather low but due to the size of the temple perfectly adequate.

If you want to learn more about detail normals, take a look at this video: https://www.youtube.com/watch?v=Fi8KkHasFu0

DetailNormal_Breakdown
DetailNormal_Graph

Edge Decals

Edge decals are an amazing way to introduce edge damage on modules which use tiling textures and can’t be uniquely unwrapped and baked with a damaged highpoly mesh. Since this environment consists mostly of non-unique meshes with tiling textures, I used edgedecals pretty much everywhere. They are very easily produced. Just take a cube, duplicate it and only keep the edge stripes as seen below. Those have to be UV’d to tile vertically and also have a consistent spacing so you can swap which edgedecal stripe you want to use even later in the material by just offsetting the UV horizontally by the spacing amount. In my case 0.25 because I have 4 stripes.

If you want to learn more about them you can watch this tutorial: https://www.youtube.com/watch?v=Aihha0sAOJI

EdgeDecals

Vertex/Texture Painting/Blending

The standard for blending between multiple materials has always been vertex painting as it’s cheap and doesn’t require much work to set up. However in areas with a low vertex density, this approach proves clunky and produces unwanted artificial gradients. There’s the option to ramp the gradient with a noise map to reduce the artificial look of a perfect gradient but this technique also has its limits as it too requires the presence of enough vertices. Subdividing the geometry would solve this issue but seems a rather awkward approach. This is why I chose to go for texture mask blending instead. Usually, this would be done by using painting and generators inside of Substance Painter and then export the masks but I wanted a more dynamic approach. This is why I chose to paint the masks on a pure black 128px texture directly inside of Unreal.

Using a 128px map provides a total of 16k points per channel evenly distributed across the mesh to be painted on. This was perfectly sufficient for my needs. This technique requires a UV channel where all islands are confined within the 0-1 UV space with no overlaps unless desired. If you are short on time or dislike UV mapping, using the auto-lightmap channel provided by Unreal works aswell. Haters would dislike this approach due to its unoptimized nature but we don’t listen to them when working on portfolio pieces.

Keep in mind that due to the low resolution of the mask, color bleeding can occur very easily. One workaround would be to adjust the island padding in the UV’s according to the mask resolution or choose a higher resolution map.

MeshPaint

Statue/Frieze-workflow (Marvelous)

I became an environment artist to not bother with sculpting humans and drapery by hand which is why I cheated got creative to build the sculptures and friezes depicting humans in cool roman clothes. Marvelous Designer is a great program to simulate clothing which sounds like it’s only meant for character artists. I personally think every environment can profit from some type of drapery hanging around. So if you have time to learn new stuff definitely check it out.

For the friezes, I used this workflow after a lot of trial and error with flat looking normal decals:

  1. Use the standard male and female avatars of Marvelous and wrap a few rectangular sheets of linen around them to make it look like a fancy tunic.
  2. Pose them according to the reference.
  3. Export the whole mesh including the avatar to 3Ds Max.
  4. Arrange multiple characters according to the reference.
  5. Export to Substance, bake height and normal maps.
  6. Use height map in ZBrush to extrude a flat plane and give it some depth.
  7. Decimate to acceptable polycount.
  8. Export to Unreal, apply material and overlay with decal with the normal from step 5.
Frieze_Breakdown

The statues were a bit more straightforward:

  1. Use the standard female avatar of Marvelous and wrap a few rectangular sheets of linen around her to make it look like a fancy tunic.
  2. Pose her according to the reference.
  3. Export the whole mesh to ZBrush.
  4. Fill holes and make mesh convex, export highpoly, then decimate to acceptable polycount for the lowpoly. Export aswell.
  5. Import in Substance, and apply limestone material. Add extra dirt/sunbleach pass.
  6. Export to Unreal.
Statue_Breakdown

Vegetation

Not much to talk about here. The bushes were quickly created in Speedtree with a leaf atlas from Megascans. The grass is entirely from Megascans. I tweaked the size variation to produce higher grass towards the walls and lower towards the walkway to simulate people trampling it down and a generally more nutritious soil where water drips down at the walls.

Lighting and Color Correction

Due to the very saturated yellowish tint of the painting, I already introduced some of it in the limestone albedo so I didn’t have to tweak too much in the post-processing. For the color grading, I mostly tweaked the global contrast and saturation to be a slight bit more aggressive and shift towards a yellow-greenish tint. Additional lights were used in areas that stayed dark but were supposed to be bright in the painting to make for a better composition. Most of the lights were used to brighten up the background with a rather cold tint and bring some more definition and contrast to the inside of the arches as well as some inwards corners which have been lit only on one wall to work out that nice clean edge you can see in the ref.

PostProcessing

Issues with Raytracing

I used RTX global illumination which looks gorgeous on the one hand but proved to be a big headache many times as a lot of features are either buggy, not performant enough or straight up not implemented yet.

One of the biggest issues I had were muddy normal details in shadowy areas. After a lot of googling and trying around without results, Jeremy suggested I should ask in the Dinusty Discord. 5 minutes later I got the suggestion to tweak the RTX denoisers. Turns out the skylight denoiser was the root of the problem so I just deactivated it with the command r.RayTracing.SkyLight.Denoiser 0 because tweaking the sample rate had no results.

Now there is a bit of noise but it really isn’t too noticeable.

Untitled-2

Another issue was black spots in the tessellated or inwards sculpted landscape. This comes due to the fact that rays don’t reach into those areas rendering them black. setting the normal bias to a higher value alleviates this problem but can lead to unwanted shadow behavior in other areas. For me, a value of 4 was ideal. The command for this is r.RayTracing.NormalBias 4.

Also, volumetric fog just doesn’t work which forced me to use linear fog. Not too much of a problem in this particular piece but good to know.

Conclusion

This was a really fun project and I learned a lot while working on it. Not only because I used RTX GI for the first time but also about general workflows and being able to assess the overall look of a piece and judge on how to improve it. My two mentors played a major role in this which brings me to my last advice: Get good feedback on your works no matter what skill level you are. We all start at 0 and it’s only natural that artworks do not look as good as the stuff on the Artstation front page for the first couple of years. So if you can afford it, get a mentorship with a reputable mentor where you get the maximum amount of 1to1 coaching time via voice chat.

Good places to find those are the Dinusty Empire mentorships https://www.dinustyempire.com/mentorships

or the Mentorship Coalition https://thementorcoalition.com/

The latter are quite a bit more expensive but worth the money if you’re serious about your art.

If you are short on cash or just want to get some advice on a particular topic, don’t hesitate to drop by at one (or all) of the following Discord communities:

Dinusty Empire https://discord.gg/5Qe3Skg

Experience Points https://discord.gg/KtTKj64PuN

The Club https://discord.gg/hz84wUPkQU

They are filled with some of the most skillful and encouraging people I have seen so far.

Also if you have any more questions about my artworks, don’t hesitate to drop me a message at my artstation profile: https://www.artstation.com/tbauer