Simulation / Modeling / Design

Developer Spotlight: Creating Photorealistic CGI Environments

Get to know Rense de Boer, a technical art director from Sweden, who is not only pushing the envelope of photo-real CGI environments, but he’s doing it all in a real-time engine!
Over the last few years de Bower has been focused on advancing real-time graphics by expanding beyond current workflows, and experimenting with cutting-edge hardware and software — he’s continuously evolving his pipeline as new techniques becoming available.
The image below is not a picture or a multi-hour 3D render, it’s a screenshot from Unreal Engine.

Workflow

To capture real-world environments and objects, he uses a process called photogrammetry. His tool of choice is Reality Capture, not only because it produces amazing results, but because it can do so very quickly. The application automatically finds similarities between photographs making it possible to extract perspective and depth information. This allows the program to construct a 3D object from that information, and project the calibrated photographs onto it.
Though the program itself is fast compared to similar software, a big advantage is that it’s CUDA accelerated — meaning it makes full use of NVIDIA GPU CUDA cores. Having a fast GPU makes a huge difference when it comes to processing the large amount of data involved, this is why he makes use of two NVIDIA GeForce GTX 1080Ti GPUs in SLI mode.
Since the 3D objects created by the photogrammetry software range between 10 to 40 million polygons, the GPUs also provide the performance necessary for the 3D applications to handle them smoothly.
It is not always easy to capture a subject as intended. There are often plants in the way or you may not able to reach it or failed to get a good photograph. In those cases, the photogrammetry process can sometimes have issues and texture can get projected weirdly, or you can get gaps in the model. One way to deal with this is to use Adobe Substance Painter — its projection tool is great to touch up textures in these problematic areas, making it possible to fix multiple channels at once with the content from other parts of the object or photographs.
The plants are done through a different workflow. After selecting live the plants that he intends to digitize, he separates the leaves from the stem and takes a top down image capture using a custom built light box. This allows him to adjust the light directions, making it possible to capture a diffuse, subsurface, opacity and different lighting directions. The normal information and height are then created using Adobe Substance Designer and inputting the various light directions into the multi angle node.

Eventually, when everything has been reconstructed, it comes together in Unreal Engine 4. For this he uses a free special build found on GitHub, that includes NVIDIA VXGI — an implementation of a global illumination algorithm known as Voxel Cone Tracing. Global illumination computes all the lighting in the scene, including secondary bounced light reflecting off diffuse and specular surfaces. Adding GI to the scene greatly improves the realism of the rendered images.
Since he is developing for 4K monitors, de Boer uses massive 8K textures to ensure image quality in the real-time experience. Working at this resolution is not only more work for the artist but also taxing for the hardware and software. Creating large environments with high quality textures requires ample GPU memory pushing the limits of gaming hardware and into NVIDIA Quadro territory with 24GB of GPU memory.
While images are great, seeing these environments in motion running in real-time is truly spectacular.
To stay up to date on Rense de Boer’s work you can check out his website and Facebook page.

The video below is captured directly from Unreal Engine

Additional Images

Discuss (0)

Tags