Photogrammetry, game-engines and the geospatial industry

Introduction

Originally used by the mapping and surveying industry, photogrammetry is now being used more and more in the gaming industry to create ultra-realistic 3D assets and produce immersive and interactive real-time 3D environments. While computing power and GPU is increasing, there is still a limit to the amount of 3D data which can be displayed in real-time. Before being game-ready, 3D assets generated by photogrammetry need to be optimised to reduce rendering cost and hardware/software limitations.

This article gives an overview of where the technology stands today and where are we heading.

What is Photogrammetry

Photogrammetry is the process of generating a three-dimensional representation of an asset using overlapping high-resolution photographs taken from multiple angles. The photographs are then stitched together using the principle of triangulation through specialised photogrammetry software such as Bentley ContextCapture, Capturing Reality RealityCapture, Agisoft Metashape, Skyline Photomesh, nFrames Sure to produce a detailed, geometrically accurate and textured 3D mesh of this asset.

Photogrammetry can be applied regardless of the size of the objects to be digitally re-created from a small rock to an entire city. The photographs can be taken from the ground with a hand-held camera, through a rig or mounted in a drone, helicopter, or fix-wing platforms or from satellite.

Application of Photogrammetry

Photogrammetry appeared in the middle of the 19th century and was first used to produce topographic maps. Photogrammetry has now been used for more than 150 years by the mapping and surveying industry and it is now used in many other fields such as architecture and cultural heritage, geology and archaeology, engineering and more recently in the film and gaming industry.

With an easier access to processing power, photogrammetry tools, advanced computer graphics and image processing capabilities, many of these distinct fields are now merging and interacting more widely . A great example of increased cooperation between the geospatial and gaming industry is the announcement that Cesium will be making geospatial data available through Unreal Engine. Large 3D city models generated using photogrammetry techniques are also being directly integrated into Unreal Engine and Twinmotion to create realistic digital representations of the real-world environment.

Pau 3D in Twinmotion

3D city model of Pau generated using advanced photogrammetry techniques and visualised in Twinmotion

Game-Engines

Outside their original purpose, game-engines are rapidly evolving to provide 3D creation environments to deliver immersive and interactive content across many industries such as the film and entertainment industry, training and simulation, architecture, and design. The two major players include Epic Games’ Unreal-Engine, Unity3D. These game-engines provide a large set of powerful tools to allow developers to focus on the creative process.

Geographic data in games

Maps and by extension the spatial aspect of them are a core component of virtual worlds used in video games. These maps are used as a base for the creation of fantasy worlds or designed to simulate real-world environments. Geographic data is used as input information during a game’s development phases to give the developers real-world information to build their maps on and emulate real environments to improve player immersion and familiarity.

For example, this is the case for “Call of Duty” based in Iraq and “Grand Theft Auto” in California. Besides simulation for training and military applications, the most popular game using real-world geographic data has been “Microsoft Flight Simulator” with a huge selection of scenery packs and add-ons based on aerial orthophotography.

Geospatial data and real-time 3D visualisation

Google, Apple and Microsoft have largely been responsible for the democratisation of GIS by initiating the transition between a geospatial specialist-dominated community of few hundreds of thousands to the general public with millions of users and growing. In the recent years, we have seen these companies make use of 3D photogrammetry techniques to automatically generate entire cities in 3D.

Geospatial Industry specific real-time 3D environment platforms able to ingest complex 3D photogrammetry data have been available for more than a decade. Major players include Cesium, Skyline Software Systems TerraExplorer, ArcGIS pro, ArcGIS CityEngine, Euclideon udStream which provide a range of analysis and simulation tools on top of powerful 3D visualisation capabilities. Other players providing real-time 3D analysis and visualisation platforms for a wider range of industries includes Twinsity, Bentley LumenRT, OpenCities Planner, Urban Circus Urban-Engine, GeoCirrus.

From a geospatial point of view, game-engines provide alternative real-time 3D platforms to create high quality geographic visualisation and simulations. Game-engines provide tools to animate objects as well as complex physics engine and are highly customisable to create gameplay elements, Both Unity and Unreal-Engine can import spatial data as FBX, elevation data as height map and corresponding imagery to drape on top. ESRI CityEngine can directly export data into Unreal Engine Datasmith format. Datasmith is a collection of tools to bring CAD, BIM and GIS content to Unreal Engine. Complex 3D environments can also be streamed to any web browser via Pixel Streaming. Unreal Engine provide tools to manager large terrain-based worlds such as Landscape and World Composition.

Photogrammetry in games

In a lot of new games, one of the main priorities is to provide players with a realistic immersive experience. In recent years, a growing number of studios have made the headlines with their innovative use of photogrammetry to produce ultra-realistic assets such as rocks, buildings, street furniture and human bodies to populate their worlds and create their characters.

It is now possible to create just about any environment possible as virtual clones of real locations or totally imaginary worlds. Titles such as The Vanishing of Ethan Carter, Call of Duty: Modern Warfare, Star Wars Battlefront are just a small sample amongst the growing list of games making heavy use of Photogrammetry.

Photogrammetry is disrupting the game development pipeline. 3D assets are now widely available and are not the exclusivity of big-budget games anymore. Through the acquisition of the company Quixel, Epic Games has made available more than 10,000 2D and 3D photogrammetry assets to their Unreal Engine developers (Megascans library).

Challenges

Games are designed to provide users with an immersive and realistic real-time experience and many decisions must be made to find the right compromise between aesthetics, realism, design requirements, gameplay and complexity of the 3D assets used.

3D assets generated from photogrammetry contain geometric and texture details which would be very hard and time consuming for a 3D artist to recreate. These small details and imperfections of the real world is what creates the illusion of realism. However, the high complexity of the geometry and high resolution of textures can be extremely heavy and need to be optimised to reduce rendering cost and satisfy processing capacity, GPU and limit video memory usage.

3D Photogrammetry: Pau, France in Unreal Engine

This video showcases the use of 3D photogrammetry to develop a next-level immersive experience within Unreal Engine 4. Large scale photogrammetry and city-wide 3D models can give creators the freedom to develop interactive virtual worlds and photo-realistic rendering by adding weather or light effects and dynamic physics. With large-scale 3D mesh models, it is possible to build design visualisations and cinematic experiences, opening up a world of possibilities for video games, virtual & augmented reality and more.

Optimisation Process

Different types of optimisation must be performed to make 3D assets game-ready.

Typical human size 3D assets generated using photogrammetry software come out as a unique 3D object with millions of triangles and multiple large texture images. In the case of large objects generated from photogrammetry, the 3D data is split into multiple individual tiles, which are also made of millions of triangles and multiple large-texture images. Many software provide the option of generating levels of detail with reduced number of triangles and texture size.

Some tools such as Granite (Graphine) are available to provide innovative tile-based texture streaming in order to optimise memory usage and speed up load-time. In the same way, objects can be simplified into multiple levels of detail and the further an object is from the virtual camera, the lower the level of detail that shall be loaded. This technique is being used extensively in the geospatial industry.

Depending on the quality of the 3D reconstruction, 3D assets need to be retouched to remove artifacts both in the geometry and texture. The capture conditions are very important to minimize the impact of real lighting on the texture of the object and it can become time consuming, if not impossible to properly remove baked light (and shadow) in the texture. Some tools are available to assist in the task such as Unity de-lighting.

Here is a typical workflow to create game-ready assets:

·        Pre-processing and colour correction of the input frames

·        3D reconstruction

·        Geometry and texture artefact clean-up

·        3D mesh optimization and decimation

·        Additional map generation such as normal, specular and gloss to enable materials to react to the game’s environment lighting

Philadelphia 3D.jpeg

Future Development

Real-time application technology is evolving very rapidly, and game-engine developers are at the forefront of these developments.

Epic announced the release of Unreal-Engine 5 (late 2021) and their virtualized micro polygon geometry Nanite technology. Data will be streamed and scaled on-the-fly and will allow extremely complex data to be displayed in real-time without the need to use levels of detail and will not impact polygon count budgets, polygon memory budgets or draw count budgets.

Aerometrex has produced many large 3D mesh models of entire cities comprised of billions of triangles. The Nanite technology will not only revolutionise the way games are designed but also open new opportunities to the geospatial industry and enable them to smoothly integrate entire cities into ultra-realistic, real-time, and immersive solutions.


Author:

Fabrice Marre, Geospatial Innovation Manager, Aerometrex

X