2019 I/ITSEC

RECONFIGURING SYNTHETIC ENVIRONMENTS AS INPUTS TO UNITY 3D (Room 320B)

03 Dec 19
4:30 PM - 5:00 PM

Tracks: Full Schedule, Tuesday Schedule

Game developers have been leaders in utilizing the latest hardware technologies, including highly parallel computing using GPUs, physics processing units, and deep learning cores. Game engines provide features using these hardware technologies, and allow simple use and access for developing content. Synthetic environments represent data synthesized from satellite imagery, modeled content, and procedural data. This paper explores techniques to prepare legacy commercial and military simulation synthetic environments to be readied for use in a game engine. In this preparation we will be discussing multiple game engine features that can augment existing synthetic environments. This paper uses Unity 3D game engine for discussing six key features. The first key feature is the visual effect graph, which uses GPU instancing to render a large number of particle effects, such as an explosion. The next key feature is entity instancing which allows the GPU to render a large number of features, such as trees. Unity’s physics capabilities provide easy access for tagged features to interact appropriately in physics-based scenarios. Physics-based shading with the custom shader pipeline is a feature that provides added realism to a scene. Unity offers an intuitive UI engine, allowing users to easily construct scenarios. Finally, Unity provides the ability to deploy to multiple platforms, including Windows, Android, and iOS. This paper explores ways to make changes to data pipelines in synthetic environments to allow the system to utilize said features provided by Unity 3D. The changes include reconfiguration of data streams to GPUs, coordinate system conversions, restructuring of asset pipelines, data tagging to fit Unity 3D’s formats and shader alterations to produce post-processed effects.