AUVSI's Unmanned Systems 2016

Time of Flight Flash LIDAR: Are Real-Time 3D Images the New Normal? (Room 261-262)

03 May 16
11:30 AM - 12:00 PM

Tracks: Air, Commercial, Research and Development, Technical Track: Sensors and Payloads

Time of Flight Flash LiDAR - Are real-time 3D images the new normal? The commercial UAV market offers a broad and increasing range of business applications, FAA section 333 exemptions so far indicate that most depend on onboard digital imagery. Today imagery is almost entirely captured in 2D, but there is about to be a leap to a new class of real-time 3D sensor now becoming available. 2D digital imaging sensors are constantly evolving precision and efficiency. 3D LiDAR spinning scanners are also evolving, within the constraints of high precision devices with fast moving parts that have to work on a flying platform. Some new Time of Flight (ToF) camera sensors have more in common with solid-state 2D cameras than spinning 3D single-point laser scanners or multi-camera stereography or post-flight photogrammetry processing. Flash LiDAR ToF cameras, also known as range cameras, record a depth distance for each sensor pixel in real time with no need for complex calculations or heavy processing. They emit a diffuse laser flash then, separately for each pixel in the sensor, time when that flash of light returns to each pixel. These cameras are solid state devices with no moving parts currently made in low volumes at high prices, such technologies tend to have major improvements as volumes increase and the technology evolves. The data capture may be in the form of frames, arrays of RGBD Red-Green-Blue-Depth points from one point of view per frame. This can be achieved either with a single lens RGBD sensor or with registration and calibration of two separate cameras: one for RGB and one for depth. A series of high-end implementations have existed for many years, usually initially secretly then later made publicly available. The extreme capabilities of the high-end LiDAR systems already available publicly (such as Harris' Geiger-mode LIDAR scanning service) indicate that the higher-volume ToF technologies can be expected to make many technology improvements in future. A series of lower-cost higher-volume hardware implementations follows, each opening up a new commercial sweet-spot. As real-time 3D capture becomes dramatically lighter and cheaper it is likely that many more commercial UAV platforms will use it to enhance their image capture. This is more than moving the data capture from 2D (a focal plane of colour RGB pixels) to a colour 3D point cloud. Current images are mostly geo-registered at the location of the camera (the UAV+camera locations and orientations). Images from many different perspectives at different times mostly require expert inspection to identify changes that matter. Depth cameras effectively geo-register each point at the location of the subject(s) shown in the images. The subjects' locations are likely to be more immediately useful for most applications. Automated monitoring of changes in those subjects over extended periods will require imagery over time all aligned by the subject. Automated inspection will be far more economical and can be applied more frequently and to more circumstances that can not justify the cost of repeated expert manual inspection. As the number of vehicles increases and the technology improves, real-time see-and-avoid technology will become a basic requirement. UAVs equipped with depth cameras can use this real-time 3D data for a much more precise and reliable see-and-avoid, enough to safely navigate closely around other objects (under bridges, around windmills, cell towers and other infrastructure). An industrial UAV ToF sensor has to be optimised to work outdoors, in direct sunlight, with a range beyond a safe flying UAV approach distance. This has some important differences from the many new and pending consumer ToF sensors optimised to work only within one room indoors, or at just an arms length. The consumer market is spending hundred of millions of dollars driving major advances in lightweight, low cost, real-time 3D imaging hardware: Kinect, Intel RealSense supported from Windows 10, HoloLens, Google Project Tango (and investing in Magic Leap), Apple bought PrimeSense. These sensors may not be directly suitable for outdoor UAV use, but they drive new manufacturing techniques, efficiencies and cost reductions. Some of these advances will surely assist the outdoor technologies that have much in common with them. Hear about BrashTech's experiences implementing advanced 3D image sensors on a UAV platform at Xponential.