When Milliseconds Count: The Impact of Adaptive Lidar on Perception Improvements
In order for autonomous systems to coexist with human drivers on highways, adequate reaction times are required for safe path planning. Hardware-centric lidars offer passive performance for redundancy, but don’t optimize data to reduce the perception systems’ reaction times based on operational domains.
Earlier this month, AEye’s Director of Technical Marketing, Mrinal Sood, and Continental’s Point Cloud Algorithm Component and Product Owner, Michael Kosubek, participated in a webinar hosted by SAE International to discuss how adaptive, high-performance, software-configurable lidar optimizes its output to allow higher-speed vehicle operation.
During the presentation, Mrinal and Michael took a closer look at how humans drive and their reaction times compared to machine processing cycles to help establish a baseline for the process that autonomous systems also follow.
Many questions emerged from the results of this comparison: For high-speed highway driving, how can we reduce the perception and reaction distances in autonomous systems? How do we get to the point where the machines that drive us around are better at responding than we are?
Mrinal and Michael covered several topics around these vital questions, such as the inherent limitations of hardware-centric, passive lidar systems, the impacts of 3D data on passengers and the benefits of adaptive lidar in point density at long ranges. The talk also featured a step-by-step overview of lidar-based perception systems.
If you missed this insightful webinar live, SAE is offering an archived version, available for free.