AEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan Rate

AEye’s IDAR Leverages Biomimicry To Enable First Solid State Lidar With 100Hz Scan Rate

Breakthrough speed to be discussed today at Western Automotive Journalists event “Silicon Valley Reinvents the Wheel”

“AEye has taken some of the most elegant lessons from human brain science and combined them with cutting edge technology. This integration created something that I believe will allow autonomous vehicles to process data like a computer but perceive like a human.”

Dr. James Doty, Clinical Professor of Neurosurgery, Stanford University

Pleasanton, CA – October 1, 2018 – Following on the heels of being honored as “Most Exciting Start-Up” at the prestigious Autosens awards in Brussels, AEye, a leader in artificial perception systems, today announced that it has established a new speed benchmark for 300m+ class solid-state LiDAR sensors achieving scan rates of 100Hz or greater – a 10x improvement over currently deployed competitive systems which typically scan at 10Hz.

AEye’s artificial perception platform, iDAR is designed to mimic the performance of the human visual cortex. This biomimicry enables software-definable search patterns that can be optimized for specific driving situations that deliver much more precise and actionable information at speeds never seen in commercially available LiDAR sensors.

iDAR’s biomimicry enables autonomous perception engineers to create these situationally specific scan patterns that are capable of searching a scene 4x to 5x faster than the human eye. This scanning speed is matched by superior spatial coverage that breaks down a scene into Dynamic Vixels, a data type unique to iDAR which combines X,Y,Z and R,G,B data.

By finding and locating objects as fast or faster than a human, iDAR enables perception that can intelligently classify and track objects at unprecedented rates – including the unique ability to calculate the vector and velocity of each object within a frame. Much of this can be done at the sensor level within the same frame, bypassing 100s of milliseconds of latency seen in currently deployed systems. This ability to modulate both spatial and temporal dimensions simultaneously as humans do is what is needed to achieve level 4 and 5 autonomy.

“IDAR is based on a revolutionary new agile LiDAR design that allows autonomous vehicles to perceive far beyond the limits of human perception, said Blair LaCorte, AEye’s Chief of Staff. “This powerful software-driven sensor system allows vehicle perception engines to actively interrogate their environment to identify the precise information they need at speeds that will radically improve safety.”

On Monday afternoon in Mountain View, CA, Mr. LaCorte and Dr. James Doty, Clinical Professor of Neurosurgery at Stanford University will present a discussion on “Making Sense of the Sensor: Applying Biomimicry to Vehicle Autonomy”. In this session, they will explore why the human brain and visual cortex are the ideal models for autonomous perception and how their performance could be best replicated with existing sensor technologies.

“AEye has taken some of the most elegant lessons from human brain science and combined them with cutting edge technology, said James Doty. “This integration created something that I believe will allow autonomous vehicles to process data like a computer but perceive like a human.”

The annual Western Automotive Journalist event is being held at the Computer History Museum in Mountain View, CA on October 1 from 10am to 5pm. For more information see waj.org.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366