February 24-25, 2020 | JMP Securities Technology Conference | The Ritz-Carlton | San Francisco, California February 25th, 2:00 - 2:55pm | Driverless: The Future of Autonomous Vehicles Speaker: Blair LaCorte – President, AEye
Chris Wiltz of DesignNews chronicles AEye President Blair LaCorte's AutoMobility LA 2019 press conference where he discusses how iDAR enables autonomous vehicle sensing technology to “out-perceive the human eye” using biomimicry, iDAR's motion forecasting ability, and how AEye's technology is similar to the Indoraptor from “Jurassic World: Fallen Kingdom.”
Marcus Amick of Ride by Kelley Blue Book calls AEye's new 2D/3D perception sensor system, iDAR, one of the "most interesting tech innovations" featured at the AutoMobility summit. At their outdoor exhibition, AEye demonstrated how iDAR can "search, acquire, classify, and track" objects by combining agile LiDAR, camera, and AI in the sensor.
In the August 2019 issue of Autonomous Vehicle Technology magazine, AEye’s VP of Product Management, Aravind Ratnam, rethinks the three “Rs” of LiDAR — rate, resolution, and range — and proposes extending automotive LiDAR evaluation metrics to meet the capabilities of today's technology.
AEye asks: “What is the best way to deliver artificial perception for robotic and autonomous vehicles?” In “Teaching An Autonomous Car To Perceive Like A Human,” Forbes learns that it involves mimicking the advanced processes of the human visual cortex. “Intelligence begins with how you collect data,” says AEye President, Blair LaCorte, “[and] iDAR captures and processes environmental data – just as the human visual cortex does.”