On Autonomous Cars with Marc Hoag, AEye President, Blair LaCorte, explains how the fusion of LiDAR and camera vision empowers autonomous vehicles to perceive like a human.
Smarter Cars host, Michele Kyrouz, sits down with AEye President, Blair LaCorte to discuss AEye's cutting-edge technology for autonomous vehicles which fuses cameras and LiDAR to mimic human perception
At CES 2019, AEye's Head of Customer Success, Joel Benscoter, spoke with BeTerrific's Michael Artsis about what sets iDAR technology apart from conventional perception systems for autonomous vehicles.
IEEE Spectrum details why the 1550 nanometer wavelength for LiDAR is safe for the human eye, and calls the industry to work together to ensure multimodal sensor compatibility and interoperability.
In its "Innovations in Enviro-Sensing for Robocars" feature, Motor Trend details how AEye's iDAR “embeds microelectromechanical systems into solid-state LiDAR, allowing it to fire photons randomly rather than in a preset pattern, making it easier for the computer to process.” This reduces latency and power consumption five to ten times over competitors.
Engineering.com: “While conventional LiDAR systems rely on an array of independent sensors that produce large quantities of data — which [then] require long processing times and extensive computing power to analyze and translate it into actionable information that a car can use” — only AEye’s iDAR system can intelligently prioritize actionable data.
Forbes details why conventional, solid-state LiDAR systems won't be enough to cultivate the future of autonomous vehicles. Instead, what will catapult autonomous vehicles into the mainstream market is faster, smarter, detection systems, like AEye's iDAR, which fuses agile LiDAR with a high-resolution, low-light camera to replicate the advanced processes of the human visual cortex.