Conventional metrics used for evaluating the performance of LiDAR for autonomous vehicles (such as frame rate, full frame resolution, and detection range) do not sufficiently address the challenges facing autonomous driving. In response, AEye proposes three new metrics for extending LiDAR evaluation, including: extending the metric of frame rate to include object revisit rate; expanding resolution to capture instantaneous resolution; and extending detection range to reflect the more critically important object classification range.
The artificial intelligence that drives autonomous vehicles will require artificial perception that is modeled after the greatest perception engine on the planet — the human visual cortex.
Based off of their findings, VSI Labs determined that AEye's iDAR system "can detect and potentially classify objects with enough precision, accuracy, and distance not possible with conventional LiDAR or camera sensors."
To make truly safe, autonomous vehicles, will we need to “teach” them empathy and compassion?
One of the reasons why I like working on autonomous vehicles is because I am working with innovative technology. It means that everything can be further tuned and customized. I enjoy the process of going from proof-of-concept all the way to a completed product. To have the opportunity to have an idea and really push through to production is really exciting.
LiDAR is enabling self-driving cars to become a reality by ensuring that they drive with the least amount of risk to other drivers or those around it. LiDAR is necessary for autonomous vehicles because it reduces the burden of real-time perception and prediction, which is not possible using AI and stereo-cameras alone.
LiDAR is the final sensor modality that is needed to make ADAS systems (and eventually full autonomy) work effectively in all conditions. LiDAR is more deterministic by nature, as it can detect and measure the distance to all objects. And with an agile LiDAR, such as AEye’s iDAR, this can be done incredibly fast with the added ability to classify objects and determine their velocity.
AEye and ANSYS are accelerating autonomous driving safety by enabling virtual prototyping of iDAR™ to speed design, testing and validation of autonomous systems. By working with ANSYS, AEye is empowering partners and customers to simulate driving situations across millions of miles in just days, minimizing physical prototyping. Watch the demonstration of AEye's iDAR using the VRXPERIENCE and SPEOS elements of ANSYS Autonomy, showcasing hazard detection in a virtual world.
AEye's partner ecosystem is embracing and extending iDAR to accelerate innovation and the availability of autonomous features. AEye and Tata Elxsi have unveiled RoboTaxi, Tata Elxsi’s in-house concept demonstrator vehicle developed using AEye’s iDAR platform and Tata Elxsi’s autonomous stack. Watch a demonstration of the fully autonomous RoboTaxi vehicle, fitted with AEye’s iDAR, successfully encounter various scenarios, such as cross-traffic detection at a junction and round-about, follow the road ahead, and cueing the sensor with HD maps and V2X information.
AEye’s growing ecosystem of global partners are leveraging AEye’s artificial perception platform to advance safe, reliable autonomy. AEye has integrated Infineon’s AURIX microcontroller into AEye’s iDAR platform to ensure a robust, software-definable platform that is functionally safe for automated and autonomous vehicle initiatives. Watch how Infineon leverages AEye’s iDAR platform to create sensor fusion in their AURIX microcontroller. Watch how.