Recent papers have presented a number of marketing claims about the benefits of Frequency Modulated Continuous Wave (FMCW) LiDAR systems. As might be expected, there is more to the story than the headlines claim. This white paper examines these claims and offers a technical comparison of Time of Flight (ToF) vs. FMCW LiDAR for each of them. We hope this serves to outline some of the difficult system trade-offs a successful practitioner must overcome, thereby stimulating robust informed discussion, competition, and ultimately, improvement of both ToF and FMCW offerings to advance perception for autonomy.
Conventional metrics used for evaluating the performance of LiDAR for autonomous vehicles (such as frame rate, full frame resolution, and detection range) do not sufficiently address the challenges facing autonomous driving. In response, AEye proposes three new metrics for extending LiDAR evaluation, including: extending the metric of frame rate to include object revisit rate; expanding resolution to capture instantaneous resolution; and extending detection range to reflect the more critically important object classification range.
Based off of their findings, VSI Labs determined that AEye's iDAR system "can detect and potentially classify objects with enough precision, accuracy, and distance not possible with conventional LiDAR or camera sensors."
To make truly safe, autonomous vehicles, will we need to “teach” them empathy and compassion?
The bottom line is: anything that you design has to meet all the reliability and quality standards, and should be manufacturable at the lowest possible cost - especially in automotive. There’s no way you can win the market without reliability, quality, and cost efficiency.
The startup opportunity intrigued me, as I saw it as an opportunity to have a different kind of impact, to change the world through changing the kinds of products that are available. I was involved with some of the early investments in autonomy and LiDAR through DARPA, and like many others amongst my peers I was drawn to the autonomous vehicle commercial market as a result of the DARPA Grand Challenge.
We have built an active, intelligent, software-definable sensing platform – which we call iDAR – that can be used in a variety of different markets, and we're enabling our systems integrator and Tier 1 partners to take iDAR and customize it to the unique needs of each of these markets. These partnerships are a win-win for everybody: for the customer, the partner, and for us because everyone gets something better out of the equation.
In this installment of the AEye Insights series, AEye Founder and VP of Corporate Development, Jordan Greene sits down with Ryan Popple, AEye Advisor, General Partner at R7 and Executive Director at Proterra to discuss current trends in electrification and urban transportation, the importance of smart sensors, and the implementation of fully autonomous charging stations.
AEye Founder and VP of Corporate Development, Jordan Greene, and Reilly Brennan, General Partner at Trucks Venture Capital, pull out their crystal ball to talk about COVID's lasting effects on logistics and delivery, the challenges in store for "structured" autonomy, what's behind the flurry of AV and sensor SPACs, and why AV engineers are like chefs.
AEye Advisor Jim Robnett sits down with Elliot Garbus, AEye Advisory Board Member and former VP & GM at Intel Corporation Transportation Solutions Division, to discuss the technical challenges facing the industry today, what kind of partnerships we'll see in the future, the social and legal issues related to autonomy, and where he thinks transportation is headed this decade.