AEye in the News
Smarter Cars host, Michele Kyrouz, sits down with AEye President, Blair LaCorte to discuss AEye's cutting-edge technology for autonomous vehicles which fuses cameras and LiDAR to mimic human perception
At CES 2019, AEye's Head of Customer Success, Joel Benscoter, spoke with BeTerrific's Michael Artsis about what sets iDAR technology apart from conventional perception systems for autonomous vehicles.
IEEE Spectrum details why the 1550 nanometer wavelength for LiDAR is safe for the human eye, and calls the industry to work together to ensure multimodal sensor compatibility and interoperability.
AEye takes Reuters on a drive down the Las Vegas Strip to show off how its artificial perception technology can detect up to 1000 meters and mimic human perception by focusing on important objects in a scene.
In its "Innovations in Enviro-Sensing for Robocars" feature, Motor Trend details how AEye's iDAR “embeds microelectromechanical systems into solid-state LiDAR, allowing it to fire photons randomly rather than in a preset pattern, making it easier for the computer to process.” This reduces latency and power consumption five to ten times over competitors.
As a robotics perception pioneer bringing new, intelligent data perception to automotive vehicles, AnyAuto has named AEye among "the automotive innovations to watch closely in the year 2019."
Engineering.com: “While conventional LiDAR systems rely on an array of independent sensors that produce large quantities of data — which [then] require long processing times and extensive computing power to analyze and translate it into actionable information that a car can use” — only AEye’s iDAR system can intelligently prioritize actionable data.
optics.org | The AE200 Series delivers ADAS L3 long range performance of up to 200 meters at 10% reflectivity at 0.1° resolution with a short range performance configuration of 50 meters range at 10% reflectivity. AEye's AE200 Series will be modular in design and capable of up to 120°x 45° Field of View. AEye also recently announced the close of its Series B funding, which will be used to scale its operations to meet the demand of its global partners and customers.…
What better way to test the abilities of a LiDAR system than determining if it can perceive the dart from a "Nerf" gun? Thanks Wired Transportation Editor, Alex Davies, for the inspiration!
Forbes details why conventional, solid-state LiDAR systems won't be enough to cultivate the future of autonomous vehicles. Instead, what will catapult autonomous vehicles into the mainstream market is faster, smarter, detection systems, like AEye's iDAR, which fuses agile LiDAR with a high-resolution, low-light camera to replicate the advanced processes of the human visual cortex.
EETimes explores AEye's use of artificial intelligence to discriminately collect data information that only matters to an AV’s path planning, instead of assigning every pixel the same priority. According to VSI's Phil Magney, "this is really edge fusion as the device is fusing the raw data with the camera data before any classification occurs.”
VentureBeat reports that AEye's iDAR system "is built for speed first and foremost." While typical LiDAR systems fail to identify potential hazards quickly, AEye's Intelligent Detection and Ranging (iDAR) system creates point cloud data called “Dynamic Vixels” which enable flexible, intelligent, and faster detection for autonomous vehicles.
TechCrunch announces AEye's close of Series B funding. Led by Taiwania Capital and including returning investors Kleiner Perkins, Intel Capital, Airbus Ventures, and Tyche Partners, the artificial perception pioneer and creator of iDAR raised $40 million, bringing its total funding to roughly $61 million.
Dynamic Vixels strengthen AEye's biomimetic approach to visual perception, enabling vehicles to see and perceive like humans do “to better evaluate potential driving hazards and adapt to changing conditions.”
FutureCar sits down with AEye’s Lead Strategist, Jordan Greene, to learn how AEye uses computer vision and artificial intelligence to mimic the advanced processes of human perception.
AEye founder and CEO, Luis Dussan, tells WardsAuto how iDAR “is a step beyond today’s LiDARS systems.” Unveiled at CES 2018, AEye's software-definable, iDAR technology is “a new form of intelligent data collection that enables rapid, dynamic perception and enhanced path planning.”
New sensor technology by AEye, which specializes in artificial perception, aims to combat key safety concerns regarding the development and mainstream adoption of autonomous vehicles.
"AEye, has built a new kind of hybrid sensor...that allows the system to prioritize where it’s looking in order to give vehicles a more refined view of the world." – MIT Technology Review
In AEye's revolutionary iDAR system, LiDAR and camera work together to achieve better results through intelligent, selective scanning — a key aspect of human perception.
CleanTechnica reports on the successful completion of AEye’s live demo of its software-extensible iDAR system in an urban environment.
AEye, which differentiates itself from competing artificial perception systems for autonomous vehicles by fusing agile LiDAR with a high-resolution, low-light camera, has raised $16 million in Series A financing led by Kleiner Perkins Caufield & Byers. Other participates include Airbus Ventures, Intel Capital and Tyche Partners.