AEye takes Reuters on a drive down the Las Vegas Strip to show off how its artificial perception technology can detect up to 1000 meters and mimic human perception by focusing on important objects in a scene.
As a robotics perception pioneer bringing new, intelligent data perception to automotive vehicles, AnyAuto has named AEye among "the automotive innovations to watch closely in the year 2019."
Engineering.com: While conventional LiDAR systems rely on an array of independent sensors that produce large quantities of data — which then require long processing times and extensive computing power to analyze and translate it into actionable information that a car can use — AEye’s iDAR combines solid-state, agile LiDAR with a low-light HD camera and integrated AI to prioritize the information received, ultimately increasing the speed of a car’s self-driving perception system by up to 10 times.
Forbes details why conventional, solid-state LiDAR systems won't be enough to cultivate the future of autonomous vehicles. Instead, what will catapult autonomous vehicles into the mainstream market is faster, smarter, detection systems, like AEye's iDAR, which fuses agile LiDAR with a high-resolution, low-light camera to replicate the advanced processes of the human visual cortex.
EETimes explores AEye's use of artificial intelligence to discriminately collect data information that only matters to an AV’s path planning, instead of assigning every pixel the same priority. According to VSI's Phil Magney, "this is really edge fusion as the device is fusing the raw data with the camera data before any classification occurs.”
VentureBeat reports that AEye's iDAR system "is built for speed first and foremost." While typical LiDAR systems fail to identify potential hazards quickly, AEye's Intelligent Detection and Ranging (iDAR) system creates point cloud data called “Dynamic Vixels” which enable flexible, intelligent, and faster detection for autonomous vehicles.
Dynamic Vixels strengthen AEye's biomimetic approach to visual perception, enabling vehicles to see and perceive like humans do to better evaluate potential driving hazards and adapt to changing conditions.
FutureCar sits down with AEye's Lead Strategist, Jordan Greene, to learn how AEye’s unique perception system uses computer vision and artificial intelligence to mimic how a human eye focuses on objects and processes its environment.
New sensor technology presented by AEye, which specializes in artificial perception, aims to combat key safety concerns regarding the development and mainstream adoption of autonomous vehicles.
"AEye, has built a new kind of hybrid sensor...that allows the system to prioritize where it’s looking in order to give vehicles a more refined view of the world." – MIT Technology Review