Connect —
  • Facebook
  • Twitter
  • Linkedin
  • Youtube
AEye, Accelight Technologies, and LighTekton Co. Announce Partnership to Bring Lidar Solutions to ChinaRead more

AEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™

Enhancement to iDAR Better Mimics the Human Visual Cortex

New real-time sensor data type, Dynamic Vixels, enable autonomous vehicles to better perceive surroundings and respond accordingly.

Pleasanton, CA – May 22, 2018 AEye, a leader in artificial perception systems, today announced the introduction of a new sensor data type called Dynamic Vixels, which are designed to more intelligently acquire and adapt data for the company’s iDAR (Intelligent Detection and Ranging) perception system. This advancement in AEye technology further strengthens its biomimicry approach to visual perception, essentially enabling vehicles to see and perceive more like humans to better evaluate potential driving hazards and adapt to changing conditions.

In simple terms Dynamic Vixels combine pixels from digital 2D cameras with voxels from AEye’s Agile 3D LiDAR (Light Detection and Ranging) sensor into a single super-resolution sensor data type. For the first time, a real-time integration of all the data captured in pixels and voxels is combined into a data type that can be dynamically controlled and optimized by artificial perception systems at the point of data acquisition. Dynamic Vixels create content that inherits both the ability to evaluate a scene using the entire existing library of 2D computer vision algorithms as well as capture 3D and 4D data concerning not only location and intensity but also deeper insights such as the velocity of objects.

“There is an ongoing argument about whether camera-based vision systems or LiDAR-based sensor systems are better,” said Luis Dussan, Founder and CEO of AEye. “Our answer is that both are required – they complement each other and provide a more complete sensor array for artificial perception systems. We know from experience that when you fuse a camera and LiDAR mechanically at the sensor, the integration delivers data faster, more efficiently and more accurately than trying to register and align pixels and voxels in post-processing. The difference is significantly better performance.”

AEye’s iDAR perception system mimics how a human’s visual cortex evaluates a scene and calculates potential driving hazards. Using embedded artificial intelligence within a distributed architecture, iDAR employs Dynamic Vixels to critically and actively assess general surroundings to maintain situational awareness, while simultaneously tracking targets and objects of interest. As a core data element for a scalable, integrated system, Dynamic Vixels enable iDAR to act reflexively to deliver more accurate, longer range and more intelligent information faster.

“One nice consequence that comes out of the architecture is we give our customers the ability to add the equivalent of “human reflexes” to their sensor stack,” says Dussan.

Dynamic Vixels can also be encrypted. This patented technology enables each sensor pulse to deal appropriately with challenging issues such as interference, spoofing, and jamming. Issues that will become increasingly important as millions of units are deployed worldwide.

Simply put, this new way of collecting and inspecting data using at the edge-processing of the iDAR system enables the autonomous vehicle to more intelligently assess and respond to situational changes within a frame, thereby increasing the safety and efficiency of the overall system. For example, iDAR can identify objects with minimal structure, such as a bike, and differentiate objects of the same color such as a black tire on asphalt. In addition, Dynamic Vixels can leverage the unique capabilities of agile LiDAR to detect changing weather and automatically increase power during fog, rain, or snow.

Likewise, iDAR’s heightened sensory perception allows autonomous vehicles to better determine contextual changes, such as in the case of a child’s facial and walking direction, which can be better and faster identified to calculate the probability of the child doing something irrational at an intersection, enabling the car to prepare for the likelihood of a sudden stop.

“There are three best practices we have adopted at AEye,” said Blair LaCorte, Chief of Staff. “First: never miss anything; second: not all objects are equal; and third: speed matters. Dynamic Vixels enables iDAR to detect an object faster, assess or acquire an object more accurately and completely, and track a target more efficiently—at low reflectivities and long ranges.”

The iDAR perception system includes inventions covered by recently awarded foundational patents, including 71 intellectual property claims on the definition, data structure and evaluation methods of dynamic Vixels. These patented inventions contribute to significant performance benefits, including a 16x greater coverage, 10x faster frame rate, and 7-10x more relevant information that boosts object classification accuracy while using 8-10x less power.

AEye’s first iDAR-based product, the AE100 artificial perception system, will be available this summer to OEMs and Tier 1s launching autonomous vehicle initiatives.