AEye Reports Fourth Quarter and Full-Year 2025 Results; Strengthened Foundation for Commercial Growth Read more AEye Joining NVIDIA Halos AI Systems Inspection Lab to Advance Safety-Certified Physical AI Solutions Read more AEye Reports Fourth Quarter and Full-Year 2025 Results; Strengthened Foundation for Commercial Growth Read more AEye Joining NVIDIA Halos AI Systems Inspection Lab to Advance Safety-Certified Physical AI Solutions Read more

Coffee Talk: Ashwin Samarao

This week we sat down with AEye senior director of systems engineering Ashwin Samarao to talk about the ability of AEye’s software to adapt to changing environments, Time of Flight (ToF) and other ToF or Frequency Modulated Continuous Wave (FMCW) systems, and why AEye’s lidar architecture aligns seamlessly with the objectives of a software-defined car.

“AEye lidar is a direct Time of Flight (ToF) lidar where we calculate the round-trip time it takes for an optical pulse to return to the lidar from the moment it is emitted after the pulse reflects off a target that we are trying to detect.…But AEye lidars are bistatic, in that the optical path that transmits the pulse is different from the optical path that receives the pulse. This is a crucial difference and is what helps AEye use robust, small, power-efficient MEMS micro-mirrors for scanning the optical pulse in the Field of View (FoV) instead of resorting to bulky, power-hungry macro-mirrors that some competitors are stuck with.”


Ashwin SamaraoTell me about your role and your background

I head systems engineering at AEye, so I am responsible for system-level algorithms, software, firmware, calibration, and validation. I’ve spent about seven years in the lidar space, including my time at an automotive Tier 1 and another lidar start-up. Before that, I was working on MEMS sensors for smartphones and IoT devices. I have a PhD in MEMS technology from Georgia Tech.

As the leader of software and systems engineering, and someone who’s spent time in the lidar space, what’s the key difference between AEye’s ToF and other ToF or FMCW systems?

AEye lidar is a direct Time of Flight (ToF) lidar where we calculate the round-trip time it takes for an optical pulse to return to the lidar from the moment it is emitted after the pulse reflects off a target that we are trying to detect. At this level, our ToF-based system is similar to other ToF lidar systems.

But AEye lidars are bistatic, in that the optical path that transmits the pulse is different from the optical path that receives the pulse. This is a crucial difference and is what helps AEye use robust, small, power-efficient MEMS micro-mirrors for scanning the optical pulse in the Field of View (FoV) instead of resorting to bulky, power-hungry macro-mirrors that some competitors are stuck with.

FMCW is a technology that belongs to the category of indirect ToF architectures. FMCW providers modulate the transmitted Continuous-Wave (not an optical pulse) with their proprietary signature and demodulate it on the receiver side to infer range and possibly per–point velocity. ToF lidar can compute the velocity of the targets in the FoV using information from two frames of data, while an FMCW system has the ability to infer instant velocity information. But this ability comes at the expense of added system complexity, cost, and power consumption.

What is Software-defined lidar?

A software-defined lidar is one that is able to achieve custom system-level features and functionality by taking advantage of a hardware architecture that is designed to be flexible from the ground-up. In this scenario, the software can take advantage of the existing hardware in order to create these new features, without requiring hardware changes. This flexibility enables AEye’s MEMS laser scanners to scan the laser pulses into the Field of View in any order. Similarly, our Rx architecture is designed to receive laser pulses from the FoV in any order. AEye lidar is a fantastic example of a software-defined lidar.

What are the benefits of this flexibility?

We can randomize the order in which we scan a FoV to mitigate possible interference from other lidars. This is very important when the majority of the cars on the roads are equipped with lidar, and you need to ensure minimal signal interference.

We also have the ability to switch scan patterns on the move – for example, from a wider FoV mid-range urban scan pattern to a narrower FoV, but ultra-long-range highway scan pattern as a truck moves from an on-ramp onto the highway. For AEye lidar, this change could be as simple as loading a new scan pattern – which takes milliseconds.

Finally, we can mount the exact same lidar hardware front-facing on the roof of a passenger vehicle and rear-facing on the side mirrors of trucks and achieve very different combinations of fields of view, resolution, range and frame-rate by making changes at the software level. These powerful capabilities are made possible because of the way AEye’s software -defined lidar is engineered.

Is a software-defined lidar necessary for a software-defined car?

Absolutely! A software-defined car is trying to achieve the same thing – custom vehicle-level features implemented as updated software relying on the flexibility of its sensing hardware. A software-defined car will be able to create its best impact when it has access to software-defined sensors like AEye lidars.

What’s your favorite mode of transportation?

My favorite mode of transportation is biking, which I find far more relaxing than any other mode of transportation. I live close to the mountains in the peninsula, with trails all around me, and I have explored a lot of them. I love going for a ride on biking trails, especially on weekends, with my family.