Manifest: The Time is Now for Autonomous Trucking
While autonomy for passenger cars has taken a backseat to advanced driver assist or active safety features, L4 trucking autonomy is accelerating, with hub-to-hub freight hauling on interstate routes taking center stage. That’s according to a recent expert panel at Manifest on Adaptive lidar advancing autonomous fleet safety, with speakers from Continental and AEye, moderated by industry consultant Richard Bishop.
The panelists concurred that trucking will lead autonomy, with rollouts taking place in meaningful quantities in the next few years. That’s due to the increased demand for goods, labor shortages and the fast evolution of technology – making a solid business case for L4 trucks driving themselves in certain driving domains, namely hub-to-hub long haul transport. According to Continental’s Head of Product Line LiDAR Gunnar Juergens, “There has been significant technological progress in the trucking sector, and it’s now common perception that trucking will be the first industry to adapt autonomous driving at a larger scale.”
Part of that technological progress is the evolution of lidar, and the realization that lidar, in addition to camera and radar, is essential for safety as part of an overall sensor suite. This is because cameras are limited in certain lighting conditions, and can only estimate the distance to an object, and radar is good in inclement weather but has resolution limitations, and high false positive rates because of multi-path issues. Lidar, on the other hand, can perform in all weather and lighting conditions and can tell with certainty if something is in its drivable path.
According to AEye’s VP of Trucking Platforms Andrew Nelson, “Lidar brings the ability to see the exact same day or night, regardless of if you are driving into a tunnel or how tired you are. That’s why you need multiple sensors for redundant safety.”
Adaptive lidar advancing autonomous fleet safety
But all lidars are not the same. As Juergens explained, adaptive lidar brings intelligence to the data collection process: “What do you do as a human driver? The moment you are on the highway you look at things differently: Is there something in my ego-lane? Can I classify it? Drive over it? Do I need to brake? That’s what adaptive lidar does on the highway. It focuses the laser power where needed.” Juergens explained that in the above scenario, where a truck needs to see long distance and with the best possible resolution, the sensor’s field of view is narrowly focused, but when the truck slows down or needs to maneuver, an adaptive lidar opens its field of view to, for example, 120 degrees to replicate a human’s head movement. In the latter case, a wide scope of view is more important than long range, and a software-driven lidar is intelligent enough to know that – and adapt its focus accordingly.
Nelson explained that the field of view, resolution and the distance the lidar sees can be triggered by many circumstances, including an HD map, a vehicle’s speed, or an unknown object that is of immediate interest. AEye works with OEMs to customize when to trigger the lidar to meet the various use cases the OEM is focused on – with no hardware changes required.
The requirements of trucking are unique. In addition to clear day and night perception, the sensors must be durable and rugged against weather, shock and vibe. Nelson discussed how AEye’s MEMS meets stringent shock and vibe requirements, and is robust against weather. Nelson added that, while radar is the best sensor for weather, trucks need both modalities, as lidar can see further.
The panelists discussed last mile – or street level delivery – as well, to which Nelson talked about the same adaptive lidar being used across use cases, due to its modular architecture. “If someone says, I want a different form factor, we can assemble the part in a different manner. There is no limitation to taking the same components and moving them higher, taller and adjusting per demand – whether that be a UPS, FedEx medium or light duty pickup truck, sedan, or seeing through glass – the same product can be used. That’s part of the adaptability.”
Cost-downs & regulations play a role in driving volumes
Panelists agreed that driving lidar costs down, getting volumes up and creating the right partner ecosystem are key to advancing toward widespread fleet adoption. Juergens reminded the audience that radar faced a similar hurdle, and that, in 1999, Continental industrialized radar technology and was able to realize an affordable price point that helped bring important safety functions like emergency brake assist to market.
“We think we can repeat the radar story. We want to make lidar solid-state and suitable for automotive manufacturing so people can use it just like any other automotive part and trust it for the life of the vehicle,” said Juergens. AEye and Continental believe that their approach of AEye licensing its architecture, and Continental commercializing and manufacturing its own adaptive lidar product, will ultimately impact the cost curve and improve scale. According to Juergens, “The next step is OEMs adopting the technology in large volumes within their cost and performance requirements.”
Trucking regulations are still in flux, and the OEMs and autonomous trucking platform providers are navigating DOT compliance. The panelists said a big part of this process is building trust that the technology works and the trucks are safe using ADAS and autonomous features. According to Nelson, “If you want to be safe in an autonomous world, you need lidar, and I wouldn’t be surprised if the DOT said ‘you need lidar’”, to which Juergens added that many states are bullish too.
Looking at the decade ahead
To close out the panel, Bishop asked the panelists how to measure the success of this technology, and what the timeline is for deployment. Consensus was that the public will see the first vehicles on the road in late 2024 into 2025, but not to interpret a dearth of public announcements in the near-term to inactivity. “This year might sound quiet, but is the year for decision making for vehicles that come to market in 2025,” said Juergens. “This is something with high complexity. ‘Delay’ isn’t because people don’t think it’s possible. It’s complex, and it’s only 2-3 years until we see these trucks on the road. I’m very convinced that lidar has a great future to make autonomy safe.” Added Nelson, “Everyone’s all in – now it’s testing and evaluation of the technology in fleets and determining your path. Trucking companies need a robust way to drive down cost and increase reliability – we have that and are excited for the year ahead.”
To watch the full panel discussion, Adaptive Lidar Advances Autonomous Fleet Safety, click here.