Connected Content: Relevant Updates
======================================================================================
As the automotive industry continues to evolve, the integration of automated driving features is becoming a key focus for car manufacturers. The benefits are clear: minimized rework, faster development cycles, quick feature additions and fixes, and a quick time-to-market are all significant advantages.
The foundational technology for these advancements is environmental perception, which forms the backbone of Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD). This technology, often utilising artificial intelligence and computer vision, allows vehicles to identify and track objects like pedestrians, vehicles, and road signs.
The technology required to enable ADAS and AD systems can be broken down into four main categories: Sensing, Perception, Fusion, Localization, Path Planning, and Control. Sensing involves the use of cameras, lidar, radar, and ultrasonic sensors to gather real-time data about the vehicle's environment.
Fusion, on the other hand, refers to combining outputs from multiple sources, either at the object-level or low-level. While object-level fusion is the current approach used in most sensor fusion solutions, it results in poor performance. Low-level fusion, however, offers advantages such as inherent system redundancy, compensation for the limitations of one sensor, and the ability to scale the same system from ADAS to AD applications.
Localization determines the vehicle's precise position in its environment, using GPS, high-definition maps, and inertial measurement units (IMUs). Path planning, meanwhile, refers to developing an autonomous driving route from point A to point B, with the path-planning algorithm needing to be adaptable and robust to handle dynamic changes.
Control systems translate the decisions made by the vehicle's AI into physical actions such as steering, acceleration, and braking. The role of these systems is pivotal for ADAS and AD developers, who strive to boost system reliability and performance. When assessing the performance of perception systems, automotive OEMs and Tier 1s must consider factors such as false alarms, object separation capability at large distances, occluded object detection ability, perception range for a given sensor set, and performance in adverse conditions.
The advanced driver assistance system (ADAS) and autonomous driving (AD) market is expected to reach $42Bn by 2030, growing at a compounded annual growth rate of 11% between 2020-2030. This growth is driven by the increasing demand for safer and more efficient vehicles.
The Society of Automotive Engineers (SAE) has categorized driving automation into six levels, spanning from Level 0 (no automation) to Level 5 (full driving automation). Almost all new passenger vehicles sold in North America are equipped with Level 1 capabilities, while most are equipped with Level 2 capabilities. Level 3 (conditional automation) is the stage at which the vehicle can execute all aspects of the dynamic driving task within specific conditions, such as highway driving, but will prompt human intervention when these conditions are not met. Level 4 (high automation) is the stage at which the vehicle can independently achieve all driving tasks within specific conditions, such as geofenced areas or road types, with no human driver intervention expected in these predefined environments.
Companies developing and commercially operating Level-4 autonomous vehicles include Uber (in partnership with Momenta, testing in Munich, Germany), Chinese firms Pony.ai and AutoX with fully driverless robotaxi services in major Chinese cities, and WeRide with robotaxi business expansion and Renault cooperation. Aurora Innovation is advancing Level-4 autonomous trucks for commercial use with partners like Nvidia and Continental.
One company uniquely positioned to enable car manufacturers to develop their vehicles is LeddarTech, with their AI-based technology providing front-to-surround-view environmental perception capabilities by adding sensors and recalibration.
The ADAS & AD market can be subdivided into three categories: environmental perception, functional integration and prediction & planning. As the industry continues to evolve, it's clear that the future of driving is automated, and companies like LeddarTech are leading the way.