Auto Express

Advanced driver assistance systems: Camera or sensor combo?


One of the fiercest areas of competition in the automotive industry today is in the areas of advanced driver assistance systems (ADAS) and autonomous driving, both of which have the potential to dramatically improve safety. whole.

Building on these technologies, a fully autonomous, level 5 car could provide game-changing productivity or economic benefits, such as fleets of robot taxis that would eliminate the need to pay drivers, or by allowing employees to work or rest in their vehicles.

Automakers are currently testing two main approaches to ADAS and these autonomous driving systems, with interim steps manifesting in the form of driver-assistance features we see and use. today: AEB, lane keep assist, blind spot warning and what to watch out for.

THAN: How self-driving is my car? Self-driving levels explained

The first approach relies solely on the camera as the data source on which the system will make decisions. The second approach is called sensor fusion, and aims to combine data from cameras as well as other sensors such as lidar sensors, radar and ultrasonic sensors.

Only camera

Tesla and Subaru are two popular automakers that rely on cameras for their ADAS and other autonomous driving features.

Philosophically, the rationale for just using the camera could perhaps be summed up by the paraphrase of Tesla CEO Elon Musk, who noted that there’s no need for anything other than a camera, when people could drive without anything other than their eyes.

Musk further explained, by mentioning that having multiple cameras would act like ‘eyes in the back of a person’s head’ with the ability to drive a vehicle at a significantly higher level of safety than a single vehicle. normal person.

Tesla Model 3 and Model Y The vehicles sold today respectively offer a complex setup consisting of eight outward facing cameras.

These include three forward facing cameras mounted on the windshield, each with a different focal length, a pair of forward facing cameras mounted on the B-pillar, a pair of rear facing cameras mounted in the housing. side and reverse repeater lights required -see camera.

Meanwhile, Subaru uses a pair of windshield-mounted cameras on most versions of its EyeSight suite of driver assistance systems, with the latest generation of EyeSight X, as on the MY23. Subaru Outback (currently revealed for the US but coming soon), also adds a third wide-angle grayscale camera for a better field of view.

Proponents of these camera-only setups claim that using multiple cameras, each with a different field of view and focal length, allows for full depth perception to support technologies like cruise control. adaptive cruise control, lane keep assist and other ADAS features.

This eliminates the need to allocate valuable computing resources to interpret other data inputs, and eliminates the risk of receiving conflicting information that could force on-board computers to prioritize data. from one type of sensor than another.

With radar and other sensors typically mounted behind or inside the front bumper, adopting a camera-only setup also has the practical benefit of reducing repair bills in the event of a collision, as these sensors will no need to replace.

The obvious limitation of relying solely on cameras is that their effectiveness will be severely limited in bad weather conditions such as heavy rain, fog or snow, or at times of day when sunlight is shining. directly into the camera lens. Furthermore, there is also the risk that a dirty windshield will obscure visibility and thus impede performance.

However in a recent presentationFormer Tesla Autopilot Andrej Karpathy claims that developments in Tesla Vision can effectively mitigate any problems caused by temporary extreme weather.

Using advanced neural networks and techniques such as automatic labeling of objects, Tesla Vision can continue to recognize objects in front of the vehicle and predict their path over at least short distances. , despite the presence of debris or other hazardous weather that may momentarily obscure the camera’s view.

However, if the weather is continuously bad, the quality or reliability of the data received from the camera is unlikely to be as good as an aggregator setup that combines data from sensors such as radar which can be less affected by bad weather.

Furthermore, there is also the risk that providing only one sensor type will reduce redundancy due to having many different types of sensors.

Thermonuclear sensor

In contrast, the vast majority of automakers have chosen to use multiple sensors to develop their ADAS and related autonomous driving systems.

Called sensor fusion, this involves taking a simultaneous feed from each of these sensors, and then combining them to create a complete and reliable view of the current driving environment. at the car.

As discussed above, in addition to the multitude of cameras, the sensors deployed typically include radar, ultrasonic sensors, and in some cases, lidar sensors.

Radar (radio detection and range) detects objects by emitting pulses of radio waves and measures the time it takes for them to be reflected back.

As a result, it often does not provide the same level of detail that can be provided by lidar or cameras, and with low resolution it is not possible to pinpoint the exact shape of an object or distinguish between multiple objects. Smaller bodies are placed close together.

However, it is not affected by weather conditions such as rain, fog or dust, and is generally a reliable indicator of whether there is an object in front of the vehicle.

Lidar sensors (light detection and range) work on the same basic principle as radar, but instead of radio waves, lidar sensors use lasers. These lasers emit pulses of light, which are reflected by any surrounding objects.

Even more than a camera, lidar can create highly accurate 3D maps of the car’s surroundings and can distinguish between pedestrians and animals, and can also track the movement and direction of objects. this image easily.

However, like cameras, lidar continues to be affected by weather conditions and remains expensive to install.

Ultrasonic sensors are commonly used in the automotive space as parking sensors, providing drivers with an acoustic signal of how close they are to other cars through a technique known as geolocation. echo, also used by bats in the natural world.

Effective at measuring short distances at low speeds, in ADAS and the autonomous vehicle space, these sensors can allow a car to automatically find and park an empty spot in a multi-storey car park. , such as.

The main benefit of adopting a sensor fusion approach is the opportunity to have more reliable, accurate data in a wider range of conditions, as different types of sensors can perform more effectively in different situations. different situations.

This approach also offers a greater chance of redundancy in the event a particular sensor fails.

Of course, more sensors also mean more hardware, and this ultimately increases the cost of setting up a sensor mix beyond an equivalent camera-only system.

For example, the trunk lid sensor is typically only found in luxury vehicles, such as the Drive Pilot system offered on the Mercedes-Benz EQS.

THAN: How self-driving is my car? Self-driving levels explained





Source link

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button