Visit Support Centre Visit Support Centre Find a Distributor Find a Distributor Contact us Contact us

What’s the best perception sensor to use alongside ADAS technology?

Blogs October 29, 2024

Did you know that Tesla cars only use cameras in their ADAS setup? Well, they did when we wrote this, but things have a habit of changing! Ultrasonic was in, and then out, and if reports are to be believed, LiDAR is being used at Tesla for something, we just don’t really know what yet! But, what is the best sensor to use alongside ADAS technology?

Others vehicle manufacturers tend to use a range of sensors including cameras, radar, LiDAR, and sonar (also known as ultrasonic). In this blog, we’re looking at these sensors, discussing their various strengths and weaknesses, and then discussing the potential of sensor fusion in ADAS systems.

 

Are Tesla betting everything on camera still, or is LiDAR making a play?

 

Is there a single best sensor for testing ADAS technology? In short, no. Different sensors, unsurprisingly, favour different applications and environments. Each has its own drawbacks, whether that’s in performance or cost. So, what are those?

 

Are cameras the best sensor for testing ADAS?

The cameras used in ADAS technology are highly advanced, processing enormous amounts of data about the world around them. As we’ve already mentioned, Tesla has bet everything on cameras – they’re the only sensors they use. Their logic is that, if humans can make do with a vision-only system like eyes to drive, another vision-only system like cameras should also be sufficient. And, of course, the success of Tesla indicates that they are right – to an extent.

Cameras are prone to a few issues that affect their performance – mainly changing light conditions. Cameras struggle in low light, and can suffer from glare from the road or standing water in bright conditions. Switching from light to dark and back again (when entering and exiting a tunnel, for instance) can also affect the performance of cameras. Vitally, they also can’t make estimations of distance without an accompanying algorithm that processes the image. Every other sensor in this article is specifically designed to measure distances by sending out a wave, or beams, and analysing what returns. That means that systems like autonomous emergency braking (AEB) calculate distance completely differently when relying on cameras to using other sensors that can estimate distance.

 

Radar: weatherproof ADAS sensors

Both radar and camera sensors are common in ADAS solutions – but radar sensors have an edge over camera sensors when it comes to low-visibility conditions. Because radar works using radio waves, it’s not affected by fog, darkness, or blinding lights. They can also detect range – it’s one of radar technology’s primary functions, after all – which makes them simpler to integrate into an ADAS setup. That makes radar vital for applications where distance is a factor, such as adaptive cruise control or blind spot detection warnings.

However, radar sensors are often not able to detect much detail about the objects they detect. That can cause your ADAS sensor to flag a false positive – for instance, activating AEB because the radar detected a paper bag.

 

Sonar ADAS sensors: cheaper sensing for low speed applications

Just like radar, sonar sends out waves and analyses returning waves to identify objects and their distance from the sensor. But sonar uses sound, and is at a lower frequency than radar. It’s cheaper than radar, but its use is limited to short-range applications where the air is relatively still. At longer ranges, the sound waves that sonar uses travel too slowly to be useful, and if there’s too much air movement the readings from sonar get distorted. For this reason, sonar is the sensor of choice in parking systems.

 

LiDAR: high resolution ADAS sensor, with a higher price

LiDAR (as we’ve talked about in a few places on our website), works like radar and sonar, but using light beams instead. Because of this, LiDAR ADAS sensors give very accurate distance data and can be used to build high-resolution images of the world around your vehicle. Although LiDAR works in bright and low light because it generates its own light, it can be affected by fog, rain, or smoke.

The biggest stepping stones to LiDAR have traditionally been the cost of the sensors (which is far greater than radar or sonar), and the fact that LiDAR sensors can’t detect colours – making them less than ideal for spotting red lights at junctions, among other things. It should be noted, though, that the cost of LiDAR is coming down as the technology advances.

 

OxTS xNAV650 GNSS/INS and Ouster LiDAR

 

ADAS sensor fusion gives the best result

Although some manufacturers will likely always favour one type of sensor over another, the best “sensor” will always be a mix of all the technologies available. Each of the different sensor types has strengths that compensate for the weaknesses of the others. For instance:

  • Cameras can be used in conjunction with LiDAR to accurately identify traffic lights, and what colour they are.
  • Using sonar wherever possible helps to keep the overall cost of the car down.
  • Radar can be combined with camera data to mitigate the radar’s lack of definition, and to offset any limitations in camera performance due to light levels.

Even Tesla’s camera-only setup is based on “radar-informed cameras”. Early Tesla models fused radar and cameras, and the data they’ve built since then has been used to develop a camera-only solution.

The new challenge this presents, however, is sensor fusion. There are a few different levels to this challenge:

 

 – Time synchronisation

You need to make sure that all your sensors feed their data into the ADAS system in sync. Otherwise, readings from one sensor may contradict the others, even though they are recording the same object or event.

 

 – Orientation

Your sensors will be stuck in different places on the car, with their own interpretation of what up, down, left, right, north, east, x, y, and z might be. You need to bring them all into a common frame of reference so that the system gets an accurate picture of where an object is in relation to the vehicle (and not have a situation where one sensor says there’s a bike next to the car, while the other says it’s above it, for instance!).

 

 – Accuracy

Every sensor will have differing levels of accuracy, and there will be times when some sensors give false readings such as when cameras transition quickly from low to high light. Your system needs to be able to identify measurements which are likely inaccurate and exclude them from the calculations being made. This is vital for ensuring your systems activate at the right time, every time.

 

 

“We use the data from OxTS devices to validate the performance of the other sensors on our vehicles as well as the vehicle itself. If OxTS say the sensor isn’t performing as we expect, this is enough to make us question its accuracy”

Leading tier-1 vehicle manufacturer

 

ADAS testing is vital to sensor fusion success

With a setup as complicated as ADAS can be, testing is vital to ensure that your systems deploy when they should – and to work out why not, if they don’t. Of particular importance is the ability to monitor and analyse data from each sensor together with the information about your vehicle’s position and dynamics. This is where OxTS can help – using the OxTS GAD Interface our GNSS/INS devices can be optimised for sensor fusion for a range of applications, and we’ve built our hardware to easily integrate into any ADAS testing solution to give precise information that can be used fto understand vehicle and sensor performance.

If you’re testing ADAS systems for an autonomous vehicle, then you’ll also be interested to know that we have a plugin for NVIDIA DRIVE. The plugin gives DRIVE Linux user access (via DriveWorks) to GNSS and IMU measurements from an OxTS GNSS/INS that can be used as groundtruth data for validating sensor and perception stack performance – in real time.

 

For more information on how OxTS GNSS/INS devices can help validate your ADAS technology, download our handy ADAS testing solution brief or our range of ADAS application guides.

  • Automatic Emergency Braking (Car-to-Car Rear Moving and Braking) – Read Guide
  • Automatic Emergency Braking (Car-to-Car Rear Stationary) – Read Guide
  • Automated Valet Parking (AVP) and Park Assist – Read Guide
  • Lane Departure Warning (LDW) and Lane Keep Assist – Read Guide

 

If you’re an engineer working in the ADAS testing field and need help validating your sensor performance, get in touch to discuss how OxTS can help.



return to top

Return to top

,