Visit Support Centre Visit Support Centre Find a Distributor Find a Distributor Contact us Contact us

AMR Navigation Spotlight – Sensors

Blogs July 17, 2024

Welcome to the second blog in our AMR navigation spotlight series, where we’ll be focusing on the sensors you use for localisation and perception. Click here to read the first blog in the series, which discusses methods for localising your autonomous mobile robot in different environments, and transitioning between the two.

The sensors on your autonomous mobile robot (AMR) perform two functions: they feed into the localisation and navigation of the robot, and they enable the robot to perceive the world around it. Both are vital, and both need consideration when you’re designing and building your AMR.

In this blog we’re going to discuss some of the technical considerations you’ll need to take into account when building your platform.

 

Top four AMR sensor considerations

1. Field of view

It sounds obvious, but your sensors need to be able to see what they need to see. For navigation sensors such as GNSS, getting that right is simple – the antenna needs to be facing the sky. With other sensors, though, it can be more subtle.

For example, if your AMR has an on board camera which its using to recognise ArUco markers for localisation, you may want to consider mounting the camera slightly off centre. This will give the camera a better field of view therefore allowing it to see more markers. The knock-on effect is that your AMR will be able to localise better.

Consider as well whether you have a 2D (planar) or 3D sensor. We’ll come onto this in more detail further down the blog, but for sensor placement 2D sensors will need to be placed carefully to ensure they detect what they’re supposed to (for example, placing a 2D radar at the right height to detect all objects that present an obstacle to your AMR).

2. Sensor interference

Some sensors, such as LiDAR and GNSS, can cause interference when placed too close together.

In this instance, shielding is likely the best option for mitigating that interference. The interference is caused by the LiDAR’s motor generating an electromagnetic field which can affect the GNSS. A metal plate between the sensors should mitigate the impact.

For other interferences, you’ll need to experiment to find out how to mitigate them.

 

3. 2D or not 2D?

Shakespeare puns aside, a major consideration when designing your autonomous mobile robot is whether you use 2D or 3D sensors. We’ve already discussed how it affects your sensor placement, but the decision is bigger than that. 3D sensors can cost a lot more than 2D sensors. That has obvious impacts on the commercial viability of your platform. From that perspective, you should use the cheapest possible sensors to create the most competitively priced final product.

On the other hand, as we’ve already discussed 2D sensors need careful placement to make sure they can detect the things they need to detect. 3D sensors give you more freedom for placement.

The AV200 is the localisation device of choice in many autonomous applications.

4. Teaming up your sensors

In automotive circles, lots of different sensor technologies are used in advanced driver assistance systems (ADAS). A common approach is to have multiple sensors doing the same job – for instance, using radar and cameras together for autonomous emergency braking (AEB) systems. Radar is very good at estimating the distance between the vehicle and an object – but not at working out what the object is. Cameras can’t calculate distance easily, but they can recognise objects – so working together they create a reliable AEB system that doesn’t accidentally trigger when the radar detects a paper bag.

The same principle applies in AMR development. Don’t be afraid to experiment with fusing data from multiple sensors to create a more reliable result. For localisation in particular, you can fuse multiple sensors together to maintain position and navigation accuracy in a range of environments.

The OxTS Generic Aiding Data (GAD) interface can help with that sensor fusion. The OxTS Generic Aiding Data Software Development Kit (GAD SDK) provides a framework for feeding external sensor data to an OxTS INS to act as aiding for navigation and localisation. The GAD format provides a standard interface that is agnostic to the sensor type, streamlining the process of sensor fusion.

 

Don’t assume sensors are simple!

Don’t get us wrong – some elements of choosing and placing the sensors on your AMR are relatively straightforward decisions. But there are hidden complexities. The camera example we mentioned above is one, but others include ensuring the data from all your sensors is converted into a common frame of reference, as well as synchronisation (the topic of our next blog in this series). These elements all require careful thought to ensure your AMR behaves as you expect it to, and that you have a product customers will want to use at the end of the process.

 

Autonomous Robot Navigation Solution Brief

AMRs need a robust robot localisation solution; a tool that not only records the position and orientation of the robot, but also operates both indoors and outdoors.

This solution brief steps through the aspects we recommend our customers consider when deciding on their source of localisation for their autonomous mobile robots.

Read the solution brief to learn how the right robot localisation solution can help your AMR project, including the key questions you need to ask yourself before embarking on a project.

AMR Solution Brief

If you’d like to learn more about what we can currently do for AMR engineers, view our application page.

Alternatively, if you’ve got a specific project that you’d like to talk to us about, contact us using the form below to get in touch. We’re always keen to help.

Keep an eye out for the next blog in our series: timing and synchronisation.

 



return to top

Return to top

,