Welcome to the 7th installment of our Introduction to Autonomous Mowing series. Last week, we reviewed GNSS, one of the most widely used positioning tools. This week, we will review the other sensors on Nomad that feed the navigation engine and the operational “Brain.” The aim is to help you understand why we have chosen the specific sensors we have integrated and the levels of redundancy these sensors offer.
These sensors include:
Inertial Measurement Units
Stereo cameras
LiDAR
Ultrasonic
Resolvers
Nomad relies on a series of sensors that help the onboard AI system understand the operating environment, such as staying on the path, avoiding collisions/obstacles, and maintaining operational safety. These sensors combine ultrasonic, radar, image, and LiDAR. Each of these sensors serves a distinct purpose to map out the path ahead digitally and to help Nomad "see" where it's going. While the full-on proliferation of fully autonomous vehicles is some ways away, these sensors will still be key players in how autonomous vehicles function—so let's figure out what they do.
Before we look at each sensor individually, let's review the communications pack that allows the system to operate successfully. With safety as the primary focus, a redundant communication pack ensures that Nomad always has access to data and operators. Nomad continually downloads RTK corrections to maintain localization accuracy. The operating system also uploads imagery (decimated images) to the operators in the case of unknown or new obstacles. Therefore, Nomad has a quad sim cellular integrated router with access to two service providers at any given time. This creates constant access to connectivity for the most reliable and productive operating environment. Now, let's dive into each sensor.
Inertial Measurement Unit
An Inertial Measurement Unit, commonly known as an IMU, is an electronic device that measures and reports orientation, velocity, and gravitational forces using accelerometers, gyroscopes, and often magnetometers. IMUs are the main component of the inertial navigation systems used in satellites, unmanned autonomous vehicles, and missiles. Computers process the data an IMU collects to track position through dead reckoning. Common applications for IMUs include control and stabilization, navigation and correction, measurement and testing, unmanned systems control, and mobile mapping.
Nomad features two separate units. The primary IMU is integrated within the navigation engine with the dual antenna GNSS system mentioned previously. This system is co-located with the system processor and all other core sensors in the “Brain” enclosures. The secondary IMU is physically tightly coupled to the LiDAR to ensure accurate LiDAR data.
Visual Sensors
Nomad’s “eyes” are provided by eight high-resolution, high frame rate cameras and a 360° field of view LiDAR. These sensors offer a dense panoramic security blanket for object detection. The dynamic integration of LiDAR and imagery allows Nomad multiple ways to detect static and dynamic objects safely. This perception suite uses an industry-standard Machine Learning perception engine to reliably identify objects of interest (people, pets, trees, sidewalks, etc.) and remove any ambiguity about unknown objects.
Stereo Cameras
Stereo Vision turns 2D Obstacle Detection into 3D Obstacle Detection using simple geometry and one additional camera.
Nomad has a pair of cameras on each face of the mower. This pair of cameras provides Nomads Stereo Vision and delivers both object identification and depth to object information. The stereo cameras have a 110° horizontal field of view and a 70° vertical field of view that can provide object depth detection to ~20m. Obstacle detection algorithms such as YOLO or RetinaNet paired with Tensor Flow provide 2D Bounding Boxes of the obstacles’ position in an image.
Pictured: Stereo Camera Sensors
LiDAR - 360° Mechanical LiDAR
Since its inception in the 1970s, scientists have used LiDAR to map the earth’s surface and acquire meteorological data. LiDAR, which stands for Light Detection and Ranging, is a remote-sensing technology that uses a laser beam to get information about surrounding objects. It is sometimes called “laser scanning” or “3D scanning.” The technology uses eye-safe laser beams to create a 3D representation of the surveyed environment. A 32-channel, mechanically scanning LiDAR is the sensor that is on Nomad.
Pictured: LiDAR Sensor
Ultrasonic Sensor System
Ultrasonic sensors mimic echolocation used by bats, transmitting high-frequency sound waves to gauge the distance between objects within close range. Ultrasonic sensors can complement other vehicle sensors, including radar, cameras, and lidar, to get a full picture of the immediate surroundings of a vehicle.
The ultrasonic sensors send out short ultrasonic impulses reflected by obstacles. The echo signals are then received and processed. Within the plastic case of an ultrasonic sensor is the main component, the ultrasonic transducer. It consists of an aluminum pot with a diaphragm containing a ceramic element. The sensor receives a digital transmit signal from the ECU. This causes the aluminum diaphragm to oscillate, resulting in the emission of ultrasonic pulses. The diaphragm then relaxes, receives the reflected sound from an obstacle, and vibrates. These vibrations are outputted by the ceramic element as analog signals, then amplified and converted to a digital signal.
Multiple ultrasonic sensors are integrated into the Nomad platform to provide close-in-range data to walls, fences, and any nearby obstruction. These sensors work in the ~40kHz to 45kHz frequency band and deliver range data to approximately 7 meters.
Pictured: Ultrasonic Sensors
Resolvers
A resolver is an angular position sensor commonly used in harsh, rugged environments. It is an electromechanical device, and the primary function of this device is to change the mechanical motion to an electronic signal. Still, it broadcasts an analog signal instead of a digital one. Nomad has resolvers integrated at the wheel motor for steering and traction to maintain precise rotation in environments with high dust, dirt, mud, and organic particulate matter.
Thank you for continuing to read our Introduction to Autonomous Mowing and walking through these sensors with us. Next week, we will share where and what tests we are working on with Nomad.
Comments