ISH100/30058/0/0/00/0/00/01/00 Autonomous robots are intelligent machines that can understand their environment and navigate from it without the need for manual control or intervention. Although autonomous robot technology is relatively new, it has been widely applied in fields such as factories, warehouses, cities, and homes. For example, autonomous robots can be used to transport goods around warehouses (as shown in Figure 1) or perform last mile deliveries, while other types of robots can be used for home vacuum cleaning or lawn mowing.
Figure 1: Robots transporting goods around the warehouse.
ISH100/30058/0/0/00/0/00/01/00 To achieve autonomy, robots need to be able to self perceive and locate in the map environment, dynamically detect surrounding obstacles, track these obstacles, plan a route to a designated destination, and control vehicles to follow that route. In addition, robots must only perform these tasks in safe conditions to avoid risks to personnel, property, or the system itself. With the increasing interaction between humans and robots, they not only need to have autonomy, mobility, and energy efficiency, but also need to meet functional safety requirements. With the help of sensors, processors, and control devices, designers can meet the strict requirements of functional safety standards such as the International Electrotechnical Commission 61508.
Precautions for autonomous robot detection
Various types of sensors can be used to solve the challenges that come with autonomous robots. Below is a detailed introduction to two types of sensors:
ISH100/30058/0/0/00/0/00/01/00 visual sensor. Visual sensors can effectively simulate human vision and perception. Visual systems can cope with challenges such as localization, obstacle detection, and collision avoidance, as they have high-resolution spatial coverage capabilities and can detect and classify objects. Compared to sensors such as LiDAR, visual sensors are also more cost-effective, but they are computationally intensive sensors.
High power consuming central processing units (CPUs) and graphics processing units (GPUs) may pose challenges to power constrained robot systems. When designing energy-saving robot systems, CPU or GPU based processing should be minimized as much as possible. System on Chip (SoC) in efficient visual systems should process visual signal chains at high speed, low power consumption, and low system cost. The SoC used for visual processing must be intelligent, safe, and energy-efficient. The TDA4 processor series is highly integrated and adopts a heterogeneous architecture design, aiming to provide computer vision performance, deep learning processing, stereo vision functionality, and video analysis with the lowest possible power consumption.
TI millimeter wave radar. The use of TI millimeter wave radar in robot applications is a relatively novel concept, but the concept of using TI millimeter wave sensing to achieve autonomy has been around for some time. In automotive applications, TI millimeter wave radar is a key component in advanced driving assistance systems (ADAS), used to monitor the surrounding environment of vehicles. You can apply some similar ADAS concepts, such as surround view monitoring or collision avoidance, to the field of robotics.
From the perspective of sensing technology, TI millimeter wave radar has certain uniqueness because such sensors can provide information on the distance, velocity, and arrival angle of objects, which can better guide robot navigation and avoid collisions. Based on radar sensor data, robots can decide whether to continue safe movement, slow down, or even stop based on the position, speed, and trajectory of the approaching person or object, as shown in Figure 2.
Figure 2: Warehouse robots use radar sensing.
ISH100/30058/0/0/00/0/00/01/00 Using Sensor Fusion and Edge AI to Solve Complex Problems of Autonomous Robots
For more complex applications, a single sensor of any type may not be sufficient to achieve autonomy. Ultimately, multiple sensors such as cameras or radars should be used in the same system to complement each other. By integrating sensors and utilizing data from different types of sensors in the processor, it can help tackle more complex challenges for autonomous robots.
Sensor fusion helps to make robots more precise, while using edge artificial intelligence (AI) can make robots smarter. Integrating edge AI into robot systems can help robots intelligently perceive, make decisions, and perform operations. Robots with edge AI can intelligently detect objects and their positions, classify objects, and take corresponding actions. For example, when a robot navigates through a cluttered warehouse, edge AI can help the robot infer what kind of objects (including people, boxes, machines, and even other robots) are on its path and decide on appropriate operations to navigate around these objects.
When designing robot systems using AI, there are some design considerations for both hardware and software. The TDA4 processor series has a hardware accelerator suitable for edge AI functionality, which can assist in real-time processing of computationally intensive tasks. Being able to access an easy-to-use edge AI software development environment helps simplify and accelerate the application development and hardware deployment process. You can read the article “Simplified Guide for Embedded Edge AI Application Development” to learn more about TI’s free tools, software, and services aimed at helping with development.
epilogue
Designing smarter and more autonomous robots is a necessary condition for continuing to improve automation levels. Robots can be used in warehouses and distribution fields to keep up and promote the development of e-commerce. Robots can also perform daily household chores such as vacuuming and weeding. The use of autonomous robots can improve productivity and efficiency, help improve our lives, and give more value to our lives.