- By Lourenco Castro
- November 30, 2020
AI will become an integral part of setting up and using AMRs, simplifying their deployment process and improving their workflow; AMR users will be empowered to make more informed decisions, even without technical expertise, and robots will become the first line of support, predicting when intervention is needed or automating troubleshooting.
In today’s ever-competitive and expensive business environment, autonomous mobile robots (AMRs) are increasingly being used to transport materials within warehouses, manufacturing facilities. Even hospitals, airports and schools are using AMRs as a means to disinfect public places or deliver medicine or food to patients.
A key feature of AMRs over conventional—and often inflexible—logistics solutions like forklifts, conveyor belts, and automated guided vehicles (AGVs) is their built-in intelligence. Rather than moving along a set of tracks or sensors built into the infrastructure, like an AGV requires, AMRs autonomously move along the factory floor track-free, bypassing obstacles in their path and even finding new paths on their own.
But how smart are they really?
With an AMR’s fleet management system installed, manufacturers have centralized control of the robots from a single station, with the most advanced able to eliminate any bottlenecks and downtime with 24/7 mobile robot operation. Once an AMR is programmed, the fleet management system manages the priorities and selects the most suitable robot to the operation at hand, based on position and availability. It also monitors robot battery levels, automatically manages the recharging, and controls the robots’ traffic patterns by coordinating critical zones with multiple robot intersections.
|The strategically placed MiR AI Camera static cameras enable the AMRs to foresee obstacles on their routes, so they are able to re-route beforehand for optimized navigation.|
This definitely sounds smart, but some AMRs are taking those smarts to the next level with artificial intelligence (AI) coupled with strategically placed cameras that function as extended robot sensors. Without AI, AMRs react the same way to all obstacles, slowing and attempting to navigate around a person or object, if possible, or stopping or backing up if there is no safe way to maneuver around it. With AI, AMRs can learn to adapt their behavior appropriately, even before they enter an area. This means they can avoid high-traffic areas during specific times, including when materials are regularly delivered and transferred by fork truck, or when large crowds of workers are present during breaks or shift changes.
How do AI-powered AMRs work?
Today, mobile robots use sensors and software for control (to define where and how the robot should move) and perception (to allow the robot to understand and react to its surroundings). Data comes from integrated laser scanners, 3D cameras, accelerometers, gyroscopes, wheel encoders, and more to produce the most efficient decisions for each situation. The AMRs can dynamically navigate using the most efficient routes, have environmental awareness so they can avoid obstacles or people in their path, and can automatically charge when needed.
|At this stage, the artificial intelligence used in autonomous mobile robots is focused on machine learning and vision systems.|
AI technology for AMRs are now focused primarily on machine learning (ML) and vision systems, which are dramatically extending earlier sensor-based capabilities. Advances in technologies—including smaller and more powerful sensors; cloud computing and broadband wireless communications; and new AI-focused processor architectures (see sidebar)—are more widely available at lower costs, which makes it easier than ever to pull data from a robot’s immediate, extended, and anticipated environment as well as internal conditions.
Technology advances enabling AI/smarter robots
Small, low-cost, and power-efficient sensors allow mobile and remote devices to capture and transmit huge amounts of data about the robot’s immediate, extended, and anticipated environment as well as internal conditions.
Cloud computing and broadband wireless communications allow the data to be stored, processed, and accessed almost instantly, from any access point. Secure virtual networks can adapt to dynamic requirements and nearly eliminate downtime and bottlenecks.
Powerful new AI-focused processor architectures are widely available from both traditional semiconductor companies such as AMD, Intel, NVIDIA, and Qualcomm as well as new players in the field, including Google and Microsoft. While traditional broad-use semiconductors are facing the limits of Moore’s Law, these new chips are purpose-designed for AI calculations, which is driving up capabilities while driving down costs. Low-power, cost-effective AI processors can be incorporated into even small mobile or remote devices, allowing onsite computation for fast, efficient decisions.
Sophisticated software algorithms analyze and process data in the most productive locations—in the robot, in the cloud, or even in remote, extended sensors that provide additional intelligence data for the robot to anticipate needs and proactively adapt its behavior.
Using these capabilities, and AI-focused cameras, fleets of AMRs can learn while they are online, but then perform without constant access to online content. Low-power, AI-capable devices and efficient AI techniques support new robotic systems with low latency and fast reaction times, high autonomy, and low power consumption, key capabilities for success.
The new AI capabilities in AMRs help maintain the robots’ safety protocols and drive improved efficiency in path planning and environmental interaction. For example, new advanced learning algorithms are included in the remote, connected cameras that can be mounted in high-traffic areas or in the paths of fork trucks or other automated vehicles. The cameras come with small, efficient embedded computers that can process anonymized data and run sophisticated analysis software to identify whether objects in the area are humans, fixed obstacles, or other types of mobile devices, such as AGVs. The cameras then feed this information to the robot, extending the robot’s understanding of its surroundings so it can adapt its behavior appropriately, even before it enters an area. If the AMRs detect a person, for example, they can continue driving as usual but will park if they detect an AGV so the AGV can drive by. The robot can also predict blocked areas or highly trafficked areas in advance and reroute instead of entering the blocked area and then re-routing. While the robots’ built-in safety mechanisms will always stop the robot from colliding with an object, person, or vehicle in its path, other vehicles like forklifts may not have those capabilities, leaving the risk of one of them running into the robot. Including a dedicated AI processor in an external, low-cost camera device means it’s now possible to extend existing MiR robots with AI capabilities without modifying their hardware.
With the AI-powered AMRs able to detect high-traffic areas before they arrive and identify other vehicles and behave appropriately to decrease the risk of collision, they are improving their own behavior and adapting to other vehicles’ limitations.
The Future of AI-powered AMRs
So what comes next? As AI advances, so will AMRs, making them increasingly smarter and more empowering.
Deploying AMRs in a new environment can be a lengthy and delicate process where specific map zones need to be carefully designed to extract the most value from a fleet of robots. In the future, robots should be able to do most of the heavy lifting by recognizing floor markings, busy intersections, narrow passages, and other distinct conditions. While mobile robots will still be a controllable tool with emergency stop buttons, knowing which side of the track they should navigate, or where they should adjust their maximum speed, will be a normal part of their operation and won’t require constant human intervention.
These exciting new features wouldn’t be so impressive, however, if the complexity of their setup surpassed their added benefits. As a result, the process of extending workflows with AI needs to be simple and intuitive, based on concepts such as example-based training and rule-based setup. Setting up intelligent AMRs shouldn’t be harder than showing AI algorithms what objects they should detect, or rotating a dial or two to adjust a robot’s behavior.
Lastly, robots are constantly generating a gold mine of data that can be used to monitor their uptime and network connectivity, among other things, but also serve as the starting point for an efficient technical support intervention. Users managing AMRs will be greatly empowered with personalized information and predictions about necessary actions. This means getting recommendations to improve specific robots’ deployments or anticipating when a component needs to be replaced ahead of time.
In summary, AI will become an integral part of setting up and using AMRs, simplifying their deployment process and improving their workflow; AMR users will be empowered to make more informed decisions, even without technical expertise, and robots will become the first line of support, predicting when intervention is needed or automating troubleshooting. That’s how AI will bring AMRs into the future.
Read more about how manufacturers and other industries are benefiting from the AI in autonomous mobile robots here.
Did you enjoy this great article?
Check out our free e-newsletters to read more great articles..Subscribe