How cloud technology can help improve collaboration between robots and employees
In recent years, the scale of the domestic robotics industry has grown rapidly. According to data released by the Ministry of Industry and Information Technology at a press conference on the theme of "vigorously developing high-end equipment manufacturing industry", the operating income of the entire robotics industry in 2021 exceeded 130 billion yuan; the application of industrial robots has covered 60 major industries and 168 medium-sized industries in the national economy. Now, robots are fighting side by side with people in many working environments such as factories, warehouses, retail stores, and hospitals. The development of the robotics industry will also change the way people produce and live in daily life, and further enable the vigorous development of the economy and society.
As robots make their way into our workplaces, we need to make it easier for employees to trust them and for robots to take the right cues from people. Especially with autonomous mobile robots (AMRs), these dynamic robots are exactly what their name implies: autonomous. No one is standing by or in the background controlling them with a remote control or traditional human-machine interface (HMI). AMRs are programmed to do a job and then sent out to do it, just like people go through a standard onboarding process and are trusted to do their work on their own. Since both AMRs and employees need to be free to do their work without much supervision, we must ensure they know how to interact with each other and teach them appropriate social behaviors.
A new perspective on robots
Teaching robots how to behave like people may make some people nervous. But it’s necessary because the two need to interact to keep workflows running smoothly in fast-paced, high-stakes supply chain operations. If you’re going to give your customers what they need, when they need it, all parties need to work together. We had to show our employees that AMRs know when to get involved and when to keep their distance, which meant teaching them what they can and can’t do.
But before we can do that, we have to make sure AMRs can see, understand, and react to what’s going on around them. They can’t be “enclosed” in their own world, like many current robots, and only see “myself,” “obstacles,” or “free space.” In the same way, workers must be able to see what’s going on around them from different perspectives. Their world can’t just consist of “myself,” “automated equipment,” “fixed infrastructure,” “other workers,” and “many unknowns.” They must be able to understand the depth and breadth of their surroundings, especially the extent to which AMRs are capable in terms of intelligence and functionality.
If we don’t change the way we think and react to robots and people, when employees first come into contact with AMRs, they may feel intimidated, hesitant, and skeptical. Their preconceived notions may lead them to believe that the robot is too aggressive or too close to them, or that it ignores them. They may over-humanize the robot and make assumptions that may be detrimental to everyone. This can lead to underutilization of AMRs, and a longer timeline for companies to see a return on their investment (ROI), if there is a return at all. Employees and customers will also miss out on the benefits that AMRs could bring.
According to a double-blind Global Warehousing Vision Study recently commissioned by Zebra Technologies, 83% of warehouse employees working with AMRs today said that autonomous robots help improve their productivity and reduce travel time, which is a win-win for the company and its front-line teams. More importantly, three-quarters of the employees surveyed said that AMRs help reduce errors, which is beneficial to both the company and its customers, and nearly two-thirds (65%) of the employees surveyed believe that AMRs provide career development opportunities and help retain employees.
Therefore, we must eliminate biases that come from the “me” mentality or preconceived notions. A better way to do this is to “think” in the cloud.
New technologies are needed to build trust
As I have always understood it, the advancement of technology in robotic automation has been driven by three factors: repeatability, scalability, and higher throughput. This is why many robotic arms, automated guided vehicles (AGVs), and static robots are designed to complete tasks along conveyor lines or on driveways within closed work cells. This is also why most robots are programmed to complete tasks using predefined movements, and their behavior is controlled by humans. Most robots don't need to "figure out why something is happening," they just do what people tell them to do.
AMRs are different. While they collaborate and interact with people, they do not rely on people to guide their every move, telling them when to stop, start, or move in different directions. They are able to make decisions and act correctly on their own without human intervention. By leveraging customer scenarios, simulations, and the cloud to understand current AMR behavior and the changes needed to achieve ideal behavior, navigation behaviors are then developed for the robots, which are based on heuristics/biases and encoded into the navigation and planning code. These heuristics/biases help make the social behavior of AMRs closer to that of humans. By encoding these behaviors into the navigation and planning of the AMRs, employees will have a better understanding of the robot's behavior as it drives around the facility, thereby building trust with each other and working more collaboratively, while improving the robot's performance.
AMRs managed through the cloud make it easy to record data and help staff understand how each robot is performing in the facility. This is refined by using speed and path consistency for low- and high-frequency interactions as a baseline to understand how changes in navigation code can improve performance, and creating dynamic charts of each robot’s performance in different facilities.
Cloud technology makes robot training more efficient
Traditionally, when teaching robots to work, we tell them what needs to be done, give them operating parameters, execute the actions, and then adjust their functions as needed. Now, thanks to machine learning, convolutional neural networks, and other cloud-based technologies, we are giving AMRs the ability to adapt to their surroundings. They can detect and classify different semantic objects, such as people, forklifts, and pallets, to make the right decision based on the encoded behavior and the current sensory input. They do not work entirely based on inferred instructions, but operate in reality.
In other words, cloud technology enables us to encode AMRs with the required social behaviors, which helps employees feel comfortable working with them. At the same time, it becomes easier to teach people the social behaviors required when working with robots. People will be able to see how the AMR successfully navigates around or away from employees who are busy or shouldn’t be there. They will also see how the AMR safely enters their workspace and provides support when they need assistance.
If robots can gain employees’ trust and have the right social behaviors, people’s behavior toward these robots will change. As people’s confidence in the robots’ behavior increases, the hesitation to adopt AMRs will gradually disappear. People will begin to realize and appreciate how AMRs can help them, and their adoption rate will increase. Businesses will also be able to increase their adoption of robotic automation.
So the next time someone tells you that the cloud doesn’t do much for robotic automation, remind them that without the cloud, AMRs wouldn’t be able to work autonomously or collaboratively as they do today. The cloud is what’s driving advances in robotics, and it’s doing a lot more to teach smart robots how to be social and to teach people that AMRs are friendly.
Previous article:Intel launches new data center processor to add new impetus to the development of digital economy
Next article:Astera Labs Launches Cloud-Scale Interoperability Lab to Enable Seamless Deployment of CXL Solutions at Scale
- Wi-Fi 8 specification is on the way: 2.4/5/6GHz triple-band operation
- Three steps to govern hybrid multicloud environments
- Microchip Accelerates Real-Time Edge AI Deployment with NVIDIA Holoscan Platform
- Keysight Technologies FieldFox handheld analyzer with VDI spread spectrum module to achieve millimeter wave analysis function
- Qualcomm launches its first RISC-V architecture programmable connectivity module QCC74xM, supporting Wi-Fi 6 and other protocols
- Microchip Launches Broadest Portfolio of IGBT 7 Power Devices Designed for Sustainable Development, E-Mobility and Data Center Applications
- Infineon Technologies Launches New High-Performance Microcontroller AURIX™ TC4Dx
- Rambus Announces Industry’s First HBM4 Controller IP to Accelerate Next-Generation AI Workloads
- NXP FRDM platform promotes wireless connectivity
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- I have never understood several parameters of MOS tubes. Please help me explain them.
- Luminous flux of LED after aging
- Image storage and display system based on SDRAM——Study record
- The smallest Arduino board
- Wi-Fi 6E front-end module: Perfectly demonstrates all the performance of the 6GHz band
- Design of wireless intelligent air conditioning control system peripheral circuit using ZigBee technology
- The static digital tube display problem of 51 single chip microcomputer in Puzhong
- 【Application development based on NUCLEO-F746ZG motor】3. Hardware connection and motor identification
- There was an earthquake
- [Mill MYB-YT507 development board trial experience] tkinter Button learning