Technology based on artificial intelligence in industrial robots
Kensuke Harada from the Graduate School of Engineering Science, Osaka University, presents and analyzes a move towards artificial intelligence (AI)-based technology in industrial robots
One of the areas where robots are most expected to replace human labor is in the manufacturing process of products in factories. However, as often depicted in films such as “Modern Times”, the fact that making products in factories is hard on humans remains a serious problem, even today, as science and technology have advanced. .
This is especially true in the assembly and inspection processes of high-diversity, low-volume production. The robotization of these manufacturing processes requires frequent changes, which makes robotization difficult.
Many robotics researchers have long disputed the problem of replacing human tasks with robots, but robots have replaced only a small portion of human tasks. This fact seems strange since humans can perform tasks with their hands without difficulty in their daily lives. The only tasks humans find difficult are those that require strength and subtle detail, while other tasks seem easy for robots as well. In Penfield’s Homunculus diagram, the relationship between the motor and somatosensory cortices and the parts of the body, the five fingers and the palm occupy one-third of the motor cortex and one-fourth of the sensory cortex, and the hand and fingers are called the second brain .
This suggests that the functions of the hand and fingers that we perform without thinking are actually performed based on the huge amount of information stored in the brain. In other words, if robots are to replace human tasks, the key to research is how to effectively realize the information stored in the brain by robots. Studying the functions of robot hands and fingers is a profound problem directly related to the study of human intelligence.
This article presents our research on machine learning and motion planning, which is intended to be applied to industrial robots. This research is expected to create next-generation manufacturing enterprises, such as variable-volume manufacturing and robot as a service (RaaS).
Acquisition of human behavior at work
One strategy for robotizing manufacturing processes is to acquire human work behavior. (1) The analysis of human behavior at work is essential for the automation of tasks by industrial robots. In our lab, we segment human behavior at work on the time axis and identify the type of work performed.
For this problem, we propose a method that uses the information of the object that the human grasps. Figure 1 shows a human assembling a toy for a child. For example, if the objects grabbed by the human are a bolt, a wheel, and a nut, we construct a hidden Markov model for each of them and identify the action based on the grabbed object. By doing so, we can reduce the variety of actions and increase the recognition success rate.
Planning tasks and movements
To automate the assembly process of high-volume and low-volume production using industrial robots, the key to success is the ease with which it is possible to generate robot movements. To achieve this goal, research has been conducted on motion planning to automatically plan movements of industrial robots for assembly and other tasks.
This section presents a motion planning system for robotic tool manipulation based on human demonstration. (2) First, a human operator picks up a tool to perform a task, and a sequence of the tool’s position and posture is recorded as a motion plan target. Then we divide the work into child tasks based on the stored target values. In addition, the posture of the gripper which grips the tool by the gripping plane is calculated. Based on these results, the robot plans the movement to change the tool from one hand to another or to place the tool on the table and grab it again.
Robot motion planning takes the output sequence of postures and plans detailed movements between sequences. In Figure 2, a human operator manipulates the screw-tightening tool and teaches the robot the movements of the screw-tightening tool. Based on the results of this instruction, the robot plans and executes the tightening operation using the tool.
Next, we research how to automatically analyze an assembly manual and convert it into robot movements rather than teaching a task through human demonstration. (3) When a person buys furniture from IKEA or Nitori, he assembles the furniture by referring to the assembly manual. Figure 3 shows the assembly work plan based on Nitori’s chair assembly manual (4) and the actual assembly work of the robot. Here, the robot recognizes which parts are listed in which frames from the diagrams in the assembly manual, estimates the work operations needed to assemble the parts, and builds a graphical structure called an assembly task sequence graph based on those data. Once this graphical structure is built, the robot can automatically plan and execute assembly tasks based on the graphical structure.
We also studied the planning of cooking actions based on cooking recipes instead of assembly manuals and the execution of cooking actions based on these plans. (5) Unlike the assembly manual, the cooking recipe is essentially a verbal instruction. Another feature is that there are more preparation actions, such as taking kitchen utensils from the shelf, than assembly actions. Here, a graph structure similar to the assembly task sequence graph described earlier is constructed by analyzing cooking recipes, and work planning is performed based on this structure.
Figure 4 shows the result of robot task planning based on the instruction “heat a frying pan with oil over high heat and fry the pork”. In this case, the robot must perform preparatory tasks such as picking up the frying pan and placing the pork on the pan. The robot automatically performs the preparatory tasks by first building a chart element and then determining what is missing in the chart element. Finally, the robot can perform the tasks described in the cooking recipe.
In this article, as examples of industrial robots using AI technology, we have focused on learning-based grasping tasks, human work action recognition, and work action planning. Future challenges include further consideration of environment and object uncertainty and building a system that can respond flexibly to perform procedures.
- K. Fukuda, N. Yamanobe, IG Ramirez-Alpizar, K. Harada, “Assembly Motion Recognition Framework Using Only Images”, Proceedings of the IEEE/SICE International Symposium on System Integrations, pp. 1242-1247, 2020.
- ISTA press release, 2019.
- I. Sera, N. Yamanobe, IG Ramirez-Alpizar, Z. Wang, W. Wan, K. Harada, “Assembly Planning by Recognizing a Graphical Instruction Manual,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems , pp 3115-3122, 2021.
- Official Nitori online store, task chair. https://www.nitorinet.jp/ec/cat/Chair/WorkChair/1/
- K. Takada, N. Yamanobe, IG Ramirez-Alpizar, T. Kiyokawa, K. Koyama, W. Wan, and K. Harada, “Task planning for robots based on verbal instructions and food images”, Proceedings of the 22nd system integration (SI2021), 2021.
*Please note: This is a commercial profile
© 2019. This work is licensed under CC-BY-NC-ND.