Whether the traditional industrial robot system or the most advanced cooperative robot (Cobot), they all rely on sensors that can generate a large number of highly variable data. These data help to build better machine learning (ML) and artificial intelligence (AI) models. Robots rely on these models to become “autonomous”, and can make real-time decision-making and navigation in a dynamic real environment.
Industrial robots are usually located in a “closed” environment. For safety reasons, if humans enter the environment, the robot will stop moving. However, limiting human / robot cooperation also makes many benefits impossible to achieve. The robot with autonomous operation function can support the safe and efficient coexistence of human and robot.
Sensing and intelligent sensing of robot applications are very important, because the efficient performance of robot systems, especially ml / AI systems, largely depends on the performance of sensors that provide key data for these systems. Nowadays, a large number of increasingly perfect and accurate sensors, combined with the system that can integrate all these sensor data, can support the robot to have better and better perception and consciousness.
Robot automation has always been a revolutionary technology in the manufacturing industry. The integration of AI into robots will obviously bring great changes to robot technology in the next few years. This paper discusses some key development trends of today’s robots, automation and the most important technologies that closely link AI and the data required by AI to realize intelligence. It also discusses how to use and integrate different sensors in AI systems.
Ml consists of two main parts: training and reasoning, which can be executed on completely different processing platforms. Training is usually conducted offline on the desktop or in the cloud, and includes the integration of big data into neural networks. At this stage, real-time performance or functionality is not an issue. The result of the training phase is that at the time of deployment, there is already a trained AI system that can perform specific tasks, such as investigating bottlenecks on the assembly line, calculating and tracking personnel in a room, or determining whether bills are forged.
However, in order for AI to realize its application prospect in many industries, sensor data fusion must be completed in real time or near real time during reasoning (ML algorithm after training). Therefore, designers need to implement ml and deep learning models at the edge and deploy reasoning functions into embedded systems.
For example, set up a cooperative robot in the workplace (as shown in Figure 1) to cooperate closely with people. It needs to use data from near-field sensors and visual sensors to ensure that it can successfully prevent human injury and support human beings to complete activities that are difficult for them. All these data need real-time processing, but the speed of cloud can not meet the real-time and low delay response required by cooperative robots. To overcome this bottleneck, people have developed today’s advanced AI system to the edge field, that is, robots mean to exist in edge devices.
The distributed AI model relies on highly integrated processors, which have a rich peripheral device group for simultaneous interpreting different sensors. High performance processing functions to run machine vision algorithms; Accelerate in-depth learning reasoning methods. In addition, all these functions must work efficiently, with relatively low power consumption and relatively small volume, so that they can be carried by the edge.
With the popularity of ML, the availability of our “reasoning engine” optimized by power consumption and size is becoming higher and higher. These engines are hardware products specifically designed to perform ml reasoning. Integrated system on chip (SOC) is usually a good choice in embedded space, because in addition to wrapping various processing elements that can run deep learning reasoning, SOC also integrates many necessary components to make embedded applications complete.