Both the traditional industrial robot system and the most advanced cooperative robot (Cobot) rely on sensors that can generate a large amount of highly variable data. These data help to build better machine learning (ML) and artificial intelligence (AI) models. The robot can make real-time decision and navigation in the dynamic real-time environment.

Industrial robots are usually located in a “closed” environment. For safety reasons, if humans enter the environment, the robots will stop moving. However, many benefits cannot be realized due to the limitation of human / robot cooperation. The robot with autonomous running function can support the safe and efficient coexistence of human and robot.

The sensing and intelligent sensing of robot applications are very important because the efficient performance of robot systems, especially ml / AI systems, largely depends on the performance of sensors that provide critical data for these systems. Nowadays, a large number of sensors are becoming more and more perfect and accurate. Combined with the system that can fuse all these sensor data together, robots can have better and better perception and consciousness.

The development of AI

Robot automation has always been a revolutionary technology in the manufacturing industry. It is obvious that the integration of AI into robots will make great changes in robot technology in the next few years. This paper discusses some of the key trends in robotics, automation, and the most important technologies that link AI and AI data together to achieve intelligence. It also discusses how to use and integrate different sensors in AI systems.

Push AI processing technology of robot to edge computing

Ml consists of two main parts: training and reasoning, which can be executed on completely different processing platforms. Training is usually done offline on the desktop or in the cloud, and includes integrating big data into neural networks. At this stage, real-time performance or functionality is not an issue. As a result of the training phase, there is already a trained AI system at the time of deployment that can perform specific tasks, such as investigating bottlenecks on assembly lines, calculating and tracking people in a room, or determining whether bills are counterfeit.

However, in order for AI to realize its application prospect in many industries, sensor data fusion must be completed in real time or near real time during reasoning (ML algorithm after training). Therefore, designers need to implement ml and deep learning model at the edge, and deploy reasoning function to embedded system.

For example, set up a collaborative robot in the workplace (as shown in Figure 1) to work closely with people. It needs to use data from near-field sensors and visual sensors to ensure that it can successfully prevent human injury while supporting human beings to complete activities that are difficult for them. All these data need to be processed in real time, but the speed of cloud can not meet the real-time and low delay response required by cooperative robots. To overcome this bottleneck, people have developed today’s advanced AI systems to the edge area, that is, robots mean that they exist in edge devices.

  AI on the edge: how can collaborative robots process sensor data quickly

Figure 1: humans interact with collaborative robots in a factory environment.

This distributed AI model relies on a highly integrated processor, which has:

  • Rich peripheral device group for simultaneous interpreting different sensors
  • High performance processing capabilities to run machine vision algorithms
  • Methods of accelerating deep learning and reasoning.

In addition, all of these functions must work efficiently, with relatively low power consumption and relatively small size, so that they can be carried by the edge.

With the popularity of ML, the availability of our power and size optimized “inference engine” is getting higher and higher. These engines are specially designed for ML reasoning.

Integrated system on chip (SOC) is usually a good choice in embedded space, because in addition to the various processing elements that can run deep learning reasoning, SOC also integrates many necessary components to make embedded applications complete.

Let’s analyze the popular robot development trend in today’s era.

Cooperative robot

Near traditional industrial robots have no peripherals, but people generally cannot get them. In contrast, collaborative robots are designed to interact safely with people at runtime, moving slowly and elegantly.

According to the definition of ISO standard TS 15066, cooperative robot is a kind of robot that can be used in cooperative environment. Cooperative operation means that robot and human work synchronously in defined workspace to carry out production operation (this does not include robot + robot system or people and robots that cooperate with each other and operate at different times). Defining and deploying collaborative robots can predict potential conflicts between physical parts of robots (such as actual functional extensions, such as lasers) and operators. More importantly, this uses sensors to determine the precise position and speed of the operator.

Cooperative robot manufacturers must implement a high level of environmental sensing and redundancy in the robot system in order to quickly detect and prevent possible conflicts. The integrated sensor is connected with the control unit to sense the imminent conflict between the robot arm and human or other objects, and the control unit will shut down the robot immediately. If any sensor or its electronic circuit fails, the robot will also shut down.

Logistics robot

Logistics robot is a kind of mobile equipment that can operate in the environment with or without people, such as warehouse, distribution center, port or park. Logistics robots pick up goods and take them to packaging stations, or transport goods from one building in a company site to another; some logistics robots can also pick and pack goods. These robots usually move in a specific environment and need sensors to locate, draw and prevent conflicts (especially with people).

Until recently, most logistics robots used predefined routes; now they can adjust their navigation based on the location of other robots, people and goods. Ultrasonic, infrared and lidar induction are all applied technologies. Due to the mobility of the robot, the control unit in the robot usually communicates with the central remote control via wireless mode. At present, logistics robot has adopted advanced technology, including ml logic, man-machine cooperation and environment analysis technology.

The rising labor cost and strict government regulations have promoted the logistics robot to be widely used. Their popularity has also risen, as the cost of components such as equipment and sensors has fallen, and the cost (and time required) of integration is also on a downward trend.

Last mile delivery robot

In the process of transporting products from the warehouse shelves to the customer’s doorstep, the “last mile” delivery is the last step in the logistics process: the moment when the goods finally arrive at the buyer’s door. Not only is this critical to creating customer satisfaction, but the last mile delivery is costly and time-consuming.

The cost of the last mile delivery accounts for a large part of the total freight cost: in itself, making the last mile delivery more efficient has become the focus of the development and implementation of new robotics technology, which can promote process improvement and efficiency.

Sensor technology of AI in robot

With the development of robot technology, complementary sensor technology is also developing. Very similar to the five senses of human beings, the combination of different sensing technologies can provide the best results when the robot system is deployed in a constantly changing and uncontrolled environment. Even the simplest tasks that robots perform will depend on 3D machine vision to feed data into AI technology. If the machine vision of 3D image can not be reconstructed and AI converts the visual information into successful robot action, it is impossible to grasp the object without predetermined position and motion.

The most popular and relevant sensor technologies used to support AI in robots today include:

  • Time of flight (TOF) optical sensor: this sensor is based on the TOF principle and uses a photodiode (a single sensor element or an array) and active lighting to measure distance. The delay is measured by comparing the reflected light wave from the obstacle with the transmitted wave, which represents the distance. This data helps create 3D maps of objects.
  • Temperature and humidity sensors: many robots need to measure temperature and sometimes humidity of their environment and their components, including motors and main AI motherboards, to ensure that they operate in a safe range.
  • Ultrasonic sensor: if the robot can’t see things in a bright environment or can’t find itself in a dark environment, the visual sensor is not working. By transmitting ultrasound and listening to the echo reflected from the object (similar to the principle of bat operation), ultrasonic sensors can operate well in dark or bright environments, overcoming the limitations of optical sensors.
  • Vibration sensor: industrial vibration sensor is the core part of condition monitoring necessary for preventive maintenance. Integrated electronic piezoelectric sensor is the most commonly used vibration sensor in industrial environment.
  • Millimeter wave sensors: millimeter wave sensors use radio waves and their echoes to determine the direction and distance of moving objects by measuring three factors: speed, angle and range. This helps the robot take more precautions based on the speed of the object approaching the sensor. Radar sensor has excellent performance in dark environment. It can be sensed by materials such as dry wall, plastic and glass.

Although humans still perform most of the tasks in the factory workshop, robots will adapt to human work and improve the degree of automation. To achieve this, they need to be equipped with more AI functions to identify and adapt to various situations in real time, which is only possible when AI is at the forefront.

By Matthieu chevroier, systems and applications manager, global industrial systems, Texas Instruments

Leave a Reply

Your email address will not be published. Required fields are marked *