The hardware revolution has pushed artificial intelligence into the mainstream. It has greatly reduced the training time and cost of AI system, and has not turned AI into an arms race that few people can participate in. In recent years, as computers show their advantages over humans in more and more complex tasks, intelligent algorithms have become a major breakthrough in the field of artificial intelligence. Today, however, another force may have a greater impact on driving AI forward.
Advances in professional chips and other hardware have improved the capabilities of the most advanced artificial intelligence systems and pushed such technologies into the mainstream. Whether this can produce tangible commercial benefits is another matter. The AI index, a project initiated by a research group at Stanford University, clearly shows the importance of the AI hardware revolution. The latest AI index attempts to summarize the progress of artificial intelligence and capture a change in the track of the greatest progress of artificial intelligence in the past 18 months.
From many aspects, these algorithms have not achieved the leap in recent years. Part of the reason is that in some tasks, the achievements of this kind of technology have not increased significantly: for example, in image recognition, computers have not made more achievements after surpassing humans. This also reflects the fact that the problems to be solved are becoming more and more difficult and the progress is becoming slower and slower. As we all know, language is the next frontier of machine intelligence, which is particularly difficult to overcome. Although the tasks of speech recognition and language translation have been solved, understanding and reasoning are still a field dominated by human beings.
On the contrary, the most remarkable progress comes from hardware. For example, specially designed chips are used to process a large amount of data required for machine learning, and the industry also develops special systems for this work. Openai, an American research institution, pointed out a hardware inflection point in 2012. Before that, Moore’s law, the rule of thumb of the chip industry, dominated the field of Artificial Intelligence Computing. Moore’s Law means that processing capacity doubles every two years.
Since then, artificial intelligence systems have followed Moore’s law. With new hardware and more resources invested in this problem, the capacity of the most advanced artificial intelligence system is improved every 3.4 months. There is a paradox in this hardware acceleration. On the one hand, at the forefront of science, it turns artificial intelligence into an arms race that few people can participate in.
Big companies and governments that can control huge computing resources will be the only ones able to participate in this competition. Openai’s operating philosophy has always been that AI researchers with the largest computers will inherit the world. The organization recently received a $1 billion investment from Microsoft to stay in the competition. However, another impact of the hardware revolution is that it has pushed this technology into the mainstream. Google’s TPU is one of the most advanced machine learning processing chips in the world. The outside world can rent it by hour through the company’s cloud computing platform (if your workload is not time sensitive and you don’t mind waiting in line, it only costs $1.35 per hour).
In Silicon Valley, people advocate too much “Popularization” of new technologies, but in the field of artificial intelligence, this proposition is reasonable. With cloud services such as Amazon Web services (AWS) making low-cost hardware and machine learning tools widely used, training neural networks, the most compute intensive part of artificial intelligence, suddenly became widely accessible.
The dawn bench project of Stanford University provides a method to benchmark artificial intelligence systems. According to the data of the project, in less than two years, the time required to train a system on the widely used Imagenet data set has decreased from 3 hours to 88 seconds. This means that the cost can be significantly reduced from $2323 to $12.
Whether the huge reduction in training time and cost will make advanced artificial intelligence a practical technology is another matter. The broad impact of machine learning is difficult to determine, but AI index points to a promising measure. In October this year, about 1.32% of the recruitment information in the United States was related to artificial intelligence, up from 0.26% in 2010. This figure is still very small, and the definition of “artificial intelligence work” is controversial, but the general direction is clear.
MIT professor Erik Brynjolfsson is committed to studying the impact of new technology on the economy. He warned that companies that hire data scientists and machine learning experts will not see immediate returns: they first need to overcome internal bottlenecks by developing new workflow to maximize the use of this technology.