Intel's Song Jiqiang: Unremitting scientific research will drive AI to the 3.0 era

Publisher:EEWorld资讯Latest update time:2020-06-19 Source: EEWORLDKeywords:AI Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

In recent days, the COVID-19 outbreak has escalated again in Beijing, making the overall positive domestic anti-epidemic situation severe. The production and life that had just restarted have been disrupted again. We may have to face the reality of normalization of epidemic prevention for a long time in the future. This week, I started to switch to the state of "cloud office, cloud life" again. Remote office and video conferencing have become habits. Children's schoolwork has been handed over to online education platforms, and the way of consumption and entertainment has also become watching live broadcasts and listening to cloud concerts. In the special scenario of the epidemic, a series of new AI application needs have been fully activated, and AI has truly penetrated into every aspect of our daily life.

 

As a technology practitioner, I know that the explosion of these new AI applications is not a one-day effort. The process from quantitative change to qualitative change is the result of long-term scientific research layout and digital infrastructure construction. Aiming at long-term goals and values ​​and persisting in long-term scientific research are the only certainties when we face uncertain events such as the "epidemic". This requires not only a forward-looking vision and layout, but also the determination and will not to be shaken by immediate interests.

 

From the perspective of the development of AI, human exploration of AI has lasted for more than 70 years. Looking back at the development of AI, we can clearly capture several key nodes. The first wave of AI was to make theoretical inferences through various rules formulated by humans. Although it performed well in reasoning, it was limited to a few strictly defined problems, and it had no learning ability and could not handle uncertainty problems. What really made AI gradually better was the second wave of AI triggered by deep learning. The massive data generated by the Internet, mobile Internet, etc., provided machines with objects for learning, mining and trial and error, allowing the system to spontaneously find "rules" and make predictions, judgments and decisions. The growth of data, the increase in computing power and the evolution of algorithms based on deep learning, these three "trump cards" have enabled some typical deep learning applications to reach or even exceed human capabilities. This has convinced more and more optimists that deep learning is a direction that is extremely valuable and worthy of large-scale follow-up by the industry.

 

However, is deep learning the ultimate answer to AI? As the research on deep learning deepens, we find that there are still some problems that need to be solved. First, "energy consumption" is the biggest challenge. A research report shows that the carbon emissions generated by the electricity consumed by using a server-level CPU plus a GPU cluster to train a large AI model is equivalent to the carbon emissions consumed by 5 American cars in their entire life cycle. Imagine if all walks of life continue to use such an AI computing model, how much damage will be done to the human ecological environment. Then, "data volume" is another big challenge. Current deep learning relies too much on big data. In some scenarios with small data volumes, the use of deep learning will be very limited. AI should learn from small data like the human brain. During the training process, how to significantly reduce energy consumption and reduce the time and amount of data required while ensuring the capabilities of the AI ​​model? This is an important direction for AI to continue to move forward. But now it seems that the method of accelerating deep learning training based on large-scale GPU parallel computing cannot meet this condition.

 

A truly intelligent system should be an environment-adaptive "natural intelligence". First, it can handle not only deterministic problems, but also uncertain problems. Second, it must not only be able to do things, but also be explainable. Third, it does not rely entirely on big data, and even a small amount of data can achieve more efficient continuous learning. Fourth, it should have high reliability, or conform to the ethics set by humans. This is our outlook on the next stage of development of AI technology - the AI ​​3.0 era.

 

At present, we are at the turning point from the AI ​​2.0 to the AI ​​3.0 era. So, what is expected to become the "sharp blade" that penetrates the future of AI? From the current perspective, as a cutting-edge computing model, neuromorphic computing is most likely to open up a new track from AI 2.0 to AI 3.0. Neuromorphic computing is an attempt and breakthrough in traditional semiconductor processes and chip architectures. By simulating the structure of human brain neurons and the mechanism of interconnection between neurons, it can continuously self-learn under conditions of low power consumption and a small amount of training data, greatly improving the energy efficiency ratio. Obviously, the characteristics of neuromorphic computing are very consistent with the development needs of AI3.0. Therefore, neuromorphic computing is also expected to play an important role in the process of human beings entering the next generation of AI.

 

Intel is a company that focuses on the long term and promotes innovation in underlying technologies to help customers achieve success in commercial applications. To this end, we continue to increase research in cutting-edge technology fields, even if these fields cannot see practical results in the short term. Focusing on neuromorphic computing, we have been actively exploring this new computing model from a very early stage and have achieved remarkable results. Intel's neuromorphic computing chip Loihi already has the ability to smell, and the neuromorphic system Pohoiki Springs already has the computing power of 100 million neurons, which is equivalent to the brain of a small mammal.

 

image.png

 

image.png

 

Of course, neuromorphic computing is still in its very early stages, and we still have a long way to go to truly apply this technology to AI. But I believe that innovation in underlying technologies must adhere to long-termism, focus on one direction and track for a long time, and use this certainty to fight against all uncertainties in the development process, in order to ultimately succeed.

Keywords:AI Reference address:Intel's Song Jiqiang: Unremitting scientific research will drive AI to the 3.0 era

Previous article:Advantech won the title of "Top 20 Edge Computing Companies in China in 2020"
Next article:Four Team USA athletes join Intel for the Olympics

Recommended ReadingLatest update time:2024-11-15 14:38

Intel® architecture-based MEC devices help commercial deployment of connected vehicles
As a new industry that deeply integrates technologies such as automobiles, semiconductors, wireless communications, and transportation, cellular vehicle-to-anything (C-V2X) has great potential to improve the experience of road users. The rapid development of 5G, AI, and edge computing is also driving the further enric
[Embedded]
Intel® architecture-based MEC devices help commercial deployment of connected vehicles
Apple's self-developed chip costs only one-fourth of Intel's and can last up to 20 hours
According to the Taiwan Economic Daily, Apple is expected to release a MacBook equipped with its own processor on November 17. The chip may be exclusively manufactured by TSMC, and the assembly plant Quanta is expected to benefit simultaneously.   Bloomberg reporter Mark Gurman previously predicted that Apple will r
[Mobile phone portable]
Apple's self-developed chip costs only one-fourth of Intel's and can last up to 20 hours
Intel withdraws from 5G chip battle and focuses on network infrastructure
Intel Corporation today announced plans to exit the 5G smartphone modem business and complete its evaluation of other modem business opportunities, including PCs, IoT devices and other data-centric devices. Intel will also continue to invest in its 5G network infrastructure business. Intel will continue to honor its
[Internet of Things]
How to break through the circle in the era of converged edge? Here comes Intel's idea!
  Video has never been absent in Intel's IoT strategy. As early as the 2019 Internet of Things Summit, Tom Lantzsch, senior vice president of Intel and general manager of the Internet of Things Business Unit, clearly put forward an Internet of Things strategy based on high-performance computing, edge empowerment,
[Internet of Things]
How to break through the circle in the era of converged edge? Here comes Intel's idea!
Intel provides a reliable foundation for Azure confidential computing
Microsoft Azure DCsv2-Series, based on Intel® Software Guard Extensions (Intel® SGX) and with a hardware-based trusted execution environment (TEE), is now available. Relying on the reliable foundation provided by Intel, Azure DCsv2-Series enables confidential computing to reach a wide range of enterprise customers, he
[Internet of Things]
Intel provides a reliable foundation for Azure confidential computing
Artificial intelligence is everywhere! It is ready to enter the field of drug research and development
It is reported that the establishment of this agency is to significantly reduce the cost of drug research and development. According to relevant statistics, the research and development cycle of all drugs entering the clinical trial stage is about 10 years, and the research and development funds are as high as 2.6 bil
[Medical Electronics]
Artificial intelligence is everywhere! It is ready to enter the field of drug research and development
Oxford University develops new AI system that enables self-driving cars to adapt to adverse weather conditions
According to foreign media reports, researchers from the Department of Computer Science at Oxford University, in collaboration with researchers from Bogazici University in Turkey, have developed a new artificial intelligence (AI) system that enables autonomous vehicles (AVs) to achieve safer and more reliable navigati
[Automotive Electronics]
Oxford University develops new AI system that enables self-driving cars to adapt to adverse weather conditions
AMD to cut 4% of its workforce to gain a stronger position in artificial intelligence chips
AMD said on Wednesday it would cut 4% of its global workforce as it seeks a stronger position in artificial intelligence chips, a field currently dominated by Nvidia Corp. It was not immediately clear which divisions would be affected. AMD had 26,000 employees at the end of last year, according to SEC fil
[Embedded]
Latest Internet of Things Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号