In the era of AI chip giants competing for supremacy, Huawei's AI chip will debut next week
Source: Content from Wall Street News , Thanks.
Today, Huawei Senior Vice President Richard Yu posted a video on Weibo to promote its own artificial intelligence (AI) chip. He said, "The pursuit of speed is never limited to imagination," and announced that the AI chip will be unveiled at IFA2017 on September 2.
At Huawei's mid-year performance media conference last month, Richard Yu revealed that AI chips will be released this fall, and Huawei will be the first manufacturer to introduce artificial intelligence processors in smartphones. In addition, at the 2017 China Internet Conference, Richard Yu also said that the chips manufactured by Huawei HiSilicon will integrate CPU, GPU and AI functions, and may be based on the new AI chip design launched by ARM at the Computex exhibition this year.
According to Yu Chengdong's video today, Huawei's AI processor is expected to significantly improve the data processing speed of Kirin 970. If the AI chip can be used in the Huawei Mate 10 mobile phone released in October, the data processing capability of Huawei Mate 10 will be very exciting.
Like Huawei, global technology giants such as Intel, Lenovo, Nvidia, Google, and Microsoft are all actively embracing AI, and the layout of AI chips has become a top priority.
Intel
Regarding the importance of AI chips, Song Jiqiang, director of Intel China Research Institute, pointed out in an interview with the media New Intelligence this month that we need to use technology to process large amounts of data to make it valuable to customers. In this process, chips are undoubtedly extremely important:
By 2020, it is conservatively estimated that there will be 50 billion connected devices in the world. The data of the future will come from various device terminals. It will no longer rely on us to make phone calls, play with mobile phones, and send emails. Unmanned vehicles, smart homes, cameras, etc. are all generating data.
In the future, every driverless car will be a server, and each car will generate more than 4,000 GB of data every day. It is impossible to transmit this data through 5G, so a lot of data must be processed and analyzed locally and then selectively sent upstream. Locally, you will use many technologies that surpass modern server technologies.
As a traditional leading chip manufacturer, Intel launched a new generation of Xeon server chips in July this year, which has greatly improved performance and deep learning capabilities that are 2.2 times that of the previous generation of servers, and can accept training and reasoning tasks. In addition, Intel also demonstrated the field programmable gate array (FPGA) technology that will play a major role in the future AI field. At the same time, it plans to launch the Lake Crest processor, which is designed for deep learning code.
Lenovo
Lenovo Group President Yang Yuanqing said, "AI general-purpose processor chips are the strategic commanding heights of the artificial intelligence era." Lenovo Group Senior Vice President and Lenovo Capital Group President He Zhiqiang also pointed out:
In the era of smart Internet, AI chips are the engine of artificial intelligence and will play a decisive role in the development of smart Internet.
Just last week, Lenovo Capital and Alibaba Ventures and other top investors jointly invested in Cambrian Technologies, known as "the world's first unicorn in the AI chip industry."
Nvidia
Nvidia has shifted its business focus to AI and deep learning in the past few years. In May this year, Nvidia released a heavyweight processor for artificial intelligence applications: Tesla V100.
The chip has 21 billion transistors and is much more powerful than the Pascal processor with 15 billion transistors released by Nvidia a year ago. Although it is only as big as the surface of an Apple Watch smartwatch, it has 5120 CUDA (Statistical Computing Device Architecture) processing cores and a double-precision floating-point performance of 7.5 trillion times per second. Nvidia CEO Jensen Huang said that Nvidia spent $3 billion to build this chip and the price will be $149,000.
Google, which announced a strategic shift to "AI first", released the TPU (Tensor Processing Unit) specially customized for machine learning last year. Compared with CPU and GPU, TPU is 15-30 times more efficient and consumes 30-80 times less energy.
At the Google Developer Conference in May this year, Google released a new product, Cloud TPU, which has four processing chips and can complete 180 tflops computing tasks per second. 64 Cloud TPUs can be connected to form a supercomputer called Pod by Google. Pod will have a computing power of 11.5 petaflops (1 petaflop is 1015 floating-point operations per second) - this will be a very important basic tool for research in the field of AI.
Currently, TPU has been deployed in almost all Google products, including Google Search, Google Assistant, and even played a key role in the Go battle between AlphaGo and Lee Sedol.
Microsoft
Last month, media reported that Microsoft will add a self-designed AI coprocessor to the next generation of HoloLens, which can analyze what users see and hear on the device locally, and no longer need to waste time transmitting data to the cloud for processing. This AI chip is currently under development and will be included in the holographic processing unit (HPU) of the next generation of HoloLens in the future. Microsoft said that this AI coprocessor will be the first chip designed by Microsoft for mobile devices.
In recent years, Microsoft has been committed to developing its own AI chips: it has developed a motion tracking processor for the Xbox Kinect gaming system; in order to compete with Google and Amazon in cloud services, Microsoft has customized a field programmable gate array (FPGA). In addition, Microsoft has purchased programmable chips from Altera, a subsidiary of Intel, and written customized software to meet its needs.
Last year, Microsoft used thousands of AI chips at a conference to translate all of English Wikipedia into Spanish, about 5 million articles, in less than 0.1 seconds. Next, Microsoft hopes to enable customers using Microsoft Cloud to complete tasks through AI chips, such as identifying images from massive data or predicting consumer purchase patterns through machine learning algorithms.
Today is the 1376th issue of content shared by "Semiconductor Industry Observer" for you, welcome to follow.
R
eading
Recommended reading (click on the article title to read directly)
★ Transitioning from 4G to 5G, these four technologies cannot be ignored
★ From 0 to 50 billion, revealing the 15-year history of OFILM
★ Four major wafer foundries are competing for MRAM orders. Is this the beginning of a new storage era?
Follow the WeChat public account Semiconductor Industry Observation , reply to the keyword in the background to get more content
Reply BYD , read "BYD's chip layout, Wang Chuanfu's ambition"
Reply Changdian Technology , see "From a small factory in Jiangyin to the top three in the world, Changdian Technology has been on a wild run"
Reply Intel , see "Besieged on all sides, can Intel return to its peak?"
Reply Full screen , see "Challenges brought by full screen mobile phones to the supply chain"
Reply Regarding the chip market , see "Another chip market ruined by Chinese manufacturers!"
Reply Exhibition , see "2017 Latest Semiconductor Exhibition and Conference Calendar"
Reply Submit your article and read "How to become a member of "Semiconductor Industry Observer""
Reply Search and you can easily find other articles that interest you!
Moore invites you to join the elite WeChat group
Hello, thank you for your long-term attention and support to the semiconductor industry observation! In order to facilitate the communication among elite experts, we have set up some professional and WeChat communication groups. You are welcome to join. We will also invite more than 100 technical experts who have shared technology and industry in Moore Live App to join the group and communicate with everyone. How to join the group: Long press the QR code, add the group owner as a friend, fill in the information required to join the group, and pull you into the group. (WeChat limits the number of friends added to 300 per day, please wait patiently)
Regional Group:
Shanghai, Shenzhen, Beijing, Jiangsu, Zhejiang, Xi'an, Wuhan, Chengdu, Chongqing, Hefei, Xiamen, Jinhua, Dalian, Taiwan, Singapore, Japan, South Korea, the United States, Europe, Moore live learning group.
Professional Group:
Analog RF design, EDA.IP, digital chip design, analog mixed signal design, layout, digital PR.Verification, wafer manufacturing Fab, equipment EE, semiconductor materials, semiconductor equipment, packaging and testing, semiconductor investment, marketing, AE.FAE, embedded development, internship communication, procurement.IC agent, AI chip
Professional WeChat Group Rules:
1. Professional and efficient communication. It is recommended to change the group nickname when joining the group. The format is: company or school + position or major + Chinese or English. Please obey the management of the group owner. If you violate the rules multiple times, you will be asked to leave the communication group.
2. In principle, each person should join no more than 3 groups. The group owner will be responsible for synchronizing the discussion content in different groups. Since you have joined a group, please try to pin it to the top of the group and actively participate in the group discussion;
3. Group chats and discussions are limited to semiconductor professional content. Non-professional content is prohibited, especially health preservation, canvassing, WeChat business and other content. Advertising for your own company is limited to not causing disgust among group members;
Click to read the original text and join the Moore Elite
Featured Posts