Intel Launches New AI Hardware from Cloud to Edge to Accelerate AI Development

Publisher:EEWorld资讯Latest update time:2019-11-13 Source: EEWORLDKeywords:Intel Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

During the 2019 Intel AI Summit, Intel demonstrated a series of new product advances designed to accelerate the development and deployment of artificial intelligence systems from the cloud to the edge, welcoming the arrival of the next wave of artificial intelligence. Intel demonstrated the NNP-T1000 for training and the NNP-I1000 for inference.Intel® Nervana™ Neural Network Processor (NNP). As Intel's first dedicated ASIC chip for complex deep learning for cloud and data center customers, Intel Nervana NNP has ultra-high scalability and ultra-high efficiency. Intel also released the next-generation Intel® Movidius™ Myriad™ Vision Processing Unit (VPU) for edge media, computer vision and reasoning applications.

 

“As AI advances, both computing hardware and memory are reaching a tipping point,” said Naveen Rao, vice president and general manager of Intel’s Artificial Intelligence Products Group. “Specialized hardware, such as the Intel Nervana NNP and Movidius Myriad VPU, is essential if we are to continue to make great progress in this area. With more advanced system-level AI, we will move from the ‘data to information’ phase to the ‘information to knowledge’ phase.”

 

With the launch of these products,Intel’s AI solutions portfolio has been further strengthened and is expected to generate more than $3.5 billion in revenue in 2019. The breadth and depth of Intel’s AI portfolio are the highest in the industry, enabling customers to develop and deploy AI models across all devices and at all scales, from the cloud to the edge.

 

The new Intel Nervana Neural Network Processor, now in production and delivered to customers, is part of a system-level AI solution. The solution provides a complete software stack developed with open components and deep learning frameworks to fully exploit hardware performance. The Intel Nervana Neural Network Training Processor (Intel Nervana NNP-T) strikes a balance between computing, communication, and memory, allowing for near-linear and energy-efficient scaling for both small clusters and the largest pod supercomputers. The Intel Nervana Neural Network Inference Processor (Intel Nervana NNP-I) is energy-efficient and low-cost, and its flexible form factor makes it ideal for running high-intensity multi-modal reasoning at real scale. These two products are targeted at cutting-edge AI customers such as Baidu and Facebook, and are custom-developed for their AI processing needs.

 

Misha Smelyanskiy, director of artificial intelligence system co-design at Facebook, said: "We are very excited to work with Intel to deploy faster and more efficient inference computing using the Intel Neural Network Processor for Inference (NNP-I). At the same time, our latest deep learning compiler Glow will also support NNP-I."

 

In addition, the next-generation Intel Movidius VPU is scheduled to be available in the first half of 2020. With its unique and efficient architecture advantages, it will be able to provide industry-leading performance: compared with the previous generation VPU, the inference performance is improved by more than 10 times, and the energy efficiency can reach 6 times that of competing products.Intel also announced the new Intel® DevCloud for the Edge, which together with the Intel® Distribution of OpenVINO™ toolkit addresses a major pain point for developers: the ability to try, prototype, and test AI solutions on a variety of Intel processors before purchasing hardware.

 

Advancing deep learning reasoning and applications requires extremely complex data, models, and technologies, so different considerations are needed in architecture selection. In fact, most organizations in the industry have deployed artificial intelligence based on Intel® Xeon® Scalable processors. Intel will continue to improve the platform through features such as Intel® Vector Neural Network Instructions (VNNI) and Intel® Deep Learning Acceleration Technology (DL Boost), thereby improving the performance of artificial intelligence reasoning in data centers and edge deployments. For many years to come, Intel Xeon Scalable processors will continue to be a powerful cornerstone of artificial intelligence computing.

 

Intel customers with the most advanced deep learning training needs require performance to double every 3.5 months, and this type of breakthrough can only be achieved with a range of AI solutions, such as Intel AI solutions. Intel has the ability to consider computing, memory, storage, interconnect, packaging and software to maximize efficiency and programmability, and ensure the key capabilities to scale deep learning to thousands of nodes, thereby expanding the scale of the knowledge revolution.

Keywords:Intel Reference address:Intel Launches New AI Hardware from Cloud to Edge to Accelerate AI Development

Previous article:Creating smart office space, Zhuosiweier helps enterprises with digital transformation
Next article:Rutronik offers Infineon broadband RF switches with high switching speeds

Recommended ReadingLatest update time:2024-11-16 14:44

Li Yajun of Linxin Investment: In 10-20 years, China will have a semiconductor giant like Intel
On January 16, the 2021 China Semiconductor Investment Alliance Annual Meeting and China IC Billboard Awards Ceremony were held in Beijing. Shanghai Linxin Investment Management Co., Ltd. (hereinafter referred to as "Linxin Investment") won the "Best Investment Institution of the Year Award" of the 2021 China IC Billb
[Mobile phone portable]
Intel to provide chip foundry services to the US Department of Defense
The U.S. Department of Defense today awarded Intel an agreement to provide commercial chip foundry services to the Department of Defense. As Intel announced this morning, Intel’s Foundry Services division has entered into an agreement with the U.S. Department of Defense to provide wafer fab services under the Rapid As
[Semiconductor design/manufacturing]
Intel to provide chip foundry services to the US Department of Defense
Google's new AI tool can scan documents, answer calls and search for products
  As enterprise adoption of artificial intelligence continues to grow (according to data surveys, this figure has increased by 270% in the past four years), Google is wise to start using machine learning technology to drive new cloud services to attract customers. At the Google Cloud Next conference held in San Franc
[Security Electronics]
Google's new AI tool can scan documents, answer calls and search for products
Samsung Electronics AI helps create a new era and a smart ecosystem
Whether it is the just concluded CES Asia or the upcoming MWC Shanghai, artificial intelligence has become a well-deserved hot topic. With the rapid development of science and technology today, AI will usher in more professional and scenario-based applications. According to data from the Forward-looking Industry Resea
[Internet of Things]
Samsung Electronics AI helps create a new era and a smart ecosystem
Intel's fourth-quarter revenue of US$14.042 billion swung from profit to loss year-on-year
On January 27, Intel (28.16, -1.93, -6.41%) announced the company’s fourth quarter and full-year financial report for fiscal year 2022. The report shows that Intel’s fourth-quarter revenue was US$14.042 billion, a decrease of 32% compared with US$20.528 billion in the same period last year; net loss was US$661 million
[Semiconductor design/manufacturing]
Building AI and big data industry ecosystem China Telecom establishes digital intelligent technology branch
On May 19, China Telecom Digital Intelligent Technology Branch was established. The subsidiary's goal is to promote the company's big data and AI core capabilities. It is understood that the Digital Intelligent Technology Branch will build a 10,000-level AI algorithm cabin to further strengthen the key core technolo
[Mobile phone portable]
Intel talks about the Metaverse: Chip computing power needs to be increased 1,000 times
Raja Koduri, Intel's senior vice president and general manager of architecture, graphics and software, made his first public comment on the Metaverse strategy, saying that chip computing power needs to increase 1,000 times to power the Metaverse. Intel will launch its first software to help devices "borrow" idle compu
[Mobile phone portable]
Swissbit announces participation in Digitimes 2022 Taiwan AI EXPO
Bronschhofen, Switzerland, April 25, 2022 – Swissbit is pleased to announce that the company will participate in the Digitimes 2022 Taiwan AI Expo, which will be held in Taipei, Taiwan from May 4 to 6. The exhibition will be held at the East Second Warehouse in Huashan Cultural and Creative Park, Taipei, Taiwan. The t
[Internet of Things]
Latest Internet of Things Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号