Intel's Sachin Katti reveals how edge platforms can enhance AI capabilities
[Copy link]
Sachin Katti , Senior Vice President and General Manager, Network and Edge Group, Intel Corporation
Over the past year, we have begun to realize the tremendous power of AI and the innovative potential it inspires. Discussions surrounding AI remain high, and many of these innovations will profoundly change the development process of the technology industry and even the world as a whole.
In the future, the lifeblood of AI will rely on an open ecosystem. This ecosystem can provide developers with a variety of choices and help them achieve cross-domain and cross-vendor application portability. This means developing platforms and solutions that will transform the world's physical infrastructure into seamlessly connected, ubiquitous software.
In the past, AI has been concentrated in data centers, but market research firm Gartner predicts : "By 2025, more than 50% of data managed by enterprises will be created and processed outside of data centers or clouds." Today, enterprises are seeking more opportunities through AI-based automated operations, which accelerates the above development trend. As a large amount of data is exploding at the edge, and the data generated by mobile phones, PCs or retail stores is driving the continuous improvement of intelligence, it is possible for AI to be everywhere. As one of the largest computing workloads, AI focuses on decision-making, and based on this, it drives efficiency and actual benefits in retail, manufacturing, hospitality and other industries.
At Intel, we see a surge in demand for edge computing, driven by economic and physical world evolution and the coming wave of autonomous, context-aware, and collaborative software, all while dramatically reducing power consumption and total cost of ownership. Businesses want automation not only to be more price competitive or mitigate talent shortages, but also to increase innovation, improve efficiency, and reduce time to market. While sending data to the cloud for processing can produce desirable results, it is a costly and time-consuming task from a physical standpoint.
From an economic perspective, generating and processing data on your own local equipment is more cost-effective and efficient than renting, maintaining, and transferring data to cloud servers. In addition, in terms of data security compliance, localizing data processing at the edge not only complies with necessary laws, but also helps protect the privacy of the generated data.
Even with all the challenges, many of the partners I communicated with are eager to deploy AI as quickly as possible to reap the benefits. Some early AI adopters have already digitized operations and reaped the rewards by layering edge AI in existing everyday applications such as restaurants, factories, and point-of-sale terminals. By eliminating the latency and bandwidth costs of cloud processing, edge AI automates decision-making, helping to address issues such as talent shortages and privacy regulations. However, some of the inherent complexities of edge technology should not be underestimated. For example, its flexibility may be affected by issues such as insufficient scale, limited computing power, and power shortages, and security and heterogeneity factors also need to be considered. Therefore, integrating new technologies into these scenarios is very complex, and the same is true for bringing AI to the edge.
With the help of telecommunications network transformation, the mobile industry has been able to cope with the challenges of complexity, which has led to explosive growth of AI technology at the edge. As the managers of the last mile for all enterprises, communications service providers provide them with a huge opportunity to help enterprises optimize and operate networks more efficiently through network slicing while using AI technology. In addition, AI-based radio intelligent controllers and predictive maintenance can also provide new edge AI products for enterprises in many vertical industries and enable them to profit from them.
Understandably, most customers prefer to integrate AI technology on existing infrastructure rather than build it from scratch. However, integrating new technology on existing infrastructure is difficult, which is also a recognized and inevitable challenge in the industry. Like any emerging technology, although people are full of expectations for the rapid development of AI, making decisions before the technology is fully mature is bound to be accompanied by certain risks. In the absence of unified industry-wide standards and protocols, companies that invested early in AI deployment and implementation may need to re-evaluate and adjust previous decisions in the future.
Analysts believe that the development of edge AI will go through three stages. The first is the highly customized use cases that are now widely adopted. Second, over time, these use cases will be gradually replaced by solutions built for specific industries, which have inherent challenges in interoperability and energy efficiency, and are complex to operate different systems and software. Eventually, a foundational cross-industry platform will emerge to solve the challenges faced by all industries. Faced with the inevitable interoperability challenges and development shackles for many years, we should actively pursue and develop innovative solutions to meet the core challenges brought by the development of the intelligent edge.
We firmly believe that by taking an open, modular, unified platform-centric approach, communications service providers, developers, infrastructure operators, and enterprises will be able to more easily and efficiently develop, deploy, operate, and manage scalable edge solutions. This will pave the way for the secure deployment and automated management of heterogeneous edge device fleets across geographies, which will continue to expand and adjust as business needs change. Just as open standards and software-defined networks have played a vital role in the evolution of cloud computing, we can also draw on these principles to accelerate the deployment of edge AI solutions. By integrating software, hardware, and platform solutions developed specifically for edge AI, we can expect to build a digital ecosystem to achieve the goal of providing AI capabilities wherever they are needed.
illustrate:
Gartner report, “Hyperscale Cloud Service Providers Extend to the Digital Edge,” by Thomas Bittman, July 24, 2023.
GARTNER is a registered trademark and service mark of GARTNER Corporation and/or its affiliates in the U.S. and internationally and is used under license. All rights reserved.
Intel does not control or audit third-party data. Please consult other sources of information when evaluating the accuracy of the data.
|