The successful application of artificial intelligence (AI) in business scenarios is constantly breaking through people's existing imagination. From the advancement of edge AI and computer vision technology, to the modernization of data centers and AI-specific chips, to the use of AI to design chips, the wave of AI innovation has been surging in the past year alone. These milestone application developments have also brought many new opportunities to the AI industry.
The revolutionary development of AI has also driven the demand for a new generation of AI system-level chips. The global market value of AI chips is expected to increase from US$8 billion in 2019 to more than US$70 billion in 2026. Investors' investment in AI start-ups has also hit a record high. In the third quarter of 2021, the total amount of global AI financing reached US$17.9 billion, fully reflecting the popularity of the AI industry around the world.
In the new year, we are facing new goals, new opportunities and new challenges. While trying to solve problems such as supply chain constraints, global chip shortages, and the impact of the COVID-19 pandemic on the economy, the semiconductor industry is also actively exploring the metaverse. It is an inevitable trend to integrate intelligence into chips, and everyone wants to share the big AI pie.
AI continues to empower chip design
Nowadays, more and more tasks require advanced AI to handle. The market demand for more energy-efficient and faster specialized chips is increasing, and it is crucial to design chips with powerful AI.
AI has driven the birth of a new generation of design tools. AI can continuously learn in iterations and obtain data in the chip design environment, thereby greatly improving production efficiency and cost-effectiveness. In a sense, the disruptive wave of AI will create a more fair competitive environment, and companies that use AI for chip design will be symmetrically distributed in the global economy. This will bring new development opportunities not only to companies in the semiconductor industry, but also to companies with smaller teams or limited financial resources.
Future AI hardware design is bound to revolutionize chip design technology. In 2021, companies that invested in data centers achieved considerable returns and demonstrated excellent technical capabilities. The development of data centers has not only pushed up the demand for dedicated AI chips, but also allowed AI investment to grow at an unprecedented rate. This year, GPU will continue to be the dominant architecture in the data center market, and we expect this growth trend to continue. Leading companies will choose the next generation of AI-assisted design systems to massively expand and explore design workflows and automate non-critical decisions. To meet their own chip design needs, companies are turning to the cloud to increase design capacity, speed up turnaround time, and optimize high-quality application designs.
AloT takes over as the driving force of the digital age
In fact, AIoT is a relatively new term. It combines all the functions of AI and IoT (Internet of Things) to provide a more intelligent network of interconnected devices. It can also process and calculate massive amounts of data that traditional methods cannot carry.
As IoT edge technologies such as augmented intelligence and the metaverse develop, large companies will choose to refocus their strategies on AI innovations that process real-time data and build on the growth of AloT devices.
Three major applications take the lead in deploying AI chip design
High-performance computing (HPC), autonomous devices, and digital healthcare are three major application areas that will take the lead in using AI technology in chip design.
The demand for specialized chips in data centers means that the HPC market will continue to drive large-scale investment in AI chips. With the support of specialized AI chips, data centers will be able to perform calculations on AI workloads consisting of more than 1 trillion nodes. In the field of edge computing, more and more companies are beginning to apply AI chips to more fields such as the automotive industry and smart machines, expanding from industrial machines to smart robots, drones, etc. Due to the huge scale of data and people's expectations for a digital society, chip design teams need to provide proven IP solutions for complex SoCs for various applications. It is expected that in 2022, this field will continue to grow despite the tight supply chain.
The active response of countries around the world to the COVID-19 pandemic has brought many opportunities for AI in the medical field, especially in diagnosis and medical research. The computing requirements in the medical field may not be as stringent as those in data centers, but the unique requirements around data protection, data security and real-time analysis require a secure localized environment to complete on-site evaluation and analysis.
From AI accelerators to cognitive systems, the three markets of high-performance computing, autonomous devices, and digital healthcare will attract more companies and capital attention, and simultaneously promote the growth of AI in chip design and the seamless integration of AI in devices.
More system companies are entering the path of self-developed chips
Looking back at 2021, chip design has become a hot topic in the technology field. AI is rapidly reshaping the overall blueprint of chip design, and various technology companies have begun to develop their own chips.
Apple's recently launched self-developed chip M1 Max can integrate multiple powerful computing components and provide the most powerful chip support in the industry. The move of non-traditional semiconductor companies to deploy customized ASIC (application-specific integrated circuit) development has inspired many companies in the industry to carefully evaluate whether self-developed chips really have a competitive advantage in the context of the rapid growth of the market served by the company. There are many advantages to self-developed chips, such as maximizing data control and reducing delays between speed, decision-making, and results. Establishing a first-class chip design team is an important way to create and protect intellectual property, but with the rapid expansion of business scenarios, talent shortage has become another difficult problem that companies need to face.
AI promotes the establishment of system trust chain
When AI is applied in areas such as autonomous driving, digital finance, and chip design, we see better results and higher productivity without causing major defects or program delays. Therefore, enterprises are motivated to establish different levels of trust in the underlying hardware infrastructure and create secure channels for remote device management, service deployment, and lifecycle management, thereby ensuring that the entire system is trustworthy not only to the software side but also to the end users.
As AI becomes more prevalent in computing software, the need for advanced trust and security is increasing at all stages of the system, especially during design and integration. Until now, AI hardware has been less important than software, but as the trust chain becomes more important in the current supply chain environment, it will become essential throughout the entire workflow.
Faster computing speeds, more edge intelligence, more efficient processing of larger amounts of data, and automation of more product functions are the core factors driving the above predictions.
Bold and innovative AI hardware architecture and clear AI strategy will become the core driving factors for the seamless integration of AI innovative applications and software systems. From chips to software, Synopsys is always committed to making technology smarter and safer. In the future, we will continue to increase investment to achieve disruptive innovation and rapid growth of AI-driven design solutions.
Previous article:Cengiz Balkas, Senior Vice President of Materials Business at Wolfspeed, reviews 2021
Next article:Top 10 Research Advances in China's Semiconductor Industry in 2021
- Popular Resources
- Popular amplifiers
- Vietnam's chip packaging and testing business is growing, and supply-side fragmentation is splitting the market
- The US asked TSMC to restrict the export of high-end chips, and the Ministry of Commerce responded
- ASML predicts that its revenue in 2030 will exceed 457 billion yuan! Gross profit margin 56-60%
- ASML provides update on market opportunities at 2024 Investor Day
- It is reported that memory manufacturers are considering using flux-free bonding for HBM4 to further reduce the gap between layers
- Intel China officially releases 2023-2024 Corporate Social Responsibility Report
- Mouser Electronics and Analog Devices Launch New E-Book
- AMD launches second-generation Versal Premium series: FPGA industry's first to support CXL 3.1 and PCIe Gen 6
- SEMI: Global silicon wafer shipment area increased by 6.8% year-on-year and 5.9% month-on-month in 2024Q3
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Download from the Internet--ARM Getting Started Notes
- Learn ARM development(22)
- Learn ARM development(21)
- Learn ARM development(20)
- Learn ARM development(19)
- Learn ARM development(14)
- Learn ARM development(15)
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- 5G is coming, and Qorvo is waiting
- What will happen if voice recognition and RFID technology are used in smart access control?
- How to choose the most suitable network cable for POE power supply?
- [Silicon Labs Development Kit Review] +EFM32PG22_EVB Key Control Example
- [Zhongke Bluexun AB32VG1 RISC-V board "runs into" RTT evaluation] Watchdog
- Voltage sampling is inaccurate
- About initialization problem
- Cost a few cents to DIY a charger
- EV2400 WIN7 64 BQ20Z45
- Help