Source: Translated from "FOREIGN AFFAIRS", thank you
The development of artificial intelligence was once a primarily technical matter, confined to academia and private sector labs. Today, it is an arena for geopolitical competition. The United States and China are investing billions of dollars each year in developing their AI industries, increasing the autonomy and power of future weapons systems, and pushing the frontier of possibility. Fears of an AI arms race between the two countries abound — and while the rhetoric often outstrips the technical reality, rising political tensions mean both countries increasingly view AI as a zero-sum game.
Despite its geopolitical complexities, AI competition can be boiled down to a simple technological trinity:
data, algorithms, and computing power
. The first two elements of the trinity have received a great deal of policy attention. As the sole input to modern AI, data is often compared to oil—a claim that’s been heard everywhere from tech marketing materials to presidential primaries. Equally important in policy discussions are algorithms, which enable AI systems to learn and interpret data. While it’s important not to overstate its capabilities in these areas, China has done well on both counts: Its vast government agencies churn through vast troves of data, and its technology companies have made significant progress in advanced AI algorithms.
But the third element of the trinity is often overlooked in policy discussions. Computing power, or computers, in industrial terms, is seen as a boring commodity that doesn’t deserve serious attention. That’s partly because computers are often taken for granted in everyday life. Few people know how fast the processor in their laptop is, only that it’s fast enough. But in artificial intelligence, compute is essential. As algorithms learn from data and encode insights into neural networks, they perform trillions or trillions of individual calculations. Without processors capable of crunching these math problems at high speeds, AI development would grind to a halt. A cutting-edge computer, then, is more than a technological marvel; it’s a powerful point of leverage between nations.
Recognizing the true power of computers means reassessing the state of the global AI competition. Unlike the other two elements of the triad, the computer industry has undergone a silent revolution led by the United States and its allies, a revolution that has given these countries a structural advantage over China and other countries that are data-rich but lag behind in advanced electronics manufacturing. U.S. policymakers can build on this as they seek to maintain their technological advantage. To do so, they should consider increasing investment in research and development and restricting exports of certain processors or manufacturing equipment. These options have substantial advantages in maintaining America’s technological edge that are often underestimated but too important to ignore.
The computing power for AI has changed radically over the past decade. According to research lab OpenAI, the amount of computing used to train top AI projects increased 300,000-fold between 2012 and 2018. To put that number into context, if a cell phone battery lasted a day in 2012, and its lifespan increased at the same rate as AI computing, the 2018 version of that battery would last more than 800 years.
The massive computing power has enabled major breakthroughs in AI, including OpenAI’s GPT-3 language generator, which can answer science and trivia questions, fix bad grammar, decipher anagrams, and translate between languages. Even more impressive, GPT-3 can generate original stories. Give it a title and a one-sentence summary, and like a student with a writing prompt, it can conjure up coherent paragraphs of text that a human reader would have a hard time recognizing are machine-generated. GPT-3’s data (almost a trillion words of human writing) and complex algorithms (running on a giant neural network with 175 billion parameters) have attracted the most attention, but neither would be useful without the program’s massive computing power, enough to run the equivalent of 3,640 quadrillion calculations per second, every day.
Recognizing the importance of computing power means reassessing the state of the global AI competition.
The rapid advances in computing that OpenAI and others have exploited are partly a product of Moore’s Law, which states that the basic computing power of cutting-edge chips doubles every 24 months, thanks to improvements in processor engineering. Equally important are rapid improvements in “parallelization,” the ability of multiple computer chips to train AI systems simultaneously. Those same chips have also become increasingly efficient and can be customized for specific machine learning tasks. Together, these three factors have boosted AI computing power, improving its ability to solve real-world problems.
None of these developments have come cheap. For example, as engineering problems get harder, the cost and complexity of producing new computer chip factories increases. Moore’s little-known second law says that the cost of building factories to produce computer chips doubles every four years. New facilities to build and staff chip-making machines can cost as much as $20 billion, and sometimes each device can cost more than $100 million to run. Increasingly parallelizing machines also adds expense, as does using chips designed specifically for machine learning.
The rising cost and complexity of computing gives the United States and its allies an advantage over China, which still lags behind its competitors on this element of the AI trinity. American companies dominate the market for software needed to design computer chips, while the United States, South Korea and Taiwan have leading chip manufacturing facilities. Three countries, Japan, the Netherlands and the United States, lead in chip manufacturing equipment, controlling more than 90% of the global market share.
China has been trying to close these gaps for decades. When Chinese planners decided in 1977 to build a domestic computer chip industry, they believed the country could be internationally competitive within a few years. Beijing invested heavily in the new field. But technological barriers, a lack of experienced engineers, and poor central planning meant Chinese chips still lagged behind competitors decades later. By the 1990s, the Chinese government’s enthusiasm had largely faded.
But in 2014, a dozen leading engineers urged the Chinese government to try again. Chinese officials created the National Integrated Circuit Fund, often called the “Big Fund,” to invest in promising chip companies. Its long-term plan is to meet 80 percent of China’s chip needs by 2030. Despite some progress, China still lags behind. China still imports 84 percent of its computer chips from abroad, and even of the computer chips it produces domestically, half are made by non-Chinese companies. Western chip design, software, and equipment still dominate, even in Chinese manufacturing plants.
The advantages currently enjoyed by the United States and its allies, due in part to the growing importance of computing, present an opportunity for policymakers interested in limiting China’s AI capabilities. By restricting the supply of chips with export controls or limiting the transfer of chipmaking equipment, the United States and its allies could slow China’s AI development and ensure its reliance on existing producers. The administration of U.S. President Donald Trump has taken limited action in this regard: In a possible foreshadowing, in 2018 it successfully pressured the Netherlands to ban the export of $150 million of cutting-edge chip-making machines to China.
The United States and its allies must consider how to develop their own computer chip industries.
Export controls on chips or chipmaking equipment would likely have diminishing marginal returns. In the long run, a lack of competition from Western technology will only help China build its own industry. Restricting access to chipmaking equipment, therefore, may be the most promising approach, since China is unlikely to develop the equipment on its own. But the issue is time-sensitive and complex. Policymakers have a window to act, and it may be closing. Their priority must be to determine how best to preserve America’s long-term advantage in AI.
The United States and its allies are also considering how to develop their own chip industries. As computing becomes increasingly expensive to build and deploy, policymakers must find ways to ensure that Western companies continue to push the technological frontier. Across several presidential administrations, the United States has failed to maintain its edge in telecommunications, ceding many areas to others, including China’s Huawei. When it comes to chips, chipmaking equipment, and AI, the United States can no longer afford the same fate.
Part of ensuring that doesn’t happen will mean making computing accessible to academic researchers so they can continue to train new experts and contribute to progress in AI development. Some AI researchers have already complained that the high cost of computing limits the speed and depth of their research. Few academic researchers can provide the computing power needed to develop GPT-3. If that power is too expensive for academic researchers, more research will move to large private companies, crowding out startups and stifling innovation.
An often-overlooked lesson in the U.S.-China competition is that computing power matters. Data and algorithms are critical, but they mean little without the compute to back them up. By leveraging their natural lead in this area, the United States and its allies can maintain resistance to China’s capabilities in AI.
*Disclaimer: This article is originally written by the author. The content of the article is the author's personal opinion. Semiconductor Industry Observer reprints it only to convey a different point of view. It does not mean that Semiconductor Industry Observer agrees or supports this point of view. If you have any objections, please contact Semiconductor Industry Observer.
Today is the 2395th issue of content shared by "Semiconductor Industry Observer" for you, welcome to follow.
Semiconductor Industry Observation
"
The first vertical media in semiconductor industry
"
Real-time professional original depth
Scan the QR code
, reply to the keywords below, and read more
EUV|Bluetooth|
5G|Qualcomm|Nvidia|FPGA|Domestic Chips|Foundry
Reply
Submit your article
and read "How to become a member of "Semiconductor Industry Observer""
Reply
Search
and you can easily find other articles that interest you!