Challenge NVIDIA and keep going!
????If you want to see each other often, please mark it as a star???? and add it to your collection~
Source
: Content
Comprehensive
by Semiconductor Industry
Observation (ID: i
c
bank
)
, thank you.
Since last year, in Silicon Valley, a company's position in the artificial intelligence industry has been determined by the number of Nvidia-made graphics processing units, or GPUs, it owns. For example, Microsoft, Amazon, Meta and Tesla have abundant GPU reserves.
Nvidia, a company that pivoted from making gaming and graphics hardware to making artificial intelligence chips in 2010, is reaping rewards for its prescient strategy. Currently, the company has about 70% market share in GPUs. (Their third-quarter results, released in late November, showed revenue growth of 206% year-over-year.)
But big tech companies don't just rely on Nvidia. In fact, Amazon, Microsoft, Alphabet, and Meta have all launched their own custom AI chipsets or plan to launch one soon. Gemini, Google's latest and most powerful AI model launched in December, was trained using the company's own tensor processing unit (TPU) chips. (Google claims TPU is the next generation of GPUs.)
Against this backdrop, what role can upstart chipmakers play in a market dominated by tech giants?
"Upstart companies have their work cut out a bit," said Sangeet Paul Choudhary, a management consultant and author of "Platform Scale." "As companies look to target enterprise audiences and build highly fine-tuned models, they will look to build AI chips that are fit-for-purpose to provide optimal performance in that environment. This will drive greater vertical integration overall."
Nvidia's demand isn't just driven by chips, it's also a one-stop shop for developers. “The right gaming vertical requires engaging the entire ecosystem. In Nvidia’s case, it has much higher adoption in terms of research (according to citations), higher developer engagement and now a partnership with Hugging Face Even more so. Cuda [a programming model that helps compute GPUs] is the favorite toolkit for developers [so] just making better chips won't solve the problem," he added.
According to reports, already cautious investors are staying away from this difficult industry, and Nvidia is making this industry even more difficult. Pitchbook data shows that as of the end of August 2023, U.S. chip manufacturing startups had raised US$881.4 million, a sharp decrease from US$1.79 billion in the first three quarters of 2022. The number of transactions also dropped from 23 to just 4 during the same period.
But the downturn is not limited to capital investment in chip manufacturing. Global venture capital funding fell to about $345 billion in the fourth quarter from $531 billion in 2022, according to PitchBook's First Look data package.
"Due to the large investment in R&D, AI chip companies generally have high capital requirements, and the capital requirements in the start-up stage can be 8-10 times that of start-up companies. Moreover, it takes at least two years to develop a medium-complexity chip. These factors lead to longer wait times for investors to get returns and higher risks," said Madhukar Bhardwaj, principal investor at Physis Capital.
While Nvidia's dominance is clear and makes it difficult for other companies to attract funding, it's worth noting that "the market is always receptive to innovators with revolutionary products," he said.
In September, Santa Clara-based AI chip startup d-Matrix raised $110 million in a Series B round of funding led by Singapore's Temasek, Microsoft and Playground Global. Rather than focusing on training large-scale AI models, the startup has chosen to do something specialized — inference, which is making predictions based on data.
In August, another AI hardware startup, Tenstorrent, founded by pioneering chip architect Jim Keller, raised $100 million in a convertible note financing round co-led by Hyundai Motor Group and Samsung Catalyst Fund. The startup recently launched a service that allows customers to use artificial intelligence models without purchasing them. A spokesperson for the company said: "While Nvidia is currently the dominant force in the AI chip industry, we do believe there is room for viable competitors to emerge. We believe the way to compete with Nvidia is to provide complete solutions (hardware and software) without requiring users to change their workflows. We think we can challenge Nvidia with an open source platform."
OpenAI CEO Sam Altman signs a $51 million deal with chip startup Rain Neuromorphics. The company is developing a neuromorphic processing unit (NPU), a chip that replicates the form of the human brain. Compared with GPUs, NPUs have 100 times more computing power and 10,000 times more energy efficiency.
There are other young companies taking up the challenge—Tiny Corp. designs ARM-based training and inference chips for edge computing; Modular develops parallel accelerator chips for training and inference (both offer speed at low cost and are Considered a replacement for Nvidia Cuda). Another startup called MatX designs neural network inference chips focused on edge applications.
"This is just the beginning of what we're seeing in the industry. I'm sure there's more to come. Despite the perception that Nvidia has a monopoly, I believe the next five years will be a period of acceleration. AI chips The manufacturing pie is a huge one, and Nvidia's share, even if it is large, is part of an industry forecast to be worth $200-300 billion over the next 10 years," said Ashok Chandak, chairman of IESA (Indian Electronics and Semiconductor Association).
Chandak believes there are several factors behind this: “First, capabilities will increase. We will see applications of AI in healthcare, automotive, and robotics; it will not be limited to large language models. Second, GPUs are not the only source of computing. . Depending on the use case, there are Intel or AMD CPUs, so there are a lot of opportunities. Third, edge computing will grow in scope and penetrate into smaller applications such as smart cameras, medical instruments or security tools,” he explained. .
He noted that Nvidia has woken up the entire industry and will be a driver in the long run.
More challengers for GPUs
The costs of further advances in AI are becoming as staggering as the illusion of ChatGPT. Demand for graphics chips, known as GPUs, required for large-scale AI training has driven prices of key components soaring. OpenAI says training the algorithms that currently power ChatGPT cost the company more than $100 million. Competition in AI also means data centers now consume worrying amounts of energy.
The artificial intelligence gold rush has some startups making bold plans to create new computing shovels to sell. Nvidia's GPUs are by far the most popular hardware for developing artificial intelligence, but these upstarts believe it's time to completely rethink the way computer chips are designed.
Normal Computer, a startup founded by veterans of Google Brain and Alphabet's Moonshot Lab
Traditional silicon chips run calculations by processing binary bits, or 0s and 1s, that represent information. NormalComputing's Stochastic Processing Unit (SPU) exploits the thermodynamic properties of electrical oscillators to perform calculations using random fluctuations that occur within the circuit. It can generate random samples that can be used to compute or solve linear algebra calculations, which are ubiquitous in science, engineering, and machine learning.
Faris Sbahi, CEO of NormalComputing, explained that the hardware is not only efficient but also well-suited for handling statistical calculations. This could one day help build artificial intelligence algorithms that can handle uncertainty, perhaps addressing the tendency of large language models to "hallucinate" their output when they are uncertain.
Sbahi said the recent successes of generative AI are impressive, but the technology is still far from its final form. “It was clear that there was something better, both in terms of software architecture and hardware,” Sbahi said. He and his co-founders previously worked on quantum computing and artificial intelligence at Alphabet. The lack of progress in using quantum computers for machine learning has prompted them to consider other ways of using physics to power the calculations required for artificial intelligence.
Another team of former quantum researchers from Alphabet left to form Extropic, a company that remains secretive and appears to have a more ambitious plan to use thermodynamic calculations for artificial intelligence. "We are trying to tightly integrate all neural computation in an analog thermodynamics chip," said Extropic founder and CEO Guillaume Verdon. “We are taking lessons from quantum computing software and hardware and bringing them into the full-stack thermodynamics paradigm.” (Verdon was recently revealed as the man behind the popular meme account on X Beff Jezos, which is associated with the so-called Effective Accelerationism movement Relatedly, the movement advocates moving toward a “technological capital singularity”).
As the industry struggles to maintain Moore's Law, the long-term prediction that the density of components on a chip continues to shrink, the idea that a broader rethinking of computing is needed may be gaining momentum. "Even if Moore's Law doesn't slow down, there's still a big problem because the sizes of models released by OpenAI and other companies are growing much faster than chip capacity," said Peter McMahon, a professor at Cornell University. Speed.” Novel ways of computing. In other words, we'll likely need to leverage new ways of computing to keep the AI hype train on track.
Normal, Extropic and other companies trying to rethink the fundamentals of computer chips are looking for investors, a sign that GPUs may soon have some competition. Vaire Computing is a UK-based startup that is developing silicon chips that work completely differently to traditional chips and can perform calculations without destroying information. This method, known as "reversible computing," was devised decades ago and promised to greatly improve computational efficiency, but it never succeeded. Vaire co-founder and CEO Rodolfo Rosini believes the physical limitations of etching smaller components into silicon mean time is running out for GPUs and other traditional chips. "We still have an order of magnitude left" in chip manufacturing, Rosini said. "We can make components smaller, but the number one enemy is removing heat from the system quickly enough."
It's not easy to convince a huge industry to abandon a technology that has been in development for more than 50 years. But for companies that deliver next-generation hardware platforms, the rewards will be huge. Andrew Scott of 7percent Ventures, which backs Vaire, said: "Every now and then something comes along that revolutionizes humanity as a whole, like a jet engine, a transistor microchip or a quantum computer." Betting on Extropic and Normal Himself Investors in the reinvention of computing have similar hopes for their own competitors.
Click [ Read original text ] at the end of the article to view the original English text.
END
*Disclaimer: This article is original by the author. The content of the article is the personal opinion of the author. The reprinting by Semiconductor Industry Watch is only to convey a different point of view. It does not mean that Semiconductor Industry Watch agrees or supports the view. If you have any objections, please contact Semiconductor Industry Watch.
Today is the 3650th issue shared by "Semiconductor Industry Observation" with you. Welcome to pay attention.
Recommended reading
★ EUV lithography machine blockbuster report, released in the United States
★ Silicon carbide "surges": catching up, involution, substitution
★ The chip giants all want to “kill” engineers!
★ Apple, playing with advanced packaging
★ Continental Group, developing 7nm chips
★
Latest interview with Zhang Zhongmou: China will find a way to fight back
"Semiconductor's First Vertical Media"
Real-time professional original depth
Public account ID: icbank
If you like our content, click "Watching" to share it with your friends.