Can China become self-sufficient in artificial intelligence memory chips amid US sanctions?
(Baonghean.vn)- The Chinese government has determined that the country must find a way to become self-sufficient in artificial intelligence (AI) memory chips, even though it may take many years.
The rapid development of generative artificial intelligence (AI) applications such as ChatGPT in recent times has made High Bandwidth Memory (HBM), memory chips designed specifically for AI processors, the most sought-after components. Meanwhile, the sanctions imposed by the Washington administration on China have left the Chinese government with no choice but to become self-sufficient in HBM, although it may take many years.

Currently, semiconductor giants SK Hynix and Samsung Electronics of South Korea control as much as 90% of the global market for HBM memory chips, which are critical components needed to train AI systems like OpenAI’s ChatGPT. According to researchers at market research firm TrendForce, SK Hynix had 50% of the global HBM market share last year, followed by Samsung at 40% and U.S. rival Micron Technology at 10%.
Both South Korean companies have revealed plans to double their HBM chip output next year, even as they scale back other areas of the memory sector amid a supply glut that has squeezed operating profits. SK Hynix became the first company to mass produce the advanced fourth-generation HBM-3 chips used in Nvidia's H100 graphics processing units (GPUs).
HBM memory chips are the next generation of memory chips dedicated to artificial intelligence processors
HBM stacks memory chips vertically, like floors in a skyscraper, shortening the distance information has to travel. These memory towers are designed to connect to the central processing unit (CPU) or GPU via an ultra-fast connection called an “interposer” that sits between the chip and the circuit board. According to TrendForce, Nvidia has set a new industry standard by using HBM chips to speed up data transfer between GPUs and memory stacks.
Nvidia’s highly sought-after H100 GPU features HBM-3 memory, which provides up to 3 terabytes per second of memory bandwidth, according to Nvidia. Meanwhile, SK Hynix says HBM technology is “a prerequisite for deploying Level 4 and Level 5 automation in self-driving cars.”
SK Hynix also recently announced that it has successfully developed a version of HBM-3E, a next-generation high-end dynamic random access memory (DRAM) for AI applications, and provided samples to customers to evaluate the product's performance. Mass production is expected to begin in the first half of 2024, with customers such as US chip companies AMD and Nvidia said to be lining up for the new product.
The forecast shows that the HBM memory chip market size is estimated to reach USD 2.04 billion in 2023 and is expected to reach USD 6.32 billion by 2028, growing at a compound annual growth rate (CAGR) of 25.36% during the forecast period 2023-2028.
Among the key factors driving the growth of the HBM market include the growing demand for high bandwidth, low power consumption and highly scalable memory, increasing adoption of artificial intelligence and the growing trend of miniaturization of electronic devices.
TrendForce predicts that rising demand for AI memory chips has driven growth in HBM shipments, with a nearly 60% increase this year. Meanwhile, China will rely solely on domestic production, putting it in a race against time.
Is China officially joining the AI memory chip race?
Beijing’s ambitions to join the world’s top memory chip makers and replace imports with domestic products in its domestic market are increasingly at risk as the world’s leading companies adopt cutting-edge technologies. At the same time, China’s ability to catch up is hampered by US sanctions.
Not long ago, China was seen as quickly catching up with international suppliers in both advanced 3D NAND flash memory and DRAM. However, the gap is now widening in the ChatGPT era as China’s Yangtze Memory Technology Corporation (YMTC) and Changxin Memory Technologies (CXMT) have been unable to continue their catch-up efforts due to US government export restrictions.
China is reportedly looking to produce its own HBM as part of its drive for self-sufficiency. “While it will be an uphill battle to catch up with global leaders such as SK Hynix, Samsung Electronics and Micron Technology due to the impact of Washington’s sanctions, the Chinese government has determined that the country must become self-sufficient in HBM, although it may take years,” the South China Morning Post (SCMP) reported.
Citing sources in the semiconductor industry, SCMP reported that China's leading DRAM maker CXMT is the Chinese government's best hope for developing HBM memory chips; however, it could take up to four years to bring the product to market. If CXMT or other Chinese chipmakers decide to pursue HBM memory chip production, they will inevitably face difficulties as they will have to use outdated equipment, the sources added.
Despite its high performance, HBM memory chips do not necessarily require cutting-edge lithography technology such as extreme ultraviolet (EUV) lithography, according to semiconductor industry experts. That means China can produce its own versions even without the latest equipment.
In short, China is known for keeping its strategic cards close to its chest regarding its latest technology, leaving the outside world to speculate about its progress in achieving self-sufficiency in technology, including artificial intelligence memory chips.