Can China become self-sufficient in artificial intelligence memory chips in the face of US sanctions?
(Baonghean.vn) - The Chinese government has determined that it must find a way to become self-sufficient in artificial intelligence (AI) memory chips, even if it may take many years.
The rapid development of generative artificial intelligence (AI) applications like ChatGPT in recent years has made high-bandwidth memory (HBM), memory chips specifically designed for AI processors, highly sought-after components. Meanwhile, sanctions imposed by the Washington administration on China have left the Chinese government with no choice but to become self-sufficient in HBM, although this may take years.

Currently, semiconductor giants like South Korea's SK Hynix and Samsung Electronics control up to 90% of the global market for HBM memory chips, which are crucial components needed to train AI systems like OpenAI's ChatGPT. According to researchers at market research firm TrendForce, SK Hynix held 50% of the global HBM market share last year, followed by Samsung with 40% and its American rival Micron Technology with 10%.
Both South Korean companies have revealed plans to double their HBM chip production next year, even as they scale back into other areas of the memory sector amid oversupply that has dampened operating profits. SK Hynix became the first company to mass-produce the advanced fourth-generation HBM-3 chips used in Nvidia's H100 graphics processing units (GPUs) – a US multinational technology corporation that develops GPUs and manufactures chipsets for electronic devices such as workstations, personal computers, and mobile devices.
HBM memory chips are the next generation of memory chips specifically designed for artificial intelligence processors.
HBM stacks memory chips vertically, like floors in a skyscraper, shortening the distance information travels. These memory towers are designed to connect to the central processing unit (CPU) or GPU via an ultra-fast connection called an "interposer" located between the chip and the circuit board. According to TrendForce, Nvidia has set a new industry standard by using HBM chips to increase data transfer speeds between GPUs and memory stacks.
According to Nvidia, the highly sought-after Nvidia H100 GPU features an HBM-3 memory system, providing memory bandwidth of up to 3 terabytes per second. Meanwhile, SK Hynix states that HBM technology is "a prerequisite for deploying Level 4 and Level 5 automation in self-driving vehicles."
Recently, SK Hynix also announced that it has successfully developed the HBM-3E, a next-generation high-performance dynamic random access memory (DRAM) for AI applications, and provided samples to customers for performance evaluation. Mass production is expected to begin in the first half of 2024, with customers such as US chip companies AMD and Nvidia reportedly lining up for the new product.
Forecasts indicate that the HBM memory chip market size is estimated to reach US$2.04 billion in 2023 and is expected to reach US$6.32 billion by 2028, growing at a compound annual growth rate (CAGR) of 25.36% during the forecast period of 2023-2028.
Key factors driving the growth of the HBM market include the increasing demand for high bandwidth, low power consumption, and highly scalable memory, the growing adoption of artificial intelligence, and the increasing trend towards miniaturizing electronic devices.
TrendForce forecasts that the growing demand for AI memory chips has fueled growth in HBM shipments, with an increase of nearly 60% this year. Meanwhile, China will rely solely on domestic production, putting it in a race against time.
Has China officially joined the AI memory chip race?
Beijing's ambition to join the world's leading memory chip manufacturers and replace imports with domestically produced products in the domestic market is increasingly at risk as leading global companies adopt advanced technologies. At the same time, China's ability to catch up with advanced technology is hampered by US sanctions.
Not long ago, China was considered to have rapidly caught up with international suppliers in both advanced 3D NAND flash memory and DRAM. However, the gap is now widening in the ChatGPT era as China's Yangtze Memory Technology Group (YMTC) and Changxin Memory Technologies (CXMT) have been unable to continue their efforts to catch up due to US government export restrictions.
China is reportedly seeking to produce its own HBM as part of an effort to achieve self-sufficiency. “While it will be a tough battle to catch up with global leaders like SK Hynix, Samsung Electronics, and Micron Technology due to the impact of Washington’s sanctions, the Chinese government has determined that it must achieve self-sufficiency in HBM, even if it takes years,” the South China Morning Post (SCMP) reported.
Citing sources in the semiconductor industry, the SCMP reported that China's leading DRAM manufacturer, CXMT, is the Chinese government's biggest hope for developing HBM memory chips; however, it could take up to four years to bring a product to market. The sources added that if CXMT or other Chinese chip manufacturers decide to pursue HBM production, they will certainly face difficulties due to the use of outdated technology.
According to semiconductor technology experts, although HBM memory chips offer high performance, their production doesn't necessarily require cutting-edge lithography technology like extreme ultraviolet (EUV) lithography. This means China can produce its own versions even without the latest equipment.
In short, China is known for keeping its strategic cards related to its latest technologies under wraps, leaving the outside world to speculate about its progress toward achieving technological self-sufficiency, including in artificial intelligence memory chips.


