Advantage Micron? SK Hynix Sees HBM Chip Sales Growing By 30% Every Year Through 2030 On ‘Very Firm And Strong’ AI Demand

HBM chips are dynamic random-access memory (DRAM) chips that provide a wider data channel, lower latency, and reduced power consumption.
The SK Hynix logo is being displayed on their pavilion at the Mobile World Congress 2024 in Barcelona, Spain, on February 28, 2024.
The SK Hynix logo is being displayed on their pavilion at the Mobile World Congress 2024 in Barcelona, Spain, on February 28, 2024. (Photo by Joan Cros/NurPhoto via Getty Images)
Profile Image
Shanthi M·Stocktwits
Published Aug 10, 2025 | 11:36 PM GMT-04
Share this article

As the artificial intelligence (AI) revolution takes a firm grip worldwide, the demand for memory chips that power the technology is expected to surge in the coming years. 

A senior executive at South Korean memory chipmaker SK Hynix stated in an interview with Reuters that the company anticipates a 30% annual increase in high-bandwidth memory (HBM) chip demand until 2030.

 In dollar terms, the executive anticipates the market for custom HBM to reach tens of billions of dollars by 2030. Custom HBM chips include a customer-specific base die that helps manage memory, making it impossible to replace a rivals’ memory products. 

HBM chips are dynamic random access memory (DRAM) chips that provide a wider data channel, lower latency, and reduced power consumption, which are critical for high-performance computing, particularly in AI and machine learning applications.

In HBM chips, DRAM dies are stacked vertically by connecting using through-silicon vias (TSV).

Significant players in the HBM chip market are Micron Technology (MU), Samsung, SK Hynix, Intel and Fujitsu, according to Mordor Intelligence. Micron stock has gained about 42% year-to-date compared to the more modest 13% gain for the iShares Semiconductor Index (SOXX).

On Stocktwits, retail sentiment toward Micron stock remained 'bullish' by late Sunday, while the message volume was 'normal.'

Screenshot 2025-08-10 at 11.33.33 PM.png
MU sentiment and message volume as of 11:30 p.m., Aug. 10 | source: Stocktwits

Choi Joon-yong, the head of HBM business planning at SK Hynix, told Reuters that “AI demand from the end user is pretty much, very firm and strong.”

According to the executive, the relationship between AI build-outs and HBM purchases is "very straightforward," and there is a correlation between the two. He called Hynix's projections conservative, given that it incorporated constraints such as available energy.

SK Hynix, a supplier to Nvidia (NVDA) and Apple (AAPL), reported strong quarterly results in late July, attributing the strength to the aggressive investments by global big tech companies in AI, which led to a steady increase in demand for AI memory.

The next-generation HBM chips, dubbed HBM4, are expected to be launched in the second half of 2025 or 2026, with substantial revenue anticipated in 2027.

A Bloomberg Intelligence report stated in January that the HBM chip market is expected to grow to $130 billion by the end of 2030, up from $4 billion in 2023.

For updates and corrections, email newsroom[at]stocktwits[dot]com.

Read Next: Economist Slams Trump Administration’s Rumored Move To Tax Nvidia-AMD’s China Chip Revenue As ‘Unconstitutional’

Subscribe to Trends with Friends
All Newsletters
For serious investors with a serious sense of humor.
Read about our editorial guidelines and ethics policy