SK Hynix Projects 30 Percent Annual Growth in AI Memory Demand
South Korean chipmaker expects sustained high-bandwidth memory demand through 2030, citing AI and data center expansion.
Forecast signals robust AI hardware trajectory
On Aug. 10, 2025, SK Hynix told Reuters it expects high-bandwidth memory (HBM) demand to grow by about 30 percent annually through 2030. The company attributed this projection to AI model training, inference workloads, and large-scale data center deployments.
Why HBM matters for AI performance
HBM provides the high throughput and low latency needed for advanced AI compute. It enables faster model training and more responsive inference, particularly in large language models and high-graphics applications. As AI systems scale in size and complexity, the need for rapid data movement between processing …
Archive Access
This article is older than 24 hours. Create a free account to access our 7-day archive.