The demand for high-bandwidth memory (HBM) is projected to skyrocket in the coming years as AI matures and the era of general artificial intelligence is ushered in. As one of the world's leading suppliers of memory products, Samsung understandably considers this a lucrative opportunity, and the company has put together a grand plan to establish its dominance in this industry.
There's a renewed sense of urgency in Samsung to come out on top with HBMs, particularly after SK Hynix got the first-mover advantage on current-generation chips. To prevent a repeat of that, Samsung is going to pull together all of its technological and financial resources to meet this demand.
Samsung will pool together all resources to establish its dominance
Kim Kyung-ryun, vice president of memory product planning at Samsung Electronics, pointed out that the hardware infrastructure for such products will undergo a process of optimization as the technology advances. Samsung is planning to cater to these changes by unifying its core dies for data storage, and it will also diversify package and base dies.
He pointed out that joint optimization is necessary as demand for customization will increase. To support that, Samsung will maximize the common design elements through platformization to create a system that can efficiently meet customization needs.
“Samsung will respond with its comprehensive capabilities in memory, foundry, system LSI, and advanced packaging (AVP),” Kim said. “We have also formed a dedicated team for next-generation HBMs, which will be unrivaled in the industry and will have significant effects,” he added.
The company has already showcased its 36GB HBM3E 12H DRAM which has a capacity that's 2.25x larger than the existing market leader. Kim expects that it will quickly become a mainstream product once the company begins selling it.