High-bandwidth memory or HBM is crucial for AI semiconductors. Samsung's viewing the AI goldfish as the perfect opportunity to sell a lot of its HBM modules. The company has already won a lot of praise from existing leaders in the segment like NVIDIA. Samsung is also expected to be the sole supplier of HBMs for NVIDIA's new AI chips.
Not one to rest on its laurels, the company continues to push the envelope on its high-bandwidth memory technology. It has announced the test production of a new 16-stack module that delivers better efficiency and performance.
Samsung's racing ahead with HBM development
Samsung vice president Kim Dae-woo revealed during the Korean Microelectronics and Packaging Society conference that the company has manufactured a smple of 16-stack high-bandwidth memory. This was achieved through a hybrid bonding process.
This is merely a proof of concept at this stage as mass production of the 16-stack HBM is going to take some time. Despite that, Samsung says that this module operated normally. This chip was developed as HBM3 but Samsung is planning to use a HBM4 to improve the productivity of this chip.
The company continues to evaluate its options regarding the use of hybrid bonding and thermal compression non-conductive film for HBM4. It will launch HBM4 samples next year with mass production potentially starting in 2025. Hybrid bonding is said to be more beneficial as it would enable the company to made stacks more compactly without having to utilize through-silicon-via or TSV that requires filler bumps to connect.