Samsung believes memory semiconductors will lead the charge in AI supercomputing before the end of the decade. The company is a believer in memory chips outshining Nvidia GPUs in AI server applications. And a couple of months ago, Kye Hyun Kyung said Samsung will make sure “memory semiconductor-centered supercomputers can come out by 2028.” Now, reports say Samsung is preparing to mass-produce high-bandwidth memory (HBM) chips for AI applications this year.
According to the Korean media, Samsung is planning to mass-manufacture HBM chips for AI in the second half of 2023 and intends to catch up with SK Hynix, the latter of which quickly took the lead in the AI memory semiconductor market.
SK Hynix had roughly 50% market share in the HBM market in 2022, while Samsung held around 40%, according to TrendForce (via The Korea Times). Micron accounted for the remaining 10%. But the HBM market as a whole is not that prevalent and accounts for only about 1% of the entire DRAM segment.
Nevertheless, demand for HBM solutions is expected to increase as the AI market grows, and Samsung now intends to catch up with SK Hynix and mass produce its HBM3 chips in anticipation of these market changes. Regardless of whether or not the term “AI” has become a buzzword, AI servers are becoming more widespread, and high-bandwidth memory solutions are gaining more traction.
Samsung's HBM3 solution vertically stacks multiple DRAM chips and has 16GB and 24GB capacities. HBM3 can top speeds of up to 6.4Gbps.