by Jan Michael Carpo, Reporter
Nvidia has officially approved a version of Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, specifically the HBM3E model, for integration into its advanced artificial intelligence (AI) processors.
This approval represents a crucial breakthrough for Samsung, the global leader in memory chip manufacturing, as it strives to close the competitive gap with its regional rival, SK Hynix, in producing state-of-the-art memory solutions designed to meet the complex demands of generative AI technologies.
Insiders reveal that although a formal supply agreement between Samsung and Nvidia is not yet in place, both companies are on track to finalize the deal soon. This agreement is expected to facilitate the delivery of Samsung’s certified eight-layer HBM3E chips, with shipments slated to begin in the fourth quarter of 2024.
However, it’s also worth noting that Nvidia has yet to approve Samsung’s 12-layer HBM3E chips, as these have not passed the necessary testing protocols, according to sources who requested anonymity due to the confidentiality of the information.
Despite the silence from Nvidia and Samsung on these developments, the implications for the fintech sector are substantial. The HBM3E chips are a type of dynamic random access memory (DRAM) that signifies a major advancement in AI processing power.
Originally developed in 2013, HBM chips are stacked vertically to optimize space and reduce power consumption, making them an essential component in AI-focused graphics processing units (GPUs). These GPUs play a critical role in processing the large volumes of data required by sophisticated fintech applications, such as real-time fraud detection and complex trading algorithms.
Working hard to meet Nvidia’s stringent testing criteria
Samsung’s journey to secure Nvidia’s certification for the HBM3E chips has been challenging. Since the previous year, Samsung has been working to meet Nvidia’s stringent testing criteria for both the HBM3E and the earlier HBM3 chips. The company faced setbacks due to issues related to heat management and power efficiency, as reported by Reuters in May.
In response, Samsung implemented significant design changes to address these challenges, ultimately leading to the recent approval of the HBM3E chips. The market responded positively to this certification news.
By 12:51 p.m. in Manila on the same day, Samsung Electronics’ stock had risen by 4.3%, significantly outperforming the broader market’s 2.4% gain. This surge reflects investor confidence in Samsung’s strengthened position in the competitive memory chip market, especially given the escalating demand for AI processors.
Nvidia’s approval of Samsung’s HBM3E chips follows an earlier decision to certify Samsung’s HBM3 chips for use in less advanced CPUs aimed at the Chinese market. This move underscores the increasing demand for high-performance GPUs, fueled by the rapid growth of generative AI technologies.
As fintech firms continue to integrate AI-driven solutions to enhance their operations, the need for GPUs capable of handling complex data processing tasks has become more urgent than ever.
Looking ahead, industry experts are optimistic about the future of Samsung’s HBM3E chips. Research firm TrendForce predicts that these chips will likely dominate the HBM market this year, with shipments expected to peak in the latter half of 2024.
In July, Samsung forecasted that HBM3E chips would account for 60% of its total HBM chip sales by the fourth quarter. Meeting this target would further solidify Samsung’s leadership in the memory chip industry, contingent on the final certification of its latest HBM3E chips by Nvidia by the third quarter.
The fintech industry’s rapid adoption of AI technologies highlights the critical importance of these developments. As financial institutions increasingly leverage AI for a wide range of applications, from customer service automation to predictive analytics, the demand for powerful and efficient memory solutions is set to rise.
In a related development, another chipmaker, AMD, recently showcased the growing momentum of its AMD Instinct accelerator family at the just-concluded Computex 2024, which gathered around 1,500 companies from 36 countries as participants. In a company press release, AMD stated that the AMD Instinct MI400 series based on the AMD CDNA “Next” architecture may be expected to arrive in 2026.
Samsung’s achievement in securing Nvidia’s approval for its HBM3E chips positions the company as a key player in the future of fintech, where AI-driven innovation is poised to transform the industry.