Samsung Electronics and Advanced Micro Devices (AMD) are deepening their strategic collaboration on next-generation artificial intelligence (AI) infrastructure, signing a Memorandum of Understanding (MoU) to advance high-performance memory and computing technologies amid surging global demand for advanced chips.
The agreement, formalised at Pyeongtaek, brings together senior leadership from both companies, including Lisa Su, Chair and CEO of AMD, and Young Hyun Jun, Vice Chairman and CEO of Samsung Electronics.
Under the partnership, Samsung will supply its latest high-bandwidth memory (HBM4) for AMD’s upcoming Instinct MI455X GPU, a next-generation AI accelerator designed for large-scale data centre workloads.

The partnership comes as global demand for AI-optimised hardware continues to accelerate, driven by the rapid expansion of hyperscale data centres and increasingly complex AI models.
The collaboration also covers advanced DRAM solutions, including next-generation DDR5 memory for AMD’s EPYC processors and the Helios platform. Samsung is expected to support sixth-generation AMD EPYC processors, codenamed “Venice,” as AMD expands its footprint in AI and cloud infrastructure markets.
As part of the agreement, Samsung will act as a primary HBM4 supplier for the Instinct MI455X GPU, strengthening AMD’s access to critical memory components required for high-performance AI workloads.
AMD on industry push for AI-ready memory
Commenting on the development, Young Hyun Jun says the agreement reflects the growing scope of collaboration between both companies, spanning memory, foundry and advanced packaging capabilities.
Lisa Su notes that accelerating AI innovation requires tighter integration across the computing stack, from silicon to system and rack-scale infrastructure, highlighting the importance of partnerships in delivering scalable AI solutions.
Samsung says its HBM4 memory is built on a sixth-generation 10-nanometre-class DRAM process with a 4nm logic base die, delivering speeds of up to 13 gigabits-per-second and bandwidth of up to 3.3 terabytes-per-second, positioning it ahead of current industry benchmarks.
The partnership comes as global demand for AI-optimised hardware continues to accelerate, driven by the rapid expansion of hyperscale data centres and increasingly complex AI models.
High-bandwidth memory is emerging as a critical component in AI systems, enabling faster data throughput and improved energy efficiency. Industry projections indicate that the HBM market is set for strong growth this decade, as enterprises scale infrastructure to support data-intensive workloads such as large language model training and real-time AI inference.
Strategic implications
The agreement underscores the growing importance of memory technologies, alongside processors, in determining AI system performance. Innovations such as HBM4 are becoming essential for handling the massive data volumes required by modern AI applications.
By securing Samsung as a key memory partner, AMD is strengthening its position in the competitive AI hardware market, where integrated, high-performance systems are becoming a defining factor in competing with rival chipmakers developing next-generation AI platforms.
























Home