- SOCAMM is a new modular memory form factor exclusive to Nvidia systems
- Micron says SOCAMM offers high bandwidth, low power and smaller footprint
- SK Hynix plans production of SOCAMM as AI infrastructure demand grows
At the recent Nvidia GTC 2025, memory makers Micron and SK Hynix took the wraps off their respective SOCAMM solutions.
This new modular memory form factor is designed to unlock the full potential of AI platforms and has been developed exclusively for Nvidia’s Grace Blackwell platform.
SOCAMM, or Small Outline Compression Attached Memory Module, is based on LPDDR5X and intended to address growing performance and efficiency demands in AI servers. The form factor reportedly offers higher bandwidth, lower power consumption, and a smaller footprint compared to traditional memory modules such as RDIMMs and MRDIMMs. SOCAMM is specific to Nvidia’s AI architecture and so can’t be used in AMD or Intel systems.
You may like
More cost-efficient
Micron announced it will be the first to ship SOCAMM products in volume and its 128GB SOCAMM modules are designed for the Nvidia GB300 Grace Blackwell Ultra Superchip.
According to the company, the modules deliver more than 2.5 times the bandwidth of RDIMMs while using one-third the power.
The compact 14x90mm design is intended to support efficient server layouts and thermal management.
“AI is driving a paradigm shift in computing, and memory is at the heart of this evolution,” said Raj Narasimhan, senior vice president and general manager of Micron’s Compute and Networking Business Unit.
“Micron’s contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications.”
SK Hynix also presented its own low-power SOCAMM solution at GTC 2025 as part of a broader AI memory portfolio.
Unlike Micron, the company didn’t go into too much detail about it, but said it is positioning SOCAMM as a key offering for future AI infrastructure and plans to begin mass production “in line with the market’s emergence”.
“We are proud to present our line-up of industry-leading products at GTC 2025,” SK Hynix’s President & Head of AI Infra Juseon (Justin) Kim said.
“With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward.”
You might also like
Services Marketplace – Listings, Bookings & Reviews