Samsung has showcased the advancements it has made with the processing-in-memory (PIM) technology, as it strives to bring memory and logic closer together.

Announced earlier this year, the PIM technology integrates an artificial intelligence (AI) processor together with a high bandwidth memory (HBM) chip to boost compute intensive tasks like machine learning (ML).

At the Hot Chips conference, Samsung announced that testing the new HBM-PIM memory with the Xilinx Alveo Ultrascale+ AI data center accelerator cards, delivers more than double the performance while consuming less than half the electricity.

“HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential,” commented Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics.

Mainstream use

Importantly, Samsung also shared plans to use the PIM technology beyond HBM chips and integrate them into mainstream DIMMs and mobile memory components to accelerate conventional memory-bound workloads.

Oliver Rebholz, head of HANA core research and innovation at SAP, revealed that the company has been working with Samsung on the new memory technology, which it hopes to use to enhance the performance on the SAP HANA in-memory database.

“Based on performance projections and potential integration scenarios, we expect significant performance improvements for in-memory database management system (IMDBMS) and higher energy efficiency via disaggregated computing on AXDIMM,” shared Rebholz.

Moving forward Samsung even hopes to integrate PIM with its mobile memory technology. At the conference Samsung revealed that early simulation tests with LPDDR5-PIM resulted in impressive performance gains for applications such as voice recognition, and machine translation.