Skip to content

Introducing Samsung’s SOCAMM2: New LPDDR Memory Module Empowering Next-generation AI Infrastructure

  • mail

As AI adoption accelerates worldwide, data centers are experiencing explosive growth in computational workloads. With the shift from large-scale model training to continuous inference, the challenge is no longer just about performance — energy efficiency has become equally critical to sustaining next-generation AI infrastructure. This transition is fueling demand for low-power memory solutions capable of supporting continuous AI workloads while optimizing power consumption.

In line with this trend, Samsung has developed SOCAMM2 (Small Outline Compression Attached Memory Module) — an LPDDR-based server memory module designed for AI data centers, with customer samples already being supplied. By combining the strengths of LPDDR technology with a modular, detachable design, SOCAMM2 delivers higher bandwidth, improved power efficiency, and flexible system integration, enabling AI servers to achieve greater efficiency and scalability.

socaam2
socaam2


Why SOCAMM2 matters beyond conventional memory

Based on Samsung’s latest LPDDR5X DRAM, SOCAMM2 expands the scope of data-center memory by combining the strengths of LPDDR and modular architectures.

While DDR-based server modules such as RDIMM (Registered Dual Inline Memory Module) continue to serve as the backbone of high-capacity, general-purpose servers, SOCAMM2 offers a complementary alternative optimized for next-generation AI-accelerated servers that demand both high responsiveness and power efficiency. It delivers more than twice the bandwidth of traditional RDIMM while consuming over 55% less power, maintaining stable, high-throughput performance under intensive AI workloads, making it an ideal solution for energy-efficient, performance-driven AI servers.

By inheriting the low-power characteristics of LPDDR technology and combining them with the scalability of a modular form factor, SOCAMM2 enables greater design flexibility for diverse AI system configurations, providing improved system versatility for next-generation AI infrastructure.


User benefit of SOCAMM2

SOCAMM2’s architectural innovations enable customers to operate AI servers with greater efficiency, flexibility, and reliability.

Its detachable design streamlines system maintenance and lifecycle management. Unlike traditional soldered LPDDR solutions, SOCAMM2 enables easy memory upgrades or replacements without any mainboard modification, helping system administrators minimize downtime and dramatically reduce the total cost of ownership (TCO).

In addition, SOCAMM2’s enhanced power efficiency makes heat management easier and more effective in AI server deployments. This helps data centers maintain thermal stability and reduce cooling requirements — a critical factor for high-density AI environments.

Lastly, the transition from RDIMM’s vertical layout to SOCAMM2’s horizontal orientation further improves system-level space utilization. It enables more flexible heat-sink placement and airflow design, allowing smoother integration with CPUs and accelerators, while remaining compatible with both air and liquid cooling systems.


Close collaboration with NVIDIA and JEDEC standardization

Samsung is expanding its collaboration across the AI ecosystem to accelerate adoption of LPDDR-based server solutions. In particular, the company is working closely with NVIDIA to optimize SOCAMM2 for NVIDIA accelerated infrastructure through ongoing technical cooperation — ensuring it delivers the responsiveness and efficiency required for next-generation inference platforms. This partnership is underscored by NVIDIA’s remarks:

“As AI workloads shift from training to rapid inference for complex reasoning and physical AI applications, next-generation data centers demand memory solutions that deliver both high performance and exceptional power efficiency,” said Dion Harris, senior director, HPC and AI Infrastructure Solutions, NVIDIA. “Our ongoing technical cooperation with Samsung is focused on optimizing memory solutions like SOCAMM2 to deliver the high responsiveness and efficiency essential for AI infrastructure.”

As SOCAMM2 gains traction as a low-power, high-bandwidth solution for next-generation AI systems, the industry has initiated formal standardization efforts for LPDDR-based server modules. Samsung has been contributing to this work alongside key partners, helping to shape consistent design guidelines and enable smoother integration across future AI platforms.

socamm2
socamm2


Through continued alignment with the broader AI ecosystem, Samsung is helping to guide the shift toward low-power, high-bandwidth memory for next-generation AI infrastructure. SOCAMM2 represents a major milestone for the industry — bringing LPDDR technology into mainstream server environments and powering the transition to the emerging superchip era. By combining LPDDR with a modular architecture, it provides a practical path toward more compact and power-efficient AI systems.

As AI workloads continue to grow in scale and complexity, Samsung will further advance its LPDDR-based server memory portfolio, reinforcing its commitment to enabling the next generation of AI data centers.


* The contents of this page are provided for informational purposes only. No representation or warranty (whether express or implied) is made by Samsung or any of its officers, advisers, agents, or employees as to the accuracy, reasonableness or completeness of the information, statements, opinions, or matters contained in this page, and they are provided on an "AS-IS" basis. Samsung will not be responsible for any damages arising out of the use of, or otherwise relating to, the contents of this page. Nothing in this page grants you any license or rights in or to information, materials, or contents provided in this document, or any other intellectual property.

* The contents of this page may also include forward-looking statements. Forward-looking statements are not guarantees of future performance and that the actual developments of Samsung, the market, or the industry in which Samsung operates may differ materially from those made or suggested by the forward-looking statements contained in this page.