[Video] Here’s Why CXL Is the Memory Solution for the AI Era

on February 25, 2022
Share open/close
URL Copied.

A world powered by artificial intelligence (AI) is no longer considered part of a distant future; it’s becoming a reality.


COVID-19 has sped up digital transformation by several years, and advances in AI technology have sped right along with it, leading to a significant increase in demand for AI. Large-scale adoption of AI is already taking place in key industries, from the automotive sector and finance to healthcare and education, as seen through innovations like self-driving cars and chatbots. At the same time, the range of applications for AI is expanding fast, driving impressive advancements in areas like image processing, speech recognition, and natural language processing.



A New Memory Solution for the AI Era

In recent years, as data throughput has increased rapidly, it’s stressed the limits of existing computing systems. AI data throughput has been rising tenfold each year,1 and current computing systems don’t offer memory capacities large enough to handle the sharp increase in data volumes.


Currently, a central processing unit (CPU) can hold up to 16 DRAMs (a maximum of 8 terabytes (TB)) a number far smaller than what’s needed to handle the massive stores of data used in AI and machine learning. The need for a memory platform that supports fast interfaces and easy scalability is becoming all the more clear as the age of AI draws ever nearer. Recently, a new DRAM module based on Compute Express Link (CXL) has emerged as a promising memory solution for the AI era. So too have processing-in-memory (PIM) and computing storage equipped with a memory-based AI processor.




CXL: An Interface Pushing the Boundaries of Memory Capacity and Server Flexibility

What makes CXL a next-generation memory platform and such a promising solution to current computing limitations? In short: scalability.


CXL is a new interface that’s designed to enhance the efficiency of a computing system’s memory, CPU and graphics processing unit (GPU). In conventional platforms, devices like memory and storage have their own interfaces that link them to the CPU. But going through all these different interfaces to communicate with one another creates latency, slowing down operations. And with the massive growth in data being used for AI and machine learning, latency issues have only gotten worse.


CXL is part of a next-generation interface that will be applied to PCIe 5.0. By integrating multiple existing interfaces into one, directly connecting devices and enabling them to share memory, CXL addresses those limitations and creates new data pathways that are faster and more efficient. This next-generation memory solution is the reason why CXL has been receiving so much attention.


In line with this trend, in May of 2021, Samsung Electronics introduced the CXL Memory Expander, a first-of-its-kind CXL-based software development solution, and began promoting CXL memory solutions. The main advantages of CXL are as follows:


  • Unrivaled Memory Expansion

Similar to a solid state drive (SSD), which is an external storage device, the CXL Memory Expander enables DRAM capacity to be expanded when installed in the location where the SSD is inserted. In other words, it enables an IT system’s DRAM capacity to be expanded simply by improving the interface and without having to modify the existing server structure or change it altogether.


  • Streamlined Data Handling

A key benefit of the Memory Expander is its efficient data processing. By expanding higher bandwidth, it enables different devices to share memory and leverage their resources more effectively. They can then use the accelerator’s memory as if it were main memory by sharing common memory areas. Devices without their own internal memory can also take advantage of that main memory and use it as their own.


  • Accelerated Computing Speed

Minimizing latency issues (or delays) caused by increases in data throughput is a key function of the CXL Memory Expander. The Memory Expander leverages both the accelerator and the CPU to improve system computing speeds, supporting much smoother and more rapid data processing.


As it stands, many in the industry are still unfamiliar with the concept of the CXL interface. Although the technology is still in its early stages, its potential to drive more efficient data processing is pushing it to be viewed as a driver of the Fourth Industrial Revolution.


In preparation for the rapidly approaching AI era, Samsung Electronics will help expand the CXL ecosystem by introducing everything from CXL memory hardware to a software solution called the Scalable Memory Development Kit (SMDK), and will lead the market with next-generation memory solutions that are capable of accommodating the fast-evolving landscape of data processing.



1 Source: OpenAI (2019)

Products > Semiconductors

For any issues related to customer service, please go to Customer Support page for assistance.
For media inquiries, please click Media Contact to move to the form.