<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet title="XSL_formatting" type="text/xsl" href="https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss.xsl"?><rss version="2.0"
     xmlns:content="http://purl.org/rss/1.0/modules/content/"
     xmlns:wfw="http://wellformedweb.org/CommentAPI/"
     xmlns:dc="http://purl.org/dc/elements/1.1/"
     xmlns:atom="http://www.w3.org/2005/Atom"
     xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
     xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	 xmlns:media="http://search.yahoo.com/mrss/"
	>
	<channel>
		<title>High Bandwidth Memory &#8211; Samsung Global Newsroom</title>
		<atom:link href="https://news.samsung.com/global/tag/high-bandwidth-memory/feed" rel="self" type="application/rss+xml" />
		<link>https://news.samsung.com/global</link>
        
        <currentYear>2021</currentYear>
        <cssFile>https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss_xsl.css</cssFile>
		<description>What's New on Samsung Newsroom</description>
		<lastBuildDate>Wed, 15 Apr 2026 08:00:00 +0000</lastBuildDate>
		<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
					<item>
				<title><![CDATA[Samsung Brings In-Memory Processing Power to Wider Range of Applications]]></title>
				<link>https://news.samsung.com/global/samsung-brings-in-memory-processing-power-to-wider-range-of-applications</link>
				<pubDate>Tue, 24 Aug 2021 11:00:14 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/08/HBM-PIM-press-release_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AXDIMM]]></category>
		<category><![CDATA[HBM-PIM]]></category>
		<category><![CDATA[High Bandwidth Memory]]></category>
		<category><![CDATA[Hot Chips]]></category>
		<category><![CDATA[LPDDR5-PIM]]></category>
		<category><![CDATA[Memory Solutions]]></category>
		<category><![CDATA[PIM ecosystem]]></category>
		<category><![CDATA[Processing-In-Memory]]></category>
		<category><![CDATA[Samsung Memory]]></category>
                <guid isPermaLink="false">https://bit.ly/2XDbRLR</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, today showcased its latest advancements with processing-in-memory (PIM) technology at Hot Chips 33—a leading semiconductor conference where the most notable microprocessor and IC innovations are unveiled each year. Samsung’s revelations include the first successful integration of its PIM-enabled High Bandwidth Memory (HBM-PIM) into a commercialized accelerator […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, the world leader in advanced memory technology, today showcased its latest advancements with processing-in-memory (PIM) technology at <a href="https://hotchips.org/" target="_blank" rel="noopener">Hot Chips 33</a>—a leading semiconductor conference where the most notable microprocessor and IC innovations are unveiled each year. Samsung’s revelations include the first successful integration of its PIM-enabled High Bandwidth Memory (HBM-PIM) into a commercialized accelerator system, and broadened PIM applications to embrace DRAM modules and mobile memory, in accelerating the move toward the convergence of memory and logic.</p>
<p><img class="alignnone size-full wp-image-126501" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/08/HBM-PIM-press-release_main1.jpg" alt="" width="1000" height="630" /></p>
<h3><span style="color: #000080"><strong>First Integration of HBM-PIM Into an AI Accelerator</strong></span></h3>
<p>In February, Samsung introduced the industry’s first HBM-PIM (Aquabolt-XL), which incorporates the AI processing function into Samsung’s HBM2 Aquabolt, to enhance high-speed data processing in supercomputers and AI applications. The HBM-PIM has since been tested in the Xilinx Virtex Ultrascale+ (Alveo) AI accelerator, where it delivered an almost 2.5X system performance gain as well as more than a 60% cut in energy consumption.</p>
<p><img class="alignnone size-full wp-image-126502" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/08/HBM-PIM-press-release_main2.jpg" alt="" width="1000" height="570" /></p>
<p>“HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential,” said Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics. “Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers.”</p>
<p>“Xilinx has been collaborating with Samsung Electronics to enable high-performance solutions for data center, networking and real-time signal processing applications starting with the Virtex UltraScale+ HBM family, and recently introduced our new and exciting Versal HBM series products,” said Arun Varadarajan Rajagopal, senior director, Product Planning at Xilinx, Inc. “We are delighted to continue this collaboration with Samsung as we help to evaluate HBM-PIM systems for their potential to achieve major performance and energy-efficiency gains in AI applications.”</p>
<h3><span style="color: #000080"><strong>DRAM Modules Powered by PIM</strong></span></h3>
<p><img class="alignnone size-full wp-image-126504" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/08/HBM-PIM-press-release_main3.jpg" alt="" width="1000" height="640" /></p>
<p>The Acceleration DIMM (AXDIMM) brings processing to the DRAM module itself, minimizing large data movement between the CPU and DRAM to boost the energy efficiency of AI accelerator systems. With an AI engine built inside the buffer chip, the AXDIMM can perform parallel processing of multiple memory ranks (sets of DRAM chips) instead of accessing just one rank at a time, greatly enhancing system performance and efficiency. Since the module can retain its traditional DIMM form factor, the AXDIMM facilitates drop-in replacement without requiring system modifications. Currently being tested on customer servers, the AXDIMM can offer approximately twice the performance in AI-based recommendation applications and a 40% decrease in system-wide energy usage.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-126503" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/08/HBM-PIM-press-release_main4.jpg" alt="" width="1000" height="708" /></p>
<p>“SAP has been continuously collaborating with Samsung on their new and emerging memory technologies to deliver optimal performance on SAP HANA and help database acceleration,” said Oliver Rebholz, head of HANA core research & innovation at SAP. “Based on performance projections and potential integration scenarios, we expect significant performance improvements for in-memory database management system (IMDBMS) and higher energy efficiency via disaggregated computing on AXDIMM. SAP is looking to continue its collaboration with Samsung in this area.”</p>
<h3><span style="color: #000080"><strong>Mobile Memory That Brings AI From Data Center to Device</strong></span></h3>
<p>Samsung’s LPDDR5-PIM mobile memory technology can provide independent AI capabilities without data center connectivity. Simulation tests have shown that the LPDDR5-PIM can more than double performance while reducing energy usage by over 60% when used in applications such as voice recognition, translation and chatbot.</p>
<h3><span style="color: #000080"><strong>Energizing the Ecosystem</strong></span></h3>
<p>Samsung plans to expand its AI memory portfolio by working with other industry leaders to complete standardization of the PIM platform in the first half of 2022. The company will also continue to foster a highly robust PIM ecosystem in assuring wide applicability across the memory market.</p>
]]></content:encoded>
																				</item>
					<item>
				<title><![CDATA[Samsung Develops Industry’s First High Bandwidth Memory with AI Processing Power]]></title>
				<link>https://news.samsung.com/global/samsung-develops-industrys-first-high-bandwidth-memory-with-ai-processing-power</link>
				<pubDate>Wed, 17 Feb 2021 11:00:20 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/02/HBM-PIM_PR_Thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[HBM-PIM]]></category>
		<category><![CDATA[High Bandwidth Memory]]></category>
		<category><![CDATA[Processing-In-Memory]]></category>
                <guid isPermaLink="false">https://bit.ly/3rX04lE</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, today announced that it has developed the industry’s first High Bandwidth Memory (HBM) integrated with artificial intelligence (AI) processing power — the HBM-PIM. The new processing-in-memory (PIM) architecture brings powerful AI computing capabilities inside high-performance memory, to accelerate large-scale processing in data centers, high performance computing […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, the world leader in advanced memory technology, today announced that it has developed the industry’s first High Bandwidth Memory (HBM) integrated with artificial intelligence (AI) processing power — the HBM-PIM. The new processing-in-memory (PIM) architecture brings powerful AI computing capabilities inside high-performance memory, to accelerate large-scale processing in data centers, high performance computing (HPC) systems and AI-enabled mobile applications.</p>
<p>Kwangil Park, senior vice president of Memory Product Planning at Samsung Electronics stated, “Our groundbreaking HBM-PIM is the industry’s first programmable PIM solution tailored for diverse AI-driven workloads such as HPC, training and inference. We plan to build upon this breakthrough by further collaborating with AI solution providers for even more advanced PIM-powered applications.”</p>
<p>Rick Stevens, Argonne’s Associate Laboratory Director for Computing, Environment and Life Sciences commented, “I’m delighted to see that Samsung is addressing the memory bandwidth/power challenges for HPC and AI computing. HBM-PIM design has demonstrated impressive performance and power gains on important classes of AI applications, so we look forward to working together to evaluate its performance on additional problems of interest to Argonne National Laboratory.”</p>
<p>Most of today’s computing systems are based on the von Neumann architecture, which uses separate processor and memory units to carry out millions of intricate data processing tasks. This sequential processing approach requires data to constantly move back and forth, resulting in a system-slowing bottleneck especially when handling ever-increasing volumes of data.</p>
<p>Instead, the HBM-PIM brings processing power directly to where the data is stored by placing a DRAM-optimized AI engine inside each memory bank — a storage sub-unit — enabling parallel processing and minimizing data movement. When applied to Samsung’s existing HBM2 Aquabolt solution, the new architecture is able to deliver over twice the system performance while reducing energy consumption by more than 70%. The HBM-PIM also does not require any hardware or software changes, allowing faster integration into existing systems.</p>
<p>Samsung’s paper on the HBM-PIM has been selected for presentation at the renowned International Solid-State Circuits Virtual Conference (ISSCC) held through Feb. 22. Samsung’s HBM-PIM is now being tested inside AI accelerators by leading AI solution partners, with all validations expected to be completed within the first half of this year.</p>
]]></content:encoded>
																				</item>
					<item>
				<title><![CDATA[Samsung to Advance High Performance Computing Systems with Launch of Industry’s First 3rd-generation (16GB) HBM2E]]></title>
				<link>https://news.samsung.com/global/samsung-to-advance-high-performance-computing-systems-with-launch-of-industrys-first-3rd-generation-16gb-hbm2e</link>
				<pubDate>Tue, 04 Feb 2020 08:00:32 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2020/02/Samsung-16GB-HBM2E-Flashbolt_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[3rd GEN High Bandwidth Memory 2E]]></category>
		<category><![CDATA[DRAM]]></category>
		<category><![CDATA[Flashbolt]]></category>
		<category><![CDATA[HBM2E]]></category>
		<category><![CDATA[High Bandwidth Memory]]></category>
		<category><![CDATA[Microbumps]]></category>
                <guid isPermaLink="false">http://bit.ly/3b6kaSM</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, today announced the market launch of ‘Flashbolt’, its third-generation High Bandwidth Memory 2E (HBM2E). The new 16-gigabyte (GB) HBM2E is uniquely suited to maximize high performance computing (HPC) systems and help system manufacturers to advance their supercomputers, AI-driven data analytics and state-of-the-art graphics systems in a […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-114919" src="https://img.global.news.samsung.com/global/wp-content/uploads/2020/02/Samsung-16GB-HBM2E-Flashbolt_main1.jpg" alt="" width="1000" height="563" /></p>
<p>Samsung Electronics, the world leader in advanced memory technology, today announced the market launch of ‘Flashbolt’, its third-generation High Bandwidth Memory 2E (HBM2E). The new 16-gigabyte (GB) HBM2E is uniquely suited to maximize high performance computing (HPC) systems and help system manufacturers to <span>advance their supercomputers, AI-driven data analytics and state-of-the-art graphics systems in a timely manner.</span></p>
<p>“With the introduction of the highest performing DRAM available today, we are taking a critical step to enhance our role as the leading innovator in the fast-growing premium memory market,” said Cheol Choi, executive vice president of Memory Sales & Marketing at Samsung Electronics. “Samsung will continue to deliver on its commitment to bring truly differentiated solutions as we reinforce our edge in the global memory marketplace.”</p>
<p>Ready to deliver twice the capacity of the previous-generation 8GB HBM2 ‘Aquabolt’, the new Flashbolt also sharply increases performance and power efficiency to significantly improve next-generation computing systems. The 16GB capacity is achieved by vertically stacking eight layers of 10nm-class (1y) 16-gigabit (Gb) DRAM dies on top of a buffer chip. This HBM2E package is then interconnected in a precise arrangement of more than 40,000 ‘through silicon via’ (TSV) microbumps, with each 16Gb die containing over 5,600 of these microscopic holes.</p>
<p>Samsung’s Flashbolt provides a highly reliable data transfer speed of 3.2 gigabits per second (Gbps) by leveraging a proprietary optimized circuit design for signal transmission, while offering a memory bandwidth of 410GB/s per stack. Samsung’s HBM2E can also attain a transfer speed of 4.2Gbps, the maximum tested data rate to date, enabling up to a 538GB/s bandwidth per stack in certain future applications. This would represent a 1.75x enhancement over Aquabolt’s 307GB/s.</p>
<p>Samsung expects to begin volume production during the first half of this year. The company will continue providing its second-generation Aquabolt lineup while expanding its third-generation Flashbolt offering, and will further strengthen collaborations with ecosystem partners in next-generation systems as it accelerates the transition to HBM solutions throughout the premium memory market.</p>
]]></content:encoded>
																				</item>
			</channel>
</rss>