<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet title="XSL_formatting" type="text/xsl" href="https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss.xsl"?><rss version="2.0"
     xmlns:content="http://purl.org/rss/1.0/modules/content/"
     xmlns:wfw="http://wellformedweb.org/CommentAPI/"
     xmlns:dc="http://purl.org/dc/elements/1.1/"
     xmlns:atom="http://www.w3.org/2005/Atom"
     xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
     xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	 xmlns:media="http://search.yahoo.com/mrss/"
	>
	<channel>
		<title>AI Components &#8211; Samsung Global Newsroom</title>
		<atom:link href="https://news.samsung.com/global/tag/ai-components/feed" rel="self" type="application/rss+xml" />
		<link>https://news.samsung.com/global</link>
        
        <currentYear>2022</currentYear>
        <cssFile>https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss_xsl.css</cssFile>
		<description>What's New on Samsung Newsroom</description>
		<lastBuildDate>Wed, 08 Apr 2026 10:57:18 +0000</lastBuildDate>
		<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
					<item>
				<title>Samsung Electronics and NAVER Team Up To Develop Semiconductor Solutions Optimized for Hyperscale AI</title>
				<link>https://news.samsung.com/global/samsung-electronics-and-naver-team-up-to-develop-semiconductor-solutions-optimized-for-hyperscale-ai</link>
				<pubDate>Tue, 06 Dec 2022 11:00:29 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2022/12/Samsung-NAVER_AI-solution_Thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[AI Into the Future]]></category>
		<category><![CDATA[AI Technology]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[HBM-PIM]]></category>
		<category><![CDATA[HyperCLOVA]]></category>
		<category><![CDATA[NAVER CLOVA]]></category>
		<category><![CDATA[SmartSSD]]></category>
                <guid isPermaLink="false">https://bit.ly/3UwN1VJ</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, and NAVER Corporation, a global internet company with top-notch AI technology, today announced a wide-reaching collaboration to develop semiconductor solutions tailored for hyperscale artificial intelligence (AI) models. Leveraging Samsung’s next-generation memory technologies like computational storage, processing-in-memory (PIM) and processing-near-memory (PNM), as well as Compute Express Link […]]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-137901" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/12/Samsung-NAVER_AI-solution_main1F.jpg" alt="" width="1000" height="570" /></p>
<p>Samsung Electronics, the world leader in advanced memory technology, and NAVER Corporation, a global internet company with top-notch AI technology, today announced a wide-reaching collaboration to develop semiconductor solutions tailored for hyperscale artificial intelligence (AI) models. Leveraging Samsung’s next-generation memory technologies like computational storage, processing-in-memory (PIM) and processing-near-memory (PNM), as well as Compute Express Link (CXL), the companies intend to pool their hardware and software resources to dramatically accelerate the handling of massive AI workloads.</p>
<p>Recent advances in hyperscale AI have led to an exponential growth in data volumes that need to be processed. However, the performance and efficiency limitations of current computing systems pose significant challenges in meeting these heavy computational requirements, fueling the need for new AI-optimized semiconductor solutions.</p>
<p>Developing such solutions requires an extensive convergence of semiconductor and AI disciplines. Samsung is combining its semiconductor design and manufacturing expertise with NAVER’s experience in the development and verification of AI algorithms and AI-driven services, to create solutions that take the performance and power efficiency of large-scale AI to a new level.</p>
<p>For years, Samsung has been introducing memory and storage that support high-speed data processing in AI applications, from computational storage (SmartSSD) and PIM-enabled high bandwidth memory (HBM-PIM) to next-generation memory supporting the Compute Express Link (CXL) interface. Samsung will now join with NAVER to optimize these memory technologies in advancing large-scale AI systems.</p>
<p>NAVER will continue to refine HyperCLOVA, a hyperscale language model with over 200 billion parameters, while improving its compression algorithms to create a more simplified model that significantly increases computation efficiency.</p>
<p>“Through our collaboration with NAVER, we will develop cutting-edge semiconductor solutions to solve the memory bottleneck in large-scale AI systems,” said Jinman Han, Executive Vice President of Memory Global Sales & Marketing at Samsung Electronics. “With tailored solutions that reflect the most pressing needs of AI service providers and users, we are committed to broadening our market-leading memory lineup including computational storage, PIM and more, to fully accommodate the ever-increasing scale of data.”</p>
<p>“Combining our acquired knowledge and know-how from HyperCLOVA with Samsung’s semiconductor manufacturing prowess, we believe we can create an entirely new class of solutions that can better tackle the challenges of today’s AI technologies,” said Suk Geun Chung, Head of NAVER CLOVA CIC. “We look forward to broadening our AI capabilities and bolstering our edge in AI competitiveness through this strategic partnership.”</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Introduces Industry’s First 512GB CXL Memory Module</title>
				<link>https://news.samsung.com/global/samsung-electronics-introduces-industrys-first-512gb-cxl-memory-module</link>
				<pubDate>Tue, 10 May 2022 11:00:09 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2022/05/CXL-Memory_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[CXL DRAM]]></category>
		<category><![CDATA[SMDK]]></category>
                <guid isPermaLink="false">https://bit.ly/3P66knr</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, today announced its development of the industry’s first 512-gigabyte (GB) Compute Express Link (CXL) DRAM, taking an important step toward the commercialization of CXL which will enable extremely high memory capacity with low latency in IT systems. Since introducing the industry’s first CXL DRAM prototype with […]]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-132446" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/05/CXL-Memory_main1.jpg" alt="" width="1000" height="708" /></p>
<p>Samsung Electronics, the world leader in advanced memory technology, today announced its development of the industry’s first 512-gigabyte (GB) Compute Express Link (CXL) DRAM, taking an important step toward the commercialization of CXL which will enable extremely high memory capacity with low latency in IT systems.</p>
<p>Since introducing the industry’s first CXL DRAM prototype with a field-programmable gate array (FPGA) controller in May 2021, Samsung has been working closely with data center, enterprise server and chipset companies to develop an improved, customizable CXL device.</p>
<p><img class="alignnone size-full wp-image-132447" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/05/CXL-Memory_main2.jpg" alt="" width="1000" height="708" /></p>
<p>The new CXL DRAM is built with an application-specific integrated circuit (ASIC) CXL controller and is the first to pack 512GB of DDR5 DRAM, featuring four times the memory capacity and one-fifth the system latency over the previous Samsung CXL offering.</p>
<p>“CXL DRAM will become a critical turning point for future computing structures by substantially advancing artificial intelligence (AI) and big data services, as we aggressively expand its usage in next-generation memory architectures including software-defined memory (SDM),” said Cheolmin Park, Vice President of Memory Global Sales & Marketing at Samsung Electronics, and Director of the CXL Consortium. “Samsung will continue to collaborate across the industry to develop and standardize CXL memory solutions, while fostering an increasingly solid ecosystem.”</p>
<p>“As an active member of the CXL Consortium, Lenovo is committed to developing this important standard and helping build the ecosystem around the new CXL interconnect,” said Greg Huff, Chief Technology Officer, Lenovo Infrastructure Solutions Group. “We are excited to be part of Samsung’s CXL development program, working to foster the growth and adoption of innovative CXL products in future Lenovo systems.”</p>
<p><img loading="lazy" class="alignnone size-full wp-image-132448" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/05/CXL-Memory_main3.jpg" alt="" width="1000" height="708" /></p>
<p>“CXL is a key technology that enables more innovative ways to manage memory expansion and pooling which will play an important role in next-generation server platforms,” said Christopher Cox, Vice President of Strategic Technology at Montage Technology. “Montage is excited to continue partnering with Samsung to help the CXL ecosystem expand rapidly.”</p>
<p>In recent years, the growth of the metaverse, AI and big data has been generating explosive amounts of data. However, conventional DDR design limits the scaling of memory capacity beyond the tens of terabyte range, requiring an entirely new memory interface technology like CXL.</p>
<p>The large pool of memory that is shared between CXL and main memory allows a server to expand its memory capacity to tens of terabytes, and at the same time increase its bandwidth to several terabytes per second.</p>
<p>Samsung’s 512GB CXL DRAM will be the first memory device that supports the PCIe 5.0 interface and will come in an EDSFF (E3.S) form factor — especially suitable for next-generation high-capacity enterprise servers and data centers.</p>
<p><img loading="lazy" class="alignnone size-medium wp-image-134216" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/05/CXL-Memory_main4_F-1000x562.jpg" alt="" width="1000" height="562" /></p>
<p>Later this month, Samsung plans to unveil an updated version of its open-source Scalable Memory Development Kit (SMDK). The toolkit is a comprehensive software package that allows the CXL memory expander to work seamlessly in heterogeneous memory systems — enabling system developers to incorporate CXL memory into various IT systems running AI, big data and cloud applications, without having to modify existing application environments.</p>
<p>Samsung will begin sampling its 512GB CXL DRAM with customers and partners for joint evaluation and testing in the third quarter of this year, and plans to have the memory ready for commercialization as next-generation server platforms become available. As a member of the CXL Consortium Board of Directors, Samsung is openly collaborating with many global data center, server and chipset vendors to deliver next-generation interface technologies that can bring highly tangible benefits to the IT industry.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Video] Here To Upgrade the World: Introducing Samsung’s Game-Changing DDR5 Solution</title>
				<link>https://news.samsung.com/global/video-here-to-upgrade-the-world-introducing-samsungs-game-changing-ddr5-solution</link>
				<pubDate>Wed, 06 Apr 2022 12:00:00 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2022/04/DDR5_DRAM_video_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Autonomous driving technology]]></category>
		<category><![CDATA[DDR5]]></category>
		<category><![CDATA[DDR5 DRAM]]></category>
		<category><![CDATA[DIMM]]></category>
		<category><![CDATA[PMIC]]></category>
		<category><![CDATA[Samsung DDR5]]></category>
		<category><![CDATA[Samsung Semiconductor Leadership]]></category>
                <guid isPermaLink="false">https://bit.ly/3LBIwVQ</guid>
									<description><![CDATA[The age of DDR5 has arrived. From 5G and artificial intelligence (AI) to metaverse and augmented reality (AR), high-performance computing is pushing the limits of server environments to process massive amounts of data at extremely high speeds. Understanding that tech giants the world over are set to add droves of servers to their data centers, […]]]></description>
																<content:encoded><![CDATA[<p>The age of DDR5 has arrived. From 5G and artificial intelligence (AI) to metaverse and augmented reality (AR), high-performance computing is pushing the limits of server environments to process massive amounts of data at extremely high speeds. Understanding that tech giants the world over are set to add droves of servers to their data centers, Samsung Electronics has developed its DDR5 memory solutions to play a key role in empowering future-oriented industries to perform at their peak.</p>
<h3><span style="color: #000080"><strong>Future-First Memory Solutions for Data-Driven Innovation</strong></span></h3>
<p>With the development of their DDR5 solution, Samsung, a company known for changing the landscape of the global dynamic random access memory (DRAM) market, has introduced yet another generational shift in the IT industry. Furthermore, following the release earlier this year of CPUs that support DDR5, tremendous change is expected in the computing landscape, too, with growth expected to encompass gaming and mainstream PCs as well.</p>
<p>Compared to its predecessor which hit the market in 2013, DDR4, DDR5 DRAM boasts twice the speed and four times the capacity, at 4800Mpbs and 512GB respectively.<sup>1</sup> This next-generation high-performance memory allows networks to handle the soaring amounts of data generated by cloud computing, AI and autonomous driving systems. Unlike DDR4, DDR5 directly incorporates a power management integrated circuit (PMIC) into a dual in-line memory module (DIMM), resulting in a 30 percent increase in power efficiency on the module level and a more stable power supply.</p>
<p>Data centers are the main users of DDR5, since, in order to work at their full potential, they require low-power, high-performance memory as they consume large amounts of energy to perform operations and keep servers cool. Starting from the third quarter of this year, existing DRAMs for servers are set to steadily be replaced by DDR5, a shift that could help data centers stay cost-efficient and encourage sustainable, eco-friendly development.</p>
<p>Going beyond the performance limitations of existing DRAMs, DDR5 will be pivotal in leading data-driven innovation in terms of speed, capacity and eco-friendliness. In order to learn more about the new world DDR5 is helping to build, take a look at the video below.</p>
<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/hZXk45nVJkU?rel=0" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"><span style="width: 0px;overflow: hidden;line-height: 0" data-mce-type="bookmark" class="mce_SELRES_start"></span></iframe></div>
<p><em><span style="font-size: small"><sup>1</sup> These figures concern modules for servers.</span></em></p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Video] Here’s Why CXL Is the Memory Solution for the AI Era</title>
				<link>https://news.samsung.com/global/video-heres-why-cxl-is-the-memory-solution-for-the-ai-era</link>
				<pubDate>Fri, 25 Feb 2022 10:00:49 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2022/02/CXL_Memory_AI_Era_thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Compute Express Link]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[SMDK]]></category>
                <guid isPermaLink="false">https://bit.ly/3BTw7Jh</guid>
									<description><![CDATA[A world powered by artificial intelligence (AI) is no longer considered part of a distant future; it’s becoming a reality. COVID-19 has sped up digital transformation by several years, and advances in AI technology have sped right along with it, leading to a significant increase in demand for AI. Large-scale adoption of AI is already […]]]></description>
																<content:encoded><![CDATA[<p>A world powered by artificial intelligence (AI) is no longer considered part of a distant future; it’s becoming a reality.</p>
<p>COVID-19 has sped up digital transformation by several years, and advances in AI technology have sped right along with it, leading to a significant increase in demand for AI. Large-scale adoption of AI is already taking place in key industries, from the automotive sector and finance to healthcare and education, as seen through innovations like self-driving cars and chatbots. At the same time, the range of applications for AI is expanding fast, driving impressive advancements in areas like image processing, speech recognition, and natural language processing.</p>
<h3><span style="color: #000080"><strong>A New Memory Solution for the AI</strong> <strong>Era</strong></span></h3>
<p>In recent years, as data throughput has increased rapidly, it’s stressed the limits of existing computing systems. AI data throughput has been rising tenfold each year,<sup>1</sup> and current computing systems don’t offer memory capacities large enough to handle the sharp increase in data volumes.</p>
<p>Currently, a central processing unit (CPU) can hold up to 16 DRAMs (a maximum of 8 terabytes (TB)) <span>—</span> a number far smaller than what’s needed to handle the massive stores of data used in AI and machine learning. The need for a memory platform that supports fast interfaces and easy scalability is becoming all the more clear as the age of AI draws ever nearer. Recently, a new DRAM module based on Compute Express Link (CXL) has emerged as a promising memory solution for the AI era. So too have processing-in-memory (PIM) and computing storage equipped with a memory-based AI processor.</p>
<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/CTkQDcZznyc?rel=0" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"><span style="width: 0px;overflow: hidden;line-height: 0" data-mce-type="bookmark" class="mce_SELRES_start"></span></iframe></div>
<h3><span style="color: #000080"><strong>CXL: An Interface Pushing the Boundaries of Memory Capacity and Server Flexibility</strong></span></h3>
<p>What makes CXL a next-generation memory platform and such a promising solution to current computing limitations? In short: scalability.</p>
<p>CXL is a new interface that’s designed to enhance the efficiency of a computing system’s memory, CPU and graphics processing unit (GPU). In conventional platforms, devices like memory and storage have their own interfaces that link them to the CPU. But going through all these different interfaces to communicate with one another creates latency, slowing down operations. And with the massive growth in data being used for AI and machine learning, latency issues have only gotten worse.</p>
<p>CXL is part of a next-generation interface that will be applied to PCIe 5.0. By integrating multiple existing interfaces into one, directly connecting devices and enabling them to share memory, CXL addresses those limitations and creates new data pathways that are faster and more efficient. This next-generation memory solution is the reason why CXL has been receiving so much attention.</p>
<p>In line with this trend, in May of 2021, Samsung Electronics introduced the CXL Memory Expander, a first-of-its-kind CXL-based software development solution, and began promoting CXL memory solutions. The main advantages of CXL are as follows:</p>
<ul>
<li><span style="font-size: 16pt"><strong>Unrivaled Memory Expansion</strong></span></li>
</ul>
<p>Similar to a solid state drive (SSD), which is an external storage device, the CXL Memory Expander enables DRAM capacity to be expanded when installed in the location where the SSD is inserted. In other words, it enables an IT system’s DRAM capacity to be expanded simply by improving the interface and without having to modify the existing server structure or change it altogether.</p>
<ul>
<li><span style="font-size: 16pt"><span style="font-size: 16pt"><strong>Streamlined Data Handling</strong></span></span></li>
</ul>
<p>A key benefit of the Memory Expander is its efficient data processing. By expanding higher bandwidth, it enables different devices to share memory and leverage their resources more effectively. They can then use the accelerator’s memory as if it were main memory by sharing common memory areas. Devices without their own internal memory can also take advantage of that main memory and use it as their own.</p>
<ul>
<li><span style="font-size: 16pt"><span style="font-size: 16pt"><strong>Accelerated Computing Speed</strong></span></span></li>
</ul>
<p>Minimizing latency issues (or delays) caused by increases in data throughput is a key function of the CXL Memory Expander. The Memory Expander leverages both the accelerator and the CPU to improve system computing speeds, supporting much smoother and more rapid data processing.</p>
<p>As it stands, many in the industry are still unfamiliar with the concept of the CXL interface. Although the technology is still in its early stages, its potential to drive more efficient data processing is pushing it to be viewed as a driver of the Fourth Industrial Revolution.</p>
<p>In preparation for the rapidly approaching AI era, Samsung Electronics will help expand the CXL ecosystem by introducing everything from CXL memory hardware to a software solution called the Scalable Memory Development Kit (SMDK), and will lead the market with next-generation memory solutions that are capable of accommodating the fast-evolving landscape of data processing.</p>
<p><em><span style="font-size: small"><sup>1</sup> Source: OpenAI (2019)</span></em></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Puts Forward a Vision To ‘Copy and Paste’ the Brain on Neuromorphic Chips</title>
				<link>https://news.samsung.com/global/samsung-electronics-puts-forward-a-vision-to-copy-and-paste-the-brain-on-neuromorphic-chips</link>
				<pubDate>Sun, 26 Sep 2021 08:00:34 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/09/Neuromorphic_Chips_0926_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[AI Semiconductors]]></category>
		<category><![CDATA[Neuromorphic Chips]]></category>
		<category><![CDATA[Neuromorphic Engineering]]></category>
		<category><![CDATA[Perspective Paper]]></category>
		<category><![CDATA[Samsung Memory]]></category>
                <guid isPermaLink="false">https://bit.ly/3nZCoOJ</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, today shared a new insight that takes the world a step closer to realizing neuromorphic chips that can better mimic the brain. Envisioned by the leading engineers and scholars from Samsung and Harvard University, the insight was published as a Perspective paper, titled ‘Neuromorphic electronics based […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, a world leader in advanced semiconductor technology, today shared a new insight that takes the world a step closer to realizing neuromorphic chips that can better mimic the brain.</p>
<p>Envisioned by the leading engineers and scholars from Samsung and Harvard University, the insight was published as a Perspective paper, titled ‘<a href="https://www.nature.com/articles/s41928-021-00646-1" target="_blank" rel="noopener">Neuromorphic electronics based on copying and pasting the brain</a>’, by Nature Electronics. Donhee Ham, Fellow of Samsung Advanced Institute of Technology (SAIT) and Professor of Harvard University, Professor Hongkun Park of Harvard University, Sungwoo Hwang, President and CEO of Samsung SDS and former Head of SAIT, and Kinam Kim, Vice Chairman and CEO of Samsung Electronics are the co-corresponding authors.</p>
<div id="attachment_127320" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-127320" class="wp-image-127320 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/09/Neuromorphic_Chips_0926_main1.jpg" alt="" width="1000" height="650" /><p id="caption-attachment-127320" class="wp-caption-text">Image of rat neurons on CNEA (CMOS nanoelectrode array).</p></div>
<p>The essence of the vision put forward by the authors is best summed up by the two words, ‘copy’ and ‘paste’. The paper suggests a way to copy the brain’s neuronal connection map using a breakthrough nanoelectrode array developed by Dr. Ham and Dr. Park, and to paste this map onto a high-density three-dimensional network of solid-state memories, the technology for which Samsung has been a world leader.</p>
<p>Through this copy and paste approach, the authors envision to create a memory chip that approximates the unique computing traits of the brain – low power, facile learning, adaptation to environment, and even autonomy and cognition – that have been beyond the reach of current technology.</p>
<p>The brain is made up of a large number of neurons, and their wiring map is responsible for the brain’s functions. Thus the knowledge of the map is the key to reverse engineering the brain.</p>
<p>While the original goal of neuromorphic engineering, launched in the 1980s, was to mimic such structure and function of the neuronal networks on a silicon chip, it proved difficult because, even until now, little is known of how the large number of neurons are wired together to create the brain’s higher functions. Thus, the goal of neuromorphic engineering has been eased to designing a chip ‘inspired’ by the brain rather than rigorously mimicking it.</p>
<p>This paper suggests a way to return to the original neuromorphic goal of the brain reverse engineering. The nanoelectrode array can effectively enter a large number of neurons so it can record their electrical signals with high sensitivity. These massively parallel intracellular recordings inform the neuronal wiring map, indicating where neurons connect with one another and how strong these connections are. Hence from these telltale recordings, the neuronal wiring map can be extracted, or ‘copied’.</p>
<p>The copied neuronal map can then be ‘pasted’ to a network of non-volatile memories – such as commercial flash memories that are used in our everyday life in solid-state drives (SSD), or ‘new’ memories such as resistive random access memories (RRAM) – by programming each memory so that its conductance represents the strength of each neuronal connection in the copied map.</p>
<div id="attachment_127317" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-127317" class="wp-image-127317 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/09/Neuromorphic_Chips_0926_main2.jpg" alt="" width="1000" height="312" /><p id="caption-attachment-127317" class="wp-caption-text">(From the left) Donhee Ham, Fellow of Samsung Advanced Institute of Technology (SAIT) and Professor of Harvard University, Hongkun Park, Professor of Harvard University, Sungwoo Hwang, President and CEO of Samsung SDS (former Head of SAIT) and Kinam Kim, Vice Chairman and CEO of Samsung Electronics, the co-corresponding authors.</p></div>
<p>The paper takes a step further and suggests a strategy to rapidly paste the neuronal wiring map onto a memory network. A network of specially-engineered non-volatile memories can learn and express the neuronal connection map, when directly driven by the intracellularly recorded signals. This is a scheme that directly downloads the brain’s neuronal connection map onto the memory chip.</p>
<p>Since the human brain has an estimated 100 billion or so neurons, and a thousand or so times more synaptic connections, the ultimate neuromorphic chip will require 100 trillion or so memories. Integrating such vast number of memories on a single chip would be made possible by 3D integration of memories, the technology led by Samsung that opened up a new era for memory industry.</p>
<p>Leveraging its leading experience in chip manufacturing, Samsung is planning to continue its research into neuromorphic engineering, in order to extend Samsung’s leadership in the field of next generation AI semiconductors.</p>
<p>“The vision we present is highly ambitious,” said Dr. Ham. “But working toward such a heroic goal will push the boundaries of machine intelligence, neuroscience, and semiconductor technology.”</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Develops Industry’s First High Bandwidth Memory with AI Processing Power</title>
				<link>https://news.samsung.com/global/samsung-develops-industrys-first-high-bandwidth-memory-with-ai-processing-power</link>
				<pubDate>Wed, 17 Feb 2021 11:00:20 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/02/HBM-PIM_PR_Thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[HBM-PIM]]></category>
		<category><![CDATA[High Bandwidth Memory]]></category>
		<category><![CDATA[Processing-In-Memory]]></category>
                <guid isPermaLink="false">https://bit.ly/3rX04lE</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, today announced that it has developed the industry’s first High Bandwidth Memory (HBM) integrated with artificial intelligence (AI) processing power — the HBM-PIM. The new processing-in-memory (PIM) architecture brings powerful AI computing capabilities inside high-performance memory, to accelerate large-scale processing in data centers, high performance computing […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, the world leader in advanced memory technology, today announced that it has developed the industry’s first High Bandwidth Memory (HBM) integrated with artificial intelligence (AI) processing power — the HBM-PIM. The new processing-in-memory (PIM) architecture brings powerful AI computing capabilities inside high-performance memory, to accelerate large-scale processing in data centers, high performance computing (HPC) systems and AI-enabled mobile applications.</p>
<p>Kwangil Park, senior vice president of Memory Product Planning at Samsung Electronics stated, “Our groundbreaking HBM-PIM is the industry’s first programmable PIM solution tailored for diverse AI-driven workloads such as HPC, training and inference. We plan to build upon this breakthrough by further collaborating with AI solution providers for even more advanced PIM-powered applications.”</p>
<p>Rick Stevens, Argonne’s Associate Laboratory Director for Computing, Environment and Life Sciences commented, “I’m delighted to see that Samsung is addressing the memory bandwidth/power challenges for HPC and AI computing. HBM-PIM design has demonstrated impressive performance and power gains on important classes of AI applications, so we look forward to working together to evaluate its performance on additional problems of interest to Argonne National Laboratory.”</p>
<p>Most of today’s computing systems are based on the von Neumann architecture, which uses separate processor and memory units to carry out millions of intricate data processing tasks. This sequential processing approach requires data to constantly move back and forth, resulting in a system-slowing bottleneck especially when handling ever-increasing volumes of data.</p>
<p>Instead, the HBM-PIM brings processing power directly to where the data is stored by placing a DRAM-optimized AI engine inside each memory bank — a storage sub-unit — enabling parallel processing and minimizing data movement. When applied to Samsung’s existing HBM2 Aquabolt solution, the new architecture is able to deliver over twice the system performance while reducing energy consumption by more than 70%. The HBM-PIM also does not require any hardware or software changes, allowing faster integration into existing systems.</p>
<p>Samsung’s paper on the HBM-PIM has been selected for presentation at the renowned International Solid-State Circuits Virtual Conference (ISSCC) held through Feb. 22. Samsung’s HBM-PIM is now being tested inside AI accelerators by leading AI solution partners, with all validations expected to be completed within the first half of this year.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Baidu and Samsung Electronics Ready for Production of Leading-Edge AI Chip for Early Next Year</title>
				<link>https://news.samsung.com/global/baidu-and-samsung-electronics-ready-for-production-of-leading-edge-ai-chip-for-early-next-year</link>
				<pubDate>Wed, 18 Dec 2019 08:00:18 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/12/Samsung-Baidu-Chip_thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Baidu KUNLUN]]></category>
		<category><![CDATA[Edge Computing]]></category>
		<category><![CDATA[High Performance Computing]]></category>
		<category><![CDATA[HPC]]></category>
		<category><![CDATA[I-Cube™]]></category>
		<category><![CDATA[Interposer-Cube]]></category>
                <guid isPermaLink="false">http://bit.ly/34snfrF</guid>
									<description><![CDATA[Baidu, a leading Chinese-language Internet search provider, and Samsung Electronics, a world leader in advanced semiconductor technology, today announced that Baidu’s first cloud-to-edge AI accelerator, Baidu KUNLUN, has completed its development and will be mass-produced early next year. Baidu KUNLUN chip is built on the company’s advanced XPU, a home-grown neural processor architecture for cloud, […]]]></description>
																<content:encoded><![CDATA[<p>Baidu, a leading Chinese-language Internet search provider, and Samsung Electronics, a world leader in advanced semiconductor technology, today announced that Baidu’s first cloud-to-edge AI accelerator, Baidu KUNLUN, has completed its development and will be mass-produced early next year.</p>
<p>Baidu KUNLUN chip is built on the company’s advanced XPU, a home-grown neural processor architecture for cloud, edge, and AI, as well as Samsung’s 14-nanometer (nm) process technology with its I-Cube<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> (Interposer-Cube) package solution.</p>
<p>The chip offers 512 gigabytes per second (GBps) memory bandwidth and supplies up to 260 Tera operations per second (TOPS) at 150 watts. In addition, the new chip allows Ernie, a pre-training model for natural language processing, to infer three times faster than the conventional GPU/FPGA-accelerating model.</p>
<p>Leveraging the chip’s limit-pushing computing power and power efficiency, Baidu can effectively support a wide variety of functions including large-scale AI workloads, such as search ranking, speech recognition, image processing, natural language processing, autonomous driving, and deep learning platforms like PaddlePaddle.</p>
<p>Through the first foundry cooperation between the two companies, Baidu will provide advanced AI platforms for maximizing AI performance, and Samsung will expand its foundry business into high performance computing (HPC) chips that are designed for cloud and edge computing.</p>
<p>“We are excited to lead the HPC industry together with Samsung Foundry,” said OuYang Jian, Distinguished Architect of Baidu. “Baidu KUNLUN is a very challenging project since it requires not only a high level of reliability and performance at the same time, but is also a compilation of the most advanced technologies in the semiconductor industry. Thanks to Samsung’s state of the art process technologies and competent foundry services, we were able to meet and surpass our goal to offer superior AI user experience. ”</p>
<p>“We are excited to start a new foundry service for Baidu using our 14nm process technology,” said Ryan Lee, vice president of Foundry Marketing at Samsung Electronics. “Baidu KUNLUN is an important milestone for Samsung Foundry as we’re expanding our business area beyond mobile to datacenter applications by developing and mass-producing AI chips. Samsung will provide comprehensive foundry solutions from design support to cutting-edge manufacturing technologies, such as 5LPE, 4LPE, as well as 2.5D packaging.”</p>
<p>As higher performance is required in diverse applications such as AI and HPC, chip integration technology is becoming more and more important. Samsung’s I-Cube<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> technology, which connects a logic chip and high bandwidth memory (HBM) 2 with an interposer, provides higher density/ bandwidth on minimum size by utilizing Samsung’s differentiated solutions.</p>
<p>Compared to previous technology, these solutions maximize product performance with more than 50% improved power/signal integrity. It is anticipated that I-Cube<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> technology will mark a new epoch in the heterogeneous computing market. Samsung is also developing more advanced packaging technologies, such as redistribution layers (RDL) interposer and 4x, 8x HBM integrated package<em>.</em></p>
<div id="attachment_114122" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-114122" class="size-full wp-image-114122" src="https://img.global.news.samsung.com/global/wp-content/uploads/2019/12/Samsung-Baidu-Chip-Main1.jpg" alt="" width="1000" height="400" /><p id="caption-attachment-114122" class="wp-caption-text">2.5D Package Structure</p></div>
<p><span style="font-size: small"><strong><u>About Baidu</u></strong></span></p>
<p><span style="font-size: small">Baidu, Inc. is the leading Chinese language Internet search provider. Baidu aims to make the complicated world simpler through technology. Baidu’s ADSs trade on the NASDAQ Global Select Market under the symbol “BIDU”. Currently, ten ADSs represent one Class A ordinary share.</span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Introduces A High-Speed, Low-Power NPU Solution for AI Deep Learning</title>
				<link>https://news.samsung.com/global/samsung-electronics-introduces-a-high-speed-low-power-npu-solution-for-ai-deep-learning</link>
				<pubDate>Tue, 02 Jul 2019 16:00:41 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/07/OnDevice-AI_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[AI Lightweight Algorithm]]></category>
		<category><![CDATA[Computer Vision and Pattern Recognition]]></category>
		<category><![CDATA[CVPR]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Exynos 9820]]></category>
		<category><![CDATA[Neural Processing Unit]]></category>
		<category><![CDATA[NPU]]></category>
		<category><![CDATA[On-Device AI]]></category>
		<category><![CDATA[QIL]]></category>
		<category><![CDATA[Quantization Interval Learning]]></category>
		<category><![CDATA[SAIT]]></category>
		<category><![CDATA[Samsung Advanced Institute of Technology]]></category>
		<category><![CDATA[Samsung Exynos 9820]]></category>
                <guid isPermaLink="false">http://bit.ly/2FJkaKb</guid>
									<description><![CDATA[Deep learning algorithms are a core element of artificial intelligence (AI) as they are the processes by which a computer is able to think and learn like a human being does. A Neural Processing Unit (NPU) is a processor that is optimized for deep learning algorithm computation, designed to efficiently process thousands of these computations […]]]></description>
																<content:encoded><![CDATA[<p>Deep learning algorithms are a core element of artificial intelligence (AI) as they are the processes by which a computer is able to think and learn like a human being does. A Neural Processing Unit (NPU) is a processor that is optimized for deep learning algorithm computation, designed to efficiently process thousands of these computations simultaneously.</p>
<p>Samsung Electronics last month announced its goal to strengthen its leadership in the global system semiconductor industry by 2030 through expanding its proprietary NPU technology development. The company recently delivered an update to this goal at the conference on Computer Vision and Pattern Recognition (CVPR), one of the top academic conferences in computer vision fields.</p>
<p>This update is the company’s development of its On-Device AI lightweight algorithm, introduced at CVPR with a paper titled “Learning to Quantize Deep Networks by Optimizing Quantization Intervals With Task Loss”. On-Device AI technologies directly compute and process data from within the device itself. Over 4 times lighter and 8 times faster than existing algorithms, Samsung’s latest algorithm solution is dramatically improved from previous solutions and has been evaluated to be key to solving potential issues for low-power, high-speed computations.</p>
<h3><span style="color: #000080"><strong>Streamlining the Deep Learning Process</strong></span></h3>
<p>Samsung Advanced Institute of Technology (SAIT) has announced that they have successfully developed On-Device AI lightweight technology that performs computations 8 times faster than the existing 32-bit deep learning data for servers. By adjusting the data into groups of under 4 bits while maintaining accurate data recognition, this method of deep learning algorithm processing is simultaneously much faster and much more energy efficient than existing solutions.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-111111" src="https://img.global.news.samsung.com/global/wp-content/uploads/2019/07/OnDevice-AI_main1.jpg" alt="" width="1000" height="771" /></p>
<p>Samsung’s new On-Device AI processing technology determines the intervals of the significant data that influence overall deep learning performance through ‘learning’. This ‘Quantization<sup><span>1</span></sup> Interval Learning (QIL)’ retains data accuracy by re-organizing the data to be presented in bits smaller than their existing size. SAIT ran experiments that successfully demonstrated how the quantization of an in-server deep learning algorithm in 32 bit intervals provided higher accuracy than other existing solutions when computed into levels of less than 4 bits.</p>
<p>When the data of a deep learning computation is presented in bit groups lower than 4 bits, computations of ‘and’ and ‘or’ are allowed, on top of the simpler arithmetic calculations of addition and multiplication. This means that the computation results using the QIL process can achieve the same results as existing processes can while using 1/40 to 1/120 fewer transistors<sup><span>2</span></sup>.</p>
<p>As this system therefore requires less hardware and less electricity, it can be mounted directly in-device at the place where the data for an image or fingerprint sensor is being obtained, ahead of transmitting the processed data on to the necessary end points.</p>
<h3><span style="color: #000080"><strong>The Future of AI Processing and Deep Learning</strong></span></h3>
<p>This technology will help develop Samsung’s system semiconductor capacity as well as strengthening one of the core technologies of the AI era – On-Device AI processing. Differing from AI services that use cloud servers, On-Device AI technologies directly compute data all from within the device itself.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-111107" src="https://img.global.news.samsung.com/global/wp-content/uploads/2019/07/OnDevice-AI_main2.jpg" alt="" width="1000" height="1315" /></p>
<p>On-Device AI technology can reduce the cost of cloud construction for AI operations since it operates on its own and provides quick and stable performance for use cases such as virtual reality and autonomous driving. Furthermore, On-Device AI technology can save personal biometric information used for device authentication, such as fingerprint, iris and face scans, onto mobile devices safely.</p>
<p>“Ultimately, in the future we will live in a world where all devices and sensor-based technologies are powered by AI,” noted Chang-Kyu Choi, Vice President and head of Computer Vision Lab of SAIT. “Samsung’s On-Device AI technologies are lower-power, higher-speed solutions for deep learning that will pave the way to this future. They are set to expand the memory, processor and sensor market, as well as other next-generation system semiconductor markets.”</p>
<p>A core feature of On-Device AI technology is its ability to compute large amounts of data at a high speed without consuming excessive amounts of electricity. Samsung’s first solution to this end was the Exynos 9 (9820), introduced last year, which featured a proprietary Samsung NPU inside the mobile System on Chip (SoC). This product allows mobile devices to perform AI computations independent of any external cloud server.</p>
<p>Many companies are turning their attention to On-Device AI technology. Samsung Electronics plans to enhance and extend its AI technology leadership by applying this algorithm not only to mobile SoC, but also to memory and sensor solutions in the near future.</p>
<div id="attachment_111108" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-111108" class="wp-image-111108 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2019/07/OnDevice-AI_main3.jpg" alt="" width="1000" height="473" /><p id="caption-attachment-111108" class="wp-caption-text">Four individuals who played key roles in developing Samsung’s On-Device AI Lightweight Algorithm. From Left to right; Jae-Joon Han, Chang-Young Son, Sang-Il Jung, Chang-Kyu Choi of Samsung Advanced Institute of Technology</p></div>
<p><span style="font-size: small"><span>1</span> <em>Quantization is the process of decreasing the number of bits in data by binning the given data into sections of limited number levels, which can be represented in certain bit values and are regarded as having the same value per section</em></span></p>
<p><span style="font-size: small"><sup><span>2</span></sup> <em>Transistors are devices that control the flow of current or voltage in a semiconductor by acting as amplifiers or switches</em></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics to Strengthen its Neural Processing Capabilities for Future AI Applications</title>
				<link>https://news.samsung.com/global/samsung-electronics-to-strengthen-its-neural-processing-capabilities-for-future-ai-applications</link>
				<pubDate>Tue, 18 Jun 2019 11:00:40 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/06/SamsungRadeon_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[ADAS]]></category>
		<category><![CDATA[advanced driver assistance systems]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[automotive processors]]></category>
		<category><![CDATA[Big Data Processing]]></category>
		<category><![CDATA[Exynos 9820]]></category>
		<category><![CDATA[In-Vehicle Infotainment]]></category>
		<category><![CDATA[IVI]]></category>
		<category><![CDATA[NPU]]></category>
		<category><![CDATA[Samsung AI Solutions]]></category>
		<category><![CDATA[Samsung Exynos 9820]]></category>
		<category><![CDATA[Semiconductors Leadership]]></category>
                <guid isPermaLink="false">http://bit.ly/2XTU12B</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, today announced that it will strengthen its neural processing unit (NPU) capabilities to further extend the reach of its artificial intelligence (AI) solutions. In line with its focus on next-generation NPU technologies, Samsung plans to create over 2,000 related jobs worldwide by 2030, which is about […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, a world leader in advanced semiconductor technology, today announced that it will strengthen its neural processing unit (NPU) capabilities to further extend the reach of its artificial intelligence (AI) solutions.</p>
<p>In line with its focus on next-generation NPU technologies, Samsung plans to create over 2,000 related jobs worldwide by 2030, which is about 10 times the current headcount. The company will also expand upon its existing collaboration with globally distinguished research institutes and universities, and support the nurturing of future talent in the field of AI, including deep learning and neural processing.</p>
<p>“For the coming age of AI, Samsung is committed to delivering industry-leading advancements brought to life by our NPU technologies,” said Inyup Kang, president of System LSI Business at Samsung Electronics. “As we leverage our differentiated technology, close partnerships with global institutes and active investment in top talent, we are excited to take future AI processing capabilities to the next level.”</p>
<p>Samsung introduced its first NPU in the company’s premium mobile processor, the Exynos 9820, last year and plans to continue offering advanced on-device AI features for high-performance mobile chips. Applications for Samsung’s NPUs will expand into areas such as automotive processors that power in-vehicle infotainment (IVI) and advanced driver assistance systems (ADAS), as well as next-generation datacenters optimized for big data processing.</p>
<p>The System LSI Business and Samsung Advanced Institute of Technology — Samsung’s R&D arm — together plan to extend and evolve the company’s current NPU research into novel AI hardware technologies such as neuromorphic processors that aim to operate at the level of a human brain.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Expands SAIT AI Lab Montreal to Spur AI Research for Next-Generation System Semiconductor</title>
				<link>https://news.samsung.com/global/samsung-electronics-expands-sait-ai-lab-montreal-to-spur-ai-research-for-next-generation-system-semiconductor</link>
				<pubDate>Thu, 02 May 2019 11:00:22 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/05/AI-Lab-in-Montreal_thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[More Stories]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[GANs]]></category>
		<category><![CDATA[Generative Adversarial Networks]]></category>
		<category><![CDATA[Mila]]></category>
		<category><![CDATA[Montreal]]></category>
		<category><![CDATA[Montreal Institute for Learning Algorithms]]></category>
		<category><![CDATA[SAIT]]></category>
		<category><![CDATA[Samsung Advanced Institute of Technology]]></category>
		<category><![CDATA[System Semiconductors]]></category>
                <guid isPermaLink="false">http://bit.ly/2UU7LYH</guid>
									<description><![CDATA[Samsung Electronics today announced the expansion of the ‘Samsung Advanced Institute of Technology (SAIT) artificial intelligence (AI) Lab Montreal’ in Canada. The Lab will help the company strengthen its fundamentals in AI research and drive competitiveness in system semiconductors. The AI Lab is located in Mila – Montreal Institute for Learning Algorithms – in Montreal, […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics today announced the expansion of the ‘Samsung Advanced Institute of Technology (SAIT) artificial intelligence (AI) Lab Montreal’ in Canada. The Lab will help the company strengthen its fundamentals in AI research and drive competitiveness in system semiconductors.</p>
<p>The AI Lab is located in Mila – Montreal Institute for Learning Algorithms – in Montreal, Canada. Founded by Professor Yoshua Bengio at the University of Montreal, Mila is one of the greatest research centers in the field of deep learning and has a partnership with the University of Montreal and McGill University. SAIT AI Lab Montreal has an open workspace with the aim of working closely with the AI research communities in Mila.</p>
<p>SAIT AI Lab Montreal will focus on unsupervised learning and Generative Adversarial Networks (GANs) research to develop disruptive innovation and breakthrough technologies, including new deep learning algorithms and next generation of on-device AI.</p>
<p>To drive the effort, this AI Lab has actively recruited leaders in deep learning research, including Simon Lacoste-Julien, Professor at the University of Montreal, who recently joined as the leader of the lab. In addition, Samsung is planning to dispatch R&D personnel in its Device Solutions Business to Montreal over time and utilize AI Labs as a base for training AI researchers and collaborating with other advanced AI research institutes.</p>
<p>On the other side, SAIT AI Lab Montreal continues to build a strong relationship with Yoshua Bengio, one of the world’s greatest experts on deep learning, machine learning, and AI. SAIT and Professor Bengio collaborated on deep learning algorithm research since 2014, successfully publishing three papers on academic journals.</p>
<p>Professor Yoshua Bengio said, “Samsung’s collaboration with Mila is well established already and has been productive and built strong trust on both sides. With a new SAIT lab in the midst of the recently inaugurated Mila building and many exciting research challenges ahead of us in AI, I expect even more mutually positive outcomes in the future.”</p>
<p>SAIT has actively pursued research collaboration with other top authorities in the field. In addition to Professor Bengio, SAIT has worked with Yann LeCun, Professor at New York University and Richard Zemel, Professor at University of Toronto. Yoshua Bengio and Yann LeCun, along with computer scientist Geoffrey Everest Hinton won the 2018 Turing Award which is deemed the ‘Nobel Prize in computer science.’</p>
<p>“SAIT focuses on research and development – not only in next generation semiconductor but also innovative AI as a seed technology in system semiconductors. SAIT AI Lab Montreal will play a key role within Samsung to redefine AI theory and deep learning algorithm for the next 10 years,” said Sungwoo Hwang, Executive Vice President and Deputy Head of SAIT.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics to Invest KRW 133 Trillion  in Logic Chip Businesses by 2030</title>
				<link>https://news.samsung.com/global/samsung-electronics-to-invest-krw-133-trillion-in-logic-chip-businesses-by-2030</link>
				<pubDate>Wed, 24 Apr 2019 13:00:21 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/04/HwaseongEUVCampus_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Logic Chips]]></category>
		<category><![CDATA[Logic Semiconductors]]></category>
		<category><![CDATA[R&D]]></category>
		<category><![CDATA[Samsung Foundry Business]]></category>
		<category><![CDATA[Samsung System LSI]]></category>
                <guid isPermaLink="false">http://bit.ly/2KZvCqo</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, today announced that it will invest KRW 133 trillion by 2030 to strengthen its competitiveness in System LSI and Foundry businesses. The investment plan is expected to help the company to reach its goal of becoming the world leader in not only memory semiconductors but also […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, a world leader in advanced semiconductor technology, today announced that it will invest KRW 133 trillion by 2030 to strengthen its competitiveness in System LSI and Foundry businesses.</p>
<p>The investment plan is expected to help the company to reach its goal of becoming the world leader in not only memory semiconductors but also logic chips by 2030. The company also plans to create 15,000 jobs in R&D and production to bolster its technological prowess.</p>
<p>The investments through 2030 will be composed of KRW 73 trillion for domestic R&D and KRW 60 trillion for production infrastructure. Per this plan, investments in R&D and facilities for logic semiconductors are expected to amount to an average KRW 11 trillion per year until 2030.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>The Exynos 9 Series 9820: Intelligence from Within</title>
				<link>https://news.samsung.com/global/the-exynos-9-series-9820-intelligence-from-within</link>
				<pubDate>Wed, 19 Dec 2018 22:00:25 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/12/exynos-9820-video_thumb728_F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[Exynos]]></category>
		<category><![CDATA[Exynos 9]]></category>
		<category><![CDATA[Exynos 9 Series 9810]]></category>
		<category><![CDATA[Exynos 9 Series 9820]]></category>
		<category><![CDATA[ISP]]></category>
		<category><![CDATA[NPU]]></category>
                <guid isPermaLink="false">http://bit.ly/2ErplyI</guid>
									<description><![CDATA[﻿ * Devices shown and features simulated on devices are fictitious creations. No simulation of any real products or features is intended. Built to maximize intelligence on the go, the Exynos 9 Series 9820 is Samsung’s most innovative mobile processor to date. Featuring advanced artificial intelligence (AI) capabilities, a powerful custom CPU and more, the […]]]></description>
																<content:encoded><![CDATA[<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/2UlQPWZDxBU?rel=0" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"><span style="width: 0px;overflow: hidden;line-height: 0" data-mce-type="bookmark" class="mce_SELRES_start">﻿</span></iframe></div>
<p style="text-align: center"><span style="font-size: small">* Devices shown and features simulated on devices are fictitious creations. No simulation of any real products or features is intended.</span></p>
<p><span>Built to maximize intelligence on the go, the Exynos 9 Series 9820 is Samsung’s most innovative mobile processor to date. Featuring advanced artificial intelligence (AI) capabilities, a powerful custom CPU and more, the processor expands the possibilities of mobile devices and provides the blueprint for future smartphones.</span></p>
<p><span>From photography and multimedia experiences to connectivity, the Exynos 9820 takes every mobile feature to the next level. The processor’s superior performance and efficiency also make multitasking and mobile gaming more seamless than ever before.</span><span></span></p>
<ul>
<li><span>An Intelligent Powerhouse: With a specialized Neural Processing Unit (NPU), the Exynos 9820 can process AI-related functions seven times faster than its predecessor.<sup>1</sup></span></li>
<li><span>Superior Performance: The Exynos 9820’s 4<sup>th</sup> generation custom CPU and tri-cluster architecture boost the processor’s single core and multi-core performances by up to 15 percent and 25 percent respectively compared to those of the Exynos 9810.<sup>1</sup></span></li>
<li><span>Lightning-fast Connectivity: LTE-Advanced Pro modem offers downlink speed of up to 2.0Gbps with 8x carrier aggregation (CA) and uplink speed of up to 316Mbps for seamless browsing, video streaming and online gameplay.<sup>2</sup></span></li>
<li><span>A Game Changer: With wider execution engines, the Exynos 9820’s Mali-G76 MP12 GPU offers up to 40 percent improvement in performance or 35 percent enhancement in power efficiency compared to those of the Exynos 9810.<sup>1</sup></span></li>
<li><span>Photography Reinvented: The Exynos 9820’s advanced image signal processor (ISP) supports flexible multi-camera solutions for high-quality photos.</span><span></span></li>
<li><span>Capture Reality in 8K: The Exynos 9820’s multi-format codec enables recording more details than ever by supporting 8K video encoding and decoding at 30fps.</span></li>
<li><span>Rock-solid Security: Physically unclonable function (PUF) provides the ultimate security solution for storing and managing personal data.</span></li>
<li><span>All-day Productivity: The Exynos 9820 delivers round-the-clock mobile productivity by enhancing battery life with an innovative low<strong>–</strong>power design.</span></li>
</ul>
<p><span> </span></p>
<p><span>For more information about the Exynos 9820, head over to the webpage <a href="https://www.samsung.com/semiconductor/minisite/exynos/products/mobileprocessor/exynos-9-series-9820/" target="_blank" rel="noopener">here</a>.</span></p>
<p><em><span style="font-size: small"><sup>1</sup>Tested internally on the Exynos 9820 and the Exynos 9810.</span></em></p>
<p><em><span style="font-size: small"><sup>2</sup>Actual speed may differ by country and carrier.</span></em></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Brings On-device AI Processing for Premium Mobile Devices with Exynos 9 Series 9820 Processor</title>
				<link>https://news.samsung.com/global/samsung-brings-on-device-ai-processing-for-premium-mobile-devices-with-exynos-9-series-9820-processor</link>
				<pubDate>Wed, 14 Nov 2018 11:00:29 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/11/Exynos-9-9820_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Exynos]]></category>
		<category><![CDATA[Exynos 9]]></category>
		<category><![CDATA[Exynos 9 Series 9820]]></category>
		<category><![CDATA[NPU]]></category>
                <guid isPermaLink="false">http://bit.ly/2Psd6Zy</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, today announced its latest premium application processor (AP), the Exynos 9 Series 9820, equipped for on-device Artificial Intelligence (AI) applications. The Exynos 9820 features a fourth-generation custom CPU, 2.0-gigabits-per-second (Gbps) LTE Advanced Pro modem, and an enhanced neural processing unit (NPU) to bring new smart experiences […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-106433" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/11/Exynos-9-9820_main.jpg" alt="" width="1000" height="500" /></p>
<p>Samsung Electronics, a world leader in advanced semiconductor technology, today announced its latest premium application processor (AP), the Exynos 9 Series 9820, equipped for on-device Artificial Intelligence (AI) applications. The Exynos 9820 features a fourth-generation custom CPU, 2.0-gigabits-per-second (Gbps) LTE Advanced Pro modem, and an enhanced neural processing unit (NPU) to bring new smart experiences to mobile devices.</p>
<p>“As AI-related services expand and their utilization diversify in mobile devices, their processors require higher computational capabilities and efficiency,” said Ben Hur, vice president of System LSI marketing at Samsung Electronics. “The AI capabilities in the Exynos 9 Series 9820 will provide a new dimension of performance in smart devices through an integrated NPU, high-performance fourth-generation custom CPU core, 2.0Gbps LTE modem and improved multimedia performance.”</p>
<p>The Exynos 9820 is an intelligent powerhouse with a separate hardware AI-accelerator, or NPU, which performs AI tasks around seven times faster than the predecessor. With the NPU, AI-related processing can be carried out directly on the device rather than sending the task to a server, providing faster performance as well as better security of personal information. The NPU will enable a variety of new experiences such as instantly adjusting camera settings for a shot based on the surroundings or recognizing objects to provide information in augmented or virtual reality (AR or VR) settings.</p>
<p>With an enhanced architecture design, the Exynos 9820’s new fourth-generation custom core delivers around 20-percent improvement in single core performance or 40-percent in power efficiency when compared to its predecessor which can load data or switch between apps much faster. In addition, the multi-core performance is also increased by around 15 percent. The new mobile processor embeds the latest Mali-G76 GPU cores, which deliver a 40-percent performance boost or 35-percent power savings, allowing longer play time of graphic-intensive mobile games or interactive AR applications.</p>
<p>The LTE-Advanced Pro modem in the Exynos 9820 brings fast mobile broadband speeds of 2.0Gbps downlink with 8x carrier aggregation (CA) and 316 megabits-per-second (Mbps) uplink. At 2.0Gbps downlink speed, an FHD high-definition movie (3.7GB) can be downloaded in about 15 seconds and massively multiplayer online games (MMOG) can be played with less lag. With more aggregation of carriers than the previous solution along with 4×4 MIMO (Multiple-Input, Multiple-Output), 256-QAM (Quadrature Amplitude Modulation) scheme, and enhanced Licensed-Assisted Access (eLAA) technology, the modem delivers stable yet blazing speed on the go.</p>
<p>For more immersive multimedia experiences, Exynos 9820’s multi-format codec (MFC) supports encoding and decoding of 4K UHD video at 150 frames per second (fps). The MFC also renders colors in 10-bit, delivering more accurate representation of color by offering a wider range of tones and hues.</p>
<p>The Exynos 9 Series 9820 is expected to be in mass production by the end of this year.</p>
<p>For more information about Samsung’s Exynos products, please visit <span><a href="http://www.samsung.com/exynos" target="_blank" rel="noopener">http://www.samsung.com/exynos</a>.</span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Announces Industry’s First 8Gb LPDDR5 DRAM for 5G and AI-powered Mobile Applications</title>
				<link>https://news.samsung.com/global/samsung-electronics-announces-industrys-first-8gb-lpddr5-dram-for-5g-and-ai-powered-mobile-applications</link>
				<pubDate>Tue, 17 Jul 2018 08:00:42 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/07/Samsung-LPDDR5-DRAM_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[5G]]></category>
		<category><![CDATA[8Gb LPDDR5]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[LPDDR5 DRAM]]></category>
		<category><![CDATA[UHD]]></category>
                <guid isPermaLink="false">http://bit.ly/2L3tvAK</guid>
									<description><![CDATA[Samsung Electronics, the world leader in advanced memory technology, today announced that it has successfully developed the industry’s first 10-nanometer (nm) class* 8-gigabit (Gb) LPDDR5 DRAM. Since bringing the first 8Gb LPDDR4 to mass production in 2014, Samsung has been setting the stage to transition to the LPDDR5 standard for use in upcoming 5G and Artificial […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-102586" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/07/Samsung-LPDDR5-DRAM_main_1.jpg" alt="" width="705" height="417" /></p>
<p>Samsung Electronics, the world leader in advanced memory technology, today announced that it has successfully developed the industry’s first 10-nanometer (nm) class* 8-gigabit (Gb) LPDDR5 DRAM. Since bringing the first 8Gb LPDDR4 to mass production in 2014, Samsung has been setting the stage to transition to the LPDDR5 standard for use in upcoming 5G and Artificial Intelligence (AI)-powered mobile applications.</p>
<p>The newly-developed 8Gb LPDDR5 is the latest addition to Samsung’s premium DRAM lineup, which includes 10nm-class 16Gb GDDR6 DRAM (in volume production since December 2017) and 16Gb DDR5 DRAM (developed in February).</p>
<p>“This development of 8Gb LPDDR5 represents a major step forward for low-power mobile memory solutions,” said Jinman Han, senior vice president of Memory Product Planning & Application Engineering at Samsung Electronics. “We will continue to expand our next-generation 10nm-class DRAM lineup as we accelerate the move toward greater use of premium memory across the global landscape.”</p>
<p>The 8Gb LPDDR5 boasts a data rate of up to 6,400 megabits per second (Mb/s), which is 1.5 times as fast as the mobile DRAM chips used in current flagship mobile devices (LPDDR4X, 4266Mb/s). With the increased transfer rate, the new LPDDR5 can send 51.2 gigabytes (GB) of data, or approximately 14 full-HD video files (3.7GB each), in a second.</p>
<p>The 10nm-class LPDDR5 DRAM will be available in two bandwidths – 6,400Mb/s at a 1.1 operating voltage (V) and 5,500Mb/s at 1.05V – making it the most versatile mobile memory solution for next-generation smartphones and automotive systems. This performance advancement has been made possible through several architectural enhancements. By doubling the number of memory “banks” – subdivisions within a DRAM cell – from eight to 16, the new memory can attain a much higher speed while reducing power consumption. The 8Gb LPDDR5 also makes use of a highly advanced, speed-optimized circuit architecture that verifies and ensures the chip’s ultra-high-speed performance.</p>
<p>To maximize power savings, the 10nm-class LPDDR5 has been engineered to lower its voltage in accordance with the operating speed of the corresponding application processor, when in active mode. It also has been configured to avoid overwriting cells with ‘0’ values. In addition, the new LPDDR5 chip will offer a ‘deep sleep mode’, which cuts the power usage to approximately half the ‘idle mode’ of the current LPDDR4X DRAM. Thanks to these low-power features, the 8Gb LPDDR5 DRAM will deliver power consumption reductions of up to 30 percent, maximizing mobile device performance and extending the battery life of smartphones.</p>
<p>Based on its industry-leading bandwidth and power efficiency, the LPDDR5 will be able to power AI and machine learning applications, and will be UHD-compatible for mobile devices worldwide.</p>
<p>Samsung, together with leading global chip vendors, has completed functional testing and validation of a prototype 8GB LPDDR5 DRAM package, which is comprised of eight 8Gb LPDDR5 chips. Leveraging the cutting-edge manufacturing infrastructure at its latest line in Pyeongtaek, Korea, Samsung plans to begin mass production of its next-generation DRAM lineups (LPDDR5, DDR5 and GDDR6) in line with the demands of global customers.</p>
<p><span style="font-size: small"><em>*Editor’s Note: 10nm-class is a process node between 10 and 20 nanometers</em></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Optimizes Premium Exynos 9 Series 9810 for AI Applications and Richer Multimedia Content</title>
				<link>https://news.samsung.com/global/samsung-optimizes-premium-exynos-9-series-9810-for-ai-applications-and-richer-multimedia-content</link>
				<pubDate>Thu, 04 Jan 2018 11:00:40 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/01/Exynos-9810_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[CES 2018]]></category>
		<category><![CDATA[Exynos]]></category>
		<category><![CDATA[Exynos 9]]></category>
		<category><![CDATA[Exynos 9 Series 9810]]></category>
		<category><![CDATA[Exynos 9810]]></category>
                <guid isPermaLink="false">http://bit.ly/2DTXXpK</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, today announced the launch of its latest premium application processor (AP), the Exynos 9 Series 9810. The Exynos 9810, built on Samsung’s second-generation 10-nanometer (nm) FinFET process, brings the next level of performance to smartphones and smart devices with its powerful third-generation custom CPU, faster gigabit […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics, a world leader in advanced semiconductor technology, today announced the launch of its latest premium application processor (AP), the Exynos 9 Series 9810. The Exynos 9810, built on Samsung’s second-generation 10-nanometer (nm) FinFET process, brings the next level of performance to smartphones and smart devices with its powerful third-generation custom CPU, faster gigabit LTE modem and sophisticated image processing with deep learning-based software.</p>
<p>In recognition of its innovation and technological advancements, Samsung’s Exynos 9 Series 9810 has been selected as a CES 2018 Innovation Awards HONOREE in the Embedded Technologies product category and will be displayed at the event, which runs January 9-12, 2018, in Las Vegas, USA.</p>
<p>“The Exynos 9 Series 9810 is our most innovative mobile processor yet, with our third-generation custom CPU, ultra-fast gigabit LTE modem and, deep learning-enhanced image processing,” said Ben Hur, vice president of System LSI marketing at Samsung Electronics. “The Exynos 9810 will be a key catalyst for innovation in smart platforms such as smartphones and personal computing for the coming AI era.”</p>
<p>With the benefits of the industry’s most advanced 10nm process technology, the Exynos 9810 will enable seamless multi-tasking with faster loading and transition times between the latest mobile apps. The processor has a brand new eight-core CPU under its hood, four of which are powerful third-generation custom cores that can reach 2.9 gigahertz (GHz), with the other four optimized for efficiency. With an architecture that widens the pipeline and improves cache memory, single-core performance is enhanced two-fold and multi-core performance is increased by around 40 percent compared to its predecessor.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-96779" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/01/Exynos-9810_main_1.jpg" alt="" width="705" height="370" /></p>
<p>Exynos 9810 introduces sophisticated features to enhance user experiences with neural network-based deep learning and stronger security on the most advanced mobile devices. This cutting-edge technology allows the processor to accurately recognize people or items in photos for fast image searching or categorization, or through depth sensing, scan a user’s face in 3D for hybrid face detection. By utilizing both hardware and software, hybrid face detection enables realistic face-tracking filters as well as stronger security when unlocking a device with one’s face. For added security, the processor has a separate security processing unit to safeguard vital personal data such as facial, iris and fingerprint information.</p>
<p>The LTE modem in the Exynos 9810 makes it much easier to broadcast or stream videos at up to UHD resolution, or in even newer visual formats such as 360-degree video. Following the successful launch of the industry’s first 1.0 gigabits per second (Gbps) LTE modem last year, Samsung again leads the industry with the first 1.2Gbps LTE modem embedded in Exynos 9810. It’s also the industry’s first Cat.18 LTE modem to support up to 6x carrier aggregation (CA) for 1.2Gbps downlink and 200 megabits per second (Mbps) uplink. Compared to its predecessor’s 5CA, this new modem delivers more stable data transfers at blazing speed. To maximize the transfer rate, the modem supports a 4×4 MIMO (Multiple-Input, Multiple-Output) and 256-QAM (Quadrature Amplitude Modulation) scheme, and utilizes enhanced Licensed-Assisted Access (eLAA) technology.</p>
<p>Not only will multimedia experiences on mobile devices with Exynos 9810 be faster, but they will also be more immersive, thanks to a dedicated image processing and upgraded multi-format codec (MFC). With faster and more energy-efficient image and visual processing, users will see advanced stabilization for images and video of up to UHD resolution, real-time out-of-focus photography in high resolution and brighter pictures in low light with reduced noise and motion blur. The upgraded MFC supports video recording and playback at up to UHD resolution at 120 frames per second (fps). With 10-bit HEVC (high efficiency video coding) and VP9 support, the MFC can render 1,024 different tones for each primary color (red, green and blue). This translates to a vast 1.07 billion possibilities of colors, or 64 times the previous 8-bit color format’s 16.7 million. With a much wider color range and more accurate color fidelity, users will be able to create and enjoy highly immersive content.</p>
<p>The Exynos 9 Series 9810 is currently in mass production.</p>
<p>For more information about Samsung’s Exynos products, please visit <a href="http://www.samsung.com/exynos" target="_blank" rel="noopener">http://www.samsung.com/exynos</a>.</p>
]]></content:encoded>
																				</item>
			</channel>
</rss>