<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet title="XSL_formatting" type="text/xsl" href="https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss.xsl"?><rss version="2.0"
     xmlns:content="http://purl.org/rss/1.0/modules/content/"
     xmlns:wfw="http://wellformedweb.org/CommentAPI/"
     xmlns:dc="http://purl.org/dc/elements/1.1/"
     xmlns:atom="http://www.w3.org/2005/Atom"
     xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
     xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	 xmlns:media="http://search.yahoo.com/mrss/"
	>
	<channel>
		<title>Samsung R&amp;D &#8211; Samsung Global Newsroom</title>
		<atom:link href="https://news.samsung.com/global/tag/samsung-rd/feed" rel="self" type="application/rss+xml" />
		<link>https://news.samsung.com/global</link>
        
        <currentYear>2022</currentYear>
        <cssFile>https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss_xsl.css</cssFile>
		<description>What's New on Samsung Newsroom</description>
		<lastBuildDate>Wed, 08 Apr 2026 21:00:00 +0000</lastBuildDate>
		<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
					<item>
				<title>How Samsung Research and MIT’s Hackathon Helped Students Innovate for the Future</title>
				<link>https://news.samsung.com/global/how-samsung-research-and-mits-hackathon-helped-students-innovate-for-the-future</link>
				<pubDate>Wed, 12 Oct 2022 11:00:43 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2022/10/MIT-Hackathon_Thumb728_FF.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[Experience and Insight Lab]]></category>
		<category><![CDATA[Hackathon]]></category>
		<category><![CDATA[MIT]]></category>
		<category><![CDATA[Samsung R&D]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">https://bit.ly/3RMxZK5</guid>
									<description><![CDATA[To reimagine the future of health and wellness, Samsung Research’s Experience and Insight Lab (E&I Lab) co-hosted a hackathon with the MIT Media Lab, bringing together a wide variety of student talents and abilities to innovate for the future of wellbeing. The three-day event brought together 12 teams of engineering, design and art students from […]]]></description>
																<content:encoded><![CDATA[<p>To reimagine the future of health and wellness, Samsung Research’s Experience and Insight Lab (E&I Lab) co-hosted a hackathon with the MIT Media Lab, bringing together a wide variety of student talents and abilities to innovate for the future of wellbeing.</p>
<p><img class="alignnone wp-image-136910 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/10/MIT-Hackathon_main1-1_FF.jpg" alt="" width="1000" height="563" /></p>
<p>The three-day event brought together 12 teams of engineering, design and art students from MIT, Harvard and Wellesley College. Each team was challenged to create a novel projection concept and a working prototype that addressed real problems in the lifestyle and wellness space.</p>
<p><img class="alignnone wp-image-136850 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/10/MIT-Hackathon_main2F.jpg" alt="" width="1000" height="279" /></p>
<p>Following in the spirit of William J. Mitchell, the former dean of MIT’s School of Architecture and Planning, the hackathon utilized an interdisciplinary approach, enabling students to utilize a variety of skills from different fields.</p>
<h3><span style="color: #000080"><strong>A Collaborative Effort Between Students and Samsung</strong></span></h3>
<p><img class="alignnone wp-image-136851 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/10/MIT-Hackathon_main3F.jpg" alt="" width="1000" height="355" /></p>
<p>Samsung Research, Samsung Electronics’ advanced R&D hub, leads the development of future technologies for the company’s Device eXperience (DX) Division and is constantly challenging the status quo and pushing boundaries. Based on this philosophy, the hackathon challenged students to push themselves by breaking down the walls between various disciplines. The hackathon also allowed Samsung to connect with students by providing new opportunities for them to ideate, innovate and bring their ideas to life.</p>
<p>Younger generations of students have fresh perspectives and creative ideas on how to solve problems facing society. Taking inspiration from this, Samsung’s Research hopes to foster an environment where diversity and creativity can thrive, leading to great and sometimes unexpected solutions.</p>
<p>The hackathon was mutually beneficial for both students and Samsung. Students were able to learn more about Samsung Research by utilizing the company’s resources and facilities, helping them to learn and build relationships with others. Samsung was also able to attract talented students, further solidifying its position as a global company.</p>
<h3><span style="color: #000080"><strong>Expanding the Future of Wellness Through Student Innovation</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-136843" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/10/MIT-Hackathon_main4.jpg" alt="" width="1000" height="563" /></p>
<p><img loading="lazy" class="alignnone size-full wp-image-136844" src="https://img.global.news.samsung.com/global/wp-content/uploads/2022/10/MIT-Hackathon_main5.jpg" alt="" width="1000" height="563" /></p>
<p>After a long weekend of hacking, students shared their concepts and prototypes with E&I designers and MIT student mentors and Samsung executives including Dr. Sebastian Seung, President and Head of Samsung Research and Dr. Federico Casalegno, EVP and Head of E&I Lab. Teams were judged on their data and research, prototypes, story, impact and overall uniqueness.</p>
<p>Through a variety of angles and thought-provoking ideas, each team created a unique solution to address problems in the lifestyle space. From utilizing projection to connect people through storytelling to creating virtual cooking experiences, many inspirational and innovative ideas were displayed at the hackathon.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Editorial] How the E&I Lab Within Samsung Research Is Redefining the Future</title>
				<link>https://news.samsung.com/global/editorial-how-the-ei-lab-within-samsung-research-is-redefining-the-future</link>
				<pubDate>Fri, 11 Jun 2021 11:00:53 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/EI_Lab_Editorial_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Editorials]]></category>
		<category><![CDATA[More Stories]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[E&I Lab]]></category>
		<category><![CDATA[Experience and Insight Lab]]></category>
		<category><![CDATA[Samsung R&D]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">https://bit.ly/3w9lgHq</guid>
									<description><![CDATA[The world has changed so much since the beginning of 2020. However, even before the COVID-19 pandemic, Samsung was heading down a path in which its technological leadership was creating great opportunities not only to invent the next generation of devices, but to use technology to make people’s lives better. We were beginning to make […]]]></description>
																<content:encoded><![CDATA[<p>The world has changed so much since the beginning of 2020. However, even before the COVID-19 pandemic, Samsung was heading down a path in which its technological leadership was creating great opportunities not only to invent the next generation of devices, but to use technology to make people’s lives better. We were beginning to make progress on redefining the future — a future that looks quite different now than it did when we imagined it in 2019.</p>
<p>With global studios in Seoul, Korea, San Francisco, U.S. and Milan, Italy, the Experience and Insight Lab (E&I Lab) – a new division of <a href="https://research.samsung.com/" target="_blank" rel="noopener">Samsung Research</a>, an advanced R&D hub that leads the development of future technologies for Samsung’s consumer product business – was created to help bring that future to fruition.</p>
<div id="attachment_125006" style="width: 855px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-125006" class="wp-image-125006 size-medium" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/EI-Lab_main1-e1623311407171.jpg" alt="" width="845" height="563" /><p id="caption-attachment-125006" class="wp-caption-text"><strong>△ Federico Casalegno, Senior Vice President and Head of the Experience and Insight (E&I) Lab </strong></p></div>
<h3><strong>Examining Experiences and Why They Matter</strong></h3>
<p>New technologies are offering consumers increasingly more sophisticated experiences, but in some cases they can also overwhelm users with their complexity. This is important to note given that in the near future, it is predicted that we will average more than 20 connected devices per person and almost 40 per household.<sup>1</sup> This is why we must focus on experiences first, rather than products themselves.</p>
<p>I define an experience as the journey customers take with products and solutions, which includes the emotional journey that they take over time. The experiences are not about the technology, which is a component of the experience, but about how customers holistically interact with solutions (including products, services and spaces) over time.</p>
<p>These journeys of holistic interactions are precisely what the E&I Lab has been tasked with exploring. Focusing our efforts on examining those journeys will help ensure that the experiences we develop better serve people’s needs. It will also enable us to make users’ experiences with their products and services both smoother and more satisfying.</p>
<p>Having a meal at home provides a simple way to understand what we mean by experiences in this context. The memories we make while enjoying great food with family and friends can be some of the finest experiences of our lives. We remember our dinner experience and the laughs and warm atmosphere with friends and family, but our memories don’t focus on the app we may have used to pay for the meal, or the devices we used to whip it up. While we can focus on designing the best products, supported by cutting-edge technologies, we need to remember that those products are not the experience in and of themselves; they simply help enable that experience. People don’t just buy products; they buy the experiences that those products enable.</p>
<p>Transaction-centric companies tend to design technologies first and then look for ways to sell them to people. An experience-driven company must be willing to forego creating a new technology, or even abandon one it has created, no matter how impressive it may be, if people don’t need it to enable the desired experiences.</p>
<p>To make this difference between using a technology and enjoying an experience even clearer, I’d like you to consider the step counter on your smartphone or wearable device. From an engineering perspective, we tend to think about step counters in terms of how they work and how accurate they are. To be sure, achieving precision and accuracy are difficult and certainly important. But to users, those factors are in the background; ideally, they are invisible. When we adopt a human-centric design perspective, we consider what counting steps means to users – that is, how the experience adds to their lives beyond simply counting steps. We ask, “What does step-counting technology empower users to do? What kinds of things – tech-related or not – matter to people who count their steps?”</p>
<p>Our approach to design is unique in that it aims to bridge the gap between great technological advances, design, and human needs. Drawing from a deep understanding of people’s lifestyles, our approach places users front and center at the earliest stages of the innovation process. We examine Samsung Research technologies through a user-centric lens, and we use the insights we acquire to develop solutions that streamline users’ lives with meaningful experiences.</p>
<div id="attachment_125007" style="width: 855px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-125007" class="wp-image-125007 size-medium" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/EI-Lab_main2-e1623311420342.jpg" alt="" width="845" height="563" /><p id="caption-attachment-125007" class="wp-caption-text"><strong>△ Members of Samsung Research’s Experience and Insight (E&I) Lab</strong></p></div>
<h3><strong>Introducing a New Samsung Research Lab</strong></h3>
<p>The E&I Lab’s mission is to help the company better understand people’s needs, aspirations and lifestyles in order to craft meaningful experiences through innovative products and cutting-edge technologies. We do this by monitoring emerging trends and cultural shifts, and discovering social and technological changes, and we use these insights to design, prototype, deploy, learn and deliver.</p>
<p>Our goal is to work with and learn from consumers to explore what technologies should do rather than what they could do. We focus on innovation from the perspective of humans and humanity, and we bring that perspective to bear in Samsung Research’s technologies.</p>
<p>The E&I Lab and its multidisciplinary team’s new approach to research and development is centered on creating experiences for people. The “insight” part of our lab’s name refers to how we identify customer needs based on a deep understanding of people’s values, behaviors, and lifestyles, and develop insights so we can design solutions that satisfy those needs.</p>
<p>Factors once considered unrelated to design suddenly become critical when taking a human-centric approach to technological design. These include not just socioeconomic trends and issues, but also multiculturalism, gender inclusion, and a host of other considerations. Central to our concept of design are the beliefs that we must always account for the fact that we live in a complex ecosystem with people of various cultures and views, and that our planet and environment must always be taken into consideration.</p>
<p>The best technology is created based on a deep understanding of why humans need technology in the first place. Our lab seeks to promote a new way of thinking – one based on the idea that people do not adopt products or services based merely on specs or features. For example, people don’t want a camera per se; they may want it because they love storytelling, and they want the experience of capturing and sharing moments that a camera provides.</p>
<p><img loading="lazy" class="alignnone size-medium wp-image-125008" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/EI-Lab_main3-e1623311452371.jpg" alt="" width="1000" height="347" /></p>
<p>We recognize that this is a new way of thinking, one that celebrates our passion for great technology while at the same time confronting the reality that technology often turns out to be little more than perfection without purpose. That happens when we fail to put human progress at the very core of our objectives. This kind of progress comes from the synergy between technological advances and human-centric design, which can unlock infinite possibilities in terms of offering people delightful experiences.</p>
<p>We aim to achieve that synergy by leveraging Samsung’s incredible engineering capabilities and what we bring to the table as designers, having adopted this focus on experiences and people’s wants and needs. That synergy will establish Samsung as the leader in an experience-driven world – one that genuinely learns from and works for consumers.</p>
<p><img loading="lazy" class="alignnone size-medium wp-image-125009" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/EI-Lab_main4-e1623311482243.jpg" alt="" width="845" height="563" /></p>
<h3><strong>Our Approach</strong></h3>
<p>What does the E&I Lab’s human-centric approach to design look like? Once we have a clear idea of the people for whom we are designing and the meaningful experiences that our technologies can enable, we set out to create whatever is called for.</p>
<p>Our work requires rapid prototyping and deploying, which enable us to make an idea tangible. We go beyond the traditional prototyping of things and instead we prototype “problems” – that is, the questions we are trying to answer, or the needs and desires for experiences we are trying to fulfill.</p>
<p>Designers are problem solvers, and good designs begin by first identifying the problems we need to solve. This makes it possible to identify what people genuinely want. Only when we believe we’ve found the problem should we start prototyping and engineering any sort of solution, including a space, an interaction, or even a story. After all, a prototype is all about what a design could be, not necessarily what it should or will be.</p>
<p>In essence, this sort of prototyping requires us to codesign with the people for whom we innovate. It encourages feedback early and often, and it offers us opportunities to learn and improve. Our human-centric approach deliberately breaks the boundaries of established disciplines to escape the confines of technology-driven design. It is multidisciplinary because humanity transcends types, and because the experiences we want to enable are complex and holistic.</p>
<p><img loading="lazy" class="alignnone size-medium wp-image-125010" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/EI-Lab_main5-e1623311497981.jpg" alt="" width="915" height="563" /></p>
<h3><strong>It’s Time to Rethink How We Design Experiences</strong></h3>
<p>The world has changed. Today, you can’t lead technologically if you aren’t also leading culturally, with purpose and humanity in mind. The E&I Lab’s perspective is based on a belief that a tech company can be considered a leader only if it is human- and humanity-centric.</p>
<p>This has tremendous implications for design, as I’ve attempted to explain here. If the COVID-19 pandemic has shown us anything, it’s that people don’t need better technologies per se, and they certainly don’t need more bells and whistles. They need technologies that help them live better and enable them to achieve their dreams. That is why the E&I Lab wants to redefine experiences in design from a technological standpoint.</p>
<p>Design plays a distinct role in advancing knowledge and creating benefits for humanity. It sparks and complements innovation in the arts, sciences, and technology. Now more than ever, successful designs must be defined by the degree to which they reflect a deeper understanding of humanity’s social and environmental needs.</p>
<p>The E&I Lab breaks boundaries by taking a big-picture view and identifying designs, as opposed to technologies, that the market may be missing. Our process combines problem-solving with critical reflection, and engages with people from the very beginning in order to produce technologies that fulfill their needs and goals rather than our own.</p>
<p>We look forward to harmonizing the E&I Lab’s goals with the work of Samsung engineers around the globe.</p>
<p><em><span style="font-size: small"><sup>1</sup> Brent Heslop, “By 2030, Each Person Will Own 15 Connected Devices – Here’s What That Means for Your Business and Content,” MarTech Advisor, March 4, 2019</span></em></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Researchers Prove the Viability of Commercialized ‘Stretchable’ Devices</title>
				<link>https://news.samsung.com/global/samsung-researchers-prove-the-viability-of-commercialized-stretchable-devices</link>
				<pubDate>Sun, 06 Jun 2021 09:00:58 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[Photoplethysmography Sensor]]></category>
		<category><![CDATA[PPG]]></category>
		<category><![CDATA[SAIT]]></category>
		<category><![CDATA[Samsung Advanced Institute of Technology]]></category>
		<category><![CDATA[Samsung R&D]]></category>
		<category><![CDATA[Stretchable Devices]]></category>
		<category><![CDATA[Stretchable Display]]></category>
		<category><![CDATA[Stretchable OLED]]></category>
		<category><![CDATA[Stretchable OLED display]]></category>
                <guid isPermaLink="false">https://bit.ly/3vR0xrQ</guid>
									<description><![CDATA[With the establishment of flexible displays behind us, many have asked what the next big development in display technology will be. Recently, free-form displays1 have been gaining traction as next-generation technology that will allow for both high-resolution visuals and portability at the same time. While the technology is still in its nascent stages, significant research […]]]></description>
																<content:encoded><![CDATA[<p>With the establishment of flexible displays behind us, many have asked what the next big development in display technology will be.</p>
<p>Recently, free-form displays<sup>1</sup> have been gaining traction as next-generation technology that will allow for both high-resolution visuals and portability at the same time. While the technology is still in its nascent stages, significant research has been carried out on stretchable displays (a core technology for free-form displays) that can be stretched in all directions like rubber bands to change their shapes.</p>
<h3><span style="color: #000080"><strong>Stretchable Display and Sensor Breakthroughs</strong></span></h3>
<p>On June 4, researchers at the Samsung Advanced Institute of Technology (SAIT), Samsung’s R&D hub dedicated to cutting-edge future technologies, published research<sup>2</sup> in the world-renowned journal ‘Science Advances’ about a technology that overcomes the limitations of stretchable devices.</p>
<p>Through this study, stable performance in a stretchable device with high elongation was achieved. This research was also the first in the industry to prove the commercialization potential of stretchable devices, given that the technology is capable of being integrated with existing semiconductor processes.</p>
<p>The SAIT research team was able to integrate a stretchable organic LED (OLED) display and a photoplethysmography (PPG) sensor in a single device to measure and display the user’s heart rate in real-time, thus creating the ‘stretchable electronic skin’ form factor. The success of this test case proves the feasibility of expanding the technology to further applications. This research is expected to increase the uptake of stretchable devices in the future.</p>
<div id="attachment_124831" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-124831" class="wp-image-124831 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_main1.jpg" alt="" width="1000" height="666" /><p id="caption-attachment-124831" class="wp-caption-text">▲ The SAIT research team that demonstrated the viability of stretchable devices: SAIT Organic Material Lab principal researcher Jong Won Chung (co-first author), principal researcher Youngjun Yun (corresponding author) and staff researcher Yeongjun Lee (co-first author)</p></div>
<h3><span style="color: #000080"><strong>OLED ‘Skin’ Display That Can Be Stretched by Up to 30%</strong></span></h3>
<p>One of the biggest achievements of this research was that the team was able to modify the composition and structure of ‘elastomer’, a polymer compound with excellent elasticity and resilience, and use existing semiconductor manufacturing processes to apply it to the substrates of stretchable OLED displays and optical blood flow sensors for the first time in the industry. The team were then able to confirm that the sensor and display continued to operate normally and did not exhibit any performance degradation with elongation of up to 30%.</p>
<div id="attachment_124832" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-124832" class="wp-image-124832 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_main2.jpg" alt="" width="1000" height="275" /><p id="caption-attachment-124832" class="wp-caption-text">▲SAIT Proto System</p></div>
<p>To put their research to the test, the SAIT researchers attached stretchable PPG heart rate sensors and OLED display systems to the inner wrist near the radial artery.<sup>3</sup> Doing this allowed them to confirm that wrist movement did not cause any property deterioration, with the solution remaining reliable with skin elongation of up to 30%. This test also confirmed that the sensor and OLED display continued to work stably even after being stretched 1,000 times. What’s more, when measuring signals from a moving wrist, the sensor was found to pick up a heartbeat signal that was 2.4 times stronger than would be picked up by a fixed silicon sensor.</p>
<p>“The strength of this technology is that it allows you to measure your biometric data for a longer period without having to remove the solution when you sleep or exercise, since the patch feels like part of your skin. You can also check your biometric data right away on the screen without having to transfer it to an external device,” explained principal researcher Youngjun Yun, corresponding author of the paper. “The technology can also be expanded to use in wearable healthcare products for adults, children and infants, as well as patients with certain diseases.”</p>
<div id="attachment_124833" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-124833" class="wp-image-124833 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_main3.jpg" alt="" width="1000" height="666" /><p id="caption-attachment-124833" class="wp-caption-text">▲ Youngjun Yun, Principal researcher and corresponding author</p></div>
<h3><span style="color: #000080"><strong>Overcoming Technical Challenges W</strong><strong>ith Stretchable Materials and Structure</strong></span></h3>
<p>Implementing stretchable display technology proves difficult because usually when a display is stretched or its shape is manipulated, the device either breaks or its performance deteriorates. In order to overcome this problem, all of the materials and elements, including the substrate, electrode, thin film transistor, emission material layer and sensor, must have physical stretchability as well as the ability to maintain their electrical properties.</p>
<p>Thus, the SAIT researchers replaced the plastic material used in existing stretchable displays with elastomer. The system developed by the SAIT team is the first in the sector to implement a display and sensor using photolithography processes that enable micro-patterning and large-area processing.</p>
<p>Elastomer is an advanced material with high elasticity and resilience, but is limited in its capacity to be applied to existing semiconductor processes because it is vulnerable to heat. To mitigate this, the team strengthened the material’s thermal resistance by tailoring its molecular composition. They also chemically integrated certain molecule chains in order to establish a resistance to the materials used in semiconductor processes.</p>
<p>“We applied an ‘island’ structure to alleviate the stress<sup>4</sup> caused by elongation,” said staff researcher Yeongjun Lee, co-first author of the paper. “More stress was induced in the area of the elastomer, which has a relatively low elasticity coefficient<sup>5</sup> and is thus more likely to become deformed. This allowed us to minimize the stress sustained by the OLED pixel area, which is more vulnerable to such pressure. We applied a stretchable electrode material (cracked metal) that resists deformation to the elastomer area, and this allowed the spaces and wiring electrodes between the pixels to stretch and shrink without the OLED pixels themselves becoming deformed.”<a href="#_ftnref1" name="_ftn1"></a></p>
<div id="attachment_124834" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-124834" class="wp-image-124834 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_main4.jpg" alt="" width="1000" height="232" /><p id="caption-attachment-124834" class="wp-caption-text">▲ OLED and cracked metal electrodes in an island structure</p></div>
<div id="attachment_124835" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-124835" class="wp-image-124835 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_main5.jpg" alt="" width="1000" height="666" /><p id="caption-attachment-124835" class="wp-caption-text">▲ Yeongjun Lee, staff researcher and co-first author</p></div>
<h3><span style="color: #000080"><strong>Commercialization and Expanded Applications</strong></span></h3>
<p>The stretchable sensor was made in a way that makes continuous heartbeat measurements possible with a high degree of sensitivity compared to existing fixed wearable sensors. The solution does this by facilitating close adhesion to the skin, which minimizes performance inconsistencies that can be caused by movement.<sup>6</sup></p>
<p>The stretchable sensor and OLED display developed by the SAIT team were brought about by overcoming limitations in existing device performance and operational processes, including those of current stretchable materials. The work done by the SAIT team is particularly significant in that it has secured chemical and heat resistance for the elastomer material, thereby making commercialization of stretchable devices with high resolution and large screens more likely in the future.</p>
<p>“Our research is still in the early stages, but our goal is to realize and commercialize stretchable devices by increasing system resolution, stretchability, and measurement accuracy to a level that makes mass production possible,” said principal researcher Jong Won Chung, co-first author of the paper. “In addition to the heartbeat sensor that was applied in this test case, we plan to incorporate stretchable sensors and high-resolution freeform displays to enable users to monitor things like peripheral oxygen saturation, electromyogram readings and blood pressure.”</p>
<div id="attachment_124836" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-124836" class="wp-image-124836 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/06/Stretchable_OLED_main6.jpg" alt="" width="1000" height="666" /><p id="caption-attachment-124836" class="wp-caption-text">▲ Jong Won Chung, principal researcher and co-first author</p></div>
<p><em><span style="font-size: small"><sup>1</sup> Displays that feature significantly smaller pixels, allowing for more freedom when determining their shapes</span></em></p>
<p><em><span style="font-size: small"><sup>2</sup> Paper title: “Standalone real-time health monitoring patch based on a stretchable organic optoelectronic system”</span></em></p>
<p><em><span style="font-size: small"><sup>3 </sup>The superficial artery in the forearm that is usually used to take one’s pulse</span></em></p>
<p><em><span style="font-size: small"><sup>4</sup> The resistance force that is created in a material when it is compressed, bent, twisted, or has other external forces applied to it</span></em></p>
<p><em><span style="font-size: small"><sup>5</sup> Rate of elasticity that shows the degree to which an object stretches and deforms</span></em></p>
<p><em><span style="font-size: small"><sup>6</sup> The motion artifact effect</span></em></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung’s Smart TV-Oriented Research Efforts Recognized With a Best Paper Award at ICCE 2021</title>
				<link>https://news.samsung.com/global/samsungs-smart-tv-oriented-research-efforts-recognized-with-a-best-paper-award-at-icce-2021</link>
				<pubDate>Thu, 08 Apr 2021 17:00:59 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/04/IEEE-ICCE-Award_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[TVs & Displays]]></category>
		<category><![CDATA[Best Paper Award]]></category>
		<category><![CDATA[CES]]></category>
		<category><![CDATA[ICCE]]></category>
		<category><![CDATA[IEEE]]></category>
		<category><![CDATA[Samsung R&D]]></category>
                <guid isPermaLink="false">https://bit.ly/3sZJEK1</guid>
									<description><![CDATA[Researchers from Samsung Electronics’ Visual Display (VD) Business have been declared winners of a Best Paper Award at the IEEE’s (Institute of Electrical and Electronics Engineers) International Conference on Consumer Electronics (ICCE 2021) for their technical paper entitled “Analyzing Images for Music Recommendation.” ICCE is the flagship conference of the IEEE Consumer Technology Society (formerly […]]]></description>
																<content:encoded><![CDATA[<p><span data-preserver-spaces="true">Researchers from Samsung Electronics’ Visual Display (VD) Business have been declared winners of a Best Paper Award at the IEEE’s (Institute of Electrical and Electronics Engineers) International Conference on Consumer Electronics (ICCE 2021) for their technical paper entitled “Analyzing Images for Music Recommendation.”</span></p>
<p><span data-preserver-spaces="true">ICCE is the flagship conference of the IEEE Consumer Technology Society (formerly the IEEE Consumer Electronics Society), held annually in conjunction with the Consumer Electronics Show (CES) in Las Vegas, USA. A total of 190 research papers were accepted this year, with five finalists nominated for the virtual conference’s Best Paper Award. The competition consisted of video-recorded paper presentations and a live Q&A session, and just three papers were selected as winners.</span></p>
<p><img loading="lazy" class="alignnone size-full wp-image-123112" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/04/IEEE-ICCE-Award_main1F.jpg" alt="" width="1000" height="667" /></p>
<p><span data-preserver-spaces="true">“The work portrayed in this paper connects domains such as AI, UX and affective computing with the goal of providing an emotionally enriching user experience by bridging artworks and music in an unconventional manner,” said Anant Baijal, a Samsung Staff Engineer (Advanced R&D Group) and the lead author of the paper. “We propose a method that considers both an artwork’s style (classical, baroque, impressionist etc.) and the emotion it conveys (calm, joyful, mellow, etc.) to automatically curate music that is well-suited to accompany the image.”</span></p>
<p><span data-preserver-spaces="true">The research highlighted in the paper is under review for possible commercialization in Samsung’s flagship products, including the recently announced Neo QLED TVs, along with lifestyle products such as The Frame. Currently, through Ambient Mode on QLED and Art Mode on The Frame, consumers can display pictures of their choice when they are not watching TV. The research team wants to take the user experience to the next level by giving consumers the option to play music that is curated based on automatic analysis of the image displayed on the screen. The results obtained from initial tests conducted during the research support the efficacy of the proposed approach. </span></p>
<p><span data-preserver-spaces="true">“When a user uploads a picture, such as artwork or a photograph, our proprietary AI algorithms evaluate the image to recommend music through a user-subscribed music streaming provider,” added Baijal. “Future work on this technology would include integrating other parameters into the algorithm, such as the weather, time, geolocation and user activity, in addition to the image being displayed. We are designing the algorithm to be capable of adapting to individual users’ preferences.” </span></p>
<p><img loading="lazy" class="alignnone size-full wp-image-123113" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/04/IEEE-ICCE-Award_main2F.jpg" alt="" width="1000" height="666" /></p>
<p><span data-preserver-spaces="true">“At Samsung’s Advanced R&D Group, we’re continually striving to provide a more dynamic screen experience by developing innovative technologies that complement consumers’ ever-changing lifestyles,” said Dr. Danny Hyun, Corporate Vice President and Head of Advanced R&D Group at Samsung Electronics and a co-author of the paper. </span></p>
<p><span data-preserver-spaces="true">The Advanced R&D Group’s research spans multidisciplinary areas such as AI, Big Data and connectivity, and the Group is also responsible for creating futuristic concepts and product prototypes. Multiple international patent applications covering the technology’s concept, components and user scenarios have been filed by Samsung Electronics. </span></p>
<p><span data-preserver-spaces="true">Check out Samsung’s First Look 2021 </span><a class="editor-rtfLink" href="https://youtu.be/7moa0_fH84Y" target="_blank" rel="noopener"><span data-preserver-spaces="true">event</span></a><span data-preserver-spaces="true"> (highlights available </span><a class="editor-rtfLink" href="https://www.youtube.com/watch?v=WGPV8n6EXjU&t=5s" target="_blank" rel="noopener"><span data-preserver-spaces="true">here</span></a><span data-preserver-spaces="true">) for more details on the company’s vision for the future of screens, including its latest display innovations and 2021 product lineup.</span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Research Centers From Around the World  Present Their Studies at CVPR 2020</title>
				<link>https://news.samsung.com/global/samsung-research-centers-from-around-the-world-present-their-studies-at-cvpr-2020</link>
				<pubDate>Tue, 23 Jun 2020 11:00:15 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2020/06/SR-CVPR-2020_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI Into the Future]]></category>
		<category><![CDATA[Computer Vision and Pattern Recognition]]></category>
		<category><![CDATA[CVPR]]></category>
		<category><![CDATA[Samsung AI Center]]></category>
		<category><![CDATA[Samsung Electronics Global Research & Development Center]]></category>
		<category><![CDATA[Samsung R&D]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">https://bit.ly/2Cx7JCI</guid>
									<description><![CDATA[Samsung Electronics’ Global Research & Development (R&D) Centers have presented their studies to the CVPR (Computer Vision and Pattern Recognition) introducing new computer vision, deep learning and AI related technical researches. CVPR is the world’s biggest conference on computer engineering and AI. At this year’s conference, held online from June 14 to 19, Samsung Research, an […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-117251" src="https://img.global.news.samsung.com/global/wp-content/uploads/2020/06/SR-CVPR-2020_banner.jpg" alt="" width="1000" height="260" /></p>
<p>Samsung Electronics’ Global Research & Development (R&D) Centers have presented their studies to the CVPR (Computer Vision and Pattern Recognition) introducing new computer vision, deep learning and AI related technical researches.</p>
<p>CVPR is the world’s biggest conference on computer engineering and AI. At this year’s conference, held online from June 14 to 19, <a href="https://research.samsung.com/" target="_blank" rel="noopener">Samsung Research, an advanced R&D hub within Samsung Electronics’ SET Business</a> and its advanced R&D centers gave presentations on a total of 11 thesis papers. Researchers from Samsung Moscow AI center and Samsung Toronto AI center were invited to oral presentations, an opportunity given to only 5% of the entire attendees.</p>
<p>At the oral presentation, Pavel Solovev of Samsung Moscow AI Center introduced ‘High Resolution Daytime Translation without Domain Labels’, which is a technology that changes a high resolution landscape photograph into scenes from various times of the day using data without domain label. Konstantin Sofiiuk also introduced ‘f-BRS: Rethinking Backpropagating Refinement for Interactive Segmentation’, which is a technology that allows a user to simply click an object in a photograph to precisely select and separate it.</p>
<div id="attachment_117252" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-117252" class="wp-image-117252 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2020/06/SR-CVPR-2020_main1.jpg" alt="" width="1000" height="504" /><p id="caption-attachment-117252" class="wp-caption-text">‘High Resolution Daytime Translation without Domain Labels’</p></div>
<p>Joining from the Toronto AI Center, researcher Michael Brown and his team introduced the paper titled ‘Deep White-Balance Editing’, which was also selected for an oral presentation. This AI technology corrects white-balance mistakes made in a captured photograph much more accurately than existing photo editing programs. This technology also allows users to accurately adjust the photo’s white-balance color temperature.</p>
<div id="attachment_117253" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-117253" class="wp-image-117253 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2020/06/SR-CVPR-2020_main2.jpg" alt="" width="1000" height="725" /><p id="caption-attachment-117253" class="wp-caption-text">Deep White-Balance Editing</p></div>
<p>Researchers from Samsung Research America also presented interesting findings at the conference. Eric Luo’s study titled ‘Wavelet Synthesis Net: An Efficient Architecture for Disparity Estimation to Synthesize DSLR Calibre Bokeh Effect on Smartphones’ focused on key enablers to narrow the gap between DSLR and smartphone camera in terms of bokeh, the narrow depth of field (DoF).</p>
<p>Yilin Shen from Samsung Research America’s AI Center introduced a study on out-of-distribution (OoD) benchmarks for deep neural networks research. Shen’s study titled ‘Generalized ODIN: Detecting Out-Of-Distribution Image Without Learning From Out-Of-Distribution Data’ proposed the key machine learning algorithm of drastically improving the detection rate, one of major challenges in AI technology.</p>
<p>Additionally, the studies proposed by researchers from the Samsung Research’s Visual Technology team and Samsung R&D Institute India-Bangalore were also selected by CVPR.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung AI Forum Offers a Roadmap for the Future of AI</title>
				<link>https://news.samsung.com/global/samsung-ai-forum-offers-a-roadmap-for-the-future-of-ai</link>
				<pubDate>Tue, 18 Sep 2018 18:30:57 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/09/samsung-ai-forum-2018_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Expert Voices]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Samsung AI Center]]></category>
		<category><![CDATA[Samsung AI Forum 2018]]></category>
		<category><![CDATA[Samsung R&D]]></category>
                <guid isPermaLink="false">http://bit.ly/2PLlxL1</guid>
									<description><![CDATA[It wasn’t that long ago that the idea of building technologies with ‘brains’ that learn and are even structured just like ours seemed like science fiction. Just ask the distinguished speakers at the “Samsung AI Forum 2018”. Held in Seoul from September 12th to 13th, the second edition of Samsung Electronics’ artificial intelligence (AI) forum […]]]></description>
																<content:encoded><![CDATA[<p>It wasn’t that long ago that the idea of building technologies with ‘brains’ that learn and are even structured just like ours seemed like science fiction.</p>
<p>Just ask the distinguished speakers at the “Samsung AI Forum 2018”. Held in Seoul from September 12<sup>th</sup> to 13<sup>th</sup>, the second edition of Samsung Electronics’ artificial intelligence (AI) forum featured accomplished AI experts, who discussed how groundbreaking advancements are not only helping to create technology that will make our lives more comfortable, convenient and efficient. They’re also teaching us more about how our own minds work.</p>
<h3><span style="color: #000080"><strong>Unsupervised Learning Takes Center Stage</strong></span></h3>
<div id="attachment_105056" style="width: 715px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-105056" class="size-full wp-image-105056" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/09/samsung-ai-forum-2018_main_1.jpg" alt="" width="705" height="375" /><p id="caption-attachment-105056" class="wp-caption-text">Attendees of the Samsung AI Forum 2018 are listening intently to the opening address of Kinam Kim, Samsung Electronics’ President and CEO</p></div>
<p>The forum began with a presentation from the founding director of the New York University Center for Data Science, and one of the world’s leading minds in the field of deep learning, Yann LeCun.</p>
<p>LeCun’s speech set the stage for the exciting discussions on unsupervised learning that would follow over the course of the two-day event. LeCun explained why he and many of his peers believe that unsupervised learning, also known as self-supervised learning, represents the future of AI. He also delved into unsupervised learning algorithms’ potential applications (and limitations), and explained how they differ from supervised and reinforcement learning algorithms.</p>
<p>As LeCun explained, <em>supervised learning</em> algorithms learn utilizing labeled datasets and answer keys that allow them to evaluate their accuracy. This essentially means that each example in the training dataset includes the answer that the algorithm should produce. With <em>reinforcement learning</em>, an algorithm is trained using a reward system that offers feedback when it performs an optimal action for a given situation. It relies on this feedback, rather than labeled datasets, to make the choice that offers the greatest reward.</p>
<p>With <em>unsupervised learning,</em> the algorithm is tasked with making sense of an unlabeled dataset—a set of examples that doesn’t have a correct answer or desired outcome—on its own. While these algorithms can be more unpredictable than their counterparts, they can also perform more complex processing tasks.</p>
<p>LeCun used training self-driving cars as a key example of unsupervised learning’s potential. “A lot of people who are working on autonomous driving are hoping to use reinforcement learning to get cars to learn to drive by themselves by trial and error,” said LeCun. “The problem with this is that, because of [reinforcement learning’s inherent inefficiencies], you’d have to get a car to drive off a cliff several thousand times before it figures out how not to do that.”</p>
<p>LeCun explained how, unlike reinforcement learning models, which rely on trial and error, unsupervised learning models could potentially be capable of guessing what to do in a situation like this—demonstrating mental capabilities similar to what we’d call common sense.</p>
<p>He also discussed his experience developing artificial neural networks—specifically convolutional neural networks (ConvNets)—and demonstrated how they can be used to build not only self-driving cars but a wide variety of innovative devices, including technologies for medical signal and image analysis, bioinformatics, speech recognition, language translation, image restoration, robotics and physics.</p>
<p>LeCun’s presentation was followed by a lecture from another leading light in the field of deep learning: University of Montreal professor Yoshua Bengio. Professor Bengio’s lecture focused specifically on stochastic gradient descent (SGD)—an AI optimization method that’s used to minimize errors made by artificial neural networks.</p>
<p>As Bengio explained, “[SGD] is really the workhorse of deep learning. This is the optimization technique that is used everywhere for supervised learning, reinforcement learning and self-supervised learning. It’s been with us for many decades and it works incredibly well, but we don’t completely understand it yet.”</p>
<p>Bengio’s presentation allowed attendees to gain a better understanding of SGD, with specific focus on how SGD variants can affect neural network optimization and generalization. Bengio discussed how the traditional view of machine learning sees optimization and generalization as neatly separated, but that’s not actually the case. He also presented detailed research findings on the effects of SGD-based learning techniques on both aspects of network design.</p>
<div id="attachment_105053" style="width: 715px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-105053" class="size-full wp-image-105053" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/09/samsung-ai-forum-2018_main_2.jpg" alt="" width="705" height="543" /><p id="caption-attachment-105053" class="wp-caption-text">(from the top, clockwise) NYU professor Yann LeCun, University of Montreal professor Yoshua Bengio, MIT Media Lab’s professor Cynthia Breazeal and Samsung Research’s Executive Vice President Sebastian Seung.</p></div>
<h3><span style="color: #000080"><strong>Could Unsupervised Learning Unlock the Secrets of the Brain?</strong></span></h3>
<p>Sebastian Seung, Executive Vice President of Samsung Research and Chief Research Scientist of Samsung Electronics, delivered a particularly illuminating presentation that outlined why unsupervised learning will be essential for developing AI with human-level mental capabilities.</p>
<p>Seung described how the convolutional neural networks that LeCun had examined in detail are in fact based on insights gained through the study of neuroscience. He also discussed how his research in both artificial and biological neural networks led him to study ways to apply AI to gain a better understanding of how our brains are wired.</p>
<p>Seung stressed that the model for designing unsupervised learning networks lies in the cortex of the brain, and highlighted a recent study that his team was involved in that used AI to map out all of the neurons contained in a one cubic millimeter of a mouse’s visual cortex—more than 100,000 in total.</p>
<p>The unsupervised learning algorithm that the researchers utilized allowed them to not only create a 3D reconstruction of the neural network’s wiring, but also made it possible to label and color in individual cells and their components. “That’s the magic of deep learning,” said Seung. “If a human had to color all that in, it would take about 100 years of work. And that’s with no coffee breaks or sleeping.”</p>
<h3><span style="color: #000080"><strong>Living with Social Robots in ‘10 to 20 Years’</strong></span></h3>
<p>The speech delivered by Cynthia Breazeal, the founder and Chief Scientist of Jibo, Inc., and the founding director of the Personal Robotics Group at MIT’s (the Massachusetts Institute of Technology’s) Media Lab, shifted focus to applying AI to develop advanced robotics.</p>
<p>Breazeal’s speech, entitled “Living and Flourishing with Social Robots,” discussed approaches needed to develop autonomous systems that utilize AI to enhance our quality of life. As Breazeal explained, autonomous, socially and emotionally intelligent technologies—robots with what’s known as ‘relational AI’—present a wide range of exciting benefits.</p>
<p>“I’m really excited to think about the next 10 to 20 years—of having these robots actually become a part of our daily lives,” said Breazeal.</p>
<p>The fascinating presentation highlighted helpful companion technologies in particular, and included specific examples of ways that robots could be used to assist children and older adults. Breazeal noted studies in which AI robotic companions were given to patients at a children’s hospital, as well as kindergarten-age students and senior citizens.</p>
<p>Videos of the studies showed how the children in the hospital drew comfort from having a peer-like companion by their side, and demonstrated how robots can be used to boost learning. As Breazeal explained, “This is about a different vision for AI. There’s so much emphasis right now on tools for professionals, and there’s not a lot of deep thinking around how AI is going to benefit everyone.” The studies, Breazeal added, “show that there’s a lot of promise with these technologies in the real world… making a real difference.”</p>
<p>This year’s forum also included a diverse array of speeches that offered an all-encompassing look at the state of artificial intelligence development today. These included presentations on topics covering advancements in reinforcement learning, mutual information neural estimation, socially and emotionally intelligent AI, personal assistant robots, and precision medicine via machine learning. The developments discussed at the Samsung AI Forum 2018 represent great strides toward creating an AI-connected future.</p>
]]></content:encoded>
																				</item>
			</channel>
</rss>