<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet title="XSL_formatting" type="text/xsl" href="https://news.samsung.com/my/wp-content/plugins/btr_rss/btr_rss.xsl"?><rss version="2.0"
     xmlns:content="http://purl.org/rss/1.0/modules/content/"
     xmlns:wfw="http://wellformedweb.org/CommentAPI/"
     xmlns:dc="http://purl.org/dc/elements/1.1/"
     xmlns:atom="http://www.w3.org/2005/Atom"
     xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
     xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>
	<channel>
		<title>Samsung Research &#8211; Samsung Newsroom Malaysia</title>
		<atom:link href="https://news.samsung.com/my/tag/samsung-research/feed" rel="self" type="application/rss+xml" />
		<link>https://news.samsung.com/my</link>
        
        <currentYear>2025</currentYear>
        <cssFile>https://news.samsung.com/my/wp-content/plugins/btr_rss/btr_rss_xsl.css</cssFile>
		<description>What's New on Samsung Newsroom</description>
		<lastBuildDate>Tue, 21 Apr 2026 13:37:28 +0000</lastBuildDate>
		<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
					<item>
				<title>Samsung Electronics Unveils 6G White Paper and Outlines Direction for AI-Native and Sustainable Communication</title>
				<link>https://news.samsung.com/my/samsung-electronics-unveils-6g-white-paper-and-outlines-direction-for-ai-native-and-sustainable-communication?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Tue, 05 Aug 2025 12:24:13 +0000</pubDate>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[6G]]></category>
		<category><![CDATA[6G White Paper]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI-native communication]]></category>
		<category><![CDATA[AI-RAN]]></category>
		<category><![CDATA[Next Generation Communications]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">https://bit.ly/4mpUtAn</guid>
									<description><![CDATA[Samsung Electronics has published a 6G white paper titled “AI-Native & Sustainable Communication,” detailing the latest trends in next-generation mobile]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics has published a 6G white paper titled “<a href="https://research.samsung.com/next-generation-communications" target="_blank" rel="noopener">AI-Native &amp; Sustainable Communication</a>,” detailing the latest trends in next-generation mobile communication technologies.</p>
<p>&nbsp;</p>
<p>Following the first 6G white paper “<a href="https://news.samsung.com/my/samsungs-6g-white-paper-lays-out-the-companys-vision-for-the-next-generation-of-communications-technology">The Next Hyper-Connected Experience for All</a><span>.</span>” in July 2020, this white paper covers the latest trends driving 6G standardization and next-generation mobile communications — including evolving market and technology needs, emerging services, key attributes of 6G and enabling technologies.</p>
<p>&nbsp;</p>
<p>Samsung aims to integrate the latest AI technology throughout the telecommunication system and improve network quality for a future-oriented and sustainable user experience.</p>
<p>&nbsp;</p>
<p>“We are intensifying our 6G research efforts, focusing on AI-enabled communication technologies and sustainable networks,” said Charlie Zhang, Senior Vice President of Advanced Communications Research Center (ACRC), Samsung Research. “As the telecommunication industry accelerates 6G standardization this year, Samsung will develop technologies to align with market demands.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-159394" src="https://img.global.news.samsung.com/global/wp-content/uploads/2025/02/Samsung-Corporate-Technology-6G-White-Paper-AI-Native-and-Sustainable-Communication_main1.jpg" alt="" width="1000" height="1412" /></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Market and Technology Trends Toward 6G</strong></span></h3>
<p>Mobile data traffic has surged, driven by the proliferation of AI technologies and the rise of streaming services. Now more than ever, there is a pressing need for technological advancements to manage increased data traffic and enhance user experiences in next-generation mobile communications.</p>
<p>&nbsp;</p>
<p>Since the introduction of 5G, the telecommunications industry has been particularly focused on optimizing system operations, sustainability and user experiences. Beyond communication performance improvements such as data rates and latency, there is an urgency to reduce operating costs, enhance energy efficiency, expand service coverage and introduce innovative technologies such as AI.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Emerging Services</strong></span></h3>
<p>5G-Advanced will provide further enhanced 5G performance and incorporate AI to support new services and use cases — ultimately becoming the foundation for 6G technology.</p>
<p>&nbsp;</p>
<p>In this white paper, some key emerging services such as immersive extended reality (XR), digital twin, massive communication, ubiquitous connectivity and fixed wireless access (FWA) are highlighted.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-159395" src="https://img.global.news.samsung.com/global/wp-content/uploads/2025/02/Samsung-Corporate-Technology-6G-White-Paper-AI-Native-and-Sustainable-Communication_main2.jpg" alt="" width="1000" height="538" /></p>
<div>
<p>&nbsp;</p>
<div style="border: 1px solid #000000; padding: 10px 32px; background-color: #ffffff;">
<p>&nbsp;</p>
<p><strong>Immersive Extended Reality (XR):</strong><span> </span>Offers truly immersive user experiences by integrating and interacting with the virtual and real worlds, attracting attention across industries such as entertainment, healthcare and science.</p>
<p>&nbsp;</p>
<p><strong>Digital Twin:</strong><span> </span>Creates virtual replicas of physical entities — including objects, people, devices and places — using 6G technology to allow remote monitoring, problem detection and control.</p>
<p>&nbsp;</p>
<p><strong>Massive Communication:</strong><span> </span>Simultaneously connects numerous sensors, machines, terminals and other devices to networks and supports automation and management of smart cities, homes and factories.</p>
<p>&nbsp;</p>
<p><strong>Ubiquitous Connectivity:</strong><span> </span>Expands service areas by extending terrestrial network coverage and interworking between terrestrial and non-terrestrial network components — including satellites and high-altitude platform stations (HAPS).</p>
<p>&nbsp;</p>
<p><strong>Fixed Wireless Access (FWA):</strong><span> </span>Allows wireless delivery of broadband services that traditionally required wired connections to become recognized as a key driver of expanding telecommunications businesses.</p>
<p>&nbsp;</p>
</div>
<p>&nbsp;</p>
</div>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>6G Key Attributes</strong></span></h3>
<p>In the white paper, Samsung highlighted four key attributes crucial to adapting to evolving market demands — AI-native, sustainable network, ubiquitous coverage and secure and resilient network.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-159398" src="https://img.global.news.samsung.com/global/wp-content/uploads/2025/02/Samsung-Corporate-Technology-6G-White-Paper-AI-Native-and-Sustainable-Communication_main3.jpg" alt="" width="1000" height="665" /></p>
<p>&nbsp;</p>
<div style="border: 1px solid #000000; padding: 10px 32px; background-color: #ffffff;">
<p>&nbsp;</p>
<p><strong>AI-Native:<span> </span></strong>Incorporates the latest AI technologies into communication functionalities from system design to the development, management and operation of systems for performance improvements.</p>
<p>&nbsp;</p>
<p><strong>Sustainable Network:</strong><span> </span>Reduces operational costs and increases user satisfaction by improving the energy efficiency of both networks and terminals.</p>
<p>&nbsp;</p>
<p><strong>Ubiquitous Coverage:</strong><span> </span>Decreases capital expenditures (CAPEX) of networks and enhances service quality by expanding communication service areas and strengthening connectivity via interconnecting terrestrial and non-terrestrial networks.</p>
<p>&nbsp;</p>
<p><strong>Secure and Resilient Network:</strong><span> </span>Ensures network security, user privacy and resilience for significant advancements in computing capabilities and AI technology for the 2030s.</p>
<p>&nbsp;</p>
</div>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>6G Timeline</strong></span></h3>
<p>With the release of this white paper, Samsung solidifies its leadership in shaping the direction of 6G research and key technologies.</p>
<p>&nbsp;</p>
<p>The telecommunications industry and standardization organizations have been researching 6G since 2020. In 2030, the 6G technology standards are expected to be finalized — following candidate technology development, evaluation and consensus-building processes. With the recent timelines from the International Telecommunication Union Radiocommunication Sector (ITU-R)<a href="#_ftn1">[1]</a><span> </span>and 3rd Generation Partnership Project (3GPP),<a href="#_ftn2">[2]</a><span> </span>momentum for 6G research and development is expected to intensify.</p>
<p>&nbsp;</p>
<p>Samsung will continue to lead global standardization efforts and prepare for the 6G era while incorporating lessons learned from 5G commercialization and adapting to new market requirements.</p>
<p>&nbsp;</p>
<p>Last November, Samsung held the <a href="https://news.samsung.com/my/samsung-electronics-hosts-silicon-valley-future-wireless-summit">Silicon Valley Future Wireless Summit</a><span> </span>and hosted an in-depth discussion with industry experts on the convergence of telecommunications and AI technologies. The company demonstrated AI-RAN technologies and Proof of Concept (PoC) results, showcasing the possibilities of AI-native technologies and garnering significant interest from major telecommunications operators.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h6><em><a href="#_ftnref1" name="_ftn1">[1]</a></em><span><em>ITU is the United Nations specialized agency for information and communication technologies with memberships of 193 Member States and more than 1,000 companies, universities, research institutes and international and regional organizations. The ITU’s Radiocommunication Sector (ITU-R) is responsible for regulating and standardizing global radio communication.<br />
</em></span></h6>
<h6><em><a href="#_ftnref2" name="_ftn2">[2]</a></em><span><em>3GPP is dedicated to developing the global unified technical specifications for mobile communications.</em></span></h6>
]]></content:encoded>
																				</item>
					<item>
				<title>Eclipsa Audio: Ushering in a New Generation of 3D Sound With Samsung</title>
				<link>https://news.samsung.com/my/eclipsa-audio-ushering-in-a-new-generation-of-3d-sound-with-samsung?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Thu, 06 Mar 2025 10:26:41 +0000</pubDate>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[TVs & Displays]]></category>
		<category><![CDATA[3D Audio]]></category>
		<category><![CDATA[Eclipsa Audio]]></category>
		<category><![CDATA[IAMF]]></category>
		<category><![CDATA[Neo QLED]]></category>
		<category><![CDATA[Open Source]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[Samsung TVs]]></category>
                <guid isPermaLink="false">https://bit.ly/41HGiz9</guid>
									<description><![CDATA[In video content, audio is just as essential as the visuals, playing a key role in immersion and making viewers feel as if they are part of the scene. To]]></description>
																<content:encoded><![CDATA[<p>In video content, audio is just as essential as the visuals, playing a key role in immersion and making viewers feel as if they are part of the scene. To create a truly optimized sound experience, Samsung Electronics collaborated with Google to develop Eclipsa Audio — a cutting-edge 3D audio technology officially introduced through Samsung TVs last month at the Consumer Electronics Show (CES) 2025 in Las Vegas.</p>
<p>&nbsp;</p>
<p>Samsung Newsroom took a closer look at the technology behind Eclipsa Audio and how it delivers lifelike 3D spatial audio.</p>
<p>&nbsp;</p>
<p><img class="aligncenter size-full wp-image-30579" src="https://img.global.news.samsung.com/my/wp-content/uploads/2025/03/Samsung-Corporate-Technology-Eclipsa-Audio-Google-3D-audio_main1.jpg" alt="" width="1000" height="330" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2025/03/Samsung-Corporate-Technology-Eclipsa-Audio-Google-3D-audio_main1.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2025/03/Samsung-Corporate-Technology-Eclipsa-Audio-Google-3D-audio_main1-768x253.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Developing 3D Spatial Audio Technology</span></h3>
<p>In 2023, the Alliance for Open Media (AOM) — a global consortium that includes Samsung, Google, Netflix, Meta and other leading companies — officially adopted Immersive Audio Model and Formats (IAMF) as the industry standard for 3D audio. Developed by Samsung and Google, this innovative 3D audio format is currently available to content creators under the brand name Eclipsa Audio.</p>
<p>&nbsp;</p>
<p>Eclipsa Audio establishes a shared protocol between different types of media content and the devices that play them. The format delivers a deeply immersive listening experience by optimizing and adapting audio positioning, intensity, spatial reflections and other sound elements to various output environments, such as cinemas, home theater systems, gaming consoles and mobile devices.</p>
<p>&nbsp;</p>
<p>Depending on the output device, the technology can render sound from multiple directions — including from the front, back, left, right, above and below — to create a sense of spatial depth and presence within the scene being watched. In a concert video for instance, Eclipsa Audio presents the artist’s performance in crystal-clear detail while also capturing the energy of the audience, making the viewer feel as if they are physically there.</p>
<p>&nbsp;</p>
<p><strong> </strong></p>
<h3><span style="color: #000080;">Designed for Optimal 3D Audio in Everyday Life</span></h3>
<p>Among the growing range of technologies enhancing 3D sound — including surround sound, immersive audio and spatial audio — Eclipsa Audio was specifically designed to provide a 3D audio experience optimized for everyday listening.</p>
<p>&nbsp;</p>
<p>Traditionally, 3D audio content is created with the assumption that it will be played in environments equipped with multiple surround speakers. However, most home entertainment setups primarily consist of a TV and a soundbar — making it challenging to accurately replicate the content creator’s intended spatial audio effects. Eclipsa Audio overcomes this limitation by automatically analyzing sound elements in each segment of a film — from whispered dialogue to the roar of fighter jets in the background — and delivering a dynamic 3D audio effect fine-tuned for the viewer’s home environment.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Building a 3D Audio Ecosystem With Open-Source Technology</span></h3>
<p>Eclipsa Audio’s open-source framework sets it apart from other 3D audio technologies by allowing anyone to create 3D audio content without paying royalties. Following its debut at CES 2025, the technology has been met with enthusiasm from content creators and has gained momentum across media platforms and online communities.</p>
<p>&nbsp;</p>
<p>In this way, Eclipsa Audio serves as an open vessel in which content creators can integrate 3D audio elements from all directions without restrictions. By democratizing spatial audio, Eclipsa Audio empowers content creators and ensures that consumers experience sound as intended — regardless of their audio setup.</p>
<p><strong> </strong></p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Eclipsa Audio on Samsung TVs</span></h3>
<p>Eclipsa Audio’s immersive 3D sound performs at its best when paired with exceptional hardware. To make that peak performance a reality, Samsung and Google have worked tirelessly to provide consumers with Eclipsa Audio-supported 3D audio content — soon to be available via the YouTube app on Samsung’s latest TVs. Eclipsa Audio is set to roll out across the company’s entire 2025 TV lineup from the Crystal UHD series to the premium flagship Neo QLED 8K models.<a href="#_ftn1" name="_ftnref1"><span>[1]</span></a></p>
<p>&nbsp;</p>
<p>Most Samsung TVs are equipped with stereo speakers at the bottom of the screen, and for QLED 4K models and above, additional speakers are positioned at the top. Flagship models, though, come with extra benefits. Besides surround speakers added to the rear of the sides, their top-positioned speakers are specially designed for height perception. Eclipsa Audio reflects sound off the ceiling with these special speakers, creating an effect that allows viewers to experience upward-directional audio — such as the sensation of an object flying overhead. Pairing the TV with a soundbar further enhances the experience, producing richer and more expansive 3D spatial audio.</p>
<p>&nbsp;</p>
<p>Eclipsa Audio has established the foundation for a 3D audio content ecosystem by bringing industry leaders — from device manufacturers to content platforms — together under a unified standard. In an era where streaming services blur the line between content creation and consumption, Eclipsa Audio unlocks new possibilities for immersive sound that Samsung is determined to further expand.</p>
<p>&nbsp;</p>
<p>For more information, please visit: <span><a href="https://www.samsung.com/my/" target="_blank" rel="noopener">https://www.samsung.com/my/</a></span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h6><a href="#_ftnref1" name="_ftn1"><span><em><strong>[1]</strong></em></span></a><em> Rollout schedule and service details may vary depending on the TV model.</em></h6>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Develops Groundbreaking Achromatic Metalens With POSTECH</title>
				<link>https://news.samsung.com/my/samsung-develops-groundbreaking-achromatic-metalens-with-postech?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Tue, 04 Mar 2025 09:59:08 +0000</pubDate>
						<category><![CDATA[Corporate]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[Metalens]]></category>
		<category><![CDATA[Nature Materials]]></category>
		<category><![CDATA[Pohang University of Science and Technology]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[XR]]></category>
                <guid isPermaLink="false">https://bit.ly/4h7y5sk</guid>
									<description><![CDATA[Samsung Electronics Co., Ltd. today announced that it has published a joint research paper with Pohang University of Science and Technology (POSTECH) detailing]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics Co., Ltd. today announced that it has published a <span><a href="https://www.nature.com/articles/s41563-025-02121-0" target="_blank" rel="noopener">joint research paper</a></span> with Pohang University of Science and Technology (POSTECH) detailing the development of an innovative achromatic metalens in the renowned academic journal Nature Materials.</p>
<p>&nbsp;</p>
<p>The paper, titled “Roll-to-plate printable RGB-achromatic metalens for wide-field-of-view holographic near-eye displays,” reflects the findings of research conducted by Samsung and POSTECH’s joint research team, wherein they developed an achromatic metalens free from color distortions and combined it with holographic displays to overcome various optical aberrations. This innovation paves the way for compact yet high-quality holographic XR wearable devices and applications in cameras and sensors.</p>
<p>&nbsp;</p>
<p>Dr. Seokil Moon from Samsung Research and Professor Junsuk Rho from POSTECH led the study, with researchers Minseok Choi, Joohoon Kim and Kilsoo Shin from POSTECH also listed as co-authors of the paper.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Overcoming Conventional Chromatic Aberration Limitations To Achieve a Compact Achromatic Metalens</span></h3>
<p>A metalens is a flat lens composed of nanoscale structures capable of controlling light diffraction, which can drastically reduce the size and thickness compared to traditional convex optical lenses<a href="#_ftn1" name="_ftnref1"><span>[1]</span></a>. For this reason, it has been recognised as a next-generation optical component for applications in displays and cameras, sparking over a decade of research.</p>
<p>&nbsp;</p>
<p>Despite these advantages, metalenses have encountered technical challenges in product development due to severe chromatic aberration,<a href="#_ftn2" name="_ftnref2"><span>[2]</span></a> which leads to significant image distortion.</p>
<p>&nbsp;</p>
<p>Previous efforts to eliminate chromatic aberration in metalenses relied on designing individual metastructures independently and subsequently assembling them onto a substrate. As a result, the interrelationships between structures were overlooked during the design phase, preventing the complete reduction of chromatic aberration in the final lens.</p>
<p>&nbsp;</p>
<p>The research team overcame the challenge of chromatic aberration reduction by redefining the conventional design approach for metalenses. By accounting for the interrelationships between all metastructures during the design phase and designing them simultaneously, the team has successfully eliminated chromatic aberration after fabrication.</p>
<p>&nbsp;</p>
<p>In addition to eliminating chromatic aberration, the achromatic metalens developed by the team also achieves a shorter focal length, significantly reducing the lens&#8217; size and weight<a href="#_ftn3" name="_ftnref3"><span>[3]</span></a>.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Higher Resolution and Less Eye Strain With a Single Lens</span></h3>
<p>Typically, metalenses exhibit various optical aberrations beyond chromatic aberration, with image distortion worsening as screen size increases. These issues have traditionally been addressed by combining multiple lenses. However, the research team has resolved various optical aberrations within the device by integrating a single achromatic metalens with a holographic display, achieving a wide field of view and distortion-free, high-quality images.</p>
<p>&nbsp;</p>
<p>Additionally, through technical validation, the research team has demonstrated that substituting conventional optical lenses and displays with achromatic metalenses and holographic displays enables the delivery of compact, lightweight and virtual images that cause less eye strain<a href="#_ftn4" name="_ftnref4"><span>[4]</span></a>.</p>
<p>&nbsp;</p>
<p>The findings of this study are anticipated to be applied to immersive media devices such as those equipped with extended reality (XR) capabilities. They will also be used in various optical systems — including displays, cameras and sensors — to enhance performance and reduce size.</p>
<p>&nbsp;</p>
<p>Through this collaboration between industry and academia, Samsung has validated the entire process — from conceptualising innovative ideas to implementation — confirming the potential for advancing various future optical systems and securing next-generation display technologies.</p>
<p>&nbsp;</p>
<p>Samsung remains committed to ongoing research efforts, aiming to secure groundbreaking technologies that will shape the future through continued collaborations with academia and other industry-leading initiatives.</p>
<p>&nbsp;</p>
<p>To read about other news and updates, please visit <span><a href="https://news.samsung.com/my/" target="_blank" rel="noopener">https://news.samsung.com/my/</a></span>.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h6><em><a href="#_ftnref1" name="_ftn1">[1]</a> A convex lens typically has a thickness of several millimeters, sometimes exceeding a centimeter, whereas metalenses are much thinner, usually less than 0.5 mm.</em></h6>
<h6><em><a href="#_ftnref2" name="_ftn2">[2]</a> Chromatic aberration, also called color fringing, occurs when a lens fails to focus all colors to the same point, creating colored fringes along the edges of objects in photographs.</em></h6>
<h6><em><a href="#_ftnref3" name="_ftn3">[3]</a> Compared to previously proposed achromatic metalenses, this research has fabricated metalenses that are 3–5 times larger in size while maintaining the same focusing power (numerical aperture).</em></h6>
<h6><em><a href="#_ftnref4" name="_ftn4">[4]</a> The image quality is enhanced by 13% after aberrations are corrected using the holographic display. The image quality is measured using the peak signal-to-noise ratio (PSNR), which is widely used in image and signal processing.</em></h6>
]]></content:encoded>
																				</item>
					<item>
				<title>The Learning Curve, Part 5: Overcoming Multicultural and Multilingual Differences</title>
				<link>https://news.samsung.com/my/the-learning-curve-part-5-overcoming-multicultural-and-multilingual-differences?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Wed, 19 Jun 2024 12:56:06 +0000</pubDate>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[Galaxy AI]]></category>
		<category><![CDATA[Interpreter]]></category>
		<category><![CDATA[language]]></category>
		<category><![CDATA[Live Translate]]></category>
		<category><![CDATA[Samsung R&D Institute Brazil]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[Text-to-Speech]]></category>
                <guid isPermaLink="false">https://bit.ly/4enmYeZ</guid>
									<description><![CDATA[As Samsung continues to pioneer premium mobile AI experiences, we visit Samsung Research centers around the world to learn how Galaxy AI is enabling more users]]></description>
																<content:encoded><![CDATA[<p>As Samsung continues to pioneer premium mobile AI experiences, we visit Samsung Research centers around the world to learn how Galaxy AI is enabling more users to maximize their potential. Galaxy AI now supports 16 languages, so more people can expand their language capabilities, even when offline, thanks to on-device translation in features such as Live Translate, Interpreter, Note Assist and Browsing Assist. But what does AI language development actually involve? Last time, we visited China to learn about the importance of partnering with other leaders in AI. This time, we’re in Brazil to explore how teams work across cultures and borders to bring Galaxy AI to more people.</p>
<p>&nbsp;</p>
<p>A diverse country with more than 203 million people embodying a wide range of cultures and traditions, Brazil uses Brazilian Portuguese as its official language. Meanwhile, 22 neighboring countries use Latin American Spanish.</p>
<p>&nbsp;</p>
<p>Although Brazilian Portuguese and Latin American Spanish are widely spoken, intricate variations in both languages presented various challenges when teaching Galaxy AI to discern and distinguish regional differences. That’s why Samsung R&amp;D Institute Brazil (SRBR) collaborated with Samsung experts from Mexico — as well as third-party partners such as the science and technology institutes SiDi and Sidia — to assemble a multidisciplinary and highly skilled team that could tackle the task.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Lower Barriers, Higher Understanding</span></h3>
<p>The team used thousands of sources and a combination of machine learning and language processing tools to improve the AI model’s recognition of speech, written texts, and regional variations. But local jargon and names of famous figures — including sports teams, celebrities, and bands — vary widely between regions. Also, the same meaning can be expressed in many different words. While language models need localized data to gain a comprehensive understanding of the different languages to be translated, such variations inevitably present obstacles.</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-27289 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-1-1-e1718768319786.jpg" alt="" width="1000" height="563" /></p>
<p>&nbsp;</p>
<p>For example, swimming pool is “alberca” in Mexico — but it is “pileta” in Argentina, Paraguay and Uruguay. Meanwhile, in Colombia, Bolivia and Venezuela, swimming pool is “piscina”, which is also used in Brazil but with a slight tonal difference. And while Colombians might say &#8220;chévere&#8221; to refer to something cool, Mexicans instead say “padre.”</p>
<p>&nbsp;</p>
<p>These differences represent huge challenges for AI language understanding and learning, but the team overcame them by building larger language models, refining processing tools – and collaborating across borders and time zones.</p>
<p>&nbsp;</p>
<p>“We had to consider local slang and different ways of speaking before adapting and testing the model accordingly, which required close collaboration between the SRBR quality assurance (QA) team and development teams,” says Mateus Pedroso, Senior Manager and Head of Software Quality Lab at SRBR. “Since SRBR is located three hours ahead of the QA team in Mexico and 12 hours behind the management team in Korea, we had to create new communication channels and processes to align results and share progress. This multicultural collaboration generated a fiesta of ideas and solutions for Galaxy AI.”</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-27290 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-2-1-e1718768344547.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Communicating Success</strong></span></h3>
<p>Samsung’s philosophy of open collaboration came to life during this regional project as it was an iterative process that leveraged evolving technology on a global scale. To overcome linguistic and cultural barriers, the SRBR team needed to collect and manage massive amounts of data — continually refining and improving upon audio and text sources.</p>
<p>&nbsp;</p>
<p>The teams carved out key areas of responsibility to ensure everyone could benefit from the collective skill sets across the company’s Latin American offices. The SRBR development team served as the intermediate stakeholder of the project, receiving directions from Samsung’s headquarters and developing new updates to improve the AI model while carrying out tests for numerous use cases.</p>
<p>&nbsp;</p>
<p>“The testing phase required extensive communication and collaboration with QA teams to optimize the user experience, and each adjustment needed further testing and review,” says Leandro Flores de Moura, Software Development Manager at SiDi. “The success of Galaxy AI’s language capabilities is built on communication and collaboration as much as it is on technical expertise” adds Nathan Castro, QA Test Developer at SiDi.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>A Roadmap for Culture</strong></span></h3>
<p>What makes Galaxy AI particularly interesting for everyone involved is the fact that this wasn’t merely a language project. To them, language is a cultural guide that provides valuable insight into people’s heritage and identity.</p>
<p>&nbsp;</p>
<p>“For SiDi&#8217;s QA team, this was an endeavor that will change the world by enabling cultures to come together and overcome the difficulty of communicating in different languages,” adds Estefanía Castro Suárez, Test Developer at SiDi. “Knowing we were part of this fills us with pride and motivation.”</p>
<p>&nbsp;</p>
<p>“The way the SRBR team collaborated exemplifies what Galaxy AI sets out to achieve—making the world a smaller place through communicating, sharing and interacting with people, even those who speak different languages,” concludes Pedroso. “This capability will only grow as more languages come on board with Galaxy AI.”</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-27291 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-3-1-e1718768377321.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Join Galaxy AI-Volution Squad Today!</strong></span></h3>
<p>Purchase our latest Galaxy innovation</p>
<p>&nbsp;</p>
<ul>
<li>Samsung Galaxy S24 Ultra: <a href="https://www.samsung.com/my/smartphones/galaxy-s24-ultra/buy/" target="_blank" rel="noopener">https://www.samsung.com/my/smartphones/galaxy-s24-ultra/buy/</a></li>
<li>Samsung Galaxy S24+ | Samsung Galaxy S24: <a href="https://www.samsung.com/my/smartphones/galaxy-s24/buy/" target="_blank" rel="noopener">https://www.samsung.com/my/smartphones/galaxy-s24/buy/</a></li>
</ul>
<p>&nbsp;</p>
<p>To learn more about Galaxy AI, visit: <a href="https://www.samsung.com/my/galaxy-ai" target="_blank" rel="noopener">https://www.samsung.com/my/galaxy-ai</a></p>
]]></content:encoded>
																				</item>
					<item>
				<title>The Learning Curve, Part 4: A New AI Model and an Evolving Language</title>
				<link>https://news.samsung.com/my/the-learning-curve-part-4-a-new-ai-model-and-an-evolving-language?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Tue, 18 Jun 2024 09:55:31 +0000</pubDate>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[Galaxy AI]]></category>
		<category><![CDATA[language]]></category>
		<category><![CDATA[Live Translate]]></category>
		<category><![CDATA[Samsung R&D Institute China]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">https://bit.ly/3XpgYvZ</guid>
									<description><![CDATA[  As Samsung continues to pioneer premium mobile AI experiences, we visit Samsung Research centers around the world to learn how Galaxy AI is enabling]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone size-medium wp-image-27261" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-1-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>As Samsung continues to pioneer premium mobile AI experiences, we visit Samsung Research centers around the world to learn how Galaxy AI is enabling more users to maximize their potential. Galaxy AI now supports 16 languages, so more people can expand their language capabilities, even when offline, thanks to on-device translation in features such as Live Translate, Interpreter, Note Assist and Browsing Assist. But what does AI language development involve? Last time we visited Vietnam to learn about preparing the data that is used to train AI models. This time, we’re seeing how teams made Galaxy AI a unique offering for both the Chinese mainland and Hong Kong.</p>
<p>&nbsp;</p>
<p>The rapid growth in AI tools that use large language models (LLM) has been seen worldwide, and China is no exception. With Baidu’s ERNIE Bot and Meitu’s MiracleVision emerging as popular choices in China, Samsung R&amp;D Institute China partnered with both companies to help build Galaxy AI features for the country.</p>
<p>&nbsp;</p>
<p>Samsung R&amp;D Institute China in Guangzhou (SRC-G) and Beijing (SRC-B) worked to ensure Mandarin speakers in China had the same Galaxy AI experience as other users around the world, despite the back-end technology looking very different. The team took advantage of the dedicated resources of Chinese dialects from third-party partners and built a unique Galaxy AI solution for China.</p>
<p>&nbsp;</p>
<p>“We have the advantage of blending global best practices with China’s local practices, as well as creating new features and constantly improving them through daily communication with Chinese consumers,” says Hairong Zhang, Software Innovation Group Leader at SRC-G. “With rich development experience from the Galaxy S24, I’m proud of how our team cooperated with local Chinese AI companies such as Baidu and Meitu to provide a solution that resonates in China.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-medium wp-image-27262" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-2-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>At the beginning, the teams had to acclimate to each other’s working styles and iron out the initial kinks of information asymmetry. Daijun Zhang, Head of SRC-B, established a task force to ensure the project followed the development schedule and moved quickly toward its goals.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-medium wp-image-27264" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-3-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>Thanks to the Beijing team’s experience in generating large-scale models and successful collaboration with third-party partners, all the generative AI features were successfully launched in China. The result is a solution that has local relevance and market-specific features such as Touch to Search.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Expanding on Chinese To Develop for the Cantonese Dialect</strong></span></h3>
<p>Chinese for mainland China (Mandarin) arrived on Galaxy AI with the launch of the Galaxy S24 in January 2024. But the job for Samsung R&amp;D Institute China was far from finished. The team was also tasked with developing the AI model for Chinese in Hong Kong (Cantonese), a dialect that builds on the work already carried out for Mandarin but brings an entirely new set of language features to address.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-medium wp-image-27265" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-4-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>In developing for Cantonese, the China R&amp;D team faced major cultural challenges that it needed to respond to in order to fully support localization for the market. The first cultural phenomenon is the two sets of systems for writing and speech. Hong Kong locals use grammar and expressions similar to Mandarin when writing but adopt a completely different colloquial grammar when communicating daily. Also, Cantonese has nine tones for pronunciation, whereas Mandarin has four.</p>
<p>&nbsp;</p>
<p>Another cultural phenomenon is that the Cantonese dialect itself develops with the times. Add to that the fact that people often blend Cantonese and English into conversations, and it’s clear to see why it was complicated to create test cases and validate language packs.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-medium wp-image-27266" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-5-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>“Cantonese is a very unique dialect that varies in different Cantonese-speaking regions,” says Jing Li, who leads the operation for testing the Cantonese AI solution. “Some of the slang, phrases, vocabulary and even the tones are varied from place to place. Therefore, we conducted a large amount of work in verifying the Hong Kong-specific data, as well as proofreading tens of thousands of relevant test cases.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-medium wp-image-27267" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-6-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>With these complexities in mind, SRC-G and SRC-B worked together to support a deep code mix using a mixture of Cantonese and English for speech recognition, simultaneously supporting both written and spoken expressions in machine translation and reflecting current pronunciations in speech synthesis.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Cultural Impact of Communication</strong></span></h3>
<p>When Galaxy AI launched the Chinese (Hong Kong) language option, the customer feedback showed that the hard work of the Samsung R&amp;D team was justified.</p>
<p>&nbsp;</p>
<p>For both the Chinese mainland and Hong Kong, Samsung’s Galaxy AI activities show the importance of a global brand having a local presence and expertise, as well as the power of open collaboration with other organizations. In Hong Kong, Cantonese is a key part of the cultural identity of those who live there. That’s why it was so important for the team to get the AI language model right.</p>
<p>&nbsp;</p>
<p>“Language and communication are crucial in every region and in all walks of life,” says Henry Wat, Heads of Engineering Group at Samsung Electronics Hong Kong. “No matter the language, any tool that helps people communicate is invaluable. I believe our work is meaningful.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-medium wp-image-27268" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/Image-7-845x563.jpg" alt="" width="845" height="563" /></p>
<p>&nbsp;</p>
<p>In the next episode of The Learning Curve, we will head to Brazil to see how a team works across cultures and borders to bring Galaxy AI to more people.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Join Galaxy AI-Volution Squad Today!</strong></span></h3>
<p>Purchase our latest Galaxy innovation</p>
<p>&nbsp;</p>
<ul>
<li>Samsung Galaxy S24 Ultra: <a href="https://www.samsung.com/my/smartphones/galaxy-s24-ultra/buy/" target="_blank" rel="noopener">https://www.samsung.com/my/smartphones/galaxy-s24-ultra/buy/</a></li>
<li>Samsung Galaxy S24+ | Samsung Galaxy S24: <a href="https://www.samsung.com/my/smartphones/galaxy-s24/buy/" target="_blank" rel="noopener">https://www.samsung.com/my/smartphones/galaxy-s24/buy/</a></li>
</ul>
<p>&nbsp;</p>
<p>To learn more about Galaxy AI, visit: <a href="https://www.samsung.com/my/galaxy-ai" target="_blank" rel="noopener">https://www.samsung.com/my/galaxy-ai</a></p>
]]></content:encoded>
																				</item>
					<item>
				<title>The Learning Curve, Part 3: Taking AI Data From Good to Great</title>
				<link>https://news.samsung.com/my/the-learning-curve-part-3-taking-ai-data-from-good-to-great?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Mon, 10 Jun 2024 12:55:39 +0000</pubDate>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[Automatic Speech Recognition]]></category>
		<category><![CDATA[Galaxy AI]]></category>
		<category><![CDATA[Live Translate]]></category>
		<category><![CDATA[Samsung R&D Institute Vietnam]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[Text-to-Speech]]></category>
                <guid isPermaLink="false">https://bit.ly/3RiPFj8</guid>
									<description><![CDATA[  Samsung is pioneering premium mobile AI experiences. To learn how Galaxy AI is maximizing the potential of its users, we are visiting Samsung Research]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-27165" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/001-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991717491.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>Samsung is pioneering premium mobile AI experiences. To learn how Galaxy AI is maximizing the potential of its users, we are visiting Samsung Research centers around the world. Now supporting 16 languages, Galaxy AI is enabling more people to expand their language capabilities, even when offline, thanks to on-device translation in features such as Live Translate, Interpreter, Note Assist and Browsing Assist. We recently visited Jordan to learn the complexities of developing an AI model for Arabic, a language with many dialects. This time, we’re going to Vietnam to explore how data is prepared to train AI models.</p>
<p>&nbsp;</p>
<p>What is the difference between a ghost, grave and mother in Vietnamese? For a language spoken by 97 million people worldwide, very little. Each word translates to “ma,” “mả” and “má,” respectively — and can only be distinguished by tone. This illustrates how difficult it can be for AI models to learn a language, considering they cannot recognize firsthand the context and emotions of conversations nor the intentions of those speaking.</p>
<p>&nbsp;</p>
<p>Samsung R&amp;D Institute Vietnam (SRV) used finely refined data to help its AI model properly recognize even the most subtle differences in language.</p>
<p>&nbsp;</p>
<p>The quality of data used directly affects the accuracy of automatic speech recognition (ASR), neural machine translation (NMT) and text-to-speech (TTS) — processes that help Galaxy AI features such as Live Translate, Interpreter, Chat Assist and Browsing Assist break down language barriers.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>A Typhoon of Challenges</strong></span></h3>
<p>“Vietnamese is a complex and diverse language with rich expressions, many of which are challenging to capture,” says Ngô Hồng Thái, NMT lead at SRV. Of the 16 languages that Galaxy AI supports, Vietnamese was particularly difficult to develop.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27167" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/002-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991772803.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>“Personally, creating an AI model for Vietnamese was more daunting than our typhoons!” he adds before explaining the hurdles faced during the development process.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27168" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/003-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991810883.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>Vietnamese is a tonal language with six distinct tones. As evident in the “ma” example above, small nuances in vocalization can drastically alter the meanings of words. Therefore, a meticulous and detailed approach was necessary.</p>
<p>&nbsp;</p>
<p>“When similar sounding words are broken down, one word consists of several short segments, or ‘frame sets’,” says Bui Ngoc Tung, ASR lead at SRV. “The AI model differentiates between the short audio frames of around 20 milliseconds to recognize what words correspond to a certain set of consecutive frames. As such, it is critical to put great effort into the early stages of the AI learning process.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27169" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/004-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991847346.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>Furthermore, homophones and homonyms are common in Vietnamese. People can normally rely on context and nonverbal elements in conversations to differentiate between words that sound the same or are written the same but have different meanings. However, AI models need to be taught to accurately identify and differentiate between tones and similar words.</p>
<p>&nbsp;</p>
<p>“This isn’t a straightforward task,” Thái explains. “Apart from the amount, the data needs to be accurate to ensure it is capable of recognizing the linguistic nuances that exist in Vietnamese.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27170" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/005-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991893728.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Rigorous Preparation</strong></span></h3>
<p>The data refinement process consists of three steps. First, the audio and text used to train the AI model must be reviewed and corrected. Then, this dataset goes through random checks for overall quality. Finally, the dataset is normalized and cleaned before use in training.</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27171" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/006-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991930187.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>“We thoroughly performed a series of tests to check the accuracy of our dataset,” says Nguyen Manh Duy, TTS lead at SRV who oversees database creation. “We faced a number of unexpected problems including misspelled words in scripts and background noise or incorrect pronunciation during audio recordings. We spent significant time refining and improving our training data.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27172" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/007-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991961317.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>In addition to the unique linguistic challenges in Vietnamese, there is a lack of universally accessible data compared to more widely spoken languages. “This is another reason why the data refinement stage is so important,” he adds. “Since we had limited sources, every piece of data had to be fully reliable. There was no margin for error.”</p>
<p>&nbsp;</p>
<p><img class="alignnone size-full wp-image-27173" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/06/009-The-Learning-Curve-Part-3-Taking-AI-Data-from-Good-to-Great-e1717991997397.jpg" alt="" width="1000" height="588" /></p>
<p>&nbsp;</p>
<p>Moreover, the AI model for Vietnamese must consider both tonal and regional differences. To improve the AI model’s accuracy, the team collected vast amounts of data with Vietnam’s northern, central and southern accents — resulting in an enormous amount of information to refine and verify.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Continued Improvement</strong></span></h3>
<p>Developers at SRV completed the project after months of hard work, and Vietnamese became one of the first languages to be supported by Galaxy AI. Despite this success, the team is ceaselessly working to improve the Vietnamese Galaxy AI experience.</p>
<p>&nbsp;</p>
<p>“We’re continuing to enhance the AI model by incorporating user feedback about the relevance of words and phrases in Galaxy AI,” says Tran Tuan Minh, leader of the AI language development project at SRV. “We have just taken our first steps into a more open world  —  and we have so much more to explore together.”</p>
<p>&nbsp;</p>
<p>In the next episode of The Learning Curve, we will head to China to dig into how AI models are trained and fine-tuned.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Join Galaxy AI-Volution Squad Today!</strong></span></h3>
<p>Purchase our latest Galaxy innovation</p>
<p>&nbsp;</p>
<ul>
<li>Samsung Galaxy S24 Ultra: <a href="https://www.samsung.com/my/smartphones/galaxy-s24-ultra/buy/" target="_blank" rel="noopener">https://www.samsung.com/my/smartphones/galaxy-s24-ultra/buy/</a></li>
<li>Samsung Galaxy S24+ | Samsung Galaxy S24: <a href="https://www.samsung.com/my/smartphones/galaxy-s24/buy/" target="_blank" rel="noopener">https://www.samsung.com/my/smartphones/galaxy-s24/buy/</a></li>
</ul>
<p>&nbsp;</p>
<p>To learn more about Galaxy AI, visit: <a href="https://www.samsung.com/my/galaxy-ai" target="_blank" rel="noopener">https://www.samsung.com/my/galaxy-ai</a></p>
]]></content:encoded>
																				</item>
					<item>
				<title>The Learning Curve, Part 1: Why Teaching AI New Languages Begins With Data</title>
				<link>https://news.samsung.com/my/the-learning-curve-part-1-why-teaching-ai-new-languages-begins-with-data?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Fri, 17 May 2024 10:31:49 +0000</pubDate>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Products]]></category>
		<category><![CDATA[Automatic Speech Recognition]]></category>
		<category><![CDATA[Galaxy AI]]></category>
		<category><![CDATA[Live Translate]]></category>
		<category><![CDATA[Neural Machine Translation]]></category>
		<category><![CDATA[Samsung R&D Institute Indonesia]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[Text-to-Speech]]></category>
                <guid isPermaLink="false">https://bit.ly/4amige5</guid>
									<description><![CDATA[  As Samsung’s continues to pioneer premium mobile AI experiences, we visit Samsung Research centers around the world to learn how Galaxy AI is enabling]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone wp-image-26766 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-Ep01-Indonesia-Key-Visual-01-1440x960-e1715844414476.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>As Samsung’s continues to pioneer premium mobile AI experiences, we visit Samsung Research centers around the world to learn how Galaxy AI is enabling more users to maximize their potential. Galaxy AI now supports 16 languages, so more people can expand their language capabilities, even when offline, thanks to on-device translation in features such as Live Translate, Interpreter and Note Assist. But what does AI language development involve? This series examines the challenges of working with mobile AI and how we overcame them. First up, we head to Indonesia to learn where one begins teaching AI to speak a new language.</p>
<p>&nbsp;</p>
<p><strong><img class="alignnone wp-image-26767 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-01-1440x960-e1715844462243.jpg" alt="" width="1000" height="667" /> </strong></p>
<p>The first step is establishing targets, according to the team at Samsung R&amp;D Institute Indonesia (SRIN). “Great AI begins good quality and relevant data. Each language demands a different way to process this, so we dive deep to understand the linguistic needs and the unique conditions of our country,” says Junaidillah Fadlil, head of AI at SRIN, whose team recently added Bahasa Indonesia (Indonesian language) support to Galaxy AI. “Local language development has to be led by insight and science, so every process for adding languages to Galaxy AI starts with us planning what information we need and can legally and ethically obtain.”</p>
<p>&nbsp;</p>
<p>Galaxy AI features such as Live Translate perform three core processes: automatic speech recognition (ASR), neural machine translation (NMT) and text-to-speech (TTS). Each process needs a distinct set of information.</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-26765 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-Ep01-Indonesia-Infographic-01-1440x960-e1715844491126.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>ASR, for instance, needs extensive recordings of speech in numerous environments, each paired with an accurate text transcription. Varying background noise levels help account for different environments. “It’s not enough just to add noises to recordings,” explains Muchlisin Adi Saputra, the team’s ASR lead. “In addition to the language data we obtained from authorized 3<sup>rd</sup> party partners, we must go out into coffee shops or working environments to record our own voices. This allows us to authentically capture unique sounds from real life, like people calling out or the clattering of keyboards.”</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-26768 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-02-1440x960-e1715844528717.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>The ever-changing nature of languages must also be considered. Saputra adds: “We need to keep up to date with the latest slang and how it is used, and mostly we find it on social media!”</p>
<p>&nbsp;</p>
<p>Next, NMT requires translation training data. “Translating Bahasa Indonesia is challenging,” says Muhamad Faisal, the team’s NMT lead. &#8220;Its extensive use of contextual and implicit meanings relies on social and situational cues, so we need numerous translated texts that the AI could reference for new words, foreign words, proper nouns, and idioms – any information that helps AI understand the context and rules of communication.”</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-26771 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-05-1440x960.jpg" alt="" width="1000" height="666" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-05-1440x960.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-05-1440x960-845x563.jpg 845w, https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-05-1440x960-768x511.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p>&nbsp;</p>
<p>TTS then requires recordings that cover a range of voices and tones, with additional context on how parts of words sound in different circumstances. “Good voice recordings could do half the job and cover all the required phonemes (units of sound in speech) for the AI model,” adds Harits Abdurrohman, TTS lead. “If a voice actor did a great job in the earlier phase, the focus shifts to refining the AI model to clearly pronounce specific words.”</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-26769 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-03-1440x960-e1715844577485.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Stronger Together</strong></span></h3>
<p>It takes vast resources to plan for much data, and SRIN worked closely with linguistics experts. “This challenge requires creativity, resourcefulness and expertise in both Bahasa Indonesia and machine learning,” Fadlil reflects. “Samsung’s philosophy of open collaboration played a big part in getting the job done, as did our scale of operations and history of AI development.”</p>
<p>&nbsp;</p>
<p>Working with other Samsung Research centers around the world, the SRIN team was able to quickly adopt best practices and overcome the complexities of establishing data targets. Furthermore, collaboration was good for advancing not only technology but also culture. When the SRIN team joined their counterparts in Bangalore, India, they observed the local fasting customs, creating deeper connections and expanding their understanding of different cultures.</p>
<p>&nbsp;</p>
<p><img class="alignnone wp-image-26764 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Collage-01-1440x960-e1715844600562.jpg" alt="" width="1000" height="667" /></p>
<p>&nbsp;</p>
<p>For the team, Galaxy AI’s language expansion project took on a new significance. “We are particularly proud of our achievements here as this was our first AI project, and it won’t be our last as we continue to refine our models and improve the quality of output,” Fadlil concludes. “This expansion not only reflects our values of openness but also respects and incorporates our cultural identities through language.”</p>
<p>&nbsp;</p>
<p><em><img class="alignnone wp-image-26770 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2024/05/The-Learning-Curve-Series-EP01-Indonesia-Photo-04-1440x960-e1715844623788.jpg" alt="" width="1000" height="667" /> </em></p>
<p>In the next episode of The Learning Curve, we will head to Samsung R&amp;D Institute Jordan to speak to the team who led Galaxy AI’s Arabic language project. Tune in to learn about the complexities of building and training an AI model for a language with diverse dialects.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Interview] Movie-Quality Audio From the Comfort of Your Home: Meet the Leaders of Next-Generation 3D Audio Technology</title>
				<link>https://news.samsung.com/my/interview-movie-quality-audio-from-the-comfort-of-your-home-meet-the-leaders-of-next-generation-3d-audio-technology?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Wed, 22 Nov 2023 13:01:19 +0000</pubDate>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[3D Audio]]></category>
		<category><![CDATA[IAMF]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">https://bit.ly/49Qzt0n</guid>
									<description><![CDATA[For viewers to enjoy content such as movies, sports games and live events to the fullest, home TVs not only need excellent picture quality but impressive sound]]></description>
																<content:encoded><![CDATA[<p>For viewers to enjoy content such as movies, sports games and live events to the fullest, home TVs not only need excellent picture quality but impressive sound quality as well. Understanding this, industry leaders are pursuing 3D audio development to add a deeper level of immersion to entertainment experiences — bringing to life thrilling action scenes and capturing the quietest details of ambient noise.</p>
<p>&nbsp;</p>
<p>Put simply, 3D audio makes listeners feel as if they’re really in the middle of the action. Typically only supported at specific venues such as movie theaters or recording studios, 3D audio is making its way into homes to give viewers a new way to experience their favorite content.</p>
<p>&nbsp;</p>
<p>Samsung Electronics’ advanced research institute Samsung Research has been striving to popularize 3D audio since 2020. To this end, Samsung worked with Google to jointly develop Immersive Audio Model and Formats (IAMF) — an advanced 3D spatial audio technology that was adopted by the Alliance for Open Media (AOM)<a href="#_ftn1" name="_ftnref1"><span>[1]</span></a> in October 2023. Samsung Newsroom sat down with the team that developed IAMF to hear the story of how the technology came to life.</p>
<p>&nbsp;</p>
<div id="attachment_24512" style="width: 1010px" class="wp-caption alignnone"><img class="size-full wp-image-24512" src="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main1.jpg" alt="" width="1000" height="563" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main1.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main1-728x410.jpg 728w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main1-768x432.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /><p class="wp-caption-text">▲ The Visual Technology Team at Samsung Research led the development of a standard for audio technology. (From left) SungHee Hwang, JeongHoon Park and WooHyun Nam</p></div>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>A New Way To Enjoy 3D Audio at Home</strong></span></h3>
<p>Ambient noise exists everywhere — from the sound of shoes scraping on the sidewalk to the gentle hum of cars driving down the road. While this noise may seem unnecessary in media, it’s vital in adding depth and a sense of realness. 3D audio works to seamlessly blend ambient noise with conversations and other sound effects to create a more vibrant and authentic entertainment experience.</p>
<p><strong> </strong></p>
<p>“3D audio makes you feel as if you’re really in the heart of the action by adjusting the strength, movement and vibrance of the sound,” said WooHyun Nam from the Visual Technology Team at Samsung Research. “In this way, viewers are able to enjoy a more full-bodied sound that captures the 3D aspects of the world around us.”</p>
<p>&nbsp;</p>
<div id="attachment_24513" style="width: 1010px" class="wp-caption alignnone"><img class="size-full wp-image-24513" src="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main2.jpg" alt="" width="1000" height="667" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main2.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main2-844x563.jpg 844w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main2-768x512.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /><p class="wp-caption-text">▲ SungHee Hwang from Samsung Research’s Visual Technology Team</p></div>
<p>&nbsp;</p>
<p>Despite the advantages of 3D audio, it has been difficult to apply the technology to home audio devices due to technological limitations. “3D sound information from content cannot be interpreted properly by home audio systems such as TV speakers or sound bars, resulting in a slightly limited audio experience that lacks detail from the original content,” said SungHee Hwang from the Visual Technology Team at Samsung Research.</p>
<p>&nbsp;</p>
<p>To resolve this, Samsung worked diligently with Google to develop an audio solution that would allow viewers to experience content audio as intended. “If the 3D audio data can be read by device manufacturers, they can adjust the sound in audio devices — allowing for immersive audio experiences with standard TV speakers or sound bars at home,” said Nam. “By adjusting the audio to match the home device environment, listeners can experience audio just as the creator intended without any distortions or loss in quality.”</p>
<p>&nbsp;</p>
<p>A uniform standard is required to send and receive audio data smoothly between creators and device manufacturers. “Samsung and Google’s respective expertise in devices and content made the companies ideal partners to create IAMF technology,” said JeongHoon Park, Executive Vice President<span> </span><span>and Head of Visual Technology Team at Samsung Research</span>. “By coming together to develop this unprecedented technology, we are paving the way for consumers to enjoy 3D audio in their homes.”</p>
<p>&nbsp;</p>
<p><strong> </strong></p>
<h3><strong><span style="color: #000080;">Three Characteristics of IAMF Technology: Vertical Audio, AI-Based Audio and Customized Audio</span><span></span></strong></h3>
<p>IAMF technology offers three distinct features that enhance the audio experience.</p>
<p>&nbsp;</p>
<div id="attachment_24514" style="width: 1010px" class="wp-caption alignnone"><img class="size-full wp-image-24514" src="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main3.jpg" alt="" width="1000" height="667" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main3.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main3-844x563.jpg 844w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main3-768x512.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /><p class="wp-caption-text">▲ WooHyun Nam from Samsung Research’s Visual Technology Team</p></div>
<p>&nbsp;</p>
<p><strong>1. Ability To Express Sound Vertically</strong></p>
<p>Previous open-source audio codecs only supported horizontal sound expression. With IAMF technology, audio can now be expressed vertically so that sound is increasingly multi-directional. “IAMF makes sound more realistic, by allowing listeners to hear audio in front, behind or to either side and also above or below them,” said Nam. “As such, when IAMF technology is applied to home TV speakers and sound bars, it allows listeners to hear sounds such as birds flying over their head on their TVs at home.”</p>
<p>&nbsp;</p>
<p><strong>2. AI-Based Scene Analysis and 3D Audio Effects</strong></p>
<p>IAMF utilizes AI and deep-learning technology to analyze scenes and emphasize certain aspects of the content — adjusting audio levels for more enhanced sound throughout the viewing experience. “In TV and film, there are certain scenes where the soundtrack or background music is the main focus,” said Nam. “IAMF will balance the sound in these instances. Similarly, the technology will fine-tune audio when there is character dialogue to allow the listener to focus on the conversation.”</p>
<p>&nbsp;</p>
<p>In addition, IAMF technology provides optimal sound despite changes in the device environment. “By adjusting the scene analysis audio data according to the device environment, IAMF technology enables listeners to enjoy the content’s original sound quality on standard home TVs,” Hwang added.</p>
<p>&nbsp;</p>
<p><strong>3. Highly Customized Audio</strong></p>
<p>Users will be able to freely adjust sound according to their preferences with IAMF technology. Whether viewers are looking to amplify sound effects from an action scene or enhance dialogue, IAMF gives them the flexibility to customize content audio for a more personalized experience.</p>
<p>&nbsp;</p>
<div id="attachment_24515" style="width: 1010px" class="wp-caption alignnone"><img class="size-full wp-image-24515" src="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main4.jpg" alt="" width="1000" height="254" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main4.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main4-768x195.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /><p class="wp-caption-text">▲ IAMF analyzes data from content and allows viewers to adjust and customize audio settings. When watching a sports game, users can directly choose whether to emphasize the commentator’s voice or the sounds of the game itself.</p></div>
<p><strong> </strong></p>
<p><strong> </strong></p>
<h3><span style="color: #000080;"><strong>3D Audio Across the Industry With Open-Source IAMF</strong></span></h3>
<p>Open source is crucial to creating a unified standard across the industry. IAMF is the first open source-based audio technology standard adopted by AOM — meaning both corporate and independent content creators across the industry can access the technology and expand its use.</p>
<p>&nbsp;</p>
<p>“In order to allow people to freely create content with 3D audio technology, related technology needs to be open to all,” said Nam. “Providing a complete open-source framework for 3D audio, from creation to delivery and playback, will allow for even more diverse audio content experiences in the future.”</p>
<p>&nbsp;</p>
<p>Similarly, Park highlighted how IAMF technology will have a large impact on the audio landscape moving forward. “Because we live in an era dominated by content creation, IAMF will help lead, expand and transform the 3D audio ecosystem,” he said.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;"><strong>Success Accomplished Through Cooperation</strong></span></h3>
<p>Research on IAMF began in 2020 and took nearly four years to complete. Through a combination of perseverance and hard work, the team was able to achieve success.</p>
<p>&nbsp;</p>
<p>“The project involved many days of nonstop work, and there were times when we needed to work at night due to the time difference between our office and Google’s,” said Hwang. “Despite these challenges, I was proud in the end. It was a great opportunity to work with developers from other countries.”</p>
<p>&nbsp;</p>
<p>The teams developed a sense of camaraderie along the way and built a solid foundation of trust and understanding. “A Google employee mentioned how this partnership with Samsung has been the most enjoyable collaboration he had completed so far,” Park recalled. “We were able to accomplish this project because our teams trusted each other. I’m very proud of and thankful for our team members’ efforts and their cooperation.”</p>
<p>&nbsp;</p>
<div id="attachment_24516" style="width: 1010px" class="wp-caption alignnone"><img class="wp-image-24516 size-full" src="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main5.jpg" alt="" width="1000" height="667" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main5.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main5-844x563.jpg 844w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main5-768x512.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /><p class="wp-caption-text">▲ JeongHoon Park, Corporate EVP and Head of the Visual Technology Team at Samsung Research</p></div>
<p>&nbsp;</p>
<p>Additionally, Park praised each team member’s unique skills and expertise which helped contribute to the creation of IAMF. “WooHyun Nam was instrumental in proposing the technology development ideas while SungHee Hwang organized and documented our team’s ideas in a clear-cut manner,” he said.</p>
<p><strong> </strong></p>
<p><strong> </strong></p>
<h3><span style="color: #000080;"><strong>A Vision for ‘Samsung Sound’</strong></span></h3>
<p>Upon its release, the Samsung Research team realized that the standardization of 3D spatial audio marked a new era of sound technology.</p>
<p>&nbsp;</p>
<p>“Thanks to IAMF, we can begin researching technologies that will open up a world of audio possibilities,” said Nam. In line with this, the Samsung Research team is currently developing an advanced version of IAMF technology that can be applied to different sectors such as mobile devices, the metaverse, video games and more.</p>
<p>&nbsp;</p>
<div id="attachment_24517" style="width: 1010px" class="wp-caption alignnone"><img class="size-full wp-image-24517" src="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main6.jpg" alt="" width="1000" height="667" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main6.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main6-844x563.jpg 844w, https://img.global.news.samsung.com/my/wp-content/uploads/2023/11/3D-Audio-Tech_main6-768x512.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /><p class="wp-caption-text">▲ With their combined efforts and unrelenting persistence, the Visual Technology Team at Samsung Research was able to successfully create a new standard for 3D audio.</p></div>
<p>&nbsp;</p>
<p>Given the success of IAMF, the Samsung Research team is motivated to create better audio technology for consumers. Each of the team members shared their aspirations for the future of 3D audio.</p>
<p>&nbsp;</p>
<p>Nam touched on his desire to create audio technology that is more immersive. “I want to create a more advanced 3D audio technology that makes users feel like they are truly in the scene of a movie, TV show or live event,” he said. “I also hope to continue this research until 3D audio is applied to Samsung’s smartphones.”</p>
<p>&nbsp;</p>
<p>More broadly, Hwang discussed how he hopes Samsung will be able to create unrivaled audio technology that will place it among the ranks of other leading audio companies. “My goal is to develop a technology that consumers will be able to easily distinguish as ‘Samsung Sound’ when they hear it,” he said. “I am optimistic that the IAMF standard is a stepping stone toward this dream.”</p>
<p>&nbsp;</p>
<p>“My hope is that Samsung’s sound technology will enable consumers to enjoy an upgraded audio experience that is on par with that of current visual experiences,” said Park. “I hope that content creators will utilize Samsung’s audio technology to make 3D audio content accessible.<span> Additionally,</span><span> </span>we aim to create a supportive environment that enables researchers to take on bold and exciting challenges such as expanding Samsung’s audio technology.”</p>
<p>&nbsp;</p>
<p>Technological standards such as IAMF will make entertainment more immersive for viewers across the board. The Samsung Research team is paving the way for innovations that will change the future of audio.</p>
<p>&nbsp;</p>
<p>To learn more, please visit: <span><a href="https://www.samsung.com/my/tvs/all-tvs/?tv-soundbar-bundles">https://www.samsung.com/my/tvs/all-tvs/?tv-soundbar-bundles</a></span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h6><em><a href="#_ftnref1" name="_ftn1">[1]</a> A non-profit industry consortium formed to develop and share technology for multimedia delivery, allowing creators to utilize the technology without worrying about cost. It is operated by an alliance of 38 companies including Samsung Electronics, Google, Amazon, Apple and Meta.</em></h6>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Further Expands Health and Wellness Ecosystem With Even More Connected and Diverse Health Services</title>
				<link>https://news.samsung.com/my/samsung-further-expands-health-and-wellness-ecosystem-with-even-more-connected-and-diverse-health-services?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Tue, 20 Dec 2022 13:53:40 +0000</pubDate>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Fall Detection API]]></category>
		<category><![CDATA[Galaxy Watch]]></category>
		<category><![CDATA[Galaxy Wearable]]></category>
		<category><![CDATA[Health Connect]]></category>
		<category><![CDATA[Health SDK]]></category>
		<category><![CDATA[Samsung Developer Conference]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[SDC22]]></category>
                <guid isPermaLink="false">https://bit.ly/3G3qV9w</guid>
									<description><![CDATA[Samsung Electronics unveiled a series of new tools ahead of the Samsung Developer Conference 2022 (SDC22) that will aid developers and communities in shaping]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics unveiled a series of new tools ahead of the Samsung Developer Conference 2022 (SDC22) that will aid developers and communities in shaping health, wellness and safety habits for consumers everywhere. These tools include the Samsung Privileged Health SDK program for select partners, the Fall Detection API to support user safety, Samsung’s new end-to-end research solution for educational, clinical and healthcare programmers, as well as ongoing opportunities for partners with Health Connect.</p>
<p>&nbsp;</p>
<p><span>“Samsung’s health foundation is rooted in our advanced hardware and sensor technology and is bolstered by our open ecosystem and collaborative approach,” said</span><span> </span>TaeJong Jay Yang, Executive Vice President and the Head of Health R&amp;D Team at Mobile eXperience Business, Samsung Electronics. “That is why I am excited to announce expanded developer tools, APIs and partner offerings that allow third-party experts, research centers and universities to advance wearable tracking and insight capabilities for a wider health, wellness and safety.”</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Samsung Privileged Health SDK: Advancing Preventative Safety in Driving</span></h3>
<p>Samsung is collaborating with select industry leaders to develop personalized health services. With safety as one of Samsung’s focus areas, these collaborations will deliver all-new preventative tools to make roads safer by identifying early signs of driver fatigue and stress.</p>
<p>&nbsp;</p>
<p><span>Samsung’s work with Tobii, a global leader in eye-tracking and attention computing that develops software to monitor drowsiness, is one example. If users choose to, Samsung’s Privileged Health SDK enables Tobii to sense real-time heart rate captured by Galaxy Watch sensors and process them to determine the user’s level of drowsiness. Also, Harman, a leader in connected car technology, audio innovations and IoT solutions, has recently introduced Ready Care, a complete automotive in-cabin sensing and tailored solution for driver safety and comfort. In addition to distraction, fatigue and vital sign sensing, users can allow Ready Care to measure cognitive load and stress level and can provide suggestions for alternative routes to lower a driver’s stress.</span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Health Services: New Fall Detection API for Enhanced Safety Capabilities</span></h3>
<p>Health Services provides APIs that allow developers and third parties to harness the powerful sensors and algorithms on the Galaxy Watch for more accurate and advanced health offerings. Samsung is introducing a new API to the ecosystem that offers adjustable sensitivity for Fall Detection, strengthening preventative safety and setting the stage for even more potential offerings.</p>
<p>&nbsp;</p>
<p>Developers can now create services using Galaxy Watch’s Fall Detection algorithms that sense a user’s stumble or fall by combining different sensors including an accelerometer and gyroscope. This API enables development of apps for users who may encounter unexpected accidents. Furthermore, Galaxy users are even able to adjust sensitivity levels<span> </span><span>— </span>whether standing still, moving, or exercising<span> </span><span>— </span>via the Galaxy Wearable app. Like sleep tracking that senses restless movements or falls from bed to paint a more complete picture of a user’s sleep, this API opens all new opportunities for developers and users alike.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">End-to-End Research Solution: Accelerating Medical Service and Research</span></h3>
<p>Samsung is also introducing an open-source project that provides end-to-end solutions combining SDK, backend system and portal for a variety of health applications. This all-in-one healthcare toolset empowers research and clinical experiences<span> </span><span>— </span>starting at the developer level <span>— </span>by delivering advanced insights through Galaxy Watches and wearables to support medical research at institutions, hospitals, wellness centers and beyond.</p>
<p>&nbsp;</p>
<p><span>Researchers can use this open source project to build modules that enable participants to join a study in a frictionless and informed way, streamlining the onboarding process. Flexible survey templates are also in place to satisfy the ever-changing needs of organizations. From there, relevant data and insights are gathered more easily from wearable devices, while participants are guided with actionable insights throughout the process. Encrypted participant data is then transmitted to the backend system where health and medical communities monitor and analyze the collected insights to inform patient treatment plans, upcoming research and much more. More details of the new project will be announced at SDC22 keynote by Sebastian Seung, President and the Head of Samsung Research.</span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Health Connect: Greater Insights and Data Control</span></h3>
<p>Health Connect, announced in collaboration with Google during the spring of 2022, offers developers a single set of APIs enabling the seamless creation of new fitness and health experiences. Currently available in beta, Health Connect offers users centralized privacy controls that make it easy to give permission to the health and fitness apps they want to share their on-device data with. Now, Samsung Health and Fitbit, along with leading health and fitness apps including Leap Fitness, MyFitnessPal and Withings have adopted Health Connect to enable a more impactful and holistic wellness experience. And with a user’s permission, app developers can leverage certain data shared through Health Connect for use in their own applications in order to provide users with a more complete picture of their health.</p>
<p>&nbsp;</p>
<p>Samsung is committed to giving its community<span> </span><span>— </span>including users, third parties, developers and beyond<span> </span><span>— </span>enhanced tools and resources to build and benefit from more connected and diverse health services. As part of Samsung’s overarching wellness initiatives, the company will continue to deliver an expansive ecosystem<span> </span><span>— </span>which the Samsung community will hear more about during the SDC22<span> </span><span>— </span>that empowers users with more insights and information on health and wellbeing.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Showcases Evolution of SmartThings and Introduces New Device Experiences at SDC22</title>
				<link>https://news.samsung.com/my/samsung-showcases-evolution-of-smartthings-and-introduces-new-device-experiences-at-sdc22?utm_source=rss&amp;utm_medium=direct</link>
				<pubDate>Tue, 18 Oct 2022 14:38:53 +0000</pubDate>
						<category><![CDATA[Home Appliances]]></category>
		<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[TVs & Displays]]></category>
		<category><![CDATA[Bixby]]></category>
		<category><![CDATA[Galaxy Watch]]></category>
		<category><![CDATA[Health SDK]]></category>
		<category><![CDATA[Home Connectivity Alliance]]></category>
		<category><![CDATA[One UI 5]]></category>
		<category><![CDATA[Samsung Developer Conference]]></category>
		<category><![CDATA[Samsung Health Stack]]></category>
		<category><![CDATA[Samsung Knox Matrix]]></category>
		<category><![CDATA[Samsung Research]]></category>
		<category><![CDATA[Samsung Smart TV]]></category>
		<category><![CDATA[Samsung TV PLUS]]></category>
		<category><![CDATA[SDC22]]></category>
		<category><![CDATA[SmartThings]]></category>
		<category><![CDATA[SmartThings Energy]]></category>
		<category><![CDATA[Tizen]]></category>
                <guid isPermaLink="false">https://bit.ly/3MCoX1r</guid>
									<description><![CDATA[  Samsung Electronics held its annual Samsung Developer Conference (SDC) today in San Francisco, bringing together developers, creators and designers to]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-18644" src="https://img.global.news.samsung.com/my/wp-content/uploads/2022/10/SDC22-Press-Release_main1.jpg" alt="" width="1000" height="666" srcset="https://img.global.news.samsung.com/my/wp-content/uploads/2022/10/SDC22-Press-Release_main1.jpg 1000w, https://img.global.news.samsung.com/my/wp-content/uploads/2022/10/SDC22-Press-Release_main1-845x563.jpg 845w, https://img.global.news.samsung.com/my/wp-content/uploads/2022/10/SDC22-Press-Release_main1-768x511.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p>&nbsp;</p>
<p>Samsung Electronics held its annual Samsung Developer Conference (SDC) today in San Francisco, bringing together developers, creators and designers to explore seamless, connected experiences powered by Samsung.</p>
<p>&nbsp;</p>
<p>During the event, Samsung shared more about its commitment to creating simplified, game-changing customer experiences, including the company’s updated vision for SmartThings as it evolves from a connectivity platform to an enabler of smarter lifestyles<em>. </em>From deeper integration with Bixby to seamless connectivity with Matter-compatible devices, SmartThings is creating a richer, more open world that empowers all users to streamline their connections and their daily lives.</p>
<p>&nbsp;</p>
<p>“At Samsung, we continuously innovate our devices, platforms and services to be simpler and more convenient. I am proud to share the next generation of our work, like SmartThings, to further our collaboration with partners and developers,” said Jong-Hee (JH) Han, Vice Chairman, CEO and Head of Device eXperience (DX) Division at Samsung Electronics. “As technologies become more complex, we will always search for ways to make life easier, more connected, and more flexible, so our consumers can focus on what matters most.”</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Connecting Platforms and Devices With Calm Technology</span></h3>
<p>Samsung’s vision of seamless connectivity is inspired by the philosophy of Calm Technology <span>— </span>where devices instantly work together, so consumers can save time on setup and get right into the experience. To realize this vision, Samsung reimagined <strong>SmartThings</strong> and its connected services and partnerships, including Samsung’s <strong>Hub Everywhere</strong>, extending its capabilities to the entire smart home with audio and visual data as well as <strong>SmartThings Energy</strong>, <strong>SmartThings Pet</strong> and <strong>SmartThings Cooking</strong>. Samsung also partnered with Philips Hue to integrate <strong>Philips Hue Sync</strong> right into Galaxy devices so smart home lighting can match with music.</p>
<p>&nbsp;</p>
<p>The seamless experience will extend beyond Samsung’s ecosystem through SmartThings’ integration with <strong>Matter</strong> and Samsung’s membership in the <strong>Home Connectivity Alliance</strong>. Google and Samsung have worked together to enable users to find and link their devices across platforms by building multi-admin feature on Matter devices. The collaboration will bring more devices and users into the connected home in the future.</p>
<p>&nbsp;</p>
<p><strong>Bixby</strong> has been integrated more deeply with SmartThings, a tool for developers to build more intelligent voice user interface experiences. As Samsung’s representative voice assistant platform, Bixby has evolved to become the on-device AI solution able to control individual Galaxy devices, as well as the cross-device experience throughout the Samsung ecosystem. With the new <strong>Bixby Home Studio</strong>, developers can now build differentiated, customized experiences for the SmartThings platform. In addition, Bixby will be even more widely available to users with support for Latin American Spanish, starting in November.</p>
<p>&nbsp;</p>
<p>As the smart home is becoming more advanced, Samsung is introducing a new security paradigm, enabling Samsung devices to protect each other. <strong>Samsung Knox Matrix<a href="#_ftn1" name="_ftnref1"><span><sup>[1]</sup></span></a> </strong>is a private blockchain based platform that turns eligible Samsung devices into a shield to protect users’ entire device ecosystem — from Galaxy devices to TVs and home appliances. Plus, users can customize personal privacy settings with the new Security and privacy dashboard, which scans for vulnerabilities, recommends security updates and gives users data management options to keep privacy and security top of mind.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Personalized Services for Better Experiences</span></h3>
<p>In the past year, <strong>Samsung TV Plus — Samsung’s free, ad-supported streaming TV and video (FAST) platform available on Samsung Smart TV</strong><a href="#_ftn2" name="_ftnref2"><span><sup>[2]</sup></span></a> and mobile devices <span>— </span>reached 100% growth in viewership,<a href="#_ftn3" name="_ftnref3"><span><sup>[3]</sup></span></a> and projects three billion hours streamed by the end of the year. To build on this momentum, Samsung TV Plus has expanded its offering by adding partnerships with Lionsgate and Vice Media, providing 8K video on demand. Samsung TV Plus has also gone through a redesign, to reflect its extraordinary variety of content with more than 1,600 channels across 24 countries. Samsung TV Plus is the premium choice for a seamless ad experience, where stream stitching makes ad playback simpler than ever.</p>
<p>&nbsp;</p>
<p>After 10 years of service, Samsung’s <strong>Tizen</strong> operating system (OS) continues to offer best-in-class user experiences. The <strong>Samsung Gaming Hub</strong> brings better, faster and more convenient access to gaming on Samsung Smart TVs, provided by industry-leading partners, such as Xbox, NVIDIA GeForce NOW, Utomik and Amazon Luna. Samsung Gaming Hub bridges expertise in hardware and software to integrate features like AI Upscaling and multitask functions to make gaming on Samsung Smart TVs an immersive, optimized experience.</p>
<p>&nbsp;</p>
<p>Tizen OS is also expanding to offer <strong>NFTs </strong>support with partners like Art Token, laCollection and Nifty Gateway. For B2B customers and developers, Samsung is providing <strong>B2B APIs</strong> specific to industries and use cases like Syncplay, which allows content to be synchronized and played in multiple signages. Tizen is also introducing <strong>SALT</strong>, a new content conversion solution to display the highest-quality HDR10+ content on supported TVs.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Your Galaxy, Your Way With One UI 5</span></h3>
<p>By introducing the new <strong>One UI 5</strong>, Samsung has empowered users to easily customize their device’s look and feel, while enhancing productivity and providing amazing experiences across devices and platforms. One UI 5 brings more personalized experiences with custom-built <strong>Modes and Routines</strong>, and a <strong>Dynamic Lock screen</strong> that displays multiple visuals on your phone, your Galaxy Watch and other One UI 5 devices. With the new <strong>Bixby Text Call</strong> feature, Bixby Voice will answer calls on your behalf and share typed messages with the caller, speaking aloud as if you had answered. One UI 5 also introduces new daily health solutions to help track health and wellness in one place. This includes the <strong>Samsung Privileged Health SDK</strong>, which enables developers to build apps that leverage the BioActive Sensor on the Galaxy Watch.<a href="#_ftn4" name="_ftnref4"><span><sup>[4]</sup></span></a></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h3><span style="color: #000080;">Future-Forward Investment and Research</span></h3>
<p>Samsung Research is looking beyond today’s horizon to innovate for a better tomorrow, full of relentless innovation and open collaboration. Samsung Research is making its robot arm manipulation code available on GitHub, enabling academics, researchers and developers to examine new ideas in robotic manipulation.</p>
<p>&nbsp;</p>
<p>The team at Samsung Research was also inspired by the new wellness features Samsung has introduced, particularly on the Galaxy Watch5. To find new use-cases leveraging these features, Samsung Research is working with universities and healthcare institutions that will explore areas like heart health, stress, blood pressure, lung health and neuroscience. As a result, Samsung is offering a full stack SDK <span>— </span>the <strong>Samsung</strong> <strong>Health Stack</strong> <span>— </span>which will jumpstart research into important health fields and spark new development.</p>
<p>&nbsp;</p>
<p>Samsung invites developers, creators, partners and more to join in on its commitment to open collaboration and seamless experiences. For more information about the Samsung Developer Conference (SDC) 2022, please visit <a href="https://developer.samsung.com/" target="_blank" rel="noopener">developer.samsung.com</a>.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h6><em><a href="#_ftnref1" name="_ftn1"><sup>[1]</sup></a> Starting as early as late 2023, Samsung Knox Matrix will be initially available on selected Samsung devices. Support for other manufacturers’ products will follow later.</em></h6>
<h6><em><a href="#_ftnref2" name="_ftn2"><sup>[2]</sup></a> Samsung TV Plus is available for selected Samsung Smart TVs only.</em></h6>
<h6><em><a href="#_ftnref3" name="_ftn3"><sup>[3]</sup></a> Viewership from September 2021 to August 2022 compared to viewership from September 2020 to August 2021</em></h6>
<h6><em><a href="#_ftnref4" name="_ftn4"><sup>[4]</sup></a> Available on Galaxy Watch4 series and Galaxy Watch5 series.</em></h6>
]]></content:encoded>
																				</item>
			</channel>
</rss>
