<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet title="XSL_formatting" type="text/xsl" href="https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss.xsl"?><rss version="2.0"
     xmlns:content="http://purl.org/rss/1.0/modules/content/"
     xmlns:wfw="http://wellformedweb.org/CommentAPI/"
     xmlns:dc="http://purl.org/dc/elements/1.1/"
     xmlns:atom="http://www.w3.org/2005/Atom"
     xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
     xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	 xmlns:media="http://search.yahoo.com/mrss/"
	>
	<channel>
		<title>Artificial Intelligence &#8211; Samsung Global Newsroom</title>
		<atom:link href="https://news.samsung.com/global/tag/artificial-intelligence/feed" rel="self" type="application/rss+xml" />
		<link>https://news.samsung.com/global</link>
        
        <currentYear>2024</currentYear>
        <cssFile>https://news.samsung.com/global/wp-content/plugins/btr_rss/btr_rss_xsl.css</cssFile>
		<description>What's New on Samsung Newsroom</description>
		<lastBuildDate>Thu, 02 Apr 2026 18:21:43 +0000</lastBuildDate>
		<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
					<item>
				<title>Travel Just Got a Whole Lot Smarter, Thanks to Samsung Galaxy S24</title>
				<link>https://news.samsung.com/global/travel-just-got-a-whole-lot-smarter-thanks-to-samsung-galaxy-s24</link>
				<pubDate>Mon, 29 Jan 2024 18:00:21 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2024/01/Travel-Just-Got-a-Whole-Lot-Smarter-S24_Thumbnail-728_F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Galaxy AI]]></category>
		<category><![CDATA[Galaxy S24]]></category>
		<category><![CDATA[Galaxy S24 Series]]></category>
		<category><![CDATA[Galaxy S24 Ultra]]></category>
		<category><![CDATA[Galaxy Unpacked January 2024]]></category>
                <guid isPermaLink="false">https://bit.ly/3OauJJt</guid>
									<description><![CDATA[Traveling is often filled with unexpected challenges and complexities, but it just got easier. Samsung Galaxy S24 series, powered by Galaxy AI, is a personal travel guide in your pocket making the experience smoother and more accessible than ever. Considering a weekend in Paris? Thanks to the 2024 Summer Olympics, that will be even more […]]]></description>
																<content:encoded><![CDATA[<p>Traveling is often filled with unexpected challenges and complexities, but it just got easier. Samsung Galaxy S24 series, powered by Galaxy AI, is a personal travel guide in your pocket making the experience smoother and more accessible than ever.</p>
<p>Considering a weekend in Paris? Thanks to the 2024 Summer Olympics, that will be even more popular this year, making it a more challenging destination. Here are all the ways in which Galaxy AI can ease the journey at step, from planning and communicating better to discovering cultures and creating great memories.</p>
<h3><span style="color: #000080"><strong>THREE MONTHS OUT: You build an (intelligent) three-day itinerary.</strong></span></h3>
<p>AI can help eliminate travel indecision and answer the question, Where should I go?</p>
<p>As you scroll through social media, for example, a photo highlighting an exhibition at Paris’ Musée d’Orsay grabs your attention. Rather than switching apps to learn more, you use Galaxy S24’s Circle to Search with Google feature (available January 31), which lets you act on inspiration the moment it strikes.</p>
<p>You press and hold the Home button to activate the feature. The new experience, which brings Google’s latest AI capabilities to Galaxy users, allows you to search what’s displayed on your screen, including images, videos or text.<sup>1</sup> With your finger or Samsung S Pen, you can simply draw a circle around what you’re curious about—in this case, a social media post—and Circle to Search quickly pulls up relevant information, such as details about the exhibition.</p>
<p>For certain searches, this feature also allows you to get AI-powered overviews so you can more easily understand concepts, ideas or topics from helpful information pulled together from across the web. In this way, well ahead of your travels, AI adds a new dimension of preparation, helping you follow your curiosity with confidence.</p>
<div id="attachment_148685" style="width: 1010px" class="wp-caption alignnone"><img aria-describedby="caption-attachment-148685" class="wp-image-148685 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2025/01/Travel-Just-Got-a-Whole-Lot-Smarter-S24_main1.jpg" alt="" width="1000" height="666" /><p id="caption-attachment-148685" class="wp-caption-text">▲ On Galaxy S24, Circle to Search with Google lets you follow your inspirations directly and intuitively.</p></div>
<h3><span style="color: #000080"><strong>DAY ONE: You make connections across language barriers, thanks to your personal pocket translator.</strong></span></h3>
<p>Settling into your hostel, you’re keen to meet people and get all the hot tips on where to eat and what to see. But while travel is an exciting experience, it can be overwhelming if you don’t speak the language. In the hostel’s lounge area, you use Galaxy S24’s <strong>Interpreter</strong> feature to sidestep the language differences and converse with your fellow travelers in real time and in person, asking them for site-seeing recommendations with a local spin. On one side of the screen is the translated audio text of the person you’re speaking to, and on the other side is your part of the conversation translated for them.</p>
<p>Having made plans to meet up with your new companions later for dinner by the Seine, you use <strong>Chat Assist’s Message Translate</strong> feature to stay in touch throughout the day and time your rendezvous perfectly. By combining Message Translate with Chat Assist’s <strong>Writing Style</strong> tool, you can automatically adjust your messages to make sure you’re using a friendly tone instead of an unintentionally stilted, formal one.</p>
<p>Thanks to mobile AI, you can make travel a more social experience, forging meaningful connections across language barriers in the moment – voilà!</p>
<div id="attachment_148686" style="width: 1010px" class="wp-caption alignnone"><img aria-describedby="caption-attachment-148686" class="wp-image-148686 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2025/01/Travel-Just-Got-a-Whole-Lot-Smarter-S24_main2.jpg" alt="" width="1000" height="666" /><p id="caption-attachment-148686" class="wp-caption-text">▲ Say au revoir to language barriers. Interpreter’s live translation makes it easy to talk to someone in another language.</p></div>
<h3><span style="color: #000080"><strong>DAY TWO: You explore the local food and culture, powered by your own digital travel agent.</strong></span></h3>
<p>Even the best planned vacations encounter hiccups, but with mobile AI, you’re prepared.</p>
<p>If you realize you booked your exhibition tickets at the Musée d’Orsay for the wrong day, for example, you can call to the ticket office with the peace of mind of knowing you don’t need to speak French.</p>
<p>Supported in 13 languages at launch,<sup>2</sup> <strong>Live Translate</strong> takes the stress out of problem-solving in a different language. Galaxy S24 is the first phone to offer real-time, two-way call translation within its native call app. When you phone the museum, the call app offers you an option to launch Live Translate, which provides both text and audio translations instantly.</p>
<p><span>The feature also works for the person on the other end of the line, regardless of whether they’re using a Galaxy device, allowing you to rebook your ticket as easily as you would at home and head out for a day full of art and culture.</span></p>
<p><strong> </strong></p>
<h3><span style="color: #000080"><strong>DAY THREE: As your trip comes to a close, your AI-powered phone helps you fine-tune your holiday pics.</strong></span></h3>
<p>Before long, you’re having one last pain au chocolat in a boulangerie at Charles de Gaulle Airport, reflecting on your all-too-brief weekend in the City of Lights.</p>
<p>You want to send your hostel friends a group photo of you all in front of the Eiffel Tower, but the well-meaning passerby who took it got the composition slightly wrong. The Eiffel Tower is tilting at an angle, and your group is off to the right instead of in the center. With <strong>Generative Edit,</strong> you can go beyond the boundaries of photography and correct these issues in a couple of taps.</p>
<p>When you rotate the image to straighten up France’s most iconic structure, Generative Edit fills in the newly created blanks in the corners using intelligent outpainting technology to extend what’s already there, with no cropping required. As for you and your friends, simply select them with your finger, move them to the center and let Generative Edit’s inpainting capabilities fill in the blank space.</p>
<p>With Galaxy S24 at your side, mobile AI will change more than the logistics of travel. It will also change the traveler in profound ways, encouraging a sense of discovery, emboldening more meaningful interaction with locals and making the world feel grander, but at the same time, more accessible than ever.</p>
<p><span style="font-size: small"><em><sup>1</sup> For security reasons, Circle to Search may have limited functionality on DRM-protected applications, such as finance and banking apps.<br />
<sup>2</sup> Languages include: Chinese (simplified), English, French, German, Hindi, Italian, Japanese, Korean, Polish, Portuguese, Spanish, Thai and Vietnamese.</em></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung and Its Foundry Partners Reveal Solutions for a Strong Design Infrastructure at 3rd SAFE Forum 2021</title>
				<link>https://news.samsung.com/global/samsung-and-its-foundry-partners-reveal-solutions-for-a-strong-design-infrastructure-at-3rd-safe-forum-2021</link>
				<pubDate>Thu, 18 Nov 2021 06:00:18 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/11/SAFE-Forum_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[2.5D Technology]]></category>
		<category><![CDATA[3D Technology]]></category>
		<category><![CDATA[3nm Gate-All-Around]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[DSP]]></category>
		<category><![CDATA[EDA]]></category>
		<category><![CDATA[GAA]]></category>
		<category><![CDATA[Gate-All-Around]]></category>
		<category><![CDATA[IP]]></category>
		<category><![CDATA[Package]]></category>
		<category><![CDATA[Performance Platform 2.0]]></category>
		<category><![CDATA[SAFE™ Forum]]></category>
		<category><![CDATA[Samsung Foundry]]></category>
		<category><![CDATA[Samsung Foundry Forum]]></category>
		<category><![CDATA[Samsung Foundry Forum 2021]]></category>
		<category><![CDATA[Samsung Semiconductors]]></category>
		<category><![CDATA[Semiconductor]]></category>
		<category><![CDATA[SFF]]></category>
                <guid isPermaLink="false">https://bit.ly/3nn96bW</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, held its 3rd Annual Samsung Advanced Foundry Ecosystem (SAFETM) Forum 2021 virtually today. With the theme of ‘Performance Platform 2.0: Innovation, Intelligence, Integration’, Samsung and its foundry ecosystem partners prepared 7 plenary talks and 76 technology sessions focused on three main topics: Gate-All-Around (GAA, Innovation), Artificial […]]]></description>
																<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-128907" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/11/SAFE-Forum_main1.jpg" alt="" width="1000" height="566" /></p>
<p>Samsung Electronics, a world leader in advanced semiconductor technology, held its 3<sup>rd</sup> Annual Samsung Advanced Foundry Ecosystem (SAFE<sup>TM</sup>) Forum 2021 virtually today.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-128908" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/11/SAFE-Forum_main2.jpg" alt="" width="1000" height="544" /></p>
<p>With the theme of ‘Performance Platform 2.0: Innovation, Intelligence, Integration’, Samsung and its foundry ecosystem partners prepared 7 plenary talks and 76 technology sessions focused on three main topics: Gate-All-Around (GAA, Innovation), Artificial Intelligence (AI, Intelligence) and 2.5D/3D (Integration) technologies and the diverse design infrastructures required for high-performance applications.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-128909" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/11/SAFE-Forum_main3.jpg" alt="" width="1000" height="542" /></p>
<p>“In the rapidly changing data-centric era, Samsung and its foundry partners have made great strides responding to increasing customers demand and to support their success by providing powerful solutions,” said Ryan Lee, Senior Vice President and Head of Foundry Design Platform Development at Samsung Electronics. “With the support of our SAFE program, Samsung will lead the realization of the vision ‘Performance Platform 2.0’.”</p>
<p>Starting with a keynote live streaming on November 17, attendees are able to explore a variety of tech sessions and engage with ecosystem partners through the virtual SAFE Forum platform for a month. To register for SAFE forum, please visit <a href="https://www.samsungfoundry.com" target="_blank" rel="noopener">https://www.samsungfoundry.com</a>.</p>
<p><strong> </strong></p>
<h3><span style="color: #000080"><strong>SAFE 2021: Performance Platform 2.0</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-128910" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/11/SAFE-Forum_main4.jpg" alt="" width="1000" height="546" /></p>
<p>Samsung has concentrated on expanding its foundry ecosystem by focusing on IP, Electronic Design Automation (EDA), Cloud, Design Solution Partner (DSP) and Package solutions necessary for today’s data-driven era. Samsung introduced today its latest SAFE<sup>TM</sup> program including:</p>
<ul>
<li><span style="font-size: 14pt"><span style="font-size: 14pt"><strong>SAFE<sup>TM</sup>-I</strong><strong>P & EDA:</strong> Samsung and its foundry ecosystem have reserved over 3,600 IPs and 80 certified EDA tools respectively. These are developed and verified based on the high-standard certification program run by Samsung and participated in by our partners. In order to respond to the demands of high performance applications, Samsung’s foundry ecosystem has developed not only HPC-specific foundation IPs including standard cell libraries and memory compilers but also key IPs, such as over 100Gbps Serializer-Deserializer (SerDes) interface and 2.5D/3D multi-die integration solutions.</span></span><br />
<span style="font-size: 14pt"></span><span style="font-size: 14pt"><br />
With our EDA partners, Samsung has secured design tools optimized for its unique 3-nanometer (nm) GAA process technology and design methodology for integrating multiple dies in 2.5D/3D. Customers can also utilize AI- and machine learning-based EDA technology to systematically manage and analyze design data. To overcome the increasing difficulties of chip design and analysis, Samsung has strengthened cooperation with partners to develop EDA tools and related technologies, such as incorporating GPUs that can efficiently use computing resources required for chip verification.</span></li>
</ul>
<ul>
<li><span style="font-size: 14pt"><strong>SAFE<sup>TM</sup>-OSAT:</strong> Samsung plans to lead ‘beyond-Moore’ technologies by strengthening various package line-ups such as 2.5D/3D through the expansion of its SAFE-Outsourced Semiconductor Assembly and Test (OSAT) ecosystem. The recent announcement of the co-development of Hybrid-Substrate Cube (H-Cube) solution, which offers efficient integration of 6 HBMs and cost benefit, is one of the successful examples of Samsung foundry’s collaboration with the OSAT community.</span></li>
</ul>
<ul>
<li><span style="font-size: 14pt"><strong>SAFE<sup>TM</sup>-Cloud Design Platform</strong>: SAFE<sup>TM</sup>-CDP, the cloud-based one-stop design platform introduced last year, now supports a hybrid cloud function that can be linked to customers’ conventional design environments.</span></li>
</ul>
<ul>
<li><span style="font-size: 14pt"><strong>SAFE<sup>TM</sup>-DSP</strong>: Through the SAFE<sup>TM</sup>-DSP ecosystem, Samsung and its global partners can actively support global fabless companies to implement their design ideas into custom product by utilizing cutting-edge process technologies as well as high-performance, low-power chip design knowledge.</span></li>
</ul>
<p><img loading="lazy" class="alignnone size-full wp-image-128901" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/11/Image-5.SAFE-Forum.jpg" alt="" width="2400" height="1300" /></p>
<p><strong>[Quote from SAFE<sup>TM</sup> Partner companies]</strong></p>
<ul>
<li><strong> <em>Ansys, </em></strong><em>Ajei Gopal</em><em>, CEO </em></li>
</ul>
<p>“Today’s chips demand a full multiphysics approach, which requires engineering simulation. Ansys is proud to partner with Samsung to deliver a comprehensive multi-physics analysis flow for Samsung’s multi-die integration initiative. The benefits to joint customers, to the industry – and to the entire world – are tremendous. Semiconductors will drive innovations as varied as autonomous and electric vehicles, artificial intelligence, and mobile technologies, including 5G and beyond.”</p>
<p><strong> </strong></p>
<ul>
<li><strong> <em>Arm, </em></strong><em>Simon Segars, CEO</em></li>
</ul>
<p>“Our longstanding partnership with Samsung Foundry has been essential for growing business opportunities in many markets for our combined partner ecosystem. This close collaboration continues as we work together to optimize our Armv9 next-generation processors on Samsung Foundry’s leading-edge processes, including GAA, to deliver a best-in-class solution that is optimized for the world of today, and the technologies of tomorrow. Together, we are unlocking new opportunities across HPC, Automotive, AI, and IoT, while also managing rising complexities, enabling faster time to market.”</p>
<ul>
<li><strong><em>Cadence, </em></strong><em>Lip-Bu Tan, CEO</em></li>
</ul>
<p>“The Cadence Intelligent System Design strategy is very well-aligned with Samsung Foundry’s Performance Platform 2.0 with common themes of innovation, pervasive intelligence and integrated solutions. Together, we’re enabling customers to develop and deliver innovative, breakthrough products using Samsung’s most advanced process and packaging technologies, and we look forward to continuing our work with Samsung Foundry to accelerate design successes”</p>
<ul>
<li><strong><em>Siemens EDA, </em></strong><em>A. </em><em>J. </em><em>Incorvaia, Senior Vice President</em></li>
</ul>
<p>“The Samsung SAFE event provides an exceptionally valuable venue for the Samsung Foundry ecosystem to meet, share information and identify opportunities to fully leverage Samsung’s cutting-edge process technologies. Siemens EDA looks forward to this year’s Samsung SAFE event and the many opportunities it presents for collaborating with customers and partners to eliminate design obstacles and enhance silicon success.”</p>
<ul>
<li><em><strong>Synopsys, </strong>Sassine Ghazi, president and COO </em></li>
</ul>
<p>“We see exciting times ahead as software and chip technology come together to create world-changing new products,” said Sassine Ghazi, president and COO of Synopsys. “We have strong programs with Samsung Foundry on 3nm gate-all-around enablement, broad IP certification, AI-assisted chip design and 2.5/3D multi-die design to name just a few. We welcome the strong collaboration opportunities offered by the Samsung SAFE initiative.”</p>
<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/IXZWwPTFeZ0?rel=0" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"><span data-mce-type="bookmark" style="width: 0px;overflow: hidden;line-height: 0" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="width: 0px;overflow: hidden;line-height: 0" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="width: 0px;overflow: hidden;line-height: 0" class="mce_SELRES_start">﻿</span><span style="width: 0px;overflow: hidden;line-height: 0" data-mce-type="bookmark" class="mce_SELRES_start"></span></iframe></div>
]]></content:encoded>
																				</item>
					<item>
				<title>[Into the Future With Samsung Research ③] Samsung R&D Institute China – Beijing: Underlining Game-Changing Technologies for Users With Fundamental Research Into Machine Learning</title>
				<link>https://news.samsung.com/global/into-the-future-with-samsung-research-3-samsung-rd-institute-china-beijing-underlining-game-changing-technologies-for-users-with-fundamental-research-into-machine-learning</link>
				<pubDate>Thu, 07 Oct 2021 11:00:07 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2021/10/Samsung-Research-China-Beijing_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Expert Voices]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Bin Dai]]></category>
		<category><![CDATA[Into the future]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Research and Development]]></category>
		<category><![CDATA[Samsung R&D Institute]]></category>
		<category><![CDATA[Samsung R&D Institute China-Beijing]]></category>
		<category><![CDATA[SRC-B]]></category>
                <guid isPermaLink="false">https://bit.ly/3iB8ZqA</guid>
									<description><![CDATA[Following Episode 2 In this relay series, Samsung Newsroom is introducing tech experts from Samsung’s R&D centers around the globe to hear more about the work they do and the ways in which it is directly improving the lives of consumers. The third expert in the series to be introduced is Bin Dai, Staff Engineer […]]]></description>
																<content:encoded><![CDATA[<p><strong>Following <a href="https://news.samsung.com/global/into-the-future-with-samsung-research-2-samsung-rd-institute-poland-creating-artificial-intelligence-powered-technologies-to-bring-about-a-whole-new-world-of-convenience" target="_blank" rel="noopener">Episode 2</a></strong></p>
<p>In this relay series, Samsung Newsroom is introducing tech experts from Samsung’s R&D centers around the globe to hear more about the work they do and the ways in which it is directly improving the lives of consumers.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-127241" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/09/SR.jpg" alt="" width="1000" height="563" /></p>
<p>The third expert in the series to be introduced is Bin Dai, Staff Engineer at the Artificial Intelligence (AI) Lab in Samsung R&D Institute China – Beijing (SRC-B). Dai joined SRC-B in 2020 to join his colleagues in working on network compression and on-device model design and research. Read on to learn more about the groundbreaking technologies Dai and his team are developing at SRC-B.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-127559" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/10/Samsung-Research-China-Beijing_main2.jpg" alt="" width="1000" height="467" /></p>
<p><strong>Q: AI-based technologies, including NLP (Natural Language Processing) and acoustic intelligence, are cutting-edge research areas that are constantly breaking new ground. But what role does the core research offering provided by machine learning play as a background for these innovations?</strong></p>
<p>Machine learning plays a crucial role into bringing all kinds of technologies directly to users. Computer vision and speech recognition are two of the most successful areas currently utilizing AI. However, existing AI algorithms require huge computation resources, making it difficult to deploy state-of-the-art algorithms on mobile devices. In order to fix this issue, our AI Lab is working on producing tiny models with powerful performance from both a theoretical and a practical perspective. In this way, our core research is set to innovate all kinds of AI-based technologies.</p>
<p><strong>Q: Can you please briefly introduce the Beijing Research Institute, and the kind of work that goes on there?</strong></p>
<p>SRC-B is one of Samsung’s Electronics’ advanced R&D centers and was established in 2000, the first Samsung R&D center to be established in China. SRC-B focuses on groundbreaking technologies and specializes in artificial intelligence (AI) and next-generation telecommunications, from machine learning, computer vision, language processing and voice intelligence through to 3GPP standardization and more. We also promote tight industrial-academic partnerships. In April 2019, the AI Lab was established to focus on fundamental research into machine learning, and we are continuously looking for ways to apply our research results to Samsung products.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-127558" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/10/Samsung-Research-China-Beijing_main3.jpg" alt="" width="1000" height="707" /></p>
<p><strong>Q: Following the success of your major research thesis and other accomplishments, what are you working on at the moment?</strong></p>
<p>SRC-B is currently aiming to find the best possible way to enhance the accuracy of an AI algorithm while reducing the computation complexity and resources used to do so. In order to achieve these goals, we are currently working on two research topics that enable accurate predictions with less data required: equivariant networks, part of the broader topic of geometric deep learning, and dynamic inference. There are many kinds of symmetries in computer vision datasets which are able to provide accurate depth measurement like human eyes can, such as image and LiDAR point clouds. With an equivariant network, these symmetries are taken into consideration when designing the network. It is thus able to achieve better performance with fewer resources since we have specifically considered the intrinsic structure of the dataset.</p>
<p>Dynamic inference is also a very interesting research direction. Unlike conventional methods which harness a fixed architecture for all data samples, dynamic inference can adaptively decide how many resources to use for each data sample. Accordingly, it will use fewer computational resources for simple samples and more resources for difficult ones. By doing so, the average computation resource used can be significantly reduced.</p>
<p><strong>Q: Fundamental research into AI has been empowering all kinds of user-forward application fields, from computer vision to speech recognition. Could you explain a bit more about why this is, and the direction of research you and the AI Lab have been taking in order to optimize mobile experiences?</strong></p>
<p>In this era of the internet, data is flooding everywhere around us. Where there is data, there is knowledge. AI algorithms are the very best tool for uncovering the knowledge hidden behind the data and make use of this knowledge to make all of our lives better.</p>
<p>We have developed a network compression algorithm based on the information bottleneck theory – which posits that extraneous details can be removed from noisy input data as if squeezed through a bottleneck – which has been applied to multiple tasks including video recognition, image segmentation and machine translation. We also actively collaborate with other labs in SRC-B in order to develop more powerful AI algorithms, including the Neural Architecture Search (NAS) and Once-For-All (OFA) solutions.</p>
<p><strong>Q: What do you see as the main user benefits from incorporating all base mobile technologies with machine learning-based AI technologies?</strong></p>
<p>Machine learning-based AI technologies can dramatically improve users’ lives in three key ways. Firstly, there are many convenient functions that simply cannot work without AI technologies. For example, the automatic question and answering system on mobile devices has to be powered by AI algorithms. Other more traditional methods are only able to handle very limited, pre-defined questions.</p>
<p>Secondly, AI techniques can significantly improve the performance of many applications compared to their performance when harnessing conventional technologies only. For example, after applying deep neural networks to a camera’s neural image signal processing (ISP) function, the quality of photos taken on that camera becomes significantly better.</p>
<p>Thirdly, AI technologies are capable of providing services that users previously didn’t even know they needed. For example, AI is capable of developing a user-specific software based on that user’s specific preferences, meaning that the user’s device experience can continuously be improved.</p>
<div id="attachment_127560" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-127560" class="wp-image-127560 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/10/Samsung-Research-China-Beijing_main4.jpg" alt="" width="1000" height="665" /><p id="caption-attachment-127560" class="wp-caption-text">▲ Researchers at Samsung R&D Institute China – Beijing</p></div>
<p><strong>Q: How does the work you do synergize with the work undertaken by the rest of Samsung R&D Institute China – Beijing, or perhaps even other R&D Institutes around the world? How does it come together to make users’ lives more convenient?</strong></p>
<p>We are constantly collaborating with the other teams within SRC-B. We have been collaborating recently with our Visual Computing team in order to apply our information bottleneck-based compression algorithm to video recognition tasks and human segmentation tasks, resulting in the significant reduction of model sizes without any performance drop. In 2021, we participated in the Conference on Computer Vision and Pattern Recognition (CVPR)’s Neural Architecture Search (NAS) competition as one team with this solution, and won 1<sup>st</sup> place.</p>
<p>We have also been working with our Language Intelligence team to compress their machine translation model, which facilitates the commercialization of their application.</p>
<p>We also believe that we can produce better research and application results by further communication, discussion and collaboration with AI centers globally.</p>
<p><strong>Q: What do you see as being the main trends within your industry right now? How have you been incorporating them into the research you do at Samsung R&D Institute China – Beijing?</strong></p>
<p>There are a lot of trending topics within our field at this time. Efficient network architecture design, self-supervised learning and graph neural networks are just a few examples.</p>
<p>Our focus is on network compression and tiny model design, which is ultimately useful for applications on mobile devices. There are a lot of mobile devices, such as smartphones, that possess very limited computational resources, meaning that it is impossible to deploy the huge models designed for services to these devices. Therefore, my team is focused on designing models suitable for these devices.</p>
<p>There are different ways to achieve these kinds of light yet powerful models. For instance, network pruning, quantization, knowledge distillation, neural network architecture search and dynamic inference are just a few industry areas that we are focusing on right now to achieve this.</p>
<p><strong>Q: What has been the achievement at Samsung R&D Institute China – Beijing that you are most proud of so far?</strong></p>
<p>Developed together in collaboration with our Communication Research team, we engineered AI algorithms for wireless communication. This solution achieved first place at the Wireless Communication AI Competition (WAIC) this year, which is the official competition for 5G+AI in China with over 600 teams enter from around the world and is held by the China Academy of Information and Communication Technology (CAICT). I am proud of this achievement and feel that it validates my belief that 5G combined with AI is a research direction with great potential.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-127585" src="https://img.global.news.samsung.com/global/wp-content/uploads/2021/10/Samsung-Research-China-Beijing_main5F.jpg" alt="" width="1000" height="390" /></p>
<p>An interview with Evgeny Pavlov, a system software expert from Samsung R&D Institute Russia (SRR) can be found in the following episode.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Expands its Foundry Capacity with A New Production Line in Pyeongtaek, Korea</title>
				<link>https://news.samsung.com/global/samsung-electronics-expands-its-foundry-capacity-with-a-new-production-line-in-pyeongtaek-korea</link>
				<pubDate>Thu, 21 May 2020 11:01:23 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2020/05/Foundry_Pyeongtaek-Production-Line_Thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Semiconductors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[EUV]]></category>
		<category><![CDATA[EUV-based 5 Nanometer]]></category>
		<category><![CDATA[Extreme Ultra Violet]]></category>
		<category><![CDATA[Foundry]]></category>
		<category><![CDATA[High Performance Computing]]></category>
		<category><![CDATA[HPC]]></category>
		<category><![CDATA[Hwaseong]]></category>
		<category><![CDATA[Samsung Foundry]]></category>
                <guid isPermaLink="false">https://bit.ly/36jB2mX</guid>
									<description><![CDATA[Samsung Electronics, a world leader in advanced semiconductor technology, today announced plans to boost its foundry capacity at the company’s new production line in Pyeongtaek, Korea, to meet growing global demand for cutting-edge extreme ultraviolet (EUV) solutions. The new foundry line, which will focus on EUV-based 5 nanometer (nm) and below process technology, has just […]]]></description>
																<content:encoded><![CDATA[<p><span>Samsung Electronics, a world leader in advanced semiconductor technology, today announced plans to boost its foundry capacity at the company’s new production line in Pyeongtaek, Korea, to meet growing global demand for cutting-edge extreme ultraviolet (EUV) solutions.</span></p>
<p><span>The new foundry line, which will focus on EUV-based 5 nanometer (nm) and below process technology, has just commenced construction this month and is expected to be in full operation in the second half of 2021. It will play a pivotal role as Samsung aims to expand the use of state-of-the-art process technologies across a myriad of current and next generation applications, including 5G, high-performance computing (HPC) and artificial intelligence (AI).</span></p>
<p><span>“This new production facility will expand Samsung’s manufacturing capabilities for sub-5nm process and enable us to rapidly respond to the increasing demand for EUV-based solutions,” said Dr. ES Jung, President and Head of Foundry Business at Samsung Electronics. “We remain committed to addressing the needs of our customers through active investments and recruitment of talents. This will enable us to continue to break new ground while driving robust growth for Samsung’s foundry business.”</span></p>
<p><span>Following the initial mass production of the EUV-based 7nm process in early 2019, Samsung recently added a new EUV-dedicated V1 line in Hwaseong, Korea, to its global foundry network. With the new Pyeongtaek facility starting full operation in 2021, Samsung’s foundry capacity based on EUV is expected to increase significantly.</span></p>
<p><span>Samsung is scheduled to start mass production of 5nm EUV process in the Hwaseong fab in the second half of this year.</span></p>
<p><span>With the addition of the Pyeongtaek fab, Samsung will have a total of seven foundry production lines located in South Korea and the United States, comprised of six 12-inch lines and one 8-inch line.</span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Hearing from an AI Expert – 2] How AI Will Change the World</title>
				<link>https://news.samsung.com/global/hearing-from-an-ai-expert-2-how-ai-will-change-the-world</link>
				<pubDate>Fri, 27 Sep 2019 11:00:04 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/09/AI-Center-Interview_Sebastian-Seung_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[People & Culture]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Expert Voices]]></category>
		<category><![CDATA[AI Experts]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Artificial Neural Network]]></category>
		<category><![CDATA[robotics]]></category>
		<category><![CDATA[Samsung AI Center]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">http://bit.ly/2leiIsk</guid>
									<description><![CDATA[There’s no denying that the age of AI is upon us and that the ways we engage and interact are set to change in big ways. In anticipation of this, Samsung Electronics has opened AI centers across the world to ensure that the company leads the charge on AI. 2019 marks the 50th anniversary of […]]]></description>
																<content:encoded><![CDATA[<div id="attachment_112924" style="width: 1010px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-112924" class="wp-image-112924 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2019/09/AI-Center-Interview_Sebastian-Seung_main_1.jpg" alt="" width="1000" height="620" /><p id="caption-attachment-112924" class="wp-caption-text">Sebastian Seung, Executive Vice President & Chief Research Scientist, Samsung Electronics</p></div>
<p>There’s no denying that the age of AI is upon us and that the ways we engage and interact are set to change in big ways. In anticipation of this, Samsung Electronics has opened AI centers across the world to ensure that the company leads the charge on AI. 2019 marks the 50<sup>th</sup> anniversary of Samsung Electronics, and the company has forecast another 50 years of ingenuity ahead, with AI set to be at the heart of future innovation.</p>
<p>To gain a great insight into what AI means for the future of society, as well as the work being done at the Samsung AI Centers, Samsung Newsroom sat down with Executive Vice President & Chief Research Scientist, Dr. Sebastian Seung.</p>
<p>Seung joined Samsung Electronics in 2018. He is also a professor at the Princeton Neuroscience Institute and Department of Computer Science. Seung is one of the most influential scientists in the world when it comes to AI research based on neuroscience.</p>
<h3><span style="color: #000080"><strong>Artificial Neural Networks and AI</strong></span></h3>
<p>Based on his extensive experience and insights into the field of artificial neural networks<sup>1</sup>, Seung is working on developing future growth engines for Samsung Electronics by establishing an AI strategy and providing advice on advanced research.</p>
<p>Artificial neural networks are mathematical models or computer simulations of the biological neural networks in the brain. “Convolutional networks, now the dominant approach to computer vision, were inspired by Nobel Prize-winning neuroscience of the 1960s,” according to Seung. His research at Princeton focuses on mapping the neuronal “wiring diagram” of the cerebral cortex. “I hope that our 21<sup>st</sup> century studies of the cortex will finally reveal how it learns, and that this new understanding will lead to more powerful artificial neural networks,” says Seung.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-112925" src="https://img.global.news.samsung.com/global/wp-content/uploads/2019/09/AI-Center-Interview_Sebastian-Seung_main_2.jpg" alt="" width="1000" height="666" /></p>
<p>In his work for Samsung, Dr. Seung travels back and forth between the U.S. and Korea. His recent work is especially focused on advanced research regarding robots, which is the New York AI Center’s main field of research.</p>
<h3><span style="color: #000080"><strong>Deep Learning and Robotic</strong><strong>s</strong></span></h3>
<p>These days, robots are already present in society in the forms of robot vacuum cleaners in our homes and robotic arms being used in factories and by shipping companies. Seung acknowledges that these robots already represent an early stage of this technology, but says that what he is aiming for is something much more sophisticated. “In order to develop robots that can, for instance, reach out to pick something up and put it away,” Seung says, “we have to equip them with computer vision so they can see what’s in front of them, and with brains so that they know what all these objects in your house are and what they should do with them.”</p>
<p>Seung acknowledges that labs have tried in the past to achieve these capabilities through the classical approach of programming, but that that hasn’t really worked out. “We have realized that we have to somehow allow the robot to learn to perform the required actions itself,” says Seung, “and a lot of that involves the deep-learning approach.”</p>
<p>Seung points to the area of home automation as a primary application for their work. “In the future, you can imagine robots that won’t just give you weather information or change the temperature – they’ll perform far more helpful tasks in your home. They’ll pick up the toys, wash the dishes and even take the laundry up and down the stairs.”</p>
<h3><span style="color: #000080"><strong>AI in Society</strong></span></h3>
<p>No discussion of AI would be complete without addressing the apprehensions some people feel when it comes to the technology and the ways in which it stands to change our way of life. Seung addresses this question first with regards to the prospect of people losing their jobs to automation. “I think this issue of robots taking our jobs is exaggerated,” he relates. “Firstly, in the last 20 years, the U.S. and many other developed countries have lost a lot of jobs to offshoring, not just to automation. As in the first industrial revolution, many jobs were eliminated, but that didn’t mean that there were fewer jobs in total, because new jobs arose from the new circumstances.”</p>
<p>Seung went on to comment on the wider attitudes towards automation of industry, and the fact that the issue needs to be looked at through a different lens. “If robots really could do all of our work, why shouldn’t we be happy about that?” he said.</p>
<p>Asked the inevitable question about doomsday scenarios in which machine intelligence outstrips that of humans and robots take over the world, Seung claimed, “People don’t actually know what the real capabilities of AI are. And part of that is a public misconception based on science fiction movies that convince people that robots can do anything. In reality, robots are still really clumsy.”</p>
<p>Seung went on to point out that AI developments may well end up greatly helping us, instead of dooming us. “Are robots going to do something bad to us?” he said. “Well, the reason that I don’t worry about that is that of all the environmental and political threats to humanity, robots are not very high on the list. And not only that, I think that if humanity is to best equip itself to deal with any and all future threats, we need to be as smart as possible. And that involves having the most sophisticated technology. You could be a science-fiction pessimist and say maybe these robots could turn on us, but you could also argue that maybe we’ll use these robots to save us.”</p>
<p>Speaking to other misconceptions about AI, Seung pointed to the actual capabilities of the technology. “The public thinks that AI can do more than it really can,” he said. “To give you an example, I met someone who wanted AI to replace her doctor. But there are many things that no human doctor can fix. So, because our current approach to AI involves training machines based on the expertise of human practitioners, if the best human experts can’t solve it, then the AI can’t do it either. It’s not like AI will all of a sudden be able to perform tasks better than the human experts.”</p>
<h3><span style="color: #000080"><strong>The Next 50 Years of AI</strong></span></h3>
<p>Having reached its 50-year anniversary this year, Samsung is now looking to AI to spearhead the next 50 years of innovation. Asked what he expects for this period, Seung said, “In 20~30 years robots will be able to work in the home just as humans can. It will have happened the same way that the mobile phone revolution has happened. Everybody has a mobile phone now – billions of them are sold every year – and the same is going to be true of robots.”</p>
<p>Home automation and self-driving cars based on AI are other hot-button topics right now. Seung says he fully expects AI-equipped cars to become a reality, but that the timeline for their inception is hard to sketch out. “AI is going to lead to a lot of labor-saving things happening in people’s everyday lives, like autonomous cars for instance,” he said. “Are they going to be here next year, or will it take 20 years? Experts are realizing that full autonomy will take longer than the media originally portrayed, but most still believe that it will be achieved. I’d like to see Samsung have some part in that revolution, if not lead that revolution.”</p>
<p>The prospective benefits of AI are enormous in scale and diverse in focus. Outlining some of the applications of AI that the general population may not be aware of, Seung remarked that “The effect AI could have on scientific research is a major one. AI can be applied to accelerate scientific discovery, and in the long term, it will have a huge impact on areas like materials engineering and chemistry. Let’s say I want to design a new molecule with certain properties – AI might allow me to do that more easily. Then, that new molecule could have applications for a drug company, or really any company that creates materials. So AI is not only applied to technology – it’s also used for scientific discovery, which then accelerates the advancement of technology.”</p>
<p><span style="font-size: small"><sup>1</sup><em>An artificial neural network is an attempt to simulate the network of neurons that make up a human brain so that the computer will be able to learn things and make decisions in a humanlike manner. (<a href="https://www.forbes.com/sites/bernardmarr/2018/09/24/what-are-artificial-neural-networks-a-simple-explanation-for-absolutely-anyone/#1b4809251245" target="_blank" rel="noopener">https://www.forbes.com/sites/bernardmarr/2018/09/24/what-are-artificial-neural-networks-a-simple-explanation-for-absolutely-anyone/#1b4809251245</a>)</em></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Expands SAIT AI Lab Montreal to Spur AI Research for Next-Generation System Semiconductor</title>
				<link>https://news.samsung.com/global/samsung-electronics-expands-sait-ai-lab-montreal-to-spur-ai-research-for-next-generation-system-semiconductor</link>
				<pubDate>Thu, 02 May 2019 11:00:22 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2019/05/AI-Lab-in-Montreal_thumb728F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[More Stories]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Components]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[GANs]]></category>
		<category><![CDATA[Generative Adversarial Networks]]></category>
		<category><![CDATA[Mila]]></category>
		<category><![CDATA[Montreal]]></category>
		<category><![CDATA[Montreal Institute for Learning Algorithms]]></category>
		<category><![CDATA[SAIT]]></category>
		<category><![CDATA[Samsung Advanced Institute of Technology]]></category>
		<category><![CDATA[System Semiconductors]]></category>
                <guid isPermaLink="false">http://bit.ly/2UU7LYH</guid>
									<description><![CDATA[Samsung Electronics today announced the expansion of the ‘Samsung Advanced Institute of Technology (SAIT) artificial intelligence (AI) Lab Montreal’ in Canada. The Lab will help the company strengthen its fundamentals in AI research and drive competitiveness in system semiconductors. The AI Lab is located in Mila – Montreal Institute for Learning Algorithms – in Montreal, […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics today announced the expansion of the ‘Samsung Advanced Institute of Technology (SAIT) artificial intelligence (AI) Lab Montreal’ in Canada. The Lab will help the company strengthen its fundamentals in AI research and drive competitiveness in system semiconductors.</p>
<p>The AI Lab is located in Mila – Montreal Institute for Learning Algorithms – in Montreal, Canada. Founded by Professor Yoshua Bengio at the University of Montreal, Mila is one of the greatest research centers in the field of deep learning and has a partnership with the University of Montreal and McGill University. SAIT AI Lab Montreal has an open workspace with the aim of working closely with the AI research communities in Mila.</p>
<p>SAIT AI Lab Montreal will focus on unsupervised learning and Generative Adversarial Networks (GANs) research to develop disruptive innovation and breakthrough technologies, including new deep learning algorithms and next generation of on-device AI.</p>
<p>To drive the effort, this AI Lab has actively recruited leaders in deep learning research, including Simon Lacoste-Julien, Professor at the University of Montreal, who recently joined as the leader of the lab. In addition, Samsung is planning to dispatch R&D personnel in its Device Solutions Business to Montreal over time and utilize AI Labs as a base for training AI researchers and collaborating with other advanced AI research institutes.</p>
<p>On the other side, SAIT AI Lab Montreal continues to build a strong relationship with Yoshua Bengio, one of the world’s greatest experts on deep learning, machine learning, and AI. SAIT and Professor Bengio collaborated on deep learning algorithm research since 2014, successfully publishing three papers on academic journals.</p>
<p>Professor Yoshua Bengio said, “Samsung’s collaboration with Mila is well established already and has been productive and built strong trust on both sides. With a new SAIT lab in the midst of the recently inaugurated Mila building and many exciting research challenges ahead of us in AI, I expect even more mutually positive outcomes in the future.”</p>
<p>SAIT has actively pursued research collaboration with other top authorities in the field. In addition to Professor Bengio, SAIT has worked with Yann LeCun, Professor at New York University and Richard Zemel, Professor at University of Toronto. Yoshua Bengio and Yann LeCun, along with computer scientist Geoffrey Everest Hinton won the 2018 Turing Award which is deemed the ‘Nobel Prize in computer science.’</p>
<p>“SAIT focuses on research and development – not only in next generation semiconductor but also innovative AI as a seed technology in system semiconductors. SAIT AI Lab Montreal will play a key role within Samsung to redefine AI theory and deep learning algorithm for the next 10 years,” said Sungwoo Hwang, Executive Vice President and Deputy Head of SAIT.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Joins ‘Partnership on AI’ for the Future of AI Safety</title>
				<link>https://news.samsung.com/global/samsung-electronics-joins-partnership-on-ai-for-the-future-of-ai-safety</link>
				<pubDate>Fri, 09 Nov 2018 08:00:32 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/11/PAI-Logo_thumb728.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI for Good]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Global AI Center]]></category>
		<category><![CDATA[PAI]]></category>
		<category><![CDATA[Samsung Research]]></category>
                <guid isPermaLink="false">http://bit.ly/2PeQfR2</guid>
									<description><![CDATA[Samsung Electronics Co., Ltd. today announced that it joined the Partnership on Artificial Intelligence to Benefit People and Society (PAI) to Benefit People and Society (PAI). The Partnership was established to serve as an open platform to discuss, study, and formulate best practices on AI technologies. Founded in 2016, PAI is a technology industry consortium […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics Co., Ltd. today announced that it joined the <span><a href="https://www.partnershiponai.org" target="_blank" rel="noopener">Partnership on Artificial Intelligence to Benefit People and Society (PAI)</a></span> to <em>B</em>enefit People and Society (PAI). The Partnership was established to serve as an open platform to discuss, study, and formulate best practices on AI technologies.</p>
<p>Founded in 2016, PAI is a technology industry consortium that conducts research and discussion, shares insights, provides thought leadership, identifies new areas for AI application, and creates informational materials to advance the understanding of AI technologies. The organization currently has more than 70 partners, including major global companies and human rights groups specializing in AI.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-106179" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/11/PAI-Logo_main.jpg" alt="" width="1000" height="239" /></p>
<p>Acknowledging the rapid development and aiming to positively impact the future of AI technologies, Samsung joined PAI to shape the direction of AI development along with global member companies. As a member of PAI, Samsung will join one of its working groups, <em>Collaboration Between People and AI Systems<sup>1</sup></em>, and research possible collaboration between humans and AI. The company also plans to participate in research on topics including safety, transparency, and the social and economic impacts of AI.</p>
<p>“Samsung is dedicated to producing AI products and services that are reliable and safe for people and beneficial to society,” said Seunghwan Cho, Executive Vice President of Samsung Research, the advanced R&D arm of Samsung Electronics’ device business. “As a member of the PAI, Samsung will strive to facilitate ongoing progress of artificial intelligence and develop best practices on AI technologies.”</p>
<p>Samsung now has a network of seven Global AI Centers in Seoul, Silicon Valley, New York, Cambridge, Moscow, Toronto and Montreal.</p>
<p><span style="font-size: small"><sup>1</sup> PAI has working groups in six major research fields: Safety-Critical AI, Fair, Transparent and Accountable AI, AI, Labor and the Economy, Collaborations Between People and AI, Social and Societal Influences of AI, AI and Social Welfare.</span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung to Acquire Zhilabs to Expand AI-Based Automation Portfolio in 5G Era</title>
				<link>https://news.samsung.com/global/samsung-to-acquire-zhilabs-to-expand-ai-based-automation-portfolio-in-5g-era</link>
				<pubDate>Wed, 17 Oct 2018 11:00:26 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/10/zhilabs-acquisition_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Network Solutions]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[5G]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Automotive]]></category>
		<category><![CDATA[biopharmaceuticals.]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Zhilabs]]></category>
                <guid isPermaLink="false">http://bit.ly/2RSey50</guid>
									<description><![CDATA[Samsung Electronics today announced its acquisition of Zhilabs, known for its Artificial Intelligence (AI)-based network and service analytics, to further enhance its 5G capabilities. The acquisition lays the foundation for Samsung to foster its 5G offerings of automation and network analytics to finely tune the customer experiences in the 5G era.   AI-based automation will […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics today <span>announced its acquisition of Zhilabs, known for its Artificial Intelligence (AI)-based network and service analytics, to further enhance its 5G capabilities. The acquisition lays the foundation for Samsung to foster its 5G offerings of automation and network analytics to finely tune the customer experiences in the 5G era. </span></p>
<p><span> </span></p>
<p><span>AI-based automation will play a central role in the introduction of new services in the 5G era, such as industrial Internet of Things (IoT) and connected cars, as carriers will require automated solutions and network analytics beyond what was possible in previous generations. AI-based transformation can be used to analyze user traffic, classify application being used, and improve overall service quality, as such needs can no longer be addressed by existing solutions. </span></p>
<p><span> </span></p>
<p><span>“5G will enable unprecedented services attributed to the generation of exponential data traffic, for which automated and intelligent network analytics tools are vital,” said Youngky Kim, President and Head of Networks Business at Samsung Electronics. “The acquisition of Zhilabs will help Samsung meet these demands to assure each subscriber receives the best possible service.”</span></p>
<p><span> </span></p>
<p><span>“</span>5G technology will disrupt the communications landscape for the better, but it will only be successful if the quality of the networks transferring the information can be measured and improved to provide a best-in-class experience<span>,” said Joan Raventós, CEO at Zhilabs. “We are delighted to be joining the Samsung Electronics family and adding a contribution with our software products and technology to the existing end-to-end solutions that the company offers its customers.”</span></p>
<p><span> </span></p>
<p>Zhilabs<span>, fully owned by Samsung, will operate independently under its own management. Samsung is looking forward to joint capabilities to create new cutting-edge technology in the transformation from 4G to 5G. </span></p>
<p><span> </span></p>
<p>In addition to the acquisition, Samsung will continue to strengthen its automation solutions that measure the quality of each user service and can also automatically optimize service quality without human intervention. The company will also explore and invest in other business opportunities powered by the emerging technologies.</p>
<p><span> </span></p>
<p><span>In August, Samsung announced plans to boost investments in businesses that will drive its future growth, committing to a KRW 25 trillion investment over the next three years in the areas of artificial intelligence (AI), 5G, automotive electronics components and biopharmaceuticals. </span></p>
<p><span> </span></p>
<p><span>For further information, please click <a href="https://news.samsung.com/global/samsung-steps-up-investment-for-future-growth-takes-initiative-to-build-innovation-ecosystem" target="_blank" rel="noopener">here</a>. </span><span></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung AI Forum Offers a Roadmap for the Future of AI</title>
				<link>https://news.samsung.com/global/samsung-ai-forum-offers-a-roadmap-for-the-future-of-ai</link>
				<pubDate>Tue, 18 Sep 2018 18:30:57 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2018/09/samsung-ai-forum-2018_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[AI Expert Voices]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Samsung AI Center]]></category>
		<category><![CDATA[Samsung AI Forum 2018]]></category>
		<category><![CDATA[Samsung R&D]]></category>
                <guid isPermaLink="false">http://bit.ly/2PLlxL1</guid>
									<description><![CDATA[It wasn’t that long ago that the idea of building technologies with ‘brains’ that learn and are even structured just like ours seemed like science fiction. Just ask the distinguished speakers at the “Samsung AI Forum 2018”. Held in Seoul from September 12th to 13th, the second edition of Samsung Electronics’ artificial intelligence (AI) forum […]]]></description>
																<content:encoded><![CDATA[<p>It wasn’t that long ago that the idea of building technologies with ‘brains’ that learn and are even structured just like ours seemed like science fiction.</p>
<p>Just ask the distinguished speakers at the “Samsung AI Forum 2018”. Held in Seoul from September 12<sup>th</sup> to 13<sup>th</sup>, the second edition of Samsung Electronics’ artificial intelligence (AI) forum featured accomplished AI experts, who discussed how groundbreaking advancements are not only helping to create technology that will make our lives more comfortable, convenient and efficient. They’re also teaching us more about how our own minds work.</p>
<h3><span style="color: #000080"><strong>Unsupervised Learning Takes Center Stage</strong></span></h3>
<div id="attachment_105056" style="width: 715px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-105056" class="size-full wp-image-105056" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/09/samsung-ai-forum-2018_main_1.jpg" alt="" width="705" height="375" /><p id="caption-attachment-105056" class="wp-caption-text">Attendees of the Samsung AI Forum 2018 are listening intently to the opening address of Kinam Kim, Samsung Electronics’ President and CEO</p></div>
<p>The forum began with a presentation from the founding director of the New York University Center for Data Science, and one of the world’s leading minds in the field of deep learning, Yann LeCun.</p>
<p>LeCun’s speech set the stage for the exciting discussions on unsupervised learning that would follow over the course of the two-day event. LeCun explained why he and many of his peers believe that unsupervised learning, also known as self-supervised learning, represents the future of AI. He also delved into unsupervised learning algorithms’ potential applications (and limitations), and explained how they differ from supervised and reinforcement learning algorithms.</p>
<p>As LeCun explained, <em>supervised learning</em> algorithms learn utilizing labeled datasets and answer keys that allow them to evaluate their accuracy. This essentially means that each example in the training dataset includes the answer that the algorithm should produce. With <em>reinforcement learning</em>, an algorithm is trained using a reward system that offers feedback when it performs an optimal action for a given situation. It relies on this feedback, rather than labeled datasets, to make the choice that offers the greatest reward.</p>
<p>With <em>unsupervised learning,</em> the algorithm is tasked with making sense of an unlabeled dataset—a set of examples that doesn’t have a correct answer or desired outcome—on its own. While these algorithms can be more unpredictable than their counterparts, they can also perform more complex processing tasks.</p>
<p>LeCun used training self-driving cars as a key example of unsupervised learning’s potential. “A lot of people who are working on autonomous driving are hoping to use reinforcement learning to get cars to learn to drive by themselves by trial and error,” said LeCun. “The problem with this is that, because of [reinforcement learning’s inherent inefficiencies], you’d have to get a car to drive off a cliff several thousand times before it figures out how not to do that.”</p>
<p>LeCun explained how, unlike reinforcement learning models, which rely on trial and error, unsupervised learning models could potentially be capable of guessing what to do in a situation like this—demonstrating mental capabilities similar to what we’d call common sense.</p>
<p>He also discussed his experience developing artificial neural networks—specifically convolutional neural networks (ConvNets)—and demonstrated how they can be used to build not only self-driving cars but a wide variety of innovative devices, including technologies for medical signal and image analysis, bioinformatics, speech recognition, language translation, image restoration, robotics and physics.</p>
<p>LeCun’s presentation was followed by a lecture from another leading light in the field of deep learning: University of Montreal professor Yoshua Bengio. Professor Bengio’s lecture focused specifically on stochastic gradient descent (SGD)—an AI optimization method that’s used to minimize errors made by artificial neural networks.</p>
<p>As Bengio explained, “[SGD] is really the workhorse of deep learning. This is the optimization technique that is used everywhere for supervised learning, reinforcement learning and self-supervised learning. It’s been with us for many decades and it works incredibly well, but we don’t completely understand it yet.”</p>
<p>Bengio’s presentation allowed attendees to gain a better understanding of SGD, with specific focus on how SGD variants can affect neural network optimization and generalization. Bengio discussed how the traditional view of machine learning sees optimization and generalization as neatly separated, but that’s not actually the case. He also presented detailed research findings on the effects of SGD-based learning techniques on both aspects of network design.</p>
<div id="attachment_105053" style="width: 715px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-105053" class="size-full wp-image-105053" src="https://img.global.news.samsung.com/global/wp-content/uploads/2018/09/samsung-ai-forum-2018_main_2.jpg" alt="" width="705" height="543" /><p id="caption-attachment-105053" class="wp-caption-text">(from the top, clockwise) NYU professor Yann LeCun, University of Montreal professor Yoshua Bengio, MIT Media Lab’s professor Cynthia Breazeal and Samsung Research’s Executive Vice President Sebastian Seung.</p></div>
<h3><span style="color: #000080"><strong>Could Unsupervised Learning Unlock the Secrets of the Brain?</strong></span></h3>
<p>Sebastian Seung, Executive Vice President of Samsung Research and Chief Research Scientist of Samsung Electronics, delivered a particularly illuminating presentation that outlined why unsupervised learning will be essential for developing AI with human-level mental capabilities.</p>
<p>Seung described how the convolutional neural networks that LeCun had examined in detail are in fact based on insights gained through the study of neuroscience. He also discussed how his research in both artificial and biological neural networks led him to study ways to apply AI to gain a better understanding of how our brains are wired.</p>
<p>Seung stressed that the model for designing unsupervised learning networks lies in the cortex of the brain, and highlighted a recent study that his team was involved in that used AI to map out all of the neurons contained in a one cubic millimeter of a mouse’s visual cortex—more than 100,000 in total.</p>
<p>The unsupervised learning algorithm that the researchers utilized allowed them to not only create a 3D reconstruction of the neural network’s wiring, but also made it possible to label and color in individual cells and their components. “That’s the magic of deep learning,” said Seung. “If a human had to color all that in, it would take about 100 years of work. And that’s with no coffee breaks or sleeping.”</p>
<h3><span style="color: #000080"><strong>Living with Social Robots in ‘10 to 20 Years’</strong></span></h3>
<p>The speech delivered by Cynthia Breazeal, the founder and Chief Scientist of Jibo, Inc., and the founding director of the Personal Robotics Group at MIT’s (the Massachusetts Institute of Technology’s) Media Lab, shifted focus to applying AI to develop advanced robotics.</p>
<p>Breazeal’s speech, entitled “Living and Flourishing with Social Robots,” discussed approaches needed to develop autonomous systems that utilize AI to enhance our quality of life. As Breazeal explained, autonomous, socially and emotionally intelligent technologies—robots with what’s known as ‘relational AI’—present a wide range of exciting benefits.</p>
<p>“I’m really excited to think about the next 10 to 20 years—of having these robots actually become a part of our daily lives,” said Breazeal.</p>
<p>The fascinating presentation highlighted helpful companion technologies in particular, and included specific examples of ways that robots could be used to assist children and older adults. Breazeal noted studies in which AI robotic companions were given to patients at a children’s hospital, as well as kindergarten-age students and senior citizens.</p>
<p>Videos of the studies showed how the children in the hospital drew comfort from having a peer-like companion by their side, and demonstrated how robots can be used to boost learning. As Breazeal explained, “This is about a different vision for AI. There’s so much emphasis right now on tools for professionals, and there’s not a lot of deep thinking around how AI is going to benefit everyone.” The studies, Breazeal added, “show that there’s a lot of promise with these technologies in the real world… making a real difference.”</p>
<p>This year’s forum also included a diverse array of speeches that offered an all-encompassing look at the state of artificial intelligence development today. These included presentations on topics covering advancements in reinforcement learning, mutual information neural estimation, socially and emotionally intelligent AI, personal assistant robots, and precision medicine via machine learning. The developments discussed at the Samsung AI Forum 2018 represent great strides toward creating an AI-connected future.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Editorial] Samsung Envisions Life Transformed by Artificial Intelligence</title>
				<link>https://news.samsung.com/global/editorial-samsung-envisions-life-transformed-by-artificial-intelligence</link>
				<pubDate>Wed, 01 Nov 2017 10:00:06 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/Sunggy-Koo-Editorial-on-AI_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Editorials]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Bixby]]></category>
		<category><![CDATA[Family Hub]]></category>
		<category><![CDATA[IoT]]></category>
                <guid isPermaLink="false">http://bit.ly/2gWlHUa</guid>
									<description><![CDATA[Just about every industry today is being transformed by Artificial Intelligence (AI). From retail and entertainment to transportation and healthcare, AI is seeping into our world in ever more profound ways, revolutionizing the way we go about our daily lives. At Samsung Electronics, we share in the vision of a fully open, intelligent and connected […]]]></description>
																<content:encoded><![CDATA[<p>Just about every industry today is being transformed by Artificial Intelligence (AI). From retail and entertainment to transportation and healthcare, AI is seeping into our world in ever more profound ways, revolutionizing the way we go about our daily lives.</p>
<p>At Samsung Electronics, we share in the vision of a fully <a href="https://news.samsung.com/global/samsung-shares-vision-for-an-open-and-connected-iot-experience-at-samsung-developer-conference-2017" target="_blank" rel="noopener">open, intelligent and connected world</a>. A world where AI will play an integral role, where one day everything from our phones to our refrigerators will possess some sort of intelligence to help us seamlessly interact with our surroundings. To bring the company closer to realizing its vision, we are working tirelessly to develop technologies that can help us take this first leap into intelligence.</p>
<h3><span style="color: #000080"><strong>AI – Where Are We Now and Where Are We Going?</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-95027" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/Sunggy-Koo-Editorial-on-AI_main_1.jpg" alt="" width="705" height="341" /></p>
<p>Since it was first envisioned in the 1950s, AI has made a palpable impact on our lives, giving us practical speech recognition, more effective web search and self-driving cars, among other innovations.</p>
<p>Earlier this month, Google’s AlphaGo AI program made news by mastering the ancient Chinese board game Go in just three days without any human assistance. This major advance comes just two decades after Deep Blue crushed chess grandmaster Garry Kasparov, illustrating that AI has not only come a long way in a short time, but is on track to creating unthinkable opportunities across all industries that will add new value to our lives.</p>
<p>The recent explosion in AI is enabled by a number of factors including a wider availability of GPUs, virtually infinite amounts of data, and more advanced machine and deep learning algorithms. Additionally, investments in AI have tripled from $26 billion in 2013 to $39 billion in 2016*, further propelling the development of new intelligent technologies.</p>
<p>Despite these advancements, there are very real challenges that are hindering the development of AI technologies, including the lack of the required talent pool in the AI industry. Furthermore, many device manufacturers haven’t quite figured out how to best optimize the user data they receive from their sensor-equipped products. As a result, enterprises struggle to determine what AI is capable of and what kind of value it can bring to consumers.</p>
<h3><span style="color: #000080"><strong>Bixby – Bringing New Value into the Smart Home</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-95028" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/Sunggy-Koo-Editorial-on-AI_main_2.jpg" alt="" width="705" height="458" /></p>
<p>Samsung, too, has contemplated how AI can deliver real value to its users, and in doing so have developed <a href="https://news.samsung.com/global/bixby-2-0-the-start-of-the-next-paradigm-shift-in-devices" target="_blank" rel="noopener">Bixby</a>, a bold reinvention of its intelligent interface that’s even more ubiquitous, open and personal. Powered by the Samsung Connect, Bixby will act as the controlling platform of your connected device ecosystem, including mobile phones, TVs and even home appliances to make the smart home experience even smarter.</p>
<p>In fact, we are adding Bixby Voice to our Family Hub refrigerator. Now, you will be able to check the weather, build shopping lists and order groceries with the power of your voice. So, for example, if you were running low on milk, you would just say, “Hi Bixby, order milk” to order food directly from the screen.</p>
<p>The integration of Bixby Voice and Samsung Connect into the Family Hub refrigerator marks a big step – one that will offer developers tremendous opportunities to develop new content, applications and experiences in areas such as food, health, home management, entertainment and more.</p>
<p>We think this could be the fourth wave, where you have programmable objects dispersed throughout your entire home, seamlessly connected and communicating in a personalized and intuitive manner. In this way, we are moving beyond simply connecting devices to the Internet and are taking the next step by connecting devices to intelligence. This new era is what we are calling the “Intelligence of Things.”</p>
<h3></h3>
<h3><span style="color: #000080"><strong>Smarter World, Better Life</strong></span></h3>
<p>A world powered by the Intelligence of Things will open up entirely new possibilities. In this world, every machine around you is intelligence-enabled, capable of understanding and anticipating your needs. In this world, mundane tasks are a thing of the past, allowing you to spend more time doing the things you enjoy with the people you love.</p>
<p>We know that we have a long way to go to fully realize our vision. But we are wholly committed to building upon our heritage of creating meaningful innovation and driving digital transformation to advance technologies in artificial intelligence. We could not be more excited to help lead the changes that will define this new, transformative era.</p>
<p><span style="font-size: small"><em>*</em><span lang="EN-US"><em><a href="https://www.mckinsey.com/business-functions/mckinsey-analytics/our-insights/how-artificial-intelligence-can-deliver-real-value-to-companies" target="_blank" rel="noopener">MCKINSEY Report 2017</a> : Artificial Intelligence, The Next</em> <em>Digital Frontier?</em></span></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>5 Things that SDC 2017 Offered the Global Developer Community</title>
				<link>https://news.samsung.com/global/5-things-that-sdc-2017-offered-the-global-developer-community</link>
				<pubDate>Fri, 20 Oct 2017 10:00:49 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_thumb704_F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[#SDC2017]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Bixby]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Samsung Developer's Conference 2017]]></category>
                <guid isPermaLink="false">http://bit.ly/2znGP9F</guid>
									<description><![CDATA[Samsung kicked off this year’s Samsung Developer Conference (SDC 2017), held within the halls of San Francisco’s Moscone West convention center, by outlining its vision of an open, intelligent and connected world backed by an ecosystem of innovative devices and services. SDC’s most diverse lineup of sessions and activities yet offered an SDC-record 6,000 developers, […]]]></description>
																<content:encoded><![CDATA[<p>Samsung kicked off this year’s <a href="https://news.samsung.com/global/samsung-shares-vision-for-an-open-and-connected-iot-experience-at-samsung-developer-conference-2017" target="_blank" rel="noopener">Samsung Developer Conference (SDC 2017)</a>, held within the halls of San Francisco’s Moscone West convention center, by outlining its vision of an open, intelligent and connected world backed by an ecosystem of innovative devices and services.</p>
<p>SDC’s most diverse lineup of sessions and activities yet offered an SDC-record 6,000 developers, partners, business leaders and content creators a roadmap for innovating in the “Intelligence of Things” era. By spotlighting a “connected thinking” approach to innovation, and introducing exciting updates to its IoT, artificial intelligence (AI) and augmented reality (AR) technologies, Samsung hopes to arm developers with the tools and insights they need to help usher in a new phase of connectivity.</p>
<p><strong> </strong></p>
<p>In case you missed it, here’s a rundown of five key factors that made this year’s SDC the best yet.</p>
<h3><span style="color: #000080"><strong>1. </strong></span><span style="color: #000080"><strong>Inspiring Words</strong></span></h3>
<p>Day one of SDC 2017 kicked off with <a href="https://news.samsung.com/global/sdc-2017-keynote-speeches-outline-samsungs-new-iot-vision" target="_blank" rel="noopener">keynote speeches</a> from Samsung and tech industry leaders, who began by revealing the company’s new, unified IoT platform: SmartThings.</p>
<p>The speakers outlined how the cloud-based platform, which combines three existing IoT services – <a href="https://news.samsung.com/global/samsungs-smartthings-cloud-bringing-the-iot-dream-to-life" target="_blank" rel="noopener">SmartThings</a>, <a href="https://news.samsung.com/global/introducing-samsung-connect-tag-a-new-way-to-keep-track-of-all-that-matters-in-life" target="_blank" rel="noopener">Samsung Connect</a>, and <a href="https://news.samsung.com/global/samsung-introduces-new-artik-secure-iot-modules-and-security-services-to-deliver-comprehensive-device-to-cloud-protection-for-iot" target="_blank" rel="noopener">ARTIK</a> – will offer seamless controls over IoT-enabled products and services, and serve as the foundation for one of the world’s largest IoT ecosystems. By providing access to one cloud API across SmartThings-compatible products, Samsung explained, SmartThings will ultimately allow developers to build connected solutions that reach more people.</p>
<p><img loading="lazy" class="alignnone wp-image-94772 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_1.jpg" alt="" width="705" height="385" /></p>
<p>Other highlights of day one’s keynote event included Samsung’s announcements of its plans to incorporate Bixby support into more Samsung and IoT devices; release <a href="https://news.samsung.com/global/bixby-2-0-the-start-of-the-next-paradigm-shift-in-devices" target="_blank" rel="noopener">Bixby 2.0</a> – an update for the intelligent assistant that’s more ubiquitous, open and personal; and advance its leadership in the field of AR through a partnership with Google.</p>
<p>Day two’s keynote speakers included Stan Lee, the chief creative force behind Marvel Comics, and Ariana Huffington, the founder and CEO of Thrive Global. Sharing their insights on topics including the creative process, innovation and “connected thinking”, the distinguished speakers inspired attendees to go further, break boundaries, connect people, and solve bigger problems.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94773" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_2.jpg" alt="" width="705" height="469" /></p>
<p><img loading="lazy" class="wp-image-94774 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_3.jpg" alt="" width="705" height="436" /></p>
<h3><span style="color: #000080"><strong> 2. A Full Slate of Diverse Sessions</strong></span></h3>
<p>SDC 2017’s sessions and activities focused largely on Samsung services and devices, intelligent technology trends, and exciting business opportunities. They also offered attendees a chance to listen to valuable insights from tech leaders on topics including AR, VR and game development, and on innovating with Samsung’s unified IoT platform.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94775" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_4.jpg" alt="" width="705" height="334" /></p>
<p>The formats for the sessions, which numbered 49 in total, spanned everything from informative lectures, panels and speeches, to fun and engaging “Ask Me Anything” discussions. The sessions slate not only offered professional developers a wide array of enriching activities, but kids as well.</p>
<p>For SDC 2017, Samsung invited students from around the world to participate in an array of fun activities. Programs included the Samsung Kids Experience, which featured fun and immersive tablet-powered lessons, and Youth Track, which invited aspiring developers and creators from local high schools to take part in a special Gear 360 workshop.</p>
<p><img loading="lazy" class="wp-image-94776 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_5.jpg" alt="" width="705" height="470" /></p>
<h3><span style="color: #000080"><strong>3. An Up-Close Look at Dynamic Developer Tools</strong></span></h3>
<p>SDC 2017 also offered developers the chance to examine software development kits (SDKs) for various Samsung apps and tools, and test out a wide array of the company’s latest offerings. These not only encompassed Samsung’s Galaxy Apps ecosystem, mobile tools, and services such as Samsung Pay, Samsung DeX and Samsung Knox, but also devices like its IoT-connected Family Hub refrigerators, Smart TVs, the Galaxy Note8, and the Gear VR headset.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94777" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_6.jpg" alt="" width="705" height="436" /></p>
<p>Visitors to the Bixby section of the booth were able to experience the intelligent interface’s convenient functions through firsthand demos. On the other end of the exhibition floor was the SmartThings Developer Zone, which presented developer tools for the new, integrated IoT platform, and featured showcases for Samsung’s <a href="https://news.samsung.com/global/samsungs-smartthings-cloud-bringing-the-iot-dream-to-life" target="_blank" rel="noopener">SmartThings</a> partners and the <a href="https://news.samsung.com/global/samsung-introduces-new-artik-secure-iot-modules-and-security-services-to-deliver-comprehensive-device-to-cloud-protection-for-iot" target="_blank" rel="noopener">ARTIK IoT chipset</a>.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94778" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_7.jpg" alt="" width="705" height="428" /></p>
<p>The SDK Bar offered info on the complete range of Samsung SDKs and APIs that were being showcased at the event, and directed attendees toward their corresponding sessions. The stream.Code101 section, meanwhile, allowed attendees to learn, via hands-on tutorials, how to develop apps and immersive experiences for products including the Gear VR, the Gear 360, Samsung Pay and Samsung Pass.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94779" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_8.jpg" alt="" width="705" height="436" /></p>
<h3></h3>
<h3><span style="color: #000080"><strong>4. Inspirational Citizenship Initiatives</strong></span></h3>
<p>Along with showcasing Samsung’s newest intelligent technologies, SDC 2017 spotlighted several Samsung initiatives designed to inspire developers to dream up fresh innovations, and help communities across the globe. The hands-on exhibits for Samsung’s citizenship efforts showcased innovative ways that Samsung technologies are being used to create solutions that target healthcare, education, and important social issues.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94780" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_9.jpg" alt="" width="705" height="454" /></p>
<p>This year’s SDC featured several attention-grabbing examples from partners around the world that utilized the Gear VR. These included the Spanish Ministry of Education, Culture and Sport’s #ByeByeBullying” campaign; a variety of health solutions developed by Nordic innovators; Limbic Life’s Limbic Chair VR, which has been used for stroke rehabilitation in Switzerland; and C-Lab’s Relumino, a Gear VR app that helps the visually impaired see the world more clearly. In addition, the exhibit’s Galaxy Upcycling section highlighted ways that Samsung products at the end of their life cycles may be used to create dynamic innovations.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94781" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_10.jpg" alt="" width="705" height="429" /></p>
<h3><strong><span style="color: #000080">5. The Most Fun and Immersive SDC Yet</span> </strong></h3>
<p>SDC 2017 was jam-packed with fun and interactive activities that offered attendees an opportunity to relax, let loose, and enjoy immersive VR and live entertainment.</p>
<p>The conference featured a wide range of can’t-miss competitions that allowed coders, gamers, and entrepreneurs to test their skills to win bragging rights and gear. Contests included “SDC’s Got Code Talent”, an experts-only coding challenge with a Galaxy Note8 up for grabs, as well as the Soundcamp SDC Challenge, which saw contestants compose a signature song with the help of a real DJ in a quest to create SDC 2017’s top track.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94782" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_11.jpg" alt="" width="705" height="449" /></p>
<p><img loading="lazy" class="alignnone size-full wp-image-94793" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_11-2.jpg" alt="" width="705" height="470" /></p>
<p>In addition, the booth’s SDC Lounge included an LED “Google Play Music Wall” that visualized Google Play Music’s ability to recommend the right song for any moment. Speaking of music, the SDC Lounge also served as the stage for musician Daler Mehndi’s exclusive unveiling of India’s first VR music video. And of course, the lounge’s dedicated VRcade offered attendees more opportunities to enjoy immersive VR.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94784" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_13.jpg" alt="" width="705" height="496" /></p>
<p>Day one of the conference closed with Samsung Celebration, an exclusive concert headlined by critically acclaimed indie artists Banks and Bonobo, while day two ended with a night of immersive content, networking, demonstrations, and can’t-miss experiences centered around VR.</p>
<p><img loading="lazy" class="alignnone wp-image-94786 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/SDC2017-Highlights_main_15.jpg" alt="" width="705" height="470" /></p>
<p>And thus concludes another successful Samsung Developer Conference. For more info on the topics and innovations that were discussed at the event, check out Samsung Newsroom’s recap of SDC 2017’s <a href="https://news.samsung.com/global/sdc-2017-keynote-speeches-outline-samsungs-new-iot-vision" target="_blank" rel="noopener">keynote speeches</a>.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Editorial] Bixby 2.0: The Start of the Next Paradigm Shift in Devices</title>
				<link>https://news.samsung.com/global/bixby-2-0-the-start-of-the-next-paradigm-shift-in-devices</link>
				<pubDate>Wed, 18 Oct 2017 18:03:03 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/Bixby-2.0-Eui-Suk-Chung_thumb704_FF.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Editorials]]></category>
		<category><![CDATA[Mobile]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Bixby 2.0]]></category>
		<category><![CDATA[Bixby SDK]]></category>
		<category><![CDATA[Samsung Developer Conference]]></category>
		<category><![CDATA[SDC 2017]]></category>
                <guid isPermaLink="false">http://bit.ly/2gslkAD</guid>
									<description><![CDATA[Today at the Samsung Developer Conference (SDC) 2017 in San Francisco, I was honored to share Samsung’s vision for intelligence, with the introduction of Bixby 2.0 – a powerful intelligent assistant platform that will bring a connected experience that is ubiquitous, personal, and open.  Bixby 2.0 will be a fundamental leap forward for digital assistants […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-94789" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/10/Bixby-2.0-Eui-Suk-Chung_main-2_F.jpg" alt="" width="705" height="417" /></p>
<p>Today at the Samsung Developer Conference (SDC) 2017 in San Francisco, I was honored to share Samsung’s vision for intelligence, with the introduction of Bixby 2.0 – a powerful intelligent assistant platform that will bring a connected experience that is ubiquitous, personal, and open.  Bixby 2.0 will be a fundamental leap forward for digital assistants and represents another important milestone to transform our digital lives.</p>
<p>Today’s assistants are useful, but ultimately still play a limited role in people’s lives. People use them to set timers and reminders, answer trivial questions, etc. We see a world where digital assistant play a bigger role, an intelligent role, where one day everything from our phones, to our fridge, to our sprinkler system will have some sort of intelligence to help us seamlessly interact with all the technology we use each day.</p>
<p>To understand where we are going with devices, it is important to understand how far we’ve come. It is hard to believe, when you think back to just a decade ago, how we used our phones. It was likely a feature phone, which was primarily used to make calls and use a limited number of apps. But I knew, we at Samsung knew, that there were greater possibilities for the mobile phone. During the early and mid-2000, I led the team at Samsung in creating the Mobile Intelligent Terminal by Samsung (MITs), which was an early generation of smartphones before the Galaxy S and Note series were finally born. Powered by open API, application eco-system and innovative touch UI, smartphone market has, ever since then, exploded and became a life-essential tool for everyone. It also brought new opportunities for businesses and developers.</p>
<p>I believe that we are now on the cusp of the next major tectonic shift. With our long legacy as the global smartphone leader, we’ve been bringing meaningful innovation and driving digital transformation for the industry, and we are excited to help lead the change during another revolutionary moment. Personally, I am excited to be part of this next great era and this shift is one of the reasons why we are motivated to continue to develop our intelligent assistant, Bixby.</p>
<p>We introduced Bixby to the world earlier this year with the launch of our flagship mobile devices, the Galaxy S8 and S8+ and the Note 8. Bixby is now available in over 200 different countries, with more than 10 million registered users. But this is just a start. When we launched Bixby, we focused on how it could help people with getting the most out of their smartphones and apps to make their life easier. We integrated a few close partner applications on the device to make those devices more intelligent. We created multi-step, and cross-app capabilities that have allowed more than millions of Bixby users get things done faster and easier, but it is just a stepping stone for us.</p>
<p>Now, we are ready to take Bixby to the next level. Bixby 2.0 is a bold reinvention of the platform. A reinvention aimed at transforming basic digital assistants from a novelty to an intelligence tool that is a key part of everyone’s daily life.</p>
<p>Bixby 2.0 will be ubiquitous, available on any and all devices. This means having the intelligence of Bixby, powered by the cloud, act as the control hub of your device ecosystem, including mobile phones, TVs, refrigerators, home speakers, or any other connected technology you can imagine. Soon, developers will be able to put their services on any and all devices and will not have to reinvent their services each time they support a new device.</p>
<p>It will be more personal, with enhanced natural language capabilities for more natural commands and complex processing, so it can really get to know and understand not only who you are, but who members of your family are, and tailor its response and actions appropriately.</p>
<p>And finally, and most importantly, Bixby 2.0 will be open. We know Samsung cannot deliver on this paradigm shift by ourselves – it can only happen if we all, across all industries, work together, in partnership. With Bixby 2.0, the doors will be wide open for developers to choose and model how users interact with Bixby in their services across all application domains e.g., sports, food, entertainment, or travel – the opportunities are truly endless.</p>
<p>Starting today, we’re announcing our first private beta program with Bixby SDK, which will be available for select developers. We will work as one team, innovating, collaborating, and bringing Bixby 2.0 to life. Over time, we will increase the number of participants in the beta, and ultimately make the Bixby SDK available to all developers.</p>
<p>Bixby 2.0 will ultimately be a marketplace, for intelligence. A new channel for developers to reach users with their service, not just on mobile devices, but through all devices. Over time, we will rollout variety of revenue models to maximize our partners’ business opportunities in this new paradigm. Hopefully making it as fruitful as the move from feature to smartphones was for our partners.</p>
<p>The future is yours. Whatever your technology platform, device category or industry sector: we call on all developers to unleash their creativity and help us to democratize intelligence, so that we can move beyond devices and into an open, connected ecosystem to simplify the lives of everyone.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Electronics Launches AI Lab in Montreal, Canada</title>
				<link>https://news.samsung.com/global/samsung-electronics-launches-ai-lab-in-montreal-canada</link>
				<pubDate>Sat, 30 Sep 2017 17:45:13 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/AI-Lab-in-Montreal_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[SAIT]]></category>
		<category><![CDATA[Samsung Advanced Institute of Technology]]></category>
                <guid isPermaLink="false">http://bit.ly/2yNTFyf</guid>
									<description><![CDATA[Samsung Advanced Institute of Technology (hereinafter referred to as “SAIT”) established AI Lab on August 17 (local time) at University of Montreal, Canada. The AI Lab will be used to strengthen collaborative research with world-leading scholars in the AI ​​field. SAIT has been collaborating with Professor Yoshua Bengio of University of Montreal, the world’s top authority […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-94301" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/AI-Lab-in-Montreal_main_1.jpg" alt="" width="705" height="470" /></p>
<p>Samsung Advanced Institute of Technology (hereinafter referred to as “SAIT”) established AI Lab on August 17 (local time) at University of Montreal, Canada. The AI Lab will be used to strengthen collaborative research with world-leading scholars in the AI ​​field.</p>
<p>SAIT has been collaborating with Professor Yoshua Bengio of University of Montreal, the world’s top authority on deep learning, machine learning and artificial intelligence, since 2014, with other partners in University of Toronto, McGill University and NYU.</p>
<p>“There is a long-standing and fruitful research collaboration between us and Samsung and we are glad to see Samsung open a research lab here and join the amazing momentum which is turning Montreal into an international hub for AI, both academically and industrially,” commented Prof. Bengio regarding this event.</p>
<p>In this Samsung AI Lab, the researchers dispatched from Korea work with local professors and students including Prof. Bengio to develop key algorithms and components for artificial intelligence such as voice/image recognition, translation, autonomous driving, and robots. It will also contribute to acquiring global talents and strengthening Samsung’s AI technology.</p>
<p>Eunsoo Shim, VP and Head of S/W Solution Lab at SAIT, said, “The joint research with Professor Bengio has been a foundation for the development of artificial intelligence in Samsung Electronics, and Samsung AI Lab will be a momentous step for us to leap forward.”</p>
<p>In addition, he added that SAIT plans to establish and operate a “Neural Processing Research Center” with local universities such as Seoul National University during the year to accelerate its research in the field of AI ​​processor.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94307" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/AI-Lab-in-Montreal_main_2.jpg" alt="" width="705" height="470" /></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung Discusses the Future of AI with Leading Academics, Industry Leaders</title>
				<link>https://news.samsung.com/global/samsung-discusses-the-future-of-ai-with-leading-academics-industry-leaders</link>
				<pubDate>Thu, 28 Sep 2017 15:00:06 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/Global-AI-Forum_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Computer Vision]]></category>
		<category><![CDATA[Data Analytics]]></category>
		<category><![CDATA[Natural Language Processing]]></category>
		<category><![CDATA[Samsung Global AI Forum]]></category>
                <guid isPermaLink="false">http://bit.ly/2fs566l</guid>
									<description><![CDATA[An audience of academics, thought leaders and industry experts gathered to discuss the future of artificial intelligence (AI) at an exclusive Samsung event recently. The first ever Samsung Global AI Forum enabled leading thinkers from across the industry to share their insights, talk about best practices and investigate opportunities for collaboration. The event took place […]]]></description>
																<content:encoded><![CDATA[<p>An audience of academics, thought leaders and industry experts gathered to discuss the future of artificial intelligence (AI) at an exclusive Samsung event recently. The first ever Samsung Global AI Forum enabled leading thinkers from across the industry to share their insights, talk about best practices and investigate opportunities for collaboration.</p>
<p>The event took place in New York at Samsung 837, Samsung’s marketing experience center. It happened at a time when discussion around AI is increasing in importance, with the technology expected to bring about dynamic changes.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94200" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/Global-AI-Forum_main-1.jpg" alt="" width="705" height="470" /></p>
<p>Twenty renowned scholars including Professor Zoubin Ghahramani of the University of Cambridge, Professor Barry Smyth of University College Dublin, Professor Alexander Rush of Harvard University and Professor Rob Fergus of New York University were invited to the Samsung Global AI Forum. Along with Samsung Electronics executives and staff, they talked about the major areas of AI application and how the hindrances in advanced AI algorithms can be overcome.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94201" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/Global-AI-Forum_main-2.jpg" alt="" width="705" height="396" /></p>
<p>The forum consisted of four separate sessions: Natural Language Processing; Computer Vision; Data Analytics/Recommendation System; and Overcoming Limitations in AI.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94197" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/Global-AI-Forum_main-3.jpg" alt="" width="705" height="396" /></p>
<ul>
<li>In the first session on Natural Language Processing, participants reviewed the current stage of development in voice recognition and voice command technology and discussed ways to achieve a higher level of language processing in the future.</li>
</ul>
<ul>
<li>The next session on Computer Vision covered the advancement of deep learning technology in image recognition, including its application in areas such as shopping, search, and autonomous driving.</li>
</ul>
<ul>
<li>In the session on Data Analytics/Recommendation System, the participants discussed on how to drive consumer innovation and production efficiencies by leveraging data, which has been growing rapidly due to increasing smart devices and the Internet of Things (IoT).</li>
</ul>
<ul>
<li>The last session addressed the future of deep learning technology and its limitations to apply in many areas which relate to interpreting the vast amount of quality data and the flexible application of algorithms.</li>
</ul>
<p><img loading="lazy" class="alignnone size-full wp-image-94198" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/Global-AI-Forum_main-4.jpg" alt="" width="705" height="396" /></p>
<p>“Today’s forum will provide an important opportunity for us to gain insights from global experts on how to prepare for the AI era. Samsung is determined to navigate through any challenges facing us along the way,” BK Yoon, President and CEO of Samsung Electronics told the attendants at the event. “Our passion and commitment together with your ideas and insights will help us take the lead in the age of AI.”</p>
<p><img loading="lazy" class="alignnone size-full wp-image-94199" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/09/Global-AI-Forum_main-5.jpg" alt="" width="705" height="396" /></p>
<p>“Samsung has a significant number of consumer devices, and it puts the company in a unique position to apply its technology in smartphones as well as home appliances,” said Rob Fergus, Assistant Professor at New York University. “I think Samsung has a lot of potential for deploying AI in practice for a number of interesting applications.”</p>
<p>Samsung announced its plans to make the forum an annual event so that the direction of AI technology development and technology innovation can continue to be discussed, and the company will continue to strengthen its global network of AI scholars to speed up the rollout of the technology in the future.</p>
<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/49OEqpVmjlo" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"></iframe></div>
]]></content:encoded>
																				</item>
					<item>
				<title>How to Get More Done in Your Day with Bixby</title>
				<link>https://news.samsung.com/global/how-to-get-more-done-in-your-day-with-bixby</link>
				<pubDate>Fri, 11 Aug 2017 16:00:29 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_thumb704_F.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Bixby]]></category>
		<category><![CDATA[Bixby Voice]]></category>
                <guid isPermaLink="false">http://bit.ly/2fxzxvc</guid>
									<description><![CDATA[Bixby is an intelligent interface that makes interacting with your phone – and managing its functions and your favorite apps – easy. Bixby’s advanced voice capabilities1 allow you to effortlessly manage your phone and your connected life using simple, natural commands. The voice support is fully integrated into your phone’s native apps, settings, and a […]]]></description>
																<content:encoded><![CDATA[<p>Bixby is an intelligent interface that makes interacting with your phone – and managing its functions and your favorite apps – easy.</p>
<p>Bixby’s advanced voice capabilities<sup>1</sup> allow you to effortlessly manage your phone and your connected life using simple, natural commands. The voice support is fully integrated into your phone’s native apps, settings, and a growing list of third-party apps, placing your phone’s most convenient and powerful features at your beck and call. So whether you’d like help taking a selfie, organizing photos from your last vacation, tracking down a particular setting, or keeping your friends updated on social media, simply tell Bixby what you need and it will work with you to get things done faster and easier.</p>
<p>Here are just a few examples of how Bixby’s voice support can help you get more done in your day.</p>
<h3><span style="color: #000080"><strong>Snap a Selfie with Bixby and Share It with Your Friends and Followers</strong></span></h3>
<p>Bixby is an expert multitasker, capable of seamlessly navigating your phone’s menus and apps.</p>
<p>Bixby is built right into your phone’s camera, allowing you to take full advantage of your camera’s advanced features with simple spoken requests, and creating an easy way to share snapshots of your day.</p>
<p>For instance, you can tell Bixby to “Take a selfie” and use Bixby to edit and enhance your image and share it with friends on social media.</p>
<p><strong><img loading="lazy" class="alignnone wp-image-92285" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-1.gif" alt="" width="705" height="397" /></strong><strong><br />
</strong></p>
<h3><span style="color: #000080"><strong>Simplify Your Bixby Conversations with Quick Commands</strong></span></h3>
<p>You don’t need to be formal or oversimplify requests when talking to Bixby. The interface’s understanding of natural language allows you to speak to your phone as you would a friend.</p>
<p>Bixby’s Quick Commands make interacting with the interface even more effortless by allowing you to replace long or complex requests with convenient keywords. For example, you can substitute a common command, such as “Turn my alarm on for 6:00 a.m.,” with a short and simple alternative, such as “Set alarm.”</p>
<p>To add a Quick Command, head to Bixby Home’s “My Bixby” menu.</p>
<p><strong> <img loading="lazy" class="alignnone wp-image-92279" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-2.gif" alt="" width="705" height="397" /></strong></p>
<p><strong> </strong></p>
<p><strong> </strong></p>
<h3><span style="color: #000080"><strong>Manage Apps and Settings with Ease</strong></span></h3>
<p>Bixby’s voice capability allows you to navigate your phone and go-to apps hands-free, without sacrificing access to settings or functions. Rather than manually searching for a particular feature, you can use Bixby to bypass menus and instantly locate the function you need.</p>
<p>Voice support is particularly useful when attempting to access functions that are buried beneath multiple menu screens. For example, with voice support, if you were to look up a recipe on your phone, you could ask Bixby to “Set screen timeout to 10 minutes” so you could roll up your sleeves and focus on cooking without needing to constantly wake up your device.</p>
<p>Bixby’s always there to help. Simply saying the word will allow you to streamline everyday tasks, including everything from clearing notifications and changing a wallpaper or ringtone, to activating commonly used features such as “Do not disturb” mode.</p>
<p><strong><img loading="lazy" class="alignnone wp-image-92280" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-3.gif" alt="" width="705" height="397" /></strong></p>
<h3><strong><span style="color: #000080">Enjoy a Smarter Driving Experience</span> </strong></h3>
<p>Bixby’s ability to respond to voice commands also makes it a terrific co-pilot for road trips and daily commutes, allowing you to easily access key apps and functions while keeping your eyes on the road.</p>
<p>When you’re behind the wheel and you make a request, Bixby will ask you to unlock your phone with a custom voice password<sup>2</sup> in order to carry out your command.</p>
<p>To set your unique voice password, simply locate the function under Bixby’s settings and follow the on-screen instructions. After you select a secure word or phrase, Bixby will ask you to read out a few sample sentences in order to get to know your distinct voice. Once finished, the next time you activate Bixby, you’ll be able to unlock your device using your chosen word or phrase.</p>
<p><strong><img loading="lazy" class="alignnone wp-image-92281" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-4.gif" alt="" width="705" height="397" /></strong></p>
<p>When you need a little help getting where you’re going, ask Bixby to pull up your destination in your favorite navigation app. The interface’s voice support and app integration make it easy to find the fastest route to wherever you’re headed.</p>
<p><strong> </strong></p>
<p><strong><img loading="lazy" class="alignnone wp-image-92282" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-5.gif" alt="" width="705" height="397" /></strong></p>
<p><strong> </strong></p>
<p>Bixby’s voice support allows drivers to handle complicated tasks hands-free, and even makes it possible to read and dictate responses to texts with messaging apps.</p>
<p><strong> </strong></p>
<p><strong><img loading="lazy" class="alignnone wp-image-92283" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-6.gif" alt="" width="705" height="397" /></strong></p>
<p>This is just a sample of some of the everyday tasks that Bixby was designed to streamline. To learn more about how to use Bixby’s voice capabilities to simplify daily life, check out Samsung’s comprehensive list of tips and tricks, located under “Tutorials” in Bixby Home.</p>
<p><strong><img loading="lazy" class="alignnone wp-image-92284" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/08/Bixby-Voice-Hands-On_main-7.gif" alt="" width="705" height="397" /></strong></p>
<p><span style="font-size: small"><sup>1</sup><em>Language support is currently limited to Korean and U.S. English, and the user’s device must be Bixby-compatible. App and service compatibility may vary by region.</em></span></p>
<p><span style="font-size: small"><sup>2</sup><em>This specific function must be activated before use.</em></span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>THIS IS QLED TV, Part 6: The Convergence of TV and AI</title>
				<link>https://news.samsung.com/global/this-is-qled-tv-part-6-the-convergence-of-tv-and-ai</link>
				<pubDate>Mon, 24 Jul 2017 17:00:43 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/QLED-TV-Part-6_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Editorials]]></category>
		<category><![CDATA[TVs & Displays]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[QLED TV]]></category>
                <guid isPermaLink="false">http://bit.ly/2eGNkz7</guid>
									<description><![CDATA[In April 2017, the market research firm International Data Corporation (IDC) made a prediction that the cognitive and artificial intelligence (AI) system market will continue its high growth of a compound annual growth rate (CAGR) of 54.5 percent by 2020. The characteristics of AI systems such as prediction, recommendation and intelligent assistance are expected to […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-91801" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/QLED-TV-Part-6_main.jpg" alt="" width="705" height="528" /></p>
<p>In April 2017, the market research firm International Data Corporation (IDC) made a prediction that <a href="http://www.idc.com/getdoc.jsp?containerId=prUS42439617" target="_blank" rel="noopener noreferrer">the cognitive and artificial intelligence (AI) system market will continue its high growth of a compound annual growth rate (CAGR) of 54.5 percent by 2020</a>. The characteristics of AI systems such as prediction, recommendation and intelligent assistance are expected to take important roles in creating intelligent services.</p>
<p>These functions of AI are already influencing consumer electronics, especially smartphones and TV products. The convergence of AI and TV is subtle, but it is a trend that is taking place and will continue to develop. Let’s look at the convergence of TV and AI and how it will change the future TV viewing, particularly for Samsung QLED TVs.</p>
<h3><span style="color: #000080"><strong>For Overwhelming Content, AI Is Necessary</strong></span></h3>
<p>People no longer just view channels provided by TV broadcasters. There are terrestrial networks, cable TV, satellite channels, internet protocol TV (IPTV) and even over the top (OTT) programming, as well as smart TV applications that viewers can choose to watch. Yet no one wants to check all the available options to find a show they might enjoy.</p>
<p>An AI recommendation feature could sort the content for you. AI embedded into the TV could analyze the viewer’s preference and show content that viewers might be interested on the TV screen. The 2017 Samsung QLED TV shows a large choice of content en bloc per service provider on a launcher section that is located on the lower side of the screen. Your favorite channels or services are showed first on the launcher by default and you can also use a voice commend to choose a show to watch.</p>
<p>In the near future, features like prediction and intelligent assistance that the IDC mentioned will be implemented in TVs and change the viewer’s experience. For example, AI can analyze weather conditions, time and issue of the day to offer the best content suited to those variables. Also, intelligent voice recognition in the future will allow conversation with viewers to recommend content depending on age, mood and content preference. These features will enhance TV viewing for everyone.</p>
<h3><span style="color: #000080"><strong>TV+AI, the Preview of IoT TV</strong></span></h3>
<p>As AI starts to be embedded into TV, viewers can use their TVs in a more convenient way. The 2017 Samsung QLED TV automatically recognizes set-top boxes, game consoles and other peripheral electronics. Instead of using technical terms such as HDMI to show the viewer which device is connected, it uses the product name like Xbox One and its logo. With this intelligent assistance, viewers no longer need complicated settings to use their TVs.</p>
<p>Let’s delve a little further into the examples of more advanced AI TVs. The Intelligent image engine of the Samsung QLED TV analyzes video sources automatically to change the brightness in real-time according to the light intensity. Also, the engine provides an immersive experience to viewers by enhancing perspective and three-dimensional perception.</p>
<p>Samsung’s The Frame which looks like a picture frame for an art, has an intelligent sensor that never existed on other televisions. The motion sensor of this product turns off the screen when there is no one around and it turns on the screen when someone comes near the TV. Furthermore, the brightness sensor automatically sets the brightness for the screen, which smoothly matches to surrounding ambient light sources.</p>
<p>TVs with the intelligent sensors are the previews of the future internet of things (IoT) -enabled TV. The IoT TV will optimize brightness, sound system and other features according to content characteristics such as movies and sports. In the morning, your TV could recognize your motion to control lights, blinds, air conditioner and other electronics.</p>
<p>Samsung plans to add IoT compatibility to all its TV products by the end of this year. In 2020, all Samsung products will become IoT-ready. Samsung’s anticipative investment towards AI and IoT will drastically change home entertainment and provide a whole new TV experience to viewers.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>[Innovation Feature Part 3] Samsung Nurturing Innovation Spirits</title>
				<link>https://news.samsung.com/global/innovation-feature-part-3-samsung-nurturing-innovation-spirits</link>
				<pubDate>Wed, 19 Jul 2017 19:00:26 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/Innovation-Feature-Part-3_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[More Stories]]></category>
		<category><![CDATA[People & Culture]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Samsung NEXT]]></category>
		<category><![CDATA[Samsung Silicon Valley]]></category>
		<category><![CDATA[SSIC]]></category>
                <guid isPermaLink="false">http://bit.ly/2tERbP8</guid>
									<description><![CDATA[Samsung today is an unquestioned global technology leader – yet, in this complex and competitive global landscape, the company has been embarking on an active course to transform itself, navigating emergent challengers and new technology categories, as it sets course for the future.   A major driver in this transformation has been Samsung’s expansion in […]]]></description>
																<content:encoded><![CDATA[<p><img loading="lazy" class="alignnone size-full wp-image-91724 swImageWide" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/Innovation-Series-Part-3_wide_main_1.jpg" alt="" width="1280" height="538" /></p>
<p><em>Samsung today is an unquestioned global technology leader – yet, in this complex and competitive global landscape, the company has been embarking on an active course to transform itself, navigating emergent challengers and new technology categories, as it sets course for the future. </em></p>
<p><em> </em></p>
<p><em>A major driver in this transformation has been Samsung’s expansion in Silicon Valley, where the company has had roots for over 30 years. Since 2012, Samsung has been investing heavily to broaden its operations and strategic partnerships in the valley, assembling a cadre of area veterans and innovators to cultivate emerging areas such as AI, health, IoT, and the cloud, and at the macro-level, to transform Samsung’s own organization and operations. </em></p>
<p><em>                 </em></p>
<p><em>From the decision to expand operations in Silicon Valley, to the current efforts there, to Samsung’s relentless mission to remain at the forefront of technology change, this three-part special feature hears directly from Samsung leaders in the valley on what the company is doing there.</em></p>
<h3><span style="color: #000080"><strong>What’s Next for Samsung Silicon Valley</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-91727" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/Innovation-Series-Part-3_main_2.jpg" alt="" width="705" height="106" /></p>
<p>Samsung’s presence in Silicon Valley has come to represent a new way of operation for the company. The traditional Samsung approach had always involved a deliberate examination of a market before the company would introduce a new technology.  In contrast, and spearheading the next chapter in Samsung’s growth, Samsung Silicon Valley will remain focused on the pursuit of a radically open style of innovation to drive, as Jacopo Lenzi, senior vice president at Samsung NEXT, shared earlier in this series. “The evolution of Samsung from a hardware manufacturer to a comprehensive tech company that ultimately is focused on delivering experiences and solutions for consumers, beyond devices,” he said.</p>
<p>The approach will be surely, disruptive, and one-rooted in examination, exploration and, at times, error.  But it is these learnings and experiences that Samsung leaders firmly believe will help the company not only stay ahead, but continue enormous growth into new areas. “We recognize that the best time to disrupt is during a point of prosperity, not panic,” said Chris Byrne, Head of IP Strategy at SSIC.</p>
<p>So what’s next for Samsung in Silicon Valley? As various Samsung leaders in the valley shared, the movement will very much be around a pursuit of partnerships to drive opportunity coupled with the strategic consolidation of technologies, all intended to infuse a new spirit of openness throughout Samsung.</p>
<p><img loading="lazy" class="alignnone size-full wp-image-91737 swImageWide" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/Innovation-Series-Part-3_main_2_2.png" alt="" width="1920" height="720" /></p>
<h3><span style="color: #000080"><strong>Partnerships to Drive Opportunity</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-91729" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/Innovation-Series-Part-3_main_3.jpg" alt="" width="705" height="106" /></p>
<p>For the Silicon Valley team, the approach to innovation will continue to be the openness to ideas from anywhere, from inside and outside Samsung. “Partnerships are the difference between winning and losing. You cannot do it by yourself anymore,” said Francis Ho, Head of Digital Health at SSIC.</p>
<p>In the pursuit of partnerships to drive opportunities, Samsung leaders in the valley share a commitment to examining where the latest trends are going, with the end aim simple, to enrich people’s lives. Towards this goal, areas of particular interest like the cloud, AI, smart health, and privacy and security may not be related to current Samsung businesses, but represent future areas of growth for the company. “This is the great challenging experiment—how do we Silicon ‘Valley-ize’ Samsung? How do we make it think and operate like a start-up?” said Chris Byrne.</p>
<p>In practical application, Samsung’s investment arms in the Valley, the Samsung Catalyst Fund and Samsung NEXT Ventures, will continue in their work to discover and engage with new partners. For example, the Catalyst Fund team of just 10 already spends half of their time meeting with startups and entrepreneurs and the other half promoting collaboration with Samsung business units. This approach, underdriven by a spirit of openness to trial and error, won’t change. Brendon Kim, Managing Director of Samsung NEXT Ventures , says that “being bold often means having a strong point of view and vision to invest ahead of the clear opportunity.”.</p>
<p>Also, in their continuing work, the teams will remain committed to persuading internal and outsides stakeholders to think outside and beyond the traditional value proposition of a venture. “We don’t have a 10-year time horizon like typical VCs. We just don’t look at it like they do, with five years to invest and five years to harvest. Our approach is to invest in things that are disruptive and defensible, regardless of the time horizon,” explained Shankar Chandran, Head of the Samsung Catalyst Fund.</p>
<p>Dr. Francis Ho added, “When you are a $200 billion company, everything we touch has a ‘B’. But sometimes, the work we are doing, to truly embrace open innovation, has no ‘B’ yet. So, we have to be very persuasive internally to stay focused and invest.”</p>
<h3><span style="color: #000080"><strong>Strategic Consolidation of Technologies</strong></span></h3>
<p>In its future work, Samsung leaders in Silicon Valley are also keen in their vision to consolidate across technologies. From vertical platforms for IoT to cloud and data storage to smart machines, the team is looking at ways to cross boundaries, and explore and capitalize on synergy areas.</p>
<p>One example is artificial intelligence. Of course, AI isn’t a new concept. It’s one of the most talked about topics in tech circles, both in the valley and around the world.  Yet, the technology is still in its infancy. “AI is still pretty early. We’ve positioned ourselves at the starting line. Now we need to deliver down the line,” said Jacopo Lenzi.</p>
<p>Presently, AI is very transactional, but Samsung leaders view the future of AI in the development of reasoning and planning capabilities – a field they hope to lead precisely by leveraging and consolidating across technologies. With Samsung’s vast portfolio of consumer devices, the company understands human/device interaction better than anybody, and the company has more touch points than any other through which it can bring AI to life. That might mean looking at ways a personal assistant comes to life in VR, for example, as opposed to how it works on a mobile device. “People may not think of us in AI, but we have this user base and portfolio of products with which to work. We want to use AI to make these products work better,” said Nick Cassimatis, who heads the Intelligence Lab at SRA.</p>
<p>Henry Holtzman, who leads work into convergence at SRA, is taking a similar approach to the work he and his team are doing. With more and more networked devices in our pockets, homes and cars, Samsung sees the need to make how consumers interact with devices even more fluid and convenient. “There is real user interface fatigue. Content is disorganized. And different generations consume information and content in different ways — older people interface with Netflix and younger people use Snapchat. So how do we make these gaps less confusing and more convenient?”</p>
<h3><span style="color: #000080"><strong>Infusing a New Spirit of Openness throughout Samsung</strong></span></h3>
<p><img loading="lazy" class="alignnone size-full wp-image-91728" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/07/Innovation-Series-Part-3_main_4.jpg" alt="" width="705" height="106" /></p>
<p>The ultimate aim of both this pursuit of partnerships, as well as consolidation across technologies, is to serve as a new model of openness that Samsung can emulate in how it operates throughout. To help business units no longer just look at the year ahead, but instead focus on long-term ideas that can change the way people live, with work bridging multiple business units. “The goal is getting our various business units to see beyond the next year and understand the impact,” said Shankar Chandran.</p>
<p>The easiest way to accomplish this, of course, will be through real life-results that help Samsung achieve breakthroughs in innovation.</p>
<p>Henry Holtzman shares an example of how several years ago, SRA was challenged to rethink how families shared information and memories. Team members visited homes to see the ways in which people posted information, pictures or notes.  What they noticed was that many families had calendars, photos, and messages on their fridges, turning it into a focal point of their everyday lives. This observation led the team to begin early design and concepts that would turn analog information sharing, like paper post-it notes and taped up pictures, and make them digital. The result? Calendar, sticky notes, voice, and camera features that came to life as part of Samsung’s award-winning Family Hub refrigerator. It is precisely this type of innovation and collaboration that the Silicon Valley team envisions being applied across Samsung in the future.</p>
<p>“The common element in this process was advanced concepting, which is a three-year concepting period for developing ideas,” said Holtzman. “After ideas are developed, they head through the innovation system — which includes trusted agencies and accelerators. This is the open innovation community that the Silicon Valley team is fostering.”</p>
<h3><strong><span style="color: #000080">The Future is Wide Open</span> </strong></h3>
<p>Technological innovation is moving faster than at any time in history, and thanks to Samsung’s efforts in Silicon Valley, the company is better positioned than ever before to be a leader in innovation and transform lives.</p>
<p>“The world will change more in the next five years than it did in the last 50,” says Chris Byrne. “And innovation in the next five years can and will come from anywhere. There are no geographic boundaries on brain power. So, our challenge is to be open—open in terms of how we innovate and open in terms of where innovation will come from.”</p>
<p>Even after working in the technology world for more than two decades, Lenzi said, “No other company has the same reach as Samsung and the opportunity to partner with innovators all over the world to realize Samsung’s vision is why I’m here,” and it is that passion for changing the way people live that will continue to drive Samsung’s efforts in Silicon Valley to achieve new heights for the future.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>Bixby : A New Way to Interact with Your Phone</title>
				<link>https://news.samsung.com/global/bixby-a-new-way-to-interact-with-your-phone</link>
				<pubDate>Mon, 20 Mar 2017 22:00:42 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2017/03/InJong-Rhee_main-704x334.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Editorials]]></category>
		<category><![CDATA[Mobile]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Bixby]]></category>
		<category><![CDATA[Galaxy]]></category>
		<category><![CDATA[Galaxy S8]]></category>
		<category><![CDATA[Galaxy S8+]]></category>
		<category><![CDATA[Injong Rhee]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Unbox Your Phone]]></category>
                <guid isPermaLink="false">http://bit.ly/2n5pz5t</guid>
									<description><![CDATA[Technology is supposed to make life easier, but as the capabilities of machines such as smartphones, PCs, home appliances and IoT devices become more diverse, the interfaces on these devices are becoming too complicated for users to take advantage of many of these functions conveniently. User interface designers have to make tradeoff decisions to cram […]]]></description>
																<content:encoded><![CDATA[<div id="attachment_88028" style="width: 715px" class="wp-caption alignnone"><img loading="lazy" aria-describedby="caption-attachment-88028" class="wp-image-88028 size-full" src="https://img.global.news.samsung.com/global/wp-content/uploads/2017/03/InJong-Rhee_main.jpg" alt="" width="705" height="360" /><p id="caption-attachment-88028" class="wp-caption-text">InJong Rhee, Executive Vice President, Head of R&D, Software and Services</p></div>
<p>Technology is supposed to make life easier, but as the capabilities of machines such as smartphones, PCs, home appliances and IoT devices become more diverse, the interfaces on these devices are becoming too complicated for users to take advantage of many of these functions conveniently. User interface designers have to make tradeoff decisions to cram many functions into a small screen or bury them deeper in layers of menu trees. Ultimately users are at the mercy of the designers with an increasingly steep curve that makes learning a new device difficult. This is the fundamental limitation of the current human-to-machine interface.  Since Samsung makes millions of devices, this problem impacts the core of our business.</p>
<p>Samsung has a conceptually new philosophy to the problem:  instead of humans learning how the machine interacts with the world (a reflection of the abilities of designers), it is the machine that needs to learn and adapt to us.  The interface must be natural and intuitive enough to flatten the learning curve regardless of the number of functions being added. With this new approach, Samsung has employed artificial intelligence, reinforcing deep learning concepts to the core of our user interface designs. Bixby is the ongoing result of this effort.</p>
<p>Bixby will be a new intelligent interface on our devices. Fundamentally different from other voice agents or assistants in the market, Bixby offers a deeper experience thanks to proficiency in these three properties:</p>
<p><strong>1. Completeness</strong></p>
<p>When an application becomes <em>Bixby-enabled</em>, Bixby will be able to support almost every  task that the application is capable of performing using the conventional interface (ie. touch commands).  Most existing agents currently support only a few selected tasks for an application and therefore confuse users about what works or what doesn’t work by voice command. The completeness property of Bixby will simplify user education on the capability of the agent, making the behaviors of the agent much more predictable.</p>
<p><strong>2. Context Awareness</strong></p>
<p>When using a Bixby-enabled application, users will be able to call upon Bixby at any time and it will understand the current context and state of the application and will allow users to carry out the current work-in-progress continuously.  Bixby will allow users to weave various modes of interactions including touch or voice at any context of the application, whichever they feel is most comfortable and intuitive.  Most existing agents completely dictate the interaction modality and, when switching among the modes, may either start the entire task over again, losing all the work in progress, or simply not understand the user’s intention.</p>
<p><strong>3. Cognitive Tolerance</strong></p>
<p>When the number of supported voice commands gets larger, most users are cognitively challenged to remember the exact form of the voice commands. Most agents require users to state the exact commands in a set of fixed forms. Bixby will be smart enough to understand commands with incomplete information and execute the commanded task to the best of its knowledge, and then will prompt users to provide more information and take the execution of the task in piecemeal. This makes the interface much more natural and easier to use.</p>
<p>We know that adopting new ways to interact with your devices will require a change in user behavior. The inconvenience of learning a new interface can cause friction and force users to revert back to old habits (e.g. the touch interface). At the same time we believe the key to success for a new voice interface is to design a scheme that reduces friction and makes the experience significantly more rewarding than the existing interface. So at its core, Bixby will help remove friction. It will simplify user education with new voice interfaces and will make using your phone even more seamless and intuitive.</p>
<p>Another example of removing friction will be the dedicated Bixby button that will be located on the side of our next device. Confusion around activating a voice interface is a barrier we have removed to make it feel easier and more comfortable to give commands. For example, instead of taking multiple steps to make a call – turning on and unlocking the phone, looking for the phone application, clicking on the contact bar to search for the person that you’re trying to call and pressing the phone icon to start dialing – you will be able to do all these steps with one push of the Bixby button and a simple command.</p>
<p>There has been a lot of excitement and speculation about what we will deliver with the launch of the Galaxy S8 later this month, especially due to the advancements in artificial intelligence. We do have a bold vision of revolutionizing the human-to-machine interface, but that vision won’t be realized overnight. Ambition takes time.</p>
<p>Bixby will be our first step on a journey to completely open up new ways of interacting with your phone. At the launch of the Galaxy S8, a subset of preinstalled applications will be Bixby-enabled. This set will continue to expand over time. Our plan is to eventually release a tool (in SDK) to enable third-party developers to make their applications and services Bixby-enabled easily.</p>
<p>Starting with our smartphones, Bixby will be gradually applied to all our appliances.  In the future you would be able to control your air conditioner or TV through Bixby. Since Bixby will be implemented in the cloud, as long as a device has an internet connection and simple circuitry to receive voice inputs, it will be able to connect with Bixby. As the Bixby ecosystem grows, we believe Bixby will evolve from a smartphone interface to an interface for your life.</p>
<p>Bixby is at the heart of our software and services evolution as a company.  We are fundamentally and conceptually changing our attitude toward software and services and working hard on innovation throughout all aspects of our mobile ecosystem. Our investment in engineering resources speaks for itself – we have thousands of software developers supporting this effort. This is something that I’m very excited about. Innovating in software and services enables opportunities for creativity and the ability to build new experiences from the ground up.  With the continued investment from Samsung on artificial intelligence, the possibility of what Bixby can become is endless.</p>
]]></content:encoded>
																				</item>
					<item>
				<title>SoundHound Inc. Raises $75 Million to Drive Growth and International Expansion of Houndify AI Voice Technology Platform and “Collective AI”</title>
				<link>https://news.samsung.com/global/soundhound-inc-raises-75-million-to-drive-growth-and-international-expansion-of-houndify-ai-voice-technology-platform-and-collective-ai</link>
				<pubDate>Tue, 31 Jan 2017 10:00:15 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2016/08/Newsroom-Press-Release-Thumb_704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Press Release]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[ARTIK]]></category>
		<category><![CDATA[Houndify™]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[SoundHound]]></category>
                <guid isPermaLink="false">http://bit.ly/2ldnwJ4</guid>
									<description><![CDATA[SoundHound Inc.®, the leading innovator in voice-enabled AI and conversational intelligence technologies, today announced $75 million in new funding. The round includes a strategic group of investors including NVIDIA GPU Ventures, Samsung Catalyst Fund, Nomura Holdings, Inc., Sompo Japan Nipponkoa Insurance Inc., and RSI Fund I (a VC arm of Recruit Holdings). Other new investors […]]]></description>
																<content:encoded><![CDATA[<p>SoundHound Inc.<sup>®</sup>, the leading innovator in voice-enabled AI and conversational intelligence technologies, today announced $75 million in new funding. The round includes a strategic group of investors including NVIDIA GPU Ventures, Samsung Catalyst Fund, Nomura Holdings, Inc., Sompo Japan Nipponkoa Insurance Inc., and RSI Fund I (a VC arm of Recruit Holdings). Other new investors such as Kleiner Perkins, SharesPost 100 Fund, and MKaNN are joining previous investors Global Catalyst Partners, Walden Venture Capital, and TransLink Capital. SoundHound Inc. has raised $115 million in funding to date.</p>
<p>The new funding will be used to accelerate growth, invest in international expansion and realize the company’s vision of Collective AI through its Houndify<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> platform. Houndify is the first independent AI platform that enables developers and business owners to deploy it anywhere and retain control of their brand and users, while differentiating and innovating. Houndify provides all the technology ingredients necessary for voice and AI integration, including the world’s fastest speech recognition, the most sophisticated natural language understanding, easy to use developer tools, knowledge graphs, and a large and rapidly growing number of domains. Domains are programs that provide a natural and fully conversational interface on specific topics, without requiring the users to memorize and use the specific phrasing of hard coded commands or skills. Houndify technologies represent 10 years of R&D and innovation by SoundHound Inc., resulting in significant and unique advantages.</p>
<p>One of the key advantages of Houndify is its architecture for collaborative intelligence called “Collective AI,” a powerful mechanism that facilitates collaboration among developers in a conversational intelligence environment. Collective AI enables developers to extend the functionality of existing knowledge domains without needing access to or a full understanding of the underlying libraries. This results in a global AI with comprehensive knowledge that is always learning, is crowdsourced to domain experts, and is larger than the sum of its parts. Houndify’s Collective AI architecture already provides access to knowledge and data from Yelp, Uber and Expedia, as well as over 100 other domains such as weather, stocks, sports, local businesses, flights, hotels, mortgage, and even interactive games. Houndify also provides a large number of domains specifically targeted for the automotive industry.</p>
<p>“We are at the inflection point of our long-term vision that every product or service needs to have a smart voice-enabled interface, and consumers have increasingly high expectations for this requirement, beyond simple commands or skills,” said Keyvan Mohajer, co-founder and CEO of SoundHound Inc. “With this strategic investment, we will bring the power of the proprietary technology behind our independent Houndify platform to even more users globally and amplify the rollout of our Collective AI architecture.”</p>
<p>In addition to advancing this vision of Collective AI, the funding will be used to further drive international expansion, particularly to Asia and Europe. Several of SoundHound Inc.’s strategic investors will be shipping products or services that take advantage of the company’s Houndify AI platform and its patented Speech-to-Meaning<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> and Deep Meaning Understanding<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> technologies. Within the first year of its launch, over 20,000 developers have registered to use the Houndify platform for their products, including strategic partners Samsung ARTIK Smart IoT platform, and NVIDIA, which brings Houndify’s large vocabulary speech recognition and natural language understanding to cars even without cloud connection by utilizing NVIDIA GPUs. SoundHound also uses NVIDIA GPUs for fast training of the models powering its Houndify platform.</p>
<p>“AI and deep learning are fueling all areas of our business, allowing customers to tackle previously unsolvable computing problems,” said Jeff Herbst, Vice President of Business Development at NVIDIA. “Combining Houndify’s Collective AI technology with NVIDIA’s deep learning platform enables an exceptional solution to understanding complex natural conversation that scales rapidly and seamlessly into new products and applications.”</p>
<p>“Samsung Catalyst Fund accelerates technological innovation by inspiring the brightest minds and ideas and providing access to Samsung’s global resources and expertise,” said Shankar Chandran, managing director and head of Samsung Catalyst Fund. “We believe the best innovations occur when ideas are shared through open ecosystems and collaboration. That is at the heart of what the team at SoundHound Inc. is building with their vision of Collective AI, which provides the tools and platform for developers to build more intelligent solutions easily and rapidly.”</p>
<p>This funding round comes at a time when the projected $14.4 trillion Internet of Things market has an increasing need for practical and innovative AI technologies, including natural language processing and voice interfaces. With over 20,000 partners on Houndify, hundreds of millions of mobile app downloads globally, and more than 100 patents, SoundHound Inc. is the only privately held company to own an entire suite of proprietary speech and language understanding technologies. At a time when every company needs to have a strategy around AI, Houndify is the only independent platform that is allowing its partners to develop, own and control their own AI strategy, data, and brand.</p>
<p>Houndify is featured in SoundHound Inc.’s mobile apps: Hound, the voice search & assistant app, and SoundHound, the music search, discovery and play app, making them hands-free and voice interface enabled. Houndify supports other technology and service providers to make their offerings available to its large and growing developer community, such as the recent partnership with SELVAS AI, which allows Houndify users to incorporate more than 20 distinct text-to-speech solutions in multiple languages into their products or services. SoundHound Inc. recently announced several new partnerships that use the Houndify platform such as Rand McNally, which uses the company’s voice technology in its OverDryve<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> connected car device; a collaboration with Onkyo, to develop and market a next-generation series of smart speakers powered by Houndify; an integration of Houndify in Sharp’s RoBoHoN® platform; and a collaboration with Shenzhen Tanscorp Technology Co. for the Robot LQ-101, an intelligent family service robot.</p>
<p>Developers interested in exploring the Houndify platform can visit <a href="http://Houndify.com" target="_blank">Houndify.com</a> to learn more and register. The Hound voice search and assistant app is available as a free iOS or Android app, and can be downloaded via <a href="http://cts.businesswire.com/ct/CT?id=smartlink&url=https%3A%2F%2Fbnc.lt%2FUIAk%2FCLvtY2YQmA&esheet=51502076&newsitemid=20170131005583&lan=en-US&anchor=SoundHound.com%2FHound&index=18&md5=3acc963e8928411d25ee71ddcd878d3d" target="_blank">SoundHound.com/Hound</a>.</p>
<p><span style="text-decoration: underline"><strong><span style="font-size: small">About SoundHound Inc.</span></strong></span><br />
<span style="font-size: small">SoundHound Inc. turns sound into understanding and actionable meaning. We believe in enabling humans to interact with the things around them in the same way we interact with each other: by speaking naturally to mobile phones, cars, TVs, music speakers, and every other part of the emerging ‘connected’ world. Our consumer product, Hound, leverages our Speech-to-Meaning<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> and Deep Meaning Understanding<img src="https://s.w.org/images/core/emoji/16.0.1/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> technologies to create a groundbreaking smartphone experience, and is the first product to build on the Houndify platform. Our SoundHound product applies our technology to music, enabling people to discover, explore, and share the music around them, and even find the name of that song stuck in their heads by singing or humming. Through the Houndify platform, we aim to bring voice-enabled AI to everyone and enable others to build on top of it. We call this Collective AI Our Mission: Houndify everything.</span></p>
]]></content:encoded>
																				</item>
					<item>
				<title>Samsung to Acquire Viv, the Next Generation Artificial Intelligence Platform</title>
				<link>https://news.samsung.com/global/samsung-to-acquire-viv-the-next-generation-artificial-intelligence-platform</link>
				<pubDate>Thu, 06 Oct 2016 08:00:27 +0000</pubDate>
								<media:content url="https://img.global.news.samsung.com/global/wp-content/uploads/2016/10/Viv-Acquisition_thumb704.jpg" medium="image" />
				<dc:creator><![CDATA[Samsung Newsroom]]></dc:creator>
						<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Mobile communications]]></category>
		<category><![CDATA[Viv]]></category>
		<category><![CDATA[Viv Labs Acquisition]]></category>
                <guid isPermaLink="false">http://bit.ly/2droosy</guid>
									<description><![CDATA[Samsung Electronics today announced that it has agreed to acquire Viv Labs, the intelligent interface to everything. Viv has developed a unique, open artificial intelligence (AI) platform that gives third-party developers the power to use and build conversational assistants and integrate a natural language-based interface into renowned applications and services. The transaction is subject to […]]]></description>
																<content:encoded><![CDATA[<p>Samsung Electronics today announced that it has agreed to acquire Viv Labs, the intelligent interface to everything. Viv has developed a unique, open artificial intelligence (AI) platform that gives third-party developers the power to use and build conversational assistants and integrate a natural language-based interface into renowned applications and services. The transaction is subject to customary closing conditions.</p>
<p>The deal showcases Samsung’s commitment to virtual personal assistants and is part of the company’s broader vision to deliver an AI-based open ecosystem across all of its devices and services. With Viv, Samsung will be able to unlock and offer new service experiences for its customers, including one that simplifies user interfaces, understands the context of the user and offers the user the most appropriate and convenient suggestions and recommendations.</p>
<p>Viv was founded by AI visionaries Dag Kittlaus, Adam Cheyer and Chris Brigham. As part of the acquisition, the founding team will work closely with Samsung’s Mobile Communications business, but continue to operate independently under its existing leadership.</p>
<p style="margin: 0cm;margin-bottom: .0001pt;line-height: 180%">“Unlike other existing AI-based services, Viv has a sophisticated natural language understanding, machine learning capabilities and strategic partnerships that will enrich a broader service ecosystem,” said Injong Rhee, CTO of the Mobile Communications business at Samsung Electronics. “Viv was built with both consumers and developers in mind. This dual focus is also what attracted us to Viv as an ideal candidate to integrate with Samsung home appliances, wearables and more, as the paradigm of how we interact with technology shifts to intelligent interfaces and voice control.”</p>
<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/TrjWhQuPvig" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"></iframe></div>
<p>With the rise of AI, consumers now desire an interaction with technology that is conversational, personalized and contextual—an experience that fits seamlessly within their everyday lives. To achieve this, Viv has developed a new and proprietary platform that allows this interaction to scale.</p>
<p>Viv’s platform also allows developers to teach the system how to create new applications or to use existing applications, building an open ecosystem of intelligence that is greater than the sum of its parts and gets smarter every day. Viv’s superior platform combined with Samsung’s leading devices, services and global resources will help drive the next generation of AI solutions.</p>
<p>“At Viv, we’re building the simplest way for anyone to talk to devices and services everywhere. We see a future that is decidedly beyond apps—where you can get what you need quickly and easily no matter where you are, or what device you are near,” said Viv co-founder and CEO, Dag Kittlaus. “Samsung offers us a unique opportunity to deliver a single conversational interface to the world’s apps and services across a diverse range of products, at global scale.”</p>
<div class="youtube_wrap"><iframe loading="lazy" src="https://www.youtube.com/embed/Op7LHp_91lQ" width="300" height="150" frameborder="0" allowfullscreen="allowfullscreen"></iframe></div>
<p>Samsung’s Global Innovation Center spearheaded the acquisition for Samsung’s Mobile Communications business. “At GIC we’re always looking for startups with a compelling vision and breakthrough experiences we can help build, grow and scale,” said Jacopo Lenzi, SVP of Business Development and Strategic Acquisitions. “We see great potential in the Viv AI software and platform, and we’re excited for Viv to reach our millions of users through Samsung’s global presence and distribution.”</p>
<p><span style="text-decoration: underline"><span style="font-size: small"><strong>About Viv Labs, Inc.</strong></span></span></p>
<p><span style="font-size: small">Viv radically simplifies the world by providing an intelligent interface to everything. Viv is an artificial intelligence platform that enables developers to distribute their products through an intelligent, conversational interface. It’s the simplest way for the world to interact with devices, services and things everywhere. Viv is taught by the world, knows more than it is taught, and learns every day. Learn more about Viv at <a href="http://viv.ai/" target="_blank">viv.ai</a></span></p>
]]></content:encoded>
																				</item>
			</channel>
</rss>