[Snapshot: Gear 360] Camera Technology that Captures Everything Around You
Samsung’s Gear 360, with its ability to capture one’s entire surroundings, provides an entirely new way to film and share life’s every moment. Yet, as a novel device boasting entirely new features, the camera’s structure, lens type and even development process differ from those of previous traditional models.
Six camera experts who worked on the Gear 360 project in the Mobile Communications Business, explained the challenges they had to overcome as well as the lessons they learned during the development of the product.
Q. Tell us a bit about the Gear 360 and how it came to be developed.
Yongwook Kim: In the same way that consumers once prompted camera manufacturers to support video recording on digital cameras, they are now wanting to see photos and watch video content in the same way they would with their own eyes. Gear VR provides such an experience, allowing viewers to feel as if they are actually in the moment.
Yet people also want to capture these moments, along with every little object and action that appears in them, as accurately as possible. Understanding this, we developed the Gear 360.
Dongok Choi: Devices like smartphones and TVs—of which Samsung is most reputed for—are essentially tools that enable viewers to consume content. On the other hand, the Gear 360 is a new type of product that lets them produce content.
Q. How did the development process of the Gear 360 differ from that of traditional cameras?
Dongwoo Kim: Traditional cameras allow you to capture an image within a limited angle of view through the lens, so you have to actually point the camera directly at the subject you’re shooting to capture it. In typical lenses, there is a central area and a peripheral area. Because we naturally focus on the center of a photograph, camera lenses, in general, produce the best picture quality at the center and relatively lower quality on the sides.
But 360-degree cameras are different. They shoot in all directions and don’t distinguish between the central and peripheral areas. When using Gear VR, for example, the “center” of an image or video changes as you turn your head in different directions. If there were a drastic difference in picture quality between central and peripheral areas, it wouldn’t feel very lifelike. Therefore, we had to minimize this difference on the Gear 360, and placed a great emphasis on designing and manufacturing the lens to ensure consistency in picture quality.
Q. How did you accomplish this?
Dongwoo Kim: Since it was our first time developing a 360-degree camera, we had to go through numerous procedures including analyzing consumer demands, determining design specifications and developing a methodology for evaluation and standardization.
Also, we had to consider things we had never given thought to before with smartphone cameras such as the stitching algorithm—which connects pictures and videos from all directions in a sphere-like form. We had a lot to do on the software front, too.
Soojung Kim: To adjust sensor assy and lens assy for the Gear 360, we needed equipment to carry out the “active align” process. However, in the initial stages of the Gear 360’s development, facilities that manufactured this equipment were also in their infancy, producing only 10 to 20 units per hour.
Because of this, it took an enormous amount of time to manufacture a pilot device for testing. We really couldn’t afford to wait, so we had to roll up our sleeves and personally help out with the manually-operated equipment. I remember how excited we were when we finally saw the module completion process working properly.
Q. What were the challenges in using two camera modules on the Gear 360?
Kiyun Jo: Considering the sphere-like design of the product, as well as performance and efficiency in aligning multiple cameras, we decided to use two camera modules and two image sensors. This created new challenges, including sensitivity deviation between the sensors and differences in the volume of image shift from each camera.
A typical camera design might incorporate multiple lenses into a single lens module to allow for more sophisticated functions. To do this, it is critical to align the optic axis of each lens accurately. But developing the Gear 360 was even more challenging, as we had to square each lens module perfectly with the other.
Another key element in manufacturing a camera is to set a perfect locational relation between the lens and sensor. While the tilt of the sensor is typically the top priority in smartphone cameras, our focus with the Gear 360 was on both three-directional motion components (x, y and z) as well as spin components (yaw, pitch and roll). In other words, we had to apply far more sophisticated standards to the tilt and spinning of the lenses.
Q. Why did you choose the 195-degree view angle?
Junghyun Lee: If you want to shoot everything in a full field of view of 360 degrees with two lenses, each lens has to capture more than 180 degrees. Also, you need some overlap of the two images or videos for stitching.
We had no specific formula to work with, but we knew that it was better to go for a larger view angle to give the software enough information to successfully stitch the images.
However, a larger view angle forces the device to use up more resources, making it more difficult to design the device in terms of optics and components. With this angle, the lenses might also capture the camera’s internal components. So we had to pay extra attention to ensure that the components would not interrupt the view angle. After much trial and error, we found that a 195-degree view angle guarantees optimal results.
Q. Why do you use fish-eye lenses?
Dongok Choi: Videos and images taken with a fish-eye lens are displayed in a sphere-like form. So if you look at them in 2D, such as on a smartphone, they look flattened and distorted. Considering a 360-degree camera needs to shoot and record everything around you, a fish-eye lens is favorable because it captures things more accurately.
To put this into perspective, imagine a globe in a classroom. Thanks to its spherical shape, it represents scale and size more accurately than a flat world map.
Q. You previously mentioned video and image stitching. How does that work?
Junghyun Lee: Since we shoot with two 195-degree lenses that face opposite directions, the excess parts of the images (from 180 to 195 degrees) overlap each other. The software processes this overlap to seamlessly connect the two images on the sides.
For optimal results, we tried to make the colors and other elements in the two stitched images appear as natural as possible even before being processed by the software. To this end, we made diverse efforts, including minimizing the difference in colors and exposure of the two lenses by adjusting ISO/AWB, and fine-tuning the optics in the engineering process.
Q. How did you evaluate the picture quality for the Gear 360?
Yongwook Kim: There was no specific standard to determine the picture quality. As I mentioned earlier, there is no difference in the central and peripheral parts of the image. So we didn’t know where to look to determine if the picture quality was satisfactory. So we continued to discuss the issue and change the standards and scope throughout the development and verification phases.
Soojung Kim: While we had established standards to evaluate the picture quality of conventional cameras, we didn’t have any to do so with sphere-like images and videos. In addition, we concentrated our efforts to meet consumer expectations, which was difficult as they have ever higher standards driven by steady advancements in smartphone cameras. As a result, we had to devise an entirely new set of standards to evaluate images captured by the Gear 360.
Products > Mobile
For any issues related to customer service, please go to
Customer Support page for assistance.
For media inquiries, please click Media Contact to move to the form.