Samsung is a leader in the development of high performance image sensors that improve the quality of photos and video. Recently, Samsung hosted its Tech Day 2022 event where they shared the latest improvements and future aspirations for various System LSI related technologies including the image sensor. Joonseok Kim, Vice President, Head of Project Management Team, discussed the pain points of smartphone photography and videography experiences and how Samsung continues to develop image sensors to improve the quality of content captured with them.
Image sensor technology has come a long way, so much so that smartphone photo quality is now almost comparable to those taken with professional digital cameras. Sensor technology is key, but software for multi-frame noise reduction, HDR and AI, among other aspects, has also been critical to this evolution.
How we use cameras has also changed a lot. It used to be primarily still images taken, but now, it’s all about video. Video is considered much more compelling and with popular social media apps, there’s greater consumer demand for it. Yet, there’s still a gap between the quality of still images and video, and closing that gap has proven to be difficult.
Why Smartphone Video Hasn’t Yet Caught Up
The first reason smartphone video quality lags is the presence of visual “noise,” especially in low light. This is caused by the lack of exposure time needed to capture enough light at 30 frames per second (fps) in video mode.
The second problem is with high-dynamic range (HDR) images it’s difficult to show true colors of both an object and the background in complex light conditions like backlighting. To make this work, a smartphone needs to run multi-exposure and multi-frame fusion through system-on-chip (SoC), which uses much memory and power to be practical.
The third is the lack of depth sensing capabilities. One of the best features of a DSLR camera is the “bokeh” – the pleasant-looking blur that appears when part of an image is out of focus. Using SoC a user can take photos where the subject stands out from a blurred background, but for video this method isn’t practical for the same reasons as that of the HDR’s.
“We decided that we needed to solve these problems with sensors as opposed to software, and we’re taking a three-part approach. First, we’ll be making improvements in light and exposure sensitivity, which has been a big challenge especially for small and thin smartphone cameras,” said Kim. “Second, to increase the luminance of range we are working on 12-bit and 14-bit sensors, and towards even higher dynamic range sensors for superb HDR. And thirdly, we are developing ToF (Time-of-Flight) sensors that detect true image depth. Our goal is to provide a precise bokeh for smartphone video and other 3D applications.”
The Advanced ISOCELL Pixel Technologies
The science of creating pixels has made substantial progress in recent years. As a rule, high resolution image sensors need small, light-sensitive pixels. To capture as much light as possible, the pixel structure has evolved from front-side illumination (FSI) to a back-side illumination (BSI). This places the photodiode layer on top of the metal line, rather than below it. By locating the photodiode closer to the light source, each pixel is able to capture more light. The downside of this structure is that it creates higher crosstalk between the pixels, leading to color contamination.
“To remedy such a drawback, Samsung introduced ISOCELL, its first technology that isolates pixels from each other by adding barriers. The name ISOCELL is a compound word from the words “isolate’ and ‘cell,’” Kim explained. “By isolating each pixel, ISOCELL can increase a pixel’s full well capacity to hold more light and reduce crosstalk from one pixel to another.”