Camera parameter tuning and development of color processing algorithms require verification over hundreds of images. Locating the coordinates of calibration targets such as Macbeth ColorChecker can be labor intensive. How many hours of graduate research assistants or interns do you think is spent locating ColorCheckers?
Joking aside, we recently developed a software tool called “CCFind” to automatically detect Macbeth ColorChecker inside an image. This is our attempt to help camera manufacturers, researchers, and photographers that we come in contact with daily. It is available free, from this link.… more
At the time that I developed the adaptive homogeneity-directed (AHD) demosaicking algorithm, demosaicking was the next exciting problem in the imaging world. With a timely publication and the help of Dave Coffin (author of DCRaw) and Paul J. Lee (contributor to DCRaw), AHD succeeded as one of the most widely adopted demosaicking methods. Nearly ten years later, the sensor resolution has exceeded the resolution that the optics can deliver. Once a hot research topic, demosaicking now receives far less attention today.
“So, is demosaicking dead?”
Somehow people take me as a spokesperson for demosaicking, and I am asked this question often. Very often. My answer has been “No, demosaicking is not dead.” In fact, there’s many treasures yet to be uncovered.
First of all, it is true that demosaicking *research* has limited impact on camera design today. A poor handling on demosaicking will certainly degrade image quality, so demosaicking certainly qualifies as an “important” or at least “relevant” problem. But the newest demosaicking algorithm will not yield significantly better results than the AHD in most scenarios. In other words, the existing methods are “good enough” for practical purposes.
But, what many overlook is the fact that other problems in camera pipeline are intimately connected with the demosaicking.… more
Truth be told, camera manufacturers spend just as much time tuning parameters as the efforts they put into designing the camera itself. Even then, the professional photographers still manage to take more pleasing images than armatures because they understand what controls they have over the camera. The role of parameters in a camera cannot be overstated. So why can’t the parameter optimization be automated?
It turns out, there is a practical problem in trying to answer this question. Leaving aside the aesthetics of photography, we turn our attention to image quality assessment.
Subjective quality assessment of digital images has many tangible benefits to the design of camera systems. “Noise” and “artifacts” are best described by aspects of images that appears most unnatural to the human eye. An objective visual quality assessment (QA) metric aimed at unsupervised prediction of perceived quality expedites the advancement of camera systems by replacing the subjective analysis with an automatic one. A “self-tuning” camera must pick a set of parameters that maximizes the QA score.
But think of the challenges. Today, there are two categories of QA. A full reference assessment (FR-QA) compares the perceived similarity of a given image to the ideal reference image—and for that, an ideal reference image must be made available. … more
In a previous blog entry, I discussed the relationship between sensor resolution and color. In this blog entry, we focus on the analysis of cross-talk.
Unfortunately, there are inaccurate claims made by imaging experts about the relationship between CFA and cross-talk, based on some speculative reasoning. Since cross-talk affects adjacent pixels, the conventional wisdom says CFA should be designed such that each type of color in CFA is evenly spread out (e.g. lattices). The idea is that every pixel should have similar neighborhood structure. This turns out to be wrong! There is little to be gained from such analysis.
Here is the correct way to understand color cross-talk. Photon/electron leakage is essentially a sharing of pixel value with its neighbors. This is nothing more than the familiar “low-pass filter” (or a spatial blurring). When a low-pass filter is applied to CFA sensor data, this will surely introduce an ambiguity between red/green/blue color components. But there is more to this story. The low-pass filter is best understood in the Fourier domain—it attenuates spatial high-pass components of CFA sensor data.
A rigorous Fourier analysis of CFA sensor data will be covered in another blog entry. For now, let’s just say that CFA sampling “modulates” chrominance (color) components to spatial high-pass regions.… more
Consumers love pixels… lots and lots of pixels. To squeeze more pixels into the same sensor surface, the pixel geometry is made smaller. Imagine subdividing 1 inch by 1 inch area into 10 million parts instead of 5 million. There are many unintended consequences caused by the miniaturization of pixels. In this blog entry, I’d like to discuss about color and pixel geometry.
When the two adjacent pixels are close together, there is a problem of pixel leakage. We call this problem “cross-talk.” There are two predominant types of cross-talks: optical diffraction and minority carrier diffusion. Optical diffraction refers to light leakage—a photon that lands on one pixel strays before it gets captured by photodetector of another pixel. Minority carrier diffusion refers to electron leakage—a photon strikes the photodetector and generates electrons. These electrons stray and gets captured by neighboring pixel. In either case, each pixel value is “shared” by its neighbors. In terms of image processing, the main difference between the two is that optical diffraction causes independent Poisson random variables at each pixel (because photon is one particle). Contrast this to minority carrier diffusion, which causes the neighboring pixels to have joint Poisson statistics because they shared the same photon.… more
Welcome to Intelligent Signal Systems Laboratory Blog. ISSL has unique capabilities and expertise on image processing and is recognized as a leader in camera processing pipeline designs. This blog is a collection of technical commentaries on signal and image processing, statistics, and color science. Our publications cover the tedious details, but this blog is intended for tech-savvy readers who want the high level understanding of image and camera processing. There is a lot of information, myths, and confusion among photographers, camera manufacturing advertisements, and even imaging experts about how camera pipeline works. This blog will hopefully shed light on some of these questions.
I will also add that I am working towards a book on camera processing pipeline. More details will be made available in near future.
In any case, I hope you enjoy the ISSL blog! Please feel free to comment on the topic and provide additional insights, feedback, or counter arguments. I welcome suggestions on other blog topics to cover. Please browse around the ISSL website and explore!
Keigo Hirakawa, PI… more