Public Lab Research note

Camera sensor issues

by viechdokter | April 09, 2016 13:24 | 33 views | 5 comments | #12949 | 33 views | 5 comments | #12949 09 Apr 13:24

Read more:

Yeah, I know! Many guys here do REAL scientific work, whereas I just toy around with my new spectrograph. But well...

Today we have some sun here in Germany so I tried to "work" on the overexposure problem. While doing this there were a few things I noticed in the spectra I wanted to share with you. Here is the first spectrum:


The sun was just behind a cloud, so I aimed the spectrograph towards the sky and got a diffuse sunlight spectrum. The greens were still slightly overexposed. As in the sunset pictures the yellow is almost gone. Unfortunately you seem to get yellows only when the greens are strongly overexposed. Everything that lowers overall light intensity takes away parts of the spectrum. Shame, really!

But now lets have a look at the curve of this spectrum only:


Although the green is clipped (overexposed) we can see two distinct intensity peeks there. And even more interesting we can detect three different peaks of red! Now the dilemma:

the grating is supposed to bend the light of different wavelengths into different directions. Although we usually see yellow as a sum of red and green light - in the rainbow it is a colour of its own right. About 610 nm. The camera sensor can only detect red, green and blue, so it will compose all the other colours out of those three. Violet is blue plus red, isn't it? Yes, the curve shows a small red peak and a lot of blue in the place where we get the violet colour. Blue plus red equals violet. Every small kid knows that.

But hey! When the grating bends the lightwaves of different wavelengths into different direction, how can we detect any red at all in the blueish part of the spectrum? Shouldn't the red be only on the right hand side? What exactly detects the camera sensor?

I used to think that photons of different energy (different wavelengths) induce electron flow in different sensors (red, blue, and green sensors) and that the sensor ranges "overlap a bit" so that certain photons affect two sensors if their wavelength is in between say red and green. But in this curve it looks like all three sensors use photons from the whole range of wavelengths.

Here is the next spectrum. Direct sunlight through the slit. Green is strongly overexposed:


And the curve again:


We now have a nice yellow but we still have the two green and three red peaks, almost exactly at the same places (same distances) but they are a bit redshifted (about 10 nm to the right). But why exactly is there a drop in the intensity of greens between the two green peaks? Even in the strongly overexposed spectrum the green drops in the middle. Why?

And another overexposure issue. Look at this curve:


All three channels are strongly overexposed and clipped. I wonder why the image in the middle becomes grey. Is this the camera's way to say: "That's too much for me! Let's fade to grey!"? And look at the curve again:


Each colour channel is clipped at a certain point because of the strong light, but in the middle (the "strange behaviour part") all three of them are clipped and reduced to 80% which adds up to grey (100% would be white, 0% would be black). So the camera sensor doesn't mind a single colour becoming too intense but when all three of them get too bright in one place it darkens the whole part down a bit.

Brain food, huh?


Interesting observations. My guess is that IF the camera's sensor responded in a purely linear fashion, the over exposure would be white as you'd expect. Since the diffraction grating doesn't care about light intensity, my guess is that the camera's internal AGC (automatic gain control) is working to protect the sensor electronics and attenuating the signal which gets converted into the RGB channels. Just a guess, but I can't think of any other component of the optical thru data signal path which would be this non-linear.

Reply to this comment...

I also think that when a blue pixel is extremely overexposed, neighboring pixels may be triggered, either by current on the sensor chip itself or potentially somehow in the blending of color channels. So this is another reason to avoid overexposure, as it can cause a variety of color issues in addition to losing peak data.

Reply to this comment...

Also, it may be helpful to add some common tags like "spectrometer,overexposure" to your posts, which I've been trying to help do too. This will hopefully help interrelate your work with similar topics. Thanks!

Reply to this comment...

Yeah, thanx for the tags. I will try to add them to my next notes. But feel free to add any tags you find helpful.

Reply to this comment...

Jeff, I have doubts about "current bleeding" between the quantum wells as they are isolated from each other with silicon and the conversion to data is generally per-row and on-chip so clipped but not blended. I'd be more inclined to accept an optical effect involving refraction between the filter layers at the sensor's front surface creating a 'blooming' effect or even refraction effects of the cheap poly-carbonate lens. But yes, all just additional reasons to prevent channel clipping.

Reply to this comment...

Login to comment.

Public Lab is open for anyone and will always be free. By signing up you'll join a diverse group of community researchers and tap into a lot of grassroots expertise.

Sign up