News Is a 600MP smartphone sensor as stupid as it sounds?

Feb 22, 2020
7
1
10
No, our eyes don't have anywhere near as many as 500-600 megapixels. In fact, we have about 120 million rod cells, and about 6 millon cone cells. The rod cells are not color sensitive, and moreso, they fully saturate and shut down in normally bright light. So you have about 6 million "picture elements" per eye.

But that's not the whole story. You also have those 6 million pixels connected to a really powerful deep learning supercomputer. Your eyes experience constant microtremors, shifting just a bit as you view a thing, and your brain integrates information from multiple positions to boost the optical resolution by at least 4x. So you're really seeing in about 24 megapixels per eye -- though once again, cone cells are picture elements, but not exactly pixels.

The rod cells are active in low light, and yes, that's 120 million per eye and 480 million or so with image processing. But that's in very low, photon starved light. The reason the rods are so many and so small is that they're very sensitive to photons. But you'll never have enough photons while they're active for all that many rods to be firing all at once. Even though what you see in low light is the product of multiple rods firing over time, you still see with grain. Just as your camera does when it's photon-starved in low light.

Samsung will have the same problem. Right now, they're using 800nm pixels in those 108 megapixel chips in the Xiaomi and the S20 Ultra, and in the 64 megapixel chips, and the 48 megapixel chips. Samsung's got a few 700nm chips as well, mostly used for tetracell selfie cameras. But they're probably not going any smaller. For one, the signal to noise level increases as your sensor's ability to capture a large number of photons diminshes, which it always will as the size shrinks.

Rod cells in the eyes are much larger, about 2um in diameter. And cones larger still. The eye's "sensor area" is actually quite a bit larger than that of a full frame camera, but maybe not so obviously, because of course it's curved, not flat. In fairness to the Samsung, today's silicon photodiodes have a quantum efficiency as high as 95%, while

Secondly, the wavelength of far red light is around 700nm. Correct color capture could be an issue going any smaller.
 

ASK THE COMMUNITY