Imagine a camera sensor as a collection of lots of smaller sensors. Each collect red, green, or blue light. These sensors are packed together to get an image that measures 3024 by 4032 pixels. (Technically each pixel on that sensor is called a ‘photosite,’ as they collect, yes, photons)
You’d think a bigger sensor means more pixels — and indeed, a bigger sensor could allow you to pack in more pixels. But we’re at a point of diminishing returns in megapixel wars.
Instead, Apple decided to make the the photo sites bigger, because one most important aspects of image quality images (and really, life in general) is signal to noise.
An important note on the difference between the Max and non-Max:
Here’s why we’re seeing stories that the camera is a minor difference at best: Most people who aren’t seeing the dramatic difference are shooting in daylight, with a fast ƒ/1.6 lens. On top of that, Apple’s intelligent image processing combines multiple shots together, which makes it harder to look into the hardware.
The visuals and diagrams here, especially of the sensors, are really cool. This is great work.
I still don’t want the huge phone, but it sure does look amazing.