From the GSM Arena article:
Let’s start from the beginning – a Bayer filter is a colorful mosaic of Red, Green and Blue filters that allows a digital sensor to capture color photos. Semiconductor pixels don’t “see” color, they only capture the amount of light that hits them, so without a filter you will get a Black & White photo. The Bayer filter makes sure that the light reaching each pixel is of one of the three primary colors. The way it works out is that a 12MP sensor, for example, has 6 million pixels that see green, and 3 million pixels each for red and blue. Green gets more pixels because the human eye is the most sensitive to that color. An algorithm called demosaicing is used to interpolate a full 12MP resolution image.
A Quad Bayer filter is a bit of a misnomer as it’s actually the same as a regular Bayer filter. What really changes is not the filter but the sensor behind it – these new sensors put four pixels behind each color square instead of just one. So, really these 48MP Quad Bayer sensors can’t offer much more detail than a 12MP sensor. Sensor and phone makers alike will tell you that smarter demosaicing algorithms can capture more detail, but our experience is that the gain is small – if there’s a gain to be had at all.
It also chats about noise reduction, single shot HDR, and zoom:
Noise is a random process and if the large pixel of a traditional sensor captures noise instead of signal, there’s little to be done (other than covering it up by interpolating data from neighboring pixels). If one of the four pixels on a Quad Bayer sensor captures noise, however, that’s only 25% of the information lost – a 4x noise reduction that doesn’t diminish the sharpness of the image.
Alternatively, the sensor can be split up into two logical sensors – one that captures a short exposure and one a long exposure. This is used in daylight for real-time HDR capture.
You could do noise reduction and HDR with a single non-quad Bayer sensor by taking two (or more) photos one after another and combining them. That’s what the Pixel phones do and they are quite good at it. But there’s a problem – moving objects change position between sequential exposures. A Quad Bayer filter takes two photos at the same time, so there’s no need to use AI to correct for artifacts caused by moving objects.
I'd recommend you read the full article (with a cup of something - it's quite long and with lots of photo examples to click through to view), but I'd like to lay out some of the benefits and drawbacks of 'Quad Bayer' versus the 40MP+ PureView arrangements we had in the Nokia 808 and Lumia 1020:
|Tech||Proprietary oversampling on standard Bayer.||Grouping pixels in blocks of four under each R, G or B filter.|
Higher genuine detail at both oversampled (5MP) and full resolutions.
Noise reduction excellent, thanks to up to 7:1 pixel combination.
Noise reduction pretty good, by processing results from each pixel in a block of four.
Single shot HDR possible, thanks to taking two different 'views' of the sensor at capture time.
Much cheaper to implement (now, in 2019), since the components are mainstream(!)
|Disadvantages||Much harder to do HDR, single multiple exposures needed.||
Limited lossless zoom, since colour accuracy suffers immediately. Details also aren't as good as on PureView devices.
Comments welcome. Is Quad Bayer a step forwards or a step backwards overall? And do you think we haven't seen any manufacturer even try at a full Bayer 40MP+ sensor since the days of PureView for patent reasons (I'm guessing Nokia or Microsoft still hold these)?
Of course, the original Nokia PureView 41MP devices are now ancient history in terms of hardware and now even OS, so the comparison is academic. But still...
I guess a modern 48MP sensor being pixel-binned like this is still preferable to an 'as-is' 12MP sensor in terms of HDR and low light, though enough manufacturers (Google, Samsung) are sticking to the latter along with ancillary cameras and lots of processing that the playing field is now not as straightforward as 'back in the day'!