Quad Bayer vs Quad Pixel AF: what they are, how they work and how they differ

Quad Bayer vs Quad Pixel AF: what they are, how they work and how they differ
ФОТО: dpreview.com

The Olympus OM-1's sensor has 80 million photodiodes, which are used to deliver 20MP images and across-the-sensor X-type phase detection. This sometimes gets mistakenly described as a Quad Bayer design.

That's not the case though, so we're going to take a look at how these two different systems work, how they differ, and contemplate why neither of the ILCs known to use Quad Bayer sensors make that detail public.

What is Quad Bayer?

Sony Semiconductor has developed a technology called Quad Bayer, initially promoted for use in smartphones.

The Quad Bayer design (right) uses an oversized version of the conventional Bayer pattern (left). Each color patch extends over four photodiodes, each of which has its own microlens in front of it. Image: adapted from Sony Semiconductor illustration

It uses a Bayer color filter pattern, but each colored patch extends across four photodiodes, instead of one. Each photodiode has its own microlens, so it can be used as an individual pixel, if needed.

This allows you to use the sensor in three different ways, depending on whether your priority is low-light performance, the ability to cope with high-contrast scenes or trying to extract the most detail from the scene:

In low light, the four photodiodes behind each color patch can have their output combined (binned), so that they collectively act as a single, large pixel with better noise performance. The image is 1/4 the sensor's nominal resolution.

In high-contrast scenes, every other row of the sensor can be read-out early, meaning it retains highlights that would otherwise be clipped. This highlight information can be combined with the midtone and shadow information to give a 1/4 resolution image with wider dynamic range than a regular sensor.

In high resolution scenes, an attempt is made to re-interpret the Quad Bayer data into something closer to Bayer data, to give a full resolution image. This won't be as detailed as an actual Bayer image with the same pixel count, but it still gives much more detail than the other two modes, and hence is more detailed than the 1/4 resolution sensor needed to match the low light mode's performance.

As you might imagine, the first applications of this technology were in smartphones, where it's used to work around the low-light and dynamic range shortcomings of tiny pixels on small sensors. Samsung has its own version, branded 'Tetracell' and a nine photodiode-per-color variant called 'Nonacell. '

This three-mode approach has direct parallels with Fujifilm's Super CCD EXR system, which first appeared back in 2009. This essentially used a doubled (and 1/2 pixel offset) Bayer pattern, rather than a full quadrupled design, so offered less detail in its high resolution mode, but the principle is very similar.

For high-res mode an attempt is made to 'remosaic' the Quad Bayer output into Bayer-like data. This process is imperfect. It's worth noting that Sony describes this as a 'conceptual' diagram. The 'remosaiced' data will not have the color resolution of a conventional Bayer sensor with the same pixel count.

Image: adapted from Sony Semiconductor diagram

Interestingly, there's evidence that Quad Bayer sensors are used in the Panasonic DC-GH5S and Sony's a7S III. Neither company acknowledges this, nor shows any outward indication of using anything other than the binned (low-light) mode. However, the fact that both cameras are very video focused raises the possibility that they use the high-contrast, HDR mode to capture additional highlights simultaneously with the midtone and shadows (an approach that avoids any timing difference between the two captures, making it ideal for video).

We approached both companies for comment on the choice of sensors in these models, but neither addressed whether the sensors are Quad Bayer*.

What is Quad Pixel AF?

The OM-1's Quad Pixel AF system also uses color filters that extend across four photodiodes, but it uses a single, large microlenses in front of each quartet of photodiodes, powering its X-type PDAF system, but meaning you can't use the individual photodiodes as separate pixels.

Image: OM Digital Solutions, with translated captions.

This isn't what's happening in the OM System OM-1, though. Like a Quad Bayer sensor, it's Quad Pixel AF system uses a Bayer color filter pattern where each colored patch extends across four photodiodes, instead of one. However, the OM-1 has large microlenses that extend across these groups of four photodiodes. This means that they can't be used as separate pixels, because they only see one quadrant of the scene in front of the camera.

This design has direct parallels with Canon's Dual Pixel AF system. But instead of having left- and right-looking 1/2 pixels, the OM-1 has Up/Left-, Up/Right-, Down/Right- and Down/Left-looking 1/4 pixels. This means the camera can derive X-shaped AF sensitivity, rather than the solely horizontal AF sensitivity of Canon's system.

We've been very impressed by the OM-1's autofocus results, so far. But the partial view of the scene received by each photodiode to give the AF system its sense of distance means there's no way for OM Digital Solutions to 'unlock' an 80MP sensor mode. Because although its filter pattern is the same as Quad Bayer, its microlens design means Quad Pixel AF isn't the same thing.

*Panasonic tells us the GH5S's chip was chosen for its readout speed, performance at high gain and the incorporation of dual conversion gain, it did not address whether it was Quad Bayer. .

bayer quad pixel

2022-4-5 17:00

bayer quad → Результатов: 2 / bayer quad - фото