The latest iOS 13 developer beta gives us a sneak peek at Apple's new Deep Fusion mode

The latest iOS 13 developer beta gives us a sneak peek at Apple's new Deep Fusion mode
ФОТО: dpreview.com

Earlier this week, Apple released the first developer beta version of iOS 13 with support for its Deep Fusion technology built-in. Although there’s still plenty to learn about the feature, multiple developers have already taken the camera tech for a spin and shared their thoughts (and results) around the web.

To refresh, below is a brief explainer on what Deep Fusion is from our initial rundown on the feature:

‘Deep Fusion captures up to 9 frames and fuses them into a higher resolution 24MP image. Four short and four secondary frames are constantly buffered in memory, throwing away older frames to make room for newer ones [. . . ] After you press the shutter, one long exposure is taken (ostensibly to reduce noise), and subsequently all 9 frames are combined - 'fused' - presumably using a super resolution technique with tile-based alignment (described in the previous slide) to produce a blur and ghosting-free high resolution image. ’

Resolution of images does not appear to be increased to 24MP (hence the strikethrough above), but there's some detail to be gained in certain situations. Although the tests are far from conclusive, we’ve rounded up a few sample images and comparisons shared by Twitter users from around the world. From the commentary shared by those who have tested the feature and from a brief analysis with our own eyes, Deep Fusion appears to work as advertised, bringing out more detail and clarity in images.

Here is a look at the difference #DeepFusion makes in lowish light. pic. twitter. com/V6Cxys2c1k

October 3, 2019

Interesting, you can see Apple’s Deep fusion technology being applied in real time to an image. The difference is pretty massive at 100% crop pic. twitter. com/ZSMb3Mywdl

— Benedict Bandersnatch (@_oscg) October 4, 2019

Very first tests of #DeepFusion on the #iPhone11 pic. twitter. com/TbdhvgJFB2

— Tyler Stalman (@stalman) October 2, 2019

In addition to the above comparison, photographer Tyler Stalman also compared how Deep Fusion compares to the Smart HDR feature.

By popular demand, here is Smart HDR vs #DeepFusion. Both shot on the #iPhone11Pro pic. twitter. com/AQqrw97VhF

— Tyler Stalman (@stalman) October 2, 2019

As noted by Halide co-founder Sebastiaan de With, it seems as though the image files captured with Deep Fusion are roughly twice the size of a standard photo.

In my first tests, Deep Fusion offers fairly modest gains in sharpness (and much larger files — my HEICs came out ~2x bigger). pic. twitter. com/ISclMKT1hK

— Sebastiaan de With (@sdw) October 2, 2019

Much remains to be seen about what Deep Fusion is actually capable of and how third-party developers can make the most of the technology, but it looks promising. There seems to be some confusion as well regarding whether Deep Fusion will work with Night Mode, but according to Apple guru John Gruber, the two are mutually exclusive, with Deep Fusion being applied to scenes between 600-10 lux while Night Mode kicks in at 10 or fewer lux.

What I’ve been told is that Deep Fusion applies between 600-10 lux. Night Mode is for 10 or fewer lux. They are mutually exclusive.

— John Gruber (@gruber) October 3, 2019

We’ll know more for sure when we have a chance to test the new feature ourselves.

.

deep fusion ltr twitter are

2019-10-5 20:50