Apple released its first beta update of iOS 13.2 on October 3, with version number 17B5059g, bringing Deep Fusion (Deep Fusion) to the iPhone 11, the iPhone 11 Pro, and the iPhone 11 Pro Max.
Unlike the "Night Mode" which is also synthesized by taking multi-frame photos, Deep Fusion is invisible in the camera interface and has no settings. When the camera starts to run, Deep Fusion is already running in the background, and finally generates the final photos on the machine side. Phil Schiller, Apple's senior vice president, called it a crazy photography algorithm at the launch. But at the time of the launch of the 11-series, Deep Fusion was not ready, so it was not launched with the new iPhone.
Deep Fusion, like night mode, emphasizes continuous photography rather than fast capture, and requires multi-frame processing, which makes it unsuitable for some snapshot scenes. But it allows cameras to shoot excellent works indoors and under moderate lighting conditions. Let's take a look at the photographic comparison shared by netizens over the past two days.
Picture from Tyler Stalman.
The picture comes from Sebastiaan de With
Compare the photos taken with Deep Fusion on iPhone 11 with those taken with Galaxy Note 10 Plus, and you can see more details and texture when magnified:
The picture comes from Max Tech
Apple likes to use sweater to demonstrate Deep Fusion, which makes sense. In the contrast of wool and fabric, iOS 13.2 is more clear.
Pictures from Ben Geskin
• Use DeepFusion to remember to turn off the overshot camera settings.
Deep Fusion takes time to process, although it takes only a second, but if you cut the photo to the album immediately after you take it, the effect can be felt. The photo is initially blurred, and only if you wait or let it shake, will the normal clear picture be displayed.
Deep Fusion photos will take up more space.The feature is still being tested with iOS version 13.2 Beta, and the official version is likely to be optimized.