Those fancy new iPhone 11 camera features — who did it first?
The new iPhone 11 series brings about a slew of new camera features, including more cameras, Night mode and also Deep Fusion. While we can’t comment on the latter, throwing in more cameras and Night mode aren nothing new in the Android space. So lets go through who did it first.
World’s First Dual Camera Smartphone?
Yeah, I know. Apple started using dual cameras back with the iPhone 7 Plus. But the very first dual camera smartphones cropped up way back in 2011 when the HTC EVO 3D and LG Optimus 3D were launched with dual cameras. But the focus (pun not intended) for dual cameras back then was quite different. Instead of offering a different focal length, the duo offered the capability to capture 3D photos and videos. It was somewhat of a thing back then.
If we are talking about the first smartphones that had two usable cameras, it would probably be the LG G5 or HUAWEI P9 in 2016. The LG G5 is actually the closest to what the iPhone 11 offers, with the secondary 8MP camera sporting an ultra-wide angle lens. HUAWEI’s combination of an RGB and a monochrome sensor for better image quality was quite novel at that time, and HUAWEI continued to use the technology until the HUAWEI P20 Pro.
World’s First Triple Camera Smartphone?
Coincidentally, the first smartphone to appear with a triple-camera setup is the HUAWEI P20 Pro, launched in 2018. It consisted of a 40MP f/1.8 sensor, a 20MP monochrome sensor and a 8MP f/2.4 telephoto camera. This is also the last flagship from HUAWEI to tout a monochrome sensor, as HUAWEI adopted ultra-wide angle cameras starting from the HUAWEI Mate 20 series onwards. It hogged the top rung on DxOMark chart for quite a while, even tying the HUAWEI Mate 20 Pro and Galaxy S10+, the latter a device which was launched 2 years after the HUAWEI P20 Pro.
We can see that Apple actually hopped onto the triple camera bandwagon quite a bit faster than they did with dual camera configurations. The iPhone 11 Pro and Pro Max are only two years late to the party, but then again it is in 2019 that we see Android smartphone makers really picking up on the triple camera trend, so I guess they are just fashionably late.
Low-light photography is a challenge for any kind of camera. Be it a full-frame DSLR or a smartphone. Especially so for the latter, given the tiny sensor sizes that we are forced to deal with. But smartphone makers have always been pushing the limits on what is possible despite the confines of the technology at the time.
ASUS, for one, boasted of impressive low-light photography in the first generation ZenFone 5 and ZenFone 6, launched way back in 2014. Yes, they launched the first trio with the very poorly planned numbering. They used pixel binning to improve low light performancem so the 13MP sensors gave you 3MP low light images, while the 8MP ones gave you just 2MP. Results weren’t exactly impressive, but it was a step in the right direction. They weren’t the first to do it though, Nokia did it way back with the Nokia 808 PureView, binning the whopping 38MP readout of the sensor into 8MP images.
The current trend of multiple short exposures combined together to simulate a longer one along with some AI stabilization has been around for quite some time, arguably a trend that started with Google’s Night Sight on the Pixel 3 back in 2018. HUAWEI has since offered a similar implementation of AI Stabilization (AIS) in the Night Mode that debuted on the HUAWEI P20 and Super Night Mode on the HUAWEI P30 and P30 Pro, the latter enabling even longer exposures.
I believe the first to tout AI photography is HUAWEI, back in late 2017. HUAWEI has been going around telling everybody and their moms about the Kirin 970’s AI chops, and the main use case for it was the camera. Notably, HUAWEI’s partnership with Leica was a bragging right of its own. There isn’t much information about Deep Fusion, but it seems like it is a lot more than what HUAWEI is currently offering.
HUAWEI’s Master AI uses AI to identify the scene you are capturing and optimizes the settings including switching to the right cameras for it, but Apple will apparently be using the neural engine in the A13 Bionic to go through 9 exposures of varying lengths and selecting the best one by going through them pixel by pixel. It sounds pretty similar to what Google does on the Pixel smartphones, and those are already renowned for their photography prowess. Apple’s implementation isn’t available yet, so there aren’t any sample images lying around for you and me to scrutinize.
With smartphones packing increasingly potent AI capabilities, this is definitely an area that should see some development. Smartphone sensors are limited by their size, and throwing more cameras at the problem can only get you so far. Computational photography definitely would be the next big thing to pursue, and it’s nice to see Apple put the neural engine in the new A13 Bionic to work with it.
Did we miss any world-first smartphone camera tech that debuted in Android smartphones before they did in Apple’s new iPhone 11 series? Let us know!