Why Android phones are better at almost everything, but they can't compete with Apple iPhone cameras?
There is always a war accelerating between Android and Apple to get the top spot. While purchasing a mobile phone, the camera is one of the most notable features that attract every buyer. On the grounds of image resolution, there is a win-lose situation between Android and Apple on a regular basis.
The camera quality varies depending upon the price and hardware being used. You can buy a brand new Android phone for under £50 and another Android phone for well over £1,000 like Samsung S21 ultra series. The more money you invest, the better image quality you get. Same is the case with the hardware being used in mobile phones.
Since a big part of overall image quality does need good hardware, let’s see the direct impact of it on both Android and iPhone.
Apple started using the 1/2.55″ sensor in the iPhone XS in 2018. On Android, they were pretty common in 2014. Before XS, Apple’s sensors were smaller, thus less light-sensitive, as compared to any flagship Android phone.
The “portrait” camera, basically a normal lens in 35mm terms, was first available in the iPhone 7 back in 2016. There were lots of 2-camera phones before 2016, but they were generally using the second camera for image quality enhancements (B&W and color shot together), depth mapping for bokeh effects and ultra-wide.
However, since 2016, other companies have delivered actual telephoto cameras, not just a normal/portrait lens. These range from 3x to 5x relative to the main wide-angle lens.
The iPhone 11 has Apple’s first ultra-wide-angle camera. LG introduced an ultra-wide front camera on the V10 in 2015, and an ultra-wide back camera in 2016 on the G5.
Missing from Apple’s phones in 2019 was the time-of-flight sensor. Most companies were using 2-lens parallax to depth map an image for bokeh. This worked pretty well when Huawei and Leica first did it in 2016, using a matched set of cameras (one B&W, one color), but it’s not all that great between two different lenses. So the time-of-flight camera reads out the time it takes a laser to reflect from a subject on a per pixel basis. So it’s very good for portrait depth mapping.
Since 2018, many companies have been offering phones with yet-again larger sensors. Most of these incorporate a quad-bayer color matrix and 48, 64, or even 108 megapixels, allowing a variety of pixel-chunking modes, letting the user trade-off better sharpness and much improved low-light performance. General Mobile GM20 with AI Scene Detection give superb camera results.
Android phones sometimes offer specialty cameras. CAT sells a phone with a built-in FLIR heat camera. Moto has an add-on to the Moto Z series that gives you a 360-degree camera, and another that gives you a 10x optical zoom. Several flagship phones this year are adding a macro camera as well.
Apple has been pretty competitive with its software, and that helped it gain a top rank. After all, they were doing decently, often making the top ten in image quality, using a far inferior image sensor. But it’s a fact that pretty much all phone sensor chips are a compromise, and it’s the software that delivers top performance, though the hardware helps, too.
Apple was recommending their 2-shot HDR mode for a while, and they made it the default in 2019’s camera app. But that app took it a bit further. When you run the app, it’s always taking photos, and buffering the last 8. It’s also alternating between full exposure and underexposure. When you press the shutter icon, the phone takes one more shot in low light, no more in bright light. It picks the best out of the four sets for bright light and may use them all for low light.
The advanced multi-shot night mode is another exciting feature. Apple implemented one in the iPhone 11, snapping nine shots, using an AI for color rendition, etc. Google and Huawei first did similar night modes in 2018. Google also published an article on their AI blog telling everything about it.
Google also implemented a computational telephoto in 2018, using a technique called DRIZZLE to boost resolution using multiple shots. Samsung and others are using this in 2020, but nothing from Apple yet.
Google also has a thing called a computational raw image. They can create a single file from multiple images used in any mode that’s using computational photography, and store it all. So nothing’s lost, versus a JPEG or HEIC file, which tosses out most of the information captured — information you may not notice missing, but that you will certainly need if you want to edit an image. iOS supports raw files, but not computational raw, and Apple doesn’t support raw in their app, or on the ultra-wide camera.
There are always reviews for camera phones, which attempt to apply a set of standard tests to determine the better cameras in the market. With the help of these test results, you can decide to go with the camera based on your personal preferences.
As you can see, Apple’s base iPhone 11 is a match for the Huawei P20 Pro but they have just released the P40 Pro. The iPhone 12 Pro Max and Samsung’s Galaxy S10+ are a match as well, but Samsung’s 2021 flagships are already out, but haven't been reviewed yet by DxO Mark.
The Xiaomi Mi Pro uses the same basic 108 megapixel main sensor as the Samsung Galaxy S20 Ultra, but Samsung’s using a different color filter and, of course, different AI and image processing.
If you don’t like DxOMark, feel free to find other review materials you trust better. But don’t just go claiming that Apple is “better” without facts. Apple does a good job, but every major Smartphone company is working hard to build a better camera. Most of them are pushing the hardware technology much harder than Apple.