For some reason, people like to take sides on things that really don’t need to have sides at all. You’re a fan of iPhones? Cool. You’re more comfortable with Android phones? Use it in good health.
But we live in a competitive society, and so the one company that produces iOS-based phones and the myriad brands that make Android-based phones always feel that they need to explain why their product is more spectacular, more flexible, more secure, more fun, and more whatever than the ones powered by the other OS. As a result, whenever a company introduces a feature that’s new to its OS, it proclaims it as innovative, wonderful, and never seen before. Anyone who has attended or watched a product introduction — from Apple, Google, or Samsung — knows what I’m talking about.
Apple is arguably the biggest offender here, with a history of taking its time to develop a feature that other companies were quick to jump on as effectively beta-testers. You can almost set your watch to the Twitter takes and memes about how iPhone users are always late to the party of ideas they might think are wholly new if taking Apple at its word.
And here we are again. Apple is introducing its latest phone line, the iPhone 14, and the latest version of its operating system, iOS 16. There are a lot of improvements and new features that will be either useful, fun, or both. And in fact, many of these features are coming to older iPhone models, while some are limited to the new iPhone 14 hardware.
But while Apple touts all of them as all new and all great, some of them are — either completely or in some fashion — already familiar to Android users. Here’s a list of at least some of the features that Apple is now offering but that Android has had for a while.
Multiple stops in maps
You are on your way home after visiting your Aunt Bea, and you suddenly realize that if you take a short side trip, you can stop in at one of your favorite bookstores. Don’t want to get lost? Now, in iOS 16, you can quickly add the address of the bookstore to your trip agenda and get directions that will let you stop there and then find your way back home.
It’s a very handy feature that Android has had since around 2017. Apple Maps has certainly come a long way since its disastrous launch 10 years ago, but the potholes in the path along its development have been like this example — requiring a long-overdue filling at each update.
iOS 16 now lets you add stops along your route.
Android has let you add stops for a while.
Email: schedule, undo, remind later, and follow-up
In iOS 16, if you hit “Send” on an email and suddenly realize you put the wrong person’s name on it, you now have 10 seconds to change your mind and undo the send (assuming the other person is also using iOS 16). You can also schedule an email to be sent anytime you like or use Remind Later to remind yourself of an email you don’t want to deal with immediately.
The Undo Send at the bottom of the screen lets you change your mind — if your recipient also has iOS 16.
The Mail app can remind you to read an email.
You can now schedule an email to go out when you want it to.
Gmail has had an unsend since about 2018, but you get a choice of 5, 10, 20, or 30 seconds to change your mind (you can set the time in Gmail’s web app). You can also Snooze an email so it will pop up later and schedule an email to be sent when you want it.
You can stop an email before it is sent by hitting Undo.
If you don’t want to deal with an email immediately, you can Snooze it.
You can schedule an email so it is sent later.
Apple Live Captions
iOS 16 has added Live Captions, which offer real-time transcription for videos, audio, and conversation. This is an extremely useful feature, not only for people with hearing disabilities but for anyone who needs to track a conversation.
In fact, Android has had a Live Caption function since 2019 and currently provides immediate translations for those captions in several languages (although the accuracy of those translations will probably not live up to those of a human translator). But hey, even if Apple is once again late on this, it’s certainly a win for accessibility — and for watching videos with your phone muted when you’re too lazy to pick up your earbuds.
Haptic feedback on keyboard
Typing on a phone still mostly sucks in 2022, and part of that is from the lack of feedback you get while tapping. Not every input device needs to have the feedback of a mechanical keyboard, but it’s nice to know when you’ve actually typed a letter on an onscreen board. As a result, iOS 16 has now introduced haptic feedback on its onscreen keyboard. Android has had it for pretty much as long as we can remember.
The main difference here is that you have to enable haptic feedback in iOS 16; in Android, it is automatically enabled on most phones (but you can disable it if you want to). Let’s chalk this one up to the “How has it taken this long?” tally of features, and we’ll be sure to sing from the rafters that iPhone users now thankfully have this basic functionality.
iOS 16 now offers haptic keyboard feedback.
Android’s Gboard keyboard uses haptics by default.
iOS 16 will soon let you create shared photo libraries — called the iCloud Shared Photo Library — based either on a date or on who is in the photos. You can share your photo library with up to five people. (Apparently, this feature won’t immediately ship when iOS 16 does, so you may need to wait a bit.) Google Photos lets you share your entire library — based on a starting date or on who is in the photos — with a single partner.
Okay, this one’s a biggie. One of the many features introduced at the recent Apple event is an always-on display, which will only be available on the iPhone 14 Pro and Pro Max. Always-on displays let you glance at the time and widgets and get other helpful info even when the phone is asleep. It’s something Apple Watch users have had access to since the Series 5, but Apple is only putting it in the iPhone now — and gating it to the fancier Pro models.
This is something that Android phones have had for the better part of 10 years. It’s a feature that Google really put in the forefront with Android 12, where it made the always-on display show a huge digital clock by default when the phone’s screen sits at rest. Meanwhile, all this time, every iPhone has just been a lifeless black rectangle until it’s touched or when a notification comes in.
That being said, at this point, Apple’s new always-on display on iPhone is more customizable and info dense than what you see on Android phones, with widgets, images, and lots of color. And sure, an always-on display is going to drain a little more battery than keeping the screen entirely off, but in most phones, it’s a trivial amount. We’re happy that we’ll soon see a few less black mirrors sitting on desks and tables, devoid of life and character.
The always-on display of the iPhone 14 Pro.
Photo by Allison Johnson / The Verge
Phone fitness app
The Apple Watch is a popular way to measure your fitness — but if you were an iPhone user who didn’t have a watch, you couldn’t use the official Apple Fitness Plus app (although you could, of course, use any of the third-party apps out there). Now you can use the Apple Fitness Plus app whether or not you are using an Apple Watch.
Google’s official Fit fitness app has pretty much always been available for Android phones, whether you use a watch or not. (Of course, you could argue that there are few Android-compatible watches that are worth worrying about unless you’re a Samsung enthusiast…) It comes bundled with Pixels, while Samsung includes its own Health app. And while a phone will not track your heart rate or temperature without the assistance of a wearable, it’s still good for anyone to be able to get a basic estimation of steps walked, calories burned, etc.
Apple’s Fitness app can now be used without a Watch.
Android’s Fit app has never required an associated device.
Lock screen widgets
This is actually a rather weird one. iOS 16 now allows you to add up to four widgets to your lock screen (provided the app developer offers one).
Android 4.2 offered lock screen widgets about 10 years ago but, for whatever reason, decided to take them away again in Android 5.0. So whether we should count this or not is up to you.
Perhaps Apple’s mostly clean track record of sitting on ideas until they are fully baked (Siri and original HomePod, notwithstanding) will pan out here, and lock screen widgets will become a key mainstay of most iPhone users’ habits. Or, maybe it will just be another odd quirk only used and beloved by some of us weirdos — you know, like how regular widgets are in the first place.
Apple’s presentation on September 7th sometimes felt like a lesson in “Why you should be scared,” with its seeming emphasis on features that would summon help if you were lost in the wilderness, tell you if you were having a heart attack, or call emergency services if you were in a major car crash. This last feature — car crash detection — is now available in the new iPhone 14 phones as well as in the upcoming Apple Watch Series 8.
Pixel phones have car crash detection as well; Google added it to the built-in Personal Safety app back in 2019.
High-megapixel camera sensor with pixel binning
A better camera is the reason many people choose to upgrade to a newer phone model, and for years, many manufacturers were dead-set on using 12-megapixel sensors while performing as many software tricks as computational photography could allow. Major players like Google and Samsung have recently started upping the resolution of their main camera sensors to around 40 or 50 megapixels — not because we all need massive image files, but because capturing all that information and sizing it down to a “normal” image around something like 12 megapixels helps with artifacts like low-light noise. It’s part of the latest software tactics being utilized to make our tiny smartphone sensors over-perform and rival what some dedicated cameras can do (in the right scenario).
This technique is called pixel binning: taking nearby pixels on a high-resolution sensor and combining them to improve image quality at a lower resolution. It’s a genuinely beneficial feature if you don’t need the native resolution of the high-megapixel sensor, though it’s far from new tech. Even the Nokia 808 PureView did this to some degree back in 2012. It’s also not a magic trick that fixes everything — you’re still reliant on the quality of the image sensor and the processing pipeline.
The iPhone 14 Pro’s main camera definitely shows promise, even if Apple is a little late to the binning party.
Photo by Allison Johnson / The Verge
Now Apple has this feature in the 48-megapixel main camera of its new iPhone 14 Pro and Pro Max — confidently throwing around terms like Quad-pixel and Photonic Engine. It sounds exciting because Apple is very good at making things sound exciting, and it may very well be a significant generational improvement. Though we should keep in mind that, once again, others were here first — Huawei, Samsung, and many others dabbled with high-res sensors for years. We’ll see if Apple and its knack for slow-burn development really set it apart from the pack.