The iPhone X arriving in stores Friday isn’t just a new design. It’s a new relationship.
Compared to your current phone, the 10th-anniversary iPhone is missing a key element: the home button. The whole front is just screen. You have to learn new gestures to operate it. Instead of scanning a finger to unlock it, you stop and look at it for a second like you’re taking a selfie. This phone recognizes you.
Is the $1,000 iPhone X for you? It’s no slam-dunk. The zaniest new features, like the face unlock, mostly function as billed. Its slimmed-down big screen feels easier to hold than previous iPhones, and its battery lasts two blissful extra hours. But this year, Apple’s also selling the cheaper iPhone 8 and 7 with nearly as much horsepower and a design — complete with a trusty home button — you already know.
If you buy an X (pronounced “ten”) now, think of it as signing up for a blind date with your most important gadget. Navigating it can be just as confusing as figuring out when to hold hands. Your thumb is going to keep ending up in the wrong place.
I’m a columnist whose job is to live on the cutting edge, and even I’d describe my relationship (so far) with the iPhone X as “awkward.”
There’s a bigger idea behind Apple’s war on buttons. Aside from to-be-expected improvements in the camera and processor, the X moves the iPhone forward by removing parts that get in between you and the message you want to send Mom. In Silicon Valley, they call these barriers the “chrome”: menus and buttons that are the interface between you and information.
Apple has a long history of giving us new tech — and taking it away. I’m not just talking about the headphone jack Apple removed from the iPhone 7 (and is still missing from the iPhone 8 and X). People were skeptical of the iPhone in its early years because it axed their beloved BlackBerry keyboards. That turned out to be a worthwhile compromise.
The X tries to make the iPhone the world’s smartest screen. Not only can it recognize you by face, but with a phalanx of sensors buried in the notch at the top of its screen, it can even know if you’re smiling. It hears you when you call out (to Siri), and it tries to take you right to the information you seek.
That’s a future vision that makes sense, given that millions of people have already adopted talking speakers around the house. But after living with the iPhone X for a bit, I quickly ran into some of its present-day limits — and near-future challenges.
Full disclosure: I’m still getting used to the X. Tech companies usually provide reviewers like me with a week to live with a flagship phone before publishing assessments simultaneously. With the X, Apple provided us just 15 hours, if you include the time I should have been asleep. (I’ll continue to test and share my findings perhaps after a nap.) It’s been an intense first date.
The FaceID unlocking mechanism embodies the iPhone X’s ambitions and hurdles. Using it takes about a second: Tap the screen to wake it up, and then sensors embedded in the notch at the top of the screen shoot out beams (invisible to the eye) to map the contours of your face. If it recognizes you, a lock icon opens.
In my initial tests, it worked nine times out of 10. You really have to hold it in front of your face like you’re taking a selfie. Too close, in particular, and it won’t work.
The existential question is: What makes it a worthwhile trade for a fingerprint reader? FaceID will work when your hands are wet or dirty, but it isn’t exactly faster. Rival Samsung gave its Galaxy S8 face and iris-reading capabilities, but it still threw in a fingerprint reader on the back.
And Apple made an annoying decision to not make the iPhone X just open to the home screen when it spots your face. Instead, it unlocks to a screen full of notifications. To get home, you have to swipe up from the bottom edge of the phone with your thumb, and I keep swiping from the wrong spot. (It’s just one of the new finger gyrations you’ll have to master on the post-button iPhone.)
A more exciting – and potentially terrifying – rationale for FaceID is that the sensors behind it can also power all sorts of other uses. The phone comes with one pretty silly example: By tracking the location of your facial features, it turns your face into animated creatures dubbed Animoji. You can send messages as a talking poop cartoon, if that’s appealing.
It doesn’t take much of a leap to imagine how the technology used to anthropomorphize emoji might appeal to advertisers that want to know where (and whether) you’re looking at the screen during their messages. Facebook, Google or other companies that monetize our behaviors might want to know if we’re smiling or frowning. Apple’s terms for app developers require permission before tapping into the camera, and forbid using face data for advertising – but we’re just at the beginning for this technology.