AirPods, Siri, and the voice-first interface

Ben Bajarin, Tech.pinions:

Apple’s AirPods are just wireless headphones about as much as the Apple Watch is “just” a watch and iPhone is “just” a phone. Nothing makes this more apparent than the Siri experience.

And:

It is remarkable how much better Apple’s Siri experience is with AirPods. In part because the microphones are much closer to your mouth and, therefore, Siri can more clearly hear and understand you. I’m not sure how many people realize how many Siri failures have to do the distance you are from your iPhone or iPad, as well as ambient background noise and the device’s ability to clearly hear you.

And:

Thanks to the beam forming mics and some bone conduction technology, Siri with the AirPods is about as accurate a Siri experience I’ve had. In fact, in the five days I’ve been using the AirPods extensively, I have yet to have Siri not understand my request.

And:

You very quickly realize, the more you use Siri with the AirPods, how much the experience today assumes you have a screen in front of you. For example, if I use the AirPods to activate Siri and say, “What’s the latest news?” Siri will fetch the news then say, “Here is some news — take a look.” The experience assumes I want to use my screen (or it at least assumes I have a screen near me to look at) to read the news. Whereas, the Amazon Echo and Google Home just start reading the latest news headlines and tidbits.

These are just a few nuggets from a much longer piece. One core question that emerges is, should we design for the screen? Instead, perhaps we should design for the screen as an option, or somehow let the user choose, perhaps with a gesture that says, “I’ve got no screen, pipe all the info into my ears”.

Good stuff from Ben Bajarin.