The difference between Siri and Alexa

Stephen Nellis, Reuters:

Currently, Apple’s Siri works with only six types of app: ride-hailing and sharing; messaging and calling; photo search; payments; fitness; and auto infotainment systems. At the company’s annual developer conference next week, it is expected to add to those categories.

And:

But even if Siri doubles its areas of expertise, it will be a far cry from the 12,000 or so tasks that Amazon.com’s Alexa can handle.

The difference illustrates a strategic divide between the two tech rivals. Apple is betting that customers will not use voice commands without an experience similar to speaking with a human, and so it is limiting what Siri can do in order to make sure it works well.

And:

Now, an iPhone user can say, “Hey Siri, I’d like a ride to the airport” or “Hey Siri, order me a car,” and Siri will open the Uber or Lyft ride service app and start booking a trip.

Apart from some basic home and music functions, Alexa needs more specific directions, using a limited set of commands such as “ask” or “tell.” For example, “Alexa, ask Uber for a ride,” will start the process of summoning a car, but “Alexa, order me an Uber” will not, because Alexa does not make the connection that it should open the Uber “skill.”

Apple is investing in foundational natural language processing (NLP) expertise, building an experience that will scale.

My 2 cents: This is a better long play. In the long run, interacting with Siri will be much more like speaking with a person, complete with slang and idiom, requiring a much smaller learning curve but with a limited domain.

It’d be interesting to see a set of benchmarks develop to test Siri and Alexa, a standardized set of statements to gauge progress.