Three keyboard changes Apple should make—to iOS

Dan Moren, Macworld:

Apple popularized the onscreen keyboard with the launch of the first iPhone, deciding to eschew the hardware keyboards that were de rigueur on smartphones at the time.

And:

That was great in 2007. But 12 years later, we’ve all largely adapted to touchscreen keyboards, and some of those smart technologies are starting to look and feel, well, not so smart. It’s time for an A-to-Z overhaul of text entry on iOS.

Dan digs into several virtual keyboard areas where there’s room for improvement. Take the time to read his article, not a long read.

I do think there’s plenty of room for improvements here. In my mind, one core problem is the blind dependence on machine learning to drive predictive text suggestions.

Here’s an example:

I brought up Twitter, and started composing a new tweet. I typed the letters “proc”, which led the keyboard to make the center prediction of the word “process”. Perfect, exactly what I was going for.

But then I typed “e”, the next letter in process. And that center suggestion was changed to another word. This happens a lot.

But the point is, when I am typing, and I see the word I want in a specific box, if I type one more letter in that word and then reach to tap the word, it should not move. I frequently find myself typing, reaching to tap the word I’m typing, and by the time my finger gets to the box, the word has changed to another word.

If my eye is on the target, and I continue to type the target word as already identified, the target word should stay the same. Until I type a letter that does not match the currently identified target word.

My sense of this is, machine learning has priority over straight logic.

I’d love to speak with someone on the keyboard team about this, and other related issues. I wonder if the team is not seeing these sorts of issues themselves.