Hacking Siri

FastCoDesign:

Chinese researchers have discovered a terrifying vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung, and Huawei. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon’s Alexa assistant.

Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.

First things first, this is not terrifying. But it is interesting.

You can watch a demo in the video embedded below. Not sure there’s a software fix to prevent this. Seems to me the audio in processor would have to have access to the frequency of the audio coming in, then filter it if it was outside some specified audible range.

Not sure this threat, which seems relatively minor, is worth the effort.

Also, DolphinAttack, cool name.