Most of the Tech Giants today are having their own Voice Assistant, be it Apple’s Siri, Google’s Assistant, Microsoft’s Cortana, Samsung’s Bixby to name a few. But a study by Chinese researchers prove that the voice assistant from the tech giants are having a vulnerability which would effect the company’s Mobile, PC platform.
The Technique known as DolphinAttack, which is invented by a team from Zhejiang University were able to translate typical voice commands into ultrasonic frequencies which were too high for the human ear to hear.
These voice commands were perfectly decipherable by the microphones and software powering voice assistants. This translation makes it easier to take control of the gadgets with just a few words uttered in frequencies which cannot be heard by a human ear.
The researchers could tell an iPhone to call 1234567890 or activate basic commands like “Hey Siri” and could force a Macbook to open a malicious website. The research team which submitted a paper accepted by the ACM Conference on Computer and Communications Security confirms that:
“Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user.”
A smartphone with about $3 of additional hardware which included a tiny speaker and amp was used to hack each voice assistant and these methods can be duplicated by anyone with a bit of technical knowledge.
Hacking an iPhone seems like no problem, with the hacker need to walk by you in the crowd and play commands in frequencies a human ear couldn’t hear and see Safari or Chrome load a site with a malware and install bringing your contents of the phone easy to explore.
The researchers say this exploit is enabled by a combination of hardware and software problems which include the microphones and software that power voice assistants like Siri, Cortana, Alexa and more which pick up inaudible frequencies which are above the 20KhZ limits of human ear.
Currently the easy fix to most DolphinAttack vulnerabilities is to turn off the always on settings of Siri or Google Assistant or Cortana on your phones and tablets and a hacker won’t be able to talk to your phone.