U.S. police chief apologizes for arrest of National Basketball Association player
Apple launches privacy portal for user data ahead of GDPR
M&S "perilously close" to losing top clothing retailer spot
Hawaii volcano aerial view: Latest webcam images as Kilauea blows AGAIN
Alexa, Google Assistant and Siri can be fooled by 'silent' commands
11 May 2018, 07:08 | Casey Mitchell
In the future artificial intelligence may overtake every aspect of human life
"With audio attacks, the researchers are exploiting the gap between human and machine speech recognition". In doing so, they've been able to make these systems dial phone numbers, open websites, and more.
According to a report by NY Times, the researchers in China and the U.S. have begun testing how hidden commands can be sent to Alexa, Google Assistant, and Siri that are undetectable to the human ear.
A series of studies have proven that it's possible to secretly give silent commands to voice assistants like Amazon Alexa and Google Assistant without their owners ever knowing.
This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text.
Worryingly, the students say the bad actors could use messages hidden within music to unlock doors, access accounts or add items to shopping lists. "My assumption is that the malicious people already employ people to do what I do".
The proliferation of voice-activated gadgets amplifies the implications of such tricks.
The findings call to light a variety of security concerns as they reveal just how vulnerable voice assistant data could be. Researchers estimate that more than half of American homes will have at least one smart speaker by 2022, so it's not hard to imagine this sort of thing becoming a problem if the vulnerability isn't addressed.
According to the coverage in the Times, both Amazon and Google have taken measures to protect their smart speakers and voice assistants against such manipulations. Google said security is a continuing focus and its Assistant has features to mitigate undetectable audio commands. Apple points out that HomePod can't do things like open doors, while iPhones have to be unlocked to execute certain Siri commands.
We need hardware makers and AI developers to tackle such subliminal messages, particularly for devices that don't have screens to give users visual feedback and warnings about having received secret commands. For its part, the Federal Communications Commission (FCC) has discouraged the practice, calling it "counter to the public interest".
Similar techniques have been demonstrated using ultrasonic frequencies.
That warning was borne out in April, when researchers at the University of IL at Urbana-Champaign demonstrated ultrasound attacks from 25 feet away. You'll still need a direct line to the device, as the commands are incapable of penetrating through walls. They were able to hide the command, "OK Google, browse to evil.com" in a recording of the spoken sentence, 'Without the dataset, the article is useless.' Humans can not detect the command.
World scrambles to save Iran deal that Trump tried to sink
This wasn't free for them, but they accepted economic sacrifices from reduced oil purchases from Iran to achieve this agreement. A less trustworthy US negotiating from a position of less leverage is not going to win magical new concessions from Iran.
Rolls-Royce Cullinan walk
According to the GM of Rolls-Royce Sandton, Marek Letowt, four Cullinans will arrive in South Africa in November 2018. All-new, all-wheel drive and an all-wheel steering system-the first in Rolls-Royce history-come standard.
Saudi Arabia warns it will develop nukes if Iran does
Companies and banks doing business with Iran will have to scramble to extricate themselves or run afoul of the US government. What doesn't Trump like about that? Also, probably, "It was Obama's idea so I hate it".