Alexa, are you spying on me? Siri, Alexa, Google Home & co are listening in!
Author
Lisa WossalI am sure you know the feeling: you are lying in bed and have forgotten to turn off the light. It was so cosy just now; do you really want to get up again? Smart home devices can cure this and countless other problems. Siri, Alexa and others have already moved in with many people. The smart home speakers are in the living room, bedroom, children's room, and the new home office space hastily put together during the pandemic. But what information do our little technological helpers actually process? Could important, confidential data fall into the wrong hands? Okay, Google, are you listening?
In this blog, we present some practical tips on protecting your data whilst still living smart.
The smart home market continues to get bigger and bigger. Light switches, security cameras and spontaneous music requests can be controlled via voice commands. To enable this, smart home speakers use built-in microphones that are switched on in regular operation to implement spoken instructions on cue.
A study by Northeastern University Boston and Imperial College London, published in October 2020, has proven that the market-leading devices from Apple, Amazon, and Google do not just become active using their code word alone.1 Between 1.5 and 19 times a day, the smart home speakers responded without first being told the activation word. The subsequent conversations or sounds were recorded for an average of about 4 seconds, in some cases even up to 43 seconds, and then sent to the cloud.
How can this happen? The devices use different activation words: Hey Siri, Okay Google, Alexa etc., which are supposed to make the speakers listen actively. The instructions spoken after the codeword are picked up by the devices and responded to the request via voice recognition and internet connection. The whole thing becomes problematic because similar-sounding word combinations also trigger activation. In English, for example, Siri could mistake the "I'm sorry" pronounced with a certain slang for her code word "Hey Siri". In German, Google reacts to simple statements such as "Okay, good". Even colloquially pronounced sentences such as "Ham wa schon" can lead to Alexa - from "Amazon" - being activated, as the NRW consumer centre found.2 In this way, conversations could be recorded and stored with those present oblivious to this happening.
But that's not all: the recorded data is sometimes linked to existing profiles. With Google, for example, information about the logged-in Google account is personalised. With Alexa, this personalisation happens via the Amazon account. The data collected by Apple's Siri is linked to a randomly generated device-specific identifier but not to the Apple ID. Those who are not paying close attention may also have released their data to "improve services". That doesn't sound so bad? The consumer advice centre warns that recorded conversations could also be used for targeted, i.e. personalised, advertising beyond the information service.3
Moreover, recordings are not only heard by the speakers themselves: isolated recordings are listened to by company employees to improve the AI. According to Amazon, these employees are subject to a strict confidentiality clause and process the voice recordings without personal data.
Now, imagine a conversation that involves an "Okay, good" and then continues. It could be about the daughter's next birthday, or perhaps it could be about the major client with whom your company is about to sign a contract. Not necessarily the kind of conversations that one would want to be overheard unnoticed. But they are still your devices, and you have the opportunity to take appropriate precautions and alter settings to keep private things private and to not share professional matters with large corporations in an undesirable way.
For Google devices, the weblog, including interactions with Google Assistant, can be viewed at myactivity.google.com. Records and activities can be deleted at this location. You can manage the privacy settings of your smart home devices via your Google account.
For Amazons Alexa, individual voice commands can be deleted in the Alexa app in settings. More detailed privacy and data protection settings can be managed in the Amazon account under "My content and devices".
Apple assures deletion of recorded data after 6 months. Details about your data and your privacy settings can be found in iOS under "Settings" and then under "Siri & Search".
Especially when working from home, you should begin to question what smart home devices can listen in on, record and process. You must ensure that personal and internal company data is processed securely at workplaces outside the company offices. Remember that although the devices offer the ability to adjust settings themselves, and the market-leading manufacturers provide pages of data protection declarations, there is still the possibility that technical devices will be hacked.
Researchers at the Berlin Security Research Labs have tried and succeeded with this in their Smart Spies project. Apps can extend the functions of the smart home speakers from Google and Amazon from other developers. Each company reviews this before they are released. However, the SRL team managed to subsequently make small changes that went unnoticed and that had a significant impact. They programmed one application so that conversations were recorded and forwarded even after issuing the stop command, which is supposed to put a device back into an inactive state. In the second experiment, the users received an error message after calling up the programme. After some time, the speakers then reported that there was an update available and that it could be initiated by using the voice command "Start" and then verbally entering the password. Now, the tech-savvy among you may think this is too obvious. You would never be asked to reveal your password. But do the youngest and oldest generations who use such devices aware of this? The project from the Berlin researches has demonstrated one thing above all: it is technically possible to use the tools of Amazon, Google, etc., for questionable purposes.
In the wake of the Corona pandemic, working from home was discussed ambivalently in many companies and the media. For many, there were no universal guidelines for working from home. Security risks were, and are, seen above all concerning private hardware, inadequately secured network connections and the lack of standardised software. For a company's overarching IT security strategy, employee awareness and the possible existence of smart home devices should also be included. After all, more than 30% of households in Germany already have at least one such device integrated, with the trend continuing to rise.
If you are handling sensitive data, you should consider the safest option for working within a smart home environment: switch off devices. All smart home speakers can be deactivated so that the integrated microphones are switched off. Some companies have already advised their employees to use this solution. One such example comes from the British law firm Mishcon de Reya, with head of cybersecurity, Joe Hancock, explaining the move:
"Perhaps we're being slightly paranoid, but we need to have a lot of trust in these organisations and these devices. We'd rather not take those risks." - Joe Hancock 4
So, in summary, here are some practical tips to keep in mind. Firstly, if you have smart home devices, take a few minutes to adjust the settings better to suit your own privacy needs. Ensure your colleagues know the risks when working from home and create a comprehensive awareness concept for your company. To ensure that no conversations are recorded unknowingly, either in the context of work or privately, set an acoustic signal tone on the device in addition to the visual one to indicate when a recording starts. If you deactivate your devices during working hours, you must make sure to switch the smart home speakers back on at the end of the working day to fully enjoy the benefits of this technology in private.
And now, Alexa, turn off the lights.
1 When Speakers Are All Ears: Characterizing Misactivations of IoT Smart Speakers in: Mon(IoT) Research Lab [online] moniotrlab.ccis.neu.edu/smart-speakers-study-pets20/ [12.07.2021]
2 Ungewollt gesprächsbereit: Auch Googles Sprachassistent hört mehr als er soll, in: Verbraucherzentrale Bundesverband [online] www.vzbv.de/pressemitteilungen/ungewollt-gespraechsbereit-auch-googles-sprachassistent-hoert-mehr-als-er-soll [12.07.2021]
3 Ungewollt gesprächsbereit: Auch Googles Sprachassistent hört mehr als er soll, in: Verbraucherzentrale Bundesverband [online] www.vzbv.de/pressemitteilungen/ungewollt-gespraechsbereit-auch-googles-sprachassistent-hoert-mehr-als-er-soll [12.07.2021]
4 Locked-down lawyers warned Alexa is hearing confidential calls, in: The Seattle Times [online] www.seattletimes.com/business/locked-down-lawyers-warned-alexa-is-hearing-confidential-calls/ [13.07.2021]