But have you ever really thought about how these home assistants know when you’re talking to them? In order to do so, they have to be listening to all the time, meaning that the devices, the servers that process the data, and the companies that build and operate these devices are listening 24/7. This can be a major issue, but it may be fine if the companies take the proper steps to protect your sensitive information. At a minimum, they should apply anonymization to the collected records, protect them properly, and destroy them as soon as the request is complete. It would also be great if requests are only managed by machines, with no humans in the loop. But what if they don’t? The number of data breaches that have occurred in recent years have demonstrated that companies have a very hard time protecting sensitive data. And recent news reports have demonstrated that these smart home companies have no intention of meeting these standards.
And It’s Not Just Them
In fact, Google has confirmed that recordings made by its Home Assistant are available to contractors to help with the improvement of the device’s artificial intelligence algorithms. This admission came after over a thousand Dutch voice records were publicly leaked. Over a hundred of these clips were captured accidentally, meaning that the owners did not know that they were being recorded and certainly didn’t know that these recordings were being used and subjected to human review to improve the Home Assistant software. Google reports that they make an effort to protect the privacy of owners by anonymizing the data. However, their anonymization efforts have been insufficient. The leaked audio recordings included the full address of one user and personal information (children’s names, significant others, etc.) of several others. This sensitive data, in addition to anything that can be extracted from the voiceprint itself, maybe enough to de-anonymize this collected and leaked data.
Privacy Implications of the Smart Home
Smart home technology is extremely convenient; however, often this convenience comes at the cost of privacy. These devices are designed to be constantly listening to their owners in order to be able to respond to the key phrases (Alexa, OK Google, etc.) indicating that the owner has a task for them. Smart assistants represent a clear tradeoff between convenience and user privacy and security. The massive datasets collected by these devices are treasure troves of personal data for hackers. Access to a user’s recordings likely would give a hacker everything that they needed to commit credit card fraud, launch a spear-phishing campaign or any of a number of other attacks. Home assistants also threaten businesses’ cybersecurity as recordings are made of employees discussing business matters in the home. It’s vital to consider the security implications whenever talking within earshot of one of these devices.