Aretha - An Assistant for Your Assistant
Your voice assistant is there to watch over you, but who watches over it?

What is Your Virtual Assistant Actually For?
Amazon Echo Dot is a fully-fledged, high-powered, artificial intelligence-based virtual assistant that can do everything from buy you groceries to wake you up in the morning. By all accounts, this is science fiction technology--the kind of thing we used to dream about as kids, along with teleporters and ray guns.
And you can buy this sci-fi gadget online, right now, for $40.
Have you ever wondered why one of the most advanced technologies available to mankind costs less than a pair of cheap sneakers?
It’s because the Amazon Echo, and devices like it, aren’t actually built for what you use them for. Buying groceries? Waking you up in the morning? These are window dressings to the real function of the device: trafficking your data, and influencing your buying patterns. Think about it: if you have an Echo, you’re more likely to place more orders on Amazon.com. Your Google Home can use the data you give it to optimize the advertising you see every time you’re on the internet. These revenue streams, over time, generate far more than $40 in value.
What this means, ultimately, is that your virtual assistant was never really designed for your benefit. It was designed with your supplier’s business interests in mind, often at the expense of your privacy.
This arrangement simply won’t do. If we want smart assistants to work for their actual users, we must fundamentally reverse the balance of power in the smart home.
A Black Box Problem
Recently, a group of computer scientists at the University of Oxford came together to do just that.
They began with a premise: virtual assistants gather and traffick lots of user data, but users themselves rarely know about it. The prevailing narrative holds “that users express concern over privacy and security, yet consistently do not take privacy and security preserving actions.” But should users be blamed for this information disconnect? Plenty of people who enjoy smart home devices aren’t savvy enough to customize their digital experience down to the fine details. And even those who know their way around a laptop or a smartphone find that doing basic due diligence towards IoT security is harder than they’d thought.
[. . .] legal and regulatory structures (such as privacy policies) often place users in an impossible situation when navigating privacy choices, suggesting that users make choices armed with incomplete information; that even with perfect information, they are unlikely to be able to process all of it [. . .]
If devices give incomplete information to users, and users aren’t equipped to handle even that which is given to them, how can anyone ever expect true privacy and security in a smart home?
A Solution
The solution, for this group of Oxford researchers, is a program called “Aretha.”
Aretha’s function is to bridge the information gap between smart home assistants and users. It does this by weaponizing the very feature that makes these assistants so attractive in the first place: their ability to translate digital information into ordinary conversation. Through the voice assistant, Aretha explains to users, in plain language, what is happening in their network and how to customize it to their liking. The process occurs in three steps.
First, Aretha develops a baseline understanding of the user’s knowledge and concern around privacy. For example, a conversation might begin with...
Aretha: How would you feel if your smart devices sold your data to advertisers?
User: I think I’d be fine with it, it’s pretty normal, right?
Next, the Aretha puts hypothetical questions into real terms--describing the actual data flows in the user’s smart home in order to determine their ideal preferences.
Aretha: Do you think your current devices send data to advertisers?
User: Yeah, probably.
Aretha: What about your Nokia Body scales, do you think they send health data, like your weight, to advertisers?
User: I guess?
Aretha: Is this fine too?
User: Hmm, I think I’d rather it didn’t do that.
Finally, with a baseline understanding having been established, and preferences having been decided, Aretha offers the option to act.
Aretha: OK, Would you like me to block advertising traffic to and from the Nokia Body scales?
User: Yes please.
Aretha: Will do, let me know if this causes any problems.
The Tech Stack
It’s no simple task to translate complex data flows into plain conversation, but Aretha isn’t starting from scratch. In fact, it’s merely a continuation in a long history of privacy-oriented software for the web. It began with Solid--a project founded by the literal creator of the internet, Tim Berners-Lee, in response to the growing threat to privacy online.
Building upon Solid, there was IoT Refine--an open-source program capable of mapping data flows in a smart home. Basically, IoT Refine acts as a network hub through which all smart home devices connect. As data flows from one device to another, into and out from the wider web, the program reveals which devices send data to what companies.
That means Aretha, expanding on IoT Refine, is really just a feature for improving user experience. It takes all the data flows that the base program reads, and converts them into simple speech. Then, it converts a user’s spoken response into actions which can modify those data flows.
This is an incremental, but absolutely crucial step towards making smart home security accessible to everyone. Hundreds of millions of people today own smart home speakers, and few among them are prepared to fully utilize a program like IoT Refine. By 2021, there may be more virtual assistants in the world than people to use them. If connected technologies will ever be safe for wider consumption, privacy must not only be possible but accessible.
The Road Ahead
Aretha is currently in its prototype state. As of this writing, it is being evaluated at a science center in Watford, England. Additionally, there are plans to test the program with a select group of smart home owners. The study will gauge how users interact with Aretha, and how positive (or negative) their experience with it is.
There will be more kinks to be worked out along the way, like smoothing over natural idiosyncrasies in conversation, and deepening the “contextual awareness” of the program. By the end, though, if Aretha succeeds, it will completely reverse who has the power in your smart home. And the virtual assistant will finally become that cool thing we all used to dream about as kids.
Product Comments