THRIDI talk @iotlondon
Updated: Apr 28, 2021
I recently shared the stage in the @iotlondon meetup with Mo Haghighi, Developer and Ecosystems Advocacy Leader - EMEA, IBM and Paul Wealls, Founder, @IotnorthU & Senior Product Manager, ADLINK.
The London Internet of Things Meetup brings together IoT enthusiasts and has close to 14,000 members. Since November 2011, members gather once a month to listen to 3 talks ranging from open hardware to urban infrastructure. You can learn more and signup on their meetup page.
It was a great pleasure to talk to a diverse crowd about the THRIDI project and hear from others on their experiences of "making the internet of things happen". My slides are here. Below is a summary of what I talked about that day.
THRIDI is my first interdisciplinary project, run together with Arthi Manohar from Brunel Design and Jiahong Chen from Nottingham University.
In my years of academia and industry, I've always been interested in computer networks and systems - I was mainly intrigued by machine-to-machine communication and network protocols. Before joining Brunel, I was working at Nominet. One of my projects was to look at data protection in the Internet of Things, especially when GDPR was not yet applicable but much debated. My decision to look into standardisation in these areas made me enter the world of IETF (Internet Engineering Task Force), specifically, ACE (Authentication and Authorisation in Constrained Environments).
I've also had a great time working on UMA 2.0 in the Kantara UMA Working Group.
The work in those working groups looks into authorisation in IoT systems generally basing on the Web Authorisation Protocol (specifically OAuth2.0). Very briefly, OAuth2.0 allows users to grant limited access to their resources on one site to another site without exposing their credentials.
When working on adaptations of OAuth2.0 to IoT (or constrained environments), we tend to see the world from the lens of four entities.
Resource owner: “user”
Authorization server: The interface where authorization is obtained
Client: Application trying to get access to user information
Resource server: Hosts user information
Access permissions to a resource are time-bound, scoped, compact tokens.
These abstractions are very useful until the moment you start thinking of the actual users. What's typically out of scope, and rightfully so, of the standards specifications are questions like:
Are the end-users resource owners? How do they know how to configure authorization servers with privacy policies? (This is needed as we cannot expect the user to be online and ready to decide each time somebody wants to access their data on an IoT device.)
Following from the above, how can we support them create data sharing policies on-demand at the time of a new interaction?
What happens when multiple people share a single device? How does each manage their privacy?
Since we do not know how to resolve these questions well, we end up assuming a system administrator role who will manage devices, users and their data. So, while ACE and UMA have great potential for giving users agency over their data, configuring these systems assumes the existence of a tech-savvy person in every household.
This is why we need input from other disciplines to make these systems more usable. And this is why we have created the THRIDI project (funded by EPSRC HDI network) and ran an interdisciplinary design workshop in November 2020. Twenty-four participants attended the workshop and included a mix of experts from academia and early-career researchers, specialising in IoT, network security, privacy-enhancing technologies, user-interface design, law, and policy.
The THRIDI workshop used the HDI framework of legibility, agency and negotiability to understand the challenges for data sharing in IoT and how to build user trust. The participants considered the technical, legal and business barriers and opportunities that will shape the implementation.
Let's go over how we interpreted legibility, agency and negotiability in the workshop:
Legibility: Helping people understand what is happening to data about them. A well-known legibility challenge is the lack of appropriate interfaces for users to see the extent and the nature of the data collected.
Agency: To change relevant systems to be in better accord with people's wishes. User agency is hard to achieve when different users share devices with different relationships (e.g., housemates, family members or hired help).
Negotiability: To work with the people using the data to improve its processing. Similarly, the negotiability of data sharing may not be apparent to the users, as their privacy preferences and data sharing context change in time (e.g., changing needs for care in a smart home designed for healthcare).
Using legibility, agency and negotiability as anchors for discussion, the workshop focused on four use-cases. 1) Home security, 2) Smart appliances, 3) Smart health, and 4) Smart toys. We have run several design activities, from card sorting to scenario cards to design fiction. We have learned a lot about different perspectives from different disciplines and identified possible avenues for future research. You can access the full workshop report here.
My three takeaways?
Everybody is different: Even if the workshop participants may be more privacy-aware than the average user, still, we are so very different when it comes to our perceptions and expectations on privacy. For instance, in a card sorting exercise, in general, people marked bedroom and bathroom as the most private spaces, except a couple who considered them moderately private as they have smart devices in those spaces or enter those spaces with smart devices.
Different perspectives are very important: Following from above, we need to cater to different perspectives when providing agency to users. Consider the following two statements for a "footprints on snow" visual. “You should always minimise data collection. All problems start when you start to collect data. So, the least footprints you have, the better in technology” vs “The footprints are temporary because, at the end of the day, the environment will eliminate them.”
We need more interdisciplinary research: We cannot work in silos and think we understand the legal, ethical, technical, design, and social aspects of home IoT systems. We should each do our part to train people to be more vigilant about their data.
To read more on the activities of the design workshop, the questions raised, and open research topics we discussed, read our report and reach out to continue the discussion.