Designing for trust

A few weeks ago I appeared on Restart Radio. One of the first questions I was asked was about IF’s perspective on ‘trust’. It’s something we’ve talked about a few times in the studio, but never written down: time to change that.

Trust Postcard 1
"Trust isn't about compliance, it's about design" postcard (CC-BY IF)

Trust is a part of every relationship

Trust is a fundamental part of every relationship in my life. The word ‘trust’ bundles up a few concepts like ‘respect’, ‘openness’, ‘empathy’ and ‘honesty’. It’s about treating people as they’d like to be treated.

Those relationships extend into things too. I trust the people who made my fridge not to have made something that'll break after a few weeks. I also trust the shop that sold it to not have stocked fridges that break a lot.

When my trust is broken I feel the same kind of way, whether it’s broken by a business or a person. I feel bruised. Sometimes angry, sometimes sad. Often I feel a bit vulnerable. That can be a grim feeling.

Trust used to last a long time

When it comes to products, there’s a whole ecosystem that supports a language around trust. We have star ratings, gauges, ‘safe to buy’ stickers and certificates that tell us at or before point of purchase whether we should trust something or not. But it’s clear that something’s coming unstuck when it comes to connected products.

Because the things we trust are capable of changing, measuring the trustworthiness of a product at a single point in time will no longer cut it. Trust is a quality that can fluctuate really quickly now.

Scenario2 2
An illustration from our report on trust in machine learning

Trust has technical characteristics

IF is here to “help teams build services people trust”. The span of that work has been hugely varied: we’ve shown teams the role digital could play in their organisation; we’ve shaped policy development for consumer advocacy groups around the world; we’ve unpacked the relationship people have with digital infrastructure. But some themes keep emerging...

How a service uses data should be clear to users. As far as we’re concerned, this is foundational. It’s also probably the way legislation is going: the General Data Protection Regulation is far from watertight, but it’s an indication that companies are going to be ‘encouraged’ to be upfront with how they use data from now on. That’s going to lead to big changes in business models and operating processes.

Blanket transparency is less useful than transparency in-context. Confronting users with every data point is disorienting and confusing. At the same time, hiding what’s happening to them is alienating and devious. At Health Data: Fit or failing? a few weeks ago, Natalie Banner from Wellcome said, “A trustworthy system is one where you know how data is going to be used.” Lacing services with microinteractions that reveal more about the relationship between what people do and data held about them is a critical part of being trustworthy. Companies can (and should) still make the raw information available to those interested in it, but shouldn't be the only approach.

Being able to verify what a service is doing is very powerful. Consumer advocacy organisations and state bodies are starting to ask questions about how they can hold services to account for the way they use and store data. Services should be built in a way that makes it possible to do that. In a short time, that’ll be a consumer expectation. For now, this will help differentiate different manufacturers and suppliers in the market.

All three of these themes overlap a little. They each bring a different quality to the ‘respect’, ‘openness’, ‘empathy’ and ‘honesty’ I mentioned up top. Together, they’re a really powerful way of making a service trustworthy.

Audree If Patterns 1
Audree Fletcher from Barnardos talks about our pattern catalogue at July's Trust & Design meetup (CC-BY IF)

Making trust a design problem

This isn’t about the abstract trustworthiness of a brand: trust has technical qualities. It’s about the way something is built and run. Services have to be safe to use.

In the security community, trust is an exploitable avenue of attack. Talking about services being ‘trustworthy’ carries a lot of baggage. Our hope is that you can mitigate that risk by making it clear how a service uses data and by being accountable for its use and storage.

We help companies get better at this stuff by making new services, improving consent models, and taking these issues to new audiences. If you want to hear me chat more about that, I recommend listening to Restart Radio. They’ve got a great collection of guests in their archives, so check that out too.

Alternatively, get in touch. I'm always happy to hear about other approaches to trust that can improve the work we do.