What can we do to humanise our “user experiences”?

How one book gave me better perspective on human experience design.

Having just finished reading “The Age of Surveillance Capitalism” by Shoshana Zuboff, I feel like my eyes have been opened that little wider and the tech world won’t quite look the same again.

'The age of surveillance capitalism' book, with post-it note on top saying 'this machine kills apathy'

The text is a tour de force thought provoking analysis of our new technology based modernity we have all so readily embraced, without stopping to assess the impact and consequences that it brings. Amongst the many points discussed (in its 691 pages) it brings to light the asymmetrically poor deal we get from relinquishing personal data in exchange for becoming more dependent on monopolistic companies determined to change our habits and influence the decisions we make as individuals, all in the pursuit of revenue generation.

It made me take a step back and start thinking about the products I’ve created or contributed toward, and the human impact that they’ve had – for better or worse. This is something we don’t do often enough, and I’ve yet to come across UX processes that bake in these kind of reflective questions, it’s important we begin to take some time to do just this.

Human experience design

Today our norms are changing, fast. Technology is becoming ever more pervasive in our lives and what previously may have seemed unacceptable and obtrusive is deemed as the cost of doing business, and users will have to like it or lump it. A 2018 study about data collection from Vanderbilt University shows just how difficult it is to escape from Google’s all seeing eye.

As UX Designers we build experiences and services at a fundamental level – we have an enormous responsibility to champion the user, to represent the user, to be their voice – but in many instances (especially in the private sector) we actually look at ways to exploit the user, nudge the user to make our preferred choice, influence their behaviour over the long-term, keep them in a cycle, make them dependent on our products. We’re not really championing the user, as much as leveraging them for the systems’ gain.

When you begin to think of the individuals using your products as humans as opposed to users’ you begin to realise the impact some of these products and services can have on people’s lives. Herein lies a philosophical debate and questions we need to start asking. Am I really championing the user? Am I really looking out to do the best by the user, and address their needs? Am I thinking about long term consequences to user behaviour  and building in safeguards?

Some food for thought

It can be difficult raising these issues in workshop sessions, but they’re important and we need to educate one another with regards to the human cost of our decisions. As experience designers we’re uniquely placed to build these arguments, and bring these issues to light.

There are some simple things we can start doing, that increase transparency and respect the rights of the individual:

  • Respect people’s privacy

    We don’t invite someone in to our place of business — search them, and do a background check on them when they want to buy a pack of gum — minimise personal information captured unless its necessary; the EU’s GDPR legislation already states this — their checklist is a handy tool for assessing compliance.

  • Don’t hide behind the fine print

    Avoid the ridiculously impenetrable terms of service that allow organisations to get away with literally anything. Speak to legal, lobby them to increase transparency, and the clarity of these documents so your users genuinely understand the terms to which they’re agreeing.

    Consider structuring the documents to highlight the key implications up-front, and make them user centric — an FAQs style approach can be easier for most to understand, larger fonts and colour can be deployed to further aid readability. One thing to avoid would be using small dialog boxes with endless scroll as they effectively inhibit the reading experience.

    Facebook aren’t known to be the bastion of privacy, but their terms of service are well structured with less legalese so that they are a relatively easier pill to swallow, albeit a bitter one. Apple’s newly updated privacy portal is not exactly a terms of service, but it shows they’re making their policies more transparent and educating their users.

  • Assess addictiveness

    We want delightful interactions, that user’s can enjoy — but we need to introduce limits and safeguards when user behaviour becomes an addiction and detrimental to themselves.

    In many instances as product designers we aim to crack that code of making our products sticky — and get our users into frequent usage, but it can also trigger unwanted addictive behaviours. Just as we analyse user data to increase engagement, we can do the same to identify negative behaviours.

    Some of these negative behaviours can be easier to identify than others, for example excessive consumption of video content, an inordinate number of times an app is opened in a day, or hours upon hours of gameplay in limited time are easily identifiable, simple warnings when thresholds are crossed can notify users if they’re overdoing it.

    Even Apple’s iOS introduced the ‘Screen Time’ feature in 2018, to help users monitor and manage the amount of time they were spending on their devices as a whole.

  • Give a fair deal

    Some apps request access to a lot of data and more or less deny service if its not granted, one cuplrit was iRobot’s Roomba which collected data in this manner – it isn’t a fair exchange, you may have shelled out for hardware but it doesn’t mean you want to share detailed personal home data.

It’s a tough challenge, as data is the new oil – and businesses are making new industries from mining this data, and selling it on to those that can continue to mine it, but a sea change is happening.

The tide is turning

Individuals are becoming more savvy when it comes to their data, and seek to find a better deal than simply handing it over to organisations willing to exploit it to alter their behaviour. The exodus of Fitbit users following Google’s acquisition, highlights that people prefer to do business with companies they can really trust. The opportunity for business growth is rife in this space, and it’s one that can benefit both the user and the organisations that take this path.

I’d like to believe the natural progression of this debate will be to have a charter that we as UX professionals can all abide by and companies subscribe to — that show we’re all serious about championing the user, not just exploiting them.

This is only the tip of the iceberg, and there is a far longer debate to be had than in this post. As a starting point I know that the next time I run a client workshop, I’ll raise these deeper questions and urge my clients to understand the consequences of the decisions we make and humanise the individuals that use our products. It can only help in building real positive human experiences.

3 Recent dark pattern implementations you have probably missed

In the wrong hands UX can be a dark art. I’ve rounded up three recent experiences that have got under my skin. Everyday we unwittingly encounter dozens of systems designed to sub-consciously nudge us into taking actions we wouldn’t naturally do, but are mainly there to benefit the system masters – these are anti-patterns or […]

There’s a problem with the latest emoji additions, and it’s only going to get worse 😥

With over 3,178 emojis and 168 new ones just added in the latest release, of the ubiquitous Unicode pictogram collection – there is a major problem looming, categorisation. Ever tried finding an emoji these days, it’s like finding a needle in a haystack. Especially if you’re trying to find that elusive emoji you’re pretty certain […]

Product owners – not sure how to get that insight? Well, there’s a web-app for that.

“How do we get the insights we need?” heard that question asked of yourself or in your own head? The Research Picker tool was designed to help you answer these questions. As a business or product owner, you want to build the best product experience for your users. You know you need to run user […]

Are you inadvertently creating unethical products?

As UXers, we’re here to champion the user in our projects, but do we really champion the user’s needs? Within our product teams we strive to create great user experiences, typically we test out new concepts with our users, observe their behaviour, consolidate the data and use the insights to push the needle ever further […]

Experience Design Masterclass: Disneyland vs Legoland

Last year we visited two theme parks with the kids, Disneyland Paris and Legoland Windsor. Cue a blog post on dissecting the theme park experiences, how the detail really matters and some of the great service design elements that have been deployed to maximise fun and reduce tedium. Before we go further in my opinion […]

A real world strategy for building accessible products

Outlined in this post is a flexible and actionable strategy for accessibility governance across product teams – suitable for start-ups to multi-billion dollar enterprises. Once your organisation decides to embrace accessible design and development, you’ve only just fought the first part of the battle. The next challenge it to get this thinking disseminated through to […]

What can be done to ensure that technology truly improves learning outcomes

One of the biggest problem facing education technology today is the way in which existing learning materials are translated onto new technology platforms, without real consideration for the “baggage” technology brings along. Technology brings a lot of extra baggage Let’s take the example of the humble text book – we’ve modernised this learning delivery mechanism, […]