Whatsapp generally infuse well thought out UX into their apps and champion user privacy, so I was a little taken back by the latest change to their terms of service / privacy policy, and even more so by how they went about hiding some critical options in their iOS App.
I would classify myself as a heavy Whatsapp user, so I know my way around their iOS app and pick up on the minutia of the changes they make from release to release. Mid Friday morning I was presented with an interstitial screen that stated a change in the Whatsapp T&C’s along with a honking great big “Agree” button. Directly below was another button with the option to read more about these changes, being a sucker for legalese and reams of tiny text – I pressed the “Read more…” button.
That’s when I saw the following screen and a critical hidden option …
So along with the full terms of service & privacy policy behind a “Read more…” call to action, there is also surprisingly a hidden option related to sharing your account information with the Facebook group of companies. You have the option to opt-out at this stage, the default setting of course is set to share. As a seasoned UX researcher having done plenty of user testing, I know first-hand that most users will agree to legal T&Cs without reading them, and furthermore most don’t ever change defaults if presented with them.
To add to this, there doesn’t seem to be an option to toggle this setting on iOS, so once you agree, you pretty much have agreed. (Android seems to even have an FAQ post outlining how to opt-out (Opens in a new window) but you have to do it within 30 days of opting-in). If you’re a new user, it’s baked into the new terms of service.
A Dark UX Pattern… and our privacy
I would classify this as a dark UX pattern, effectively hiding options, making it a very pro-active choice for users to opt-out of relinquishing information and ultimately not being transparent with regards to some pretty critical privacy options.
I’ve seen my fair share of similar dark patterns, from retailers automatically putting related accessories into your baskets after adding your main item to the basket, or highlighting extra mandatory costs at the very end of a customer checkout journey. I just find it a bit more disconcerting when it’s my personal account information being used. Also it’s not 100% clear what is being referred to with regards to “Whatsapp account information”. It’s stated that the message contents or numbers are not being shared, but as we know simply having account usage information or any account activity meta-data alone, can tell a very compelling story.
I understand that Whatsapp is a commercial application, and that user research data is instrumental in helping to understand behaviours and building better application experiences. However they have undermined the users’ trust by hiding these important options. If I’m going to pay the price of a free app with my data, I want to know what data exactly and how they’re going to use it.
As you may have guessed, I didnt opt-in.
Since Facebook’s acquisition of Whatsapp, I was expecting there to be some tighter integration with their product suite – I’m surprised it didn’t happen sooner. Facebook have been very cavalier with questionable defaults on new features when it comes to data sharing and privacy; so it seems like the same stance has now surfaced at Whatsapp.
I’ll continue to use Whatsapp for the time being, but it’s making me rethink about messaging and whether I should switch over to a more secure app that’s built around privacy like Wickr (Opens in a new window).
In case you’re wondering – I was running Whatsapp version 2.16.9 at the time of writing.