In a tech-driven world, our economy runs on data. It has surpassed oil as the world’s most valuable resource, and we’re handing it over more readily than ever. Designing for privacy and data protection has never been so important, and there are some interesting trends emerging as a result.
Consumers don’t mind sharing their data, given the right protections
When it comes to designing for privacy, one of the most influential factors is the end user’s attitude. How do we, as consumers, really feel about protecting our data?
A 2019 survey conducted by the Center for Data Innovation found that 58% of Americans are willing to allow third parties to collect at least some of their personal data, including biometrics and medical data.
That number jumps to 70% who are willing to share even more personal data with the organizations they interact with online, so long asthey believe that the exchange will benefit them.
Despite numerous major data breaches in recent years, it seems that we’re not deterred from sharing our data.
We’re aware of the dangers: 85% of consumers say that cybersecurity and privacy are among the biggest risks facing society. However, we’ve also grown accustomed to the fact that, in many cases, our data is the price we pay for a more personalized, user-friendly experience. And that price is often non-negotiable.
However, we do care what happens to our data after we hand it over. Consumers believe that companies, not the government, are primarily responsible for keeping their data safe. If they don’t trust that a company is handling their data responsibly, 87% of those surveyed say they’ll take their business elsewhere.
The focus seems to be on fair value exchange. Transparency and trust are crucial, and design and data protection must go hand in hand. Designers can give users the power to control their data, and help create an experience that users can trust—if they’re willing to.
So how can designers change the future of data sharing? Let’s explore some of the most significant trends in privacy-aware design:
- Perceptions of control
- Blockchain (yes, it’s still a thing)
- The role of copy in data protection
- Privacy’s default setting
1. Help customers answer “What’s happening to my data?”
Just 10% of consumers feel that they have control over their personal information. This is a worrying statistic, but equally a great opportunity for brands to differentiate.
As global research firm Gartner, Inc. predicts, brands that implement user-level control of marketing data will reduce customer churn by 40%—and increase lifetime value by 25% in 2023. Think like choosing whether or not you want Facebook to show you clothing ads, or if you want to turn off Facebook’s access to your Whatsapp messages.
“One of the biggest trends in privacy by design is the switch to making the user opt in, rather than out.”
Brands will increasingly seek to set themselves apart by handing data authority back to users. But what constitutes control when it comes to personal data?
One way of empowering the user is to arm them with knowledge by letting them know exactly what data is being collected, and what the company is doing with it. After all, no one likes finding out after the fact that they’ve been broadcasting their personal data this entire time.
At the end of 2018, Google launched a new feature that aims to put user privacy and security “front and center”. Integrated directly into Google Search, this new feature makes it easier for the user to manage their data settings.
As Google puts it, “Before today, if you were searching on Google and wanted to review or manage this data, the best way for you to do that would have been to visit your Google Account. Now, we’re bringing these controls to you—from directly within Search, you can review or delete your Search activity and quickly get back to finding what you were searching for.
Without ever leaving Search you can now review and delete your recent Search activity, get quick access to the most relevant privacy controls in your Google Account, and learn more about how Search works with your data.”
“One way of empowering the user is to arm them with knowledge by letting them know exactly what data is being collected, and what the company is doing with it.”
Keeping the user informed is one way of giving them more control. However, it’s still extremely difficult for the everyday consumer to really grasp the volume of data we’re sharing, and what this data is worth.
One brand seeking to change that is LOOMIA, the producer of a soft circuit system that can be integrated into textile products. They realized that, when sewn into clothing, the LOOMIA electronic layer could be used to collect data—of the sort that could be extremely valuable to fashion companies.
However, they didn’t want to become yet another company storing the personal data of their users. As CEO Janett Liriano explains: “People know that businesses are making a lot of money from them. Consumers want to leverage their own data. They don’t want big businesses to be in control of it.”
Instead, they came up with the LOOMIA Tile, a device that can be stitched into the seams of clothing and connected to sensors that gather information from the wearer. What happens to this data is entirely up to the user, and if they choose to share, LOOMIA will pay them back in crypto tokens.
“Keeping the user informed is one way of giving them more control. However, it’s still extremely difficult for the everyday consumer to really grasp the volume of data we’re sharing, and what this data is worth.”
To share their data, the wearer scans the TILE with their phone and submits the information to a decentralized cloud storage service. Brands then pay the users for their data in LOOMIA’s cryptocurrency tokens, which the user can use to buy goods via the TILE app.
2. Use blockchain technology as an alternative to “the Cloud”
Despite its advantages, the ominous-sounding, all-encompassing Cloud often makes headlines for all the wrong reasons. You’re probably already familiar with a number of the high-profile Cloud security breaches affecting the likes of Microsoft, Dropbox, and Yahoo.
Then there are the more personal cases. Last year, the story of an Amazon Echo recording a couple’s conversation and subsequently sending the audio to a third party—not by request—raised major concerns about how smart devices capture and handle our most private data.
As AI and voice technology become more entwined in our daily lives, it’s critical for brands to design for trust. For users who aren’t comfortable with their data being sent to the Cloud, Blockchain is emerging as an appealing alternative.
The global Blockchain technology market is projected to be worth $20 billion USD by 2024, and we’re seeing a new wave of brands designing it into their products.
For voice technology company Snips, Blockchain holds the key to more trustworthy user experience. Rather than sending data to the Cloud, Snips relies on Blockchain to create decentralized voice assistants that run completely on-device.
As Rand Hindi, Snips co-founder and CEO, explains: “Consumers are increasingly aware of privacy concerns with voice assistants that rely on cloud storage—and these concerns will actually impact their usage. However, emerging technologies like blockchain are helping us to create safer and fairer alternatives for voice assistants.”
With more and more companies considering Blockchain as a data privacy solution, this burgeoning technology will increasingly shape the way the products of the future are designed and built.
3. Write plain English privacy policies
When it comes to data protection, brands must pay attention to how they communicate with the user. As it currently stands, 71% of consumers find companies’ privacy rules difficult to understand—signalling a huge need for improvement.
All good designers know that microcopy is a crucial part of the product design experience. Just as lousy microcopy frustrates user, poorly-written privacy policies are simply unacceptable in today’s climate.
More and more, brands are moving away from jargon and legalese in favor of simple, digestible language and logical information architecture.
And it’s not just about wording. Just as companies seek to make their written content more accessible, they will also experiment with different ways of presenting the most important privacy information.
4. Enforce privacy as a default setting
The concept of privacy by design is gaining traction. As we grow ever wiser to dark design patterns, privacy and data control is increasingly being built in from the very first wireframe.
Traditionally, data has been something of an afterthought; a topic that came up just before a product was built, perhaps. However, with new regulations laid out by the GDPR, designers must be data-aware from the get-go.
Primarily, designers need to know what data will be collected and shared by the product they’re designing—and where this data collection process fits into the user journey. At what point will the user be notified and asked for their permission? What screen will this appear on, and what wording will be used?
Designers must ensure that each and every aspect of the data collection process is clear and user-friendly, and this means incorporating it right from the ideation stage.
One of the biggest trends in privacy by design is the switch to making the user opt in, rather than out.
By explaining to users from the get-go that sharing their data is optional, and they can very easily choose not to do so, you’re giving them real freedom of choice—without putting the onus upon them to search for ways out of this big data mess we’re in.
Just recently, Google was fined $57 million by French data privacy body CNIL for lack of GDPR compliance. One of the two core privacy violations cited by the watchdog was Google’s failure to validly and unambiguously gain the user’s consent to process their data in order to personalize ads. Basically, Google was sneaking behind your back to show you freakishly relevant ads.
As defined by the GDPR, “consent is unambiguous only with a clear affirmative action from the user (by ticking a non-pre-ticked box for instance).”
“What all of these trends boil down to is the importance of privacy as a default setting; privacy that is inherent in every aspect of the product or service.”
Facebook is another company that has been called out for using pre-highlighted buttons in order to guide the user’s choices.
The GDPR explicitly forbids pre-selected boxes that can easily trick the user into giving their consent. And, as the GDPR cracks down on such shady design patterns, designers will need to reconsider their approach to user interface design.
This includes ensuring that all privacy options are visible above the fold, doing away with pre-ticked boxes, and using unambiguous wording to gain (or lose) the user’s consent.
What all of these trends boil down to is the importance of privacy as a default setting; privacy that is inherent in every aspect of the product or service.
Today’s designers have a responsibility to make privacy and transparency a core component of the user experience—be it for an app, a voice-activated speaker, or a service. Privacy and data protection can no longer be an afterthought; they must inform the design process from day one.