I recently read The Birth and Death of Privacy on Medium. It’s a couple of years old, but that doesn’t decrease its value. As someone who’s “pro-privacy,” I love reading content like this—particularly when it’s well researched, thoughtful, and captivating.
That said, the conclusion drawn by the author is fundamentally incorrect.
In this article, I’ll explain why privacy is neither dead nor alive, and why privacy’s “new look” matters to people like us—the people who design products and services for folks all over the world.
Let’s get started.
When I was 15
Do you remember when MySpace was really cool? You could customize your profile to suit the day. You could write (or copy) little snippets of code. It was a fun and slightly odd time.
This was when I first had my taste of consequence relating to privacy. Consequence is a really interesting topic that isn’t spoken about nearly enough.
A friend of mine set up my profile. I was mostly into sports at the time—I didn’t play video games, and I didn’t spend more than a few minutes every few days online. So it took some convincing for me to join the party.
I trusted my friend to set things up, hand over the details, and let me do my thing.
Well, turns out he also handed the details to someone else who managed to “customize” my profile rather inappropriately.
“The future of privacy has to start with designers.”
The details of my bio can’t be repeated. The images certainly can’t be shared. All I can say is that this fairly public profile led to a negative perception of yours truly.
I managed to get everything sorted out within a few hours, but there was some damage done.
This was the first time I’d ever really cared about privacy and my ability to control what I did and didn’t choose to share.
But I only cared because of consequence. If nothing bad had ever happened, I never would have learned. This reflects much of the market, not just my 15-year-old self.
For the past decade, my relationship with privacy has continually evolved. Here’s a brief timeline to give you some context. Go easy—I’m not a visual designer.
I now have a strong relationship with privacy. I’ve progressively developed an understanding of it, too.
My experience tells me privacy isn’t simply hiding in the shadows. It isn’t erecting walls to create rooms. It isn’t dead or alive. Privacy is a form of power.
In 2017, and I believe for the foreseeable future, privacy should be considered the controlled release of information for a purpose you agree.
In a world largely governed by data, controlling what data you do and don’t share is power. You should have that power.
“Privacy is a form of power.”
This is not a story of black or white. It’s dangerous, particularly for us designers, to tell that story. Privacy is every color of the rainbow. Yet, if we are to design for a future in which people have power, we need new tools and approaches. We need to fundamentally change the way we think about personal data.
The future of privacy has got to start with us designers. The best part is, we actually have the power to make this happen. We are in the position to empower the people we serve as customers.
It all starts with design.
The 3 rules of designing with personal data
I originally externalized these rules in a TechCrunch article. I then followed up with an InVision DesignTalk, which you can watch below.
These rules helped inform how I—and the teams I work with—have been designing with personal data for the last few years.
Although our approach has evolved and matured, here’s how I originally explained the rules.
Rule #1: Acquire data progressively—and only when genuinely needed
What this means is that the data you require to fulfill your value proposition must match the context and stage of the relationship.
If someone wants to take a guided tour, find ways to enable them to browse anonymously. Then support them with specific, action-oriented and value-generating onboarding when the time is right. Lead them on a pathway to success, and empower them to utilize their data to help realize this outcome.
To put a commercial spin on this rule, focus on the metrics that matter. LTV means more than the number of signups this December.
Rule 2: Clearly state your purpose
You are the temporary custodian of the personal data you intend to utilize. For the purpose of achieving your business objectives, it’s critical you maximize the likelihood a person grants their explicit consent for you to use their information through an affirmative action.
To do this, use plain, human language (or visual references) that clearly states the exact purpose through which the data will be used to fulfill the business purpose.
No one likes nasty surprises. Start earning trust through radical transparency.
Rule 3: Give back
Personal data is most likely a significant liability for your organization. Wouldn’t it be better to acquire the right data at the right time, without the need to hold the liability?
Of course it would.
So consider giving back data as a design rule and practice. Empower your customers to engage with you in a multi-directional data exchange that creates shared value.
“Minimum data, maximum utilization.”
Think of it like this: If giving back data is embedded into your onboarding journey, and the customer has the ability to control and utilize that information, you can ask to make use of it at appropriate times.
Better yet, if the customer moves (or changes any other key life status), they can simply update their address and choose to share that updated address with you.
I always think, “Minimum data, maximum utilization.” If you keep this in mind and start that journey by giving, you’re likely to get a whole lot more in return.
People want to feel in control. People want power. When people feel this way, they’re actually more open and trusting (opens PDF). They share more data as a result.
This is where new approaches must come into play. The simple rules above can give us the foundation to give people the power of controlling what they do and don’t share. These rules can help us earn trust, gain access to the right data at the right time, and use the access we’ve earned to deliver superior outcomes.
The Classification of Everyday Living sits at the heart of the COEL standard. The COEL standard separates directly identifying personal information (“who you are”) and behavioral information (“what you do”). This model helps organizations make responsible use of people’s personal data. Image: Coelition.org.
This isn’t to say people don’t want to remove themselves from certain decisions. People do not want to manually approve or decline every single data access request that comes their way—that isn’t sustainable. Even the most bullish amongst us would give up if that were the case.
Cozy Cloud is a French Personal Information Management Services (PIMS) startup. They are one of many in what is a rapidly growing market offering personal data services to both people and organizations. Image: Cozy.io.
This is where new tools must come into play. This is where personal data startups, those that give people tools to control and make use of their data at scale, will really begin to shine.
These services are springing up all over the place. Big brands are taking note too. Once these new tools are in the hands of more people, and brands evolve their data models, the mutual value of empowered data-sharing (opens PDF) will be realized.
“The future of privacy is in the hands of design.”
There’s plenty of content available on all of these topics, so let’s get right to the heart of what matters.
Saying privacy is dead is misguided. It isn’t a biological entity. It isn’t alive, nor is it dead. In 2017, privacy is controlling the release of information based on context and appropriate value exchange within an environment of trust.
If people are empowered within an environment of trust, if they’re armed with new tools, and if we adopt new approaches, we can do more with data. We can solve global problems. We can create better real-life human outcomes. We can share, or choose not to, in ways we are yet to imagine.
This is something we all want. This is a future that is fundamentally better than one where we have no control, where we are powerless.
The future of privacy is in the hands of design.
If you’re interested in the tools and approaches we now use to empower people with their data, contact us or check out Designing for Trust: The Data Transparency Playbook, launching September 16. This is a labor of love, sweat, and just a few tears. Brought to you by >X and the Data Transparency Lab.
Do you think privacy is dead? Chat with other designers about it in Community.