Privacy Inequality, Part Two

We all suffer inequalities in one way or another.
Maybe you’ve been lucky and have had all the opportunities in the world. But most people don’t. There are tremendous disadvantages in life, in the end, that’s life. But at some point, we’ve just accepted them.
Privacy inequality is going to be by far the worst one, and we’re not noticing it. Instead we’re accepting it through our actions. However, we can’t fall in the same trap and accept this one. There’s no way we picture a great future without fighting back for privacy.
When we think about the future, we can’t imagine 95% of what will happen. We just can’t. Our ability to predict stops working when our imagination can’t go any further. We can’t predict the consequences that will come with the lack of privacy in society. We can’t predict it simply because there are so many external factors we don’t control.
But I’m going out on a limb here, and try to predict what privacy inequality will look like in the future. (And I hope to be wrong, but it doesn’t seem that way.)
First, what is privacy inequality?

Privacy inequality is the difference among individuals in society where privacy is the source of economic, opportunity and democracy inequalities, but unlike any other inequality, this one goes to the extremes.

Privacy inequality has several characteristics, but the most notorious ones are:

  1. It steals our most basic value as human beings.
  2. It dehumanizes. It’s cruel. Because it goes directly to the people who will suffer more from it, creating two cast of people (people who can pay for privacy and the ones who can’t).
  3. Its damage is exponential, because the value of data increases over time. It gets worse and worse.

In the succeeding paragraphs we’ll attempt an approximation of imagining the magnitude of privacy inequality, analyzing these characteristics and symptoms of each one.

1. Our value as human beings

This is a fundamental point we need to understand, accept and fight for. As I wrote in my manifesto:

Privacy is a human right. It is the right to own your own value as a person. As Edward Snowden said, “Privacy is the right to a free mind. Without privacy, you can’t have anything for yourself.”

Last year The Economist started to say that “the world’s most valuable resource is no longer oil, but data”. Now it seems that everybody knows that. In just a couple of years or so we’ve accepted it and moved on. We’ve taken it for granted. But companies keep pushing it to the edge and mine more data from us than ever before.

We’ve got AI assistants (smart speakers constantly listening and monitoring everything about us). We’ve got surveillance machines in our pockets working 24/7. Machines that report to all sorts of companies (every free — and paid — app ask for something in return: GPS location, microphone usage, text messages. Backup your Whatsapp messages so you just don’t share your data with Facebook but with Google or whoever that shows up).
We’ve got all sorts of weak points and we don’t really care about them.
So we need to ask ourselves two questions: (1) Why privacy is a human right? And (2) Why is data considered such an important characteristic of privacy inequality?
Bryan Johnson explained this in his article Your Data is Your Property. The reason you and I have (economic) value resides in our data and cognitive abilities.
For the purpose of this article, let’s just focus on the data part (we’ll talk about cognitive abilities and its value another time) — which we can separate in two parts: (1) Your raw data, and (2) The predictive value of that data.
Simply put, your raw data is everything about you: your age, location, preferences, how tall you are, go down the list. But the tricky one is the predictability of your data. And what people when say that Facebook knows people better than they significant other (and themselves), is that they can predict what you’re going to do before you, or your significant other, are aware of it.
That’s so powerful. When you get tons of data from a person and put the right algorithms to work, you can start getting interesting (or scary) results.
Privacy becomes a human right in the moment someone or something invades your personal space (either physically or digitally) in order to extract monetary value from you.
That’s our value as human beings, and companies and governments steal that value from us. What do we do? We just give away because we’ve got no idea of our own value as human beings.
This is the main source of privacy inequality. The less privacy the bigger the gap.
And this is where the gap gets in its worse shape— when people can’t protect themselves from it.

2. The value of data increases over time.

The inequality gap gets bigger and bigger over time.

I have a one-year-old niece. And since she was born I started to notice what people do when they have babies: Share lots of pictures and videos. And I don’t mean on social media, but through messaging platforms like Whatsapp. And that data is stored in someone else’s server.
My niece doesn’t live nearby, so when I can’t go visit her and her parents, I like to see pictures and videos of her. But, what happens with that data we send through Whatsapp? Could that data ever turn against my niece in the future?
Right now I think a lot about that, of course. It seems harmless that an app that allows you to keep in touch with your love ones and share your most precious moments, would ever become dangerous for you. But in the end, it’s a business that will exploit every little vulnerability it can in order to suck up monetary value from you.
This is the danger of dealing with data over time. Especially with AI getting better every single day.
If you pick a recording from a Whatsapp conversation and analyze it yourself, you could actually make an accurate identification of that person’s profile: age, gender, language, probably education, and a bunch of other things. A few years ago, that same recording applied to an AI algorithm maybe wouldn’t have been much useful. But today you can predict the mental health of a person analyzing the recording of his or her own voice through an algorithm.
There are so many things you can do with data. Today. But, what could be done tomorrow with that same data? If today we can predict the mental health of an individual through a voice recording, what will we be able to do with that same data five years from now? Or 10 years?
We certainly didn’t think about this 10 years ago when we started giving away our data for cheap. Who could’ve thought that the funny pictures or messages we posted on Facebook could at some point turn against us?

3. Dehumanizing people: The Protected and The Predictable

What’s happening now isn’t just that data increases its value over time. But also that there are more extracting points than ever before, increasing every day. So when you put in the mix predictable data, which value increases every day and put tons of data on it, the result is inevitable.

Here’s my prediction: At some point this situation will create two casts of people: The Protected and The Predictables.

Well, there are already two casts of people, but the difference can be spotted only at the edges — there are many subgroups in it. But what privacy inequality will do is to push these group to the edges. The ones who have lots of opportunities and resources will be given more. And the ones with few opportunities and resources will be taken away, even more.

Again, we’re unable to predict the future, but with this information in place, it’s easy to see how this inequality is going to hit us:

  • Overpriced insurance
  • Bank loans denials (or high interests)
  • Unaffordable healthcare
  • Highly targeted advertising
  • Political manipulation
  • Simple opportunity cost of opportunity.

In the future these two casts of people would be: The Protected and The Predictables. The Protected are the ones who can afford to pay for privacy. These group understand that their privacy is their very human value and want to protect it. However, The Predictables are the ones who can’t pay for privacy. These are ones who live in a Matrix style algorithm. Maybe they know privacy should be a human right, but probably will fail to understand its value and the impact it has on them.
Privacy inequality is the most brutal one because it’s like a global filter that intensifies the economic, opportunity and democracy inequalities. It dehumanizes people and takes the instability of society to a whole new level, abusing of the people who suffer from it.
At some point (unless we solve this thing once and for all) they’ll tell you: “You want privacy? Here it is, pay for it.” Just like any subscription service such as Spotify or Netflix. Here you’ll pay for privacy. Something that legitimate belongs to you, and shouldn’t be for sale. But that’s what they’ll do.
Although, if you think data, the new oil, is reaching its top you’re very wrong. This has just gotten started.
Companies and governments know that the biggest inequality is about to come. And they also know that if they’re able to get ahead and mine more data than anybody else, they might have a shot at ruling the market.

It’s already happening. We’ve got to act now.

Today it seems this scenario is far away, and most people will fail to understand the problem at hand. But this isn’t just about Hollywood movies. The problem is already here, we just don’t see it coming.
Whether you agree with me or not, if something is true about revolutionary times, is that change happens faster than we imagine, disrupting everything to a level you can’t tell in advance.
Let me show you just a few things that are happening today, that is shaping the infrastructure for privacy inequality to settle in.

1. More, and more data mining points

This is a no-brainer. If data is our most precious asset and increases its value over time, from a business point of view makes sense to have more mining sources.
For example, it’s fascinating how Xiaomi has entered into the tech market. I mean, tech applied to everything. If you take a look at the variety of products they sell, it goes from smartphones, scooters, security cameras, to even electric toothbrushes. It’s not coincidence that all these products are connected.
I’m not just picking on Xiaomi, most brands (want to) do that. Because whoever controls the most data mining points, might rule the market tomorrow.
And controlling those data points can come from all sorts of gadgets, or diversification in order to take over other data mining points. Privacy inequality will have scenarios such as not being able to have a loan, because they can predict things you don’t even know about yourself. Thus, it’s not difficult to imagine why Facebook is looking to make a deal with banks.
(A little aside here, banks can go unnoticed, but they use all sorts of techniques to extract your data. Now they’re using “behavioral biometrics” to track how you type, swipe and tab.)
Things will get interesting when biometric data can be actually extracted from everybody.

In a talk at the World Economic Forum in Beijing, Yuval Noah Harari said:

“If too much data gets concentrated in too few hands, humanity will split, not in two classes, it will split in two species, into different species. Now, why is data so important? It’s important because we’ve reached the point when we can hack, not just computers, we can hack human beings and other organisms. […]

“What do you need in order to hack a human being? You need two things: [1] You need a lot of computing power. And [2] you need a lot of data, especially biometric data. Not data of what I buy or where I go, but data of what is happening inside my body and inside my brain. Until today, nobody had the necessary computing power and the necessary data to hack humanity. […]

“Once you have enough biometrical information, and enough computing power, you can create algorithms that know me better than I know myself. And humans don’t really know themselves very well. This is why algorithms have a real chance of getting to know ourselves better.”

It’s not if, it’s when.
The key here is to achieve mass adoption. How do you do that? You can’t do it at once. It’s got to be smooth, so you don’t see it coming and accept it as a new reality. Where’s the starting point? The adoption of facial recognition technologies.

2. Opening Pandora’s box: Mass adoption of facial recognition

The first step in order to push a change into a market is to make it so smooth that people don’t notice it. And right now, facial recognition no longer looks creepy.
We’ve opened the Pandora’s box with facial recognition. The technology isn’t new, it’s been there for a while, but what’s new and surprising, is how we’ve adopted it and how Apple pushed, better said forced, iPhone users to unlock their phones with their face. Because they know what’s at stake here.
Facebook, of course, is also pushing this technology. And went far by saying in Europe:

“Face recognition technology allows us to help protect you from a stranger using your photo to impersonate you.”

Facebook tried six years ago to push facial recognition on European users, but it was deactivated after regulators raised some questions about its consent. Today even 7-Eleven has this technology in its stores. China goes further, as always, and police have started using facial recognition-enabled sunglasses. (Read my personal experience with this technology in China, and how the government found me using facial recognition systems.)

So, once facial recognition spreads to the masses, it’ll be time to push eye movements tracking, blood pressure monitoring and anything that tells them what happens inside your body. Again, today they can know your personality through your eye movements (in fact, my bachelor degree thesis was a study using eye-tracking technology on websites, you’d be surprised how many things you can tell by where you look), and know your blood pressure increases when you’re about to buy something. But what about tomorrow? What will they be able to know?

3. Getting away with it

And the beauty of the deal is that no one is paying attention. No one is paying attention. Because companies are too busy making money and we’re too busy using their “free” stuff. So it’s not surprising that they end up getting away with it.
Consider Facebook. Since 2005 Facebook has been involved in all sorts of scandals, and every single time they get away with it. They masquerade their way out, telling one story to the investor and another to the public. But even today, just getting out of the Cambridge Analytica scandal, they’ve just patented a concept application that can listen to everything you say, so they can know whether you are watching an ad on TV or not, among many other things of course.
Or consider when Google silently removed their “don’t be evil” motto. But hey, it’s Google, people use it on a daily basis so it automatically makes them trust it. So when they allow third parties to read your emails… Nothing happens.
But it’s not just Silicon Valley.
The telecom lobby isn’t shy either. There’s a whole network of spy hubs managed by AT&T for the NSA in eight US cities. And here’s worth considering that AT&T doesn’t just have their clients data, but also any company that operates under their network.
Again, no one is stopping them to make whatever they want. In the end, too often people with the power to stop them have interests involved. We can’t trust them.
I always say this, and I’ll repeat as many times as I have to. The main goal of public companies is to create a profit for the investor. That’s why the company went public in the first place. But, again and again, these tech public companies make two promises, one to the investor (“I’ve grabbed people by the balls, now it’s time to make some money”) and another to the public (“We want to make the world a better place”). Well, always someone ends up unhappy. It’s not coincidence that Elon Musk is taking Tesla private again. He couldn’t keep his promise to the people, so he does what he’s gotta do — my hat’s off.

Children will grow up without privacy

No one can predict the future. I can’t. You can’t. Experts can’t either. But if you connect all the dots, you can make a general prediction of where the future is going, and how blatant is the benefit companies and governments get out of this, while the general public, us, get all the downside.
The worse thing of all? Children won’t know what privacy means. I think about my niece, and the way she’ll perceive the world. I’m afraid that we could fail as species and leave them this legacy of surveillance capitalism.
Privacy inequality will take everything to the edges. Some kids will be lucky and might be able to afford privacy protection. Maybe that will be paying for products that watch their privacy or a kind of antivirus, who knows. But other kids won’t be able to pay for it and will choose the “free” option — others just won’t understand their value as human beings and will pick the free version, because it’s a bargain, right?
Alas, they will grow up with inequalities worse that the one I had at school, or any other you have had in your entire life.
We seem to forget that for decades we’ve been fighting for freedom and equality. Now we’re throwing it all away, and for what, free dumbass apps?

Edward Snowden, the NSA whistleblower, said in an interview:

“A child born today, will grow up with no conceptional privacy at all. They’ll never know what it means to have a private moment to themselves, an unrecorded, unanalyzed thought. That’s a problem, because privacy matters. Privacy is what allow us to determine who we are, and who we want to be.”

The good news is that Hollywood movies, most of the times, have a happy ending where a group of radicals decide to shut down the system and revoke that inequality altogether. The bad news, though, is that this is real life and happy endings don’t come by themselves. We’ve got to make them happen.
Alice Walker once said:

“The most common way people give up their power is by thinking they don’t have any.”

We’ve got more power than we think. We’ve always had. Now it’s time to realize that and make the change we need and create the ending that we want. Maybe we can turn things around after all.
Join The PrivateID Mission.