Introducing The PrivateID Mission, my latest project

[This is the introduction post from PrivateID’s site. You can check it out and you’ll find more stuff to read — I recommend you to read the PrivateID’s Manifesto.]
PrivateID is a non-profit privacy equality and data ownership advocate company. Our goal is to advance data ownership recognition as personal property in the way that is most likely to preserve our privacy and integrity as human beings.
We believe privacy should be a human right and, for that matter, data must be an individual’s property. The outcome of this venture is uncertain and there’s a lot of work in front of us, but we believe the change we need is urgent and our window is closing faster than we imagine. We hope to make this right for humanity’s sake.


Privacy invasion is old hat. Back in the 1950s major advertising firms hired anthropologists to “observe” consumers behavior. They spied on consumers and measured their reactions to sell more stuff — which led to include babies in their ads. And if babies weren’t effective doctors always worked. The truth is we’ve been tracked for decades. Phone providers have been invading our privacy for a really long time. Or consider the way banks have been collecting our data every time we use our credit cars. All that data which might seemed isolated and without context in many cases, but it gets scarier when conglomerates like Acxiom collects all those little dots of data and put some context on them. But that’s not even close to the problem here.
The reality is that with the exponential grow of technology, their techniques and tools not only have gotten cheaper, but way, way more effective in invading privacy and recording everyone’s daily life. Particularly in the last decade, gathering and mining data have reached a whole new level. The conversation is not just about advertising, that’s just the tip of the iceberg. Today we’re facing two challenging problems. One has been around for a few years, and the one that’s coming will have a significantly bigger impact.
In recent scandals like Cambridge Analytica, and with Trump’s 2016 election, we’ve seen that we’re vulnerable and with the right algorithms and access to massive amounts of data, they can influence our thoughts. After all we’re emotional machines, so they can press our emotional button and treat us like puppets. Deep down, people knew companies have been gathering their data, but no one really cared.
Anyway, what’s the downside of this? There’s enough data out there on us already.

Looking forward

That’s still scratching the surface though. We just need to dive deeper to see what’s really going on. AI is booming and we’re entering in a Virtual & Augmented Reality era, so if we follow that thread it’s not to hard to imagine the future ahead of us.
In addiction, neural Interfaces are coming. Fast. There are a few companies working on this, Facebook is doing it, and you can bet the other tech giants won’t be left behind. The most notorious ones now are Neuralink (founded by Elon Musk) and Kernel (founded by Bryan Johnson.)
The question that immediately arises is: What would happen if we continue with our current privacy standards and our thoughts are out there in the open?
Bryan Johnson, the founder of Kernel, started to talk about this issue in an article titled Your Data is Your Property, where he claimed that “privacy is the right to own your own value.
Big tech companies and governments own our data (we’re giving most it away for free!) But they don’t just own it, they use AI algorithms to create a predictable value of that data. Consequently, with that predictability they are able to know what your thoughts are, even before you have them. And since they can create algorithms that replicate how you think, it’s a matter of time before they can replicate a version of you in a virtual world and take over your job. Sounds crazy?
What would happen if a tech giant like Facebook, instead of earning around $250 per user annually on advertising, got your annual salary? That’s a lot of dough to be fooling around. They’ll be able to create a digital version of yourself that in a few minutes is better than you at being you (that’s the magic of AI.) Therefore, they could offer digital avatars as workforce in a virtual work.
That’s a trillion dollar market. And like every time there’s so much money at stake, from a business point of view it’s very appealing. They you want you in a virtual world. But not you you, they want an improved version of you and take over your annual salary instead of the few bucks from advertising.
With this lack of privacy we are summoning the demon. This will create the most extreme economical inequality we’ve ever imagined.
We urgently need to tackle this problem now. We can’t rely on governments to force this change — it’s up to us, citizens to force this change. Laws like GDPR are great, but they won’t work for two basic reasons: (1) they are not global, and (2) we can’t rely on the good faith of these companies — regardless what they say, if they have the power to do it, it might get done along the way. In the end, it’s not technology what we should worry about. Technology is neutral. You can use it to oppress or to liberate.


It’s hard to predict when this change is going to take place, but as long as there is so much money at stake there will be companies trying to get a bigger piece of the cake. Surprisingly, regardless of some regulations and media coverage, we’re not even close to change this, and the hard truth is there’s nothing stopping these companies from taking over our most precious human right, privacy. That’s exactly why we are starting The PrivateID Mission.
We believe creating this non-profit is the right way to fight this change, without letting money and shareholders dictate our best interests.
We’re hoping to grow PrivateID and create projects that would get us closer to the change we seek. We don’t know yet exactly which forms will have these projects, but you can subscribe and we’ll let you know. Also, we’re hoping to see you fighting with us.