The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think - by Eli Pariser

The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think – by Eli Pariser

How strongly I recommend it: 8/10
See my lists of books for more.

Go to the Amazon page for details.
This book has been one of my favorites reads lately. It explains how customized filters are making us dumber and how big tech companies have a critical impact in the way we consume media. This book is key to understand the digital world we live in.

Highlights

Behind the pages you visit, a massive new market for information about what you do online is growing, driven by low-profile but highly profitable personal data companies like BlueKai and Acxiom.
The Internet giants’ formula is simple: The more personally relevant their information offerings are, the more ads they can sell, and the more likely you are to buy the products they’re offering.
Our media is a perfect reflection of our interests and desires.
Making everything more personal, we may lose some of the traits that made the Internet so appealing to begin with.
One of the best ways to understand how filters shape our individual experience is to think in terms of our information diet. As sociologist danah boyd said in a speech at the 2009 Web 2.0 Expo:

Our bodies are programmed to consume fat and sugars because they’re rare in nature…. In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole.

And while the premise of personalization is that it provides you with a service, you’re not the only person with a vested interest in your data. Researchers at the University of Minnesota recently discovered that women who are ovulating respond better to pitches for clingy clothes and suggested that marketers “strategically time” their online solicitations. With enough data, guessing this timing may be easier than you think.
At best, if a company knows which articles you read or what mood you’re in, it can serve up ads related to your interests. But at worst, it can make decisions on that basis that negatively affect your life.
You may think you’re the captain of your own destiny, but personalization can lead you down a road to a kind of informational determinism in which what you’ve clicked on in the past determines what you see next.
What is good for consumers is not necessarily good for citizens.

1. The Race for Relevance

If you’re not paying for something, you’re not the customer; you’re the product being sold. —Andrew Lewis, under the alias Blue_beetle, on the Web site MetaFilter

Google and Facebook have different starting points and different strategies—one starts with the relationships among pieces of information, while the other starts with the relationships among people—but ultimately, they’re competing for the same advertising dollars.
Lock-in is the dark side of Metcalfe’s law: Facebook is useful in large part because everyone’s on it.
The more locked in users are, the easier it is to convince them to log in—and when you’re constantly logged in, these companies can keep tracking data on you even when you’re not visiting their Web sites.

The Data Market

“Think of [Acxiom] as an automated factory,” one engineer told a reporter, “where the product we make is data.”
Kayak makes money in two ways. One is pretty simple, a holdover from the era of travel agents: When you buy a flight using a link from Kayak, airlines pay the site a small fee for the referral. The other is much less obvious. When you search for the flight, Kayak places a cookie on your computer—a small file that’s basically like putting a sticky note on your forehead saying “Tell me about cheap bicoastal fares.” Kayak can then sell that piece of data to a company like Acxiom or its rival BlueKai, which auctions it off to the company with the highest bid—in this case, probably a major airline like United. Once it knows what kind of trip you’re interested in, United can show you ads for relevant flights—not just on Kayak’s site, but on literally almost any Web site you visit across the Internet. This whole process—from the collection of your data to the sale to United—takes under a second.
The champions of this practice call it “behavioral retargeting.”
And if you sit down on a Southwest Airlines flight, the ads on your seat-back TV screen may be different from your neighbors’. Southwest, after all, knows your name and who you are. And by cross-indexing that personal information with a database like Acxiom’s, it can know a whole lot more about you. Why not show you your own ads—or, for that matter, a targeted show that makes you more likely to watch them?
TargusInfo, another of the new firms that processes this sort of information, brags that it “delivers more than 62 billion real-time attributes a year.”
Your behavior is now a commodity, a tiny piece of a market that provides a platform for the personalization of the whole Internet.
Behind the scenes, the Web is becoming increasingly integrated. Businesses are realizing that it’s profitable to share data.

2. The User Is the Content

Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life. —John Dewey

The technology will be so good, it will be very hard for people to watch or consume something that has not in some sense been tailored for them. —Eric Schmidt, [Former] Google CEO

Unless newspapers could think of themselves as behavioral data companies with a mission of churning out information about their readers’ preferences—unless, in other words, they could adapt themselves to the personalized, filter-bubble world—they were sunk.

A New Middleman

Law professor and Master Switch author Tim Wu says, “The rise of networking did not eliminate intermediaries, but rather changed who they are.”
These platforms hold an immense amount of power—as much, in many ways, as the newspaper editors and record labels and other intermediaries that preceded them.
We’ve given very little scrutiny to the interests behind the new curators.
Bharat tells an interviewer, Google should promote what the reader enjoys reading. But at the same time, overpersonalization that, for example, excludes important news from the picture would be a disaster.
The difference in authority between a blog post and an article in the New Yorker is much smaller than one would think.
“Internet connected TV is going to be a reality. It will dramatically change the ad industry forever. Ads will become interactive and delivered to individual TV sets according to the user,” Google VP for global media Henrique de Castro has said.
“We think basically you watch television to turn your brain off, and you work on your computer when you want to turn your brain on,” Apple founder Steve Jobs told Macworld in 2004.
People watch TV passively, they’re more likely to keep watching when ads come on. When it comes to persuasion, passive is powerful.
Steve Jobs’s proclamation that computers are for turning your brain on may have been a bit too optimistic. In reality, as personalized filtering gets better and better, the amount of energy we’ll have to devote to choosing what we’d like to see will continue to decrease.
And while personalization is changing our experience of news, it’s also changing the economics that determine what stories get produced.
The only thing that’s better than providing articles that are relevant to you is providing articles that are relevant to everyone. Traffic watching is a new addiction for bloggers and managers—and as more sites publish their most-popular lists, readers can join in the fun too.
While Google and others are beginning to grapple with the consequences, most personalized filters have no way of prioritizing what really matters but gets fewer clicks. And in the end, “Give the people what they want” is a brittle and shallow civic philosophy.
The rise of the filter bubble doesn’t just affect how we process news. It can also affect how we think.

3. The Adderall Society

We have to realize that our idea of what’s real often comes to us secondhand and in a distorted form—edited, manipulated, and filtered through media, other human beings, and the many distorting elements of the human mind.
Like a biased sample in an experiment, a lopsided selection of data can create the wrong impression
Creativity is also a result of this interplay between mind and environment, they can get in the way of innovation. If we want to know what the world really looks like, we have to understand how filters shape and skew our view of it.
First, the filter bubble surrounds us with ideas with which we’re already familiar (and already agree), making us overconfident in our mental frameworks. Second, it removes from our environment some of the key prompts that make us want to learn.
Psychologists call these concepts schemata (one of them is a schema), and they’re beginning to be able to identify particular neurons or sets of neurons that correlate with each one—firing, for example, when you recognize a particular object, like a chair. Schemata ensure that we aren’t constantly seeing the world anew: Once we’ve identified something as a chair, we know how to use it.
We don’t do this only with objects; we do it with ideas as well.
Once we’ve acquired schemata, we’re predisposed to strengthen them. Psychological researchers call this confirmation bias—a tendency to believe things that reinforce our existing views, to see what we want to see.
Experts have a lot invested in the theories they’ve developed to explain the world. And after a few years of working on them, they tend to see them everywhere. It’s not just that experts are vulnerable to confirmation bias—it’s that they’re especially vulnerable to it.
The filter bubble tends to dramatically amplify confirmation bias—in a way, it’s designed to. Consuming information that conforms to our ideas of the world is easy and pleasurable; consuming information that challenges us to think in new ways or question our assumptions is frustrating and difficult.
In the bubble, the proportion of content that validates what you know goes way up.
A filtered environment could have consequences for curiosity.
But to feel curiosity, we have to be conscious that something’s being hidden.
The personalized environment is very good at answering the questions we have but not at suggesting questions or problems that are out of our sight altogether.
Stripped of the surprise of unexpected events and associations, a perfectly filtered world would provoke less learning. And there’s another mental balance that personalization can upset: the balance between open-mindedness and focus that makes us creative.

The Adderall Society

Adderall also has some serious side effects. It’s addictive. It dramatically boosts blood pressure. And perhaps most important, it seems to decrease associative creativity. After trying Adderall for a week, Foer was impressed with its powers, cranking out pages and pages of text and reading through dense scholarly articles. But, he wrote, “it was like I was thinking with blinders on.”
On the Internet, personalized filters could promote the same kind of intense, narrow focus you get from a drug like Adderall. If you like yoga, you get more information and news about yoga—and less about, say, bird-watching or baseball.
The search for perfect relevance and the kind of serendipity that promotes creativity push in opposite directions. “If you like this, you’ll like that” can be a useful tool, but it’s not a source for creative ingenuity. By definition, ingenuity comes from the juxtaposition of ideas that are far apart, and relevance comes from finding ideas that are similar. Personalization, in other words, may be driving us toward an Adderall society, in which hyperfocus displaces general knowledge and synthesis.
Personalization can get in the way of creativity and innovation in three ways. First, the filter bubble artificially limits the size of our “solution horizon”—the mental space in which we search for solutions to problems. Second, the information environment inside the filter bubble will tend to lack some of the key traits that spur creativity. (…) Finally, the filter bubble encourages a more passive approach to acquiring information
In a way, the filter bubble is a prosthetic solution horizon: It provides you with an information environment that’s highly relevant to whatever problem you’re working on. Often, this’ll be highly useful: When you search for “restaurant,” it’s likely that you’re also interested in near synonyms like “bistro” or “café.” But when the problem you’re solving requires the bisociation of ideas that are indirectly related—as when Page applied the logic of academic citation to the problem of Web search—the filter bubble may narrow your vision too much.
Some of the most important creative breakthroughs are spurred by the introduction of the entirely random ideas that filters are designed to rule out.
[Another] way in which the filter bubble can dampen creativity is by removing some of the diversity that prompts us to think in new and innovative ways.
Psychologists Charlan Nemeth and Julianne Kwan discovered that bilinguists are more creative than monolinguists—perhaps because they have to get used to the proposition that things can be viewed in several different ways.
Foreign ideas help us break open our categories.
The filter bubble isn’t tuned for a diversity of ideas or of people. It’s not designed to introduce us to new cultures. As a result, living inside it, we may miss some of the mental flexibility and openness that contact with difference creates.
Perhaps the biggest problem is that the personalized Web encourages us to spend less time in discovery mode in the first place.

The Age of Discovery

This shift from a discovery-oriented Web to a search and retrieval–focused Web mirrors one other piece of the research surrounding creativity. Creativity experts mostly agree that it’s a process with at least two key parts: Producing novelty requires a lot of divergent, generative thinking—the reshuffling and recombining that Koestler describes. Then there’s a winnowing process—convergent thinking—as we survey the options for one that’ll fit the situation. The serendipitous Web attributes that Johnson praises—the way one can hop from article to article on Wikipedia—are friendly to the divergent part of that process.
But the rise of the filter bubble means that increasingly the convergent, synthetic part of the process is built in. Battelle calls Google a “database of intentions,” each query representing something that someone wants to do or know or buy. Google’s core mission, in many ways, is to transform those intentions into actions. But the better it gets at that, the worse it’ll be at providing serendipity, which, after all, is the process of stumbling across the unintended. Google is great at helping us find what we know we want, but not at finding what we don’t know we want.
one of the prices of personalization is that we become a bit more passive in the process. The better it works, the less exploring we have to do.
David Gelernter, a Yale professor and early supercomputing visionary, believes that computers will only serve us well when they can incorporate dream logic. “One of the hardest, most fascinating problems of this cyber-century is how to add ‘drift’ to the net,” he writes, “so that your view sometimes wanders (as your mind wanders when you’re tired) into places you hadn’t planned to go. Touching the machine brings the original topic back. We need help overcoming rationality sometimes, and allowing our thoughts to wander and metamorphose as they do in sleep.” To be truly helpful, algorithms may need to work more like the fuzzyminded, nonlinear humans they’re supposed to serve.
personalized filters can interfere with our ability to properly understand the world: They alter our sense of the map. More unsettling, they often remove its blank spots, transforming known unknowns into unknown ones.
In the filter bubble, things look different. You don’t see the things that don’t interest you at all. You’re not even latently aware that there are major events and ideas you’re missing. Nor can you take the links you do see and assess how representative they are without an understanding of what the broader environment from which they were selected looks like. As any statistician will tell you, you can’t tell how biased the sample is from looking at the sample alone: You need something to compare it to.
As a last resort, you might look at your selection and ask yourself if it looks like a representative sample. Are there conflicting views? Are there different takes, and different kinds of people reflecting? Even this is a blind alley, however, because with an information set the size of the Internet, you get a kind of fractal diversity: at any level, even within a very narrow information spectrum (atheist goth bowlers, say) there are lots of voices and lots of different takes.
We’re never able to experience the whole world at once. But the best information tools give us a sense of where we stand in it—literally, in the case of a library, and figuratively in the case of a newspaper front page. This was one of the CIA’s primary errors with Yuri Nosenko. The agency had collected a specialized subset of information about Nosenko without realizing how specialized it was, and thus despite the many brilliant analysts working for years on the case, it missed what would have been obvious from a whole picture of the man.
Because personalized filters usually have no Zoom Out function, it’s easy to lose your bearings, to believe the world is a narrow island when in fact it’s an immense, varied continent.

4. The You Loop

I believe this is the quest for what a personal computer really is. It is to capture one’s entire life.
—Gordon Bell
David Kirkpatrick for his book The Facebook Effect.
Even if you’re using the highest privacy settings in your Web browser, in other words, your hardware may soon give you away.
There’s another tension in the interplay of identity and personalization. Most personalized filters are based on a three-step model. First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media. There’s just one flaw in this logic: Media also shape identity. And as a result, these services may end up creating a good fit between you and your media by changing … you. If a self-fulfilling prophecy is a false definition of the world that through one’s actions becomes true, we’re now on the verge of self-fulfilling identities, in which the Internet’s distorted picture of us becomes who we really are.
Personalized filtering can even affect your ability to choose your own destiny. In “Of Sirens and Amish Children,” a much cited tract, information law theorist Yochai Benkler describes how more-diverse information sources make us freer. Autonomy, Benkler points out, is a tricky concept: To be free, you have to be able not only to do what you want, but to know what’s possible to do. The Amish children in the title are plaintiffs in a famous court case, Wisconsin v. Yoder, whose parents sought to prevent them from attending public school so that they wouldn’t be exposed to modern life. Benkler argues that this is a real threat to the children’s freedom: Not knowing that it’s possible to be an astronaut is just as much a prohibition against becoming one as knowing and being barred from doing so.
Of course, too many options are just as problematic as too few—you can find yourself overwhelmed by the number of options or paralyzed by the paradox of choice. But the basic point remains: The filter bubble doesn’t just reflect your identity. It also illustrates what choices you have. Students who go to Ivy League colleges see targeted advertisements for jobs that students at state schools are never even aware of. The personal feeds of professional scientists might feature articles about contests that amateurs never become aware of. By illustrating some possibilities and blocking out others, the filter bubble has a hand in your decisions. And in turn, it shapes who you become.

A Bad Theory of You

But it’s that behavior that determines what content you see in Google News, what ads Google displays—what determines, in other words, Google’s theory of you.
The basis for Facebook’s personalization is entirely different. While Facebook undoubtedly tracks clicks, its primary way of thinking about your identity is to look at what you share and with whom you interact.
There’s a big difference between “you are what you click” and “you are what you share.”
Facebook’s share-based self is more aspirational: Facebook takes you more at your word, presenting you as you’d like to be seen by others.
Both are pretty poor representations of who we are, in part because there is no one set of data that describes who we are.
there are some important things that are lost in the gap between the data and reality.
Zuckerberg’s statement that we have “one identity” simply isn’t true. Psychologists have a name for this fallacy: fundamental attribution error. We tend to attribute peoples’ behavior to their inner traits and personality rather than to the situations they’re placed in. Even in situations where the context clearly plays a major role, we find it hard to separate how someone behaves from who she is.
Someone who’s gregarious when happy may be introverted when stressed.
Personalization doesn’t capture the balance between your work self and your play self, and it can also mess with the tension between your aspirational and your current self.
Behavioral economists call this present bias—the gap between your preferences for your future self and your preferences in the current moment.
And when we’re aware that everything we do enters a permanent, pervasive online record, another problem emerges: The knowledge that what we do affects what we see and how companies see us can create a chilling effect.
when they are able to accurately gauge the workings of your psyche—things get even weirder.

Targeting Your Weak Spots

If personalized persuasion works for products, it can also work for ideas.
In the wrong hands, persuasion profiling gives companies the ability to circumvent your rational decision making, tap into your psychology, and draw out your compulsions. Understand someone’s identity, and you’re better equipped to influence what he or she does.

A Deep and Narrow Path

Drew Westen, a neuropsychologist whose focus is on political persuasion, demonstrates the strength of this priming effect by asking a group of people to memorize a list of words that include moon and ocean. A few minutes later, he changes topics and asks the group which detergent they prefer. Though he hasn’t mentioned the word, the group’s show of hands indicates a strong preference for Tide.
Priming isn’t the only way media shape our identities. We’re also more inclined to believe what we’ve heard before.
With information as with food, we are what we consume.
Your identity shapes your media, and your media then shapes what you believe and what you care about.
You become trapped in a you loop, and if your identity is misrepresented, strange patterns begin to emerge, like reverb from an amplifier.
We know what happens when teachers think students are dumb: They get dumber.
So what happens when the Internet thinks you’re dumb? Personalization based on perceived IQ isn’t such a far-fetched scenario—Google Docs even offers a helpful tool for automatically checking the grade-level of written text.

Incidents and Adventures

In some cases, algorithmic sorting based on personal data can be even more discriminatory than people would be. For example, software that helps companies sift through résumés for talent might “learn” by looking at which of its recommended employees are actually hired. If nine white candidates in a row are chosen, it might determine that the company isn’t interested in hiring black people and exclude them from future searches. “In many ways,” writes NYU sociologist Dalton Conley, “such network-based categorizations are more insidious than the hackneyed groupings based on race, class, gender, religion, or any other demographic characteristic.” Among programmers, this kind of error has a name. It’s called overfitting.
Banks are beginning to use social data to decide to whom to offer loans: If your friends don’t pay on time, it’s likely that you’ll be a deadbeat too. “A decision is going to be made on creditworthiness based on the creditworthiness of your friends,” Stryker said. “There are applications of this technology that can be very powerful,” another social targeting entrepreneur told the Wall Street Journal. “Who knows how far we’d take it?”
imagine if LinkedIn provided that data to corporate clients to help them weed out people who are forecast to be losers. Because that could happen entirely without your knowledge, you’d never get the chance to argue, to prove the prediction wrong, to have the benefit of the doubt.
As the investment cliché has it, past performance is not indicative of future results.
This raises some big questions for science, which is at its core a method for using data to predict the future. Karl Popper
The statistical models that make up the filter bubble write off the outliers. But in human life it’s the outliers who make things interesting and give us inspiration. And it’s the outliers who are the first signs of change.

5. The Public Is Irrelevant

In practice, the firewall is not so hard to circumvent. Corporate virtual private networks—Internet connections encrypted to prevent espionage—operate with impunity. Proxies and firewall workarounds like Tor connect in-country Chinese dissidents with even the most hard-core antigovernment Web sites. But to focus exclusively on the firewall’s inability to perfectly block information is to miss the point. China’s objective isn’t so much to blot out unsavory information as to alter the physics around it—to create friction for problematic information and to route public attention to progovernment forums. While it can’t block all of the people from all of the news all of the time, it doesn’t need to.
“What the government cares about,” Atlantic journalist James Fallows writes, “is making the quest for information just enough of a nuisance that people generally won’t bother.” The strategy, says Xiao Qiang of the University of California at Berkeley, is “about social control, human surveillance, peer pressure, and self-censorship.”
It may not be a coincidence that the Great Firewall stopped blocking pornography recently. “Maybe they are thinking that if Internet users have some porn to look at, then they won’t pay so much attention to political matters,” Michael Anti, a Beijing-based analyst, told the AP.
But in the age of the Internet, it’s still possible for governments to manipulate the truth. The process has just taken a different shape: Rather than simply banning certain words or opinions outright, it’ll increasingly revolve around second-order censorship—the manipulation of curation, context, and the flow of information and attention. And because the filter bubble is primarily controlled by a few centralized companies, it’s not as difficult to adjust this flow on an individual-by-individual basis as you might think. Rather than decentralizing power, as its early proponents predicted, in some ways the Internet is concentrating it.

Lords of the Cloud

Given all that, I was a bit surprised when the first weapon he referred me to was a very quotidian one: a thesaurus. The key to changing public opinion, Rendon said, is finding different ways to say the same thing. He described a matrix, with extreme language or opinion on one side and mild opinion on the other. By using sentiment analysis to figure out how people in a country felt about an event—say, a new arms deal with the United States—and identify the right synonyms to move them toward approval, you could “gradually nudge a debate.” “It’s a lot easier to be close to what reality is” and push it in the right direction, he said, than to make up a new reality entirely.
To make good decisions, context is crucial—that’s why the military is so focused on what they call “360-degree situational awareness.” In the filter bubble, you don’t get 360 degrees—and you might not get more than one.
Rather than housing their Web sites and databases internally, many businesses and start-ups now run on virtual computers in vast server farms managed by other companies. The enormous pool of computing power and storage these networked machines create is known as the cloud, and it allows clients much greater flexibility. If your business runs in the cloud, you don’t need to buy more hardware when your processing demands expand: You just rent a greater portion of the cloud. Amazon Web Services, one of the major players in the space, hosts thousands of Web sites and Web servers and undoubtedly stores the personal data of millions. On one hand, the cloud gives every kid in his or her basement access to nearly unlimited computing power to quickly scale up a new online service. On the other, as Clive Thompson pointed out to me, the cloud “is actually just a handful of companies.” When Amazon booted the activist Web site WikiLeaks off its servers under political pressure in 2010, the site immediately collapsed—there was nowhere to go.
Personal data stored in the cloud is also actually much easier for the government to search than information on a home computer. The FBI needs a warrant from a judge to search your laptop.
The FBI can just ask the company for the information—no judicial paperwork needed, no permission required—as long as it can argue later that it’s part of an “emergency.”
“They can go to a single place and get everybody’s documents.”
Because of the economies of scale in data, the cloud giants are increasingly powerful. And because they’re so susceptible to regulation, these companies have a vested interest in keeping government entities happy.
Jonathan Zittrain – The Future of the Internet—and How to Stop It

Friendly World Syndrome

Gerbner called this the mean world syndrome: If you grow up in a home where there’s more than, say, three hours of television per day, for all practical purposes, you live in a meaner world—and act accordingly—than your next-door neighbor who lives in the same place but watches less television. “You know, who tells the stories of a culture really governs human behavior,” Gerbner later said.
While the mean world on television arises from a cynical “if it bleeds, it leads” approach to programming, the friendly world generated by algorithmic filtering may not be as intentional.
If television gives us a “mean world,” filter bubbles give us an “emotional world.”
One of the troubling side effects of the friendly world syndrome is that some important public problems will disappear.
But while it’s easier than ever to bring a group of people together, as personalization advances it’ll become harder for any given group to reach a broad audience. In some ways, personalization poses a threat to public life itself.

Fragmentation

The aim of modern political marketing, consumer trends expert J. Walker Smith tells Bill Bishop in The Big Sort, is to “drive customer loyalty—and in marketing terms, drive the average transaction size or improve the likelihood that a registered Republican will get out and vote Republican. That’s a business philosophy applied to politics that I think is really dangerous, because it’s not about trying to form a consensus, to get people to think about the greater good.”
Somewhat confusingly, postmaterialism doesn’t mean anticonsumption. Actually, the phenomenon is at the bedrock of our current consumer culture: Whereas we once bought things because we needed them to survive, now we mostly buy things as a means of self-expression. And the same dynamics hold for political leadership: Increasingly, voters evaluate candidates on whether they represent an aspirational version of themselves.
The result is what marketers call brand fragmentation. When brands were primarily about validating the quality of a product—“Dove soap is pure and made of the best ingredients”—advertisements focused more on the basic value proposition. But when brands became vehicles for expressing identity, they needed to speak more intimately to different groups of people with divergent identities they wanted express. And as a result, they started to splinter. Which is why what’s happened to Pabst Blue Ribbon beer is a good way of understanding the challenges faced by Barack Obama.
In the science of social mapping, the definition of a community is a set of nodes that are densely interconnected—my friends form a community if they all don’t know just me but also have independent relationships with one another. Communication builds stronger community.
Ultimately, democracy works only if we citizens are capable of thinking beyond our narrow self-interest. But to do so, we need a shared view of the world we cohabit. We need to come into contact with other peoples’ lives and needs and desires. The filter bubble pushes us in the opposite direction—it creates the impression that our narrow self-interest is all that exists. And while this is great for getting people to shop online, it’s not great for getting people to make better decisions together.
“The prime difficulty” of democracy, John Dewey wrote, “is that of discovering the means by which a scattered, mobile, and manifold public may so recognize itself as to define and express its interests.” In the early days of the Internet, this was one of the medium’s great hopes—that it would finally offer a medium whereby whole towns—and indeed countries—could co-create their culture through discourse. Personalization has given us something very different: a public sphere sorted and manipulated by algorithms, fragmented by design, and hostile to dialogue.
Which begs an important question: Why would the engineers who designed these systems want to build them this way?

6. Hello, World!

Geek cultures are data driven and reality based, valuing substance over style. Humor plays a prominent role—as Coleman points out, jokes demonstrate an ability to manipulate language in the same way that an elegant solution to a tricky programming problem demonstrates mastery over code. (The fact that humor also often serves to unmask the ridiculous pieties of the powerful is undoubtedly also part of its appeal.)
But there are dangers in taking the method too far. As I discussed in chapter 5, the most human acts are often the most unpredictable ones. Because systematizing works much of the time, it’s easy to believe that by reducing and brute-forcing an understanding of any system, you can control it. And as a master of a self-created universe, it’s easy to start to view people as a means to an end, as variables to be manipulated on a mental spreadsheet, rather than as breathing, thinking beings. It’s difficult both to systematize and to appeal to the fullness of human life—its unpredictability, emotionality, and surprising quirks—at the same time.
Like goldfish that grow only large enough for the tank they’re in, we’re contextual beings: how we behave is dictated in part by the shape of our environments.
What Technology Wants – Kevin Kelly
The back of his business card had a mission statement about connecting people with brands they’d love

7. What You Want, Whether You Want It or Not

There will always be plenty of things to compute in the detailed affairs of millions of people doing complicated things.
—computing pioneer Vannevar Bush, 1945
All collected data had come to a final end. Nothing was left to be collected. But all collected data had yet to be completely correlated and put together in all possible relationships.
—from Isaac Asimov’s short story “The Last Question”
people are much less likely to volunteer private information when being interrogated by a virtual agent than when simply filling out a form.
if they feel as though they’re privately entering it into an impersonal machine rather than sharing it with people.
On the other hand, when Harvard researchers Terence Burnham and Brian Hare asked volunteers to play a game in which they could choose to donate money or keep it, a picture of the friendly looking robot Kismet increased donations by 30 percent.
advertisers may well decide to invest in technology that allows them to insert human advertisements into social spaces. The next attractive man or woman who friends you on Facebook could turn out to be an ad for a bag of chips.

The Future Is Already Here

As of the end of 2010, however, this feature isn’t available in Google Image Search. Face.com, an Israeli start-up, may offer the service before the search giant does. It’s not every day that a company develops a highly useful and world-changing technology and then waits for a competitor to launch it first. But Google has good reason to be concerned: The ability to search by face will shatter many of our cultural illusions about privacy and anonymity.
Many of us will be caught in flagrante delicto. It’s not just that your friends (and enemies) will be able to easily find pictures other people have taken of you—as if the whole Internet has been tagged on Facebook. They will also be able to find pictures other people took of other people, in which you happen to be walking by or smoking a cigarette in the background.
It’s not just people that will be easier than ever to track. It’s also individual objects—what some researchers are calling the “Internet of things.”
As sci-fi author William Gibson once said, “The future is already here—it’s just not very evenly distributed.”
This phenomenon is called ambient intelligence. It’s based on a simple observation: The items you own, where you put them, and what you do with them is, after all, a great signal about what kind of person you are and what kind of preferences you have. “In the near future,” writes a team of ambient intelligence experts led by David Wright, “every manufactured product—our clothes, money, appliances, the paint on our walls, the carpets on our floors, our cars, everything—will be embedded with intelligence, networks of tiny sensors and actuators, which some have termed ‘smart dust.’”
“Advertiser-funded media,” or AFM.
Product placement has been in vogue for decades, and AFM is its natural next step. Advertisers love product placement because in a media environment in which it’s harder and harder to get people to pay attention to anything—especially ads—it provides a kind of loophole. You can’t fast-forward past product placement. You can’t miss it without missing some of the actual content. AFM is just a natural extension of the same logic: Media have always been vehicles for selling products, the argument goes, so why not just cut out the middleman and have product makers produce the content themselves?
In 2010, Walmart and Procter & Gamble announced a partnership to produce Secrets of the Mountain and The Jensen Project, family movies that will feature characters using the companies’ products throughout. Michael Bay, the director of Transformers, has started a new company called the Institute, whose tagline is “Where Brand Science Meets Great Storytelling.” Hansel and Gretel in 3-D, its first feature production, will be specially crafted to provide product-placement hooks throughout.
If the product placement and advertiser-funded media industries continue to grow, personalization will offer whole new vistas of possibility.

A Shifting World

A team led by John Hauser at MIT’s business school has developed the basic techniques for what they call Web site morphing, in which a shopping site analyzes users’ clicks to figure out what kinds of information and styles of presentation are most effective and then adjusts the layout to suit a particular user’s cognitive style. Hauser estimates that Web sites that morph can increase “purchase intentions” by 21 percent.
So the key to the modern technology is to take all that data and turn it into useful information that the pilot can recognize very quickly and act upon.” What Google does for online information, Lovell’s Scorpion project aims to do for the real world.
Since 2002, DARPA has been pushing forward research in what it calls augmented cognition, or AugCog, which uses cognitive neuroscience and brain imaging to figure out how best to route important information into the brain.
Augmented-reality tech provides value, but it also provides an opportunity to reach people with new attention-getting forms of advertising.
Augmented reality represents the end of naive empiricism, of the world as we see it, and the beginning of something far more mutable and weird: a real-world filter bubble that will be increasingly difficult to escape.
Spyware and spam companies sell questionably derived data to middlemen, who then add it to the databases powering the marketing campaigns of major corporations.
(<mark>)As technology gets better and better at directing our attention, we need to watch closely what it is directing our attention toward(</mark>).

8. Escape from the City of Ghettos

you have to see lots of ways of living in order to choose the best life for yourself. This is what the best cities do: They cultivate a vibrant array of cultures and allow their citizens to find their way to the neighborhoods and traditions in which they’re most at home.

What Individuals Can Do

Social-media researcher danah boyd was right when she warned that we are at risk of the “psychological equivalent of obesity.” And while creating a healthy information diet requires action on the part of the companies that supply the food, that doesn’t work unless we also change our own habits.
Habits are hard to break. But just as you notice more about the place you live when you take a new route to work, varying your path online dramatically increases your likelihood of encountering new ideas and people.
Going off the beaten track is scary at first, but the experiences we have when we come across new ideas, people, and cultures are powerful. They make us feel human. Serendipity is a shortcut to joy.
A better approach is to choose to use sites that give users more control and visibility over how their filters work and how they use your personal information. (mark)
There’s great power in setting the default option when people are given a choice. Dan Ariely, the behavioral economist, illustrates the principle with a chart showing organ donation rates in different European countries. In England, the Netherlands, and Austria, the rates hover around 10 percent to 15 percent, but in France, Germany, and Belgium, donation rates are in the high 90s. Why? In the first set of countries, you have to check a box giving permission for your organs to be donated. In the second, you have to check a box to say you won’t give permission.

What Companies Can Do

As Larry Lessig says, “A political response is possible only when regulation is transparent.” And there’s more than a little irony in the fact that companies whose public ideologies revolve around openness and transparency are so opaque themselves.
Knowing what information the personalizers have on us isn’t enough. They also need to do a much better job explaining how they use the data—what bits of information are personalized, to what degree, and on what basis.
There’s one more thing the engineers of the filter bubble can do. They can solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience.

What Governments and Citizens Can Do

  • You should know who has your personal data, what data they have, and how it’s used.
  • You should be able to prevent information collected about you for one purpose from being used for others.
  • You should be able to correct inaccurate information about you.
  • Your data should be secure.

Especially if you’re a blogger or a writer, if you make funny videos or music, or if you coach or consult for a living, your online data trail is one of your most valuable assets. But while it’s illegal to use Brad Pitt’s image to sell a watch without his permission, Facebook is free to use your name to sell one to your friends.
But in 2010, it decided that all of that data should be made fully public; a clause in Facebook’s privacy policy (as with many corporate privacy policies) allows it to change the rules retroactively. In effect, this gives them nearly unlimited power to dispatch personal data as they see fit.
And this isn’t just about privacy. It’s also about how our data shapes the content and opportunities we see and don’t see.