Facebook and the fallacy of the mirror

The science fiction story Strangers’ Eyes is about a spaceship that discovers a cloud-covered planet orbiting around a dim red star. When they land, they discover a population of intelligent, though primitive, alien beings that live in huts.

There is only one problem: they all appear to be blind. In fact, they appear to have been recently blinded, because they are debilitated by their lack of sight. They don’t seem to detect the presence of humans or each other in their path. An aborigine, unable to detect the presence of a cliff, falls to his death.

The space explorers ponder this puzzle: how could a lifeform evolve to be sightless in a sighted world? Then an uncomfortable thought strikes them. They look up their spaceship; did their own powerful locator scans, that they used before landing to investigate the planet, blind all of its inhabitants?

This is not an easy conclusion for the explorers to accept: so, they don’t. They were circumspect and responsible. They didn’t set off on their journey with the notion of violence. Indeed, their attitude was closer to that of the jocular bear in the song, who went over the mountain to see what he could see. They can’t square that attitude with the impact their scans actually had.


Facebook, today, sits atop a bear’s mountain’s worth of horror reports. A massacre of Rohingya Muslims in Myanmar. The Trump victory in 2016 that came in on the legs of a systematic spread of disinformation through fake Facebook accounts. Sporadic violence during the recent Black Lives Matter uprising, organized by Boogaloo groups on Facebook. Facebook’s role in spreading antivax and Covid-19 misinformation. Their role in inflaming the pernicious QAnon conspiracy theory, that has left many believers estranged from their families, and occasionally turn violent.

Facebook has been dragged—kicking and screaming, usually—into culling this spread on their platform, first by the media, then by the more vocal of their employees, and most recently by an advertiser boycott.

But it seems that we have now, in the blinding present, arrived at a hard stop, beyond which the two sides will not budge.


That is the question of the mirror.

Facebook’s general attitude with critics is unfailingly accommodating. They always accept that they need to “do more” to fix problems on their platform. They proclaim their wholesome vision for the platform. Then they bring up the mirror.

Ex-UK politician Nick Clegg was hired by Facebook earlier this year as a reputation-fixing spokesperson. He is the perfect messenger for this message.

He was recently on Brian Stelter’s Reliable Sources Sunday show to address their latest PR crisis, the #StopHateForProfit campaign to goad advertisers to pause ad spends on Facebook. Clegg easily accepted that Facebook needs to “do more”. But, he reminded Stelter, Facebook did not create a polarized society, they just landed in one. The filth on their platform didn’t come intrinsically from Facebook; indeed, Facebook is just a mirror, he said, for problems that already existed in society.

As such (he did not say, but undoubtedly meant), while they will of course do all they can to cull misinformation and hate speech on their platform, they aren’t, in a metaphysical sense, responsible for it.

After all, when you look in the mirror and spot a hairy wart on your cheek, the mirror didn’t put it there. It just reflected your sad reality.


As far as I am aware, no one has approached this argument head-on.

When Facebook executives argue that Facebook did not invent polarization or propaganda, they are merely beating down a straw man. Not a single civil rights activist believes that we lived in Utopia before Facebook showed up in 2004; or that harmful content is created by Facebook employees. The argument they make is rather different—and quite simple.

Civil rights activists claim that Facebook, through its tepid management of fake news and hate speech on its platform, makes things worse. That in several cases that have been studied, an event or a situation would have been much less likely to occur without Facebook’s role. That Facebook is, not just reflecting, but shaping society with its engagement-driven model that prioritizes shocking content, and truthful content not enough.

Facebook, on the other hand, seems to reject the notion that they have any impact on society at all (hey, the mirror didn’t put that hairy wart on your cheek). This has been a common thread through all their public messages, from the time in 2016 that Mark Zuckerberg said that he thought that it was a “crazy idea” that anything people read on Facebook might have swayed their vote, to Nick Clegg emphasizing, once again, in an open letter, that Facebook holds up a mirror to society, to Facebook executive Andy Stone heartily seconding Clegg.

I understand. The space travelers in the story didn’t want to accept that their scans blinded the planet either. But here, let me tell you what’s a truly crazy idea: that anyone would believe that Facebook posts reflect its users’ opinions, but its users’ opinions are never influenced by Facebook posts.


Facebook prides itself on giving politicians a platform from which to directly address the public. One particular politician uses that platform particularly well: Trump. Consider his post below, which makes the (false) claim that mail-in ballots will lead to electoral fraud:

Now, Facebook isn’t merely showcasing that post as a campaign website might do. Notice the 9.5K comments, the 4.5K shares, and the 62K reactions. It is pushing this post into the attention of people who Facebook’s algorithm has judged as most likely to trust, engage with, and share it.

These 76K engagements are merely the start. They are the legs that Facebook gives to Trump’s original post. Given that people tend to cluster in like-minded groups, each engagement would bring it into the attention, in an exponentially expanding network, of others who are very likely to like, trust, and further share the original Trump post.

Back of the envelope: out of an average of a user’s 100 friends, let’s say each of these engagements lead to say, 50 other people seeing the post, and of these 50, say 5 to engage with it. That leads to a 380K second-order spread, and a 1.9M third-order spread, 9.5M fourth-order, and so on.

Now you might say that many of the people in this expanding network might have already come across this post on Facebook from their other friends or family, thus it was not new information, and that is very likely true. However, this ignores the power of repetition, and resonance, and the indoctrinating power of seeing like-minded comments building up below the post:

All of this, I want you to notice, is very different from merely showing the post, as a dumb mirror might. This is no mirror. This is an incubator for posts that favors posts that will drive the most engagement; or a meme-bomb that detonates the most engaging memes, spreading them in all directions—pick your metaphor.

Stripped of Facebook’s viral algorithms, Trump’s post might have languished on a campaign website that hardly anyone might have thought to go check; thus, a very small number would have been exposed to the lie. That would have been a dumb mirror; Facebook, on the other hand, is not.

There is another option outside of Facebook, of course: that Americans would have heard Trump’s false claim through the news. But notice that this is very different. Most credible outlets would provide context about the truth or falsity of the claim as they reported on it; perhaps even have an expert on a panel to discuss how unlikely fraud is to occur. On the other hand, the “news outlet” here is merely Facebook, and its CEO has made a religion out of not providing context—even when the context is merely truth.


Why do we permit Facebook executives to gaslight us on this point? It is because the “mirror” argument relies on sleight of hand that one might not notice, if one isn’t immersed in algorithms as part of one’s job.

The point of the mirror analogy is to stake a claim similar to net-neutrality: content-neutrality. Their amplification is like blind justice, they claim: with neither fear nor favor for any political party, programmers code rules without knowing who they will be applied to.

Is it the fault of Facebook, then, one hears them say—that if you frequently click on garbage, our algorithm will give you more garbage? That’s kind of like a mirror, isn’t it?

No. Not quite.

Email, in contrast, is content-neutral. Despite the fact that in the early days of the Obama administration we all got emails from friends and family with conspiracy theories about Obama’s foreign birth being sent out with a long chain of “Re:”s and “Fwd:”s, no one sought to blame Gmail or YahooMail.

There is a reason for that. People instinctively get that while email truly is content-neutral, Facebook emphatically is not. Even if we take them at their word that they do not put their thumb on the scale for the Left or Right (and there is good reason to doubt that), while their programmers might be content-neutral, the algorithms they build emphatically are not.

Indeed, their algorithms are all about content-specificity.

Facebook’s rapacity for private user data is well-known. There is a point to this data collection: it enables them to build up complex models about each user’s proclivities, so that the algorithm can show them content they are most likely to engage with. They don’t do this by being blind to content. They do this by analyzing it to within an inch of its life and matching it up with you.

Imagine a cafeteria that is built to keep track of your tastes. If you have a weakness for salads, it will serve up more exciting salads—this is the benign part. But if you have a tendency to gorge on candy or low-nutrition processed foods full of trans fats—it will exclusively serve you such food. Don’t blame the cafeteria for the obese, diabetic heart-patient you become: it was your choices that drove the cafeteria!

While my feed tends to be filled with political articles and activism, other feeds look very different. A person who is bewildered and fearful of modernity will tend to absorb conspiracy theories about evil puppetmasters. A person driven by resentment about their own lot in life, who feels psychologically driven to find scapegoats, will revel in racist literature. Credulous people with a low understanding of science will tend to engage with misinformation about vaccines and cell towers. One user found that merely sharing her cancer diagnosis filled her feed with ads for alternative cures.

So what if it isn’t true? Facebook isn’t in the business of arbitering truth.

The Facebook content its users marinate in shape their opinions, radicalize them, and change their lives. Here is the mirror reaching in to put the hairy wart on your cheek.


Indeed, not only does Facebook radicalize vulnerable people—Facebook knows this better than anyone else. When their internal research teams tried to answer this question, executives didn’t much like what they heard, and the project quickly lost support:

Worse was Facebook’s realization that its algorithms were responsible for their growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”

WSJ report

This isn’t the first time they have investigated their own ability to manipulate users’ emotions. Back in 2014 an internal research team deliberately exposed users to a heightened number of negative posts to see if they could spread anxiety. It worked.

Keynote speaker Maria Ressa at the Global Investigative Journalism Conference in Hamburg, Sept. 29. Photo: Nick Jaussi / nickjaussi.com

So when Filipino journalist/activist Maria Ressa warned them in September 2018 that if they didn’t cull the disinformation campaign against her being spread on Facebook, she could end up arrested by their despotic government, and they ignored her, they knew what they were doing. Last month, she was arrested.

In Myanmar, where Facebook is basically the entirety of the Internet, civil rights activists repeatedly warned them that the government was deliberately spreading falsehoods about the Rohingya community through their platform. They ignored the warnings until it was too late.

They recently took down hundreds pages belonging to the Boogaloo Bois—a group that is striving for a new US civil war. But only in response to official reports that they had already carried out acts of violence. Meanwhile, rumors about supposed “Antifa” bus attacks spread unchecked on Facebook, in one case leading a multiracial family to be hunted while camping, and completely innocent charter bus tours being threatened. They recently took down the Plandemic video that spreads dangerous Covid-19 misinformation; but not before it had already gone viral, thus affected millions of minds.


Please note that there is a purpose to the mirror analogy. It subtly informs listeners that Facebook doesn’t actually have agency in spreading hate speech or fake news: that all the agency belongs to users. This was also the message from Facebook COO Sheryl in her July 7th post:

But people don’t “spread” hate: they merely post it. The “spreading” is all done by Facebook.

So here is the chain of logic:

  • If Facebook is a mirror,
  • then they don’t have agency in spreading hate speech or fake news,
  • thus any cleanup they do of their platform is just them acting as Good Samaritans, as a favor to society—not fixing something they broke in the first place;
  • so, they can be merely reactive and not proactive, and wait for journalists or officials to report content before they act on it.

As Facebook executive Neil Potts said at a Congressional hearing last year: “if brought to their attention,” they would take problematic content down. If not brought to their attention? Well; (shrug emoji).

My guess is that being proactive on harmful content is so incredibly expensive an endeavor that it would be an “existential threat” to their business—as Mark Zuckerberg once said in a different context. This is why they treat the “Facebook-as-mirror” argument as a shibboleth to cling to against all critics.

But Facebook is not a mirror. It shapes society everyday through billions of eyeballs. And they know it. This is why we should treat the “Facebook-as-mirror” argument as what it actually is—deliberate deception as a PR strategy to stave off unpalatable demands.

(Follow me on Twitter at @TheOddPost and on Facebook at The Odd Post.)

Read my other work on Facebook and social media: Moral confusion at Facebook and NYT; It’s not about Free Speech; Don’t let them off the hook.

Print Friendly, PDF & Email