Facebook’s Algorithm Deserves More Credit for Sowing Society’s Disharmony

facebook

“The unexamined life is not worth living.”

–Socrates, during his trial where he was sentenced to death.

– – – – – – – – – – –

Why are we all so mad at each other all of a sudden? Why does every decision seem to be split down society 50/50, from the United States Presidential election, to the Brexit vote in Britain? Why are we more distrusting of each other than any other time in the last nearly 50 years? How did 2,500 years of Western Civilization that seemed to be at a slow but steady march towards equality, democracy, and tolerance for one another all of a sudden hit a patch where we’ve regressed, and blood is being shed in the streets because of it? Do the injustices of today really add up to the injustices our forefathers faced, or is it just our perception of it all?

Everywhere you look, society is more polarized than any other time in recent memory along racial, cultural, economic, and religious lines, and people are comparing 2016 to the type of riotous anger and disharmony prevalent in 1968. So how did we get here? There are a few easy factors to point to, such as the growing economic divide between rich and poor, and how every sector of society, not just politics, seems to be suffering from a lack of true leadership, partly stemming for our need to blame others for the dilemmas we see and face, leaving a society that is unable to be universally trusting of anyone.

But maybe the problem is much more fundamental. Maybe it’s just the way we’re perceiving the outside world due to the way we interface with it. The Internet was supposed to open up the possibilities for human intellect and enhance our appetite for learning and discovery. And for many years that’s exactly what it did. However something changed in the last few years, marked by the recent moment when Google ceased to be the #1 way individuals were referred to content on the web, and Facebook took the lead. Where Google and other search engines encouraged people to ask questions and search for answers, Facebook looks to serve people all of the media they interact with on any given day in their curated algorithm, stultifying curiosity, and stimulating a reality-tunneled cultural window defined by ultra-focused and polarized media outlets purposely slanted to incite the senses and stimulate engagement and sharing … and sometimes violence.

The data on how Facebook’s news feed is set up to serve you media that will reinforce your already-established thoughts and ideologies is well documented. There isn’t anything dubious behind this necessarily, it’s just obvious: if Facebook wants to keep you on their platform, they should serve you content you’re more likely to interact with. And as users “like” certain pages and posts, the focus of their Facebook feed continues to turn more inward and self-serving.

As Frank Bruni of The New York Times observed recently:

More prevalent for many users are the posts we see from friends and from other people and groups we follow on the network, and this information is utterly contingent on choices we ourselves make. If we seek out, “like” and comment on angry missives from Bernie Sanders supporters, we’ll be confronted with more angry missives from more Sanders supporters. If we banish such outbursts, those dispatches disappear…

The Internet isn’t rigged to give us right or left, conservative or liberal — at least not until we rig it that way. It’s designed to give us more of the same, whatever that same is: one sustained note from the vast and varied music that it holds, one redundant fragrance from a garden of infinite possibility…

We construct precisely contoured echo chambers of affirmation that turn conviction into zeal, passion into fury, disagreements with the other side into the demonization of it.

Carnival barkers, conspiracy theories, willful bias and nasty partisanship aren’t anything new, and they haven’t reached unprecedented heights today. But what’s remarkable and sort of heartbreaking is the way they’re fed by what should be strides in our ability to educate ourselves … Growth of the Internet promised to expand our worlds, not shrink them. Instead they’ve enhanced the speed and thoroughness with which we retreat into enclaves of the like-minded.

After years of being accused of being complicit in the closing of minds, Facebook decided to go on the offensive, and like a tobacco company telling the public their product does not cause Cancer and presenting their own set of facts, Facebook posted a blog on May 7th, 2015 called Exposure to Diverse Information on Facebook. In the missive Facebook took two primary stances. The first was to shield blame from themselves for the nature of people’s feeds, and instead blamed people’s behavior and choices.

“While News Feed surfaces content that is slightly more aligned with an individual’s own ideology (based on that person’s actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter,” Facebook says, taking the onus off of themselves and their algorithm for creating an echo chamber for their users, while also partially admitting that the algorithm caters to people’s already-established tastes and habits.

But the crux of Facebook’s argument was that even news feeds streamlined to certain ideologies still on average serve news from opposing points of view.

Specifically, we find that among those who self-report a liberal or conservative affiliation,

  • On average, 23 percent of people’s friends claim an opposing political ideology.
  • Of the hard news content that people’s friends share, 29.5 percent of it cuts across ideological lines.
  • When it comes to what people see in the News Feed, 28.5 percent of the hard news encountered cuts across ideological lines, on average.
  • 24.9 percent of the hard news content that people actually clicked on was cross-cutting.

In other words, you’re still likely to see or click on news stories from a different point of view than your own about 25% of the time, on average at least. Though for some it could be much less. But the most important thing to note about the Facebook study is that it admitted that the vast majority of the hard news stories that show up in people’s feeds are from a strongly slanted point of view, whether it is liberal or conservative, while news that tries to take an unbiased perspective makes up very little of people’s news feeds.

“There was substantial polarization among hard news shared on Facebook,” Facebook’s own study admits, “with the most frequently shared links clearly aligned with largely liberal or conservative populations, as shown below.”

facebook-political-alignment

As you can see above, the gray, or neutral stories make up very little of the content on Facebook, actually the least-shared content on the format, while the stories in blue and red considered polarizing make up much of what you see in your feed. So even though Facebook may serve up hard news to you that cuts across your political ideology, most of what you see is still from a biased, extreme viewpoint.

This type of news interaction not only helps perpetuate extreme viewpoints of your already-predisposed ideology, it also shows you the biased and polarized opinions for the other side of the perspective, painting the other side with extremes, and likely emboldening your opinions and the vehemence behind them. In other words, Facebook is a war of extremes that is made worse by the political diversity of its news feed, not facilitating the open-minded sharing of viewpoints.

Not only can we not agree on issues, we can’t even agree on the basic facts surrounding the issues because they’re all being served from slanted viewpoints.

Is all of this manifesting into the actions of people on the streets, or inspiring psychopathic killers to take extreme measures to assert their ideologies on others? That may be a leap of faith, and additional research is needed. But it is interesting that in many cases, including both of the latest mass shootings in San Bernardino and Orlando, the killers took to Facebook to profess their ideologies during their killing rampages. Killing people wasn’t enough, they had to take to Facebook to assert their self-righteous ideologies. Other recent killers and extremists are regularly exposed as narcissistic ideologues through the images, affiliations, and postings of their Facebook pages. Police are currently investigating whether a specific Facebook post inspired the individual who recently killed five police officers in Dallas to take action.

But this goes much beyond high-profile cases of violence that may be scary, but are isolated, even if they sow further political divisiveness through the Facebook news feed as society reacts. Our everyday world is affected by the widespread polarization of society. Nothing can get done in a government of gridlock, where both sides are expending the majority of their energy undermining the other, instead of moving society forward as a whole. It could be the reason the most unpopular candidates for the American Presidency ended up on top. And as more people use Facebook as their primary interface with the Internet and the world, the problem only becomes exacerbated.

What’s the solution, if any? The Google search algorithm actively looks to feed Internet users news from more respected and unbiased sources first, as well as filtering out news and information from many of the spurious outlets that thrive on Facebook. That is also one of the reasons Facebook has overtaken Google in referrals, because Facebook is giving into people’s hatred, biases, and fears, which is a powerful medium for engagement. Not only is the majority of traffic to Internet articles now coming from Facebook, the majority of news outlets are specifically catering their news to exploit Facebook’s news feed algorithm, compounding and exacerbating the echo chamber effect.

Until the users of Facebook can wake up to implicit bias in their news feeds, and Facebook starts to implement some basic controls on the quality and perspective of the news it feeds to people, the problem looks to only get worse, and the polarization of society along with it.