Facebook needs more ‘human bias’

Facebook’s PR department has been busy defending itself against allegations from an anonymous, disgruntled employee that editors of the company’s “trending stories” list were biased against conservative viewpoints. The story — itself a trending topic today — may say less about the failings of biased human intervention, however, than the need for more of it.

According to the employee’s allegations, stories quoting conservative stalwarts such as Glenn Beck were being rejected by Facebook’s human reviewers because “it was like they had a bias against Ted Cruz.” We might as well use that argument to explain the entire news media’s coverage on Donald Trump over every other candidate, and every other important story this election cycle.

Supporters of Cruz — just like supporters of John Kasich and Jeb Bush, or Hillary Clinton and Bernie Sanders for that matter — are still stunned by how Trump has monopolized media coverage over the past year. But the reality TV star’s ability to trend on social media, including Facebook, has less to do with leftist human intervention than the embedded biases of these technology companies’ algorithms.

No, these platforms are not biased against conservatives, but against low traffic.

Social media is not configured to generate a balanced perspective on politics or anything. It has one and only one purpose: to generate attention, eyeballs, likes, reposts, and tweets. Facebook’s algorithms are programmed to spot posts that are inflaming or titillating people, and then help those stories gain even more traction by highlighting them in people’s newsfeeds and the site’s list of trending topics.

The result — what’s known as “power law dynamics” — means that one or two stories dominate, and the rest are seen by almost no one. It’s the same property of digital media that leads to the winner-take-all pop music scene, where there are a handful of superstars like Taylor Swift or Beyoncé and the rest make a much more modest living. It’s not that record producers are biased against one sort of music or another. It’s that the digital platforms on which music is played and sold tend to magnify existing trends.

In social media news, that trend is always going to be novelty. Things gain attention — they “go viral,” to quote my 1994 book — because they provoke an immune response. We humans are hardwired to pay attention to things that are weird, different, and potentially threatening.

A car crash leads to rubbernecking, even though road signs are probably more important to the trip. Likewise, the radical novelty of a candidate banning Muslims, insulting someone’s wife or building a border wall will generate more social media attention than someone talking about policy. When Mexico’s president likens Trump to Hitler, he gets more attention than Glenn Beck calling him out for not being a true conservative.

Facebook’s human reviewers work in what amounts to a boiler room, where the trending topics list is curated and peppered with additional, potentially viral hits. This is less about mitigating the harsh, extremist, or sensationalist results of headlines derived by algorithms than enhancing them. Did the machines miss something? Do we need a human interest story in there to balance out the ISIS beheading?

Make no mistake: Facebook is not a news bureau. It is a business plan. The object of the game is to win traffic from Google, Amazon, Spotify, and CNN. Its algorithms don’t just exploit the natural human failing for sensationalist novelty; they amplify and aggravate it.

A staff of sensible, thinking, human curators could compensate for runaway algorithms and the charged rhetoric they demand and inspire from us all. Instead of defending itself against charges of human bias, Facebook should start using and celebrating it.

Exit mobile version