Facebook is taking steps to avoid a repeat of last month’s Napalm Girl controversy, while potentially opening itself up to a new headache.
Facebook announced Friday that it plans to cut down censorship of offensive posts that may violate its community standards against nudity and violence — that is, if those posts are deemed to be in the public interest.
“In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards,” according to a blog post from Joel Kaplan, VP of global public policy at Facebook and Justin Osofsky, its VP in charge of media partnerships.
The shift comes one month after Facebook found itself at the center of an international outcry for preventing users from posting the iconic “Napalm Girl” picture. The image, one of the most famous war photographs in history, depicts a naked girl fleeing a Napalm attack.
Facebook eventually reinstated the image under pressure, concluding that “the value of permitting sharing outweighs the value of protecting the community by removal.”
The company also came under fire this week for removing a breast cancer awareness video in Sweden for being graphic. The company later apologized and said it was incorrectly removed.
The announcement to cut down on censorship comes as Facebook employees reportedly debated whether to remove certain posts from Donald Trump for violating its rules against hate speech.
Facebook’s policy shift may win praise as a common sense move, but it potentially puts the social network in the position of determining what is and is not newsworthy.
For years, Facebook has stressed it is a technology company and not a media brand. It prefers to lean on clear community standards and algorithms to maintain order among its 1.7 billion users, rather than editorializing. Facebook even ditch some of its human editors recently.
Now, according to Friday’s post, Facebook plans to work “closely” with “publishers, journalists, photographers” and others “to do better when it comes to the kinds of items we allow.”
Facebook is staying vague for now on exactly how this will work. Who will be involved in determining if a post is newsworthy? Will officials in certain countries be able to deem certain critical posts as not in the public interest? Will graphic but newsworthy posts include a warning?
Reps for Facebook did not immediately respond to a request for comment. One thing seems clear, though: It will take more than just an algorithm to solve this problem.