News that YouTube removed videos from war-ravaged Syria troubled advocates working to preserve digital records of one of the world’s worst humanitarian crises.
It also raised fresh questions about the fine line YouTube and other tech companies walk in moderating violent and shocking content — which is sometimes vital to chronicling human suffering or even documenting violations of international law.
Google-owned YouTube relies on people and technology to monitor its community for content that violates its policies on violent and graphic content.
In June, it announced it would turn more to technology to identify extremist and terrorism-related videos. Shortly after, YouTube started flagging Syria-related posts, some advocates told CNN Tech.
Airwars, which monitors international airstrikes in places like Iraq, Syria and Libya, was one of the organizations affected.
According to its director, Chris Woods, at least 10 Airwars videos have been banned in recent weeks. Some of them had been on YouTube for three years with no issue. Following public complaints on Twitter by Airwars, the videos have been reinstated, Woods said. But some now contain an “age restriction,” meaning users have to be 18 or older to watch.
“It’s still unclear the extent of the damage done,” Woods told CNN Tech. Other archivists have reported similar experiences, he added.
“We don’t know how much material has been lost, how many playlists are permanently lost,” Woods said. “[The videos] may be key to restitution and reconciliation, war crimes investigations, and compensation payments. They have a profound value both for the broader community but also to Syrians themselves.”
Eliot Higgins, founder of an “open source and social media investigation” site, told CNN Tech that beginning in July, videos he had posted years ago started being flagged by YouTube.
Higgins’ account on YouTube was suspended. When restored, all of his videos and playlists (a way to group videos for viewers) were deleted. After a back-and-forth with YouTube, the content was restored, he said.
“I don’t think this was a malicious act by them — I think it was just an imperfect system rearing its ugly head,” Higgins said. “This flagging system is picking up videos that in some cases shouldn’t be flagged.”
YouTube concedes that its systems sometimes don’t work as they should.
“With the massive volume of videos on our site, sometimes we make the wrong call. When it’s brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it,” a YouTube spokeswoman said in a statement.
The New York Times reported on the removal of “thousands” of Syria-related videos. It cited YouTube’s new machine learning, which the company said helps detect potentially problematic posts and speed up the review process.
Despite tech’s increased role in moderating content at YouTube, people still review almost every post flagged before a decision is made to remove it. The company has said that context is “king” in helping moderators understand the purpose of posting a video. It also says it allows graphic videos that have “news or documentary value.”
Regardless, Keith Hiatt, vice president of nonprofit organization Benetech, says “any mistake humans make, machines are going to make, too.”
After all, humans are writing the algorithms. That’s why Hiatt argues for “algorithmic transparency,” a push for oversight of technology through research and analysis. Hiatt focuses on human rights at Benetech, which works to apply technology to bettering society.
The digital archives of historic events is especially significant in places like Syria, which has endured six-and-a-half years of civil war.
Last week, the International Criminal Court issued an arrest warrant for a senior Libyan commander accused of executing prisoners in Benghazi. Evidence against him includes videos that circulated on social media.
Hiatt, who is also on the tech advisory board of the ICC, likened Syria videos to newsreel footage of the liberation from concentration camps that has been preserved and is evidence of the Holocaust’s existence.
“It’s not like if [companies] mess up they effect ten people. If they mess up they might affect continents,” Hiatt added.