Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook's community standards. After Tuesday's announcement, people will be told whether their posts violated guidelines on nudity, hate speech and graphic violence.
A Facebook executive said the teams were working on building more tools.
"We do want to provide more details and information for why content has been removed," said Ellen Silver, Facebook's vice president of operations. "We have more work to do there, and we are committed to making those improvements."
Though Facebook's content moderation is still very much driven by humans, the company does use technology to assist in its work. The company currently uses software to identify duplicate reports, a timesaving technique for reviewers that helps them avoid reviewing the same piece of content over and over because it was flagged by many people at once.
Software also can identity the language of a post and some of the themes, helping the post get to the reviewer with the most expertise.
The company can recognise images that have been posted before but cannot recognise new images.
For example, if a terrorist organisation reposts a beheading video that Facebook already took down, Facebook's systems will notice it almost immediately, said Silver, but it cannot identify new beheading videos. The majority of items flagged by the community get reviewed within 24 hours, she said.
Every two weeks, employees and senior executives who make decisions about the most challenging issues around the world meet. They debate the pros and cons of potential policies. Teams who present are required to come up with research showing each side, a list of possible solutions, and a recommendation.
They are required to list the organisations outside Facebook with which they consulted.
In an interview, Bickert and Silver acknowledged that Facebook would continue to make errors in its judgment.
"The scale that we operate at," Silver said. "Even if were at 99 per cent accuracy, that's still a lot of mistakes."
>>>>>>>
Facebook Shed Some More Light on the People Behind Its Content-Review Process
Vice president of operations Ellen Silver said in her Hard Questions post that ensuring the safety of Facebook's content reviewers is one of the reasons why details have been scarce, writing, "As we saw with the horrific shooting at YouTube's headquarters earlier this year, content reviewers are subject to real danger that makes us wary of sharing too many details about where these teams are located or the identities of the people who review. The day-to-day challenges our reviewers face and the obscurity of their work often leads to confusion about what our reviewers actually do and whether they're safe doing it."
Next Page 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).