Facebook also adds that it protects minors using the social network. It will oblige requests for:
Removal of an underage accounts (you need to be at least 13 years old to use Facebook).
"Government requests for removal of child abuse imagery depicting, for example, beating by an adult or strangling or suffocating by an adult." "Legal guardian requests for removal of attacks on unintentionally famous minors."
>>>>>>
Facebook finally explains why it bans some content, in 27 pages
Elizabeth Dwoskin and Tracy Jan
24-7 moderators look at content on the site, following strict guidelines. Among the most challenging issues for Facebook is its role as the policeman for the free expression of its 2 billion users.
Now the social network is opening up about its decision-making over which posts it decides to take down - and why. The company for the first time has published the 27-page guidelines, called Community Standards, which it gives to its workforce of thousands of human censors.
Facebook released the number of people whose information may have been improperly shared with Cambridge Analytica last week. It did not include the number of New Zealanders.
The set of guidelines encompasses dozens of topics including hate speech, violent imagery, misrepresentation, terrorist propaganda and disinformation. Facebook said it would offer users the opportunity to appeal Facebook's decisions.
Facebook is shining a light on how and why it bans some content.
The newly released guidelines offer suggestions on topics including how to determine the difference between humour, sarcasm and hate speech. They explain that images of female nipples are generally prohibited, but exceptions are made for images that promote breast-feeding or address breast cancer.
"We want people to know our standards, and we want to give people clarity," Monika Bickert, Facebook's head of global policy management, said in an interview. She added that she hoped publishing the guidelines would spark dialogue. "We are trying to strike the line between safety and giving people the ability to really express themselves."
[CEO Mark Zuckerberg has had to testify before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill in Washington, Tuesday, April 10, 2018, about the use of Facebook data to target American voters in the 2016 election.]
The company's censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs. Moderators have struggled to tell the difference between someone posting a slur as an attack and someone who was using the slur to tell the story of their own victimisation.
In another instance, moderators removed an iconic Vietnam War photo of a child fleeing a napalm attack, claiming the girl's nudity violated its policies. The photo was restored after protests from news organisations.
Next Page 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).