Facebook Reveals Its Internal Rules For Removing Controversial Content

Silhouettes of mobile users are seen next to a screen projection of Facebook logo in this

Silhouettes of mobile users are seen next to a screen projection of Facebook logo in this

The ideal content moderator was hired with care, by someone else, probably.

The company admits that its enforcement "isn't ideal". The flawless content moderator follows the rules to a T, yet is keenly attuned to every nuance.

Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook's community standards. In 2016, NPR's Aarti Shahani detailed a workforce comprised primarily of subcontractors who are stationed in distant countries and asked to review large quantities of posts every shift. Appeals will be reviewed by a moderator within a day, the company promises. Five months later that figure was bumped up to 4,000.

In response, the company has said it will double its 10,000-person safety and security team by the end of this year. The reports are then reviewed by an operations team made of up of more than 7,500 content reviewers spanning over 40 languages.

How Will Facebook's Appeal Process Work?

This seems to be another push by Facebook to show just how much content they have to moderate on their platform following criticism from Congress recently for the alleged limiting of certain Facebook pages such as that of conservative YouTube stars Diamond and Silk. Considering, engagement from Facebook India is controlled and barely existent, the appeals process will be useful for many complainants. ProPublica obtained documents past year regarding Facebook's community standards that show its policies are far deeper and nuanced than what the social media platform shared on Tuesday.

Trump says NAFTA talks going 'nicely,' Canada sees progress on auto rules
But the remainder - organizers claim some 600 are still part of the caravan - say they're determined to make the U.S. Trade Representative's Office in Washington on Tuesday for meetings.

Facebook for years has had "community standards" for what people can post. Facebook has never revealed direct guidelines that are used by moderators to stop harassment, spam, violence, etc.

In "adult nudity and sexual activity" under "Objectionable Content", too, there are plenty of provisos. Software also can identity the language of a post and some of the themes, helping the post get to the reviewer with the most expertise.

Facebook's disconcerting treatment of the class of "employees" so crucial to its daily functioning goes far beyond stressful workplace conditions. "I don't have a job, I have anxiety and I'm on antidepressants".

Quartz noted that some of the new rules were "clearly developed in response to a backlash Facebook received in the past".

Amy Gesenhues is Third Door Media's General Assignment Reporter, covering the latest news and updates for Marketing Land and Search Engine Land. A 1996 federal law shelters most technology companies from legal liability for the content users post on their services.

Racist Promposal Condemned by School District; Student Sorry
According to the Herald-Tribune , the school district is investigating the sign and considering disciplinary action. Trevor Harvey of the Sarasota Chapter of the NAACP said the incident itself was extremely upsetting.

Amid a series of unfolding humanitarian crises, Facebook has been under pressure to improve content moderation around the globe. And sporadically reactive at best. Some groups will surely find points to take issue with, but Facebook has made some significant improvements.

Distributed by APO Group on behalf of Facebook. Though she (and dozens of others) reported the harassment to Facebook, absolutely nothing was done.that is, until she chose to write about it for HuffPo and asked Facebook for comment on the whole hellish endeavor in an official capacity as a reporter.

And of course, humans make plenty of mistakes themselves. It's a couple thousand undervalued and overworked contractors against two billion users' worth of disturbing content.

The company believes with these rules published online, users who don't want to be blocked from social network will be more careful and polite. The people being hired to tame this behemoth are set up to fail, and the only thing that's going to fix it is a systematic change.

Siobhan Cummiskey, Facebook's head of policy for Europe, the Middle East and Africa, admitted the company's enforcement of policy violations isn't ideal but insisted Facebook had the interests of its users at heart and plans to hire additional content reviewers to beef up its 7,500-strong team worldwide.

Trump snaps at reporter for asking about possible Michael Cohen pardon
Justice Department spokeswoman Sarah Isgur Flores declined to confirm or comment on Sessions' decision to Mother Jones . The prospect of Trump pardoning Cohen has been a source of growing speculation since a Trump tweet over the weekend.

Latest News