Facebook wants to be more transparent about what is and isn't allowed on the world's largest social network.
On Tuesday, the company released an updated version of its Community Standards guidelines -- the rules that dictate what's acceptable content for its 2.2 billion users to post.
Facebook's rules themselves haven't changed. What's new is the release of the comprehensive guidelines its content moderators use to handle objectionable material. Previously, users could only see surface-level descriptions of they couldn't post. Now, the rules provide details on how Facebook deals with specific situations and defines certain terms.
For example, the social network says it defines a "mass murder" as a homicide that "results in 4 or more deaths in one incident." And in the section on harassment, Facebook says people can't send a message that "calls for death, serious disease or disability, or physical harm" or "claims that a victim of a violent tragedy is lying about being a victim."
Facebook is also expanding its rules around appeals. Where before, you could request an appeal only if your Facebook profile, Page or Group was taken down, you now can challenge the social network about the removal of an individual piece of content. Users can also appeal Facebook's decision to preserve content they'd reported as a violation of the company's rules.
"These standards will continue to evolve as our community continues to grow," Monika Bickert, vice president of product policy and counterterrorism, said last week at a press briefing in Facebook's Menlo Park, California, headquarters. "Now everybody out there can see how we're instructing these reviewers."
Facebook has been in the hot seat since last month's scandal involving Cambridge Analytica, a digital consultancy that improperly accessed data on up to 87 million Facebook users without their consent. The controversy has put several of Facebook's policies and practices under the microscope.
Bickert says the new transparency around Facebook's Community Standards doesn't have anything to do with that controversy, however.
"I've been in this job for five years," Bickert said. "We've wanted to do this for that entire time."
Hot seat
Facebook has been under pressure to clarify its moderation guidelines since the 2016 election, when Russian trolls abused Facebook with a combination of paid ads and organic posts to sow discord among American voters. Many conservatives have also criticized the platform for what they see as political bias.
When Mark Zuckerberg was grilled by Congress two weeks ago, lawmakers repeatedly asked him about what is -- and isn't -- allowed on Facebook.
Rep. David McKinley, a Republican from West Virginia, mentioned illegal listings for opioids posted on Facebook, and asked why they had not been taken down. Other Republican lawmakers asked why the social network had removed posts by Diamond and Silk, two African-American Trump supporters with 1.6 million Facebook followers.
In November, Facebook said it will add 20,000 content moderators, up from 10,000 last year. In his testimony to Congress, Zuckerberg said the real breakthrough will come when artificial intelligence tools will be able to proactively police the platform's content, although it will take "years" before that kind of technology will be good enough.
In the meantime, Bickert said Facebook's moderators do a good job overall of taking down inappropriate material. Still, some things fall through the cracks.
"We have millions of reports every week," Bickert said. "So even if we maintain 99 percent accuracy, there's still going to be mistakes made."
Facebook has also talked about further expanding its appeals process to include opinions of people outside the company. In an interview with Vox earlier this month, Zuckerberg mentioned the idea of a Facebook "Supreme Court," made up of independent members who don't work for the company. Their role would be to make the "final judgement call" on what's acceptable speech on Facebook.
Bickert didn't address that idea last week, but said the company is "always exploring new options" for appeals.
Facebook also said it wants community input on how its guidelines should evolve. In May, it's launching a new forum called Facebook Open Dialogue to get feedback on its policies. The first events will take place in Paris, Berlin and the UK. Events in the US, India and Singapore are planned for later this year.
You guys have surpassed my expectations! James is seriously amazing and is doing everything to help my Fiancé and me, in1weeks my credit score went up 700 points and I can only imagine what is to come. Thank you for the excellent customer service and doing exactly what you all have set out to do! NO GIMMICKS OR BS with you guys.They carry out any kind of hacks You can reachout to them via Hackintechnology@gmail.com +16692252253
ReplyDelete