Facebook brings you exactly what it takes to down posts


The social media is also amplifying its desires procedures around offensive posts — as Facebook is trying trigger toxic content

Facebook is working on more clarity for what is allowed and what isn’t allowed all over social media globe.

On Tuesday, the company permitted a new version of its guidelines for Community Standards  — the principles that explains what’s content is ready to go for its 2.2 billion public post.

Facebook’s rules can’t be changed by themselves. What’s new is the update of the comprehensive guidance of its content controller use to handle offensive material. Last, users could only see surface-level detail of they couldn’t post. Now, the rules give details over how Facebook handle specific situations and clarify certain conditions.

Let’s take an example, the social media defines a “mass murder” as an executions which “causes 4 or more deaths in single incident.” And in harassment section on Facebook says people can’t deliver a message that “notify for death, serious disorder or disability, or harm physical” or “objects that a victim of a violent mishap is not true about being a victim.”

Facebook is also delivering its rules over appeals. Where before, you could request only if your Facebook profile, Page or Group has taken down, you can now dare the social media by replacing of an individual piece of content. Users can also request Facebook’s conversation to save content they’d as a violation the rules of company.


Monika Bickert, vice president of product policy and counterterrorism said last week at a press briefing in Facebook’s Menlo Park headquarters California .”These standards will continue to evolve as our community continues to grow,” “Now all people out there can watch how we’re commanding these auditor.”

Scandal involving Cambridge Analytica put Facebook on a hot seat from last month’s, a digital consultancy that improperly accessed data on up to 87 million Facebook users without their consent. The controversy has put several of Facebook’s policies and practices under the microscope.

The new transparency around Facebook’s Community Standards doesn’t have anything to do with that controversy, however. By Bickert

” Bickert said. “We’ve wanted to do this for that entire time.” and “I’ve been in this job for five years

Hot Topic

Facebook is facing difficult time to clarify its moderation guidelines over by  the 2016 election, when Russian trolls was disrespecting Facebook with a combination of paid ads and authentic posts to sow discord among American voters. Many authorities have also pointed the platform for what they see as prejudice.

When Mark Zuckerberg was taken to be guilty by Congress two weeks ago, lawmakers asked him again and again about what is — and isn’t – permitted on Facebook.

Queued by Rep. David McKinley, a Republican from West Virginia, mentioned illegal details for opioids posted on Facebook, followers asked by Other democratic lawmakers that for what reason they are not been taken down the two African-American Trump supporters with 1.6 million Facebook In November, Facebook said that there will be 20,000 content controllers, up from 10,000 last year. In his statement to Congress, Zuckerberg reveal that the real breakthrough will come when artificial intelligence tools will be able to proactively police the content platform’s, wherever it will take “years” before where that kind of technology will be in working order.

Meanwhile, Bickert said Facebook’s controller do a good job overall of taking down unauthorized material. Still, some things fall through the cracks.

Bickert said “We have millions of reports every week”. “So even if we maintain 99 percent accuracy, there’s still going to be mistakes made.”

Further expanding requests process to include opinions of people outside the company of Facebook. During the interview with Vox in stating of this month, Zuckerberg pointed the concept of a Facebook in front of   “Supreme Court,” made up of independent members who don’t work for the company. Their role would be to make the notification for  “final judgment ” on what’s acceptable speech on Facebook.

Bickert said the company is “always exploring new options” for requests.

Facebook also declare that it wants district input on how its guidance would evolve. In May, Facebook is promoting a new forum called Facebook Open Dialogue to get feedback over its policies. For first the events takes place in Paris, Berlin and the UK and for last year events are planned in US, India and Singapore.

Minibigtech provides latest news and information regarding all the tech based offerings, features, products, applications and services. It keeps you updated about market trends, social trends and google updates.