Facebook Says It Has Shut Down 5.4 Billion Fake Accounts This Year

Facebook Mark Zuckerberg

Facebook Mark Zuckerberg

It removed more than 11.6 million pieces of content depicting child nudity and sexual exploitation of children on Facebook and 754,000 pieces on Instagram during the third quarter. Following the report's release, Chief Executive Officer Mark Zuckerberg said that the company - the world's biggest social network - gets unfairly criticized for reporting large takedown numbers, but that these accounts actually show Facebook is taking these problems more seriously than competitors are.

The disclosure highlights the scale of the challenge before Facebook as it prepares for a high-stakes election season in the United States, as well as the 2020 U.S. census.

The company attributed that gap to the different features of Instagram compared to Facebook, which has more easily-scannable text content, and to the complexity of distinguishing between genuinely harmful content and frank accounts from mental health sufferers expressing their experiences. "What it says, if anything, is that we're working harder to identify this and take action on it and be transparent about that than what any others are".

In a blog post, Facebook also said it took down millions of pieces that violated copyrights. That was up from 841,000 and 609,000 pieces respectively six months earlier. Instagram removed 92.2% of terrorism content using software algorithms.

Nissan posts 70% Q2 profit slump, slashes full-year outlook
Nissan shares , down 19% this year, closed up 1% at 714.5 yen before the results announcement, Reuters reported. Adding to Nissan's woes is continued tension within the three-way alliance with Mitsubishi Motors and Renault.

Washington, DC (CNN Business) So farthis year, Facebook has shut down 5.4 billion fake accounts on its main platform, but millions likely remain, the social networking giant said Wednesday.

The Menlo Park, California-based firm said that over the past six months or so, it has improved its "ability to detect and block attempts to create fake, abusive accounts".

Facebook removed almost 10,000 images related to suicide and self harm images from Instagram every day in the months following the Molly Russell scandal, but still relies on users to report one in five.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.