ADVFN Logo ADVFN

We could not find any results for:
Make sure your spelling is correct or try broadening your search.

Trending Now

Toplists

It looks like you aren't logged in.
Click the button below to log in and view your recent history.

Hot Features

Registration Strip Icon for discussion Register to chat with like-minded investors on our interactive forums.

META Meta Platforms Inc

460.08
8.12 (1.80%)
Last Updated: 18:42:19
Delayed by 15 minutes
Share Name Share Symbol Market Type
Meta Platforms Inc NASDAQ:META NASDAQ Common Stock
  Price Change % Change Share Price Bid Price Offer Price High Price Low Price Open Price Shares Traded Last Trade
  8.12 1.80% 460.08 460.03 460.13 460.71 453.34 455.58 7,835,090 18:42:19

Facebook Bug Exposed Names of Content Moderators to Terrorists -- Update

17/06/2017 2:15am

Dow Jones News


Meta Platforms (NASDAQ:META)
Historical Stock Chart


From May 2019 to May 2024

Click Here for more Meta Platforms Charts.
By Georgia Wells 

Facebook Inc. inadvertently exposed the names of some moderators to suspected terrorists and other groups whose content the workers were tasked with reviewing, a flaw the company Friday said has been fixed.

About 1,000 of Facebook's moderators were affected by the flaw, which disclosed their names in an activity log, a spokesman said. Clicking on a name, though, would take the viewer to the public version of the moderator's Facebook profile page. In the vast majority of cases, he said, moderator names weren't viewed by administrators of these groups.

Facebook investigators believe suspected terrorists may have viewed the profiles of fewer than six workers, the spokesman said. According to the investigators, none of the cases involved suspected members of the terror group ISIS, the spokesman said.

The problem began in the fall and was fixed in November, the spokesman said. News of the flaw and subsequent fix was reported earlier by the Guardian.

In response, Facebook made a number of changes to prevent workers' information from becoming available externally again. The company is also testing new accounts that won't require workers to log in with personal Facebook accounts.

"As soon as we learned about this issue, we fixed it and began a thorough investigation to learn as much as possible about what happened, " the Facebook spokesman said.

The fumble comes as Facebook is under scrutiny to do more to police inappropriate content. The company has been leaning more on artificial intelligence in recent months to block potential terrorist posts and accounts on its platform without requiring reviews by human moderators.

One tool combs Facebook for known terrorist imagery, such as beheading videos, to stop them from being reposted. Another set of algorithms attempts to identify and block propagandists from creating new accounts after they have already been kicked off the social network.

Facebook's previous attempts to replace humans with algorithms haven't always succeeded. In August, the company put an algorithm in charge of its "trending" feature, but within days the lists featured false stories and celebrity gossip in place of serious news.

Currently, moderators do much of the work at Facebook deleting content deemed to be in violation of Facebook's terms of service, such as hate speech and child exploitation. In May, Facebook Chief Executive Mark Zuckerberg said the company would hire 3,000 more staffers to review content in an attempt to curb violent or sensitive videos.

Typically, these actions don't appear on Facebook's timeline or logs. But because of a bug introduced last fall, when a moderator revoked the privileges of a group administrator, a note of this action was created in the activity log for the group.

Write to Georgia Wells at Georgia.Wells@wsj.com

 

(END) Dow Jones Newswires

June 16, 2017 21:00 ET (01:00 GMT)

Copyright (c) 2017 Dow Jones & Company, Inc.

1 Year Meta Platforms Chart

1 Year Meta Platforms Chart

1 Month Meta Platforms Chart

1 Month Meta Platforms Chart

Your Recent History

Delayed Upgrade Clock