After weeks of demonstrations and unrest in the U.S. and abroad, Facebook chief executive Mark Zuckerberg said that, although the company has policies on handling content related to violence and civil unrest, “there may be additional policies or integrity measures to consider around discussion or threats of state use of force when a country is in this state.” The social giant will also review its policies with regard to countries with violent conflicts and civil unrest. Facebook removed almost 200 accounts linked to white supremacist groups.
The Wall Street Journal reports that Zuckerberg made this announcement “after the company’s employees staged a virtual walkout Monday over [his] decision to leave up a post from President Trump about the recent social unrest.” In that post, Trump called looters thugs and stated, “when the looting starts, the shooting starts.” In response, “critics say the president’s post violated Facebook’s rules about inciting violence.”
Zuckerberg said the policy — which refrains from fact-checking or removing politicians’ posts, although it will remove posts that “glorify violence and spread voter misinformation” — is “principled and reasonable.”
On Twitter, 12+ Facebook employees have spoken out against Zuckerberg’s statements. Facebook said it would look for non-binary solutions (i.e., on the site or removed from the site), and also “work on products that advance racial justice.” Facebook app head Fidji Simo has been named to “take charge of the [latter] project.” The company has also vowed to create a voter hub to provide “access to accurate and authoritative information about voting.”
The Verge reports that, according to the Associated Press, the company has “removed almost 200 accounts connected to white supremacist groups that were trying to rally supporters to attend protests over police violence against black people.” The accounts were linked to two that Facebook already banned: Proud Boys and American Guard. Facebook spokespeople said that they “started to see posts encouraging people to attend protests in person … some [of whom] were preparing to go with weapons.”
Elsewhere, The Verge reports that, “Facebook published new recommendations for group admins” in the light of difficult discussions about the Black Lives Matter movement. Many admins and moderators have deleted posts they consider “political,” which in turn has splintered long-standing groups and resulted in some shutting down temporarily.
Among its suggestions, Facebook recommends that “leaders create specific lists of topics that aren’t allowed” (including legislation and political candidates) and that “admins educate themselves on the issues … [and] create opportunities for new and diverse members” to join the moderation team.
Relayed:
The Complex Debate Over Silicon Valley’s Embrace of Content Moderation, The New York Times, 6/5/20
NYU Study: Facebook’s Content Moderation Efforts Are ‘Grossly Inadequate’, VentureBeat, 6/7/20
The Racial Ugliness on Facebook and Twitter Show America as It Really Is, Bloomberg, 6/8/20
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.