ACLU accuses Facebook of unfairly targeting minority groups

Facebook
Civil rights groups claim that content from activists is routinely removed from Facebook, while the company seemingly fails to “prevent the spread of violent threats and harassment by white supremacist hate groups” on the site.
Christophe Morin/Bloomberg
Luke Stangel
By Luke Stangel – Contributing writer, Silicon Valley Business Journal

Seventy-seven civil rights groups, including the ACLU, are urging Facebook to voluntarily publish data around its content moderation efforts.

Seventy-seven civil rights groups, including the ACLU, are urging Facebook to voluntarily publish data around its content moderation efforts, including how many posts are flagged each day and the percentage of posts that get removed by moderators.

In a new letter this week, the group said content from civil rights activists is routinely removed from Facebook (Nasdaq: FB), while the company seemingly fails to “prevent the spread of violent threats and harassment by white supremacist hate groups” on the site.

Facebook uses a combination of algorithms, community flagging and human moderation to identify and remove content that

violates its community standards

. The problem is, some people use Facebook’s community flagging tools to flag content they disagree with politically, or simply don’t like.

Facebook currently doesn’t publish data around content moderation, so it can feel like there’s a double standard, the coalition said.

“Activists in the Movement for Black Lives have routinely reported the takedown of images discussing racism and during protests, with the justification that it violates Facebook’s Community Standards,” the coalition wrote. “At the same time, harassment and threats directed at activists based on their race, religion, and sexual orientation is thriving on Facebook.”

Facebook’s content moderation policies took center stage in September, after the site banned the Vietnam-era “Napalm Girl” photo, which shows a naked child screaming after being burned by napalm. The site later reinstated the photo.

The coalition called on Facebook to do four things:

  • Allow people to appeal content moderation decisions on individual posts, photos and videos that get flagged and removed.
  • Tell people why their content was removed, citing specific sections of the company’s community standards.
  • Retrain human content moderators to address implicit and explicit racial bias.
  • Begin publishing a new report on content moderation with data around how many posts are being flagged, by whom, and how many are being removed.

The group’s concerns stretch back to their first letter to the company in October. Facebook responded to that initial letter saying the company had reviewed its moderation practices and found they were adequate.

Read the full letter here.