Civics: “Content Moderation” and Facebook

Casey Newton:

For the six months after he was hired, Speagle would moderate 100 to 200 posts a day. He watched people throw puppies into a raging river, and put lit fireworks in dogs’ mouths. He watched people mutilate the genitals of a live mouse, and chop off a cat’s face with a hatchet. He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook “as long as the skin is translucent.” He found that he could no longer sleep for more than two or three hours a night. He would frequently wake up in a cold sweat, crying.

Early on, Speagle came across a video of two women in North Carolina encouraging toddlers to smoke marijuana, and helped to notify the authorities. (Moderator tools have a mechanism for escalating issues to law enforcement, and the women were eventually convicted of misdemeanor child abuse.) To Speagle’s knowledge, though, the crimes he saw every day never resulted in legal action being taken against the perpetrators. The work came to feel pointless, never more so than when he had to watch footage of a murder or child pornography case that he had already removed from Facebook.

In June 2018, a month into his job, Facebook began seeing a rash of videos that depicted organs being harvested from children. So many graphic videos were reported that they could not be contained in Speagle’s queue.

Related: The First Amendment to the United States Constitution prevents Congress from making any law respecting an establishment of religion, prohibiting the free exercise of religion, or abridging the freedom of speech, the freedom of the press, the right to peaceably assemble, or to petition for a governmental redress of grievances.