How social-media platforms dispense justice

The Economist:

EVERY other Tuesday at Facebook, and every Friday at YouTube, executives convene to debate the latest problems with hate speech, misinformation and other disturbing content on their platforms, and decide what should be removed or left alone. In San Bruno, Susan Wojcicki, YouTube’s boss, personally oversees the exercise. In Menlo Park, lower-level execs run Facebook’s “Content Standards Forum”.

The forum has become a frequent stop on the company’s publicity circuit for journalists. Its working groups recommend new guidelines on what to do about, say, a photo showing Hindu women being beaten in Bangladesh that may be inciting violence offline (take it down), a video of police brutality when race riots are taking place (leave it up), or a photo alleging that Donald Trump wore a Ku Klux Klan uniform in the 1990s (leave it up but reduce distribution of it, and inform users it’s a fake). Decisions made at these meetings eventually filter down into instructions for thousands of content reviewers around the world.