Facebook announced that it would add information to its community standards about its so-called satire exception when moderating content. The change was made in response to a recent decision by its Oversight Board that required it to reinstate a comment with an adaption of the “two buttons” meme commenting on Turkey’s response to the Armenian genocide.
This week, in a news announcement about the case, Facebook said the information update would allow teams to consider satire when assessing potential hate speech violations, which was the issue in question with the two buttons meme. The company said the update would be completed by the end of the year. Facebook is only fully implementing the board recommendation on the satire exception and is assessing feasibility of the other recommendations made based on the case.
In response to one of the board’s other recommendations, which advised Facebook to ensure it has adequate procedures in place to “assess satirical content and relevant context properly including by providing content moderators with additional resources,” the company revealed that it had already been working on a new satire framework for its regional and escalation teams. However, it is currently determining how to “apply this review at scale.”
Facebook stated that stakeholders—ranging from academic experts and journalists to comedians and representatives of satire publications—it had engaged with for its framework had pointed out that humor and satire are highly subjective across people and cultures. The company was also told that it is important to institute human review of humor and satire by individuals with cultural content.
“Given the context-specific nature of satire, we are not immediately able to scale this kind of assessment or additional consultation to our content moderators,” Facebook said. “We need time to assess the potential tradeoffs between identifying and escalating more content that may qualify for our satire exception, against prioritizing escalations for the highest severity policies, increasing the amount of content that would be escalated, and potentially slower review times among our content moderators.”
The comment with the meme was posted by a Facebook user in the U.S. in December 2020. Although some might scratch their heads over the description of the “two buttons” meme, like yours truly, there’s a good chance you’ve seen it. Created by Jake Clark in 2014, the meme features a horizontal split-screen cartoon: the top screen is an image of a hand poised to click one of two red buttons, while the bottom features a cartoon character with their hand on their head sweating over which button to choose.
People routinely adapt the meme with crazy, dumb, and benign text over each of the buttons. Memerino highlights some great ones such as “get to eat a turkey” and “get to eat a ham”; “infinite power” and “the Marvel superheroes; or “add more Easter eggs” and “make more games,” to name a few.
In this case, as described by the Oversight Board, the U.S. Facebook user substituted the cartoon character’s face in the lower split-screen for a Turkish flag. In the split-screen above, the user included two choices: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it.” The meme was preceded and followed by the thinking face emoji, the board said.
According to the Oversight Board, Facebook said it removed the comment because the phrase, “The Armenians were terrorists that deserved it,” contained assertions that Armenians were criminals based on their nationality and ethnicity, which violates the company’s community standard for hate speech. Facebook also claimed that the meme was not covered under an exception that allows users to share hateful content to condemn it or raise awareness, as the cartoon character could be seen as “condemning or embracing the two statements featured in the meme.”
The majority of Facebook’s Oversight Board disagreed, though, and found that the meme was covered by this exception, overturning the company’s decision on the matter.