Facebook Updates its Community Values to Better Frame its Policy Decisions

The social media landscape is always changing, which means the social platforms themselves need to continually update their approach, and ensure that their policies match shifting user expectations.
This week, Facebook has outlined its latest update to its Community Standards – “the guidepost for what is and isn’t allowed on Facebook”. And with The Social Network being blamed for everything from the spread of misinformation, to facilitating hate speech, it’s now more important than ever that Facebook gets such policies right, enabling it to take action on concerning content, and protect its users.
As explained by Facebook:
“The goal of our Community Standards is to create a place for expression, and give people voice. Building community and bringing the world closer together depends on people’s ability to share diverse views, experiences, ideas and information. We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content which would otherwise go against our Community Standards – if it is newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments.”
That sounds a lot like Twitter’s justification for letting certain users get away with violating platform rules – while the platforms do need to adhere to certain standards, the ‘newsworthy and in the public interest’ type qualifiers like this are sort of an asterisk, an exception that they can fall back on which enables them to leave certain discussion points active, despite them violating the actual rules. That means that the platforms can also benefit from the subsequent engagement they fuel – but there are some questions about the viability of such process.
Yes, excepted content needs to be in the ‘public interest’, and there is value to such transparency. But maybe, if the platforms enforced their rules more stringently, they wouldn’t be providing an outlet for comments that can create larger societal divides.
In that case, you do have to question whether it’s in the public interest, or in the company’s interest to boost their usage stats.
In terms of more specific changes to its values, Facebook says that it’s keeping its focus on giving everyone a voice, but it’s also adding in new elements which it will use as guide posts for its decisions around content removals and restrictions:
- Authenticity – We want to make sure the content people are seeing on Facebook is authentic. We believe that authenticity creates a better environment for sharing, and that’s why we don’t want people using Facebook to misrepresent who they are or what they’re doing.
- Safety – We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.
- Privacy – We are committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, and to choose how and when to share on Facebook and to connect more easily.
- Dignity – We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.
Some of those are a little vague – how you define ‘dignity’ from a policy standpoint is pretty complex.
In fact, Facebook acknowledges this in its further notes, saying:
“We recognize that words mean different things or affect people differently depending on their local community, language or background. We work hard to account for these nuances while also applying our policies consistently and fairly to people and their expression.”
In essence, Facebook’s new Community Values provide more of a structure around the types of activity and content it will take action against. It’s difficult to provide concrete rules on such, but these are the guideposts that Facebook will be using to action concerns as they arise.
Of course, another consideration here is Facebook’s capacity to enforce the same. With 2.4 billion users, you can only imagine how many reports Facebook’s team is handling every day, which leads to delays in response, particularly for ‘low priority’ issues. But the importance of each is relative to the individual, adding another layer of complexity to Facebook’s action process.
No one has a perfect system for such, but Facebook’s looking to provide additional oversight on exactly why it takes action, and what questions its moderators ask when actioning concerns.
As a policy overview, it makes sense, and covers the key elements. But as an actual, prescriptive guide to enforcement, a lot of gray areas remain, and always will. That’s likely compounded by the aforementioned exceptions, but it does provide some guidance as to what you can expect when reporting content of concern.
Follow Andrew Hutchinson on Twitter