SHARE

Meta’s Oversight Board—an independent group responsible for overseeing Facebook and Instagram’s content moderation policies—wants to suggest a change the company’s long standing nudity policy to be more inclusive and respectful of human rights. It comes as the Oversight Board overturned Meta’s original decision earlier this month to remove two posts on Instagram that depicted transgender and non-binary people with bare chests. 

The case was brought to the Oversight Board by a US couple who identify as transgender and non-binary. In 2021 and 2022 they posted two images on Instagram where, according to the Board’s decision, they were “bare-chested with the nipples covered.” The captions discussed transgender healthcare and said the couple were fundraising and selling t-shirts so one of them could undergo top surgery—gender-affirming surgery that generally involves the removal of breast tissue.

After a series of alerts from both Meta’s content moderation AIs and reports from users, the posts were “reviewed multiple times for potential violations of various Community Standards” by the human moderation team. In the end, both posts were removed for violating the Sexual Solicitation Community Standard—which is meant to ban sex workers soliciting payments—“seemingly because they contain breasts and a link to a fundraising page.”

The couple appealed the content moderation decision to Instagram and then the Oversight Board on the basis that the reason for the removals did not match the actual intention for the post. After the Board accepted the two cases, Meta’s moderation team decided it had been wrong to remove the posts and restored them. This was too little, too late for the Board, which heard the cases anyway in order to give broader recommendations on Meta’s nudity policies. 

The decision released this week found in the couple’s favor. The Oversight Board decided that removing the posts was “not in line with Meta’s Community Standards, values or human rights responsibilities,” and highlighted “fundamental issues with Meta’s policies.” It found that Meta’s guidance to moderators about the Sexual Solicitation policy was too broad for the stated rationale and publicly available guidance. 

The Oversight Board also found that the Adult Nudity and Sexual Activity Community Standard—which “prohibits images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery”—is inappropriately based on a binary view of gender. The distinction between male and female bodies makes it unclear to both users and moderators “how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender.” Regardless of the ethics of the situation, the Board highlights that it’s “not practical when moderating at scale.”

Similarly, the Board called the restrictions and exceptions to the rules on showing female nipples “confusing, particularly as they apply to transgender and non-binary people.” Female nipples are allowed be shown as part of a protest, during childbirth, and in medical and health contexts (including top surgery) but not while someone is at the beach or in other context where anyone may “traditionally go bare-chested.” It argues that, as these cases show, “Meta’s policies on adult nudity result in greater barriers to expression for women, trans and gender non-binary people on its platforms” and that LGBTQI+ people can be “disproportionally affected.” 

As well as overturning Meta’s original decision to remove the posts, the Board had three recommendations for improving the company’s policies around nudity, LGBTQI+ expression, and nipples in general. 

First, Meta should “define clear, objective, rights-respecting criteria to govern its Adult Nudity and Sexual Activity Community Standard, so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.” Second, it should “provide more detail in its public-facing Sexual Solicitation Community Standard on the criteria that leads to content being removed.” Finally, it needed to “revise its guidance for moderators on the Sexual Solicitation Community Standard so that it more accurately reflects the public rules on the policy,” which could help reduce the number of enforcement errors. 

All in all, it’s a pretty clear win for free expression—though as TechCrunch notes, if some of the Board’s recommendations are taken to the fullest extent, it could result in some pretty major changes to how nudity is moderated on Facebook and Instagram. Automatically presuming that nude female, transgender, and non-binary bodies are sexually suggestive while male bodies are not is at odds with the kind of gender-neutral policies that international human rights standards call for. 

Meta says that it welcomes the Oversight Board’s decision and that it already reinstated the affected content. It says it will conduct a review of the Board’s recommendations, and will issue an update when it decides how it plans to move forward.