Site icon Make Noise

Instagram Might Finally Free the Nipple After Meta Advises Overhaul on Nudity Rules

[ad_1]

Instagram and Facebook have been advised to end their censorship of bare breasts and chests for all genders following a decision from the oversight board of parent company, Meta. The panel — made up of academics, politicians, and journalists, per The Guardian — overturned a decision by Instagram to remove a post from a transgender and nonbinary couple (who must now be kept anonymous due to the involvement of the oversight board) with their bare chests on show. The images showed the couple posing topless with their nipples covered and captions describing trans health care and raising money for top surgery.

In overturning the removal of the posts, the board also recommended that Meta change its current rules on censorship, “so that it is governed by clear criteria that respect international human rights standards.”

Referring to Meta’s Adult Nudity and Sexual Activity Community Standard, the report states that the policy “prohibits images containing female nipples, other than in specified circumstances, such as breastfeeding and gender confirmation surgery,” and is therefore based on a binary view of gender. It reads: “Such an approach makes it unclear how the rules apply to intersex, nonbinary, and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”

It adds that the restrictions and exceptions to these censorship rules – which include protests, scenes of childbirth, and other medical and health contexts such as top surgery – are not always clear for moderators and create uncertainty for Facebook and Instagram users.

“Here, the Board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender nonbinary people on its platforms,” the report states. “For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies.”

The panel concluded by advising that Meta define clearer and more “rights-respecting” criteria for its Adult Nudity and Sexual Activity Community Standard without discrimination on the basis of sex or gender. It also noted that the policy should protect against nonconsensual image sharing and questioned if other ruling needs to be strengthened in this regard. Questions have also been raised as to how a new, more inclusive policy will deal with concerns like child protection and pornography.

[ad_2]

Source link

Exit mobile version