Get the deal →
Exclusive Outcast World Deal
Get the deal →
Outcast World
Queer politics  ·  sex  ·  culture

THE NEW DIGITAL CLOSET: HOW META BUILT THE WORLD'S BIGGEST PLATFORM AND THEN TOLD QUEER PEOPLE THEY WEREN'T WELCOME IN IT

Share
THE NEW DIGITAL CLOSET: HOW META BUILT THE WORLD'S BIGGEST PLATFORM AND THEN TOLD QUEER PEOPLE THEY WEREN'T WELCOME IN IT

There is a particular cruelty in building something that three billion people depend on for connection, community, and commerce, and then quietly, systematically making it hostile to an entire section of those people, not with a dramatic announcement or a press conference but through the slow, opaque, deniable machinery of algorithmic enforcement, content moderation failures, and corporate cowardice that is almost impossible to challenge because it is almost impossible to prove. What Meta has constructed over the past eighteen months is, in effect, a new digital closet, assembled not with bricks and mortar but with shadow bans, ad rejections, and account removals, and LGBTQ+ people are being shoved back inside it while the company insists, with the blandly sociopathic conviction of a Silicon Valley earnings call, that nothing has changed.

Since October 2025, around fifty reproductive rights organisations, queer groups, and sexual health providers have reported having their accounts restricted, shadow-banned, or outright removed across Facebook, Instagram, WhatsApp, and Threads. Repro Uncensored, an NGO that tracks digital censorship against movements focused on gender, health, and justice, documented over 210 incidents of account removals and severe restrictions in a single year, up from 81 the year before. The affected organisations span the globe, from the UK and Europe to Asia, Latin America, and the Middle East, and the removals have included abortion helplines operating in countries where abortion is entirely legal, queer and sex-positive content that contains nothing remotely explicit, and the restriction of non-explicit nudity and basic sexual health information that has been available on these platforms without incident for years. In one particularly telling incident, a Meta staff member reportedly contacted an affected organisation privately and suggested they start a mailing list instead of relying on the platform, which tells you everything you need to know about what the company privately understands about what it is doing, even as it publicly denies doing it.

Meta's official position is that every organisation and individual on its platforms is subject to the same set of rules, and that any claims of enforcement based on group affiliation or advocacy are "baseless," a statement that is, on any honest reading of the evidence now available, straightforwardly untrue. Consider misterb&b, the world's largest LGBTQ+ travel platform, founded by Matthieu Jost in 2014 after he experienced homophobia while travelling with his partner and built from the ground up to help queer people find safe accommodation around the world. The company's adverts do not contain explicit content; they show same-sex couples, they reference sexual orientation, and they promote a legitimate, well-established travel service that is frequently cited in mainstream press. Meta pulled them anyway, citing content "related to sexuality or sexual orientation or directed at individuals with a specific gender identity," and when Jost appealed, the process was, in his words, "frustratingly slow and opaque, with little transparency about why an ad was rejected." Two weeks later the problem remained unresolved, and Jost was left to conclude what many queer businesses and creators have already concluded, which is that ads showing same-sex couples are flagged more often than equivalent heterosexual content, even when they comply fully with Meta's own advertising guidelines, and that the system, whether through AI moderation or user reports or some murky combination of both, disproportionately punishes content that acknowledges the existence of LGBTQ+ people.

The roots of what is happening now go back to January 2025, when Mark Zuckerberg announced sweeping changes to Meta's content moderation architecture in what was transparently an act of political alignment with the incoming Trump administration dressed up as a principled defence of free expression. The company ended its third-party fact-checking programme in the United States and gutted its hateful conduct policy to remove protections that had explicitly shielded LGBTQ+ people, people of colour, women, and immigrants from targeted abuse. It introduced the term "transgenderism" into its own policy language, a right-wing neologism designed to frame trans identity as an ideology rather than a reality, and permitted the use of slurs that had previously been banned under rules the company itself had written. It dismantled its diversity, equity, and inclusion programmes in the same breath, and the message to the LGBTQ+ community, and to every marginalised group that had been told these platforms were safe spaces for connection and self-expression, was as clear as it was chilling. The political weather has changed, and you are no longer commercially useful enough to protect.

GLAAD, the world's largest LGBTQ+ media advocacy organisation, was unequivocal in its assessment, describing Meta and Zuckerberg as "not only permitting and encouraging, but engaging in anti-LGBTQ hate speech" by intentionally employing anti-LGBTQ+ language in the company's own policies and instituting new rules permitting extreme slurs, a move that the organisation said "squarely moves the company into the territory of Truth Social and other extreme right-wing platforms that are unsafe for youth and advertisers." The survey data that followed confirmed what queer users already knew from their own feeds. Seventy-five per cent of LGBTQ+ respondents reported an increase in harmful content on Meta's platforms since the policy changes, 92 per cent said they felt less protected from being targeted by harassment, and 27 per cent of LGBTQ+ respondents reported being direct targets of gender-based or sexual violence online, including doxxing, stalking, and threats of physical harm. One trans and nonbinary user described the experience with a bluntness that no corporate press release will ever match: "Violence against me has skyrocketed since January. I live in daily fear."

The shadow-banning machinery predates the 2025 policy changes but has intensified dramatically under them, and its mechanics are worth understanding because they reveal how a platform can marginalise an entire community while maintaining plausible deniability about doing so. Instagram classifies LGBTQ+-related hashtags, including #lesbian, #bisexual, #gay, #trans, #queer, and #nonbinary, as "sensitive content," and because teenage accounts have the sensitive content filter enabled by default, young LGBTQ+ people searching for community, for resources, for the basic human reassurance that they are not alone in the world, are met with a blank page and a prompt to review their content settings, as though their identity were a parental advisory warning rather than a fact of their existence. GLAAD's social media safety programme manager, Leanna Garfield, put it plainly: "These platforms are lifelines for young LGBTQ+ people, and restricting this content isolates them further." For LGBTQ+ creators and small business owners who depend on Instagram's algorithmic infrastructure to reach audiences beyond their existing followers, and that means the Explore page, Feed Recommendations, and Reels, the financial consequences are immediate and severe. For trans creators in particular, who have reported being blocked from uploading content entirely, the effect is not merely economic but existential, because you are being made to disappear from the only public square that matters, and the company responsible will not even acknowledge that it is happening.

The broader context in which all of this is unfolding makes it not merely troubling but actively dangerous, because Meta's retreat from LGBTQ+ protections is happening at precisely the moment when queer people, and trans people in particular, need digital platforms most desperately. In the United States, over 700 anti-LGBTQ+ bills are under consideration in state legislatures. Kansas has invalidated the identity documents of every trans person in the state. Kentucky is attempting to reclassify being transgender as a mental illness. The Supreme Court struck down conversion therapy bans on the International Transgender Day of Visibility. The Trump administration has stripped federal protections, defunded HIV prevention programmes, and removed the Pride flag from the Stonewall National Monument. In the more than sixty countries where homosexuality remains criminalised, Meta's platforms are often the only space where LGBTQ+ people can safely connect, organise, and access the information that keeps them alive, and when Meta removes a queer organisation's account in those jurisdictions, the consequences are not measured in lost engagement metrics or reduced ad revenue but in the real and immediate endangerment of human lives.

We have been here before, of course, and the queer community has always understood, in a way that perhaps only marginalised people can, that visibility granted by institutions is always conditional and always revocable. The pub that hosted drag nights until the neighbours complained. The television channel that commissioned queer programming until the ratings dipped. The corporation that flew the rainbow flag with great enthusiasm right up until a right-wing commentator with a podcast made it politically expensive. The mechanism changes from decade to decade, but the underlying dynamic has remained remarkably stable across the entire modern history of LGBTQ+ public life, which is that you are welcome here as long as it costs us nothing, and the moment it costs us something, you will be asked to leave, and if you refuse to leave, you will be made invisible instead.

What is different now, and what makes Meta's version of this ancient pattern so much more consequential than the pub or the television channel, is the scale at which it operates. Three billion users, a third of the planet's population, and an advertising infrastructure that determines not merely who gets seen but who gets to exist in the digital public sphere at all. When Meta decides that queer content is "sensitive," that classification reshapes the information landscape for billions of people simultaneously and across every jurisdiction on earth. When it pulls an LGBTQ+ travel company's adverts for the offence of depicting a same-sex couple, it sends a signal to every advertiser, every creator, every charity, and every community organisation that depends on the platform. You are tolerated, not welcomed, and that tolerance can be withdrawn at any time, without explanation, without meaningful recourse, and without the faintest pretence of an apology.

The queer community has always built its own spaces when the mainstream has refused to make room, and we will build again if we have to, because we always do. But we should not have to pretend that what Meta is doing is anything other than what it plainly, demonstrably, documentably is, which is the construction of a new closet, algorithmic rather than physical, global rather than local, but serving exactly the same purpose it has always served, which is to make queer people invisible, to make queer people quiet, and to ensure that the discomfort of the majority is never troubled by the inconvenient fact of our existence. We are not going to disappear, but we should be honest about what we are up against, and we should name it without euphemism or equivocation. Meta is not failing its LGBTQ+ users through incompetence or algorithmic accident. It is failing them by design. And until that changes, every rainbow logo on Mark Zuckerberg's platforms is a lie told to people who have heard quite enough lies already.

Read more

🎧 Listen to Outcast World
▶ Watch — Outcast World
Outcast World — Watch All Episodes