Bluesky has announced it will be doing more to “enforce our moderation policies to better cultivate a space for healthy conversations,” with the new guidelines scheduled to take effect on October 15. The company shared in a blog post that it received feedback from more than 14,000 community members on an earlier draft of the policy. In response to this feedback, Bluesky said it will “bring a greater focus to encouraging constructive dialogue and enforcing our rules against harassment and toxic content,” and will “more quickly escalate enforcement actions towards account restrictions.” A spokesperson for the company clarified that this means users who violate the guidelines will receive fewer warnings before their accounts are deactivated.
The company also plans to make product changes that “clarify when content is likely to violate our community guidelines.” This could mean users will see more warnings before they post content that might violate the rules.
Controversies Fueling the Policy Change
The question of who gets banned and why has been a fraught issue for Bluesky. The platform has faced recurring complaints that accounts fundraising for Palestinians in Gaza have been unfairly suspended. More recently, many users criticized Bluesky’s decision to temporarily suspend horror writer Gretchen Felker-Martin for her now-deleted comments about the shooting of Charlie Kirk. Writer Roxane Gay described the decision as “unacceptable” and an “absolute shame.” In response to the new guidelines, Felker-Martin wrote, “thanks this sucks.”
In addition to criticism from its more left-leaning users, Bluesky also faces ongoing complaints that it has become a liberal echo chamber. The company’s post mentions other upcoming product changes, including a “zen mode” for a calmer social media experience and “prompts for how to engage in more constructive conversations.”
The New Guidelines and User Backlash
Since the guidelines were announced, much of the response has focused on a section that forbids “sexual content involving non-consensual activity,” even if the content is animated, illustrated, or synthetic. One user complained that Bluesky should “worry less about whether or not a cartoon has rights and more about whether real life trans and Palestinian people do.” Another user said this language “has literally always been there” and that the confusion was due to “bluesky’s dogs–t comms.” A spokesperson for Bluesky confirmed there had been some misunderstanding, stating, “we haven’t changed anything about our enforcement in this area and have no intention of doing so.”
What The Author Thinks
Bluesky’s struggle with moderation, similar to that of other social networks, is a battle that may be unwinnable. No matter how a platform crafts its rules, the diverse and often conflicting expectations of its user base will lead to controversy. By taking a firmer stance, Bluesky is attempting to cultivate a more “healthy” space, but it risks alienating the very users who were drawn to its more experimental roots. The backlash over both the content policy and specific account suspensions shows that for every moderation decision, there is a strong and often justified counter-argument, leaving the company in a no-win situation. This highlights a central dilemma of decentralized social media: how to enforce community standards without sacrificing the very freedom that defines the platform.
Featured image credit: Windows Central
For more stories like it, click the +Follow button at the top of this page to follow us.