Two years after its launch, social network Bluesky is updating its Community Guidelines and related policies, inviting users to give feedback before the revisions take effect. The company, which positions itself as an alternative to X, Threads, and Mastodon, said the changes aim to add clarity to its moderation rules, appeals process, and overall safety procedures.
Much of the overhaul comes in response to global regulatory requirements. Laws such as the U.K.’s Online Safety Act, the EU’s Digital Services Act, and the U.S.’s TAKE IT DOWN Act require platforms to strengthen safety and reporting standards. For example, users in the U.K. must now complete age verification steps like ID upload, face scans, or payment card entry to access certain content under the new rules.
New Dispute Resolution Path
One of the more unusual changes involves an “informal dispute resolution process,” where Bluesky agrees to speak with a user by phone before moving into formal resolution. The company argues that many disputes can be solved through conversation, a departure from the opaque moderation systems at larger platforms where users often feel ignored. Additionally, Bluesky says some harm-related claims can now be resolved in court rather than through mandatory arbitration.
The updated Community Guidelines are centered on four main principles: Safety First, Respect Others, Be Authentic, and Follow the Rules. These principles guide decisions on whether posts should be labeled or removed, if accounts should face suspension, or whether certain cases should be escalated to law enforcement.
The rules prohibit violence, self-harm promotion, animal abuse, child sexualization (even in role-play), doxxing, spam, and malicious activity. Exceptions are carved out for journalism, parody, and satire — with recognition that journalists may need to report on sensitive topics. At the same time, the guidelines emphasize bans on harassment, bullying, discriminatory speech, and harmful deepfakes targeting protected groups.
A History of Moderation Struggles
Bluesky has faced criticism before over moderation decisions, particularly from Black and trans communities who felt underserved or unfairly treated. More recently, users have accused the platform of leaning too far left, while also lamenting a culture of excessive criticism and lack of humor.
Although Bluesky originally promoted tools like block lists and opt-in moderation services to let users shape their own experience, the community has often called for the platform itself to take stronger action. This tension continues to shape the debate around its policies.
Alongside the Community Guidelines, Bluesky revised its Privacy Policy and Copyright Policy to address data rights, takedown requests, retention, and transparency reporting. These changes will take effect September 15, 2025, without a public feedback period. The new Community Guidelines, however, will go live on October 15, 2025, after user feedback is reviewed.
What The Author Thinks
Bluesky is trying to walk a fine line: strict enough rules to meet global regulations and protect users, but flexible enough to let the community define itself. The problem is that the more detailed these policies get, the more users will expect consistent and fair enforcement. If Bluesky fails to deliver on that, no amount of “respect others” messaging will stop the criticism from piling up.
Featured image credit: Spiked
For more stories like it, click the +Follow button at the top of this page to follow us.