[[{“value”:”
More users means more problems, and social media wunderkind Bluesky is no exception. On Monday, Bluesky announced new moderation efforts to address a growth in concerning user content amid its incredible growth.
In an exclusive with Platformer, Bluesky explained that it would be quadrupling its content moderation team, currently a 25-person contracted workforce, in order to curb a worrisome influx of child sexual abuse material (CSAM) and other content that violates the sites’ community guidelines — cases that have so far fallen through the existing moderation systems and warrant human oversight.
“The surge in new users has brought with it concomitant growth in the number of tricky, disturbing, and outright bizarre edge cases that the trust and safety team must contend with,” the company wrote. “In all of 2023, Bluesky had two confirmed cases of CSAM posted on the network. It had eight confirmed cases on Monday alone.”
At large, the platform is navigating an explosion in user reports being handled by an extremely small company. On Nov. 15, the platform posted that it was receiving 3,000 reports per hour, compared to only 360,000 reports for all of 2023. “We’re triaging this large queue so the most harmful content such as CSAM is removed quickly. With this significant influx of users, we’ve also seen increased spam, scam, and trolling activity — you may have seen some of this yourself,” the platform wrote at the time. “We appreciate your patience as we dial our moderation team up to max capacity and bring on new team members to support this load.”
Bluesky’s bolstering of its human workforce supplements what is often a complex and confusing world of automatic, AI-powered content moderation. In a follow-up thread, Bluesky noted issues with “short term” moderation policies put in place over the last week to tackle harmful content under high-severity policy areas, including CSAM. Responding to automatic flags and efforts from the trust and safety team, Bluesky explained, several accounts were temporarily suspended. The platform is reinstating accounts who feel they were unjustly removed.
Bluesky’s in-house content moderation systems are also paired with third-party tools, like reporting tool Safer, created by child safety nonprofit Thorn. Branded as a user-powered, decentralized social network, Bluesky prioritizes an “ecosystem of third-party providers” and eschews a “centralized moderation authority” in favor of user customizability.
“}]] Mashable Read More
Bluesky announces new moderation efforts as it spots a growth of child sexual abuse material among users.