Home » X just released its first transparency report in years. Here’s what they aren’t saying.

X just released its first transparency report in years. Here’s what they aren’t saying.

X just released its first transparency report in years. Here’s what they aren’t saying.

[[{“value”:”

For the first time since CEO Elon Musk’s takeover of X (formerly Twitter), the social media platform is taking the public behind the scenes of its increasingly opaque reporting and moderation practices. Sort of.

Released today, the 15-page Global Transparency Report is the first public report on internal enforcement data beyond Dec. 2021 (Musk took over Twitter in Oct. 2022). It covers the first six months of 2024, and attempts to paint a picture of the platform’s new enforcement ethos. According to the data, X received more than 224 million user reports, suspended more than 5 million users, and took down more than 10 million posts between January and June.

Previously, Twitter issued twice-yearly reports on its enforcement mechanisms via its Transparency Center. The practice began in 2012, and didn’t stop until new ownership took hold over the platform’s reigns. At the time, Musk spoke openly about fighting the government’s “bullying” of social media platforms and tech leaders, which included shutting out researchers from internal data like transparency reports.

Now, the platform has changed its tune. “Our policies and enforcement principles are grounded in human rights, and we have been taking an extensive and holistic approach towards freedom of expression by investing in developing a broader range of remediations, with a particular focus on education, rehabilitation, and deterrence,” the report reads. “These beliefs are the foundation of ‘Freedom of Speech, not Freedom of Reach’— our enforcement philosophy, which means we restrict the reach of posts, only where appropriate, to make the content less discoverable as an alternative to removal.”

The report is notably more scarce than previous iterations. It features a brief run down of user reporting and corresponding company action, covering a variety of policy areas, including child safety, abuse and harassment, platform manipulation, and suicide and self-harm. It depicts a hybrid machine-learning and human moderation process, featuring an “international, cross-functional team with 24-hour coverage,” making enforcement decisions.

What “rehabilitation” looks like is not explained — although previous reinstatements of some of the platform’s worst offenders, and the focus on account suspensions in the report, suggest X is moving away from outright banning.

X sent 370,588 reports of child exploitation, required by law, to the National Center for Missing and Exploited Children (NCMEC)’s CyberTipline in the first half of the year. The platform says it also suspended more than 2 million accounts actively engaging with child sexual abuse media (CSAM). In 2021, X/Twitter reported 86,000 cases to NCMEC. The number increased to 98,000 in 2022, and then saw a massive jump to 870,000 in 2023.

An X spokesperson explained the jump in numbers in a statement to Mashable. “In 2023, X updated its enforcement guidelines to also suspend users who engaged with actioned CSAM content (Like, Reply, Share, Bookmark, etc.) and added additional proactive defenses. We saw a spike in enforcements after these changes (catching and cleaning up an existing problem), and we believe that those changes have been effective at discouraging users from either sharing CSAM or looking for it (the actions trending down over time, even though we continue to improve defenses).”

The report also offers (limited) information on government data requests and removals, formerly a major focus of Twitter’s reporting as it then championed for a more “open” internet. At the time of the 2021 report, X/Twitter said it had fielded 11,460 requests for information from 67 countries, complying with 40.2 percent of them. In 2024, the platform reported more than 18,000 requests for information and 72,000 requests for content removal from an undisclosed amount of countries. X reportedly disclosed information in 52 percent of cases and complied with 70 percent of removal requests.

The report drops as the platform is subtly revamping itself and its generative AI offerings pre-election. In previous months, X has quietly reinvested in its safety and security teams, with Musk simultaneously redefining the notion of site wide “transparency” and supporting content moderation tools. The CEO also announced this week that the company will soon shutter the site’s block feature.

“}]] Mashable Read More 

​ X published its first Global Transparency Report in years, despite CEO Elon Musk’s previous decision to end the platform’s transparency reporting.