Home » One week before its big Senate hearing, Meta announces even more teen safety controls

One week before its big Senate hearing, Meta announces even more teen safety controls

One week before its big Senate hearing, Meta announces even more teen safety controls

[[{“value”:”

Meta has rolled out stricter Instagram and Facebook messenger settings, intended to push back against unsolicited DM’s from strangers in a company-wide attempt to address teen safety on its bevy of platforms.

The new default setting turns off a teen user’s ability to receive DMs from anyone they don’t follow or aren’t connected to online, including other teens, the company explained in a blog post. This also applies to minor accounts (defined as 16 years or younger in the U.S. and 18 years or younger in certain countries) added to group chats.

Meta had already limited users to sending just one message per day to accounts who don’t follow them. Previous settings options also only limited adults over the age of 19 from messaging minors who don’t follow them. Now, the messaging settings will apply in both directions.

The company also adjusted parental control settings for caregivers overseeing supervised minor accounts. Account supervisors will now be prompted to approve any changes to privacy and safety settings.

Snapchat, also known as Snap, announced similar preventative settings for messaging last year, including a September update that removed minor accounts from appearing in search results and alerting users when adding an unknown account to their friends list.

Earlier this month, Meta announced it would automatically set all minor accounts to the most restrictive content control settings available — also known as “Sensitive Content Control” on Instagram and “Reduce” on Facebook — prompting teens to reevaluate their personal settings but still offering certain opt-out features. The settings included limiting teens’ likelihood of stumbling on potentially sensitive content or accounts in search or on explore pages and hiding search results for specific queries about suicide, self-harm and eating disorders. Meta also previously added settings that prevent other accounts from reposting content, tagging, or mentioning minor users.

As Mashable’s Christianna Silva noted, the influx of new teen safety measures is presciently timed, with Meta officials set to testify alongside other major social media companies in a Senate hearing on online child exploitation taking place Jan. 31. Other attendees include X/ Twitter CEO Linda Yaccarino, TikTok CEO Shou Zi Chew, Snapchat CEO Evan Spiegel, and Discord CEO Jason Citron.

Hovering in the background are several lawsuits levied against Meta over the last year, including a recent Massachusetts case against Zuckerberg once again alleging he repeatedly blocked attempts to address teen mental health on the app.

The new messaging controls may just be the latest in what some advocates say are “too little too late” measures.

“}]] Mashable Read More 

​ Meta announced another round of updated parental control and minor account settings, including messaging filters that limit strangers from DM’ing minors — less than a week before its child safety Senate hearing.