- Advertisement -

Instagram adds safety features to adult-run accounts featuring children

Must read


Meta is introducing additional safety features for teen users on its Instagram platform, including new safeguards in direct messaging aimed at preventing what the company calls “direct and indirect harm.”

The new features add on to its existing “teen accounts” and include a more prominent display of information about other users when messaging directly, new options to view safety tips and swiftly block an account, and extends existing teen account protections to accounts run by adults that primarily feature children.

Meta’s July 23 announcement is the latest in what has been a monthslong effort to enact greater protections for its teen users on Instagram, Facebook and Messenger amid a widening pressure campaign by activists, lawmakers and health officials demanding social media companies do more to protect younger users against mental health harm and sexual exploitation.

Here’s what to know about the new features.

More: Meta and TikTok face lawsuit over NYC teen’s ‘subway surfing’ death

Teen accounts to see new restrictions in direct messaging

Meta said in its July 23 statement on the update that teen users will now see the month and year an account joined when chatting with them directly, and teens will be able to more easily block and report another user while direct messaging.

A teenager holds a phone that displays the Instagram social media logo in Kerlouan, France on Feb. 25 2025.

A teenager holds a phone that displays the Instagram social media logo in Kerlouan, France on Feb. 25 2025.

“These new features complement the Safety Notices we show to remind people to be cautious in private messages and to block and report anything that makes them uncomfortable – and we’re encouraged to see teens responding to them,” Meta said. “In June alone, they blocked accounts 1 million times and reported another 1 million after seeing a Safety Notice.”

Meta introduced teen accounts in September 2024 in response to increasing concerns over social media’s impact on children and teens’ mental health and safety and criticism that platforms do not do enough to protect minors.

What are Instagram Teen Accounts? Here’s what to know about the new accounts with tighter restrictions

New safety features on adult-run accounts featuring children

The most popular social media platforms in the U.S., such as Meta’s Facebook, Instagram and Messenger, only allow users 13 years of age and older. On Instagram, adults are able to create accounts featuring or representing children if they disclose in the account bio that they manage the account, according to platform guidelines. This includes so-called family-influencer accounts that regularly share images of their children and accounts run by parents or talent managers that represent minors.

“While these accounts are overwhelmingly used in benign ways, unfortunately there are people who may try to abuse them, leaving sexualized comments under their posts or asking for sexual images in DMs, in clear violation of our rules,” Meta said of the adult-run accounts, which will be subject to several teen-account safety features as well in the coming months.

Here’s how these accounts will change:

  • Adult-run accounts will default to the platforms’ “strictest message settings” to prevent unwanted messages, including a filter for offensive words and comments.

  • Meta will implement protections to make it harder for “potentially suspicious adults,” such as those who have been blocked by teen users, to find these adult-run accounts via search.

Instagram app is seen on a smartphone in this illustration taken, July 13, 2021.

Instagram app is seen on a smartphone in this illustration taken, July 13, 2021.

The company said it removed 135,000 Instagram accounts earlier this year for leaving sexualized comments or requesting sexual images from adult-managed accounts featuring children under 13. It removed an additional 500,000 Facebook and Instagram accounts linked to those accounts.

What are teen accounts on Instagram? See safety features

The Instagram teen accounts are automatically assigned to any new users under the age of 18. They limit what users can see, who can message and interact with them, and enable parents to exercise more control over their teens’ social media use.

Safety features on teen accounts include:

  • Teen accounts can only be messaged by people they follow or are already connected to via mutual friends. They employ the platform’s tightest anti-bullying settings, meaning offensive words and phrases are filtered out of comments and direct-message requests.

  • Other restrictions include a filter for content classified as “sensitive” even when shared by someone they follow, including anything marked as potentially sensitive − such as sexually suggestive content, content discussing suicide, self-harm or disordered eating and images of violence.

  • Teens get notifications telling them to leave the app after 60 minutes each day, and a sleep mode turns on to mute notifications overnight and send auto-replies to direct messages.

In April, the company announced it would start testing a protection feature in Instagram direct messages for teens under 18 that automatically blurs nude images, gives senders the option to unsend the images, and gives warning notices to both senders and recipients. Meta said 99% of users have kept the nudity blur turned on since it was introduced, and in June over 40% of blurred images received in direct messages stayed blurred.

Social media platforms criticized over impact on minors

Meta, alongside other social media platform owners, has faced years of criticism and hundreds of lawsuits from over the addictive nature of social media and safety for minors.

In 2023, more than 40 U.S. states sued Meta for misleading the public about the dangers of its platforms, and in July 2024, the U.S. Senate advanced two online safety bills that would force social media companies to take responsibility for how their platforms affect children and teens.

Contributing: Reuters

Kathryn Palmer is a national trending news reporter for USA TODAY. You can reach her at kapalmer@usatoday.com and on X @KathrynPlmr.

This article originally appeared on USA TODAY: Instagram adds safety features for teens, accounts featuring kids



Source link

- Advertisement -

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -

Latest article