Snapchat is releasing new app safeguards to protect teen users (aged 13-17) from unknown users and age-inappropriate content — a responsive step from a social media platform that’s been under fire for allegedly exposing its younger users to explicit content, despite its 13+ age rating.
Developed in collaboration with the National Center on Sexual Exploitation (NCOSE) and National Center for Missing and Exploited Children (NCMEC), the new features include protections against unwanted contact. The safeguards build on an existing feature preventing teen users from messaging accounts not on their friends list by alerting users when adding an unknown account that doesn’t share mutual friends. Snapchat doesn’t populate teen accounts in app-wide searches unless they have several friends in common or existing phone contacts, and the update increases this threshold based on how many friends the user has.
“In many countries, including the United States, United Kingdom, Australia, and Canada, we also make it difficult for a teen to show up as a suggested friend to another user outside their friend network,” the company explained.
Barbie launches its 2023 career collection, dedicated to women in sports
The platform also updated its resources for caregivers and parents, including new informational videos and a caregiver guide to using the app. In addition, Snapchat provides a safety checklist and a step-by-step guide to using the app’s parental controls.
As explained in the parents guide, Snapchat applies default safety settings to teen accounts, such as limited contact results in search and blocked location sharing. The updated features add stricter content settings and in-app education initiatives to teach young users about online safety.
The app will get a small overhaul for other users, too, as Snapchat will begin strengthening the detection and moderation of content posted to public stories and app spotlights, in line with its new Strike System for removing accounts posting and marketing age-inappropriate content.
In 2022, the platform launched Family Center, an in-app safety tool that lets caregivers monitor their child’s friends list and recent chats, and report any suspicious accounts interacting with their child. Other social media platforms have launched and continue to update similar parental control hubs, like Instagram’s own Family Center.
“Our commitment to making our platform safer is always on,” Snapchat wrote in the update’s press release, “and we will continue to build on these improvements in the coming months with additional protections for teen Snapchatters and support for parents.”