On Wednesday, Meta announced a new set of tools designed to protect younger users, in a belated response to widespread criticism that the company is not doing enough to protect its most vulnerable users.
Parents, tech watchers and lawmakers have long been calling for the company to do more to keep teens safe on Instagram, which invites people over the age of 13 to sign up for an account.
To that end, Meta offers what it calls “Family Center,” a centralized center of security tools through which parents can control what kids see on the company’s apps. What can and can do, starting with Instagram.
A new set of monitoring features gives parents and guardians significant insight into the habits of young Instagram users. These tools allow parents to keep track of how much time their child spends in the app, notify them of accounts they have recently followed and followed, and receive notifications of accounts they report.
These tools will hit Instagram and VR platform Meta immediately in May, and the rest of the Meta (remember Facebook?) apps in the coming months. The company described the tool as “the first step in a long journey,” but why it took so long to take these initial steps to protect teens from the malicious side of their software is less clear.
For now, teenage Instagram users will have to enable security tools on their account, although the company says a parental option to start monitoring mode will be available from June. Instagram also plans to create more controls, including a way for parents to restrict app usage to certain hours and a setting to allow multiple parents to track the same account at the same time.
young and weak
Over the past year, Instagram has come under scrutiny for its lack of robust security features designed to protect younger users. This technically prevents anyone under the age of 13 from registering on the app, although there are some barriers to children’s use of social media.
Instagram previously announced that it would apply artificial intelligence and machine learning to address this issue to keep users over the age of 13, but the company knows that children and teens can still easily access the app.
“While many people are honest about their age, we know young people can lie about their birth dates,” the company wrote on its blog last year. “We want to do more to stop this, but verifying the age of people online is difficult and a lot of people in our industry struggle with it.”
The company says the problem with weeding out younger users is that it plans to create a kid-friendly version of Instagram, which BuzzFeed News introduced earlier last year. YouTube and TikTok offer versions of their own social apps optimized for kids under 13, though Instagram’s own plans seem to be a bit ahead of the party.
Last September, The Wall Street Journal published a series of investigative reports about the app’s negative impact on the mental health of teenage girls, a scandal that hastened the (temporary?) end of Instagram for kids. In light of the WSJ reporting, public backlash, and aggressive, unusually bilateral opposition from industry regulators, Meta continued to stall Instagram’s plans for kids.
The meta is also under competitive pressure. Two years ago, TikTok introduced its own tools that allow parents to track their children’s use of the app, and has since updated those controls to be more granular. In 2019, the company launched its own fashion app for kids, TikTok for Young Users, which restricts risky features like posts and comments.