The changes are one of the most far-reaching set of measures undertaken by an app to address teenagers’ use of social media, as scrutiny over young people’s experiences online has ramped up. In recent years, parents and children’s groups have warned that Instagram, TikTok, Snapchat and other apps have regularly exposed children and teenagers to bullying, paedophiles, sexual extortion and content promoting self-harm and eating disorders.
In June, Vivek Murthy, the US surgeon general, called for cigarettelike labels on social media to warn of the potential mental health risks. In July, the Senate passed bipartisan legislation called the Kids Online Safety Act to impose safety and privacy requirements for children and teenagers on social media. And some states have passed social media restrictions.
Mark Zuckerberg, Meta’s CEO, has faced particular criticism over social media’s risks to young people. Dozens of state attorneys general have filed lawsuits against his company, accusing Meta – which also owns Facebook and WhatsApp – of knowingly hooking children on its apps while playing down the risks. At a congressional hearing on child online safety in January, lawmakers urged Zuckerberg to apologise to families whose children had killed themselves after social media abuse.
“I’m sorry for everything you have all been through,” Zuckerberg told the families at the hearing.
How effective Instagram’s new changes will be is unclear. Meta has promised to protect minors from inappropriate contact and content since at least 2007, when state attorneys general warned that Facebook was rife with sexually explicit content and had enabled adults to solicit teenagers. Since then, Meta has introduced tools, features and settings to foster youth well-being on its social networks – with varying degrees of success.
In 2021, for instance, Instagram announced that it would make new accounts opened by those who indicated they were younger than 16 private by default. At the time, the app allowed younger teenagers to simply switch the default to public accounts.
This time, 16-year-olds and 17-year-olds will be able to opt out of the default privacy settings by themselves. But Instagram said users younger than 16 will now need a parent’s permission to make their accounts publicly viewable.
Megan Moreno, a paediatrics professor at the University of Wisconsin School of Medicine who studies adolescents and problematic social media use, said Instagram’s new youth default settings were “significant”.
“They set a higher bar for privacy and confidentiality – and they take some of the burden off the shoulders of teens and their parents,” she said.
Yet the changes do not directly address a glaring problem: young people who lie about their age when they register for Instagram. The new settings and features are set to automatically kick in for account holders who self-identify as minors. And while Instagram’s terms of service prohibit children under 13 from using the app, “Teen Accounts” is not designed to search for and remove underage users.
Instagram said it removes underage accounts when it learns of them. It said it would require teenagers to verify their ages if they tried to circumvent the new privacy defaults by creating new accounts with an adult birth date. The company is also working on technology to allow it to proactively find teenagers who have set up accounts posing as adults.
Several children’s groups said Instagram’s announcement, which came as Congress was poised to take up children’s online safety legislation Wednesday, seemed to be an attempt to ward off new federal protections for young people online.
“These are long overdue features that Instagram should have put in place years ago to keep young people safe online,” said Jim Steyer, the chief executive of Common Sense Media, a children’s advocacy and media ratings group. “They’re only acting now because they’re under pressure from lawmakers, advocates and a groundswell of public opinion.”
While the overhaul may be well received by parents, some teenagers – who are an important part of Instagram’s user base – may be less pleased. Teenage influencers who keep their accounts public to gain new followers could balk at the changes. Nearly half of US teenagers ages 13-17 use Instagram at least once a day, according to a survey last fall from Pew Research, making it the fourth most popular social network among young people in the US, after YouTube, TikTok and Snapchat.
The safety moves could hurt Meta’s business in the short term, since the company needs new users to grow and young users to remain relevant. But by making these changes now, Instagram is also attempting to court the next generation of young people to use social media while trying to reduce the risks they can face online.
Mosseri acknowledged that the new safety measures could affect Meta’s bottom line and popularity among teenagers.
“It’s definitely going to hurt teen growth and teen engagement, and there’s lots of risk,” he said. “But fundamentally, I want us to be willing to take risks, to move us forward and to make progress.”
Other social media apps have also made changes for younger users. In 2021, TikTok made accounts private by default for those registered to users ages 13-15. It also disabled direct messages for those younger teenagers.
Instagram’s latest settings and features will begin rolling out Tuesday, with new accounts registered by people who identify themselves as minors automatically being put into private mode. The app said it would also soon begin making private existing accounts of minors in the United States, Canada, Australia and Britain.
Meta said it would continue restricting teenagers on Instagram from being able to send direct messages to people they do not already follow. The company said it will also show them less content in the main Instagram feed from people they do not follow and prevent them from being tagged by the accounts of other people with whom they are not connected.
The new options give parents who oversee their teenagers’ accounts more insight into how their children use the apps, Instagram said. That includes a feature enabling a parent to see the topics of posts their child has chosen to see more of, as well as the accounts of the people their child recently messaged. To protect user privacy, though, parents will not be able to view the content of their children’s messages.
While parents might use the information to start important conversations with their children, experts said the feature could also create tensions for vulnerable teenagers whose politics or gender identities may be at odds with their parents’ views.
Moreno, who is also the medical co-director of the American Academy of Paediatrics’ Centre of Excellence on Social Media and Youth Mental Health, said she was looking forward to seeing teenagers’ reactions to Instagram’s changes. Many young people might be relieved that their accounts are made private, she noted, while others may find that getting a parent’s permission to change default settings too burdensome.
“Their voices will be really important in determining how meaningful these changes are,” she said.
Mosseri said developing the new features was tricky for the company in trying to balance safety concerns with personal privacy.
“The thing for me about this whole world of safety online and well-being and social media is that there are trade-offs,” he said. “We think we’ve found a decent balance. But I’m sure we’re going to get a bunch of feedback.”
This article originally appeared in The New York Times.
Written by: Mike Isaac and Natasha Singer
Photographs by: Ricky Rhodes and Anthony Gerace
©2024 THE NEW YORK TIMES