Instagram is an online playground full of hazards -- teens know it, parents know it and the platform's parent company Meta knows it. On Tuesday, Meta announced sweeping safety updates designed to keep teens safe while they're on Instagram.
Starting today, and continuing for the next 60 days, Instagram will migrate all children using the app who live in the US, the UK and Australia and who are under the age of 16 to one of its new teen accounts. Any children signing up for the first time will also be given a teen account, and teens living in the European Union can expect to see teen accounts rolling out later this year.
This new account type places restrictions on what teenagers are able to do on Instagram, with the aim of making them safer and providing peace of mind to parents. If children want to change any of their in-built protections, they will need a parent's permission to do so.
The changes mark a hard-won victory for children's safety campaigners, who have been asking Instagram to do more to protect kids for years. Back in 2021, Instagram first announced it would make all teen accounts private by default and introduced more parental oversight controls last summer. Even as recently as January, Meta introduced new safety features across its platforms, but none have been quite as radical as the introduction of dedicated teen accounts.
"I talk to parents all the time about their concerns about their teens being online, and I consistently hear three things," said Instagram CEO Adam Mosseri in a video posted to Threads. "One, concerns about who can contact them; two, what content they see; and three, how much time that they spend online."
Teen accounts have been designed to address all three of these concerns, he added. All teen accounts will be private, meaning that they will have to accept new followers, and people who don't follow them won't be able to interact with them in any way. This will apply to messages, but also tags and mentions. Instagram will use its Hidden Words feature to automatically filter comments and DM requests in an attempt to minimize bullying.
In terms of content, Instagram's most restrictive settings will be placed around what children are able to see. The company wants the entire experience of using the platform to be centered around teens' friends and their interests, which they will be able to preselect in order that they're served content they're genuinely interested in.
To prod teens into being mindful about how long they're spending on the app, Instagram will tell them to leave the app after one hour of use every day. It will also enable sleep mode between 10 p.m. and 7 a.m., muting notifications and sending auto-replies to DMs.
If teens want to change any of these settings, they will have to set up parental supervision on Instagram. Kids and parents will have to link theirs, which will give adults a degree of oversight into what their children are up to on the platform. They'll be able to see who they're messaging, for example, but not read any of the messages. They'll also be able to set hard limits for how long teens are able to spend on Instagram every day. Even though teen accounts are only for under-16s on the platform, Instagram is also making the parental supervision feature available for parents with older teens.
You might be wondering, what about teens that lie about their age? It's a fair question. Since the dawn of the household computer, kids have been finding ways to get around parentally enforced tech restrictions. Instagram will increasingly be asking for age verification, and is working on an additional solution. It's currently developing tech designed to spot children who might be posing as adults, and it says it'll start testing it in the US next year.
When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.
Comments