Skip to main content

Facebook is putting a hold on the development of a kids’ version of Instagram, geared toward children under 13, to address concerns that have been raised about the vulnerability of younger users.

“I still firmly believe that it’s a good thing to build a version of Instagram that’s designed to be safe for tweens, but we want to take the time to talk to parents and researchers and safety experts and get to more consensus about how to move forward,” Adam Mosseri, the head of Instagram, said in an interview Monday on NBC’s Today show.

The announcement follows an investigative series by The Wall Street Journal which reported that Facebook was aware that the use of Instagram by some teenage girls led to mental-health issues and anxiety.

Yet the development of Instagram for a younger audience was met with broader opposition almost immediately.

Facebook announced the development of an Instagram Kids app in March, saying at the time that it was “exploring a parent-controlled experience.” Two months later, a bipartisan group of 44 attorneys-general wrote to Facebook chief executive officer Mark Zuckerberg, urging him to abandon the project, citing the well-being of children.

They cited increased cyberbullying, possible vulnerability to online predators and what they called Facebook’s “checkered record” in protecting children on its platforms. Facebook faced similar criticism in 2017 when it launched the Messenger Kids app, touted as a way for children to chat with family members and friends approved by parents.

Josh Golin, executive director of children’s digital advocacy group Fairplay, urged the company Monday to permanently pull the plug on the app. So did a group of Democratic members of Congress.

“Facebook is heeding our calls to stop plowing ahead with plans to launch a version of Instagram for kids,” Massachusetts Senator Ed Markey tweeted. “But a ‘pause’ is insufficient. Facebook must completely abandon this project.”

The Senate had already planned a hearing Thursday with Facebook’s global safety head, Antigone Davis, to address what the company knows about how Instagram affects the mental health of younger users.

Mr. Mosseri maintained Monday that the company believes it’s better for children under 13 to have a specific platform for age-appropriate content, and that other companies like TikTok and YouTube have app versions for that age group.

He said in a blog post that it’s better to have a version of Instagram where parents can supervise and control their experience rather than relying on the company’s ability to verify if kids are old enough to use the app.

Mr. Mosseri said that Instagram for kids is meant for those between the ages of 10 and 12, not younger. It will require parental permission to join, be ad free, and will include age-appropriate content and features. Parents will be able to supervise the time their children spend on the app, oversee who can message them, who can follow them and who they can follow.

While work is being paused on Instagram Kids, the company will be expanding opt-in parental supervision tools to teen accounts of those 13 and older. More details on these tools will be disclosed in the coming months, Mr. Mosseri said.

Be smart with your money. Get the latest investing insights delivered right to your inbox three times a week, with the Globe Investor newsletter. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe