Instagram head Adam Mosseri offered a full-throated defense of the social media app during Senate testimony on Wednesday afternoon as lawmakers questioned him over the company’s safety measures for young people.
Mosseri highlighted existing, soon-to be-released Instagram tools that allow users to manage who they interact with and those for parents to limit what their children see. On Wednesday, Mosseri said Instagram would give independent reserachers access to data and algorithms but wouldn’t rule out resuming work on Instagram for Kids, a controversial version of Instagram for children that the company has put on pause.
Mosseri asked Congress to reverse the downward pressure on Congress by urging lawmakers to pass new federal legislation that will help Instagram regulate itself.
“We believe there shoUld be an industry body that will determine best practices when it comes to at least three questions: how to verify age, how to design age-appropriate experiences and how to build parent controls,” Mosseri said. “And I believe that companies like ours should have to adhere to these standards to earn some our Section 230 protections,” a reference to the federal legislation offering broad protection to tech companies.
Instagram and Meta, its parent company, were under attack for several months following Frances Haugen’s release of a number of documents from within the organization to federal regulators and journalists. These documents provided a rare view of the issues Meta is facing, and raised concerns about Instagram’s impact on teens’ mental health.
Question about the children’s Instagram experience have drawn bipartisan fury and increased chances for legislation to be introduced. It was a matter that has been neglected for a while. “There is bipartisan momentum—both here and in the House [of Representatives]—to tackle these problems we are seeing with Big Tech,” said Sen. Marsha Blackburn, the most senior Republican on the Senate subcommittee that heard Mosseri’s testimony. “The time is ripe to pass a national consumer privacy bill, as well as kids-specific legislation to keep minors safe online.”
Richard Blumenthal, another subcommittee chair, also spoke out against Instagram. “Something is terribly wrong,” he said. “And what stuns me is the lack of action.”
Blumenthal, Blackburn and their subcommittee have steered renewed interest in Washington, D.C., regarding possible regulation of tech. They previously heard testimony by Haugen, Meta Head for Safety Antigone Davis, and executives from YouTube, Snap, TikTok, and Haugen over several months. When Davis spoke to Congress, Blumenthal’s office created a fake Instagram account for a teen girl and said the account was quickly shown content that promoted eating disorders. Blumenthal’s office again set up another account this week and found similar content, a sign that Meta isn’t taking the increased attention from Congress seriously, the senator said.
During Mosseri’s remarks on Wednesday, he said Instagram has removed over 850,000 acounts this year that appeared to belong to underage users. Moreover, Mosseri said, Instagram has tried to push teenage users toward private accounts—showing them a notification badge that highlights the greater safety a private account offers over a public one—and limited the extent of targeted advertising.
Mosseri also directly criticized the conclusions drawn from the whistle-blower documents obtained by Haugen, continuing a line of attack on Meta’s own researchers that the company has previously employed. The documents contained incomplete research relying on small sample sizes, Mosseri told Congress, and shouldn’t be taken as seriously as lawmakers and many journalists have.
The new Instagram feature, which prompts users not to use the app anymore this week, was unveiled by the company. This also signals the arrival of new parental controls in March. Blackburn was not pleased with the timing of this new feature, which was published Tuesday morning.
“At 3 a.m., which is midnight in Silicon Valley, you released a list of product updates you said would raise the standard for protecting teens and supporting parents online. I’m not sure what hours you keep out there in California, but, where I’m from, that’s when you drop news that you don’t want people to see,” Blackburn said.
Ironically, Wednesday’s Mosseri spoke out about Meta’s core issue: it is losing ground against Snap and TikTok. He did so to suggest that the problem of teen safety is bigger than just Meta—making it impossible for the company to tackle alone without guidance from the government. During his testimony, Mosseri cited a newly released Forrester survey that showed 63% of U.S. teens spend time weekly on TikTok compared to 57% of teens who said they were on Instagram, a four-percentage-point drop for Instagram in a year.
“With teens using multiple platforms, it is critical that we address youth online safety as an industry challenge and develop industry-wide solutions,” Mosseri said.