もっと詳しく

After dragging in the same companies and their reticent, overtrained executives time and time again, Congress is turning its attention to two of the tech industry’s fresh but important faces: TikTok and Snap.

On Tuesday, lawmakers on the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security will question policy leads from those two companies and YouTube on how their platforms affect vulnerable young users. Facebook whistleblower Frances Haugen testified before the same committee on parallel issues in early October, shortly after revealing her identity.

The hearing will air on Tuesday at 7AM PT, featuring testimony from Snap VP of Global Public Policy Jennifer Stout, TikTok’s VP and Head of Public Policy Michael Beckerman and Leslie Miller, who leads government affairs and public policy at YouTube.

Subcommittee chair Senator Richard Blumenthal (D-CT) will lead the hearing, which will focus on social media’s detrimental effects on children and teens. “The bombshell reports about Facebook and Instagram—their toxic impacts on young users and lack of truth or transparency—raise serious concerns about Big Tech’s approach toward kids across the board,” Blumenthal said, connecting reports about Instagram’s dangers for teens to social media more broadly. The subcommittee’s ranking Republican Marsha Blackburn (R-TN) has signaled that she’s particularly interested in privacy concerns around TikTok.

We’d expect topics like eating disorders, harassment, bullying, online safety and data privacy to come up as members of the subcommittee take turns pressing the three policy leads for answers. The group of lawmakers also plans to discuss legislation that could help protect kids and teens online, though how solutions-oriented the hearing will be remains to be seen. Some of those potential solutions include the KIDS Act (Kids Internet Design and Safety), which would create new online protections for people under the age of 16. Blumenthal and fellow Democratic Senator Ed Markey reintroduced the bill last month.

The mental health of kids and teens isn’t the only pressing societal crisis that social platforms are implicated in at the moment, but it’s one Republicans and Democrats are both rallying around. For one, it’s a rare arena of criticism with plenty of political overlap for both sides. Both parties do seem to agree that tech’s biggest companies need to be controlled in some way, though they generally play up different parts of the why: for conservatives it’s that these companies have too much decision making power when it comes to what content gets wiped from their platforms. On the opposite side of the aisle, Democrats are generally much more worried about the kind of content that gets left up, like extremism and misinformation.

Tuesday’s hearing will also likely dive into how algorithms amplify harmful content. Because social media companies play their cards close to the chest when it comes to how their algorithms work, hearings are a rare opportunity for the public to learn more about how these companies serve their users personalized content. Ideally we’d be learning a lot about that kind of thing in the often lengthy, repetitive tech hearings Congress has held in the last couple of years, but between lawmakers pushing uninformed or irrelevant lines of questioning and evasive tech executives with hours of media training under their belts, the best we can usually hope for is a few new tidbits of information.

While Facebook won’t appear at this particular hearing, expect recent revelations around that company and Instagram to inform what happens on Tuesday. All three social media companies set to testify have had an eye on the public response to leaked Facebook documents and more reporting on that data just landed on Monday.

Just after the initial reports that Instagram is aware of the risks it poses to teen users, TikTok introduced a new set of safety measures including a well-being guide, better search interventions and opt-in popups for sensitive search terms. Last week, Snap announced a new set of family-focused safety tools to give parents more visibility into what their kids are up to using the platform. Both social networks skew heavily toward younger users compared to platforms like Facebook, Instagram and Twitter, making robust safety tools even more of a necessity. Leading into the hearing, YouTube announced some changes of its own around what kind of kids content will be eligible for monetization while also highlighting its other kid-centric safety measures.