Lawmakers Grill Snap, TikTok and YouTube on Children’s Safety.

Tech News 4 min read , October 27, 2021

Introduction

At the House hearing, lawmakers will grill tech execs about kids and safety. There's no doubt that the Internet-induced world has changed the game regarding our children's safety.

The attention is currently on social media companies for how they are providing content to children. Additionally, the topic has become rife with criticism regarding President Trump's sons, who reportedly play games that portray violence.

Recent studies show that the children who grow up with the most screen time are less likely to tell the difference between fact and fantasy.

Between gaming, chat, and social media, it exposes our children to more language than ever before.

| Header Image | Tedbree.com | Lawmakers to Grill Tech Execs on Kids' Social Media Safety |
Image Credits: AaronP/Bauer-Griffin/GC Images / Getty are turning their focus to s Are Turning Focus On TikTok and Snap

The Target Social Media Platforms

Congress is turning its focus to two of the internet industry's new but crucial faces: TikTok and Snap, after repeatedly bringing in the same corporations and their shy, overtrained executives.

Senators from the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security grilled policy leaders from those two corporations and YouTube on Tuesday, about how their services influence vulnerable teenage users.

In early October, shortly after announcing her name, Facebook whistleblower Frances Haugen testified before the same committee on related matters.

The session, which aired today at 7 a.m. PT, featured evidence from Jennifer Stout, Snap's VP of Global Public Policy, Michael Beckerman, TikTok's VP and Head of Public Policy, and Leslie Miller, YouTube's government affairs and public policy leaders.

Subcommittee chair Senator Richard Blumenthal (D-CT) led the hearing, which focused on social media's detrimental effects on children and teens.

"The bombshell reports about Facebook and Instagram about their toxic effects on young users and lack of truth or transparency. This raises serious concerns about Big Tech's approach to kids across the board," Blumenthal said, linking reports about Instagram's dangers for teens to the dangers of social media in general.

Marsha Blackburn (R-TN), the ranking Republican on the subcommittee, has expressed an interest in privacy concerns surrounding TikTok.

As members of the subcommittee take turns questioning the three policy heads, we anticipate hearing about eating disorders, harassment, bullying, internet safety, and data privacy.

The Lawmakers Also Discussed On The Kids Acts (KIDS)

The lawmakers also discussed legislation that could help protect kids and teens online, though how solutions-oriented the hearing will remain in the limelight.

The KIDS Act (Kids Internet Design and Safety) is one of the potential answers, as it would introduce new internet protections for people under the age of 16.

Last month, Blumenthal and Democratic Senator Ed Markey reintroduced the bill.

The mental health of children and teenagers isn't the only societal concern that social media platforms are currently causing. Still, it is one that both Republicans and Democrats are rallying around.

For starters, it's a unique forum for criticism, with plenty of political overlap on both sides.

Both parties appear to agree that tech's most prominent companies need control in some way. However, they emphasize different aspects of the why.

For conservatives, these companies have too much decision-making power over what content to remove from their platforms.

For liberals, these companies have too much decision-making power over what content to remove from their platforms.

On the other hand, Democrats tend to be more concerned about the content that is left up, such as extremism and disinformation.

Lawmakers Also Dived Into How Algorithms Amplify Harmful Content.

The Hearing Also Dived Into How Algorithms Amplify Harmful Content.

Hearings are a once-in-a-lifetime opportunity for the general public to learn more about how social media businesses offer individualized information to their users because social media companies keep their cards close to their chests regarding how their algorithms function.

In an ideal world, we'd be learning a lot about that kind of thing in Congress's frequently long and tedious tech hearings.

That has happened in the previous several years. Still, with lawmakers asking uninformed or irrelevant questions and evasive tech CEOs with hours of media training under their belts, the best we can typically hope for is a few new facts.

While Facebook will not be present at this hearing, recent disclosures about the business and Instagram will likely influence what happens today.

The public response to leaked Facebook information has focused on all three social media corporations set to testify, and news reporting on that data just came out on Monday.

TikTok announced a new set of safety precautions, including a well-being guide, better search interventions, and opt-in popups for sensitive search terms, shortly after early reports that Instagram is aware of its hazards to underage users.

Snap launched a new set of family-focused safety capabilities this week, giving parents additional insight into what their wards are doing on the platform.

Compared to platforms like Facebook, Instagram, and Twitter, these social networks have a higher percentage of younger users, making robust safety tools even more important.

Youtube kids

Before the hearing, YouTube revealed several improvements to the types of children's material that will be eligible for monetization, as well as other kid-friendly safety features.

Click here to read the top news in the tech industry!

Kids social media law The KIDS Act Kids Internet Design and Safety tiktok for kids youtube kids lawmakers Social Media facebook for kids tech laws grill exces