In a landmark Senate Judiciary Committee hearing, CEOs from some of the biggest social media companies faced a grilling over their records on child exploitation and efforts to protect teenagers using their services. The hearing, titled “Big Tech and the Online Child Sexual Exploitation Crisis,” featured Meta CEO Mark Zuckerberg, Snap CEO Evan Spiegel, TikTok CEO Shou Chew, Discord CEO Jason Citron, and X CEO Linda Yaccarino. This event marked the first time Congress directly heard from Spiegel, Yaccarino, and Citron, and only the second appearance for TikTok’s Chew, who faced questions about the app’s safety record and ties to China last year.
The hearing delved into various concerns regarding how social platforms have failed to safeguard young users from harmful content, including child sexual exploitation, drug-related content, extremist ideologies, and self-harm content. While the CEOs defended their platforms and highlighted measures to enhance safety, the hearing also focused on proposed legislation aimed at ensuring child online safety, including the controversial Kids Online Safety Act (KOSA).
The Kids Online Safety Act (KOSA) and Its Controversy
The Kids Online Safety Act (KOSA) has gained bipartisan support and aims to hold tech platforms accountable for the content they recommend to minors, with a focus on protecting their mental health. The legislation proposes the creation of liability, or a “duty of care,” for apps and online platforms that recommend content negatively affecting the mental health of minors. However, KOSA has its share of critics who argue that it may lead to over-censorship and unclear liability standards.
One concern raised by experts is the broad language in the bill, which could potentially result in platforms overzealously censoring content, inadvertently stifling free expression. Additionally, KOSA’s enforcement at the state level introduces the possibility of differing interpretations and standards across states, adding complexity to compliance.
Support and Opposition to KOSA
Despite the controversy surrounding KOSA, it has garnered support from various organizations and lawmakers. Advocates for child online safety, including the American Academy of Pediatrics, the National Center on Sexual Exploitation, and Fairplay, a nonprofit advocating for online child safety, are firmly behind the legislation. They argue that KOSA is a necessary corrective measure against the toxic business models of social media platforms, which prioritize engagement and profit over the well-being of young users.
Josh Golin, the executive director of Fairplay, stated that KOSA aims to address issues such as the promotion of eating disorders, suicide, drug use, and sexual abuse to young users. Parents who have tragically lost children due to online harms have also joined the coalition supporting KOSA, advocating for change to prevent further tragedies.
However, not all are in favor of KOSA. The legislation has faced opposition from various quarters, including the American Civil Liberties Union (ACLU) and the Electronic Frontier Foundation (EFF). Some concerns relate to the bill’s implications for encryption and potential effects on free expression. Additionally, there are worries about KOSA’s potential impact on LGBTQ content, with concerns that it may inadvertently suppress age-appropriate LGBTQ resources and information.
The debate over KOSA raises fundamental questions about the balance between protecting children online and preserving freedom of speech and expression. It underscores the complexity of regulating social media platforms, which serve as both forums for discourse and sources of potential harm.
Snap’s Surprising Support for KOSA
In a surprising move, Snap, the parent company of Snapchat, expressed its support for KOSA. This decision sets Snap apart from its tech peers and even its industry associations. Snap’s alignment with KOSA may be viewed as an attempt to foster goodwill with regulators, particularly concerning TikTok, its dominant rival among young users. Snap’s decision echoes a similar move by Meta (formerly Facebook) in 2018 when it supported the controversial FOSTA-SESTA legislation, aimed at addressing online sex trafficking but with unintended consequences for sex workers.
Meta’s Scrutiny and Wall Street Journal Investigation
Meta, formerly Facebook, has been at the center of scrutiny following a Wall Street Journal investigation last year. The investigation revealed that the company was aware of Instagram’s significant negative effects on the mental health of teenage users, according to internal documents. The findings reignited concerns about the responsibility of social media platforms in safeguarding the well-being of young users.
Meta, in response, stated that it has developed more than 30 tools to help teenagers and their parents cultivate safe experiences on its platforms. The company emphasized its commitment to addressing content and behavior that violates its rules. The revelations underscore the ongoing challenges faced by tech companies in balancing user engagement and safety, especially for vulnerable demographics like teenagers.
Discord’s Unusual Inclusion and Safety Measures
Discord’s inclusion in the Senate hearing was notable, given its popularity among young people and concerns about its role in facilitating grooming, kidnapping, and other forms of sexual exploitation. Discord, originally designed as a chat app for gamers, has become a widely used platform among teenagers.
Discord has maintained a zero-tolerance policy for child sexual abuse and employs a mix of proactive and reactive tools to moderate the platform. The company allocates over 15% of its workforce to trust and safety efforts, prioritizing issues that present the highest real-world harm to its users and the platform.
TikTok’s Efforts to Protect Teenagers
TikTok, often in the spotlight for its appeal among teenagers, has emphasized its efforts to protect young users. TikTok restricted livestreams to accounts registered to users aged 18 or older and introduced measures specifically designed to protect teenagers. Shou Zi Chew, CEO of TikTok, stressed the platform’s commitment to safety during his previous appearance before the House Energy and Commerce Committee.
Chew highlighted that many of TikTok’s safety measures were pioneering efforts in the social media industry. TikTok’s proactive approach to safety is a reflection of the platform’s recognition of its responsibility to protect its young user base from potential harms.
The Call for Empowering Social Media Platforms
Throughout the hearing, lawmakers and experts emphasized the need for meaningful reforms that empower social media platforms to invest in tools aimed at ensuring the safety of young people online. The consensus is that instead of relying solely on legal action, tech companies should take proactive measures to address the root causes of online harm.
Aliya Bhatia, a policy analyst for the Free Expression Project at the Center for Democracy and Technology, stressed the importance of focusing on what companies can do more of to enhance safety. She expressed concerns about KOSA’s broad language and the potential for unclear liability standards. Bhatia hopes that lawmakers will concentrate on empowering social media platforms to invest in more tools to help young people navigate the internet safely.
Tech Executives’ Commitment to Child Safety
X, formerly known as Twitter, emphasized its zero-tolerance policy for child sexual exploitation (CSE) and its determination to make the platform inhospitable for those seeking to exploit minors. Discord echoed a similar sentiment, highlighting its dedicated trust and safety team and its commitment to removing harmful content, banning users, shutting down servers, and collaborating with authorities to combat online child exploitation.
TikTok, which has faced scrutiny in the past, reiterated its dedication to protecting teenagers and adopting measures to ensure their safety. Meta, while acknowledging its role in addressing content and behavior that violates platform rules, also underscored its commitment to age-appropriate experiences on its apps.
Logos used in Featured Image courtesy of Meta, X, Discord, Snapchat, and TikTok.