A Meta whistleblower’s testimony could deliver a US online safety bill


A former engineering director at Facebook spoke to a Senate Judiciary subcommittee about how he was ignored when he raised concerns with the company about how the platforms worked and how damaging they were to the well-being of children.

This is not the first time Facebook, now known as Meta, has put profits before people. It is, however, another insider deciding to be a whistleblower. The hope is that it could spur the adoption of legislation that will significantly cut down on the time young people spend on social media and what they can view or do on the platforms.

A bipartisan bill addressing this has already been introduced and is known as KOSA (Kids Online Safety Act). In October, 42 US attorneys collectively announced a lawsuit against Meta for the harm they say such platforms bring to the young, citing evidence of bullying, depression, anxiety, and self-harm.

Change could not just be possible but imminent

Privacy advocates have raised concerns about the changes proposed and how they could affect civil rights and censorship, potentially becoming an onerous burden. Arturo Bejar, the whistleblower, was the former director of engineering for a Protect and Care group while working for the company.

Speaking to senators, he described the culture at the company as ‘see no evil, hear no evil.’ He first worked there between 2009 and 2015, and again from 2019 to 2021. Bejar’s primary responsibility at the company was to ensure Facebook and Instagram were less harmful.

His concern became pronounced after seeing what happened to his 14-year-old daughter when she started using Instagram. It led to “unwanted sexual advances, harassment, and misogyny,” an experience many teenagers experience.

The testimony could push the bill to pass

Bejar told the committee that we live in an “extraordinary time” where there’s consensus across the political divide about the need and urgency to pass legislation that protects all kids. The engineer explained that Meta could easily create a button for young people to report such issues, but that has not happened in part since there is no “transparency about the harms” teenagers experience on Instagram.

Meta responded by saying that “countless people inside and outside” the company are working to help young people and keep them safe. The facts do not support that stance, but the company has introduced over 30 tools to support teens and their families.

With the new testimony and a plethora of other concerns ranging from mental health concerns to sleep deprivation and a culture of envy, it is becoming more likely the safety bill could pass into law.

Share to...