A former security engineer from Meta testified to Congress on Tuesday that the company chose not to respond to internal data indicating that more teenagers were being harmed by Instagram and Facebook than public data indicated.
Facebook whistleblower Arturo Bejar appeared before the Senate Judiciary Committee on Tuesday to provide his perspective on Meta’s approach to youth safety. Bejar was featured in a Wall Street Journal piece last week in which he said that the company failed to respond to internal data regarding bullying, sexual harassment, and other negative experiences on the platform.
DEMOCRATS SPOOKED AFTER BIDEN SHOCK POLL AND FRET TRUMP WHITE HOUSE COMEBACK
In his opening remarks, Sen. Richard Blumenthal (D-CT) framed the hearing as a chance to illustrate the threat that Big Tech presents to teenagers and advocated a floor vote for technology reform bills, including the Kids Online Safety Act, a bill that would require social media platforms to disable “addictive” product features and take additional action to prevent harmful content from being presented to underage users.
Bejar used his testimony to implore Meta to be more transparent and to change its products.
“The company was grading its own homework,” Bejar said, noting that the company’s definitions for what is harmful are narrowly defined. Bejar previously reviewed the company’s “Bad Emotional Experience Feedback” data, which surveyed users on their experience on the platform over the last seven days. He said the review revealed a much deeper problem for teenagers than Meta had shown before.
Among users under the age of 16, 26% recalled terrible experiences in the last week due to witnessing hostility against someone based on their race, religion, or identity, the survey revealed. More than a fifth felt terrible about themselves after viewing others’ posts, and 13% had experienced unwanted sexual advances in the last week.
The former Meta employee advocated creating tools to allow teenagers to report users’ private messages so content moderators see them if they are too lewd or explicit while also maintaining private encryption.
Bejar supported the Platform Accountability and Transparency Act, a bill forcing social media to provide researchers access to their internal data.
Sens. Josh Hawley and Lindsey Graham both supported the reversal of Section 230, a provision of communications law that protects platforms from being held accountable for content posted by third parties. “Until you open up the courthouse, [Big Tech is] not going to do anything,” Graham argued. “Once you do, they’ll come forward with all sorts of good ideas they never told you about.”
Graham and Hawley also called for Congress to boycott all political giving from Big Tech companies.
Blumenthal and Judiciary Chairman Dick Durbin supported attempting to get a vote for the Kids Online Safety Act and several other technology-oriented bills. The Illinois Democrat said that they had sufficient votes to get it passed this year. He noted that over 45 senators had co-sponsored the bill, nearly half of the Senate.
Bejar served at Meta as an engineer dedicated to protecting platform users. He stayed there until 2015, when he left the company for personal reasons. During his time at Meta, Bejar said, he attempted to provide the company with better tools for addressing the problems facing teenagers.
He rejoined the company in 2019 on Instagram’s well-being team, where he had access to high-level executives such as Facebook CEO Mark Zuckerberg and Instagram head Adam Mosseri. He stayed on until 2021, when his term was up. In the final weeks of his time at the company, Bejar emailed Zuckerberg, Mosseri, Chief Product Officer Chris Cox, and Chief Operating Officer Sheryl Sandberg, discussing the “Bad Emotional Experience Feedback” data.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
Sandberg and Mosseri responded to Bejar’s emails bemoaning the results and expressing a desire to change things, but Zuckerberg never responded. There were also no further follow-ups from leadership on the data.
The email to Meta correlated with the first day that fellow Meta whistleblower Frances Haugen appeared before Congress. Haugen’s releases of Meta’s internal documents led to a crackdown on research being shared that could cause further damage to the company’s reputation. A watered-down version of Bejar’s research was sent out to the staff. However, the company failed to act on the data and merely adopted what Bejar describes as performative changes in youth safety online.