In an unusual gesture during a US Senate hearing on Wednesday, Facebook founder Mark Zuckerberg apologized to parents whose children have been harmed by using the company's online platforms. This came as one Senator accused the entrepreneur of inadvertently creating a "product that's killing people."
"I'm sorry for everything you have all been through," Zuckerberg told family members at the hearing, some of whom held up photos of their children. "No one should go through the things that your families have suffered."
Technology executives convened by the US Senate Judiciary Committee were grilled in a session titled "Big Tech and the Online Child Sexual Exploitation Crisis."
In addition to Zuckerberg, TikTok CEO Chouzi Chew was invited to Washington, as were Snapchat co-founder Evan Spiegel, Discord CEO Jason Citron and the head of online platform X, formerly Twitter, Linda Yaccarino.
Senators blame Big Tech
US Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks that technology companies "are responsible for many of the dangers our children face online."
"Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk," he said.
"Mister Zuckerberg, you and the companies before us, I know you don't mean it to be so, but you have blood on your hands. You have a product that's killing people," said Senator Lindsey Graham.
Zuckerberg told the lawmakers that "keeping young people safe online has been a challenge since the internet began and as criminals evolve their tactics, we have to evolve our defenses too."
He added that research shows that "on balance," social media is not harmful to young people's mental health.
Companies vow to invest in online safety
"As a father of three young children myself, I know that the issues we're discussing today are horrific and every parent's nightmare," said TikTok's Chew.
He said he intends to invest more than $2 billion (€1.85 billion) in trust and safety. "This year alone, we have 40,000 safety professionals working on this topic," Chew said.
Meta also said that 40,000 of its employees work on online safety and that $20 billion has been invested since 2016 to make the platform safer.
Meanwhile, in anticipation of the fiery session, Meta, which owns the world's leading platforms Facebook and Instagram, said it would block direct messages from unknown people to young teens.
Meta also tightened content restrictions for teens on Instagram and Facebook. It made it harder for them to see posts that discuss suicide, self-harm or eating disorders.