Meta CEO Mark Zuckerberg who runs Instagram and Facebook issued a public apology in US Congress on Wednesday (31), where he and other tech chiefs of TikTok, Snap, X, and Discord faced intense scrutiny from lawmakers concerned about the risks children encounter on social media platforms.
The executives convened by the US Senate Judiciary Committee were put to task in a session titled “Big Tech and the Online Child Sexual Exploitation Crisis.”
Tech giants are confronting a torrent of political anger for not doing enough to thwart online dangers for children, including from sexual predators and teen suicide.
During one round of heated questioning, Zuckerberg was made to stand up and apologise to the families of victims who had packed the committee room.
“I’m sorry for everything you have all been through,” he said. “No one should go through the things that your families have suffered.”
Also testifying to senators were X’s Linda Yaccarino, Shou Zi Chew of TikTok, Evan Spiegel of Snap and Discord’s Jason Citron.
“Mister Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people,” said Senator Lindsey Graham.
Zuckerberg told the lawmakers that “keeping young people safe online has been a challenge since the internet began and as criminals evolve their tactics, we have to evolve our defenses too.”
He added that, according to research, “on balance” social media was not harmful to the mental health of young people.
“I don’t think it makes any sense,” said Senator Dick Durbin, who chaired the meeting in response.
“There isn’t a parent in this room who’s had a child…(who) hasn’t changed right in front of (their) eyes” because of an “emotional experience” on social media, he said.
TikTok’s Chew said “as a father of three young children myself I know that the issues that we’re discussing today are horrific and the nightmare of every parent.”
“I intend to invest more than $2 billion in trust and safety. This year alone, we have 40,000 safety professionals working on this topic,” Chew said.
Meta also said 40,000 of its employees work on online safety and that $20 billion has been invested since 2016 to make the platform safer.
Ahead of their testimony, Meta, and X, formerly Twitter, announced new measures in anticipation of the fiery session.
Meta, which owns the world’s leading platforms Facebook and Instagram, said it would block direct messages sent to young teens by strangers.
By default, teens under age 16 can now only be messaged or added to group chats by people they already follow or are connected to.
Meta also tightened content restrictions for teens on Instagram and Facebook making it harder for them to view posts that discuss suicide, self-harm or eating disorders.
Singling out Meta, senators pointed to internal company documents that show that Zuckerberg declined to strengthen the teams devoted to tracking online dangers to teens.
“The hypocrisy is mind-boggling,” Senator Richard Blumenthal told the New York Times.
Those documents are part of a major lawsuit brought by about 40 states jointly suing Meta over alleged failures with children.
Under US law, web platforms are largely shielded from legal liability in relation to content that is shared on their site.
While lawmakers would like to set up more rules to increase online safety, new laws have been stymied by a politically divided Washington and intense lobbying by big tech.
One existing proposal is the Kids Online Safety Act, or KOSA, which aims to protect children from algorithms that might trigger anxiety or depression.
Another idea would require social media platforms to verify the age of account holders and completely bar children under the age of 13.
“I don’t think you’re gonna solve the problem. Congress is gonna have to help you,” Senator John Neely Kennedy told the executives. (AFP)