CEOs from social media giants Meta (Facebook and Instagram), X, Snapchat, Discord, and TikTok faced a US Senate Committee hearing this week.
The bosses were questioned over their platforms’ efforts to protect young people online from abuse, including sexual exploitation.
The hearing was attended by families of victims of online abuse.
Here’s what you need to know.
US Senate Committee
The US Senate Judiciary Committee has broad powers. Its 21 members, both Republicans and Democrats, are tasked with examining influential legal and social issues.
It’s also responsible for interviewing judges nominated for the U.S. Supreme Court — the highest court in America.
Recently the judiciary has been reviewing laws surrounding online safety for children. In particular, it’s voiced an increasing need for social media companies to take responsibility for dangerous online content.
Social media CEO testimony
In November, the committee called on a group of social media CEOs to attend a hearing called “Big Tech and the Online Child Sexual Exploitation Crisis”.
Some of those CEOs testified this week.
It comes more than a year after officials called on Elon Musk to detail how his company X, then Twitter, was working to rid the platform of child sexual exploitation. Musk did not respond to the request.
Key messages
Senators were highly critical of CEOs over a lack of online safety features to protect children.
Committee Chair Dick Durbin (Democrat), called the rise of online child sexual exploitation a “crisis in America”. He alleged social media platforms are being used to spread abusive content and facilitate grooming.
Senator Lindsay Graham (Republican) compared the dangers of social media to cigarettes, telling social media bosses: “You have blood on your hands… you have a product that’s killing people”.
Meta apology
Following pressure from Senator Josh Hawley (Republican), Meta CEO Mark Zuckerberg apologised to the families of victims of abuse on its platforms.
“Everything that you have all gone through, it’s terrible. No one should have to go through the things that your families have suffered,” Zuckerberg said. He insisted that Meta is investing in “industry-leading” efforts to improve safety.
Zuckerberg faced further criticism over a child sexual abuse warning on Instagram that users could ignore by selecting a “see results anyway” option.
Safety features
Snap CEO Evan Spiegel also apologised during the hearings, addressing parents of children who fatally overdosed after sourcing drugs over the app.
Tech bosses were asked about the number of staff they employ to monitor harmful content on their platforms.
TikTok and Meta said they each employ about 40,000 people involved in “safety”. X said it had 2,300 and Snapchat said 2000.
Discord (a messaging platform originally popular with gamers) said it had a safety team of “hundreds”.
What now?
Senators asked social media executives to support a proposed law called the “Kids Online Safety Act”.
The laws would require popular websites and social media companies to take “reasonable” steps to reduce harm on their platforms, including bullying, illicit substance use, eating disorders, and sexual abuse.
X and Snap’s CEOs said they fully support the act, while Discord, Meta, and TikTok bosses didn’t explicitly back it. Debate on the bill is continuing in Congress.