Social media executives face tough questions in the US Senate over child protection
In a tense session, senators insisted social platforms must do more to protect young users.
Executives from Snap, Meta, X and TikTok appeared before the US Senate Legal Committee yesterday to discuss their efforts to combat child exploitative content on their apps. They also answered questions about the ongoing development of new initiatives to better protect young users, and some senators did not hold back in their criticism of the platforms.
The hearing on "Big Tech and the Child Sexual Exploitation Crisis on the Internet" is a follow-up to an earlier session in which the Senate heard from child safety experts on the harm caused by social media apps. Today's hearing was originally scheduled to take place late last year, but had to be rescheduled to ensure that all CEOs were present.
Today, the CEOs themselves had the opportunity to present their perspective and detail what each of them is doing to combat child sexual abuse material (CSAM).
First of all, each of the CEOs shared a pre-prepared statement that provided an overview of their efforts and plans.
Meta CEO Mark Zuckerberg spoke about Meta's defense systems, which includes 40,000 employees working on safety and security, with Zuckerberg also revealing that Meta has invested more than $20 billion in this element since 2016.
Zuckerberg also refuted criticism from the previous session about the harm caused by social media apps:
"A recent report from the National Academy of Sciences evaluated more than 300 studies and found that the research does not support the conclusion that social media causes changes in adolescent mental health. It also suggests that social media can provide significant positive benefits when young people use it to express themselves, explore, and connect with others."
Zuckerberg also reiterated Meta's recent suggestion that app shops should be held responsible for underage downloads.
"For example, 3 out of 4 parents are in favour of introducing age verification in the app shop, and 4 out of 5 parents want app shops to get parental approval every time teens download apps."
So while Zuckerberg is willing to accept his share of responsibility, he also set the tone early on by stating that he believes there are counterarguments to what has been expressed by child safety experts.
X CEO Linda Iaccarino emphasized her perspective as a mother and spoke about X's efforts to provide greater protection for young users.
"Over the past 14 months, X has made significant changes to protect minors. Our policy is clear: X does not tolerate any material that shows or promotes the sexual exploitation of children."
Iaccarino also explained that in 2023, X blocked more than 12 million accounts for violating CSE policies, and sent 850,000 reports to the National Centre for Missing and Exploited Children (NCMEC) through a new automated reporting system designed to streamline the process.
Iaccarino outlined the same in a recent post on X, although the automated reporting element in particular could lead to further problems in the form of incorrect reports. At the same time, it could reduce the workload at X, and with staffing levels 80% smaller than the previous Twitter team, it needs to utilize automated solutions where possible.
Meanwhile, Snapchat CEO Evan Spiegel emphasized the platform's fundamental approach to privacy in a statement:
"Snapchat is private by default, which means people need to give consent to add friends and choose who can contact them. When we created Snapchat, we decided that images and videos sent through our service would be deleted by default. Like previous generations who enjoyed the privacy afforded by phone calls that are not recorded, our generation has benefited from the ability to share moments through Snapchat that may not be perfect, but instead convey impermanent emotions."
Spiegel also cited the platform's NCMEC reporting data, stating that Snap submitted 690,000 NCMEC reports last year.
Meanwhile, TikTok CEO Shaw Tzu Chu spoke about TikTok's evolving CSAM detection efforts, which will include significant investment in new initiatives.
"We currently have over 40,000 trust and safety professionals working to protect our community, and we expect to invest over two billion dollars in trust and safety efforts this year alone - much of that investment will be in our US operations."
TikTok may be in a tougher position, given that many senators are already trying to ban the app over concerns about its ties to the Chinese government. But Chu argues that the platform is leading the way on many CSAM detection elements and is committed to using them wherever possible.
The meeting drew a number of pointed questions from the Senate floor, including the following remark from Senator Lindsey Graham:
"Mr. Zuckerberg, you and companies before you, I know you didn't mean for this to be the case, but you have blood on your hands. You have a product that is killing people."
Zuckerberg was the main focus of the meeting, which makes sense given that he is responsible for the most used social media network in the world.
Senator Josh Hawley also forced Zuckerberg to apologize to families who have been affected by his company's apps. Zuckerberg turned to the gallery to make a statement to a group of parents in attendance:
"I'm sorry for everything you've been through. No one should have to go through what your families have gone through, and that's why we're investing so much and are going to continue our industry-wide efforts to ensure that no one has to go through what your families have gone through."
However, at the same time, the new report indicates that Zuckerberg has previously rejected calls to increase Meta's protective resources in 2021, despite requests from employees.
As reported by The New York Times:
"In 90 pages of internal emails in the autumn of 2021, top officials at Meta, which owns Instagram and Facebook, discussed hiring dozens of engineers and other employees to focus on the well-being and safety of children. One proposal to Zuckerberg for 45 new employees was rejected."
Zuckerberg has maintained his composure under pressure, but it's clear that Meta's initiatives on this front raise a lot of concerns.
Several senators also used today's session to call for changes to the law, specifically Section 230, to weaken social platforms' defenses against harmful content. So far, the repeal of Section 230, which protects social apps from lawsuits for content shared by users, has been rejected, and it will be interesting to see if this viewpoint moves the debate forward.
It was a tense session, with senators trying to make the case that social platforms need to do more to protect young users. I'm not sure if any of the proposed changes to the law will survive today's questioning, but it's interesting to note the different elements in place and how the main platforms are seeking to implement solutions to address the issues.