Congress: Stop Expecting Big Tech to Protect Kids Unless You Make Them
Annie Chestnut Tutor /
The harm social media does to kids has been well documented. Perhaps the most well-known exposé is The Wall Street Journal’s The Facebook Files, published in October 2021. Congress has held several hearings since that publication. On Wednesday, the Senate Committee on the Judiciary is holding another hearing centered around Big Tech’s failures to protect kids from sexual exploitation on social media platforms. The question is, are our elected officials willing to hold these companies accountable when they don’t protect kids?
Senators will hear from the CEOs of Meta (parent company of Facebook and Instagram), Snap (parent company of Snapchat), X, TikTok, and Discord.
Senators are right to expose the failings of these social media companies to the general public and rally support from their colleagues and constituents, but they should not rely on any actions from these CEOs to secure the change that is desperately needed.
If we take economist Milton Friedman’s famous maxim as an absolute truth, that the only moral duty a corporation has is to maximize shareholder value as long as it stays within the rules of the game, then we can understand that social media companies are incentivized—if not obligated—to appeal to as many people as possible and engage people for as long as possible.
Fifty-eight percent of U.S. teens 13-17 use TikTok, 51% use Snapchat, 47% use Instagram, and 19% use Facebook. These companies all prohibit users under age 13 due to restrictions in the Children’s Online Privacy Protection Act, but age restrictions are enforced only by an honor code.
A recent investigation initiated by the attorney general of New Mexico uncovered that Meta knew of Facebook accounts with users under the age of 13 but failed to disable or remove the accounts. In this case, Meta was not adhering to the rules of the game, and Attorney General Raúl Torrez was right to hold the company accountable.
Meta CEO Mark Zuckerberg and the other CEOs testifying Wednesday will point to the many steps they have taken to combat child sexual abuse material and other harmful content on their platforms, like suicide ideation, eating disorders, bullying, and violence. For its part, Meta has submitted significantly more reports of child sexual abuse material to the National Center for Missing and Exploited Children than Snapchat, TikTok, Discord, and X combined.
Earlier this month, Meta announced new content restrictions for teen users on Facebook and Instagram. TikTok disables messaging for users who attest they are younger than 16. Discord finally introduced parental controls in its gaming communications platform in 2023. Snap got ahead of Sens. Marsha Blackburn’s, R-Tenn., and Richard Blumenthal’s, D-Conn., inevitable questions at the hearing and endorsed their Kids Online Safety Act.
These are all positive steps but not enough to mitigate the harms, despite each platform’s prohibition of, or zero-tolerance policy for, child exploitation or sexual abuse material.
Several news reports have exposed the tools that predators use to find and groom kids online: “The People You Know” feature on Facebook suggesting people to add as friends; the algorithm that shows users content similar to what they’ve been viewing based on likes, comments, and other forms of engagement on Instagram; and the public livestream feature on TikTok that allows others to comment and submit coins in the form of emoji that can be converted into cash by the person streaming the content.
It works for these companies to add and support protective features like screen time notices, disabling private messaging, and filtering sensitive content to show critics they are doing something because they know not everyone will opt in to these features, and it creates the illusion of safety while ensuring they lose few users. This is why we are not seeing meaningful actions that make it harder for kids under 13 to create accounts.
These companies will not go as far as raising the allowable age for users on social media websites because they stand to lose too much revenue. During the hearing, they will say that they care about the safety of kids online, but do not think for a minute that they put this ahead of their bottom line.
The social media companies will also stress that their platforms have great benefits for kids to stay connected with friends. However, research shows a decline in the number of people posting actual content and the frequency at which they do but an increase in the time spent on the app just consuming content. Social media has become predominantly a platform for influencers or aspiring influencers to get famous and make money.
Users are consuming social media as a form of entertainment, and many feel that they have little control over what they view. Bots and fraud are becoming more rampant. For example, read the comments of a public post, and you will see several comments thanking Mr. So-and-So for changing their life and setting them up with $5,000 a week from passive income, or comments from an account with a pornographic photo saying something like, “Am I hot?”
Click on any of these accounts, and they either have zero posts and followers or similar photos posted in quick succession and promises that you can get rich just like they did or links to external adult-oriented websites. How are platforms that have evolved into consumerist entertainment and voyeurism, riddled with scammers and predators, even remotely appropriate for kids?
Instagram said in July 2023 that users on the app spent most of their time in their messages from other users (direct messages or DMs, as they are abbreviated), further showing an evolution in how people use the platform. Kids can text or use apps that are strictly for messaging if they want to text or share photos or videos with their friends. They don’t need a social media account to do that. Plus, they are less likely to be targeted, since texting or messaging apps like WhatsApp don’t connect users and strangers through content by design. Leave social media for adults, who are not immune to the perils of social media but less vulnerable than kids.
The Senate Judiciary Committee passed a number of bills that address gaps in the law to stop the exploitation of children online, improve reporting requirements to the CyberTipline run by the National Center for Missing and Exploited Children, seek justice for victims, and hold platforms more accountable. Two of these bills passed the Senate and are waiting for action in the House. None of them have become law yet.
These are also important steps to minimize the proliferation and dissemination of child sexual abuse material on social media, but they do not immunize kids from the risk of predation online, simply because kids and predators will still be on social media.
Last month, I wrote about establishing age verification for pornography websites. I’d argue that prohibiting kids from using social media and using age verification (not age attestation) to adhere to the requirement—and holding platforms liable for any violations—is the best way to minimize harms to kids on social media.
Many states are considering such measures. Is Congress willing to truly take Big Tech to task to protect kids as well?
Have an opinion about this article? To sound off, please email [email protected], and we’ll consider publishing your edited remarks in our regular “We Hear You” feature. Remember to include the URL or headline of the article plus your name and town and/or state.