Opinion: Mark Zuckerberg’s extraordinary apology should only be the beginning
CNN
Opinion by Kara Alaimo
(CNN) — On Wednesday, the chief executives of Meta, TikTok, X, Snap and Discord testified before the Senate about what they’re doing to protect kids from harm online. While Meta’s Mark Zuckerberg and TikTok’s Shou Chew appeared voluntarily, Democratic Sen. Dick Durbin of Illinois chastised Snap’s Evan Spiegel, X’s Linda Yaccarino and Discord’s Jason Citron for having to be subpoenaed and noted that federal marshals had to be sent to Citron’s offices to serve his subpoena.
Durbin accused social networks of harming kids’ mental health, saying that they not only “have contributed to this crisis, they are responsible for many of the dangers our children face online.”
Senators also expressed concern about how children are being exploited online. “Sextortionists,” for example, are convincing kids to share racy images — sometimes by pretending to form a romantic relationship — and then blackmailing them by threatening to make the images public if they don’t provide more explicit content or money (or, sometimes, both). This can destroy people’s lives and puts them at risk of depression and suicide. “You have blood on your hands,” South Carolina Republican Sen. Lindsey Graham told the executives.
In the leadup to the testimony, tech companies announced new initiatives to protect kids. Apps like Instagram and TikTok are prompting kids to take breaks or limit their screentime and have changed their algorithms to show kids less toxic content, like posts about eating disorders. Meta says it’s hiding this kind of inappropriate content from kids and has changed teens’ default privacy settings to help prevent people they don’t know from messaging them. Snapchat is giving parents more oversight options, like information about their kids’ safety settings.
Chew testified that, on TikTok, accounts for children under age 16 are set to private and not recommended to people they don’t know, and Yaccarino said children between the ages of 13 and 17 can’t accept messages from people they don’t approve on X. Spiegel noted there aren’t public likes or comments on Snapchat. These are all critical features.
Zuckerberg also stood up and apologized to families of children harmed by social media who attended the hearing, saying, “I’m sorry for everything you have all been through. No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry wide efforts to make sure no one has to go through the things your families have had to suffer.”
But it’s not enough. Lawmakers and tech companies need to do much more to protect our kids.
The Stop CSAM (Child Sexual Abuse Material) Act of 2023 would make it possible to hold tech companies civilly liable for hosting child sexual abuse material. This would be an important way of incentivizing tech companies to do more to protect kids from sextortion and other forms of online exploitation.
The SHIELD Act would criminalize sharing or threatening to share intimate images without the person’s consent (which is popularly known as “revenge porn”). This is also an essential way of protecting kids and adults.
Tech companies also have a lot more work to do. While they may claim they’re showing kids less harmful posts, it’s still unclear how they’re identifying what’s harmful. Tech companies should work with experts in adolescent health to develop and share standards for content that can be shown to kids so that potentially harmful posts about things like body image and mental health don’t appear on their feeds at all, and so content creators know what the rules are and can try to follow them. Then, they need to hire a lot more human moderators to determine whether content meets those standards before showing it to children.
We can’t trust artificial intelligence tools to do this vetting — one internal document showed that automated systems removed less than 1% of content that violated the company’s rules about violence and incitement, for example. (Zuckerberg said during the hearing that “99% or so” of the content Meta removes is automatically identified using AI.) But figuring out ways for humans to vet content is not as difficult as it may sound — Snapchat, for example, doesn’t promote videos from its creator program to more than 25 people until they’re reviewed by human moderators.
We also need to know what big trends our kids are being exposed to online. Currently, none of these platforms give us good metrics about what’s trending on their platforms. TikTok, for example, recently disabled the search function on a tool that allows people to search popular hashtags on the app, on the heels of a report that found topics censored by the Chinese government were less prevalent on TikTok than on Instagram. All social apps should provide tools that show what’s trending, and also break down what’s trending among users under age 18 so parents know what to talk to their kids about.
As I write in my forthcoming book, “Over the Influence: Why Social Media is Toxic for Women and Girls – And How We Can Take It Back,” when kids search for content related to mental health, tech companies should show them boxes with links to organizations that can help them. It’s astounding to think that, instead, kids searching for this content could be served videos that make their problems worse.
I also argue in my book that these apps should identify when photos have been filtered or otherwise manipulated, so kids are constantly reminded that the images they are seeing of other peoples’ bodies often aren’t real. This could help them develop realistic body images.
Finally, these tech companies should be doing a better job of using their platforms to give children information that helps and empowers them — like lessons on how to use social media in healthy ways. For example, on the heels of the nude deepfakes of Taylor Swift that have been circulating online in recent days, social apps should be showing kids content about how to spot fake images and why they should never create or engage with them. As I write in my book, when nude images of a woman circulate online, it puts her at greater risk of sexual assault, unemployment, difficulty dating, depression and even suicide.
Tech executives promised to protect kids in their testimony to senators. But they didn’t promise to do what’s actually needed to safeguard kids’ physical and mental well-being. To protect kids on their apps, they need to create and enforce better standards for content shown to kids, along with more human moderators, mental health resources, lessons for kids and disclosures when content has been manipulated. And lawmakers need to pass legislation to crack down on online sexual exploitation. These kinds of solutions would give parents something to actually like.
The-CNN-Wire
™ & © 2024 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.