That libertarian’s argument, in this case, boils down to the idea that if you have a novel, obviously addictive technology that might well be associated with depression, narcissism and self-harm, you need to wait for absolute certainty in that association before you start thinking about limits on how kids use it, because once upon a time there was a moral panic about comic books and wasn’t that embarrassing. Perhaps I’ve buried my 13-year-old self too deeply, but I am not convinced.
But if we are willing to think about imposing limits on the teenage Instagram experience, then we probably need something more than a general rage at Silicon Valley’s reckless nerds. Yes, it would be ideal if social media companies would self-regulate in their relationship to teenagers, and it’s swell that in the wake of the bad Wall Street Journal publicity Facebook is temporarily putting a hold on its plans to start a version of Instagram explicitly for kids. But real, sustained self-regulation generally happens only under threat of external action or with the establishment of a new consensus around what’s acceptable to sell to kids. So for people who read the Journal article and come away irate at Facebook, the question should be, what exact consensus do you want? What norms do you expect Instagram or any other company to follow? In the light of the data, what rules should they obey?
And if your answer is that they should be forced to invent an algorithm that doesn’t feed depression or anxiety, then I’m not sure I take your anger seriously. You’re setting us up for a future of endless public promises to tweak the algorithm joined to constant behind-the-scenes pressure to get the biggest numbers possible, mental health effects be damned. (A future much like our present.)
No, if you actually want to take precautionary steps that might really limit whatever damage social media is doing, you need those steps to be much simpler and blunter: You need to create a world where social media is understood to be for adults and the biggest networks are expected to police their membership and try to keep kids under 16 or 18 out.
What would be lost in such a world? Arguably social media supplies essential forms of connection and belonging for kids who are isolated and unhappy in their flesh-and-blood environments. (Though if that’s really the case, you would expect the previous decade to be an inflection point toward improved mental health for teenagers, which it definitely wasn’t.) Arguably it provides outlets for kids to experiment creatively and develop themselves as artists and innovators. (Though the belief that TikTok is nurturing aesthetic genius sometimes feels like a Philistine’s delusion, nurtured by an adult establishment that lacks the self-confidence to actually educate its kids into the distinction between quality and rubbish.)
In both cases, though, in a world where Instagram couldn’t rely on 15-year-olds to juice its stats, some of those alleged benefits of social media would still be available via the wider internet, which offered all manner of forms of community, all kinds of outlets for creativity, before Twitter and Facebook came along.
A key problem with social media, from this perspective, isn’t just its online-ness but its scale. As Chris Hayes puts it in a recent essay for The New Yorker, the contemporary internet universalizes “the psychological experience of fame” and takes “all of the mechanisms for human relations and puts them to work” seeking more of it. But that happens in a much more profound way on a network like Instagram, with all its teeming millions of users, than it would in a message board or chat room for some specific niche identity or interest.