This is an audio transcript of the Tech Tonic podcast episode: ‘Peak social media — The debate over young users’ mental health’

Emma Lembke
So the kind of the first sign of it was the fact that I felt myself consistently being anxious when I would post. I would delete posts if they didn’t get enough likes or if I became anxious thinking what is everyone to say if I post twice in one day.

Elaine Moore
That’s Emma Lembke. She’s 20 years old and a university student in the US. But she was talking to me about what happened to her when she was growing up in Alabama. At 12, she got her first social media account on Instagram (typing sound). By the time she was 13, she was struggling.

Do you know what your screen time would have been like at that point? Say, when you were 13 or 14?

Emma Lembke
My screen time would have been upwards of definitely five to six hours, like scrolling mindlessly through social media accounts. But I would probably ballpark it towards eight to nine, nine to 10, depending on the day. At certain points, it was very obsessive. It felt like I was living to document on Instagram or Snapchat (camera clicking). It really was all encompassing.

Elaine Moore
Gradually, Emma started to worry that this near-constant checking of social media was becoming a problem.

Emma Lembke
I was following a very rigid structure on posting and then kind of curating my online persona that began to really allow me to quantify my worth and really attach my worth to my likes, my comments, my followers (ping). I began to get really anxious, depressed.

Elaine Moore
Emma wasn’t just worried about the amount of time she was spending on social media. She was also worried about what she was seeing.

Emma Lembke
As I began to spend more time on these apps, even though I was feeling more anxious and depressed, I was being fed workout videos which would then lead to the next workout video, which would then lead to dieting videos, more pro anorexic content that was being streamed my way that I felt down the line severely impacted my development of my sense of self, my self-esteem, but also led to disordered eating patterns.

Elaine Moore
Now, anxiety and depression in teenagers existed long before social media. But what struck Emma was the way in which the algorithms on these platforms encouraged endless addictive scrolling and actively pushed her towards harmful content.

Emma Lembke
I kind of came to the moment of realisation where I had, you know, I was sitting with my phone and it buzzed and I had a pull over in response to grab for it. And kind of between the buzz and between the grab, I just finally asked, like, why is it that my phone has so much control over me? How am I allowing it to do that? And why is no one speaking up about this?

Elaine Moore
That moment of realisation spurred Emma to start a group where she could talk about these issues with other teenagers. She set up an organisation called Log Off. It pushes social media companies to do more to protect young users. And she says a reckoning is coming. New laws will mean that platforms have to take responsibility for the harm they’re causing to their young users.

Emma Lembke
We’re not just screen-agers. We’re not just passive victims of Big Tech. So what I would say to tech companies is you have to jump on the ship. You have to begin to really come to the table because this legislation is coming.

[MUSIC PLAYING]

Elaine Moore
This is Tech Tonic from the Financial Times. I’m Elaine Moore, deputy editor of the FT’s Lex column. And in this season of the podcast, I’m asking: Have we reached peak social media? And if so, where do we go from here? The past decade has been a very good one for social media giants financially. But over the years, there’s been a growing suspicion that social media might be bad for us, bad for society and bad for our wellbeing. In this episode, I want to find out if that’s really true. Does social media really cause us harm? And what could new child safety laws mean for the future of social media companies?

I joined Facebook in 2007. It felt as if half of my friends were already on there. We posted photos and tagged one another, wrote messages and nonsensical status updates. Although it was public, it felt private and it was fun to be on there. But there’s a sense now that for many people social media is unpleasant, feeds are saturated with ads, and if it’s not ads, they’re full of hatred, arguments, divisive and polarising content, conspiracy theories and disinformation. And there’s now more concern than ever that social media is toxic for a whole generation of kids who grew up with an online presence.

Rep Cathy McMorris Rodgers
You are recognised for five minutes.

Shou Zi Chew
Thank you. Chair Rodgers, ranking member Pallone . . .

Elaine Moore
Earlier this year, the boss of TikTok, Shou Zi Chew, was hauled in front of US Congress to testify.

Shou Zi Chew
Our app is a place where people can be creative and curious . . . 

Elaine Moore
It wasn’t the first time a social media CEO was in the hot seat. But TikTok was the latest app to take America’s youth by storm. And the US lawmakers had two big concerns. The first was anxieties over Tiktok’s Chinese ownership.

Rep Cathy McMorris Rodgers
But Chinese Communist party is able to use this as a tool to manipulate America as a whole.

Rep Earl L Carter
As I understand it, there’s a sister app in China. (Inaudible) . . . I’m sorry if I’m butchering the pronunciation.

Elaine Moore
But the other big issue they kept bringing up over and over again was the negative effects that TikTok was having on young people.

Rep Doris Matsui
I’m concerned about the app’s tendency to exacerbate existing mental health challenges.

Rep Earl L Carter
You know about the milk crate. You know about the blackout challenge. You know about the NyQuil chicken challenge.

Rep John Sarbanes
The more time that middle and high schoolers spend on social media, the evidence is, the more likely they are to experience depression and anxiety.

Rep Kathy Castor
Corrosive effect on our kids’ mental and physical wellbeing.

Rep Gus Bilirakis
Would you share this content with your children, with your two children? Would you want them to see this?

Elaine Moore
There are calls to ban TikTok altogether in the US. The state of Montana has signed a bill that will remove TikTok from app stores next year. Authorities are worried about the data that Tiktok’s Chinese parent company could collect on American citizens. But concerns about the impact that TikTok is having on children are common across all social media platforms. The rationale is simple: these platforms are designed to be sticky to keep you on them for as long as possible. The algorithm feeds you content you’ll engage with no matter what consequences it has on your wellbeing. If the most engaging content happens to be about eating disorders or school shootings, so be it.

[MUSIC PLAYING]

This might seem obvious now, but we have to remember, just a few years ago the conversation was very different. We were talking then about how social media was a threat to democracy. Researchers were studying how malicious actors were using the platform. Take Katie Paul. She works on child safety issues with the Tech Transparency Project. But that’s not where she started.

Katie Paul
I had been focusing on the trafficking of antiquities in the Middle East and north Africa after the Arab spring.

News clip 1
A flourishing trade in Syria, the selling off of a priceless cultural heritage.

Katie Paul
Bloody antiquities is a term that references antiquities that have been plundered and trafficked during conflict, often to fund bad actors, whether it is an aggressive nation state or a terrorist organisation.

News clip 2
Commandos secure Cairo museum in the heart of the Egyptian capital, but they’re too late.

News clip 3
The trade of antiquities is a key revenue stream for Isis.

Elaine Moore
Katie Paul has a background in archaeology and anthropology, and she was working on the illegal trade of relics and artefacts.

Katie Paul
During that research. I also stumbled across a massive online trafficking market for bloody antiquities on Facebook and began tracking that and trying to understand how what is essentially a war crime could be so openly permitted.

Elaine Moore
After studying the way terrorists sold relics on the internet, she began to see similarities with the issues around child safety online. In both cases, platforms are not held responsible for the content that appears on their sites, whether it’s pro-anorexia videos or an Isis affiliate trying to sell an ancient mosaic. That’s because the laws governing these platforms predate the platforms themselves. A law known as Section 230 says that social media companies are, broadly speaking, not liable for content posted by their users, which can seem baffling.

Katie Paul
Even my parents, for instance, they don’t understand why things that seem so blatantly illegal can just be on their . . . on these platforms with no repercussions. And when you explain that the law passed to protect these companies happened in 1996, anyone that was alive then remembers what the internet was like in 1996. I mean, even to sign on, it sounded like a dying robot (dial-up internet connection sound). Technology has changed so much, but the law has not. And this is just one of many examples, unfortunately, where the law is far behind what the technology is doing.

Elaine Moore
Section 230 means that social media companies are not treated as publishers, and Paul says it means the platforms aren’t motivated to moderate and police some of the worst content on their sites. And that’s particularly important when it comes to younger users.

Katie Paul
You’re dealing with a population whose minds are still developing. We know that the algorithms at work have been designed to manipulate. Part of that manipulation is making efforts to keep kids online as long as possible and keep eyeballs on ads.

Elaine Moore
An increasing number of lawmakers agree with Paul about this. There’s been a sudden wave of new legislation. Over the past year, we’ve seen major bills passed and more are being debated this summer, including in Maryland. In California, the focus is on design changes to infinite scroll and other addictive features.

Katie Paul
We’re seeing a big push across states right now for more child protection bills on tech, for more privacy bills in places like Texas and Utah. There’s the age verification issue. But one thing is clear is that whether it’s California or Texas, there’s a bipartisan interest in reining in Big Tech when it comes to protecting children.

Elaine Moore
Social media platforms are under scrutiny from regulators on multiple fronts. There have been calls to break up Facebook on the grounds that it’s a monopoly and efforts to curb the amount of data that companies can gather. Some lawmakers worry that algorithms amplify content that spreads misinformation and disinformation. But protecting children online is the one issue that has bipartisan support.

Katie Paul
Many members of Congress, whether they have children themselves, whether they have grandchildren, can all agree on this is a population that’s being exploited. Part of that is also because nobody wants to get on the floor of Congress or on the floor of their state senate and vote against something that’s meant to protect children. And this has become a foot in the door for this historic bipartisan interest in protecting kids online.

Elaine Moore
While there may be a foot in the door, tech platforms are trying to slam it shut as fast as they can, which is perhaps a sign of how seriously they take the threat.

Katie Paul
Big Tech is working hard to try to push against any of these laws that are meant to protect children’s data, children’s privacy, the kind of content that is algorithmically amplified to them. They pour millions of dollars into lobbying to get those bills killed because kids are a huge market for these companies, and it will really cut into their bottom line if they can no longer collect these data points on children.

Elaine Moore
We’re at a turning point where campaigners believe that regulation is a real possibility, and what’s at stake is the future of the social media business model. But this is happening despite the fact that it’s not particularly clear what proof we have that social media is bad for us, particularly when it comes to our mental health. Beyond the anecdotal, what concrete evidence do we have for social media’s negative effects on its youngest users?

[MUSIC PLAYING]

In the US, lawmakers want social media platforms to do more to protect children online. There’s a generally accepted narrative that social media makes us feel worse, that it’s damaging our mental wellbeing. But here’s the thing — it’s not entirely clear what damage is being done.

Amy Orben
I started researching this area about 10 years ago.

Elaine Moore
Amy Orben is a psychology researcher at Cambridge university in the UK.

Amy Orben
And at that point a lot of the research work was done by people who’d never actually used social media. So as somebody who had used social media as an adolescent, my feeling was that this type of research being done wasn’t adequately showing the complexities of social media.

Elaine Moore
Orben is one of the leading researchers investigating how digital technologies affect adolescent mental health. She runs a lab at Cambridge, funded by public research bodies. Orben acknowledges there has been a general decline in mental health amongst young people that has coincided roughly with the rise in social media. But is social media really the cause? Orben’s long-term studies suggest that if you start using social media more, you’re more likely to feel worse. But also that feeling bad can make you use more social media.

Amy Orben
So for example, in a study of about 17,000 young people, we found that if a young person used more social media than their own average in one year, that predicted a small decrease in wellbeing one year later.

Elaine Moore
Simple enough: when a young person needs a social media, they feel worse.

Amy Orben
But also if they felt worse in one year that predicted a small increase in social media use one year later.

Elaine Moore
In other words, depression or anxiety may also be causing the increase in social media use. Feeling bad might make you scroll mindlessly on your phone, for example. The other problem is there are a lot of other things going on in the lives of teenagers that might impact their mental health besides social media. So even if there is a small correlation between social media use and mental wellbeing, it might not really mean that much. It turns out there are lots of things you can correlate with mental health.

Amy Orben
In one of my studies, eating potatoes had a more negative relationship to wellbeing than screen time had to wellbeing. Wearing glasses have a more negative relationship than screen time as well. I think it shows us the complexity of reasoning, taking correlational studies as kind of causal because naturally we wouldn’t start banning glasses in schools or we would also say, well, it’s probably not the potatoes that are causing the mental health decline. It might be something around poverty or the ability to have a really balanced diet that really makes a difference here. And I think it was the same with screens. We don’t know yet what the intricate causal network underneath these correlations are.

Elaine Moore
Now, there has been an overall general decrease in young people’s mental wellbeing. And Orben says that’s not small potatoes. But what’s actually causing it?

Amy Orben
It can be connected to many things and probably is connected to many things: the changes in the welfare state; austerity measures in the UK; global financial crisis; this generation probably not having it as good as their parents; global warming; increases in pressures from schools; as well as changes in family structures and changes in how technologies are used.

Elaine Moore
Social media might be a small contributor in the overall decline in mental health in young people. But she says it’s also become a convenient scapegoat, a kind of technological panic.

Amy Orben
We’ve seen this again and again. In the 1940s there were psychologists writing about how the majority of young people and children are addicted to the radio and radio crime dramas. In the sixties thousands of parents wrote in to the US government saying they’re worried about the television, grabbing their children’s attention and causing large-scale health changes. We then have the same with video games and now with social media and smartphones. We have this intermixing between very concrete, high-quality scientific evidence and then a lot of misinformation really about brain impacts of social media.

Elaine Moore
And the danger is that if we’re regulating without properly understanding the problem we’re trying to address, we can end up missing the root cause, the thing we’re actually trying to fix. Orben does think we need regulation. She just thinks we need to go about it differently.

Amy Orben
The real concern is by going into hyperbole and not sticking to what the data actually says, we will be throwing the baby out with the bathwater and our regulation won’t actually be as effective as it could be, and it might be harming people who use that technology very positively.

Elaine Moore
There’s a lot of disagreement about whether social media is really harmful, even when it comes to other major issues like violence or political polarisation. It’s not clear exactly what role social media plays. The assumption is that social media algorithms push increasingly extreme political content, widening the political divide. But in the US, political polarisation is greatest amongst older people, and they use social media much less than younger ones. It’s something researchers would really like to dig into more. But the social media companies are notoriously reluctant to give away any information about how people use their platforms, even in the name of scientific research. So why then has there been this big push to regulate platforms when there isn’t a consensus that social media really is harmful? It’s a question I put to my colleague Hannah Murphy, who covers social media for the FT in San Francisco.

If there is no clear evidence about the impact of social media companies on mental health, particularly for young people, then why are so many lawmakers interested in this subject and trying to push something forward?

Hannah Murphy
I would say there is evidence. It’s not necessarily enough evidence yet, but I think that this will likely be driven in part by the anecdotal experience of the politicians themselves, seeing that children a kind of fear of technology. And it’s undeniable that you’ve seen children move from unrestricted free play to now having an iPad in front of them at six years old. I think parents are probably hyper-aware of that. There has been a feeling that regulators have always moved too slowly when it comes to tech and that this is a huge crisis that’s building up ready to blow. Parents are getting concerned enough to want to try to intervene sooner, that there’s enough signs and signals that it’s worth jumping on this right now and assuming the worst.

Elaine Moore
But even if parents think it’s worth jumping on this right now, there’s a bigger debate over how exactly legislation should address these problems.

Hannah Murphy
There are sort of two camps emerging globally among politicians to tackle social media for young people in two different ways. So one is they want young people to just be banned from the platforms altogether, and their parents have more parental control over what the young person sees. The parent can have access to the child’s account and then restrict that if they so choose. So that’s sort of one camp: try to get more parental controls. And then the other camp is more around trying to place the burden on the companies themselves, the platforms to design themselves in such a way that is more safe for children. A push for the platforms to add a prompt that it says “You’ve been on for six hours now. Time to switch off”. One is about design of the platforms, places the burden on the companies, and the other places the burden more on the parents.

Elaine Moore
If lawmakers and regulators do manage to make some kind of adjustment in young people’s usage, is that going to change user numbers at social media companies?

Hannah Murphy
If the camp of politicians that calls for parents to have control of children’s accounts and children to be age verified in order to use the accounts in the first place and in some cases not be able to have an account until they’re 18. If these sorts of rules come into play, then yes, that will obviously hit user numbers. Not just that, but it also means that the platforms miss out on this window during which sort of impressionable minds might become addicted and loyal to their platforms. And they know that even though at that young age, that person might not have any purchasing power or be able to buy anything, but once they’ve sort of got them on the platform, they have them later, later down the line, to be able to serve ads to them. So they’re missing out on that window there.

Elaine Moore
Regulations that cut down on young people’s use of social media are a real threat to social media companies and their efforts to recruit more users and more targets for their advertising clients. But Emma Lembke, the young digital activist you heard at the top of the show, says that changes in the law must hold platforms to account. She doesn’t want to kick TikTok out of the United States or to ban any social media app, for that matter. She wants to alter the way they work, giving users more autonomy, not preventing access.

Emma Lembke
I think the way forward in placing that responsibility on tech companies to protect and to prioritise the wellbeing of young people is through design as a vehicle for change, legislative change specifically. Allowing for opt out recommendation systems, it gives more user control to limit addictive features like autoplay. What that really does too, is it prevents the harm to young people when they decide to enter online platforms, rather than saying you have to be a certain age before you enter or leaning into age verification.

Elaine Moore
It’s hard to tell how this push for regulation will end. But despite her early addiction to social media, despite the turmoil it caused her, Emma actually calls herself a digital optimist, and she doesn’t think that social media is going to disappear.

Emma Lembke
The genie is out of the bottle. We’re never going to go back to an age where social media is not prominent or is not widely used by young people. Nor should we. On a more personal note, I’m a girl from Birmingham, Alabama, and there are so many different communities and ideologies and people, places and things that I’ve been able to explore using the online world that I would never have been able to have access to. I think that there are so many beautiful things that are possible with the online world being connected to social media. If it is built in a way that is conducive to the individual and the young person’s wellbeing, privacy, safety, security.

Elaine Moore
Two things are true. In the past two decades, social media use amongst young people has exploded. So have reported rates of depression and other mental health issues. But even after years of research, there’s considerable confusion about how one relates to the other. What makes sense to me is that the endless performance that many social media networks require can be exhausting, even self-destructive, and that the incentives to remain online for as long as possible aren’t conducive to a particularly balanced life. Regulation may be coming, but it may also be slow-moving. What could be more effective is the dawning realisation that more of us are having, that social media platforms are supposed to be fun and entertaining. And if they no longer serve that purpose, then perhaps it’s time to look for something new.

[MUSIC PLAYING]

Next week on Tech Tonic: Will the next phase of social media be powered by a new generation of internet celebrities?

Keith Bielory
These creators generate so much revenue for these platforms. You don’t want to piss them off.

Elaine Moore
And what power do the platforms hold over them?

Kris Collins
I’m trying to diversify my revenue as much as I can. I don’t want to be stuck like, “Hey, YouTube’s gone. Oh, no.”

[MUSIC PLAYING]

Elaine Moore
You’ve been listening to Tech Tonic from the Financial Times with me, Elaine Moore. The producer is Josh Gabert-Doyon, and the senior producer is Edwin Lane. Manuela Saragosa is executive producer. Sound design is by Breen Turner and Samantha Giovinco. Original scoring by Metaphor Music. And before you go, we’re keen to hear more from our listeners about this show, so we’re running a survey which you can find at ft.com/techtonicsurvey. It takes around 10 minutes to complete and we’d appreciate your feedback.

[MUSIC PLAYING]

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Comments

Comments have not been enabled for this article.