Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

Europe’s duty to act: Tackling social media’s assault on teen wellbeing

9 months ago 48

On Tuesday (11 December), the European Parliament will vote on addictive design features in online services. The time has come to regulate this free-wheeling experiment on our children, writes Bryn Austin.

Bryn Austin is a Professor at the Harvard University School of Public Health.

As the time teens spend on social media continues to rise and teen mental health plummets, platforms celebrate the financial windfall of the biggest social experiment ever run on the world’s children.

We may be aghast,  but can we really be surprised when maximising engagement time, regardless of the mental health consequences for vulnerable users, is precisely how platforms are built to drive revenue?

It is vital that lawmakers in Europe take the lead in regulating the abusive practices that evidence strongly suggests are worsening the teen mental health crisis. Platitudes about protecting children fall short when profits remain the priority, and monetising misery, in the poignant words of grieving father Ian Russell, is the business model.

The European Parliament is about to vote on whether it will endorse a powerful set of recommendations on addressing addictive design features in online services. This is a vital first step towards healthier and safer online spaces for young people, and one that no lawmakers can in good conscience ignore.

As in the US and indeed around the world, the evidence gathered by my peers in behavioural sciences and epidemiology – and my own research – points to how social media’s predatory business practices threaten the wellbeing and development of young people.

The recent lawsuit launched by 41 states in the US, suing Instagram and Meta for knowingly designing features that exploit and manipulate children, is a clear sign of the pressure required on both sides of the Atlantic to drive stronger regulation and safe platform design.

Image-based apps like Instagram are likely responsible for the most acute harm to young people. Although Meta has guarded its algorithms against scrutiny, let’s consider what we’ve discovered about the company’s decisions surrounding Instagram’s design in recent years – largely through the disclosures of whistle-blower Frances Haugen.

The trove of internal documents she exposed showed that the company privately acknowledged – and documented – what public health professionals and experimental research had already highlighted for years: that Instagram’s features can easily draw vulnerable youth into a dangerous spiral of negative social comparisons, hooking them into unrealistic ideals of appearance and body size and shape and increasing their risk of eating disorders.

What’s worse, Meta’s corporate leadership knew this, but chose not to act. The harm is by design. In recent weeks, unsealed documents from the lawsuit these dozen states’ have brought against Meta allegedly show that Mark Zuckerberg vetoed or ignored internal requests to mitigate harmful features and increase investments in teen wellbeing.

Social media companies argue that these harms are no more prevalent on their apps than they are in the offline world. This disingenuous claim is quite the opposite of their pitch to advertisers, based on a now well-known business model predicated on how much they can manipulate users’ behaviour to algorithmically boost engagement and extend time spent on the platform. In an earnings call just this year, Meta’s leadership boasted that AI-enhanced ‘reels’ mimicking the TikTok format had increased time spent by 23%.

Meanwhile, in recent years, we’ve seen dramatic increases in clinical-level depression, anxiety, and suicidality among youth, and eating disorder cases among teen girls have doubled in emergency departments across the US, to the alarm of the US Surgeon General. While we need more research, the trend repeats itself in other countries with comparable data, including in Europe. 

The scale and impact of the crisis is severe, and the consequences can be heartbreaking. A recent Amnesty study showed that within an hour, TikTok recommended multiple videos glorifying suicide to an account posing as a 13-year-old, and more than half the videos portrayed mental health struggles. When 14-year-old British teenager Molly Russell took her own life in 2017, the contents of her phone later revealed that she was bombarded with 2,100 posts discussing and glorying self-harm and suicide on Instagram and Pinterest over the preceding six months. A coroner’s report found that this material likely “contributed to her death in a more than minimal way”.

The health of an entire generation hangs in the balance. The practical solutions in the European Parliament’s report on addictive design are both welcome and urgent.

The report calls for the EU to assess the addictive and harmful features of hyper-personalised ‘recommender systems’ that use machine learning to curate news feeds. Additionally, it requires the EU to identify specific features causing ‘systemic risks’ to users, including children, on their platforms, and assess which manipulative practices can be prohibited. A right not to be disturbed is another essential proposal, empowering users by turning all attention-seeking features off by design.

The time has come to put serious regulatory measures in place to prevent this free-wheeling experiment on our children. The European Parliament should approve this report’s recommendations in full.

Read Entire Article