How social media’s toxic content sends teens into ‘a dangerous spiral’

Girl-cell phone

October 8, 2021 – Eating disorders expert Bryn Austin, professor in the Department of Social and Behavioral Sciences, discusses the recent revelation that Facebook has long known that its Instagram app is harming teensmental health.

Q: Leaked documents from Facebook show that the company has known for at least two years that its Instagram app is making body image issues worse for teens, particularly girls. What’s your reaction to this news?

Bryn Austin
Bryn Austin

A: I was aghast at the news—but not surprised. We’ve known for years that social media platforms—especially image-based platforms like Instagram—have very harmful effects on teen mental health, especially for teens struggling with body image, anxiety, depression, and eating disorders. From experimental research, we know that Instagram, with its algorithmically-driven feeds of content tailored to each user’s engagement patterns, can draw vulnerable teens into a dangerous spiral of negative social comparison and hook them onto unrealistic ideals of appearance and body size and shape. Clinicians and parents have been sounding the alarms about this for years. So to hear that Instagram’s own research shows this too is not surprising. What astounds me, though, is what whistleblower Frances Haugen exposed: that, in internal conversations at Instagram, staff and senior leadership acknowledged these very damning findings, and yet the actions they’ve taken in response have been little more than window dressing, sidestepping the fundamental problem of the platform’s predatory algorithms. This revelation is what leaves me aghast.

Q: In a recent blog post, Instagram’s head of public policy wrote that the company knows that social media “can be a place where people have negative experiences” and that they’re working to mitigate the problem, but added, “Issues like negative social comparison and anxiety exist in the world, so they’re going to exist on social media too.” What do you make of this argument?

A: Instagram is peddling a false narrative that the platform is simply a reflection of its users’ interests and experiences, without distortion or manipulation by the platform. But Instagram knows full well that this not true. In fact, their very business model is predicated on how much they can manipulate users’ behavior to boost engagement and extend time spent on the platform, which the platform then monetizes to sell to advertisers. Instagram is literally selling users’ attention. The company knows that strong negative emotions, which can be provoked by negative social comparison, keep users’ attention longer than other emotions—and Instagram’s algorithms are expressly designed to push teens toward toxic content so that they stay on the platform. For teens struggling with body image, anxiety, or other mental health issues, negative social comparison is a dangerous trap, intensifying their engagement with the platform while worsening their symptoms. But with Instagram’s nefarious business model, every additional minute of users’ attention—regardless of the mental health impact—translates into more profits.

Keep in mind that this is not about just about putting teens in a bad mood. Over time, with exposure to harmful content on social media, the negative impacts add up. And we now have more cause for worry than ever, with the pandemic worsening mental health stressors and social isolation for teens, pushing millions of youth to increase their social media use. We are witnessing dramatic increases in clinical level depression, anxiety, and suicidality, and eating disorders cases have doubled or even tripled at children’s hospitals across the country.

Q: What steps are necessary to lessen potential harm to teens from Instagram?

A: If we have learned anything from the recent Congressional hearings with the whistleblower, the Wall Street Journal investigative reporting, and other important research, it’s that Instagram and Facebook will not—and likely cannot—solve this very serious social problem on their own. The business model, which has proven itself to be exquisitely profitable, is self-reinforcing for investors and top management. The platform’s predatory algorithms have been aggressively guarded, keeping them from being scrutinized by the public, researchers, or government. In fact, U.S. federal regulation on social media hasn’t been meaningfully updated in decades, leaving protections for users and society woefully inadequate.

But with the new revelations, society’s opinion of the industry may have soured and there may be a new willingness to demand meaningful oversight and regulation. What’s encouraging is that on the heels of the recent Congressional hearings, there are already several pieces of legislation in the works to establish a new government system of algorithm auditors, who would have the expertise and authority to require social media algorithms to meet basic standards of safety and transparency for children and users of all ages on Instagram and other social media platforms.

Q: What advice do you have for parents, and for teens who use the platform?

A: Until we have meaningful government oversight in place, there is still a lot that teens and parents can do. Although it’s a real struggle for parents to keep their kids off social media, they can set limits on its use, for instance by requiring that everyone’s phones go into a basket at mealtimes and at bedtime. Parents can also block upsetting content and keep dialogue open about how different types of content can make a young person feel about themselves. Equally important, teens and parents can get involved in advocacy, with groups such as the Eating Disorders Coalition and others, to advance federal legislation to strengthen oversight of social media platforms. With all that we know today about the harmful effects of social media and its algorithms, combined with the powerful stories of teens, parents, and community advocates, we may finally have the opportunity to get meaningful federal regulation in place.

Karen Feldscher

photo: iStock