鶹ýӰ

TikTok and the Death of Nex Benedict

— We should think twice about how mental health discussions are regulated on social media

MedpageToday
A photo of Nex Benedict
Melinek is a forensic pathologist.

I'm one of the rare middle-aged women of TikTok. I tell myself -- have convinced myself -- that this social media video platform is a valuable way to reach "the youths" and maybe spark an interest in forensic pathology or science communication. I am deluding myself, of course. The platform is an amusement, an addictive and profitable stream of amateur video entertainment tailor-made to my interests, which tend to STEM, parenting, medicine, and mental health. And I'm not gonna lie: I lap it up. That's how successful algorithms work.

But recently, the death of a teenager, which garnered extensive attention on TikTok, emphasized to me the dangers of "clicks" and how discussions of suicide are regulated on the platform.

The Nex Benedict Tragedy

It was from the viral-news generators on TikTok that I first learned about Nex Benedict, a 16-year-old nonbinary kid from Oklahoma who had been assaulted in a school bathroom one day and died the next. Normally the death of a teenager does not make national news, but the circumstances of this death pushed the buttons for internet virality. It involved violence and bullying, and looked suspicious.

Nex had been cornered and harassed by girls in the school bathroom and retaliated by splashing the girls with a water bottle. The reaction was disproportionate: the bullies jumped Nex and beat them up. Nex had to go to the emergency department to be evaluated for a head injury.

The next day, Nex became and their grandmother (Nex's adoptive parent) called 911. Nex never came back to consciousness and died in the hospital soon after. Videos from school cameras and interviews with Nex in the hospital during the initial visit were shared widely on TikTok and other social media platforms, with many posters and commenters noting what they perceived as an absence of visible head injuries.

And after Nex's death, many internet influencers -- people who make their living by attracting clicks on social media sites -- theorized that Nex had been killed. Allegations by these media entrepreneurs that Nex's death was a murder bagged them a ton of engagement. Millions of clicks. Lots of money. All of those theories were tossed around and all of those clicks clicked before any public reports of Nex's actual cause or manner of death had been released.

The Dangers of Speculation and Sensationalizing a Story

Sudden, unexpected, and violent deaths are my professional specialty, so reporters and bloggers often reach out to me when there is a high-profile suspicious death -- like this one. I reviewed the autopsy report when it was released by the Oklahoma Medical Examiner's Office, which determined that from a mixed drug intoxication. The , which I saw online, was redacted and incomplete, probably in an effort to minimize release of private medical information.

When releasing information about suicide, ethical media entities and medical examiner offices will follow a set of designed to prevent "contagion," or copycat suicides. These guidelines include best practices to avoid sensationalizing the story, instructing reporters to avoid description of specific details about the death scene or the method used, or even to avoid attributing the death to a specific trigger or cause, since deaths by suicide are often caused by multiple factors, some of which may only be known to the decedent. Because these protocols offer guidance on things like headlines and article placement, they may seem to be geared primarily toward professional journalists, but there have also been a set of developed for ethical social media reporting on suicide.

Few, if any, of the TikTokers I saw who reported on Nex Benedict's death followed these guidelines. Instead, they shared details on the specific medications used, and, in some cases, still placed the blame on only one speculative cause: the bullying that Nex experienced as a gender non-confirming individual and the fact that Nex was held equally responsible for fighting at school (since they were blamed for escalating the bathroom conflict by splashing their bullies with water).

The Trouble With Social Media Algorithms

The TikTok algorithm of the word "suicide," but users get around it by using phrases such as "un-alive" or altering the spelling on captions to "Su-e-cid3." Viewers can report any video or comments they deem to be actively promoting harm, but at its core, the algorithm appears designed to promote things people "like," and people tend to like videos that stir strong emotions. These videos go viral and get pushed under the eyeballs of young social media users ( of TikTok consumers are between 10 and 19 years old), and seem to be disproportionately shared on the "for you" page of viewers who may be trans or non-binary, have already shown interest in mental health issues, and could be struggling.

Putting the burden of policing these videos on the public -- who may not know anything about suicide contagion or how to prevent it -- is irresponsible. But by suppressing all content that even mentions the word "suicide," social media algorithms also collaterally suppress psychologists and psychiatrists on the platform who want to talk openly about prevention and how to support people who need help.

The Need for Change

So what can we do? In schools we need to create a culture of kindness and tolerance, which means re-evaluating "zero tolerance" disciplinary policies. Many zero-tolerance policies meant to shut down conflict are instead lazily punitive and result in struggling kids being sent home to parents who are ill-equipped to resolve their conflicts.

Schools need to focus not on the abolition of conflict but instead on conflict resolution and reconciliation. Rigid zero-tolerance policies end in punishment and isolation for kids who find themselves in conflict for all sorts of reasons. Educators need to instead consider policies crafted by mental health professionals and mediators, with support not only for kids in conflict but also for their parents -- because parents of both the bullies and the bullied can be defensive, blame-casting, and not open to talking.

I'd like to see more doctors and mental health professions speaking out about anxiety, depression, and conflict resolution on social media, in the virtual ecosystem where our kids are spending more and more time. At the same time, I'd like to see legal regulation with teeth to ensure that algorithms are being designed not only by IT engineers who aim to increase clicks and engagement without regard for the harm they might cause their eventual audience, but also by mental health professionals who can study ways in which we can decrease teen anxiety, isolation, and suicidality.

Nex Benedict's death is no longer trending on TikTok. Now that the manner of death has been determined as suicide, it is unsurprising that click-hungry activists and influencers have moved on to trending hashtags, which get them more eyeballs.

But those of us with young loved ones need to talk to our representatives on school boards and at all levels of government about these issues: Just because social media algorithms suppress mentions of suicide, that doesn't mean suicide isn't there. in teens is increasing and has been linked to the effects of on social media. Online platforms are not in the business of promoting the health and well-being of their users. It's up to us to make sure legislation gets put in place that will motivate them to do so.

If you or someone you know is considering suicide, call or text 988 or go to the .

Judy Melinek, MD, is an American forensic pathologist currently working as a contract forensic pathologist in Wellington, New Zealand. She is the co-author with her husband, writer T.J. Mitchell, of the memoir , and the Making of a Medical Examiner. You can follow her on Twitter and Facebook .