Within the second when her world shattered three years in the past, Stephanie Mistre discovered her 15-year-old daughter, Marie, lifeless within the bed room the place she died by suicide.
“I went from mild to darkness in a fraction of a second,” Mistre mentioned, describing the day in September 2021 that marked the beginning of her combat in opposition to TikTok, the Chinese-owned video app she blames for pushing her daughter towards despair.
EDITOR’S NOTE — This story contains dialogue of suicide. Should you or somebody you realize wants assist, the nationwide suicide and disaster lifeline within the U.S. is obtainable by calling or texting 988. There’s additionally an internet chat at 988lifeline.org. Helplines outdoors the U.S. might be discovered at www.iasp.data/suicidalthoughts.
Delving into her daughter’s cellphone after her loss of life, Mistre found movies selling suicide strategies, tutorials, and feedback encouraging customers to transcend “mere suicide makes an attempt.” She mentioned TikTok’s algorithm had repeatedly pushed such content material to her daughter.
“It was brainwashing,” mentioned Mistre, who lives in Cassis, close to Marseille, within the south of France. “They normalized despair and self-harm, turning it right into a twisted sense of belonging.”
Now Mistre and 6 different households are suing TikTok France, accusing the platform of failing to average dangerous content material and exposing youngsters to life-threatening materials. Out of the seven households, two skilled the loss of a kid.
Requested in regards to the lawsuit, TikTok mentioned its tips forbid any promotion of suicide and that it employs 40,000 belief and security professionals worldwide—tons of of that are French-speaking moderators—to take away harmful posts. The corporate additionally mentioned it refers customers who seek for suicide-related movies to psychological well being companies.
Earlier than killing herself, Marie Le Tiec made a number of movies to elucidate her resolution, citing varied difficulties in her life, and quoted a track by the Louisiana-based emo rap group Suicideboys, who’re in style on TikTok.
Her mom additionally claims that her daughter was repeatedly bullied and harassed at college and on-line. Along with the lawsuit, the 51-year-old mom and her husband have filed a grievance in opposition to 5 of Marie’s classmates and her earlier highschool.
Above all, Mistre blames TikTok, saying that placing the app “within the arms of an empathetic and delicate teenager who doesn’t know what’s actual from what isn’t is sort of a ticking bomb.”
Scientists haven’t established a transparent hyperlink between social media and psychological well being issues or psychological hurt, mentioned Grégoire Borst, a professor of psychology and cognitive neuroscience at Paris-Cité College.
“It’s very troublesome to indicate clear trigger and impact on this space,” Borst mentioned, citing a number one peer-reviewed research that discovered solely 0.4% of the variations in youngsters’ well-being could possibly be attributed to social media use.
Moreover, Borst identified that no present research counsel TikTok is any extra dangerous than rival apps reminiscent of Snapchat, X, Fb, or Instagram.
Whereas most teenagers use social media with out vital hurt, the true dangers, Borst mentioned, lie with these already going through challenges reminiscent of bullying or household instability.
“When youngsters already really feel unhealthy about themselves and spend time uncovered to distorted photos or dangerous social comparisons,” it may well worsen their psychological state, Borst mentioned.
Lawyer Laure Boutron-Marmion, who represents the seven households suing TikTok, mentioned their case is predicated on “intensive proof.” The corporate “can not conceal behind the declare that it’s not their accountability as a result of they don’t create the content material,” Boutron-Marmion mentioned.
The lawsuit alleges that TikTok’s algorithm is designed to lure susceptible customers in cycles of despair for revenue and seeks reparations for the households.
“Their technique is insidious,” Mistre mentioned. “They hook youngsters into depressive content material to maintain them on the platform, turning them into profitable re-engagement merchandise.”
Boutron-Marmion famous that TikTok’s Chinese language model, Douyin, options a lot stricter content material controls for younger customers. It features a “youth mode” obligatory for customers underneath 14 that restricts display time to 40 minutes a day and affords solely accredited content material.
“It proves they’ll average content material after they select to,” Boutron-Marmion mentioned. “The absence of those safeguards right here is telling.”
A report titled “Youngsters and Screens,” commissioned by French President Emmanuel Macron in April and to which Borst contributed, concluded that sure algorithmic options must be thought-about addictive and banned from any app in France. The report additionally referred to as for limiting social media entry for minors underneath 15 in France. Neither measure has been adopted.
TikTok, which confronted being shut down within the U.S. till President Donald Trump suspended a ban on it, has additionally come underneath scrutiny globally.
The U.S. has seen comparable authorized efforts by mother and father. One lawsuit in Los Angeles County accuses Meta and its platforms Instagram and Fb, in addition to Snapchat and TikTok, of designing faulty merchandise that trigger severe accidents. The lawsuit lists three teenagers who died by suicide. In one other grievance, two tribal nations accuse main social media corporations, together with YouTube proprietor Alphabet, of contributing to excessive charges of suicide amongst Native youths.
Meta CEO Mark Zuckerberg apologized to folks who had misplaced youngsters whereas testifying final 12 months within the U.S. Senate.
In December, Australia enacted a groundbreaking legislation banning social media accounts for kids underneath 16.
In France, Boutron-Marmion expects TikTok Restricted Applied sciences, the European Union subsidiary for ByteDance—the Chinese language firm that owns TikTok—to reply the allegations within the first quarter of 2025. Authorities will later determine whether or not and when a trial would happen.
When contacted by the Related Press, TikTok mentioned it had not been notified in regards to the French lawsuit, which was filed in November. It might take months for the French justice system to course of the grievance and for authorities in Eire—dwelling to TikTok’s European headquarters—to formally notify the corporate, Boutron-Marmion mentioned.
As an alternative, a TikTok spokesperson highlighted firm tips that prohibit content material selling suicide or self-harm.
Critics argue that TikTok’s claims of sturdy moderation fall brief.
Imran Ahmed, the CEO of the Heart for Countering Digital Hate, dismissed TikTok’s assertion that over 98.8% of dangerous movies had been flagged and eliminated between April and June.
When requested in regards to the blind spots of their moderation efforts, social media platforms declare that customers are capable of bypass detection by utilizing ambiguous language or allusions that algorithms battle to flag, Ahmed mentioned.
The time period “algospeak” has been coined to explain strategies reminiscent of utilizing zebra or armadillo emojis to speak about reducing your self, or the Swiss flag emoji as an allusion to suicide.
Such code phrases “aren’t notably subtle,” Ahmed mentioned. “The one purpose TikTok can’t discover them when impartial researchers, journalists, and others can is as a result of they’re not wanting arduous sufficient,” Ahmed mentioned.
Ahmed’s group carried out a research in 2022 simulating the expertise of a 13-year-old lady on TikTok.
“Inside 2.5 minutes, the accounts had been served self-harm content material,” Ahmed mentioned. “By eight minutes, they noticed consuming dysfunction content material. On common, each 39 seconds, the algorithm pushed dangerous materials.”
The algorithm “is aware of that consuming dysfunction and self-harm content material is particularly addictive” for younger ladies.
For Mistre, the combat is deeply private. Sitting in her daughter’s room, the place she has saved the decor untouched for the previous three years, she mentioned mother and father should know in regards to the risks of social media.
Had she identified in regards to the content material being despatched to her daughter, she by no means would have allowed her on TikTok, she mentioned. Her voice breaks as she describes Marie as a “sunny, humorous” teenager who dreamed of changing into a lawyer.
“In reminiscence of Marie, I’ll combat so long as I’ve the power,” she mentioned. “Mother and father have to know the reality. We should confront these platforms and demand accountability.”
Related Press writers Haleluya Hadero and Zen Soo contributed to this story.
—Tom Nouvian, Related Press