Summarised by Centrist
Christchurch documentary maker Nadia Maxwell conducted an experiment: set up a TikTok account posing as a 13-year-old girl.
“It took 22 minutes and 15 seconds for TikTok to show me the first suicide-related video,” she said.
Maxwell, who also posed as a teenager on Instagram and Snapchat, tailored the account’s preferences to interests like kittens and netball. Yet the algorithm quickly steered her towards videos about suicide, child abuse, and violent crime. “It was quite shocking… I felt so gross,” Maxwell said, adding that the experiment left her “emotionally confused.”
“What must this be doing to the brains of 11-year-old kids?” she questioned.
TikTok defends its safety measures, saying younger users experience stricter content controls and can use filters or restricted modes.
However, critics like cybersafety specialist John Parsons argue that such measures are insufficient. “Social media often represents a distorted version of reality,” Parsons said, warning that exposure at a young age fosters anxiety and harmful dopamine-chasing behaviours. “These algorithms not only cater to a user’s existing interests but expand them in potentially harmful ways,” he said.
Maxwell advocates for stricter platform accountability and significant algorithm changes to protect young users.
“These kids have got a portal in their pocket to the digital world, and it’s an unrealistic expectation on parents to monitor every minute,” she said.