Official figures reveal referrals about criminal content have more than doubled in 12 months. Video / Mark Mitchell / Corey Fleming
A 36-year-old has been arrested in New Zealand for posing as an 11-year-old on Snapchat.
Alerts from Snapchat about grooming and child sex abuse material have more than doubled in a year.
NZ Police received 1549 referrals about child sex abuse material on Snapchat last year, up from 617 in 2023.
An online safety expert told the Herald the problem is probably much larger than reported.
Alerts from Snapchat about predatory grooming and child sex abuse material shared by users in New Zealand has more than doubled in a 12-month period, according to figures obtained exclusively by the Herald.
The Herald has learned of a recent case where a 36-year-old man wasarrested after pretending to be an 11-year-old girl on Snapchat.
It’s alleged he convinced two girls – aged between 9 and 13 – to provide him with video of themselves engaging in sex acts.
An increasing number of complaints are also being referred to police about TikTok, which the video sharing platform put down to its growing popularity.
New Zealand police are also regularly sent information about predators scouring other platforms, including Facebook and Instagram, and the gaming site Roblox that is popular with children.
Any platform where young people are, child sexual offenders are potentially there as well
Social media and gaming platforms are required by law to refer child exploitation material to the National Centre for Missing and Exploited Children (NCMEC) in the United States.
That agency then funnels information to relevant law enforcement agencies around the world.
The following table, obtained by the Herald under the Official Information Act, gives a breakdown of complaints that have landed with detectives in New Zealand over the past few years.
The data shows the number of complaints police receive vary significantly from company to company.
Roblox – an app that hosts hundreds of user-generated games – alerted authorities to just 23 complaints related to New Zealand predators or victims last year.
The company told the Herald it does not allow “peer-to-peer” image sharing like other platforms which may account for the comparatively smaller number of alerts.
Tipoffs about child sex abuse material sent by Snapchat to police in New Zealand have skyrocketed in the past 12 months.
At the other end of the spectrum, police received 1549 referrals about child sex abuse material on Snapchat last year – a massive increase on 2023 when 617 complaints were made.
A Snapchat spokesman said the company introduced a new function last year that allowed users to report child sexual exploitation and abuse material (CSEA).
“We observed an increase in CSEA-related reports and enforcements in H1 2024 [the first half of last year]. Specifically, we saw a 64% increase in total in-app reports by users,” he said.
Meta, which owns Facebook and Instagram, referred more cases to police here than any other platform over the past few years.
A spokesman for the company said it works “aggressively” to combat the issue.
Detective senior sergeant Kepal Richards is the manager of the Online Child Exploitation Across New Zealand (OCEANZ) team.
Manager of the online child exploitation team (OCEANZ) detective senior sergeant Kepal Richards said a growing population, smarter phones, and an increase in online communities contributed to the problem.
He told the Herald anywhere children congregate online could be a hunting ground for predators.
“Any platform where young people are, child sexual offenders are potentially there as well,” he said.
He said even if platforms like Roblox don’t allow image sharing, that won’t prevent offenders connecting with young people to then take them on to another site or server.
“If a child sex offender befriends a young person on a gaming platform, they’ll quickly try and take them to a messaging service that perhaps is end-to-end encrypted, which gives them that protection from identification, which makes it harder for us to investigate.”
He said massive increases in referrals – like from Snapchat – could be because of a range of factors including improved moderation or detection, an indecent image that goes viral and is subsequently reported multiple times, or because of a platform’s growing user base.
Referrals to police “only a fraction” of abuse
Rob Cope is the co-founder of Our Kids Online and regularly speaks at schools to parents and pupils about online safety.
Our Kids Online co-founder Rob Cope says social media companies are only detecting a tiny amount of offending because most children won’t report sexual abuse.
Cope, who runs online safety presentations at schools around the country, said in many cases sharing nudes online had become “normalised”, children often won’t know what they’re being exposed to is illegal, and there’s a reluctance to blow the whistle if it could lead to restrictions in social media use.
“Only a fraction of what’s actually going on is ever reported, because it’s mostly kids who are seeing it, and they’re not going to report it, because why would you nark on it?”
He said the way predators operate on social media was also relevant because they trap young people by emphasising secrecy or they’ll “share nudes with the whole world”.
“All the horrible stories I’ve heard lately - kids with porn, beheadings, disembowelments, torture, rape and stuff - all of it was on Snapchat.”
Cope argued social media should be restricted to people over the age of 18 years old given the content and people they’re exposed to.
“Are you happy with your child being exposed to everything you could possibly ever imagine on their social media? And are you happy with them talking to complete strangers all the time?”
He accused gaming sites like Roblox of being “predator playgrounds” because paedophiles could hide behind an avatar and pretend to be whoever they liked.
Sexual bribes for ‘robux’, grown men acting as kids
Tautoko Mai Sexual Harm Support CEO Julie Sach. Photo / supplied
The CEO of Tautoko Mai Sexual Harm Support Julie Sach says children as young as 5 years old are being targeted by online predators.
She told the Herald gaming platforms – including PlayStation and Xbox – open children up to interactions with total strangers.
“Through these interactions children can be convinced to join platforms without safety measures,” she said.
She said entering an email address to join unsafe sites was “easy” for children.
Staff at Taitoko Mai, who support victims of sexual harm, gave the Herald multiple recent examples of cases they’d encountered.
This included children being “bribed” into providing sexual content in exchange for Robux [the virtual currency used in Roblox gaming platforms].
Sach’s team also provided support to a 7-year-old boy who together with a friend had been putting sticks, grass, and their fingers inside each other at school.
“The 7-year-old boy was coached by his friend who had learnt from a Minecraft video on YouTube that you can “stick things up your a***”.
In another case, two girls aged between 9 and 13 years old were groomed by a 36-year-old man who claimed online to be an 11-year-old girl.
Sach said Snapchat, video platform Zoomerang and the chat app Discord were used to engage with the girls.
“[The offender] got our client and other girls known to client to share pictures and engage in sexual acts with each other,” she said.
She said the offender was arrested and two victims were receiving counselling support.
“Both [victims] are feeling huge shame but also grief at the loss of their ‘friend’. It’s had a huge impact on both families too, with one mother also needing counselling as a result.”
Sach said Snapchat was a “constant” problem with children making fake accounts to bully others, adults making profiles to pretend they’re kids and girls who claim to be older getting attention from men who encourage them to send nude images.
Another common threat used against intermediate and high school-aged children was predators demanding victims share their locations on “snapmaps” or nudes would be leaked to public Instagram sites known as “mugshot” sites.
Despite Meta saying it does not allow nudity or sexually explicit content on its platforms, the Herald was easily able to find a New Zealand-based “mug” site on Instagram, which contained sexually explicit material and videos of brutal school yard fights.
The rise of AI-generated child abuse material
Police and the DIA are investigating several cases of artificial intelligence being used to sexualise images of children.
The Department of Internal Affairs (DIA) told the Herald artificial intelligence was also increasingly being used to sexualise children.
Senior investigator with the DIA’s digital child exploitation team Jon Peacock said several investigations were under way.
“It [AI] gives people the ability to take photos of children they know, their own children for example or their neighbours’ and to use AI to modify content to produce sexually explicit material out of it.”
Jon Peacock is a senior investigator at the Department of Internal Affairs.
He said as well as modifying images of real children, AI allowed predators to create material bespoke to them.
“They might have a particular interest or fascination or sexual interest that they want to express through this media. Rather than having to try and find a child that matches that perfect picture in their head, they have the software to generate it for them on demand.”
The manager of DIA’s child exploitation team Tim Houston said the challenge with AI is trying to determine whether investigators are dealing with a real child or not.
Regardless, creating, possessing or distributing such material is still illegal.
“The ease with which this type of technology enables offenders to do this is of concern,” he said.
The DIA and police had a warning for parents when it came to social media use - be mindful of what images, videos or information they post about their children online.
Detective Senior Sergeant Kepal Richards said predators harvest details about a child such as which school they attend, or which sports clubs they’re associated with, to gain their trust when approaching them online.
He said back-to-school imagery was an example of how seemingly innocent posts could create opportunities for predators to learn more about their victims.
“Parents will give away identifying particulars such as a school name, a street, the address of a sports team which could place their child at risk. My advice is to be mindful of what you post. If you do want to post that type of imagery to make sure you remove any identifying particulars.”
What are social media companies doing to protect children?
Popular social media and gaming platforms say they have robust measures in place to detect, remove and report child sex abuse material.
All social media and gaming platforms contacted by the Herald emphasised their efforts to constantly innovate and use new technology to prevent children being harmed.
Snapchat said it used cutting-edge technology to identify illegal content and regularly engaged with law enforcement agencies.
“The sexual exploitation of any young person is horrific, illegal and against our policies. We use proactive technology to find and remove content exploiting minors,” a spokeswoman told the Herald.
Roblox said it uses filters to prevent personal information like home addresses or phone numbers being shared between players and used AI to detect and prevent the use of aggressive, violent or sexualised voice chat.
New protections rolled out in November mean those under the age of 13 will not be able to send direct messages to each other.
According to TikTok, it used machine learning to proactively identify new child abuse material and technology to identify re-uploaded abuse material.
The company also emphasised that direct messaging isn’t permitted among users younger than 16 years old, and that it engaged with both police and the DIA in New Zealand.
“There is no finish line when it comes to safety. Last year, we spent $2 billion USD to keep our platform and our global community safe and will continue to invest heavily in this area,” a spokesman said.
Meta also used technology to identify suspicious accounts and restricts adults from starting private chats with teens who are not connected to them.
The tech giant said it also sent “safety notices” to teenagers to warn them of risky accounts and said it referred more notifications of child exploitation to authorities than other platform.
Last year, the company said it removed or reported 50 million pieces of child exploitation content, most of which it said was found proactively before it was reported.
However, Julie Sach was critical of efforts by big tech companies to protect young people.
She said offenders often bypassed detection tools by using coded language or emojis and many platforms took too long to report harmful content or predatory behaviour.
“AI tools need to be better trained to detect grooming behaviour, and human moderators should intervene faster.”
She also called for the Government and tech platforms alike to implement stronger verification methods to stop underage users accessing high-risk platforms.
Michael Morrah is a senior investigative reporter/team leader at the Herald. He won the best coverage of a major news event at the 2024 Voyager NZ Media Awards and has twice been named reporter of the year. He has been a broadcast journalist for 20 years and joined the Herald’s video team in July 2024.