A nine-year-old British girl was targeted by an alleged predator through the gaming platform Roblox, her mother says, sparking an international investigation.
The incident has renewed ongoing concerns about child safety on the platform, which announced changes to its measures last month.
The girl's mother told the ABC she was unaware of the platform's risks.
"I genuinely thought only children went on there. I was very naive," she said.
The alleged predator first contacted the child through Roblox before convincing her to download Discord, a messaging platform popular with gamers.
They went on to allegedly send sexually explicit and other graphic messages.
The mother said: "There were things said in there that she didn't understand, and he told her to Google it, things that she shouldn't know at nine years old."
The ABC reported the case had been referred to Interpol, with authorities in the US state of Minnesota, conducting an "open and ongoing" investigation into a suspect.
Roblox attracted about 90 million daily users, including many in New Zealand. It was estimated that a quarter of daily users of Roblox were in the Asia-Pacific region.
Netsafe chief online safety officer Sean Lyons told 1News it goes "a long way" for parents to understand the safety features on the platforms their children use.
"For example, how open is the communication between players in the game? Can they communicate with anyone or only someone that they already have a linkage to — for example, someone who is part of a team within the game?"
He said it was also important to proactively talk to young people about online safety, while cautioning that the "vast majority of online interactions are not harmful".
"Potentially, anywhere that we can communicate with another individual could be an opportunity for harm," Lyons added.
"We need to make sure that our tamariki are aware of the challenges and risks that exist online, and that they are able to access help and support when it is required."
There have been growing calls to more heavily regulate platforms such as Roblox in Australia, which has also recently resolved to ban social media for children under 16.
Last month, Roblox announced it would restrict children from certain messaging features and introduce new parental controls.
Under the suite of changes, children under 13 would be blocked from sending private messages to other users unless they have verified parental permission.
"We have an entire investigative process running behind the scenes to make sure that what's happening on the platform is safe," Roblox chief safety officer Matt Kaufman told the ABC.
He said the platform had a tiered approach to safety, with automated moderation to help review all content that was shared on Roblox.
"We have thousands of people around the world, they speak lots of different languages and they're working 24/7. They’re training the AI."