Snapchat most used online care app, NSPCC says

Getty Images Thumbs over a glowing phone screenGetty Images

Messaging app Snapchat is the most used platform for online grooming, according to police figures provided to children’s charity the NSPCC.

More than 7,000 offenses of having sex with a child were recorded across the UK in the year to March 2024 – the highest number since the offense was created.

Snapchat accounted for almost half of the cases where the platform used for grooming was recorded by the police.

The NSPCC said it showed society was “still waiting for technology companies to make their platforms safe for children”.

Snapchat told the BBC it had “zero tolerance” for the sexual exploitation of young people and had extra safety measures in place for teenagers and their parents.

Becky Riggs, head of the National Chief Constable’s Council for Child Protection, described the data as “shocking”.

“It is imperative that the responsibility for protecting children online is placed on the companies that create spaces for them and the regulator strengthens the rules that social media platforms must follow,” she added.

Adjusted at 8 years old

The gender of the victims of care violations was not always recorded by the police, but of the cases where it was known, four out of five victims were girls.

Nicki – whose real name the BBC is not using – was eight when she was messaged on a gaming app by a carer who encouraged her to go on Snapchat for a chat.

“I don’t need to explain the details, but everything you can imagine happened happened in those conversations – videos, photos. Requests for certain material from Nicki, etc,” explained her mother, who the BBC is calling Sarah.

She then created a fake Snapchat profile pretending to be her daughter and the man messaged her – at which point she contacted the police.

She now checks her daughter’s devices and messages every week, despite her daughter’s objections.

“It’s my responsibility as a mother to make sure she’s safe,” she told the BBC.

She said parents “can’t rely” on apps and games to do that work for them.

‘Snapchat design issues’

Snapchat is one of the smallest social media platforms in the UK – but it is very popular with children and teenagers.

This is “something adults are likely to use when looking to look after children”, says Rani Govender, online policy manager for child safety at the NSPCC.

But Ms Govender says there are also “design problems with Snapchat which are also putting children at risk”.

Snapchat messages and images disappear after 24 hours — making it harder to track incriminating behavior — and senders also know if the recipient has hijacked a message.

Ms Govender says the NSPCC hears directly from children who single out Snapchat as a concern.

“When they report [on Snapchat]this is unheard of and that they are also able to see extreme and violent content on the app,” she told the BBC.

A spokesperson for Snapchat told the BBC that the sexual exploitation of young people was “appalling”.

“If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities,” they added.

Record offense

Care registration cases have been on the rise since the offense of Sexual Communication with a Child came into force in 2017, reaching a new record of 7,062 this year.

Of the 1,824 cases where the platform was popular last year, 48% were recorded on Snapchat.

The number of regulatory breaches recorded on Snapchat has increased every year since 2018/19.

Reported privacy breaches on WhatsApp also increased slightly last year. On Instagram and Facebook, popular cases have fallen over the past few years, according to the figures. All three platforms are owned by Meta.

WhatsApp told the BBC it has “robust security measures” in place to protect people on its app.

Jess Phillips, minister for protection and violence against women and girls, said social media companies “have a responsibility to stop this vile abuse from happening on their platforms”.

In a statement, she added: “Under the Online Safety Act, they will have to stop this type of illegal content being shared on their sites, including private and encrypted messaging services or face significant fines .”

The Online Safety Act includes a legal requirement for technology platforms to keep children safe.

From December, big tech firms will have to publish their risk assessments of illegal damage on their platforms.

Media regulator Ofcom, which will enforce those rules, said: “Our draft codes of practice include strong measures that will help prevent grooming by making it more difficult for perpetrators to contact children.

“We are prepared to use the full extent of our enforcement powers against any company that comes up short when the time comes.”

Leave a Comment