The coronavirus lockdown has seen Police Scotland deal with an increase in recorded crimes of groomers sending sexual messages to children. Michael Alexander reports.
Cooped up at home during the coronavirus lockdown, it’s been a challenging time for everyone.
But with many young people spending even more time “online” to connect with the outside world, the increased risk of child abuse or exposure to inappropriate material has grown exponentially.
Thirteen-year-old Emily (name changed to protect her identity) is amongst those who have come to harm during lockdown. She exchanged messages and photos on Facebook and Snapchat with a ‘boy’ she believed was 15.
However, the ‘boy’ turned out to be 24 and sexually abused her.
Emily’s mum, Wendy (name also changed to protect her identity) said: “It’s important for social media to be regulated and for Facebook and Instagram to take more responsibility to keep the people who use their platform safe. All other businesses have a duty of care to keep children safe, so why not them?”
The dangers have been highlighted by NSPCC chief executive Peter Wanless who described child abuse as an “inconvenient truth” for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom children.
However, he recently told Prime Minister Boris Johnson that coronavirus has created a “perfect storm for abusers” because platforms haven’t done enough to tackle safety risks going into the crisis.
New figures revealed via a Freedom of Information request show that 651 offences of communicating indecently with a child were recorded by Police Scotland in the last year, compared to 354 crimes in 2014/15 – an increase of 84%.
That included a rise of 12% in the year to April 2020.
However, the NSPCC is warning there could be a sharper increase this year due to the unique threats caused by coronavirus that are being exacerbated by “years of industry failure” to design basic child protection into platforms.
The leading children’s charity, fighting to end child abuse in the UK, is now calling on the UK Prime Minister to urgently press ahead with legislation that would help prevent offenders from using social media to target children for sexual abuse.
Mr Wanless said: “The Prime Minister signalled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety.
He can do this by committing to an Online Harms Bill that puts a legal duty of care on big tech to proactively identify and manage safety risks.
“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”
An analysis by the NSPCC of data of an equivalent crime from police forces in England and Wales has revealed that Facebook-owned apps Facebook, Messenger, Instagram and WhatsApp were used in 55% of 5,784 recorded cases, from April 2017 to October 2019, where police recorded information about how a child was groomed. Snapchat was used over 1,060 times.
A similar breakdown of data was not available from Police Scotland.
However, police figures show related offences continue to rise with 1694 recorded in Scotland between April 2019 and February 2020, compared to 1573 over the same period for 2018/19.
Police Scotland recently launched its latest campaign targeting sexual predators who groom and abuse children online.
The new #GetHelpOrGetCaught campaign, which has finished its initial run, proactively targeted men who are either already offending or at risk of offending.
It featured a film which challenged behaviour and asked the question: if you wouldn’t do it in the real world, why groom and abuse children in the online world?
Police Scotland consulted with partners on the campaign including Stop It Now! Scotland, which has since reported a 200% increase during lockdown in the number of people accessing self-help resources to stop online child sexual abuse.
This included a peak in traffic after Police Scotland launched its online child sexual abuse prevention campaign with the number of people accessing the ‘Get Help’ section of the Stop it Now! Scotland website rising from 60 in the four weeks prior to launch to 185 in the four weeks after launch.
More than 79,000 people across Scotland viewed the Stop It Now! Scotland website during that same period.
The NSPCC has also warned of serious safeguarding risks as a result of a growing trend in videoconferencing giant Zoom calls being ‘bombed’ with child sexual abuse images.
Several weeks ago, a Fife Scout group had to quickly abandon its attempt at a ‘virtual camp’ catch-up using Zoom after it was apparently hacked with pornographic images.
This came weeks after Zoom chiefs condemned the “horrible” hijacking of an online Angus Council meeting which saw participants subjected to porn images and indecent abuse. Zoom says its security measures have been tightened in recent weeks.
Superintendent Tim Ross, Safer Communities, said: “We are broadly supportive of the legislative proposals to introduce an internet regulator to tackle online harms but this is a matter for Parliament to decide.”
Facebook told The Courier it uses integrated Photo DNA that scans all images and videos on Instagram and Facebook and flags known child exploitative material so it can quickly remove it.
In addition to Photo DNA, they have technology which proactively detects child nudity and previously unknown child exploitative content when it’s uploaded.
The company says it manually reviews this content, and if it violates their policies they will report it to police and remove the account in question.
Facebook says 99% of child nudity content is detected by its technology and removed from the Facebook platform automatically.
Facebook and Instagram also work with child protection experts, including specialist law enforcement teams like CEOP in the UK and NCMEC in the US, to keep young people safe.
A Facebook company spokesperson said: “There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.
“We have a content and security team of over 35,000 people investigating reports from our community and working to keep our platforms safe.
“Our teams also work closely with child protection experts and law enforcement, reporting content directly to specialists such as CEOP and NCMEC”.