To return to your region please select

Loading...

Read the script below

Natalie: Hello and welcome to Safeguarding Soundbites.

Colin: This is the weekly podcast that keeps you in the know with all the week’s important safeguarding updates and news.

Natalie: It sure is! I’m Natalie and he’s Colin and this week, we’re talking about the removal of political content on Threads and Instagram…

Colin: Cyber-attacks on schools.

Natalie: And much more. Colin, do you want to start us off?

Colin: This week, a deeply concerning story has emerged regarding child sexual abuse materials. Reports allege that school pupils have been accessing and sharing child abuse images on Snapchat. The claims came to light during events organised by the school itself, aiming to promote safe internet practices among students. Police are investigating and working with the school and Snapchat have said that this type of material has no place on their platform. Now, we don’t know any more details on this story – whether it’s sending images found online or self-generated images.

But I want to signpost listeners to our resources on self-generated sexual imagery, which you can find on our website ineqe.com or our Safer Schools apps. There, you’ll find really useful information on the subject, plus practical advice on how you can effectively respond if a child or young person in your care has created, shared or lost control of an image.

Natalie: Very alarming story. And unfortunately, our next story is also related to child sexual abuse imagery. A charity, called the Lucy Faithful Foundation, have reported that they’re receiving calls from people who are confused about the ethics of viewing AI-created child abuse imagery. The charity is a UK-wide child protection charity dedicated solely to preventing child sexual abuse and say that callers to its helpline think AI images are blurring the boundaries between what is illegal and what is morally wrong. The charity is warning callers that creating or viewing these materials is still illegal, even if they are created by AI.

Now our next story is maybe one of the first examples of the new Online Safety Act in action… A registered sex offender has become the first person convicted of cyber-flashing in England and Wales under the newly implemented Online Safety Act. This landmark case sends a powerful message about protecting individuals from unsolicited and harmful digital content.

The man in question sent an unsolicited picture of him exposing himself to both a 15-year-old girl and a woman on WhatsApp. Thankfully, the woman in this instance responded swiftly, taking screenshots of the image and promptly reporting him to the Essex police. This decisive action led to his arrest and subsequent conviction.

It’s crucial to acknowledge the importance of the Online Safety Act in this case. This legislation, which came into effect on January 31st this year criminalises the act of sending unsolicited sexual images with the intent to cause distress or alarm. This conviction demonstrates the potential of this act in safeguarding individuals from online harassment and promoting a safer digital environment.

Colin: Absolutely. I wanted to also share with our listeners an update from Meta. That’s the parent company of Instagram and Facebook, but also Threads, which is sort of Meta’s version of Twitter/X. They’ve announced plans to remove political content from recommendations on both Threads and Instagram.

This move aims to steer away from replicating the often heated and potentially harmful political discourse found on other platforms like X. However, it’s crucial to clarify that political content won’t vanish entirely. Users will still be able to follow and engage with accounts focused on politics, accessing their updates as usual. It’s purely the proactive recommendation of such content that’s being removed.

Meta emphasises this will be a gradual rollout approach, allowing them to carefully monitor and tweak the changes to ensure they achieve the desired effect.

Natalie: Interesting – I’ve not used Threads myself but I’ll definitely be keeping an eye out over on Instagram to see if I notice any changes. With this being an election year over in America, along with many other countries – possibly even here? – this seems like a timely move on Meta’s behalf.

In other news, a school in Buckinghamshire recently suffered a cyber-attack, resulting in the leak of sensitive personal data linked to its community on the dark web. Hackers exploited vulnerabilities to steal the information, despite the school’s attempts to contain the attack.

The nature of the leaked information remains unclear, but the stolen data was published on the dark web, a hidden online marketplace often used for illegal activities. This incident highlights the importance of robust cybersecurity measures in schools to protect sensitive student and staff information…and I’d like to remind our listeners about our upcoming training in cyber security. Here’s Ryan to tell us more:

[Ad Break] In today’s digital age, cyber security plays a crucial role. But it’s not just about protecting your own information – it’s about protecting our schools, our organisations and the children and young people in our care. Our cyber security webinars will help improve your understanding of what cyber security is and why it matters. Gain confidence by learning how to better protect against vulnerabilities and improve your response to potential dangers. Plus, our cyber security training is CPD certified and is available in both beginners and advanced levels. With webinars coming up soon, make sure you visit ineqe.com and head to the webinars page to book your place today.

Colin: And we’re back now for our safeguarding success story of this week and it’s about a new partnership between the Internet Watch Foundation and the Public Internet Registry that will give registries the tools to disrupt sexual abuse material online. Although it’s a bit tech-y, but basically this is going to disrupt criminals from hopping to one domain (like a website address) to another. So currently, if a site hosting this type of content online gets taken offline, it often comes back again but under a slightly different domain. So, for example, if ineqe.com changed to ineqe1.com. This new program will alert participating registries in real time, allowing for faster action to be taken.

Natalie: Definitely tech-y but definitely a fantastic initiative! Well that’s all from us for this week. Join us next time for more news and alerts – and don’t forget to visit ineqe.com to find out more about our upcoming cyber security training.

Colin: And if you and the child or young person in your care hasn’t checked out The Online Safety Show yet…well, why not?! Head over to theonlinesafetyshow.com to find all our episodes, including our latest Safer Internet Day episode in which you can learn more about the Online Safety Act with our special guest Bimpe Archer from Ofcom. And while you’re there, why not take part in our survey? Until next time…

Both: Stay safe!

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think and Plan

Guidance on how to talk to the children in your care about online risks.
Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2024-02-16T15:46:10+00:00
Go to Top