To return to your region please select

Loading...

Read the script below

Tyla: Hello

Natalie: – and welcome

Colin: – to Safeguarding Soundbites! I’m Colin

Tyla:  I’m Tyla

Natalie: and I’m Natalie! We’re here to bring you your Safeguarding Soundbites for this week’s online safeguarding news!

Tyla: If you’re wondering why all three of us are here this week, it’s because this is the last Safeguarding Soundbites before we take a break for the summer.

Colin: Whether you’re looking forward to summer break starting in a few weeks, or you’re currently enjoying your break already, we’re going to pause our weekly podcast for the summer while we all enjoy (hopefully!) a bit of sunshine. So, let’s start off today with our Social Media round up. Natalie?

Natalie: Thank you, Colin! This week has been a buzz of activity for social media platforms – and not just because two of the social media CEO heavyweights are making plans to fight each other in a cage match. Telegram has announced they will be taking a leaf out of the Meta playbook to release their own version of the Stories feature in “the Telegram way”. The platform is endeavouring to make their version different by allowing users to repost channel messages to stories and set their story to expire or be permanently available. This feature is set to be released on the Telegram platform in early July after it graduates from its testing phase.

Tyla: Sounds like it will make the Telegram experience more familiar to newer users!

Colin: It certainly does! Especially as stories are one of the most popular features that you can find on almost every major social media platform.

Natalie: Which is why other platforms are very eager to replicate them. However, just because platforms like Telegram are trying to implement features that more popular platforms like Instagram have made ‘standard’ doesn’t mean they are necessarily safer. There are quite a few risks that can come with Stories, as they can ‘disappear’ or seem ‘less permanent’ than posting something on a profile page or a grid. That is why it is important to discuss with young people what is appropriate to share, what platform safety features they should be using, and who they should accept as friends or allow to view what they post. It’s also important to make sure you know about the risks that come with each individual platform. For example – did you know that there are no effective age restrictions on Telegram?

Tyla: I didn’t, no!

Natalie: This makes it harder to determine what is appropriate for users to see on the platform, and our online safety experts have warned that there is limited moderation and harmful content within the Telegram platform that could be a risk to young people who decide to use it. To find out more about Telegram, why not check out our suite of Safeguarding Apps, or find our online shareable on our website?

Tyla: Our shareables are excellent ways to get all of the information you need about popular platforms in one place! And speaking of popular platforms, social media giant TikTok’s family pairing tool is now giving parents and carers personalised control over the content the young people in their care see while scrolling on the app. The Family Pairing tool has previously allowed parents and carers to link their TikTok account to their young person, which enabled them to adjust content and privacy settings only. Now TikTok is planning to implement a content filter which will aim to help parents reduce the likelihood of young people “viewing content they may uniquely find jarring.”

Natalie: Goodness, I find that sentence a bit ‘uniquely jarring’!

Colin: Tell me about it! That isn’t language we would use, but we think they are trying to say that different young people will have their own limits for the type of content they see, or the harmful content that will affect them. For example, a young person may not want to see videos about dieting or violent content, and these filters will help make sure that is limited from their view.

Tyla: What TikTok is trying to do is allow families to work together to tailor a young person’s online experience, while also respecting their online rights and privacy. It means that parents can feel more secure about their young person being on TikTok, without having to remove the app altogether.

Colin: And are young people allowed to know what their parents and carers are restricting them from seeing?

Tyla: Yes, actually! This is the default setting for the feature. When parents and carers use The Family Pairing tool the way it is meant to be used, their young person can see the safety features they have set up and understand what restrictions are in place.

Colin: That’s good. It is so important to have rules and guidelines, and to use features like this one to help young people be safer, but a big part of teaching them how to be digitally resilient is being transparent with them about how these security features work and why they are important.

Natalie: I couldn’t agree more! When does this feature roll out, Tyla?

Tyla: While there is no official date for release just yet, their announcement suggests it will roll out very soon! There are also a few new features that will be rolled out in the coming months that are focusing on “increasing transparency for all users” which would be good to watch out for.

Natalie: And TikTok just released an update to their targeted advertising policy, didn’t they?

Tyla: Yes, they did. Starting Thursday 28th June, anyone registered on TikTok between the ages of 13 and 17 will no longer see personalised ads based on their online or offline activities.

Colin: So there won’t be any advertisements based on their interests, which might influence them or play into their algorithms and produce similar content?

Tyla: Exactly. This update keeps the platform in line with the EU Digital Services Act, which governs what businesses can do online in the UK. It’s another step in the right direction to protecting young people who use the platform!

Colin: One very important thing to remember is that these safety features will only work if the young person has registered to TikTok with the right age. Might be worth having a conversation with your young person if they use TikTok! While we’re waiting for them to release this feature, why not head to our Online Safety Centre, where you can find details on how to use safety features of popular platforms! Our team of experts make sure we keep our information updated and easy to understand for parents and carers who want to secure their child’s online experiences, and for teaching them how to report and block harmful content.

Natalie: We’re back again with more Online Safety Bill news and this week, mega tech company Apple have joined the opposition to the Bill’s proposal that would keep an eye on encrypted messages.

Tyla: What are they looking for?

Natalie: Well, as we’ve discussed before, The Online Safety Bill has claimed it will use its powers to force encrypted messaging services (like WhatsApp) to scan all user messages for child sexual abuse material.

Tyla: Right – and I know we’ve talked about it on Soundbites already, but for those who need a refresher, can you remind us why this is an issue, Natalie?

Natalie: Of course! This new ‘rule’ is a bit of an ethical conundrum, which is a big part of the reason Apple have joined the fight against this clause. Apple (along with 80 other organisations and tech experts) have written to Technology Minister Chloe Smith in an appeal to have this part of the Bill reconsidered. Their argument is that it should be amended to protect end-to-end encryption, which is a form of security that prevents anyone but the sender and recipient from seeing the content of any messages. This includes text, photo, audio, and video messages sent between people.

Colin: Which makes complete sense, however – it could then be dangerous when it comes to illegal content like child sexual abuse materials.

Natalie: Exactly, Colin. This technology, while being beneficial against security risks like hackers, also benefits a lot of illegal activities. It gives criminals a ‘safe place’ as it were to discuss and send information back and forth without worrying they are being watched by law enforcement. A big part of the Online Safety Bill has been to fight against the way illegal content like child sexual abuse materials is shared and spread. Protecting this kind of encryption or excluding it from the Bill’s reach makes it a lot harder for police and other officials to stop illegal behaviour and catch the relevant criminals.

Tyla: But then encryption also protects against hackers, which is also an illegal action…

Natalie: And this is why it is so hotly contested. The Government have stated that they believe end-to-end encryption should only be used if social media and tech firms can “simultaneously prevent abhorrent child sexual abuse” from happening on their platforms. The BBC reported that a package of amendments to the Bill would be revealed in the coming weeks, but there is no clue as to whether these amendments will satisfy the companies who are opposing the Bill.

Colin: At the very least, it’s encouraging to see that these discussions are happening, and that different points of view are being covered.

Natalie: Absolutely!

Tyla: It’s also good to see that the Online Safety Bill is so well covered in the news, because it means more people will be aware of it when it comes to pass.

Colin: Unfortunately, the spread of illegal trade in child sexual abuse images is gaining new traction with the inclusion of materials generated by AI technology. According to a report from the BBC, paedophiles, or people who have a sexually deviant interest in children, have begun to use this new technology to create r images that are then distributed through paid subscriptions on content-sharing sites like Patreon. Patreon has come forward to say it has a ‘ zero tolerance’ policy about this type of graphic imagery being available on its platform.

Natalie: So these images are being generated through the same AI software that we’ve seen popping up recently? Where a person describes what they want the image to look like or contain, and the AI creates an image based on that?

Colin: Yes – unfortunately, investigators have found that software such as Stable Diffusion has been used exactly in this way to create the images. Some of them are very life-like, which can make it harder for investigators to discern if a real child is an actual victim of abuse somewhere in the world.

Tyla: Stable Diffusion was actually created for use in art or graphic design. As a graphic designer myself, I can see the benefit of the technology. It’s a shame that people are misusing it in such as horrific way.

Colin: It really is a shame.

Natalie: So because these images are created by AI, that means that the images aren’t ‘real’, right?

Colin: Yes, but that doesn’t make them any less harmful or illegal. In fact, UK police online child abuse investigation teams have already clarified that they are treating this artificial imagery in the same way they treat a real image. This means it is illegal to create, possess, publish, or transfer these images in the UK. The National Police Chiefs’ Council or the NPCC has warned that it is incorrect to say that because no real children were harmed or depicted in these “synthetic” images that it is not ‘real’ abuse, especially as images of real children can be altered through AI software to make it look as though they are not ‘real’ or to place them in different inappropriate scenarios. The NPCC believe that people with a harmful sexual interest in children could gradually move up the scale of offending – which is from thought, to synthetic, to the abuse of a real child – if given access to these materials or the platform to create them. This makes these synthetic images a real threat, especially as some of these image creators are aiming to create hundreds or thousands of images a month to keep up with the demand.

Tyla: My goodness.

Colin: I know. What’s more is that many of these closed subscription groups can have up to 100 anonymous members, and those members can begin to share links to materials in which real children are actually being harmed.

Natalie: So this creation of synthetic sexual images of children could actually increase the demand for images of real children being abused?

Colin: That is a major concern, yes. There is also the additional strain on policing and law enforcement who deal with these investigations, which can take attention away from actual children who may need their help. The government has also shared that the Online Safety Bill will require companies to “take proactive action” against child sexual abuse material and other online harms, “or face huge fines.”

Tyla: I suppose we’ll see how this all comes into effect over the next few months.

Colin: We will indeed.

Tyla: The Online Safety Bill is something we have been monitoring for months here at Ineqe and Safer Schools, and we welcome the proposed changes. It’s never been more important to try and protect children and young people online, especially as a new survey commissioned by Kent’s Police and Crime Commissioner has shown that almost one in three young people have been abused online.

Natalie: One in three? That’s shocking.

Tyla: As online life becomes more commonplace for younger generations, so do online harms and risks. One of these is online bullying, and the devastating impact it can have on the development of children and young people as they grow. The Youth Survey was completed by 4,400 children, in which it was revealed that the number of children who have been bullied online has gone up 18% in the last five years.

Colin: And this is a situation where any incline is not what we want to be seeing.

Tyla: Precisely. They reported that 26% had been bullied while on their way to school, with 33% saying they had been left feeling frightened. What’s more worrisome is that only half of those who had been scared told their parent or carer about what was happening, and a further 30% didn’t tell anyone. Despite this, 90% of children and young people still decided to return to the app where the abuse took place and still use it.

Natalie: And did the report say anything about who was doing the bullying?

Tyla: It did, actually. It revealed that over half of the children who were bullied said they knew their bully, and 11% of the respondents actually admitted to cyberbullying themselves.

Colin Cyberbullying can make it so much harder for children and young people to feel they can escape their bullies. This is especially important to remember going into summer holidays. Just because school is out for the summer doesn’t mean that bullying has stopped!

Tyla: Exactly, Colin. Together, we want to make sure everyone feels safe, protected, and empowered to stand up to online risks like cyberbullying. If you are worried about someone in your care, or would like to help a young person who you know is struggling with online bullying, head to your Safer Schools App today for helpful resources, advice, and signposts for help.

Tyla: Now, we also know that summertime can sometimes mean an increase in screentime too!

Natalie: We live in a digital world, after all!

Tyla: What might come as a surprise is that a Barnardo’s survey found that 71% of 11-to-17 year olds are expected to spend more time online during this year’s summer holiday. It makes sense that with more downtime, many children and young people will turn to playing videogames, engaging in digital communication with friends, or scrolling through online platforms to try and fill the time.

Colin: And even then, I’m sure many parents and carers will be very familiar with the phrase “I’m bored!” over the next few months!

Tyla: Exactly, and we want to try to help with that, especially as increased time on screens also means increased risk of encountering online harms. Fear not – we’ve pulled together our Top Three Tips for a Safer Digital Summer to help you out!

Colin: You can find this helpful article on our suite of Safeguarding Apps, the Safer Schools App or on our website. Think of it as our gift to you for a safer summer!

Natalie: And now for our Safeguarding Success Story of the Week!

Tyla: I’ll miss these!

Natalie: Well, you won’t have to wait too long for the next one.

Colin: That’s right, we’ll be back with our Soundbites Summer Special at the end of July! So Natalie, tell us what has been the big win for safeguarding this week?

Natalie: Well, the Online Safety Bill strikes again as the government plans to overhaul the laws around ‘revenge porn’. In other words, sharing or threatening to share private explicit images or videos without the consent of the person in them. The government have taken a significant step forward to make it easier to prosecute offenders and protect victims.

Tyla: That sounds like positive news – but hold on. Do we still use the term ‘revenge porn’?

Natalie: You’re right Tyla, our weekly listeners might also know that we don’t use the term ‘revenge porn’. It’s widely agreed that it does not accurately reflect what it is for victims – and that is abuse. Intimate image abuse is the term we’ll be using for this story.

So, under the new laws, victims will no longer have to prove that their perpetrators intended to cause distress, which is a huge relief for those who have been through such a traumatic experience.

Colin: It has previously been difficult for victims to prove that the perpetrators meant to cause them harm and distress. Someone could just claim they “accidentally shared it” or send a message to say they didn’t mean to cause any harm or even delete any incriminating messages, and prosecutors would have a harder time proving they meant to hurt the victim.

Natalie: Exactly, and unfortunately it was a legal loophole that many had exploited. So thankfully now, perpetrators could face up to six months in jail for sharing an intimate image. But in cases where it can be proven that the offender sought to cause distress, alarm, or embarrassment, or shared the material for sexual gratification purposes, they could be given a two-year jail term or even be placed on the sex offenders register.

Tyla: It must be a relief for victims not to have the added burden of proving intent.

Colin: Definitely, it’s good to see the government taking this issue seriously. But I’m wondering how this affects the sharing of deepfakes. Those images that have been manipulated to look explicit using someone’s face for example. Although it may not be their real body, this can be equally as distressing for the victim.

Natalie: Well under the new legislation, the sharing of deepfakes without consent will also be criminalised! Natasha Saunders, a survivor of domestic abuse, expressed her thrill that the government has finally listened to the concerns of victims of intimate image abuse. And Lisa King, of Refuge, the UK’s largest provider of shelters for domestic abuse victims, called these changes a “significant moment” for women who have been threatened with the sharing of their private intimate images.

Colin: It’s clear that these measures are welcomed by those who work tirelessly to support and advocate for victims and that these changes were long overdue.

Tyla: Absolutely!

Colin: Well! We did it!

Natalie: We did!

Tyla: Our last Safeguarding Soundbites episode before the summer!

Colin: It really has been a privilege to bring you important safeguarding news in this way – and it’s been so fun to do it together!

Natalie: It has! On behalf of the team at Safer Schools, we would love to wish you and your family a safe, fun, connected summer.

Tyla: And remember – we’ll be back in few weeks for our Summer Special!

Colin: Until then,

All: stay safe!

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think and Plan

Guidance on how to talk to the children in your care about online risks.
Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2023-06-30T13:06:31+00:00
Go to Top