Your regular in-app roundup of current digital safeguarding news.
Daily Safeguarding Update
Tech companies urged to reveal number of children on platforms
The Children’s Commissioner and the Culture and Education Secretaries has urged companies reveal how many children use their platforms.
Companies included Meta (owner of Facebook and Instagram), TikTok, Snapchat and Twitter, alongside Google and Apple.
All these platforms require a minimum age of 13 for their users.
This follows the warning to tech companies to expect tougher regulation under the Online Safety Bill and the Facebook whistle-blower, Frances Haugen, stating that the platform could make a ‘huge dent’ in the number of underage users but chooses not to.
Rise in number of children arrested for far-right terrorism offences
Counter-terror police have warned parents to be vigilant a rise in the number of children being arrested for offences linked to far-right terrorism.
In the year ending June 2021, children under the age of 18 made up 13% of all terrorism arrests, up 5% the previous year.
Vicky Washington, Counter Terrorism Policing’s national co-ordinator for Prevent, has stated that the COVID-19 lockdowns created the ‘perfect storm’ for the potentially radicalisation of children in online spaces.
Washington reports that social media being used to radicalise children and young people by members of the far-right has been reported in video games.
Crime ‘hotspot’ schools to get county line gang support
The Department of Education in England has announced that schools in crime ‘hotspots’ across the country with pupils at risk of violence and involvement in county lines gangs will be offered targeted support.
This will see the rollout of 10 ‘SAFE’ (Support, Attend, Fulfil, Succeed) taskforces, led by local school leaders to help prevent pupil involvement in criminal activity.
The aim is to improve attendance, reduce exclusions and maintain young people’s focus on education.