• We’re responding to reports from our safeguarding network and in the news about hateful content on popular platforms

  • Today it was reported that TikTok has deleted a collection of anti-Semitic videos with over 6.5million views

  • We have been aware for some time about the challenges of hateful content on platforms used by young people and echo concerns about the impact this has on their health and wellbeing

  • The issue of online hate contributes greatly to the issue of online harms  

This problem is not unique to one platform but features across the broad base of social media apps used by children and young people.

Examples of content on TikTok

Why you should be concerned:

  • Certain sites have been used to radicalise young people

  • Violent attacks, instruction manuals and statements have been circulated on platforms used by young people

  • Hateful messages can impact how young people see certain groups or communities

  • Young people can feel ostracised if they belong to groups targeted by online hate

  • Young people may not fully understand the context of online hate

  • Young people may feel pressure to mimic or participate in viral trends which could involve hateful content or actions that constitute hate crimes

Talking to young people about hateful content online: 

  • Teach them how to cope with hateful content when they do see it*

  • Empower them to tell you when they see things that are hurtful or upsetting

  • Talk to them about how to report online hate crimes

  • Encourage them to think about ‘dark humour’ and how it may be hurtful to other people

To report online material that promotes terrorism or extremism, you can use the Home Office Anonymous Reporting Portal 

To report hate crime and online hate material you can use the True Vision Reporting Portal