New research published yesterday (10 November 2021) by The Social Switch Project, a collaboration between charities Catch22 and Redthread, showed that 97 per cent of Catch22’s child sexual exploitation referrals have an online or social media element – with substantial increases related to online grooming and abuse. The research was undertaken by Dr Faith Gordon (@Dr_FaithG) who found that more than 70 per cent of young people had seen content during lockdowns in 2020 and 2021 that was either violent or explicit, which included videos of suicide, nudity and extreme violence. They also described social media as ‘toxic’ and found it led to a negative impact on a young person’s mental health and wellbeing.
Young people cited unwanted contact online from adults, companies and bots, while there were also complaints of cyberbullying, threats and sharing of explicit content. However, only 40 per cent of young people interviewed reported online harms because they didn’t know how to, or they had previously been ignored or had negative experiences. There were also examples cited of young people receiving responses a long time after a complaint, which caused them to relive the event or incident. The data provides much-needed context on the wide-reaching implications of the pandemic and the need for services and training for professionals, parents and guardians during this time. The researchers divided findings into four key themes.
- Children and Young People (CYP) are too often exposed to unwanted content online, including graphic imagery and videos.
- CYP differently perceive some platforms as more negative and “toxic” than others.
- Perceptions of difference exist in relation to harm, and CYP feel that younger children are most ‘at risk’ of harm online.
- CYP speak about the behaviour of others on online spaces and in particular the unwanted contact they received from adults, and on occasions from other CYP, commercial companies or bots.
- CYP refer to coming across other users who were clearly significantly older or younger than the age they are perceiving themselves to be online.
- Unwanted contact is in the form of cyberbullying, threats, sharing of explicit content and harassment. CYP placed blame on the lack of restrictions on platforms.
Unwanted surveillance and use of data
- While CYP say that some peers want to be ‘noticed’ online and many seek validation through the numbers of followers or ‘likes’ they receive, CYP are generally future-thinking in their discussions. They want the option to respond by deleting their previous content and refer to wanting a ‘right to be forgotten’.
- CYP are concerned about what happens with the content that they post online, often referring to their job prospects and whether previous content would impact upon those.
- CYP want their privacy to be respected and do not want their data being used without their full knowledge, understanding and consent. They question how their data is being used by companies, agencies, law enforcement and others. Some described phone seizures by police during investigations as a cause for concern, including the timescales in returning devices and the extent of information extracted.
Unreasonable delay in action and lack of redress
- CYP have felt responsible for reporting content as they do not want other CYP to view the content and be equally as distressed. However, they felt that often redress was not always possible, as the “damage” is often already done when the incident or incidents occur.
- CYP stated that sometimes they did not hear back after making a complaint or they often received responses a long time afterwards and this had caused them to relive the event or incident.
- Several CYP did not know how to seek redress. Others felt that it was pointless to complain if the company responded with an automated response or if nothing happened about the complaint.
You can see the recommendations of the report summarised in the video below.
- Young people want to see better training for professionals and guardians in relation to online behaviour
- Young people felt responsible to report content, but also felt the ‘damage’ had already been done
- Children and young people want to see improved monitoring, swift action and accountability from tech organisations, rather than the responsibility being placed on the user
- Police are ‘one step behind’ developments in technology and so need to develop stronger relationships with tech companies
- As well as harms, young people highlighted significant benefits to their online world – in their education, their social lives and in their identity.
The report sets out seven key recommendations (summarised in the video above), which include improve regulations and legislation for social media companies, greater responsibility so tech companies are held accountable for inaction, and for young people to be involved in panels that are consulted on tackling online harms and the development of games, new content and online spaces.