Report Harmful Content
In this article from SWGfL, the role of its national reporting centre in providing advice about all types of online harm and signposting users to the correct services is explained.
- Part of
In the digital age, the internet has become an integral part of our lives, providing a wealth of information and opportunities for communication and entertainment. However, alongside its benefits, the online world also harbours harmful content that can pose risks to individuals, especially vulnerable groups such as children and young people. To combat this issue, SWGfL, supported by the Welsh Government, established Report Harmful Content, the national reporting centre that assists and empowers anyone, above the age of 13, to report harmful content online.
Report Harmful Content is a service that allows users to report various forms of harmful online content, helping to build a safer and more inclusive online environment. By actively utilising this platform, individuals contribute to combatting online harms, holding offenders accountable, and fostering a culture of social responsibility. Together, we can work towards a safer and more enjoyable online experience for all.
Harmful content is anything online which causes a person distress or harm
Report Harmful Content (RHC) provides advice about all types of online harm, signposting users to the correct services and highlighting public reporting routes for non-criminal content. Where the response from industry isn’t what was expected, Report Harmful Content can follow this up directly with our network of industry partners.
Online harm encompasses a huge amount of content and can be very subjective. What may be harmful to one person might not be considered an issue by someone else. To address this, we studied the community guidelines of several different platforms and concluded that the following areas of legal but harmful content are most likely to violate platform terms:
- threats
- impersonation
- bullying or harassment
- self-harm or suicide content
- online abuse
- violent content
- unwanted sexual advances
- pornographic content
When should you report harmful content to the police?
If you are unsure about whether to contact the police, consider the following questions:
- Is someone in immediate danger?
- Has a threat to someone’s life been made?
- Has someone’s safety been compromised?
- Is someone being forced to take part in sexual behaviours online?
If you answered yes to any of these questions, we would recommend contacting the police as an emergency. It is always best to contact the police by dialling 999 if you or a person you are helping is in immediate danger. You can report other non-emergency situations (i.e. those that do not require an immediate police response) by dialling 101.
When does harmful content become criminal content?
It’s not always easy to determine when harmful content becomes criminal in nature. UK laws relating to online safety date back as far as the 1960’s and as such, there isn’t always a clear set of criteria to meet when determining whether content is criminal or not. You can find out more about UK laws surrounding online behaviour on the Report Harmful Content website.
In addition to this, interpretation of harmful behaviour online is subjective; what may be harmful to one person might not be considered an issue by someone else. This makes it harder to understand when exactly harmful behaviour crosses the threshold into criminal behaviour.
The eight types of harmful content we accept reports for are not always specific criminal offences in UK law. However, there are criminal laws that can apply in terms of harassment or threatening behaviour. For example, should you receive threatening, obscene, or repeated messages and fear for your safety, this is against the law and you should contact the police. Context should be taken into consideration and the police determine their response on a case-by-case basis.
How do I submit a report to Report Harmful Content?
Before you submit a report to us, it is essential that you have reported the material to the social media platform at least 48 hours beforehand. You can find guidance on how to do this in the report section of our website. If you have already made a report to a social media platform and would like the outcome reviewed by us, you can submit a report via our reporting wizard.
Report Harmful Content can escalate unsuccessful reports that have been made to online platforms. If the platform has responded to your report stating that no rules (commonly called community guidelines or community standards) have been broken (often referred to as violations), then we will ask that you show us their response. From here, Report Harmful Content practitioners will review the content and/or screenshots where it is appropriate and legal to do so.
We are often asked, ‘why do I need to show you platform responses to my report?’ We understand that this can sometimes feel frustrating, and we want to explain the main reasons behind this request:
- There are lots of different reporting routes on online platforms based on the type of violations that are being experienced. Viewing the response from an industry platform can help practitioners to identify whether the correct reporting routes have been used, and signpost accordingly if a more applicable route is available. This is a requirement of our partnership work with online platforms to help us keep doing what we have been set up to do.
- We are able to provide this information in the context of specific cases to online platforms which provides them with valuable feedback, helping them to improve and streamline the reporting process.
- It reduces the risk of over-reporting cases which can have a detrimental effect.
How do we escalate reports?
It is important to remember that our role as a service is only to mediate once the correct methods of reporting have been attempted. That said, if you have made a report and waited over a week for a response, it may be that we can look into this further. Unfortunately, when platforms have not removed content, this doesn’t mean that we are able to have it automatically reviewed and removed. Practitioners review content against online platforms’ own community guidelines and, in some cases, we may reach the same conclusion (i.e., that no violations have been found). However, in this scenario, we will always try and offer you a further explanation about why the platform may have come to this decision. Where possible, we will also offer further support, advice, and signposting towards more relevant services.
In other cases, after reviewing the content reported, we may disagree with an online platform’s initial decision. If this is the case, Report Harmful Content will escalate your case further to our contacts at the relevant platform. To do this effectively, we will often ask (if not provided already) that you provide us with further details about what has happened. It is often difficult to judge what is happening with just one URL (website address). To rectify this, as well as the URL's, we need to know the context behind the situation. Sometimes additional screenshots, links to relevant comments or images that are being reported may also be needed.
Although we can escalate content with our contacts at online platforms and make a case for action to be taken, we are unable to action content removals ourselves. There is sometimes a delay between the time we escalate content and a platform’s response, which is due to the platform’s own moderation processes and is outside our control. We really do appreciate that this may cause some anxiety and frustration, but please rest assured that we will continue to chase escalations and update our clients along the way towards resolution.
For further information, the Report Harmful Content website provides advice on how to report both illegal, and legal but harmful, content. Organisations including schools are also able to download the Report Harmful Content Button for free to signpost their users towards the service. To get a better understanding of additional reporting routes, please visit the SWGfL Reporting Hub. You can also use REIYA, the online chatbot for additional support.
SWGfL
SWGfL is a not for profit charity ensuring everyone can benefit from technology free from harm. Forming 1/3 of the UK Safer Internet Centre, our experts advise schools, public bodies and industry on appropriate actions to take in regards to safeguarding and advancing positive online safety policies.
SWGfL has been at the forefront of online safety for the past two decades, delivering engaging presentations and training to a wide variety of audiences nationally and internationally. Our work has brought online safety to the forefront of public attention, ensuring everyone can develop their understanding of what online safety truly means in an ever changing world.