Social media, news and young people’s wellbeing
In this article, Internet Matters shares findings from their new report exploring the impact of online news on children and young people’s wellbeing.
- Part of
News consumption has shifted dramatically from traditional media like TV and print to online platforms and social media, especially among younger generations.
Social media offers 24/7 access to news information which keeps young people informed and connected with the world. However, the sheer volume as well as quality of this information, exacerbated by the rise of mis- and disinformation and AI-generated content, can pose risks to children.
In this article, Internet Matters shares findings from their latest research which looks at the impact of this online news environment on children and young people’s wellbeing.
Where are young people getting their news from?
Social media is young people’s main source of news, with 68% of young people who consume news saying so. Young people tell us that the online world is critical to keeping them informed about current events (74%) and over two-thirds (67%) say that social media is where they learn about breaking news. This makes it important that they receive quality news information to keep them connected to the world.
Even on social media, children continue to get their news from established news outlets like the BBC or ITV (41%). These accounts are also the most trusted source of news information on social media for young people (55%).
Children also see and hear news from influencers and content creators accounts, on par with established news outlets (40%). Despite this, the trust gap is significant with only 16% of children saying they trust these creators as a source of news.
Social media algorithms
The role of social media algorithms cannot be understated, playing a major role in children’s passive news consumption by serving up news content directly in their feeds, regardless of whether children want to see it or not. In fact, children told us they often get news from accounts they don’t follow and 40% said they do not follow news accounts at all.
“It’s not like I am searching for it [news], it just comes up on my feed randomly.” (Boy, 15)
A constant stream of news
Six in ten (61%) children who consume news on social media tell us they had seen a story that worried or upset them in the past month. When we asked children what had upset them, 56% said news stories about war and conflict, violence and death (19%) and crisis events (10%).
Young people highlighted to us how algorithms impacted this, pushing unwanted and sometime distressing content onto users.
“The other day I was scrolling, and I saw on a news account someone in a shop was just stabbing people and it was blurred out, but not blurred enough…” (Boy, 15)
“If there are certain things you don’t want to see, they will still come up [on your feed] and there is not much you can do to stop it.” (Girl, 17)
This constant stream of news content affects young people’s wellbeing. Nearly half (47%) of children say that seeing news content on social media gives them new problems to think about, and 41% feel overwhelmed by this news content.
The proliferation of AI-generated content and fake news
Fake news is growing: over a quarter of children say they have seen a fake or AI-generated news story and a further 41% say they think they have but are not sure. Social media amplifies this problem because it is a largely unverified environment where traditional journalistic standards do not apply. Unlike traditional media, posts do not have to be fact-checked and weak regulation in this area allows for misinformation to spready quickly.
On an individual level, this is problematic for children’s wellbeing. Children report that they felt confused (30%) or even embarrassed (10%) after falling for a fake or AI-generated news story. Nearly a third (31%) of young people said that seeing a fake news story led to distrust in other news content.
This content also has wider societal consequences. We saw this with the Southport incident in 2024, where misinformation about the identity of the perpetrator went viral on social media and led to riots. The House of Commons Science and Technology Committee concluded that these riots were spurred on by the “viral spread of harmful misinformation”, amplified by recommender systems on major platforms. It does not come as a surprise that 63% of children are worried about the growth of fake news and AI-generated content.
Children need the media literacy skills to tackle these challenges
It is more critical than ever that children and young people have the media literacy skills they need to safely and responsibly navigate news in the online world.
Media literacy skills to identify fake news
Our research shows that young people from higher income households are more likely to report having seen AI-generated or fake news content, with 36% reporting this compared to 20% from lower income households.
As fake news is ubiquitous, the difference in reporting is not gaps in exposure but most likely gaps in recognition. Children and young people from higher income households are also more likely to pay closer attention to the news and to access it from a wider range of established outlets. With more opportunities to verify information and a broader knowledge base, these children may be better able to identify fake news stories.
This highlights the importance of equipping all children to navigate the online news environment, including how to verify news information on social media. The need for this is stark, with only 34% of children knowing to verify news information on social media by looking at other sources like established news outlets.
Media literacy skills to stay safe online
Media literacy is also about staying safe online and having positive experiences. This includes being empowered to use tools that protect them: yet only 18% of children and young people told us that they used blocking, filtering, or reporting tools to respond to upsetting information in their feeds. This is in line with our previous research on children’s experiences of reporting harm online, which shows that although 71% of children say they have experienced reportable harms, only a third (36%) of children have reported these experiences to the platform.
Beyond just reporting harm, it is also important to teach children how to mitigate and prevent encountering harm. For instance, only 16% of children know how to reset their social media feeds to display different content.
Supporting children navigate the news
Children require support from the trusted adults around them to navigate this complex and rapidly changing space.
Parents
We know that parents are who children turn to when something goes wrong online. 84% of children also say that they have spoken to their parents about how to tell if online news is true.
But parents too can be vulnerable to fake news and AI-generated content. Parents should have access to the information they need to support children to navigate this everchanging news environment.
Check out our resources on fake news and misinformation and Find the Fake, a quiz to play as a family to learn and test knowledge on fake news.
Teachers
Schools play a crucial role in shaping how children engage with digital content. Yet only 56% of children who consume news say they’ve been taught how to verify it at school.
Media literacy must be woven into the curriculum at every stage , empowering pupils to question, analyse, and understand the online information they encounter. With the right support from government, teachers can deliver vital media literacy education effectively and confidently.
Take a look at Digital Matters, our interactive platform that supports teachers to teach children about online safety including navigating mis- and disinformation.
Internet Matters
Internet Matters is a not-for-profit that helps families stay safe online, providing guidance and resources for parents, carers and professionals. As part of this work, Internet Matters conducts regular research with children and parents and uses these insights to champion the views and interests of families, making evidence-based recommendations to all those with influence over children's digital lives.