Cymraeg

5. AI companion apps

Advice from CEOP Education at the National Crime Agency

AI companion apps are like online friends, girlfriends or boyfriends that people can chat with any time. They’re powered by artificial intelligence (AI) and are designed to feel personal and friendly. 

Unlike AI chatbots, like ChatGPT (which can help with answering questions or tasks like booking tickets etc), companion apps are designed to build a personal relationship with a user, often focussing on emotional support and friendship. This can keep people engaged and coming back to the app for more conversation.

AI companion apps may feel comforting, especially during times of loneliness, stress, or curiosity. Young people might use them for:

  • emotional and mental health support
  • practising conversations or flirting
  • help and advice
  • entertainment

Some companion apps even let users design and create their own character or avatar to talk to, and let them share and receive images and videos which can feel exciting or personal.

Whilst AI companion apps may have some benefits, there are also potential risks to be aware of:

  • They’re trained to please the user, which means they often won’t say no, even to talking about harmful or sexual topics.
  • Some companion apps can expose children and young people to sexual or harmful content, even when they didn’t ask for it.
  • They can give unrealistic expectations about relationships, sex and consent.
  • Talking to a companion app that always agrees with you can affect how someone feels about themselves and others.
  • Some people may become emotionally dependent or reliant on the app, which can make real, offline relationships harder to build.
  • The apps can provide misinformation. They may give incorrect advice or information because they do not truly understand context like humans do or they are saying something false just to please.
  • The apps often collect personal data, including chat history, images and emotional responses. Thinking critically about what personal information is shared with the app can be helpful.

There aren’t special laws just for AI companion apps, but they do still have to follow existing rules. These include data protection laws (UK GDPR) which make sure that personal information is kept safe, and the Online Safety Act, which requires some apps and platforms to protect users, especially children and young people, from harmful content.

If an AI companion app makes you feel uncomfortable, unsafe or confused, you can:

  • Report harmful content directly to the app.
  • Stop talking to the app and(or) delete it.
  • Talk to a trusted adult about how you’re feeling, like a parent or carer or teacher.
  • Call Childline on 0800 1111 or visit www.childline.org.uk for confidential support.

Visit www.themix.org.uk for free information and support for young people. 

It’s important to remember that AI companions are not real people. If an AI companion app has made you feel uncomfortable, it’s normal to feel scared, sad, embarrassed or anxious. But remember, it’s never your fault and you should not feel guilty for deleting the app or asking for help.

If you are feeling lonely, there are other options available. You could visit Childline’s moderated message boards to speak to other young people (and Childline counsellors) about your feelings.