Cymraeg

Artificial intelligence (AI) is becoming a familiar presence in the tools and platforms used by learners, teachers and families. Schools and settings in Wales are already engaging with AI (sometimes knowingly, often invisibly) through translation tools, accessibility features, generative chatbots, learning apps, and wider digital services.

These recommendations are adapted from guidance used across the Welsh public sector, with a focus on safe, ethical and purposeful use of AI in schools. They align with Welsh values, the Curriculum for Wales, safeguarding responsibilities, and the practical realities faced by staff and learners.

AI can bring real benefits when used thoughtfully, improving efficiency, enabling creativity, supporting inclusion, and building digital literacy. But it also brings risks, including bias, misinformation, over‑reliance, inappropriate data use, and unequal access. The aim is not to use AI more, but to use it better, in ways that strengthen learning, uphold wellbeing, and support every learner to thrive.

We believe that AI developments and use should be:

  • Ethical: Trustworthy and aligned with the values of the people of Wales, while meeting international standards of safety, responsibility, and fairness.
  • Empathetic: Inclusive, accessible, and designed to benefit all, with particular focus on supporting the most vulnerable and disadvantaged.
  • Enterprising: Innovative, collaborative, and committed to strengthening Wales’ research, development, and entrepreneurial ecosystem.
  • Effective: Transparent, accountable, and governed responsibly, ensuring AI delivers measurable impact and public value.
  • Schools may discover that far more AI is in use than expected — from classroom apps and translation tools to learners independently experimenting with tools such as ChatGPT, Google Gemini or Microsoft Copilot. A good starting point is to create a simple internal register of AI tools used by staff or learners.

    This should include both free tools and paid-for platforms, and should be developed jointly with the Data Protection Officer (DPO), digital and safeguarding leads to assess actual usage and risks. Wherever use of AI includes use of personal data then data protection law applies. It is important to ensure that tools comply with standards set by the Information Commissioner’s Office (ICO) before using them for work related to the school. Because AI is also accessible on personal devices, schools need a clear, simple set of rules defining what information can and cannot be entered into AI tools. This protects learners, supports staff confidence, and avoids accidental data breaches.

    A regularly updated register also helps leaders understand risks, spot trends, and provide informed guidance. You can use the ‘Generative AI in schools’ policy template to support you with these considerations and your wider AI policies and practice

  • Schools can benefit from having a diverse group who can advise on AI use. A school level AI panel could include teachers, governors, digital leads, safeguarding staff, ALN coordinators, support workers and crucially learner voice.

    This group does not need to be large or formal. It may sit within an existing digital group, curriculum team, or student council sub group. What matters is that it represents a range of perspectives and feeds into leadership decisions made by SLT or the Governing Body.
    The panel can:

    • review emerging trends
    • advise on tools staff or learners are using
    • explore cultural and Welsh language considerations
    • ensure safeguarding implications are understood
    • recommend when external support is needed

    This creates shared ownership and avoids decisions being made in isolation.

  • Knowledge, understanding and confidence with AI can vary widely. Some staff and learners are enthusiastic; others feel uncertain or wary. Schools should create safe opportunities to explore AI using age appropriate tools and examples. For primary learners the focus may be on understanding simple AI concepts and testing this together in a classroom setting, while secondary learners might be ready to engage with more complex and independent applications of AI.
    AI literacy includes:

    • understanding how AI works at a basic level
    • recognising the limitations of AI tools
    • questioning outputs
    • checking information for accuracy
    • spotting bias or harmful content
    • understanding when tools should not be used

    Staff should model these behaviours, reinforcing digital literacy skills in the Digital Competence Framework (DCF), a cross curricular component of Curriculum for Wales.

    Governors and support staff should also be included in training, as they frequently handle communications and data where AI misuse could pose risks.

  • AI can be powerful, but new tools should not be adopted simply because they exist. Schools should routinely ask:
    "Do we need this tool now? or at all?"

    When considering new tools, reflect on:

    • educational benefit
    • cultural relevance, does the AI tool understand Wales?
    • quality of Welsh language support
    • environmental impact
    • inclusivity and accessibility
    • alternatives that may be simpler

    To benefit from enhanced protections and controls, consider if the tool is available to access through the Hwb Platform. See the guidance on Copilot Chat, Google Gemini and NotebookLM for further information. Pilot tools on a small scale first, with clear success criteria. Work with the AI panel to evaluate impact before deciding whether wider adoption is appropriate.

  • Schools should ensure their approach to AI aligns with established guidance used across Wales, including:

    These frameworks support accountability, fairness, transparency and child centred practice. They help schools assess new tools, understand risks, and make informed decisions about procurement or use. The AI Plan for Wales provides further context about how AI will be scaled up responsibly across Wales.

  • AI responsibilities should be clearly defined. For example:

    • The Headteacher may act as Senior Information Risk Owner (SIRO).
    • The digital lead  may be Information Asset Owner (IAO).
    • Staff should be clear on their responsibilities when using AI tools.

    Job descriptions may need light updates to reflect oversight, risk management and proper data use. Most importantly, every member of staff should understand the school’s simple rules for AI and demonstrate responsible practice. Ensure that those that already deal with risk are aware that AI use is now part of their role (this may be a specific sub-committee of the Governing body and members of Senior Leadership Team or non-teaching staff).

  • Clear communication helps build understanding and trust among staff, pupils, parents and governors. Schools should explain:

    • how AI is used and why
    • what safeguards are in place
    • what learners should expect
    • how risks are managed
    • how families can support safe use at home

    This could be through newsletters, school websites, parent events, pupil assemblies or classroom discussions. Transparency reassures the community and strengthens digital wellbeing.

  • A “critical friend”, such as a local authority digital lead, governor, cluster digital coordinator, or external advisor can help schools sense‑check plans and highlight risks that may otherwise be missed.

    Periodic external review promotes continuous improvement and ensures schools keep pace with emerging guidance, especially in such a fast moving area.

  • AI literacy should become part of ongoing professional learning, similar to safeguarding or GDPR awareness. Training does not need to be technical. It should help staff:

    • understand benefits, risks and limitations
    • evaluate AI-generated content
    • spot misinformation
    • protect personal and sensitive data
    • use tools confidently but cautiously

    For learners, AI awareness should sit naturally within lessons, just as the DCF does, supporting safe, ethical and curious engagement.

  • These simple, memorable rules may help staff and learners decide what information can be entered into AI tools:

    1. Is the information public or general knowledge

    Using free tools may be appropriate though outputs should not be unchecked, nor passed off as one’s own work.

    2. Does it include personal or sensitive information

    Only use secure, approved tools and platforms. Remember, learners’ personal data must never be entered into AI tools.

    3. Not sure

    Ask. The AI Panel will be able to ask the Data Protection Officer or digital lead. 

    These rules reduce risk while keeping space for creativity, exploration and learning.

Closing thoughts

AI is changing rapidly, and schools are navigating uncertainties as well as opportunities. By taking an approach that is thoughtful, transparent and grounded in the values of Welsh education, schools can help learners develop the skills, confidence and critical thinking they need to thrive in a digital world.


 

CDPS 

The Centre for Digital Public Services (CDPS) was established in 2020 as an arms-length body of Welsh Government to help deliver the Digital Strategy for Wales, with a focus on Mission 1: digital services.

CDPS supports people, teams and organisations to create sustainable digital transformation, and make public services:

  • designed around the people who use them
  • simple, secure, and convenient
  • accessible, inclusive, and cost-effective