Blogs

Responsible AI Teaching - Part 4: Safe and ethical AI use for secondary school students

By Tim Bradbury posted 20 days ago

  

How do we prepare secondary students to use AI safely and responsibly outside the classroom?

That’s the focus of our latest Five-Minute CPD Drop, created in partnership with the Good Future Foundation. In this video, Alex draws on insights from a focus group of UK secondary teachers to explore the challenges of AI in both supervised (school) and unsupervised (home/online) environments.


Key themes from teachers

From our discussions, three big issues stood out:

  • Ethics & safeguarding
    Teachers want frameworks for ethical AI use, balancing opportunities for learning with clear safeguarding measures.
  • Unsupervised AI use
    Students are already using AI in apps like TikTok, Snapchat, Instagram, and large language models at home. But in unsupervised spaces, they’re more vulnerable to misinformation, exploitation, or inappropriate content.
  • Critical evaluation
    Educators agreed that it’s not enough to use AI in class – we must help students question what they see, hear, and create with AI.

Why supervised vs. unsupervised matters

In school, students benefit from firewalls, teacher oversight, and structured learning. Away from school, they often navigate AI alone – in social media, games, or when completing homework.

This gap makes it vital that pupils leave school with:

  • Digital resilience – knowing how to cope when things go wrong
  • Critical literacy – recognising bias, misinformation, and manipulative algorithms
  • An ethical compass – understanding privacy, consent, and fairness

As Laura Knight (Sapio Ltd) notes, young people’s first experiences of AI may well be in unregulated spaces. That makes building strong internal guidance and resilience skills in school even more important.


Practical classroom activities

Alex shares three simple but powerful activities to embed critical conversations about AI:

  • Is it AI or fake?
    Students evaluate whether text or images are AI-generated or human-made, learning to spot tell-tale signs and challenge authenticity.
  • Who sees my data?
    Students map where AI touches their lives – from chatbots to social media to adaptive learning platforms – sparking discussion on what data is collected and how it’s used.
  • Concept cartoons
    Using four characters with different viewpoints, pupils debate AI issues such as “How does AI train on my data?” This builds oracy, critical thinking, and awareness of different perspectives.

Beyond the classroom

The Good Future Foundation supports this work with a student council bringing youth voices into AI discussions, and through its AI Quality Mark – a Bronze, Silver, Gold, and Progress award framework that schools can apply for free. This helps schools embed AI in ways that are safe, ethical, and future-focused.

Watch the video above to see Alex share more ideas and examples of how to bring oracy into your AI teaching.

Supporting resources, certification and evaluation

You can download the supporting activity for this session on this link:

https://community.stem.org.uk/viewdocument/responsible-ai-teaching-activity-4?CommunityKey=0f32484b-dc5a-4266-8526-01a09365a63a&tab=librarydocuments


What did you think of this session? How will it impact your classroom practice? Let us know after each session and receive your digital badge and certificate of completion: https://forms.office.com/e/CezgKHhv8Q

0 comments
27 views

Permalink