When AI tools like ChatGPT started showing up in schools, we all knew we couldn’t just ignore them. It’s in ur schools whether we’re ready or not. How does AI fit into our work, how to use it responsibly and how to help students think critically about the tools they’re already using.

My go-to book for the LC lesson plans lately is “AI in the library: strategies, tools and ethics for today’s schools” by Elissa Malespina. It’s a guide I know I”ll revisit often as AI continues to evolve in education.

This week’s LC focus was on…AI videos, like Deepfake. At the same time, OpenAI’s release of Sora 2, its powerful text-to-video generator, makes it easier than ever to create convincing synthetic media. The normalisation of deepfakes raises urgent questions: How to navigate a world where even videos can’t be trusted?

Sora 2 generates short, realistic videos with synchronised voices, ambient sound, and cinematic motion. Sora is currently invite-only, but OpenAI has confirmed it will open to the public soon. The Cameo feature adds another layer of risk. With a one-time scan of your face and voice, Sora will let you insert yourself, or others. The app itself functions like TikTok: users can create up to 10-second clips. The videos we scroll past may look real, may even sound real, but that doesn’t make them true.

Don’t take these videos at face value. Instead, we all need to develop the skills to slow down, question, and verify. That’s the only way we are ready for a world where seeing is no longer believing.