AI, Education and Copyright: Insights from EDUtech 2025

June 17, 2025

EDUtech 2025 brought together thousands of educators, edtech developers and policy leaders to explore the future of learning. Unsurprisingly, artificial intelligence dominated the agenda — from keynote stages to classroom demos. 

As delegates, we saw firsthand the enormous potential of AI in education, as well as the critical questions it raises around policy, pedagogy, and copyright. Here’s what we took away. 

 

Human-led, AI-supported: A shift in teaching roles 

Sal Khan, founder of Khan Academy, opened the event with a strong message: AI is not here to replace teachers — it’s here to assist them. 

He likened AI to another transformative human invention with inherent risks and huge potential benefits: fire. 

Used well, AI could free teachers to focus more on the human side of learning: motivation, feedback and support. Teaching, he argued, is one of the most secure professions because of its uniquely human core. 

However, when asked about the data used to train AI tools like Khanmigo, Khan acknowledged a legal grey area. Khan Academy uses OpenAI’s models and does not train its own. Copyright concerns, he said, will “likely be settled in court” — but warned that if responsible actors are slowed down, less ethical platforms may move ahead faster. 

This highlights why licensing clarity and robust copyright frameworks remain so important as the sector evolves. 

 

Strategy first: Implementing AI with purpose 

Professor Rose Luckin (University College London) took a more cautious tone. She encouraged educators to focus not just on what AI can do, but why and how we use it. 

Her guidance: 

  • Be strategic and collaborative in implementation 
  • Build clear policies around AI use 
  • Define a theory of change for each tool 
  • Ensure decisions are rooted in pedagogy, not novelty 

Her message resonated with providers across sectors: successful use of AI starts with professional judgement and clear values. 

 

A confronting question: What happens if we get it wrong? 

Dr Danny Liu (University of Sydney) posed a sobering challenge to the audience: 

“What will the headlines be in five years’ time if we mess up AI?” 

It was a moment that brought together the threads of risk, ethics and responsibility. As the tools evolve rapidly, Liu reminded us that educators — not engineers — must lead the conversation on how AI is used in learning. 

Dr Danny Liu also introduced a powerful metaphor: the AI x Assessment Menu — a conceptual tool showing how AI can ethically support learning across multiple phases. 

The menu includes: 

  • Soups (Critical Friend) – reflection prompts, study tips 
  • Entrees (Getting Started) – brainstorming, outlines 
  • Bread Service (Literature Engagement) – summarising, decoding jargon 
  • Mains (Content Creation) – drafting text, creating visuals 
  • Lighter Mains (Analysis) – interrogating data, exploring arguments 
  • Coffees (Editing) – grammar, tone, shortening 
  • Desserts (Feedback) – rubric-based review or draft comments 

But it was Liu’s framing that resonated most: 

“Our role isn’t to remove all the junk food from the table [i.e. ban AI in education]. It’s to teach students how to eat a balanced diet.” 

Rather than banning tools or fearing misuse, educators should help students develop critical thinking, ethical awareness, and judgment — skills that are essential both for academic integrity and lifelong learning. 

At Copyright Agency, we see this as a clear parallel to how content is managed in education: not by restricting access, but by supporting lawful, informed and respectful use. 

 

AI in practice: Tools built with teachers, not for them 

Among the most impressive demonstrations were AI tools developed with — and for — the teaching profession: 

  • NSW EduChat and SA’s EdChat: platforms helping teachers write communications, interpret policies and streamline admin, all built with guardrails around data privacy and copyright 
  • SA’s LEAP App: an assessment tool that reduced English proficiency testing time from 30 minutes to just 42 seconds, without uploading or repurposing any third-party content 

These examples show what responsible, policy-aligned AI use can look like in education— where compliance, clarity and educator trust are baked into design. 

 

Where copyright fits in 

As AI becomes more embedded in education, it raises important questions about how content is used, shared and generated. We’ll continue to monitor developments and keep providers informed as the picture becomes clearer. 

 

Final reflections 

AI in education is not just a technology shift — it’s a shift in thinking. As Sal Khan reminded us, we must treat it like fire: powerful, transformative, and needing clear boundaries. 

With the right frameworks in place — including copyright — education providers can embrace new tools while protecting the integrity of teaching, learning and content creation. 

At Copyright Agency, we’re committed to: 

  • Supporting educators as they navigate change 
  • Protecting creators whose work powers learning 
  • Enabling innovation that respects legal and ethical boundaries 

As EDUtech 2025 made clear, we’re entering a new era — and our collective role is to ensure it’s one built on balance, integrity and care.