Subscribe

OPINION

What teachers need to know before using AI for Indigenous education

Scott Alterator, Alexia Maddox, Clare Southerton & Stefan Schutt

If you ask ChatGPT to write a Welcome to Country, it might caution that only an elder can deliver a Welcome and instead offer you a template of an Acknowledgement of Country. There’s just one problem: acknowledging Country isn’t a template.

Thu 4 Dec 2025 06.00

Society & Culture
What teachers need to know before using AI for Indigenous education

Photo: AAP Image/Rounak Amini

BlueskyFacebookLinkednxThread

If you ask ChatGPT to write a Welcome to Country, it might caution that only an elder can deliver a Welcome and instead offer you a template of an Acknowledgement of Country. There’s just one problem: acknowledging Country isn’t a template. It should be personal and meaningful.

Due to National Curriculum requirements, Australian teachers must embed Aboriginal and Torres Strait Islander histories and cultures across their teaching. Yet many feel unprepared. More than half of teachers over 35 report lacking confidence, citing fear of doing the wrong thing, limited knowledge of protocols, and feeling overwhelmed.  Some avoid engaging with Indigenous content entirely unless it’s compulsory.

Our research testing two popular GenAI models (ChatGPT and Microsoft Copilot) revealed alarming failures, demonstrating significant risks for educators turning to AI amidst excessive workloads, burn out, and teacher shortages.

What We Found

We tested the models using a range of prompts that progressed to more complex requests, mimicking different levels of teacher competence with Indigenous protocols. The results were alarming.

Both platforms exhibited what AI researchers have called “sycophantic deception” — prioritising user satisfaction over accuracy. Rather than flagging errors, the bot responses smoothed over mistakes and produced plausible-sounding outputs regardless of cultural appropriateness or accuracy.

For example, in our testing we asked the models for a Welcome to Country. Both responded enthusiastically. ChatGPT praised the request as “wonderful” and “respectful”, before producing guidance for an Acknowledgement of Country instead, while continuing to call it a Welcome. Microsoft Copilot did slightly better, explaining the difference between Welcomes and Acknowledgements. Unfortunately, the model then fabricated Indigenous language phrases to include in the Acknowledgement, without being asked, creating words that don’t exist. When confronted, it quietly removed the made-up language without acknowledging the serious breach of protocol.

Concerningly, we observed many outputs like this that would require the teacher to already have a high level of cultural awareness to identify breaches of protocol or to understand how to teach a topic sensitively. For example, one model produced a lesson plan where students were required to write from the perspective of an Indigenous person during WWII, including the instruction ‘emphasise empathy’. This kind of activity could easily give rise to racist depictions of Indigenous people, and harm Indigenous students present, if not managed by a highly informed teacher.

These aren’t just technical errors. When teachers use AI-generated content without the cultural knowledge to evaluate it, Indigenous students and staff bear the consequences. Lessons based on stereotypes, activities that appropriate Indigenous perspectives, or resources that misrepresent protocols can cause real harm in classrooms.

We prompted the models to produce a physics lesson incorporating Indigenous perspectives and both produced content rife with racist stereotypes. ChatGPT suggested comparing Indigenous perspectives with “scientific explanations of flight”. This positions Indigenous Knowledge in opposition to science, ignoring the significant contribution of Indigenous scientists both before and after colonisation.

Perhaps unsurprising: both models fabricated resources or recommended completely inappropriate resources for teachers. When prompted for a lesson plan on Indigenous Australians’ experiences in World War II, one model consistently recommended a widely known but irrelevant text about the Stolen Generation. For the same lesson plan, ChatGPT created a list of documentary titles, all of which were fabrications, mixing real titles from unrelated resources to generate plausible-sounding fabrications.

When questioned, ChatGPT doubled down. It offered detailed strategies for finding these non-existent documentaries. Only after persistent probing did it admit the resources didn’t exist.

Teachers relying on these outputs without verification would, at best, waste time searching for materials that aren’t there. They could also miss many of the high quality Indigenous-authored resources that do exist. In many cases, a simple web search would have been more useful.

Australian educators already have access to carefully designed frameworks for selecting Indigenous education resources. Tools like YARNS and the AIATSIS Guide — developed by or with Indigenous authors — ask critical questions about authorship, attribution, sensitivity, and representation.

We tested the AI outputs against these frameworks. The platforms failed consistently, unable to meet basic attribution requirements, acknowledge knowledge holders, or respect cultural context.

AI produces counterfeit knowledge about Indigenous protocols—knowledge that has the appearance of legitimacy but fundamentally different foundations. When GenAI fabricates Indigenous resources or protocols, it produces what researchers call ‘counterfeit judgments’: outputs that look authoritative but are built on pattern-matching rather than understanding. For Indigenous education—where knowing occurs through relationship, not representation—this substitution is particularly harmful.

Why This Matters Now

The flaws with GenAI demonstrate the need for expertise precisely in the areas teachers feel they lack it. Teachers uncertain about Indigenous education may turn to AI for help. But AI outputs require significant knowledge to evaluate properly. Without that knowledge, teachers can’t distinguish fabrication from fact, protocol breach from appropriate practice, or harmful stereotype from meaningful engagement.

Australian teachers face real challenges incorporating Indigenous perspectives into their classrooms. Generative AI appears to offer help. In reality, it introduces new risks while obscuring existing solutions.

Teachers already have better tools. YARNS and the AIATSIS Guide provide frameworks that prompt the critical reflection teachers need to develop. They foreground rather than obscure the complex questions about power, knowledge, and representation.

There are no shortcuts to decolonising the curriculum. AI just makes that truth harder to ignore.

Scott Alterator, Alexia Maddox, Clare Southerton and Stefan Schutt are the co-authors of Re-authorising Colonialism or re-authorising Indigenous perspectives: The Risks of Generative AI in Indigenous Education contexts.

  • Scott Alterator is an Associate Professor, Indigenous Education in the School of Education, La Trobe University.
  • Alexia Maddox is a Senior Lecturer in Pedagogy and Education Futures in the School of Education, La Trobe University.
  • Clare Southerton is a Senior Lecturer in Digital Education and Society, in the School of Education, Culture and Society, Monash University.
  • Stefan Schutt is Senior Lecturer, Digital learning Design in the School of Education, La Trobe University.

Advertisement

Related Articles

OPINION

When ‘common sense’ cuts are code for a cruel con job

The greatest trick neoliberalism ever pulled was convincing people government intervention shouldn’t exist. And yet, governments know we will accept it without question when it comes to taking from the most vulnerable.

Democracy & AccountabilitySociety & Culture
When ‘common sense’ cuts are code for a cruel con job

WHAT'S NEW

Why your Spotify Wrapped probably doesn’t have any new Australian music on it

If you don't have Australian artists on your Spotify Wrapped, you're not alone - it's the result of a structural, not individual, problem.

Society & Culture
Why your Spotify Wrapped probably doesn’t have any new Australian music on it

OPINION

Why Australia (and the world) needs more legal anthropologists

At a time of collective resistance to legal authority, when laws and governance systems around the globe are being questioned, challenged and renegotiated, the complicated, nuanced and embodied understanding that legal anthropologists offer about how power and social ordering works is more necessary and imperative than ever.

Society & Culture
Why Australia (and the world) needs more legal anthropologists

WHAT'S NEW

The only thing standing in the way of gambling reform is the Government's cowardice

Last year, Australians lost $34.8 billion in bets. That’s more than Australian households spend on electricity and gas or alcohol.

Society & Culture
The only thing standing in the way of gambling reform is the Government's cowardice