Practical strategies for designing assessments that center human thinking
Designing Inclusive Assessments in the Age of AI
AI doesn’t just change how students complete assessments — it changes who is included, who is disadvantaged, and whose voice becomes invisible if we’re not intentional.
This section offers inclusive, AI-aware design moves that support diverse learners without diluting academic challenge.
1. Assess the Human Layer, Not the AI Layer
Shift marks toward:
-
decision-making
-
critique
-
explanation
-
context
-
reflection
-
localisation
These are the parts that AI cannot authentically produce.
Try:
Instead of “Write 800 words on X,” shift to:
“Use any tools you like to explore X. Then explain:
-
what you agreed with,
-
what you disagreed with, and
-
what you added from your own context.”
Outcome:
Students with AI access don’t gain an unfair advantage; students without it aren’t disadvantaged.
2. Use Multi-Modal Expression as a Default Option
Let learners demonstrate understanding through:
-
audio explanation
-
short video
-
annotated screenshots
-
mindmaps
-
paragraph + voice note combos
Why it matters:
Reduces barriers for ESOL, neurodivergent learners, and anxious writers.
3. Require Evidence of Process — Not Just Product
Ask for:
-
prompt attempts
-
drafts
-
AI comparisons
-
revision notes
-
“what I changed and why” blurbs
-
mistake analysis
This protects integrity and reveals thinking.
4. Ground Tasks in Aotearoa Contexts
AI is weakest when tasks are:
-
local
-
specific
-
cultural
-
personal
-
contextual
-
relational
Examples:
-
“Apply this idea to your whānau, workplace, or community.”
-
“Explain using an example from Aotearoa.”
-
“How would this concept work in your rohe (region)?”
Outcome:
Learners must bring themselves into the task.
5. Build Choice Into Assessment Pathways
Choice increases engagement and reduces inequity.
Offer:
-
AI-supported pathway (critique + curate + reflect)
-
AI-free pathway (build + revise + reflect)
Both aligned to the same learning outcomes.
6. Scaffold AI Literacy Directly Into the Assessment
Examples:
-
“Highlight one moment AI was wrong — and correct it.”
-
“Describe how AI shaped or challenged your thinking.”
-
“What cultural perspective did AI miss?”
-
“What did you remove because it didn’t fit your community?”
Outcome:
Critical AI literacy becomes part of the assessment, not a bolt-on.
7. Design for Transparency, Not Policing
Instead of detection software, use:
-
reflective disclosures
-
voice explanations
-
annotated drafts
-
face-to-face kōrero
What you’re assessing:
authenticity + understanding + reasoning.
Not surveillance.
Kaupapa Māori Lens — Aromatawai that Uplifts Mana
A culturally anchored layer that enhances LP3’s “Mana Deep Dive” but avoids repetition.
🪶 1. Mana Motuhake as the Assessment Anchor
Design tasks that preserve the learner’s voice, not overwrite it.
Prompt:
“What part of this mahi could only have come from you or your whānau?”
This keeps identity central and prevents AI homogenisation.
🪶 2. Whakapapa as Evidence of Learning
Instead of asking, “Did you use AI?”
Ask, “What is the whakapapa of your ideas?”
Encourage mapping:
-
people
-
places
-
experiences
-
texts
-
conversations
-
AI outputs
Whakapapa restores relational depth.
🪶 3. Whanaungatanga in Assessment Conversations
Use collaborative interpretation:
“What did our class agree the AI missed?”
“What did your tīpuna teach you that AI wouldn’t know?”
This centres relational learning over tool dependence.
🪶 4. Tapu + Taonga
Some assessments are human-only:
-
whakapapa
-
lived experience
-
spiritual reflections
-
cultural practices
Encourage learners to decide:
“Is this something I should give to a machine?”
This builds ethical discernment.