Will Schools Exist in Ten Years?

I finally met Captain Kirk at a fan event this past summer in the Bay Area.
To me, William Shatner was more than an actor—he was the embodiment of a childhood imagination that dared to chart galaxies. The Captain’s hunger to explore, his willingness to consult Spock, his readiness to push boundaries—these qualities shaped how I thought about learning. To this day, my annual ritual of watching Star Trek II: The Wrath of Khan is no mere nostalgia; with my son, it’s a ritual in lineage, reminding us of what loyalty, sacrifice, and wisdom demand.
After our conversation, I thanked Mr. Shatner. He lingered, spoke with fans, teased nervous questioners, even sparring—jokingly—with Neil deGrasse Tyson. He opened a hailing frequency across generations, asking us “Do you love what you do–and how do you reckon with change?”
That was more than a pleasantry. It was a challenge, a mirror, a subspace transmission summoning me onto the ship.
In that moment I was transported back to my early teaching days at The Athenian School. During a faculty inservice, we had watched a Mike Wesch talk about culture change—how a simple census book disrupted tribal life in Papua New Guinea, foreshadowing how social media would reshape our shared sense of humanity. After it ended, Bruce Hamren, a wise senior faculty member, gently placed his hand on my shoulder and said, “It’s just another professional development theme. Keep doing what you do.”
For more than a decade, I did. The tools kept changing—learning management systems, multimedia portfolios, a UC-approved blended learning course on Hip-Hop History that I taught—but the paradigm held: information flows, teachers mediate, students learn.
When I came to Dunn School as Head of School, I carried that same formula: supporting progressive education rooted in student voice, humility, relational culture. Even in rural Santa Barbara County, we thrived—not loudly, but deeply. We layered on innovation: entrepreneurship, STEM racing, forensics—but our core remained familiar.
But, today the horizon has shifted.
At a California Association of Independent Schools (CAIS) board meeting, during a break, I sketched what I called the “six great disruptions”: smartphones, social media, narrowcasting, cancel culture, the pandemic—and artificial intelligence. The room quieted. The last one silenced us.
Why? Because for generations, schools claimed authority through control of information. A library card or a Britannica tome was precious. Then came computer labs. Then the internet. Then smartphones—each moment eroding the gatekeeping. Now, AI doesn’t just provide information—it provides answers. Immediate, polished, persuasive. Why wrestle with ambiguity when a chatbot gives you closure? Why pay tuition for what a machine offers gratis?
This is no longer a mere disruption — it’s an existential question. Bruce’s old reassurance—“just another theme”—no longer suffices.
If schools are to endure, we must relinquish being gatekeepers of knowledge. The new currency is authentic humanity.
Authentic Humanity: The Bridge Between Teaching Humanity and AI
The framing is not original. At Teaching Humanity (my initiative for authentic education in an AI era), I argue that in an age of accelerating AI, our highest intelligence is still humanity–voice, dignity, and courage. My book Speaking Truth, Teaching Humanity is part memoir, part field guide: designed to help educators and leaders lean purposefully into what AI cannot replicate. AI might simulate empathy, but it cannot live a story. It might riff on nuance, but it cannot own moral risk. It might speed up production, but it cannot bear witness.
In ten years, schools will survive, but only if they double down on what differentiates educators from machines.
Starship Dunn: A Metaphor for the Future of School
Let’s return to Star Trek. Imagine Dunn School as a starship traversing unknown regions. AI is our computational core—our version of Data on the bridge—gleaming with predictive power, memory, and speed. But we can’t let the mission rest solely on that entity.
We also need:
- A Kirk: moral courage, risk-taking, vision
- A Deanna Troi: emotional attunement, reading what’s behind the data
- A Spock: rigorous logic, disciplined questioning
We cannot let AI plot every warp jump. Our students might learn through holodecks—immersive, design-centered simulations of dystopias, ecological collapses, or alternate cultures—but only educators can help translate those worlds: “What did you see? What surprised you? What felt unjust? What made you uncomfortable?” That interpretive layer, that connective tissue, is the educator’s prime directive.
Spock once said: “The needs of the many outweigh the needs of the few, or the one.” Not just a captain’s maxim—also wisdom for schooling. We must teach our students to see beyond themselves, to be of service, to weigh collective flourishing above individual convenience. That is what we teach that no algorithm can.
So I ask again: Will schools exist in ten years?
Only if we venture beyond familiar orbits.
The New Mission: Curriculum for Humanity
Let me sketch how this shift could look like in practice—pulling forward themes from CATDC’s AI + education work (human-centered, assessment redesign, ethics, UDL, leadership) as well as practices from Speaking Truth, Teaching Humanity.
1. Start with Twelve Questions of Belonging (Voice, Dignity, Courage)
From Speaking Truth, Teaching Humanity I carry twelve guiding questions that ground curricular design:
- What accountability do we owe to future generations?
- Whose voices are centered or erased?
- What risks are students asked to take?
- How do power and identity show up?
- When is silence necessary, and when is it complicity?
- Whose humanity is assumed or denied?
- What metaphors of justice undergird our work?
- How do we respond when trust fractures?
- When failure emerges, how do we notice it?
- How do we scaffold empathy alongside critique?
- What does courage demand today?
- How do we anchor moral imagination?
In every unit—from AI ethics to global histories—these should be our warp coordinates. The questions invite us not only to teach content, but to feel and wrestle.
2. Redesign Assessments: Beyond the Anti-AI Trap
I’m a product of CATDC facilitated learning (from CATDC Leadership Fellows to its work in Equity and Inclusion), and I continue to follow the resources the collaborative shares, and in its online offerings (see Keeping Humanity at the Center of AI in Schools). CATDC emphasizes intelligent assessment redesign: not just “AI-proof” tests, but tasks where process, metacognition, and iteration are visible.
In CATDC’s AI and the Teaching of Writing” workshop, facilitated by Eric Hudson, educators are given something rare in schools–time to rethink rather than react. Together, they take a signature writing assessment and gently deconstruct it, piece by piece, until its true purpose is visible. Then AI is invited into the process, not as a shortcut, but as a mirror. By testing each component with generative tools, we surface the “vulnerable zones” where a chatbot could complete the work without the student ever thinking. From there, it is time to rebuild. The goal is to design moments of authentic authorship–annotations where students must respond to AI suggestions, defend their divergences, name their choices, and reflect on their voice. We carve out AI resistance spaces that only a human can fill: lived experience, moral reasoning, local wisdom, personal stake. The result is not simply an “updated assignment,” but a renewed invitation for students to write– with agency, awareness, and accountability.
At Dunn, I imagine a “Holodeck Simulation + Reflective Port” assessment: students inhabit an AI-generated utopian city; after exploring multiple versions, they build a community design and then defend which scenes they’d collapse or preserve—and why. Their reflections would hinge on empathy dilemmas, environmental justice, and systems trade-offs. No chatbot can spin that—but an educator can guide the navigation.
3. AI Literacy & Guardianship, Not Gatekeeping
AI literacy must be woven into every grade. Students and educators need a shared vocabulary: “hallucination,” “prompt reflexivity,” “adapter bias,” “chain-of-thought.” But we also practice guardrails: when to require source-checking, when to require annotations about AI use, when to ban AI from certain parts of the work. CATDC’s emphasis on bias, safety, and ethics must be central—not afterthoughts.
One exercise inspired by Speaking Truth, Teaching Humanity is the “AI Autobiography”: students imagine their own identity through a bot, and draft what they’d say about justice, empathy, and failure. Then they compare that to their human self: where do they diverge? What is missing? That bridge invites humility and awareness.
4. UDL + Responsive Design
AI offers potential to reduce barriers (text transformations, translation, scaffolding). But we must demand that AI-generated supports come with footnotes and coaching, not erasure of effort. CATDC’s connection to UDL urges us to scaffold output, not abdicate craft.
For instance, let AI generate three versions of a draft paragraph (formal, narrative, poetic). Then students compare, annotate strengths/weaknesses, and choose one to revise further—showing how rhetorical choices matter. That scaffolding invites agency.
5. Leadership, Change, and Culture of Inquiry
AI adoption is not plug-and-play. It cannot be delivered as a policy memo or a list of tools. It must begin where all lasting change in schools begins — with teachers in conversation. In our approach, we don’t start with mandates; we start with teacher crews. We invite them to gently deconstruct a signature writing assessment, pulling it apart until its purpose is clear. Then, rather than banning AI, we bring it into the room as a kind of diagnostic tool — testing each part of the assignment to see where a student might bypass true thinking. This is not about policing; it is about understanding.
From that understanding, something powerful happens: teachers begin reimagining assessments to preserve what only a human can do. Students are asked to annotate AI suggestions, defend their divergences, and name their choices. Leadership, in turn, must build AI norms that reflect the school’s deepest values — voice, responsibility, belonging — because meaningful adoption is not about saving time; it’s about safeguarding purpose.
And that brings us to the heart of it all…
6. Belonging, Dignity, Voice, Moral Risk
Here is where Teaching Humanity comes alive. The core differentiator: schools must commit to dignity in dialogue, to discomfort, to failing together. As I recount in Speaking Truth, Teaching Humanity, my debate teacher Tommie Lindsey’s mantras still echo: “We debate ideas, not people.” “The story is the argument.” We must build Darkness Circles (small dialogue spaces) in AI-inflected classes: asking students to wrestle with whether AI should judge creative work, who owns voice, what “originality” even means.
I also tell the story of a night when our debate team was fractured by racial tension. Most leaders would have said, “Move on.” Lindsey said, “No one gets on the bus.” We stood in the cold. He asked: “What did you hear? How do we stand with each other without standing over each other?” That is moral education. Machines can offer perspectives—but they cannot feel the tension, carry the silence, or sustain trust.
Return to the Bridge: Why Schools Matter
We must not pretend this is easy. The existential threat is real. But when I look at students—at their hunger, fragility, curiosity—I do not see the end of schooling. I see a ship whose mission is more urgent than ever.
Let me close with a revised Star Trek metaphor:
Imagine the universe of knowledge as a vast nebula of stars and possibilities. AI provides new propulsion systems, new sensors. But the mission is not just faster travel—it is exploration of meaning. When the Enterprise approaches a planet, it doesn’t just scan and leave. It tries to understand the ecology, the beings, their stories, their contradictions.
That is what schools must become: explorers of the human frontier. We teach not because information is scarce—but precisely because meaning is rare. We cultivate not because algorithms cannot compute—but because wisdom demands courage and discernment.
And yes—if you haven’t yet, I hope you consider reading Speaking Truth, Teaching Humanity, which weaves stories of the legendary Tommie Lindsey, classrooms that refused apathy, and exercises to reclaim dignity in the AI age. Let it be a compass for your own away mission.
In ten years, will schools exist?
They will—if we resolve to go where no classroom has gone before.
Kirk’s hailing frequency still echoes in my head. Will we answer it with dignity, audacity, and humanity?
I believe our response has to be, “Risk is our business.” Let our voyage be toward human flourishing, not just efficiency.
Engage.

Kalyan Ali Balaven is a national voice in progressive education, known for centering belonging, moral courage, and whole student education. As Head of School at Dunn School and author of Speaking Truth, Teaching Humanity, Balaven connects his own story with the legacy of Tommie Lindsey, showing why authentic humanity must be at the center of education in the age of A.I.