For all the hype about AI replacing teachers, the reality is that teachers are irreplaceable. It is a deeply human endeavor. The teachers who can leverage this human element through authentic learning will ultimately prepare students for an unpredictable world forged by AI.
Listen to the Podcast
If you enjoy this blog but you’d like to listen to it on the go, just click on the audio below or subscribe via iTunes/Apple Podcasts (ideal for iOS users) or Spotify
Let’s Not Lose the Human Element
A few weeks ago, I watched a cringe-worthy Apple commercial. Like many viewers, I had a visceral reaction to the message because it created this false dichotomy between the digital and the physical, the technological and the human, the compressed and the expansive. I immediately sat down and sketched out a video and then used my trusty Macbook to turn it into a video.
If you’re not familiar with the Apple commercial, there is a house full of colorful, vibrant, idiosyncratic objects that are then crushed by a giant industrial machine. The iPad lives. It’s so thin. So compressed. So sexy and skinny. It’s also breakable. Drop it down a set of stairs and you get a spiderwebbed screen, but let’s not look at that.
The message was clear. Why bother with the clunky, the human, the messy, the colorful, the vibrant? Just opt for an iPad. Why strum your fingers on an acoustic guitar learning the chord progression to “Stairway to Heaven” like your father did (and maybe grandpa?) when there’s GarageBand? Why spend hours playing Space Invaders with your friends at the arcade if the App Store can get you it so much cheaper and faster? Why embrace the memory of emptying your spit valve in the marching band as you look up, embarrassed at your crush staring back at you; if you could just stare at the likes on your latest social media update?
It’s so inefficient to hold onto the tangible in an age of automation.
I get it.
I love tech.
I’ve been using a Mac since my childhood. The Apple Ecosystem is how I do so much of my creative work, like crafting videos, making podcasts, and creating books.
And yet . . .
We have a grand piano in the living room. It takes up far too much space and is filled with far too many memories. It needs a tune up and not an update and a man, a very old man, fiddles around for hours to make this thing feel timeless. But when someone pounds those keys, I swear I can hear entire generations before me.
It’s a sacred object that I don’t want to crush.
The most powerful iPad ever?
I’m not looking for power.
The thinnest iPad ever?
I don’t want more compression.
Give me a can of spray paint so Micah and I can make art in the garage. Give me an acoustic guitar and see if I can play “New Kid in Town” and feel the memories of my dad singing that song to me while I feel asleep. Give me a whole truck load of hardback books, covered with annotations, that are passports to all kinds of worlds.
Forget compression. I want duct tape. I want cardboard. Give me a makerspace full of random objects that can’t be digitized. I want an acoustic guitar next to a campfire while we have another IPA. Life is short. We don’t need to compress it any more than it already is.
In a world of automation, I don’t want to crush creativity.
Of course we should enjoy the tech. It’s amazing. But also . . . let’s embrace the lo-fi, the vintage, the tangible, the messy, the human. I share this because I keep seeing the message that AI will spark all kinds of amazing innovations in education. But I worry that if we center innovation on machine learning, we run the risk of chasing the fleeting novelty of techno-futurism instead of preparing students for the deeply human skills they will need in a changing world.
The False Promise of Techno-Futurism
When I was in middle school, a teacher brought us all to the front of the class and held out a giant golden disc. “This is the future of education,” he said. “Someday, you’ll be able to pick up one of these discs, take it to a pod with a tv and learn exactly what you need to learn. You won’t even need teachers. This will change learning forever.”
It did not change learning forever. Years later, I saw the same argument re-emerge with the rise of internet and search engines. We could just ask Jeeves and he’d answer our questions. Rest in Peace, Jeeves. Later, it was one-to-one devices and then leveled reading programs for intervention. Now, it’s the promise of personalized learning with generative AI. Students will sit in front of a computer with an AI tutor and receive the exact instruction they need without the guidance of a teacher.
Artificial Intelligence futurists are predicting that AI will provide students with this entirely personalized educational experience – one that has the potential to replace teachers. However, this vision for the future mistakes adaptive learning for personalized learning.
Adaptive learning is clean and fast and efficient (not unlike that giant machine that compressed every human element in the iPad commercial). Personalized learning, by contrast, is messy and human. It’s filled with inefficiencies. It’s built on student empowerment:
Learning Is About More Than Just Content Delivery
A year ago, I explored this idea of personalized versus adaptive learning in a blog post and a video.
It’s interesting that after I posted the video, I began to get angry comments on YouTube. Some of these were polite, such as, “Yeah, you need to revisit this claim. I have heard heads of AI companies even suggest that education and health care the largest impacted industry. Imagine individualized teachers for students on a computer.”
Others were most skeptical about the role of teachers:
I think you have an extremely idealised view of teachers tbh, most of the teachers I’ve had were terrible and put me off the subject. The only reason they were necessary to me before was because if I tried teaching myself a subject by reading and researching, I’d always have questions that I’d need answering, and I’d have no one to ask (I couldn’t find the answers online since they were too specific and niche, and no one seems to know the answer when I ask on sites like Quora). But now that I can ask the AI anything I want, and it always explains it far better than any human does whenever I ask them online, and explains it instantly, teachers are obsolete for me now (which is fantastic, since it opens the doors to education for me now since I don’t have to pay for courses or private tutors).
One person told me, “This whole video is just pure cope” and another wrote, “Eventually, AI will replace teachers. Teaching is becoming a dead career.”
I get it. These are flaming rants from strangers in the YouTube comment section. But dig underneath a little more and you’ll see that there’s an underlying message that education is merely content delivery. Teachers deliver the content, give feedback on mistakes through assessment, and then re-teach:
If that’s the model of learning, then AI might eventually do things better than teachers. If teaching is merely a set of instructions, worksheets (digital or otherwise) and feedback loops, then, yes, AI might surpass humans. But I actually don’t think that’s what teaching and learning is all about. The future of learning is not merely about acquiring a pre-determined set of knowledge and then being tested on it. Instead, it’s about learning the skills, concepts, and mindsets that will be needed in an unpredictable world.
What Will Students Need in the Future?
In the past, students could depend on a simple formula. Do well in school, get the right degree from a university, and climb a corporate ladder.
With automation and AI, the ladder is gone and in its place is a maze. Our students will need to navigate this maze of an unpredictable world. The bad news is that the rules have changed. The exciting news is that our students will get to re-write the rules.
So, given this new reality, we want students to learn the content at a deep level while also developing those critical skills they will need as they navigate this maze. There are no simple answers here. We still need students to learn things like phonics and math facts. There’s nothing wrong with traditional approaches. But we also need to help students become empowered as self-directed learners and they can develop this through authentic and meaningful learning.
One example is PBL. Project-based learning is a teaching method that focuses on active, experiential learning through the completion of real-world projects. Students are given a problem or challenge to solve, and they work collaboratively to design, create, and present their solution. A key distinction of PBL is that students learn through the project rather than doing a culminating project. When coupled with design thinking, they develop deeper empathy as well.
If this seems like too much, you can do sprints and mini-projects instead.
Another option might be inquiry-based learning. Here, students ask questions, engage in research, and share their findings with others. You could do something like a single Wonder Day project.
But on a simpler level, you can take short question breaks where students follow this process:
- Generate a list of questions
- Rally Robin: Each student asks and answers questions
- Stand up, Hand up, Pair up: Students walk around answering questions
- Answer questions as a whole class
Another option might be to run a Socratic Seminar. Socratic seminars are a democratic, student-centered, approach to class discussions. They can be used at any grade level with any subject area. In a Socratic Seminar, members meet in a circle (or more likely an oval, because, let’s be real, circles are really hard to create) and share their insights. Participants do not raise their hands or call on names. Because there’s no discussion leader, each member can comment or ask follow-up questions to one another. This approach can be empowering for participants because they own the conversation. Unlike a typical class discussion, the conversation moves fluidly back and forth rather than having to go through the teacher.
Notice that each of these options require human interaction. They are messy and unpredictable. They contrast sharply with the efficiency of a lone student sitting in front of a screen with headphones as they move along a predetermined adaptable learning program.
Embracing a Blended Approach
The best creators are going to know how to use A.I. in a way that still allows them to retain their humanity. This feels like a daunting task, but I’m inspired by a phenomenon in competitive chess. A.I. will nearly always beat a human. But when you do chess via teams, the fully automated A.I. teams rarely win. Neither do the all-human teams. The winning teams are nearly always the combination of A.I. and human. If that’s true of an isolated system like chess, how much more true will that be in a complicated world where the systems are constantly evolving?
The goal, then, is to use a blended approach that incorporates elements of both humanity and machine learning:
This starts by asking what AI does well and then leveraging what we, as humans, do well.
What Does AI Do Well
When we consider AI, there are certain elements that it does really well:
- Synthesize information
- Generate examples
- Role-playing
- Creating systems
- Using predictive analytics
- Analyzing a problem
- Help with conceptual understanding
A blended approach will likely need to embrace some of these elements of AI. However, it doesn’t mean that we outsource all of the synthesis, analysis or predictive analytics to a machine. We want to avoid cognitive atrophy. It’s also important to remember that AI will contain biases and misinformation. While an app like Consensus does a great job using vetted resources, that limitation is still present in all forms of generative AI.
We use the term “intelligence” to describe A.I. But a chatbot isn’t sentient. It’s not affective. It will do no thinking without a prompt. When I leave the room, the chatbot is not daydreaming or making plans for the future or feeling insecure about that super awkward thing that happened yesterday. A chatbot feels no shame, has no hopes, and experiences no loss. It can generate a love poem but it can’t be heartbroken. And yet, those are all major aspects of human cognition. For this reason, we need to ask, “What do humans do well?”
What Do Humans Do Well
The following are four key areas where humans excel.
1. Context
AI lacks contextual knowledge. It’s a bit like interacting with a toddler who has an encyclopedic knowledge of the world but has never left the room. This toddler has only interacted with you. Truth be told, AI is actually a little closer to that of a parrot who speaks a bunch of phrases but doesn’t actually have meaning or intent behind it. Because AI is built on algorithms shaped by predictive analytics, it lacks the contextual knowledge within a classroom. Generative AI often struggles to fully understand context because it learns from patterns in data rather than truly grasping meanings.
On a basic level, generative AI has difficulty in maintaining coherence over extended interactions or fully grasping the subtleties and complexities of situational nuances, which can lead to errors or inappropriate responses. Chatbots don’t understand the heated debates in your local school board or the institutional knowledge of a teacher or the sheer number of cultural values present in a classroom community. AI hasn’t taken the time to learn just how direct the Dutch style of communication can be or the sheer number of kata a student from Japan has learned.
This can be a challenge for for teachers using Project-Based Learning (PBL), which requires deep understanding and connecting ideas across different subjects. Since AI might not always catch these nuances, teachers need to be the experts in facilitating the understanding of context. Similarly, teachers leading a discussion or Socratic Seminar will need to bring their contextual knowledge into the process in a way that a machine simply cannot do.
2. Divergent Thinking
Humans tend to be better than AI at divergent thinking because AI algorithms are built on probabalistic thinking. Humans can think abstractly, draw on personal experiences, and incorporate emotional and cultural nuances into their decision-making processes (see context above). This allows them to generate unique ideas and solutions that are often unexpected and innovative.
By contrast, AI typically follows programmed patterns and lacks the ability to truly innovate beyond its training data. This is a bit simplistic but AI tends to generate things that are vanilla and predictable. They’re more derivative.
Teachers can think divergently and take the vanilla to make it their own. They can work with the basic convergent thinking of the chatbot and then come up with ideas and resources that shift away from the norm. In other words, teachers can be wildly and unabashedly different:
I love to create visual writing prompts. Some of these are just odd (like the snail detectives) and this one has my own unique stamp on it as someone who has learned that my greatest weaknesses can actually be my hidden strength:
So we move forward, we need to embrace the divergent thinking of innovative teachers. But we can also encourage our students to think divergently by doing small divergent thinking challenges and my creating moments of creative constraint.
3. Curiosity
On the surface, a chatbot might seem like it is curious. It can generate questions, for example. But human curiosity is different. Curiosity involves a desire to learn or know more, driven by emotional and cognitive processes that AI does not possess. It’s almost like an itch that we need to scratch. It’s something innate in our humanity and, honestly, something that many animals have as well. Something nags at you and you just have to figure it out.
By contrast, AI operates based on algorithms and data patterns, responding to inputs with outputs trained from vast datasets. The AI only answers once you have begun the question. Machine learning lacks self-awareness. It’s incapable of the the intrinsic motivation that fuels human curiosity. AI experiences no joy when it finds an answer and no mild anxiety when an answer is elusive.
While AI can be designed to explore data or problems in ways that might appear “curious,” it does so without genuine interest or consciousness—it’s simply executing programmed instructions.
If we want students to embrace wonder and curiosity, that requires a human – a teacher – to be a part of that process.
4. Empathy
One of the aspects rarely mentioned in conversations about AI is just how often it fails to “read the room” with a person. An AI tutor might be pleasant and incredibly patient but it can’t read a student’s body language in the same way that a teacher does. If we think about intrinsic and extrinsic motivation, many teachers have an uncanny ability to figure out what is actually motivating a student in the moment:
Or consider the role of scaffolds for exceptional learners and multilingual students. True, teachers can use AI to design scaffolds and supports. But this will also require teachers to know students at a relational level. They can take the basic “vanilla” scaffolds and then modifies those based on what they know about a particular student’s interests, goals, growth, and strengths. Some of the best teachers I know are adept at modifying strategies in the moment based on how a student responds.
But this empathy piece is bigger than motivation or scaffolding. Ask anyone to describe their favorite teacher and there’s often a story that barely tip toes on the academics. For me, it was when Mrs. Smoot said, “When you hide your voice you rob the world of creativity. I’m not going to let you do that.” For my son, it was a teacher who said, “You have an eye for design. You are an artist, Micah.” I know countless people who have stories of a teacher who sat down and listened to them as they opened up about a family tragedy or who saw something no one else could see.
I remember showing my son the amazing feedback I could get from a chatbot. It was a video script about the things I had learned from our adopted greyhound that had just passed away. His response, “That’s awful feedback. The only good feedback here is, ‘I’m sorry for your loss. Do you want to talk about it?’ I don’t write for a machine. I write for a person.”
Our humanity, as imperfect as it may be, is a gift to our students. In an age of A.I., our students still need a human to listen and empathize; to experiment and adapt; to make mistakes and apologize. They will need a guide who can build a relationship and help them navigate a complex world.
Strategy: Rewrite the Prompts
I recently led a workshop on the future of writing in a world of AI. One of my favorite activities involved redesigning writing prompts to center on these human elements:
- Choose a Standard Writing Prompt: Start with a typical writing prompt that you would normally use in your classroom.
- Add Human Context: Revise the prompt to include elements that relate to human experiences, emotions, or backgrounds. Encourage students to consider how personal experiences and emotions influence responses.
- Encourage Divergent Thinking: Modify the prompt to allow for multiple correct answers or interpretations. Ask students to explore alternative outcomes, perspectives, or creative solutions. Consider using something like creative constraint to push them to think more divergently.
- Stimulate Curiosity: Include questions in the prompt that make students question or delve deeper into the topic. Encourage them to research or imagine possibilities beyond the obvious. Provide some sentence stems for them to ask their own questions.
- Foster Empathy: Direct the prompt towards understanding and connecting with others’ situations. This could involve writing from another person’s perspective or considering the impact of actions on others.
- Feedback and Discussion: After rewriting your prompts use the 20 minute peer feedback system to get feedback on how you might revise your prompts
Note that this doesn’t mean students won’t cheat and use AI for answers. This will not solve the issue of academic integrity. But it does require some deeper human element that machine learning cannot do.
Teachers Are the Future of Learning
Earlier, I mentioned the authentic and meaningful learning that students will need as they navigate an unpredictable future. Whether it’s a project-based learning unit or a class discussion, it is the teacher, as the artist and the problem-solver and curator, who sparks innovation.
The future of education can’t be found in a gadget or an app or a program or a product. It doesn’t require a think tank full of pundits. No, the future of education can be found in your classroom. Your classroom is packed with creative potential. You have all the innovation you need right there in your room.
It’s what happens when you experiment. It’s what happens when you give your students voice and choice. It’s what happens when you abandon the scripted curriculum and take your students off-road in their learning. It’s what happens when you teach to your students rather than teaching to the test. It’s what happens when you unleash the creative power of all of your students — when you make the bold decision to let them make things and design things and solve problems that they find relevant.
Sometimes it’s messy and even confusing. It often looks humble. But understand this, that every time your students get the chance to be authors, filmmakers, scientists, artists, and engineers. You are planting the seeds for a future you could have never imagined on your own. And that right there is the beauty of creative classrooms. That’s the power of innovative teachers. That is why the future of education is you.
Get the FREE eBook!
Subscribe to my newsletter and get the A Beginner’s Guide to Artificial Intelligence in the Education. You can also check out other articles, videos, and podcasts in my AI for Education Hub.
Fill out the form below to access the FREE eBook:
One Comment