Skip to main content

Machine learning is changing our world in profound ways. It will impact the way we learn and the way we teach. In today’s article and podcast, I want share a few big ideas on what that might look like. I’ll share some questions we might consider along with four trends I think we will see in the teaching profession. Next week, I plan to zoom in a little more into the specific aspects of teaching and how AI might redefine those areas (lesson planning, assignment design, standards, assessment, admin tasks, etc.). But I’m starting with the big picture first.

How will AI transform the teaching profession?Listen to the Podcast

If you enjoy this blog but you’d like to listen to it on the go, just click on the audio below or subscribe via iTunes/Apple Podcasts (ideal for iOS users) or Spotify.

 

How is AI Changing the Profession?

Yesterday, our department had a discussion about machine learning and teacher preparation. At one point, a colleague of mine said, “Our course goals and common assessments focus on teaching teachers how to design lesson plans and assessments but when I talk to teacher candidates, they’re not doing in-depth lesson planning. In many cases, they’re not doing lesson planning at all. They’re using Teachers Pay Teachers or an adopted curriculum. Or they’re making lessons with Magic School.”

This led to a deep conversation about how we need to adapt to generative AI within the courses we teach. In the immediate future, this involves using a common assessment that asks teachers to design a lesson plan but also analyze a preexisting lesson and modify it based upon contextual and personal knowledge of one’s students.

In the long run, though, we are asking a bigger question, “How is generative AI changing the role of teachers? How is it changing the way we learn? How do we adapt to that reality as a program?”

I work with an amazing group of innovative educators who are tackling these questions from a mindset of curiosity rather than fear. I’m excited about the work that we are doing and the conversations we are having with district partners and pre-service teachers. But it also has me reflecting on those big questions and I thought I would share my thoughts with anyone who is interested in a similar dialogue. First, though, I want to share a pushback on this idea that AI might somehow replace the role of the teacher.

 

AI Will Not Replace Teachers

When I was in middle school, a teacher brought us all to the front of the class and held out a giant golden disc. “This is the future of education,” he said. “Someday, you’ll be able to pick up one of these discs, take it to a pod with a tv and learn exactly what you need to learn. You won’t even need teachers. This will change learning forever.”

It did not change learning forever. Nobody is using laser discs. They have become obsolete. Meanwhile, teachers continue to be relevant. Ask anyone to name a person who had a profound role in their life and nine times out of ten, you hear about a teacher. That has not changed and that will not change.

Years later, I saw the same argument re-emerge with the rise of internet and search engines. We could just ask Jeeves and he’d answer our questions. Rest in Peace, Jeeves. Later, it was one-to-one devices and then leveled reading programs for intervention. Now, it’s the promise of personalized learning with generative AI.

For the last two years, I’ve watched as tech gurus have promised a bold new future where we won’t even need teachers. Students will sit in front of computers doing tailored, individualized work at a computer in isolation. Meanwhile, we’ll do away with formal requirements of teacher licensure and hire contract workers who pass a background check to walk around the rows of computers keeping students on task.

This is supposed to be a techno-futurist utopia but to me this feels more like a bleak, machine-driven dystopia. It strips learning of the human, the tactile, and the authentic. I’ve written before about why I don’t think AI will replace teachers. Instead, I think the human element will become ever more central in a world of automation. Students will need deeply human skills that machines lack:

In a world of AI, our students will need to become really good at what AI can't do and really different with what it can doGiven this reality, I’m actually hopeful for the future of education in a world of AI. If we avoid the trap of techno-futurism, we can re-imagine the teaching profession in a way that honors our humanity. But what does this look like? Let’s start with a few driving questions.

 

Three Driving Questions for Educators to Consider

In competitive chess. A.I. will nearly always beat a human. But when you do chess via teams, the fully automated A.I. teams rarely win. Neither do the all-human teams. The winning teams are nearly always the combination of A.I. and human. If that’s true of an isolated system like chess, how much more true will that be in a complicated world where the systems are constantly evolving? The goal, then, is to use a blended approach to teaching that incorporates elements of both humanity and machine learning. As educators, we want to be human-centered but tech-informed.

This venn diagram is an overlap of AI and the human voice with the word "blended" in the middle

This starts by asking what AI does well and then leveraging what we, as humans, do well.

 

1. What Do We Do Better Than Machines?

One thing humans do really well that machines struggle with is context. AI lacks contextual knowledge. It’s a bit like interacting with a toddler who has an encyclopedic knowledge of the world but has never left the room. This toddler has only interacted with you. Truth be told, AI is actually a little closer to that of a parrot who speaks a bunch of phrases but doesn’t actually have meaning or intent behind it. Because AI is built on algorithms shaped by predictive analytics, it lacks the contextual knowledge within a classroom. Generative AI often struggles to fully understand context because it learns from patterns in data rather than truly grasping meanings.

Humans also tend to be better than AI at divergent thinking. AI algorithms are built on probabalistic thinking. Humans can think abstractly, draw on personal experiences, and incorporate emotional and cultural nuances into their decision-making processes (see context above). This allows them to generate unique ideas and solutions that are often unexpected and innovative. By contrast, AI typically follows programmed patterns and lacks the ability to truly innovate beyond its training data. This is a bit simplistic but AI tends to generate things that are vanilla and predictable. They’re more derivative.

We also excel at curiosity. Follow a toddler around anywhere and you’ll see them ask non-stop questions. This is an innate aspect of our humanity. Curiosity involves a desire to learn or know more, driven by emotional and cognitive processes that AI does not possess. It’s almost like an itch that we need to scratch. It’s something innate in our humanity and, honestly, something that many animals have as well. Something nags at you and you just have to figure it out. While AI can be designed to explore data or problems in ways that might appear “curious,” it does so without genuine interest or consciousness—it’s simply executing programmed instructions. If we want students to embrace wonder and curiosity, that requires a human – a teacher – to be a part of that process.

We are also empathetic in a way that machines simply cannot be. One of the aspects rarely mentioned in conversations about AI is just how often it fails to “read the room” with a person. An AI tutor might be pleasant and incredibly patient but it can’t read a student’s body language in the same way that a teacher does. We can program AI chatbots to be “pros-social,” but we can never train them to feel what we feel.

As we think about a blended approach to teaching, we need to center the profession on these characteristics. We need to ask, “How can we restructure the teaching profession to center on elements like empathy, curiosity, divergent thinking, and contextual knowledge?” In my current 506 course, I ask students to revise an AI-generated lesson plan with a focus on empathy (and student knowledge), contextual understanding, and divergent thinking.

After focusing on the human element, we can ask, “What do we want to automate?”

 

2. What Do We Want to Automate?

When we consider AI, there are certain elements that it does really well:

  • Synthesize information
  • Generate examples
  • Role-playing
  • Creating systems
  • Using predictive analytics
  • Analyzing a problem
  • Help with conceptual development

A blended approach will likely need to embrace some of these elements of AI. However, it doesn’t mean that we outsource all of the synthesis, analysis or predictive analytics to a machine. We want to avoid cognitive atrophy. This is something I explore in the following Instagram video:

 

View this post on Instagram

 

A post shared by Dr. John Spencer (@spencereducation)

If you like this type of content, would you consider subscribing to my Instagram?

In other words, just become we can automate something doesn’t mean we should automate it. There are certain aspects of teaching that we all, as professionals, should fight to keep. This isn’t a new phenomenon. As educators, we have had to battle for professional autonomy. We’ve seen this with scripted curriculum and air-tight policies that reduce teachers to automatons who deliver lessons. I created the following video exploring this concept a few years ago:

But there’s some nuance here. In working with so many preservice and current teachers, I’ve learned that we all have a different approach to the creativity of teaching. As a result, we all have different aspects of the profession that we would like to automate. Some teachers want to create all lessons from scratch. Others love packaged curriculum that they can then modify based on their knowledge of students. Some love creating assignments and assessments from scratch. Others would love AI to do that work for them. I, for one, can’t imagine letting a machine make my slideshows or videos. However, I am happy to let a machine learning system create an initial rubric that I modify.

The bottom line is that we need to ask, “What should we automate but also what do we not want to lose?”

3. What Are the Challenges and Opportunities of Machine Learning?

We need to be honest about the pros and cons of machine learning. We can now provide targeted instruction and instant feedback in a way that was once nearly impossible. However, we need to recognize the risk of AI making tasks so targeted and efficient that students don’t experience necessary productive struggle. We can now provide instruction at students’ reading levels with more simplicity and clarity. But we also need to be cognizant of the role of confusion in deeper learning. Students can use AI as a co-creation tool but they can also slip into cognitive atrophy. They can use it to find answers but also as a quick way to cheat. They can find answers instantly in a way that is more streamlined than a Google search but they also need to know about hallucinations and biases within the answers, which is why prompt engineering is more important than ever.

We need to avoid the two extremes of techno-futurism and lock-it-and-block-it:

As I think about these big questions, it has me wondering about the larger trends that we need to consider for the teaching profession at both the K-12 and higher education levels. Here are a few that come to mind.

Four Trends We Might See in the Teaching Profession

The following four ways the teaching profession might evolve in the upcoming years. I’d like to recognize ahead of time that nobody knows for sure exactly how the technology will change the teaching landscape. These are merely some current trends that I imagine will accelerate in the next 5-10 years.

 

1. Educators Will Need to Be Curators

With AI, so many more tasks will be automated. From lesson planning to assessments to things like classroom newsletters, we no longer need to do these same tasks from scratch. This is why curation is so important. Curators are able to look at the pros and cons of a piece of content and ultimately ask, “What is best?” Curators know how to organize different pieces as mash-up artists. They’re able to be critical without being critics and celebratory of what works without falling for the latest fad.

As teachers engage in lesson planning, assessment, and scaffolding instruction, they will need to engage in this curation process. The following video explores this curation process:

In other words, teachers will need to be both critical and curious. They’ll need to be knowledgeable and wise. But like any great curator, they’ll need to figure out how to make the AI-generated content more relatable. Which leads to the next point . . .

2. Educators Will Need to Tap Into Empathy and Relational Knowledge

When ChatGPT first came out, I got really excited about the way it could provide targeted feedback on my writing. It was an amazing feedback tool. So, at one point, I sent the script to the following video and asked for feedback:

I then called my middle child, Micah, over to me and said, “Check out this amazing feedback. It’s practical. It’s specific. It’s actionable. This is a gamechanger.”

He shook his head. “Dad, that’s awful feedback.”

“What do you mean?” I asked.

“The only correct feedback from a teacher should be, ‘I’m sorry for your loss do you want to talk about it?'”

He then said, “I don’t write for machines. I write for people. And the best feedback I ever got was what convinced me I could be a writer. It was three words. Ms. Reddiger said, ‘That moved me.’ It wasn’t practical or actionable or whatever but it was the best.”

It’s a reminder that AI can provide targeted feedback. It can generate amazing leveled readers. It can craft phenomenal feedback. But it can’t empathize. It can’t build relational knowledge with students. In a world of machine learning, this skill will be more important than ever.

3. Educators Will Need to Contextualize the Learning

Previously, I mentioned the role of context as an area where humans beat AI. As educators, we will need to take AI-generated content and modify it based on our contextual knowledge. Let’s take a small example of word choice. Imagine you’re doing a classroom newsletter and it auto-generates as “Dear parents.” Based on your contextual knowledge, you realize that many of your students live with aunts, uncles, older siblings, or grandparents. Some live in a foster care situation. The institutional term would be “guardian.” Okay, but that doesn’t capture the essence of how your community views caregivers. So, instead, you use the term “Dear families.” That’s a small example of contextualizing AI-generated content. If you’re creating a decodable, you might connect it to local interests. You might modify the sentence stems that it creates for a Socratic Seminar based on a lesson you taught your class last week. The bottom line is that contextual knowledge is now a premium and we, as educators, will need to build a bridge between generative AI and the immediate context.

 

4. Educators Will Need to Be Adaptable

Right now, there is a big push toward more “traditional” practices in education. I see this all the time on X. They create a straw man version of student-centered learning and contrast that to something traditional. So, “PBL is bad. It leads to extraneous cognitive load and wastes time.” Never mind the proponents of PBL who demonstrate how we can reduce cognitive load and even use direct instruction strategically. The message seems to be, “Let’s return to the glorious good old days of teaching.”

Some of this corrective action is necessary. I see value in limiting cell phone usage, for example. I’ve written before about the need for old-school tools and vintage technology.

At the same time, we need to recognize that technology has always impacted the way we learn. New tools ultimately transform the learning process. We need to take a “vintage innovation” approach that embraces the overlap of the tried and true and the never tried, that sees value in the best practices and the next practices.

In my experience, many teachers are facing change fatigue. Between the rise of social media and then COVID-19 and then the rise of AI, many teachers are saying, “I just want to get back to the good old days of teaching.”

But those days won’t return. The context will continue to evolve. For this reason, we need to adapt to the changes we face. We need to play the New Teacher Card:

So many teachers have described the feeling of “being a new teacher all over again.” This feeling is going to accelerate as generative AI continues to disrupt every aspect of our lives. But we, as educators, can be adaptable. We model for our students what it means to be open to new ideas and to iterate along the way. We can be resilient when faced with challenges. We can experiment with new ideas.

I feel for the teachers who long for former days. I have felt this myself at times. But we can’t wish away the current context or put on blinders and pretend that our world isn’t changing. Instead, we need to be creative and innovative.

 

The Future of Education Is You

When I first learned about generative AI (back in late 2015), I was overwhelmed. I would talk to computer scientists, programmers, and engineers; and I would ask, “What does the future of education look like?”

Over time, I shifted from scared and nervous to cautiously optimistic. I think we tend to underestimate the creative potential of teachers. Collectively, our society fails to grasp just how innovative educators are on a regular basis. I genuinely believe that the future of education can’t be found in an app or a system or an AI chatbot. The future of education is already in the classroom.

This is why it’s so important that we empower teachers to empower students. This is why teachers need to be a key stakeholder in the conversation about AI integration. Next week, I’ll be sharing some ideas for how AI might redefine various aspects of the teaching profession but I do so with a profound recognition that it will be the teachers who reimagine the profession at the ground level.

Get the FREE eBook!

With the arrival of ChatGPT, it feels like the AI revolution is finally here. But what does that mean, exactly? In this FREE eBook, I explain the basics of AI and explore how schools might react to it. I share how AI is transforming creativity, differentiation, personalized learning, and assessment. I also provide practical ideas for how you can take a human-centered approach to artificial intelligence. This eBook is highly visual. I know, shocking, right? I put a ton of my sketches in it! But my hope is you find this book to be practical and quick to read. Subscribe to my newsletter and get the  A Beginner’s Guide to Artificial Intelligence in the Education. You can also check out other articles, videos, and podcasts in my AI for Education Hub.

Fill out the form below to access the FREE eBook:

 

John Spencer

My goal is simple. I want to make something each day. Sometimes I make things. Sometimes I make a difference. On a good day, I get to do both.More about me

5 Comments

  • Oscar Lang says:

    AI is just another tool added to collect of teaching resources. However,high technical apps and access to the internet in third world countries are a challenge.
    Specialised ICT training should be part of the B.Ed degree. Internet access should be free in third world countries.
    Teachers should be adaptable to embrace AI in the classroom.

  • Tom Panarese says:

    Some pushback on your point re: Twitter (sorry, I can’t with Elon) and the “good ol days” line. While I am sure that there are people who are looking to go back to those days, I think there are also a number of us who have gotten tired of years of “educators” on social media chastise teachers for still using the older tools they have been using for years. Add to that administrators who take five minutes to walk through a room and ding you because they walked into the middle of some direct instruction or that you weren’t demonstrating whatever they recently read in an article.

    I personally think AI is going to be a burden on learning as cell phones were because the rhetoric surrounding it at the moment has the exact same points from the exact same people. Plus, it’s another means by which teachers can be devalued by those who are “influencers,” “thought leaders,” and “decision makers” in our field (many of whom are grifting for a buck).

    But maybe I’m just an old man yelling at a cloud.

  • I have an excellent experience

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.