Skip to main content

Note that this isn’t simply the question, “How will AI change teaching in the future?” This starts with the question, “How is AI already changing the teaching profession?”

How will AI transform teaching?Listen to the Podcast

If you enjoy this blog but you’d like to listen to it on the go, just click on the audio below or subscribe via iTunes/Apple Podcasts (ideal for iOS users) or Spotify.

 

How Will Generative AI Transform the Teaching Profession?

We can’t predict exactly how AI will change every aspect of teaching. But here are a few ideas of what it might look like in various aspects of the teaching profession.

 

Redefining the Lesson Planning Process

In terms of lesson planning, teachers will need to be both creators and curators. In terms of creativity, teachers will need to know how to create their own lessons from scratch. This helps them find their voice as educators and define what good learning looks like in their class. But they’ll also use AI as a co-creation tool within this process. They might use AI to expand their lesson plans and add details like objectives or new learning tasks. They might use AI to receive feedback on the pacing or alignment of their lesson plans.

In terms of curation, teachers will need to take AI-generated lessons and analyze them for quality. Teachers will need to know how to access and analyze lessons with the driving question, “Is this the best option for my students?”

But they will also need to modify the lessons. I use the metaphor of ice cream to describe how they might take the AI-generated lessons and modify them based on their own expertise:

A great set of reflection questions might be:

  • Does the lesson align with curriculum standards and learning objectives?
    Teachers would want to ensure that the AI-generated lesson plan meets specific state or district standards and aligns with the intended learning goals for students.
  • Is the lesson plan appropriately differentiated for diverse learners?
    Teachers might check whether the AI lesson plan offers modifications or accommodations for students with different learning needs, such as ELLs, students with IEPs, or advanced learners. This is a chance to start with your knowledge of students and ask, “How should I design scaffolds based on what I know about _______?” Again, the more specific the better.
  • Does the content promote student engagement and critical thinking?
    Teachers should evaluate whether the AI-generated activities and materials encourage active participation and higher-order thinking skills, rather than focusing solely on rote memorization. This is a great chance to ask, “What actually engages my current class and how do I modify it?”
  • Are the assessments and feedback mechanisms meaningful and aligned with the lesson’s objectives?
    Teachers would want to know if the formative and summative assessments included in the AI plan truly measure student understanding and if the feedback provided helps guide improvement.
  • Is the pacing and structure of the lesson realistic for the classroom environment?
    Teachers should assess whether the timing of activities and transitions is manageable within their specific class period and whether it takes into account the needs of their students. Again, you know your immediate context the best. Is this too slow or too fast for them?

Note that teachers will need to lean into the areas that they do well, like empathy (Does this fit with what my students need? Will they find this engaging?); contextual understanding (Is this relevant to my local context?), and divergent thinking (Can I add some interesting creative elements that will make things more engaging?). In my AI for Education elective course, I had students take an AI-generated lesson and modify it based on their knowledge of their students and the context. I also asked for at least 3 original ideas that the AI didn’t consider. Again, this taps into the concepts of divergent thinking, empathy, and contextual knowledge.

Re-imagining Differentiated Instruction

When I first began teaching, I learned about differentiated instruction and I fell in love with the concept. I would craft lessons that would provide the precise scaffolds and supports that every student would need. I would modify our informational texts to be at each student’s reading level. I would break down tasks and create handouts to reduce the cognitive load for students executive function skills.

And then . . .

I started student teaching. I fell into the trap of “teaching to the middle.” I couldn’t track every student. I couldn’t spend hours creating every single scaffold and support. I attended IEP meetings and work hard to design learning supports but I always had a lingering feeling that I was failing.

As I shifted toward student empowerment, I began to embrace the idea of UDL:

I started to realize that I could provide supports for students but also make them universally accessible. I could empower my students to self-select the scaffolds and supports they need. This was a game-changer in terms of the logistics of differentiated instruction. I started to watch my students move through higher and higher levels of the ZPD:

And yet, I still felt like I was failing. But now, with generative AI, it is easier than ever to design scaffolds and supports for students.

Here are a few more ideas:

  • Providing additional handouts to facilitate task-analysis and executive function
  • Using A.I. to help schedule small groups
  • Using A.I. speech recognition software as an assistive technology to help students with writing
  • Using A.I. image generators to help students who need a more concrete example of what they are learning in class
  • Designing targeted skill practice. For example, you might use a chatbot to generate word problems for students who struggle with 2-step equations, or you might use it to create a high-interest non-fiction text at a student’s reading with sample questions
  • Using A.I. to modify assignments to reduce cognitive load (fewer steps) while encouraging students to still access the grade level content.
  • Using A.I. to create skill practice that students can engage in when they need additional intervention throughout a larger project
  • Using A.I. to reduce the amount of work while still maintaining a high challenge level. For example, a student with dyscalculia might need fewer problems but can still master the math content at the same grade level.

None of these supports should replace the goals within an Individualized Education Plan (IEP). We don’t want to replace educators with algorithms. We can, however, use the A.I. as a starting place for designing more personalized scaffolds and supports. While we want to take a UDL approach, we also want to craft supports that are tailored to individual students. Here, the A.I. platform saves time and makes the differentiation process more feasible for teachers. It works like an assistant to create something general that you can then modify based on your own expertise and knowledge of students. In other words, you can design individualized scaffolds for a specific student within seconds based on your own list of supports that student needs.

Similarly, we can use AI to design front-loaded vocabulary, translations, sentence stems, and other scaffolds for multilingual students.

We can also use AI to create leveled readers. We might take an older text (like a primary source) and have it create leveled readers from that. We might use our text that we write and have it do the same. We could use multiple texts and have the AI create a mash-up or do an open-ended question and use AI to design leveled texts from that. In each case, we will likely need to add revisions based on our own knowledge and the knowledge of our students. But the AI does make this type of differentiation far more feasible than it has been in teh past.

I also think teachers are going to get more imaginative in how they modify similar assignments. So, they might find that some students do well with complex directions and others with simplified directions and they’ll train AI systems to tailor specific directions to meet the needs of specific students. As a middle school teacher, I used to break down tasks for students who struggled with task analysis and then we would go from my breakdown to a shared process and ultimately work toward students doing this task analysis on their own. AI systems do such a great job with task analysis that I imagine this whole process is going to change.

We also want to empower students to use age-appropriate chatbots for their own differentiated learning. Here, they go to a chatbot to ask questions, find new explanations, clarify misconceptions, and engage in skill practice. As educators, we will likely be working with stakeholders (parents and guardians) to use these tools at home in a way that creates new supports.

And yet, these same tools that allow us to create differentiated scaffolds can easily become tools for cheating. Which leads to my next thought . .

 

Revising Our Assignment Design

One of my biggest concerns with AI is that students will cheat. I know that we aren’t supposed to be concerned about cheating. It’s just a tool. No different from a calculator or a slide rule. But it’s the same. It’s a large language model (LLM) mimicking the human brain and students will use it to cheat.

It’s easy to lean on tools like AI detectors / checkers to see if students are cheating. But these tools are often unreliable and we run the risk of falsely accusing a student of cheating when the AI mistakenly assumes a text is machine-generated. If an AI-checker is 85% accurate and you teach 180 students who each do ten essays in semester, this could lead to 270 cases of either getting away with cheating or a false accusation of cheating. The result is a complete loss of trust.

What we can do is revise the assignments in a way that focuses on the human elements (context, empathy, curiosity, etc.)

I recently led a workshop on the future of writing in a world of AI. One of my favorite activities involved redesigning writing prompts to center on these human elements:

  • Choose a Standard Writing Prompt: Start with a typical writing prompt that you would normally use in your classroom.
  • Add Human Context: Revise the prompt to include elements that relate to human experiences, emotions, or backgrounds. Encourage students to consider how personal experiences and emotions influence responses.
  • Encourage Divergent Thinking: Modify the prompt to allow for multiple correct answers or interpretations. Ask students to explore alternative outcomes, perspectives, or creative solutions. Consider using something like creative constraint to push them to think more divergently.
  • Stimulate Curiosity: Include questions in the prompt that make students question or delve deeper into the topic. Encourage them to research or imagine possibilities beyond the obvious. Provide some sentence stems for them to ask their own questions.
  • Foster Empathy: Direct the prompt towards understanding and connecting with others’ situations. This could involve writing from another person’s perspective or considering the impact of actions on others.
  • Feedback and Discussion: After rewriting your prompts use the 20 minute peer feedback system to get feedback on how you might revise your prompts

Note that this doesn’t mean students won’t cheat and use AI for answers. This will not solve the issue of academic integrity. But it does require some deeper human element that machine learning cannot do.

As educators, we need to design systems that spell out exactly how and when students can use AI. Here’s an example of one I use in some of my assignments. Here’s how it works:

  • Blue: AI-generated text
  • Green: AI Generated but Revised by Human
  • Pink: Human Generated but Edited by AI (think Grammarly or Spell Check)
  • Black: Human Generated (with no modifications)

As a professor, I can look at an assignment and see, in a clearly visual way, the interplay between AI and human. I can see the way an AI-generated idea sparked an entirely new line of thinking that then led to something fully human. I can also see how students created significant modifications in their work. I’m still in the process of modifying this system and incorporating it in all of my assignments.

But there’s also a more subtle element at work and it has nothing to do with cheating. If students use AI on auto-pilot, they run the risk of hitting cognitive atrophy. Cognitive atrophy happens any time we lose the ability to engage in a mental process due to inactivity. In a world of Artificial Intelligence, we need to be cognizant of the dangers of cognitive atrophy so that we can continue to engage in curiosity, creativity, and deeper learning. As we re-imagine our assignment design, we need to consider how we might integrate AI into previous instructional strategies that incorporate creativity and curiosity. For example, we might incorporate the student use of AI into something like project-based learning:

Here’s a sample of what this might look like:

  • Generating additional questions: Toward the beginning of a project, a student might start with a list of research questions they have. They can then go to AI to get a list of additional questions. Or they could use AI to refine their questions to be more specific. If they’re asking interview questions, tehy could ask the AI to refine their questions to be more open-ended or convey more critical thinking. Notice how they’re not outsourcing the inquiry but they are using AI as a tool.
  • Clarifying misconceptions during research: Sometimes students struggle with conceptual understanding. AI can function in a similar way to Wikipedia, in that it’s not the best source but it is a great starting place when students are trying to develop a schema.
  • Restating research in simpler terms: If students are doing text-based research, they might see a website with great research. They’ve looked at the reliability of the source and explored the bias. Unfortunately, the source contains technical language and dense grammatical structures. Students can use AI to simplify the language.
  • Navigating ideas: After students have engaged in a deep dive brainstorm, they can go to AI and ask for additional ideas. Students can then analyze these ideas and incorporate them into their design.
  • Generating project plans: ChatGPT is really good at taking a larger task and breaking it down into smaller tasks. After they have navigated ideas, students can use AI as a starting place for a project plan with dates and deadlines. They can then modify this based on their skill level, group dynamics, etc.
  • Prototyping: If students are writing code, they might start with AI and then modify the code to make it better. They could mash up two examples. In this way, the AI functions like an exemplar within a project. The critical idea is that it should occur after students have engaged in ideation.
  • Coming up with group roles: Students can use AI as a starting place for group roles and then modify them to fit the group. Afterward, they can negotiate norms and consequences for breaking norms. The group can then use AI to create group contracts with norms, roles, and consequences.
  • Project management: Students can take the tasks and the progress they’ve made and use AI to help them determine what to do next and what they might need to change to stay on schedule.
  • Receiving feedback: I’ve been surprised at how well AI does in giving quality feedback. While peer feedback should remain a student-to-student endeavor, groups sometimes fall victim to groupthink. AI is a great tool for helping avoid the groupthink.

These are just a few ideas and they’re based largely on how I might use AI within PBL. As we craft asssignments, we will need to consider how we might incorporate AI as a co-creation tool and as a tool for curiosity. We will need to bring students into the conversations about how to use AI ethically and wisely.

But we also need step away from AI entirely at times and embrace the lo-fi. Years ago, I wrote a book called Vintage Innovation which explored the overlap of the old and the new. I know you’re not supposed to have a favorite book but it was my favorite book I ever wrote because it captures the nuance I want to hold onto as we redesign our assignments in light of AI. I think it’s important to remember that relevance isn’t simply “flashy and new.” It’s often “better and different.” So, we might have times where we go off-screen and embrace the physical and tactile. We might say, “Kids are on screens too much and we need to step away from that.”

As we redesign our lessons, we also need to refine how we deliver instruction.

 

Refining How We Deliver Instruction

For years, teachers have struggled to deliver the precise instruction that meets a student’s skill level. If we think of this flow model, the goal is for the challenge and skill level to meet:

With AI, it is now easier than ever for teachers to craft lessons that meet a student’s skill level. It’s no surprise that many technology evangelists are hopeful that the technology might replace teachers as a role for instructional delivery:

Artificial Intelligence futurists are predicting that AI will provide students with this entirely personalized educational experience – one that has the potential to replace teachers. However, this vision for the future mistakes adaptive learning for personalized learning.

Adaptive learning is clean and fast and efficient (not unlike that giant machine that compressed every human element in the iPad commercial). Personalized learning, by contrast, is messy and human. It’s filled with inefficiencies. It’s built on student empowerment:

As we think about the changing role of instructional delivery, we will need to choose wisely when to use both adaptive learning and personalized learning. We will need to pay close attention to the role of productive struggle. In an age of instant answers, we need students to develop resilience and wade into confusion.  We need to make sure that they are engaged in collaborative learning rather than sitting in silos doing adaptive learning worksheets. We need to make sure that personalized learning leads to the skills that students will need as they navigate an uncertain future. But that also means we need to be open to revising our content standards as well.

 

Revising Our Content Standards

We need to recognize that the standards themselves are going to change as a result of AI. In my book, The A.I. Roadmap, I explore specific ways that machine learning will impact each subject. Our tools shape the way we engage in research and discover new insights into science, history, geography, literature, etc. But it also impacts the ways that we learn those subject as well. An early 19th century education relied more heavily on rhetoric and memorization than our current educational models. And, while there is benefits in embracing aspects of classic education, we need to recognize that stakeholders in the field continue to redefine their own learning as new technologies emerge. For this reason, we might need to reach out to the following groups:

  • researchers and professors at the university level and ask, “how has machine learning changed the way you study ______?”
  • practitioners in the field and ask, “how has machine learning changed the way you do ______?”
  • business people and entrepreneurs and ask, “what skills are you looking for in light of the changes brought about by machine learning?”
  • people who work in civic spaces and non-profits and ask, “given these changes within our world, what would you like to see us address?”

We might find a greater emphasis on human skills like collaboration, creativity, empathy, and contextual understanding. Many schools are already addressing these realities by creating graduate profiles.We might also see a more interdisciplinary approach, where students work on projects that combine multiple disciplines in a creative way. However, we will also see the standards change within the subject areas themselves. We’ve already seen an emphasis on STEM-related standards with the NGSS and we have seen the inclusion of digital literacy, information literacy, and visual literacy in the CCSS and TAKS standards for ELA. Both sets of standards now emphasis the role of online research for informational text writing.

But what about AI? Here’s a potential way we might integrate AI into writing. 

  • Using AI for initial conceptual understanding
  • Using AI for summarizing key information
  • Engaging in information literacy and prompt engineering during the research process
  • Using AI to generate an intial outline and modifying the outline
  • Using AI to generate initial text but hen revising it and adding to it
  • Using AI for the revision process
  • The ability to find and refine one’s voice in a sea of bland AI content
  • Differentiating between when you should and should not use AI tools for writing
  • Redefining information literacy to include AI tools and larger concepts of deep fakes

We might see new standards emerge around prompt engineering:

We might see these standards within the ELA and social studies standards. But we will continue to see big revisions in the ISTE standards as well. I’ve been impressed by the ways that the ISTE standards have been evolving to reflect the changing landscape of technology, including the significant impact of artificial intelligence (AI). While the core principles of the standards remain consistent, the specific skills and competencies required have shifted to emphasize the ethical, responsible, and creative use of AI. They have incorporated ideas of data usage, privacy, copyright, and environmental concerns. They’ve incorporated ideas around bias and hallucinations in a way that helps students understand the nature of AI.

 

Reconsidering the Role of Assessment

Machine learning is going to transform assessment in significant ways. For a deeper dive, check out these five trends.

This is a sketchnote with all 5 ideas: 1. Less grading, more assessment. 2. Empowering students to own the assessment process 3. Faster feedback. 4. Increased differentation of assessements 5. Predictive analytics

On a basic level, AI promises to make the assessment process smoother and more feasible. We can use AI tools to do auto-grading, for example. We can use these tools to design assessment protocols, like quizzes and rubrics. This can then free us up, as educators, to provide more one-on-one feedback and comments on student work. But that just scratches the surface of machine learning and assessment.

We can use machine learning to review data and look at trends. Note that we need to be FERPA, COPPA, and CIPA compliant. But with properly vetted tools, we can get a better picture of what students know and don’t know. We can use it as a diagnostic tool to find areas where they need intervention in things like phonics or mathematical processes. I’m currently working with computer scientists on a tool that will measure student fluency levels but also find trends on what needs to be retaught (phonics, blending, etc.) in a way that’s grounded in the science of reading so that students can get specific intervention.

But we need to go beyond an efficiency mindset and ask the bigger question, “What does assessment look like when students have access to AI tools?” In other words, how can we use AI to increase student ownership of the assessment process?

When this occurs, students improve their metacognition and they have a deeper undrestanding of what they know and what they don’t know:

So, what does this look like for students? Students now have a tool they can use to ask for instant feedback. Here, the AI becomes another set of eyes. They can ask for critical feedback but also affirmation on key areas of a product they create. They can ask for diagnostic feedback on a math problem and learn exactly where they went wrong. But they can also use AI tools as a form of retrieval practice. Here’s how it works:

  • Students upload their previous work and ask AI for trends on what to study
  • Students then ask for skill practice from the AI (this works best when using a tool that has been trained on academic content data)
  • Students ask the AI to quiz them going progressively harder as they master the content. For every wrong answer, the AI then provides feedback on why an answer was wrong.
  • Over time, students can shift to open-ended questions with feedback on why their response was wrong or right (on a sliding scale).
  • Students can use AI to set goals and track progress over time

Ultimately, every student’s use of AI as an assessment tool will vary. But our job as educators will be to model these approaches so that they can leverage generative AI in a way that helps build metacognition while also ensuring productive struggle.

 

Automating Administrative Tasks

Teachers have jam packed busy schedule and AI can help reduce stress and save us time. Ideally, we choose to use technology to automate tasks that we hate doing. I love having an automatic transmission and feel zero nostalgia for a stick shift. I’m glad I have a laundry machine and dishwasher. The same is true of our jobs. There are certain mind-numbing, repetitive tasks that we hate doing. If AI can help automate data tracking, that’s great. If it can speed up grading, I’m on board. I love using AI tools for scheduling and breaking students up into groups.

There are other tasks that we might not want to automate entirely. Here, we start with an AI-generated process and then modify it based on our contextual and relational knowledge of our students and of the larger community. For example, you might provide generative AI with a bullet point list and a template for a classroom newsletter. But then you’ll revise it to reference conversations you had in class or to add your personal voice. You might take this newsletter and make two versions: long form and short form and let families decide which option they want to read.

It’s important that we provide teachers with the autonomy to decide what gets automated and what they choose to do from scratch. We need to give them the time and the freedom to find workflows that work for them. But this requires intentionality.

It’s important that we remain intentional about how we use AI in these domains. Automation is great but we need to ask, “What might we lose when we turn to technology to do this task?” It’s also important that we don’t pile on new responsibilities once we see teachers implement AI for these administrative tasks. My fear is that once these AI strategies become pervasive, leaders will say, “Now that you have free time, we need you to ______” and simply add new items to their already full plates.

 

Engaging in Dialogues About the Ethical Use of AI

Note that I titled it “nearly” every aspect of the job. There are so many aspects of teaching. But there are so many areas that should remain the same – crafting meaningful lessons, building relationships, using hands-on vintage tools. As educators, we need to be at the forefront on any conversation about transformation, tech, and teaching.

Ultimately, all of these changes in the teaching profession require intentionality. We need to engage in hard conversations about when and how we will use AI moving forward. We need to wrestle with the ethical implications in a way that is neither idealistic nor naive. No one knows exactly how AI will change the teaching profession but my hope is that we engage in this transformation in a way that is human-centered and focused on both student and teacher empowerment.

Nobody knows exactly how AI will change the teaching profession. However, my hope is that we lean into the expertise of current classroom teachers and ask them to reimagine the role of the teacher and the tasks they do. My hope is that we retain the human element but also give teachers the freedom to integrate AI ethically. Ultimately, we need to give teachers the professional autonomy they deserve because empowered teachers empower students.

Get the FREE eBook!

With the arrival of ChatGPT, it feels like the AI revolution is finally here. But what does that mean, exactly? In this FREE eBook, I explain the basics of AI and explore how schools might react to it. I share how AI is transforming creativity, differentiation, personalized learning, and assessment. I also provide practical ideas for how you can take a human-centered approach to artificial intelligence. This eBook is highly visual. I know, shocking, right? I put a ton of my sketches in it! But my hope is you find this book to be practical and quick to read. Subscribe to my newsletter and get the  A Beginner’s Guide to Artificial Intelligence in the Education. You can also check out other articles, videos, and podcasts in my AI for Education Hub.

Fill out the form below to access the FREE eBook:

 

John Spencer

My goal is simple. I want to make something each day. Sometimes I make things. Sometimes I make a difference. On a good day, I get to do both.More about me

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.