With AI becoming a helpful tool in education, it’s easy to see the benefits of quick access to information, personalized support, and the efficiency that technology can bring. However, as we embrace these tools, we also want to preserve and nurture the learning experiences that AI doesn’t fully replicate. Skills like embracing confusion, navigating boredom, and practicing extended focus are essential for students not only to succeed academically but to thrive in an AI-rich world.
Listen to the Podcast
If you enjoy this blog but you’d like to listen to it on the go, just click on the audio below or subscribe via iTunes/Apple Podcasts (ideal for iOS users) or Spotify.
Technology Can Make the Learning Process More Efficient
When I in the eighth grade, I spent an entire year working on a multimedia project for the National History Day competition. As a student, I treated learning as a consumer commodity. My goal was to put in the bare minimum effort to get the highest possible grade. I was a decent test taker and completed just enough worksheet packets to earn a B and keep my parents satisfied.
Like many students with the Gifted label, I equated success with speed and accuracy. I defined myself as a good reader because I scored at the top of the class in reading fluency. This wasn’t the teacher’s fault. She actually used large cats instead of numbers for our literacy circles and reading groups. But I knew it was better to be a Lion than an Ocelot. And I could see the difference in speed and mistakes.
So fast forward to this History Day Project. Mrs. Smoot gave me a warning. “John, you’re going to get frustrated. You’re going to have to learn new skills and try new things. You’ll have to problem-solve challenges on your own. You’re going to work harder than you’ve ever worked before.” She was right.
And at first, I hated it. I wanted instant answers. I wanted step-by-step directions. But slowly, as realized that I had extended time to work on this project, I began to embrace the longer journey. I fell in love with research and read every book and article I could find about the integration of baseball. I interviewed former players. I wrote out my script and reworked it over and over again. Along the way, I realized that learning isn’t only about speed and accuracy. It’s about depth. It’s about critical thinking and curiosity. It includes confusion and nuance and productive struggle and even boredom.
As I look back at this project, I realize that the learning process was slow, but so was the technology. When I read articles, I had to get a ride to the Fresno State library (Go Bulldogs!), find the information in a card catalog, visit with the librarian, and ultimately navigate the dark art of mirofiche and microfilm. If I wanted a portable copy, it cost money. I had a limit on the number of books I could check out as well.
I spent hours in front of a typewriter and an APA formatting book making sure I didn’t make a mistake on my expansive Bibliography. I wrote each draft of my script by hand. I used a camera and had to wait for optimal lighting to take photographs that I then converted to slides. We went to a radio studio to record our scripts and I edited it the reel to reel tape using a razor blade and tape.
I can now do all of these tasks quickly. I sketch out entire sketch videos by hand and edit them with a computer. I make slideshows in minutes. I record podcasts in my office and add background music on the fly. I can interview experts for free without asking how much money a long distance phone call will cost. I can research with Google Scholar and even get summaries of articles in advance using Consensus. The multimedia creative process is cheaper and faster than ever.
Is efficiency always a good thing?
As we think about AI, it’s easy to see just how fast and efficient the learning process might become in the future. Instant answers. Targeted tutoring. Quality feedback in seconds. I’ve already seen how students can use AI tools to fix audio issues and create jump cuts for videos. However, there is a danger in using AI to transform the entire learning process without thinking about the cost. I’ve written before about the two traps to avoid: Lock in and Block It and Techno-Futurism.
If we begin with the question, “What can technology do faster?” we run the risk of cognitive atrophy, where we allow the algorithms to do the thinking for us. I struggle with spatial reasoning because I have allowed the map app on my phone to do all navigation for me. I worry that the same thing might occur when students use AI as the starting place for their writing (especially given the reality that we often learning through writing rather than write after learning).
But I think there’s a more subtle danger in the efficiency of AI. We often use speed metaphors in education. At the start of every school year, we hear about summer learning loss. The two biggest education policy initiatives I witnessed as a middle school teacher were No Child Left Behind and Race to the Top. For all the talk of “lifelong learning” in our mission and vision statements, the message was clear. We need to move quickly. We have lots to cover and kids are falling behind. I don’t want to minimize the challenges of a student who is years behind their grade level in reading. I don’t want to ignore the need to prepare students for university, trade schools, and future jobs. But I still wonder if there is a cost to the educational obsession in America with speed.
What do we lose when students spend hours in front computers completing adaptive learning programs doing digital worksheets? Consider this contrast between personalized learning and adaptive learning:
In this adaptive learning model, built on speed and accuracy, students might not develop some of the vital human skills that they will need in a world of smart machines. However, we can mitigate this by taking a blended approach that emphasizes human skills and integrates AI with intentionality.
What Are the Human Skills We Don’t Want to Lose?
The following are nine deeply human skills that have become vital in a world of machine learning. Note that this is not an exhaustive list. I’d love for you to share some comments at the bottom of this article sharing some human skills as well. Also, these aren’t particularly groundbreaking. Schools have been engaged in hard conversations for decades as they craft and refine their Graduate Profiles; sharing a set of core competencies their students will need in an uncertain future.
1. Confusion as a Catalyst for Learning
As a professor, I strive for clarity. I use visuals to clue students in to key information. I keep a consistent format for the agendas and slideshows. Each semester, I do a UX Design Audit to reduce extraneous cognitive load, making my courses easier to navigate:
It’s frustrating when students are confused. And yet, confusion has a few surprising benefits. It pushes students to slow down and think more deeply about the content. In the process, this struggle to figure things out helps the learning to stick. It’s why I forget entire sections of textbooks but it’s really hard to forget a confusing parable. It’s why certain key scientific concepts still click because of the confusion, curiosity, and discovery that occurred decades ago when I conducted a science experiment back in AP Physics.
Confusion often leads you into a place of nuanced understanding. It slows the learning process down in a way that leads to deeper, sustained focus. Confusion can lead students to question assumptions, explore multiple solutions, and develop a resilient learning mindset. If we think about AI, a chatbot’s instant answers might lead students to sidestep confusion rather than work through it, potentially missing out on deeper understanding. On the other hand, we can leverage confusion through AI for deeper learning (something we’ll explore next week). Here, students can ask questions and follow-up questions. They can share their own theories and use AI to add additional information. They can slow down the process deliberately through the use of a prompt engineering process like the FACTS Cycle:
As educators, we can embrace confusion by doing short History Mystery activities or by sharing a counterintuitive scientific concept that will push students to ask questions and even formulate hypotheses. As this occurs, students experience a necessary delay that might even lead to productive struggle.
2. Productive Struggle as a Means to Mastery
Looking back on that History Day Project, I loved it. I learned so much at such a deep level. And yet, there were moments I really hated it. I remember one moment I walked into class and told my teacher, “I have writer’s block and I don’t feel like writing. I think I’m just going to join the class instead.”
She responded with, “The nice thing about writing is you can do it whether you feel like it or not. Writer’s block is just a code word for fear. You write your way through that fear.” But then she added, “I want you to give me 200 mediocre words and then we can revisit it.”
I learned a key lesson about fear and creativity. I realized that often the best way to develop grit is to be given slack. The permission to make mistakes can lead to productive struggle, which can ultimately lead to a creative breakthrough. Productive struggle builds critical problem-solving skills and persistence. It can help lead to a growth mindset.
By offering instant solutions, AI risks diluting this valuable phase, which often leads students to “aha” moments where they genuinely understand concepts. This is why it’s critical that we focus on the role of AI and productive struggle.
3. Slower Learning for Lasting Knowledge
As mentioned before, students can easily internalize the belief that speed and accuracy are the only routes to academic success. While fluency (whether it is reading fluency, computational fluency, or creative fluency) are key, we need to value the role of “slow learning” for depth and mastery. The slower, reflective learning process helps students internalize and apply knowledge in complex ways.
It also makes the learning more “sticky.” From an information processing standpoint, it’s the idea of building in more moments of recall and retention.
In a world of machine learning, we will need to tackle what are often called “wicked problems,” where the solution leads to new challenges. These types of wicked problems require deeper, sustained focus that move beyond the instant answers. However, when AI speeds up access to information, it can disrupt this process, sometimes trading depth for surface-level engagement. Here’s where it helps to develop a “snailed it” mindset.
Here’s where boredom can play a critical role as well. Boredom encourages students to self-generate ideas and explore their interests. In a world of instant entertainment, this embrace of boredom is not just a skill. It’s also a habit and a mindset. It’s the ability to say, “I’m going avoid stimulus right now to create the space needed to slow down and solve problems.”
4. Divergent Thinking to Move Beyond the Algorithm
I want you to try something. Go to a chatbot and ask it to create an image of a left-handed painter and see if it works. I’ve tried this with multiple platforms and most of the time, I end up with something like this:
Similarly, if you ask AI to generate an image of a married couple on their wedding day. Most of the time you’ll get an image that looks a bit like this. They’re both white, young, heterosexual, and thin. He’s nearly always taller than her. The wedding usually looks fancy and the clothing is a traditional western style of dress (often distinctly American). Also . . . for some reason she nearly always looks like an influencer and he looks like a Property Brother.
So, what’s going on? Large Language Models (LLMs) use predictive analytics to determine the most likely answer to a question. This can be great in tight parameters where you are looking for increased accuracy. But it can also lead to bias because the “most likely” often ignores certain groups, like left-handed artists. So, as we think about problem-solving, generative AI uses a convergent thinking process that leads to the most common answer. By contrast, we as humans, have the potential to engage in divergent thinking.
One of the best ways to think about divergent thinking is that people “think outside the box” by repurposing the box.
As teachers, we can integrate divergent thinking into our lessons by limiting supplies in strategic ways, by doing short divergent thinking challenges, or by requiring students to find alternative methods for solving problems.
5. Developing Your Own Voice in a Predictive World
Have you ever read an article and immediately recognized it was created using generative AI? The verb tenses are too consistent and the author never moves from past to present like I just did. The sentence length doesn’t vary. But there’s something else you can’t put your finger on. The article feels flat and, well, boring. Generative AI often creates work that can seem vanilla at first.
Here’s where voice is so critical. In order to stand out in a crowd, our students will need to develop their own unique voices based on their personalities and experiences and personal tastes. If we think about writing, this means we need to think intentionally about how when we will integrate AI into each piece of the writing process. For a deeper dive, check out this article I wrote about two year ago.
6. Cultivating Empathy for Better Creativity
For decades, computer scientists have worked toward perfecting the pro-social robot. The journey began with ELIZA in the 1960s. Created by Joseph Weizenbaum, ELIZA used basic pattern recognition to respond to users as a humanist psychologist. ELIZA was super basic but people responded to it socially, leading to the now famous ELIZA Effect.
In later decades, robots like Sony’s robotic dog, AIBO, a robotic dog acted pro-socially in a way that seemed to offer true companionship. The humanoid NAO robot was actually integrated into therapy. More recently, robots like SoftBank’s Pepper have been incorporated into healthcare, retail, and education. These algorithms seem to pick up on human emotions in ways that we sometimes fail to do. Okay, in ways that we often fail to do.
And yet . . .
That’s not empathy. We can anthroporphize robots like Wall-E or Wild Robot, but a robot cannot feel loneliness. It cannot feel heartache. It cannot weep with a close friend who gets that dreaded cancer diagnosis and feel that gut-wrenching reality in person. At the end of the day, a robot is still just a set of algorithms responding to predictive analytics.
Empathy is a deeply human endeavor. It allows us to feel what others feel and engage in perspective-taking. It is part of what it means to be human. But it’s also necessary for the creative process. Empathy is an embedded aspect of the design thinking process as students engage in what Tim Brown describes as “designing with” rather than “designing for.” As educators, we can create PBL units that incorporate design thinking in a way that I describe as PBL by design:
7. Contextual Understanding
Generative AI chatbots tend to struggle with context because the LLMs use massive data sets that are largely decontextualized. The information is broad and expansive rather than specific and localized. Here the AI platforms rely on patterns in data rather than genuine understanding of the immediate context. While it excels at recognizing and generating based on immediate prompts, it lacks real-world awareness, memory across long interactions, and the ability to infer deeper meaning or nuance. This often leads to inconsistencies, misinterpretations, or overly literal responses, and answers that aren’t tailored toward any kind of cultural awareness.
In other words, generative AI can’t “read the room.” Chatbots don’t know what was just spoken in a group. They don’t know what local controversies are playing out in the community. They don’t have a handle on the in-the-moment issues affecting people’s lives. This is why contextual understanding is so critical in a world of AI.
Context helps us judge if the information is relevant, accurate, and useful in the moment. The key piece there is in the moment. It’s the physical, social, and relational dynamics occurring at the moment. So, in a design thinking project understanding who the audience is, what problem is being solved, or how new information fits with what we already know allows us to make smarter decisions. When students know the context, they can see how a solution might create additional problems (the wicked problem notion).
But contextualization isn’t just about awareness. It’s what allows us to find an audience and reach them in a relevant way. It’s the ability to take a concept and communicate it to a specific group. It’s what helps us market products or solutions to a community who will actually need it.
If this sounds a bit like the last idea of empathy, there’s definitely an overlap. But while empathy tends to be individual and relational, contextualization also includes navigating systems and structures as well. This is yet another reason I love PBL. It’s a chance for students to discover the world by exploring their local community. It’s a chance to connect relationally but also study how specific systems work around them.
8. Wisdom in a Sea of Information
Wisdom is more important than ever in our current world of amusement and novelty. Trends shift quickly. However, philosophy is timeless. People are starting to realize that some of the best voices are actually vintage. I’ve met several people who are reading Seneca and embracing the ideas of stoicism. In a world where we have so much and still don’t feel happy, there’s this vintage voice from the past providing some real answers.
I’m not suggesting that stoicism is the answer. But I do think there’s this danger in viewing newer ideas as being inherently better than anything classical. There’s a chorus of voices from the past and, if we’re open to it, they raise some great questions for us to grapple with. And I guess that’s the point. It’s not that we buy into one specific philosophy but that we learn how to engage in philosophical dialogue. The cool thing about Socrates is not that he finds the answers. In fact, he almost never does. Rather, it’s that he asks all the right questions. This is why philosophy might be one of the core courses of the future for secondary students.
Back in 2016, I wrote an article about the need for philosophy in an age of AI. Generative AI can provide us with quick answers but it can’t ask the harder questions that we need to ask. It can tell us “what is” but it cannot tell us “what is best.” Philosophy pushes us to explore the nature of reality and truth. It helps us discover our moral values and our sense of what is right.
Artificial intelligence continues to blur the lines between ourselves and our machines. For example, in an era with self-driving cars, who decides the value of a life when avoiding a car crash? What does it mean to have an original thought? What does this mean for plagiarism and copyright? How do you make sense out of ideas like attribution and originality?
If we want students to think ethically about how to use AI, they will need to engage in Socratic conversations that tackle hard questions:
This doesn’t mean we abandon our content. But it does mean we can integrate Socratic Seminars and philosophical questions into our content area in a way that sparks curiosity and ultimately leads to wisdom.
9. Extended Focus in a Distracted World
Deep work is increasingly rare but essential for tackling complex tasks and understanding nuanced topics. AI, especially if used too passively, can pull focus away from sustained engagement, reinforcing a pattern of quick, fragmented interactions. This is why deeper learning is so important in a world of distraction. When students develop the habit and skill of sustained, deeper work, they can tune out the noise and engage in deeper problem-solving and creativity.
What does this look like?
Note that each of these nine skills are interconnected. Empathy involves contextual understanding. Deeper learning includes divergent thinking as students engage in problem-solving where they are faced with confusion and boredom. We can implement inquiry-based learning:
We might implement a shorter problem-based learning mini-unit:
Or we might do a full-scale PBL unit that incorporates aspects of design thinking and service learning:
But it can also be something smaller. We can revise our assignment design in subtler ways.
Revising Our Assignment Design
I recently led a workshop on the future of writing in a world of AI. One of my favorite activities involved redesigning writing prompts to center on these human elements:
- Choose a Standard Writing Prompt: Start with a typical writing prompt that you would normally use in your classroom.
- Add Human Context: Revise the prompt to include elements that relate to human experiences, emotions, or backgrounds. Encourage students to consider how personal experiences and emotions influence responses.
- Encourage Divergent Thinking: Modify the prompt to allow for multiple correct answers or interpretations. Ask students to explore alternative outcomes, perspectives, or creative solutions. Consider using something like creative constraint to push them to think more divergently.
- Stimulate Curiosity: Include questions in the prompt that make students question or delve deeper into the topic. Encourage them to research or imagine possibilities beyond the obvious. Provide some sentence stems for them to ask their own questions.
- Foster Empathy: Direct the prompt towards understanding and connecting with others’ situations. This could involve writing from another person’s perspective or considering the impact of actions on others.
Note that this doesn’t make the prompts AI-proof but it does make them AI-resistant, and, more importantly, human-centered.
Get the FREE eBook!
Subscribe to my newsletter and get the A Beginner’s Guide to Artificial Intelligence in the Education. You can also check out other articles, videos, and podcasts in my AI for Education Hub.
Fill out the form below to access the FREE eBook: