by Anita Charles
Imagine that you live five miles from a grocery store. You can walk to it, ride a bicycle, or drive. Which do you choose?
Let’s add these details: you have a full-time job, the route has steep hills, it’s wintertime, and you need to buy several bags of food. The choice is obvious: you will gladly drive the distance. Driving is efficient, saving time and energy, and a car provides warmth and space. Even though you need a car loan, insurance, registration, and a driver’s license, and even though the car spews emissions and could be dangerous, you will almost always choose the car. We live busy and complicated lives. We appreciate modern conveniences, even if there are negative trade-offs to which we reluctantly concede.
Enter Artificial Intelligence, stage right.
AI is a near-miraculous quick and convenient way to get from point A (a problem, question, or task) to point B (the solution, answer, or resolution). If you ask teachers, “Would you rather handwrite daily lesson plans, build your own templates and type in your lessons, or hit a button that generates a lesson for you?”, we will invariably take the path of least resistance. AI can – with decent (though far from perfect) accuracy -- generate lessons, rubrics, tiered readings, games, worksheets, and assessments. And for students, AI can – with decent (though far from perfect) accuracy – write papers, answer worksheet questions, provide outlines, build annotated bibliographies, do math, and so on.
The internet is replete with articles about using AI in teaching and learning, and on the pros and cons (and here) of Large Language Models. (And ChatGPT is quite capable of tooting its own horn, so to speak.) However, the issue is not whether teachers are taught to use AI, but rather whether our teachers are examining AI through a critical lens, asking where and how and when AI can be leveraged for better teaching and learning; and where and how and when AI diminishes or interferes with teaching and learning. And then problem-solving ways to articulate and address the challenges while still finding room for the benefits.
Return to the grocery store analogy: imagine that, at the touch of a button, you can get your weekly groceries delivered directly to your door. Someone at the store guesses at what you want, and sends it to your home. Sometimes the order is off – the wrong product, too many bananas, too little bread. Sometimes they send a bland flavorless equivalent. And it costs more, of course. But still: the touch of a button! Imagine the time you can get back; imagine freeing your psyche of the worry; imagine barely having to think at all.
The difference between the grocery store and a classroom is that learning is, at its very core, about thinking. The question is not “to do AI or not to do AI”; the question is how and why and when are we choosing to use AI as teachers, or helping our students know how and why and when to use it.

Recently, a college student (I will call her “Maria”) in one of my courses submitted a project proposal that read like AI, and I asked her to meet with me. I said, “Do you trust me?” She nodded. I continued, “You are not in any trouble, I just would like to know your process in doing this work. Please show me very specifically how you use AI.” Maria pulled out her laptop, pulled up her paid version of ChatGPT, and showed me her process, as she narrated:
“I start by writing my own answers to the questions. Then I put my answers and the questions into ChatGPT and ask it to grade me on a scale of 1 to 10. And then I ask it to make the paper a 10. When it fixes up my answers, I put that back in and ask the same question – rate this and then fix it up. I only submit something when it finally reaches a 10 on a draft.”
I asked her to consider what AI might consider a “10” on a paper it has no nuanced context for: “Is it possible that AI’s 10 and my 10 are two different things?” I also asked her to explain some of the AI-produced responses: “What does this sentence mean to you?” She could not always answer that question. I asked her for some of her original work, and told her that that work was more authentically her and showed her own thinking. I asked if she was learning anything from AI, and she said she was not.
Then I asked: “What if you submitted only your own draft, and you didn’t get a good score on it from me?” Maria’s head sank down, and she said, “I would be upset with myself.” I replied, “No. You would rewrite it using feedback, and learn and grow from that process, and be proud of yourself for your hard work.”
I had a granola bar beside me, and I pointed to it. “Your own thinking and writing are like that granola bar – full of texture and rough edges and flavors. When you put it through AI and ask it for a better bar, AI smooths down the bar more and more each time, taking out the nuts and grains, until you are left with something bland and tasteless that no longer resembles what it was meant to be. It may appear prettier, but it has lost its identity.”
Note that in this conversation, I needed first to establish a relationship of mutual trust before we could engage in meaningful dialog. We dug deeply into Maria’s sometimes-fragile identity as a student, into whether AI has the capacity to make any work an objective 10, and what the purpose of an assignment is. We were working together on developing “AI Literacy.” (See also here.) One conversation won’t resolve Maria’s dependency on AI, but it has started to shift her thinking into critical reflection and engagement.
We can gladly accept an easier way to get our groceries, and we are willing tolerate a range of error in what we get. Similarly, AI can save time and energy and the drudgery of tasks that are distractions from real learning. But when AI leads to us making compromises in the quality and depth of learning or teaching, tempting us to bypass, rather than to augment, our learning, then it’s time to remind ourselves and our students to get back on the bike, slow down, exert our own energy, stretch those muscles, and embrace the joy of the challenge of learning and new growth, and of engaging critically with AI in the classroom.

Some strategies to build AI Literacy across K-16 classrooms:
- Teach AI Literacy skills to students. Assist students in learning to ask the probing questions around use of AI for school work.
- Build in-class assignments and careful guardrails around AI. Teachers need to learn about AI in order to engage in conversations with students about what it can and can’t do.
- Help students understand that schooling is about thinking, not only a means to an end (such as college or employment), but about the processes of thinking and growing. Bypassing the hard work of learning only diminishes our capacity as critical and creative thinkers.
- Draft AI policies in collaboration with students and families.
- Remember that AI can never replace trusting human relationships with students. Build on those relationships to help students in their decision-making around AI.

Dr. Anita Charles is Director of Secondary Teacher Education and Senior Lecturer at Bates College. She previously taught for many years across multiple grades and levels, including first grade, high school English and French, and alternative education for “at risk” youth. She received two Fulbright Scholarships to India where she taught university courses and researched inclusive education; and she has also studied educational nonprofits in Zambia and Tanzania. Her interests in the field of education include Teacher Education, Literacy, Special Education, and Comparative/International Education. She has five grown children and resides in Lewiston, Maine, with her husband.