In My AI Era

For many educators, the rise of AI tools like ChatGPT feels like a tidal wave. Everywhere we turn, there's a new tool promising to revolutionize lesson planning, personalize instruction, or "save teachers time." And yet, beneath the buzz, there's a quieter undercurrent: hesitation.

It's not that teachers aren’t curious — they are. It's that this curiosity is often accompanied by deep-seated concerns: What if I misuse it? What if it gives me bad information? What if it undermines what I know about my students and what they need? These questions aren’t a sign of resistance — they’re a sign of responsibility.

When I asked educators and system leaders about AI’s role in the classroom, their reflections surfaced this tension. Many were energized by the possibilities: AI can help learners break down complex ideas, clarify misunderstandings, and generate new approaches to problem-solving. It can even act as a cognitive coach — a tool that prompts students to probe more deeply for meaning and connection.

One systems leader expressed hope that AI might help teachers offload some of the cognitive burden of unit planning, allowing them to spend more time thinking deeply about student learning. But they also raised a red flag: AI-generated plans might look strong but conceal weak internalization of the concepts by teachers themselves — which in turn could flatten instruction rather than deepen it. A polished plan doesn't always mean powerful teaching.

This concern is echoed in broader conversations about AI becoming a shortcut — a stand-in for intellectual labor. Will students see AI as a way to avoid thinking, or can we help them engage it as a partner in thinking? Will teachers use it to enhance their instructional vision, or to simply check boxes?

A former systems-level leader who now serves on the board of a charter school shared another perspective. They see AI as a potentially transformative tool for low-income communities, where access to supplemental individualized academic support — such as SAT prep courses or after-school programs like Kumon Learning Centers — can be scarce. In their view, AI offers a free, on-demand way for students to get help with complex homework topics. But they also cautioned: this same tool, if overused or misused, can stifle creativity, particularly in digital arts and other intellectual or expressive processes.

This dual nature of AI — its ability to support or suppress — places educators in a uniquely powerful position. Teachers are not just implementers of new tech; they are designers of learning experiences. In my view, AI works best when positioned as a thought partner, not a shortcut. I believe it should help students wrestle with ideas, not bypass them.

To do that, we must shift the cultural conditions under which AI enters classrooms. Teachers often work in high-stakes environments where instructional design is expected to be precise, perfect, and individual. In that context, using AI can feel risky — like inviting in an unvetted co-teacher. And when that “co-teacher” is trained on dominant cultural narratives, educators who serve culturally and linguistically diverse students rightly ask: Will this help me see my students' thinking and learning more clearly — or obscure it further?

Last summer, a friend and I taught a group of non-STEM undergraduate humanities and social science majors at Howard University how to use ChatGPT to apply data science to their research interests. We didn’t present AI as an all-knowing authority. We used it as a tool for inquiry. Students adapted code with ChatGPT, but also questioned its output, debated its suggestions, and developed original ideas alongside it. AI didn’t replace their thinking — it helped stretch it.

So perhaps the better question isn’t “Should we use AI for planning, designing, and implementing teaching and learning?” but rather: Under what conditions can AI support equitable, relevant, and learner-centered classrooms?

To build those conditions, we need:

  • Time for teachers to explore and test AI tools in low-stakes, supportive environments.

  • Structures that promote teacher metacognition and professional learning, not just content output.

  • Transparency around data privacy, student information, and ethical use.

  • Dialogue about bias, authorship, and the role of human judgment in learning design and implementation.

  • And most of all, a reimagining of AI as a collaborative tool that supports — not substitutes — meaningful thinking.

Hesitation, then, isn’t the opposite of innovation. It’s the beginning of it — the pause before a bold step, the space where educators imagine something better.

Tamyra WalkerComment