Let’s not try to outsmart AI: A Sociologist’s reflection on teaching and learning in the GenAI Era.
February 24, 2026
Elena Neiterman, University of Waterloo
In early January, when most people were still finalizing their New Year resolutions, I sat in front of my computer preparing to teach Gender, Sex, and Health. While the two years during which I did not teach this course might seem like a short pause, for me they marked a shift in the pedagogical era, a shift shaped by the rapid proliferation of generative AI.
I am not new to teaching, or so I thought. As a teaching faculty with 15 years of experience, I have taught thousands of students, across dozens of different courses. It felt like I’ve done it all. Like I knew it all. Until it didn’t.
When AI Entered the Classroom
When generative AI crept into the classroom, few of us were prepared. As with many other new and exciting educational tools, some instructors jumped at the opportunity to experiment with this technology, while others felt the harms of GenAI were so significant that it should be banned and outlawed.
The media didn’t help either. Headlines reported on the excessive use of GenAI in education, where instructors used GenAI to create and grade assignments, while students relied on ChatGPT or other similar platforms to complete these tasks. Some articles openly questioned whether academic institutions themselves would soon become obsolete. And somehow, amidst all this buzz, we were expected to teach.
After spending some time in blissful denial, I had to face reality. A sudden and rather unexpected improvement in students’ writing hinted they might be using GenAI. Yet, our traditional tools like Turnitin were strikingly ineffective in detecting AI use. I spent hours sorting through plagiarism reports that were neither accurate nor reliable. I was not fond of this detective work – I chose a career in teaching, not policing.
Almost simultaneously, I also saw a dramatic jump in students’ grades on online tests and quizzes. True, there might have been a chance that we lucked out and admitted into our program a cohort of prodigies… but it was much more likely that my assessments were no longer aligned with the new learning landscape.
Facing GenAI Reality
Like many instructors, I struggled with what to do. Workshops about GenAI left me feeling overwhelmed. I learned that ChatGPT could plan my lessons, generate lecture slides, compose exam questions, design class activities and assignments. Students, meanwhile, could use AI to write assignments and tests, to summarize readings without ever reading them, and to substitute my “exam prep” by generating exam questions from my slides.
In short, it could do it all. So, where did that leave me? What role could I still have when seemingly anything and everything I do might be replaced by AI?
Rethinking the Purpose of Teaching
This new reality forced me to rethink my teaching. For years, I strongly believed in the value of writing as a tool for developing critical thinking skills. Was it still important, I wondered, to hone students' writing skills when Grammarly, ChatGPT, and even Outlook could do it for them?
How essential is it today to memorize dates and facts when most of us do not remember the phone numbers of our family members and friends?
What was the purpose of banning GenAI from my courses when employers across the country now are expecting students to be experts in AI?
More importantly, what could I still offer to students that GenAI could not? What did I want them to learn in my classroom?
After going through this professional identity crisis, I ultimately settled on two skills - critical thinking and interpersonal communication.
Teaching Critical Thinking in an AI World
True, ChatGPT can generate knowledge, but – at least for now – it cannot truly think. We – humans – still need to develop the ability to critically assess the outputs provided by AI, interrogating their accuracy, validity, and biases.
Redesigning my assignments, I invited students to use GenAI, but also to critique it. What was overlooked or taken-for-granted in the paper they produced with the assistance of GenAI? Where were its blind spots, and how could we, as sociologists, detect those? What prompts should we use to receive an effective output from GenAI? What can we learn from this process about our own assumptions and biases?
In the era of information overload, these questions help students learn to think and act like sociologists.
Re-establishing Human Connection
The other skill I integrated into my teaching is interpersonal communication. Today’s students have lived through online learning during the COVID-19 pandemic and the rise of always-on technology. They crave human connection, yet many of them struggle with social anxiety.
No AI can replace the embodied, spontaneous, sometimes awkward but always human experiences of peer interaction. Learning is about exchanging ideas, and today this exchange among humans seems more important than ever.
So, I cut down on my “sage-on-the-stage” time in class, leaving more space for small-group discussions and think-pair-share exercises. Even in large classes, I strive to create small pockets of community. Class discussions do not always work (especially if you happen to teach an early morning required class), but when they do, they create a special energy in the room, the kind of energy that I have yet to experience communicating with ChatGPT.
Moving Forward
I am not naïve. I know these strategies are neither perfect nor permanent. GenAI is improving daily.
Designing my assignment for Gender, Sex, and Health, I asked ChatGPT, “How can I make it AI-proofed?” The response was surprisingly thoughtful: ask students to select specific quotes from the readings and connect them to class discussions.
Great idea, I thought, until I realized AI can also help students find these quotes and link them to the course content.
I cannot outsmart GenAI. I am not that talented or witty. Most importantly, I do not want to play this game. And perhaps that is the point.
What I can do is teach students to be critical consumers of AI-generated information. All technologies reflect the biases and assumptions of humans who built them.
Learning sociology has always been about questioning how the world works. GenAI is now a part of this world. We owe it to our students to question what it has to offer. And here, our work as educators just begins.
Interested in discussing this topic in more detail?
Join our next CSSH Teaching Conversation Circle Tuesday March 3rd, 2026 12:30-1:30 EST
Register here: https://uottawa-ca.zoom.us/meeting/register/w67lErbOQu2w7QSvrmCuZg