abstract pattern in blues and greens

AI in the Classroom? For Faculty, It's Complicated

Back to All Perspectives Stories
01/30/2026 February 2026 Perspectives
Illustration of classroom with teacher in front and students at desks, with some students' heads replaced by computer monitors.
Illustration by Jessica Castillo.

With artificial intelligence (AI) tools like ChatGPT, Copilot, and NotebookLM now widely available to students, UW faculty face new challenges and opportunities in the classroom. Perspectives newsletter editor Nancy Joseph spoke with three College of Arts & Sciences professors to discuss the impact of AI on their teaching and on student learning. The consensus? It’s complicated.

The roundtable participants:

For the purposes of this conversation, AI refers to artificial intelligence tools used with intentionality to help with assignments and/or assist with studying.

Montage of three UW faculty featured in story about AI in the Classroom
Discussing AI in the classroom are (from left) professors Megan Callow (English), Ian Schnee (Philosophy) and Casey Self (Biology). 
What’s your general approach to AI use by students in your classes?  

Self: I specifically ask students not to use AI in my classes. AI will be an important tool for many fields and students should understand how to engage with it ethically, but it's hard for classes where it doesn't make sense pedagogically. Still, we do discuss AI in the context of decision-making. Many students in my classes are going into healthcare careers, and I’ve always had conversations with them about how every day they are making little decisions that can prepare them for big decisions as health care professionals. The availability of AI tools adds to that. When they have three finals and a time crunch, they might think, “There's a tool I can use and it's unfair that this faculty member is saying that I can't use it.” And so AI is providing an opportunity to talk about ethical decision-making in a way that I never had before.

Callow: I agree with Casey that it’s essential for us to introduce this conversation about intention and choice-making into the classroom now. But my pedagogical approach is different. As a writing teacher and as someone who has come around to believe that teaching critical AI literacy is now part of my job, I encourage students to experiment with AI and I give them structured spaces in which to do that. We have lots of conversations about what it can do and what it can't do and the considerations around all these choices. While Casey is talking about ethical or moral consequences of choices, I'm also focusing on the rhetorical consequences of these choices.

As a writing teacher and as someone who has come around to believe that teaching critical AI literacy is now part of my job, I encourage students to experiment with AI and I give them structured spaces in which to do that.

Megan Callow Teaching Professor, UW Department of English; Director of Writing at UW Seattle
portrait of Megan Callow
Can you give an example?

Callow: In my fall class, we were playing around with NotebookLM as a tool to organize content for a multimodal research project that students were doing. One student built a website, and during our group conference, we all remarked on how flat it was. It felt like most of it was informed by Wikipedia and it was totally depersonalized, even though it was a very personal project. The student realized he brought too many of NotebookLM’s suggestions into the final project. There was an aha moment where he realized the content that it produces is not that great and that he could do better than that.

Are there times when AI is beneficial for your students?

Callow: Ian and I are conducting a study with my English Department colleague Calvin Pollak to assess this. We have three groups of students working on projects — one using Copilot, one using NotebookLM, and a control group using no tool at all. I ask students in the first two groups to use the AI tool on the front end of their writing (for topic development and keyword generation) and on the back end (for editing), but not for drafting. We have found that the AI tools can be useful in generating keywords that the student can then bring to research databases. We also find these tools useful for organizing ideas.

Illustration of student in a classroom with a computer monitor replacing his head.
Ian, you’ve been involved in several AI pedagogy studies and have kept up with others. What has stood out to you?

Schnee: A big problem with student use of AI is that students often use it for offloading the cognitive work of learning. There is a good study from MIT (Kosmyna et al., 2025) that looked at EEG brain activity and found that it was suppressed when students were using AI for essay writing without proper guidance. On the other hand, with proper guidance it can be beneficial. A paper from Harvard (Kestin at al, 2025) found that AI tutors actually outperformed active learning instruction. They had a quiz section with an instructor using active learning best practices and another quiz section that used an AI tutor. The students with the AI tutor outperformed the other group. But it’s important to note that this was a customized AI tutor, fine-tuned to do things like encourage a growth mindset and sequential thinking. It would not just give the students the answers. Instead, it engaged in interaction with them that led to the learning results.

Have you incorporated AI tutors into your courses?

Schnee: I have. I used the RAG (retrieval -augmented generation) framework to make a customized AI tutor for my large logic class, which has 200 to 400 students. I provided all the source material for the AI tutor— my course materials, my textbook, my notes, hundreds of YouTube videos, transcripts, and PowerPoints— and customized instructions to structure its answers. I allowed the students to use the AI to do their homework too— I'm against unenforceable AI policies— but we had a control. There were written in-person exams with no AI allowed, so I could rigorously see if students were actually using the tutor bot to learn. They had the threat of that and knew their grade would tank if they didn't take some responsibility for their AI use. Lastly, I devoted 15 minutes of every lecture to metacognition— learning how to learn and how to use the AI as an aid to learning. I had such positive responses to that from students. It was their favorite part of the class. And the result is that I saw a one-letter grade improvement in my students' performance.

...I devoted 15 minutes of every lecture to metacognition — learning how to learn and how to use the AI as an aid to learning. ...It was their favorite part of the class.

Ian Schnee Teaching Professor, UW Department of Philosophy
portrait of Ian Schnee
When you’re committing class time to discussing these things, does that mean sacrificing content you would otherwise teach?

Schnee: I view this as a false dichotomy. The same pattern comes up regardless of AI, if you devote time to active learning. You're going to decrease content coverage in class, but students learn it better. So ironically, devoting 15 minutes of class time to something that was seemingly irrelevant led to better learning of the thing that I wanted them to learn anyway.

Does the UW have policies or guidelines related to using AI in the classroom, or is every professor on their own?

Self: A couple of years ago, the Faculty Council on Teaching & Learning was asked whether we should have a policy on AI, and my response was that any policy we write will be out of date by the time the ink dries. The Faculty Senate decided that the use of AI falls under faculty autonomy— that faculty are best positioned to understand how new technology fits into their existing pedagogy and should have freedom to decide what goes in and out of their classes. So that's the current lay of the land. There are programs through the Center for Teaching and Learning, or Teaching@UW, the tri-campus enterprise, that provide incentives and guidance to faculty to help integrate AI into their courses, but other than that, it's on faculty to independently decide how to fit AI into their pedagogy.

Illustration of a student sitting in a classroom desk, looking straight ahead.
That could be intimidating for faculty who are not tech savvy.

Callow: There was lots of anxiety at first. I resented that I had to learn about this thing when I already can't cover everything that I want to cover in my classes. But I came around to accepting that I was going to default on my responsibility for teaching different kinds of literacies to my students if I decided not to teach it or refused to acknowledge it in my classes. I came to understand that critical AI literacy resonates so much with the other kinds of literacies that I'm already teaching. Students are making rhetorical choices when they use these tools. There's digital literacy involved, and information literacy and media literacy and citation literacy. All of these issues that I'm already passionate about, generative AI is providing new contexts for me to continue those conversations.

Given that not all faculty welcome AI use in their classes, is the range of faculty responses to AI confusing for students?

Self: Where I see students confused is when faculty haven't made their decision about AI use clear. If faculty don't want to use AI, that's okay. You get to decide not to use it. But say it with your full chest and expect your students to ask questions. You need to be able to have a conversation about that. And if you are using AI, set your expectations clearly for where and how you're using it so that students know what the boundaries are, and when they hit that boundary, you can give them clear guidance.

If faculty don't want to use AI, that's okay. ...But say it with your full chest and expect your students to ask questions. You need to be able to have a conversation about that.

Casey Self Teaching Professor, UW Department of Biology
portrait of Casey Self

Callow: This confusion is not new. Students are used to being confused by seemingly arbitrary, shifting expectations from different faculty. So what we have to do is train them to take an inquisitive stance, so that every class they go into, they can be perceptive about the cultural norms and expectations of that situation.

Schnee: I agree with everything Megan and Casey have been saying about this. I would just add that there’s additional uncertainty for students beyond differences between instructors. In terms of how much to use AI for coursework, students don't necessarily know the norms for cheating anymore— when is using AI considered cheating?— and whether AI is going to matter more later on than the primary course content. So there are a multitude of interesting dimensions of uncertainty that go to the core of student identity and ethics. Also, in my 200-300 person class last year, one-third of students were conscientious objectors who did not want to use my tutor bot. They would explain why, and they were not required to use it. So students themselves are not homogenous. There are a lot of different viewpoints among students right now.

Any final thoughts about AI in the classroom?

Self: Just that off-the-shelf AI tools are not necessarily a good fit for all classes, so to thoughtfully integrate AI is a big undertaking. It might not be something every faculty member is in a place to do at any given time.

Callow: And there is still so much in development for the world of teaching and learning. As Ian said, students have a lot of uncertainty about what the expectations are, and that's because we as teachers have a lot of uncertainty about it. There's a huge variability and we're still figuring out how we feel and how to teach it and whether to teach it, so I hope that we can all give ourselves some grace as we work this out.

Curious about student perspectives on AI? Check out What Students Really Think About AI

More Stories

Andrea Roberts wearing headphones

Bringing Music to Life Through Audio Engineering

UW School of Music alum Andrea Roberts, an audio engineer, has worked with recording artists in a wide range of genres — including Beyoncé.

Michael Seguin behind the bar at Mobtown Ballroom.

A Love of Classics and Ballroom

Michael Seguin studied Classics at the UW and now owns Baltimore's Mobtown Ballroom. The two interests, he says, are more connected than they might seem.

Thuc Nhi Nguyen standing in front of a colorful statue of the interlocked Olympic rings.

A Sports Obsession Inspires a Career

Thuc Nhi Nguyen got her start the UW Daily. Now she's a sports reporter for Los Angeles Times, writing about the Lakers and the Olympics.