Does AI challenge academic honesty? What are its environmental impacts? Are there implications for access and equity? These are some of the questions that the Arts and Sciences Advisory Council for Students (ASACS) — an advisory group comprised of undergraduate and graduate students — has been exploring in meetings with Dianne Harris, dean of the College of Arts & Sciences, and faculty members from the College’s four divisions.
These meetings have been messy, generative, and vital for establishing an open dialogue on AI between students and the College.
“I think there's no getting away from AI now, because it's basically the next wave of everything,” says ASACS member and international studies major Priya Khaira. “But it's important that we understand what we stand to lose when we use these services more and more.”
“A topic that keeps getting brought up is the idea that students are losing academic skills because of AI,” adds psychology and philosophy major Lucy Nowicki, the 2025-2026 ASACS Chair. “Regardless of how you feel about AI, it’s the topic of the moment. Everyone's thinking about it.”
Talking with Faculty about AI
When Arts & Sciences faculty came to ASACS meetings to share how their departments were handling AI in their courses, their varied experiences highlighted the challenges of addressing AI in the College.
“We asked them directly what guidance they’ve received on AI,” said Nowicki. “Some have received guidance, others haven’t heard anything. It's not consistent. This year, we’re hoping to bring together individuals at a higher department level who may be able to help with a top-down communication plan that gives clarity to different departments and facilitates a larger message about AI.”
Nowicki and Khaira agree that a campus-wide AI policy is not realistic because each department is so different. But they say that zero-tolerance policies also hurt students.
“AI is relevant, it’s going to stay, and it can be used in helpful, good ways,” says Nowicki. “For those faculty members with zero-tolerance policies, I wish they would have a more thoughtful approach to how AI could be used. Be open to the conversation. Honestly, I think this would actually stop inappropriate use of AI in the classroom.”
Khaira agrees. “Right now, AI is kind of the elephant in the room. It needs to be addressed and integrated in a way where students are still learning.”
Students’ Varied Approaches to AI
ASACS members also wanted to learn more about their fellow students’ feelings about AI. Nowicki designed an anonymous survey that was sent to a cross-section of undergraduate and graduate students. The 82 respondents included representatives from every division within Arts & Sciences as well as students studying engineering, education, health sciences, and the environment.
When asked how often they used AI tools (including ChatGPT, Grammarly, and similar services on a weekly basis, 45% reported never using them. When asked how often they used AI tools that were explicitly banned by the instructor, 78% reported never using them.
Students were offered the opportunity to elaborate on their responses, and more than one-third of respondents added more thoughts. “They were passionate enough about disliking AI or using AI that they had more to say. This is why students need to be included in the dialogue about AI, especially if and when policies are created,” says Nowicki.
Many of the survey respondents were concerned about AI’s impact on critical thinking skills. “AI is different from other tools in that it solves the entire problem for you or writes the entire essay. It is a direct discouragement of your own critical thinking skills,” wrote one respondent. Another agreed, writing simply: “Using AI is cheating yourself out of the opportunity to learn and think critically.”
Concerns also included the impact of AI on the environment, ethical and copyright infringement implications, and safety concerns. “It worries me when students use AI in civil engineering classes,” wrote one student. “What we do directly puts people in harm’s way if something is calculated wrong or not up to code.”
Some students, however, pointed out that AI is here to stay and can be used as a tool, not a solution.
“I use AI mainly to help elaborate on ideas or concepts I don’t understand, generate study questions, or to help get started on writing assignments. I don’t usually use it when it is not allowed, and I don’t use it to do entire assignments for me,” wrote one student.
Another noted that “AI can help you learn without mentoring, but relying on AI to directly solve problems will only hurt your critical thinking skills.” Some students mentioned services like Grammarly, which could be used to check for spelling and syntax errors— small tweaks that improve the overall readiness of an essay without offloading critical thinking skills. These students were aligned in their belief that, because AI is ubiquitous, students must know how to use it ethically to prepare themselves for the workforce.
“I feel that AI is like any other technology. It’s neutral,” wrote one student. “I use it to generate study questions, and I put my essays into it and have it grade my work. That said, I don’t use it to generate my work. I think it’s a great study tool and assistant and should be limited to that.”
I use [AI] to generate study questions, and I put my essays into it and have it grade my work. That said, I don’t use it to generate my work. I think it’s a great study tool and assistant and should be limited to that.
For her part, Nowicki uses AI — ChatGPT specifically — but only sparingly. “The biggest way it helps me is with writing emails, particularly outreach emails. It helps me develop a professional, formal tone when reaching out to potential speakers and guests for ASACS meetings.”
Khaira is even more skeptical. “I'm really wary about using it because I come from a privacy background, so it makes me nervous. And that's why I've tried to stick with only using ChatGPT, because I feel like the more platforms you engage with, the more places your data is being stored. I sometimes use it to help with research. I usually don't like to use it when writing, because I feel like it takes away from that innate skill.
I do think that if we can be open about talking about it as a tool, not as a replacement, we'd be on a better course.”
Curious about faculty perspectives on AI? Check out AI in the Classroom? For Faculty, It’s Complicated.
More Stories
Bringing Music to Life Through Audio Engineering
UW School of Music alum Andrea Roberts, an audio engineer, has worked with recording artists in a wide range of genres — including Beyoncé.
10 Arts & Sciences Stories from 2025
As 2025 comes to a close, we're sharing some of the year's top Arts & Sciences stories.
A Transformative Gift for Arts & Sciences
To honor his wife and support the college that has meant so much to both of them, former Arts & Sciences dean John Simpson created the Katherine and John Simpson Endowed Deanship.