Faculty Tackle AI in the Classroom and Beyond
In pondering the stunning growth of artificial intelligence tools at the beginning of the Center for Teaching and Advising’s “Enhancing Our AI Literacy” conference, some faculty members expressed excitement at incorporating the powerful tech tool into their classes. Others, though, like Associate Professor Karen Petruska (Communication Studies) joked that they went “straight to existential threat” when considering AI’s possible ramifications for teaching and learning.
It’s safe to say that when the couple dozen faculty members, administrators and staffers left the day-long discussion in August — part of CTA’s Teaching and Learning Symposium 2023 — most attitudes landed somewhere between excitement and fear. But everyone walked away knowing that AI is here to stay.
What happens next as the Gonzaga community adapts to this dynamic new tool is a story that will unfold differently across campus. The overall message for faculty at the symposium, though, was that there are incredible opportunities for teaching students to use AI that enhance students’ creativity and academic rigor. And the same goes for teachers.
“In teaching, there’s always another thing to learn about,” said CTA Director Nichole Barta (Kinesiology & Sport Management) in a session she co-hosted with Justin Marquis (Instructional Design and Delivery) delving into AI as a “new science of learning.” “But I think about the students, and they show up to class every day thinking the same thing.”
Chase Bollig (English) is an early adapter and embracer of AI, used it in a first-year writing course last year, learning how students use AI now, and what the potential is for using ChatGPT or Google’s Bard AI tool as an automated researcher, assistant and writer. He led a session on “AI as Part of a Writing-Enriched Tool Kit” and noted that his students found it frustrating at times, and some had their own ethical questions about how much AI used in an assignment is too much.
Other sessions at the symposium included Marquis guiding faculty through how to design assignments that incorporate AI, and research librarians Anthony Tardiff and Candise Branum exploring “AI-Powered Research: Helpful or Hallucination?”
“Keeping up with the evolution of AI can be daunting,” Barta said, reflecting on the conference a few days later. “Faculty already have so many responsibilities, and the thought of mastering yet another area can feel overwhelming. This becomes particularly significant when we see how AI can influence how some faculty deliver their assessments.
How to Engage, or Not?
AI offers both challenges and opportunities across college campuses. If you’re in, say, admissions, you might have work to do in determining how much of a potential student’s essay is the student’s work and how much was generated by ChatGPT. A professor in the classroom, conversely, might find students’ ability to generate Socratic debates with themselves to be extremely valuable for students trying to hone arguments for a paper or presentation.Barta outlined four ways for teachers to react to the seemingly unstoppable tide of AI.
1. “Ban” AI completely from the classroom, knowing students will most likely continue to use it and challenge any findings by the “AI detectors” currently available.
2. Attempt to “evade” AI use via proctored exams, which is cost-prohibitive, or have students clarify when they use AI in their work, but that will become more difficult as AI becomes more pervasive.
3. “Adapt” to AI’s arrival by developing “new methods of assessment, new policies and guidelines.”
4. “Embrace” it, recognizing that AI is here to stay and will “pervade all aspects of our work.”
“We need to initiate a process of building ethical systems and trust in those systems,” Barta said, relaying some research she’d done. And while some might want to ban or evade AI in the classroom, or jump right in trying to use the tools often, she encouraged all teachers to do some research and “be clear about what AI can do in your field before you start inserting it into class. Let students know where it’s flawed, what its strengths are.”
Steve Conant, a lab specialist and anatomy instructor in the School of Health Sciences, considers himself an “embracer” of all things AI. He’s partial to Google’s Bard tool, and attends Google conferences regularly to learn about the ways to use its tools in the lab and classroom.
“For me, it’s a tool like a calculator is a tool,” Contant says of AI. “I want to figure out how we immediately can get on this.”
The Human Element
Justin Marquis is director of GU’s Instructional Design and Delivery office and chair of its Academic Integrity Board, and both roles will have him addressing AI to varying degrees.
“It’s here,” Marquis told the attendees of the Enhancing Our AI Literacy event. “We don’t have a choice. How do we make the best of it?”
That answer is still in development, but Marquis has researched and worked with AI for classes he’s taught more than most, and he noted that while many consider AI something to replace human work and ingenuity, in reality it’s a tool that’s only as good as the humans working with it.
Marquis suggested a few ways to bring the humanity of the student into any projects using artificial intelligence, and says that “open and transparent conversation” about the positives and negatives of AI is needed. For example, ChatGPT now allows link-sharing within the documents it produces, so a teacher can ask a student to show their process in how they used the tool in an assignment. He also noted that a typical paper “written” by an AI tool will only deliver C-level work; it’s the expertise and extra research a student can add that will make that paper an “A.”
There are some inherent risks to using AI. Due to its very nature in processing data, and given that we know that data is biased by the humans that collected it, an AI tool then will replicate and reflect the bias that’s in the data. That’s where the humanity of the students and expertise of faculty members must come into play.
Even so, he’s found using AI to be well worthwhile, using it in creating resources for students and generating ideas for assignments for his classes. Marquis illustrated the possibilities by talking about how he created a fictional pickleball league, and used AI to help form by-laws, rules, financial plans, press releases and scripts for league coaches, effectively doing weeks of work in just a couple hours.
With that exercise in efficiency and creativity in mind, Marquis believes AI offers new ways to engage students, fresh means of re-engaging themselves post-pandemic, and new ideas in their approach to classes.
“It can springboard your process in interesting ways,” Marquis said.