Community Members Share Thoughts on Generative AI

By CARLEE CANSECO, KAYLEE GONG, and KEVIN THANT

Artificial Intelligence (AI) has always been a term with futuristic connotations. However, with the rise of tools such as ChatGPT or Google’s Gemini AI, new generative AI technologies have become increasingly prominent in the modern day. 

Along with this rise, Exeter is seeing an uptick in concerns regarding academic dishonesty and plagiarism. To adapt to rapidly changing technologies, Exeter’s administration has tried implementing new AI policies on a department and faculty basis. However, these unclear policies have continued to vary between each faculty and their courses.

As a part of her master’s degree, Instructor in English Christina Breen conducted research for her thesis on academic integrity. After finishing last April, she is now a fellow for the Phillips Exeter Center for Teaching and Learning. Her research examines department policies on AI and engages in conversation about AI use.

  “We don’t really have our finger on the thumb of how prevalent this is among our students,” Breen commented. “It’s percolating up to the CCC… and if there are 20, 30 cases amongst 1100 kids in this school, there’s no way only 30 kids are using these tools.”

“There’s been a significant increase in academic integrity cases with the CCC,” Breen continued. In her opinion, there has been a noticeable association between developments in generative AI and academic dishonesty. Especially after advancements such as ChatGPT-4, more and more students are turning to artificial intelligence to assist or complete their assignments.

However, some believe the shift toward AI does not necessarily help students with their assignments or to achieve better grades. “I think that some students are trying to use AI in my classes, but I am pretty sure they aren’t getting good grades when they do so,” Instructor in History and Department Chair Betty Luther-Hillman reflected. 

“Most of the papers that I’ve seen that used AI were simply not very good papers; the AI did not help the student’s grade on the essay and, in many cases, made the paper worse,” she continued. Other humanities teachers also voiced similar sentiments on using AI in writing pieces. They have found that it is easy to spot AI-written essays due to their lack of quality.

There are also concerns that generative AI takes away from the critical learning experience of writing. “Even partial dependence on AI when you are writing an English paper means that you are skipping over critical stages of your own thought process and gaining only the limited expertise of harvesting and adapting others’ ideas and words to your own purpose.” Instructor in English and Department Chair Barbara Desmond explained.

This critical thinking and articulating stage of development is inevitably challenging. Many students are tempted to skip this challenge and turn to AI. The problem is only made worse by its accessibility. ChatGPT can be accessed with a simple Google search and can provide an answer in seconds. When using Grammarly Premium, a commonly used web browser extension, sentence structure and content can be changed with the click of a button. 

Several teachers note how this accessibility provides a significant risk to students. “I think the most significant challenge is that it’s so easy to use because it’s right there, and the temptation must be really high,” Instructor in Mathematics Jarad Schofer remarked. “When it can solve something for you or do something for you in a matter of seconds, it can be hard to stay away from it.”

“I totally understand the temptation to use it, but students should be aware that AI is not always accurate in what it writes,” Luther-Hillman commented.

An anonymous student further articulated their views on AI in a different light. “Obviously the usage of AI is wrong and unjust, but it often helps students reduce their stressful homework load and relieve the difficulties of an Exonian’s lifestyle,” the student said. “However, when the school punishes the students with an even more dramatic CCC case, it just enforces a negative feedback loop. CCC cases cause students more stress, which is the exact reason why they used the AI. The school should focus on finding a better alternative, both enforcing no AI and punishing it.”

As more and more students are beginning to turn to artificial intelligence for academic purposes, the Exeter faculty are changing the structure of their classes and assessments.

“Some teachers have moved away from grading any out-of-class work,” remarked Instructor in Chemistry Jeannette Lovett on changes in the science department. Lovett emphasized how teachers were more heavily weighing in-class assignments, while making lab activities, a take-home assignment, a minor grade.

However, this phenomenon is not unique to the sciences. The mathematics department also maintains an 80/20 grade weight split between in-class and out-of-class work. There has been a shift towards in-class work in departments such as English, where take-home assignments constitute a significant part of the course. 

“For the first time in 15 years at Exeter this year, I’ve given tests,” Breen commented. The need to give tests has surprised her since she had never given tests before at Exeter.

Similarly, Turnitin, an important tool used by the history department to detect plagiarism and AI usage, has recently been incorporated into the English department. “The nature of the assignments, how they are given, and how they are graded has changed.” Breen elaborated.

However, teachers recognize AI’s usefulness, especially in the learning process. In her research, Breen recognized that “teachers are really eager to learn more about these tools. They feel that they’re generally ill-equipped across all of our departments. In general, our faculty is craving more education about these tools.”

On campus, Lovett leads an Artificial Intelligence working group formed by interested community members. The group meets to discuss developments in AI and its usage on campus. Lovett agreed with Breen’s statement. “One of our goals for this winter is to think more about professional development for faculty and how we get faculty and even staff more comfortable with these tools to help them and help students develop perhaps an ethical framework.” 

She further commented on the rise in academic dishonesty throughout the campus: “We’ve been focusing much more on the rise in academic dishonesty cases. I feel like that sort of puts it in a very negative light where it could be a great tool for us and for students and learning.”

AI usage is a nuanced topic, and teachers across departments have different views. “Currently, I think it’s being used more to cheat than as a tool,” Instructor of Mathematics Eric Bergofsky said. “However, I think it could be used as a tutor. So, instead of walking over to the learning center, if I have tried a problem and given it a reasonable attempt on my own and I’m stuck, you ask a computer for a hint.”

Breen acknowledged the benefits of using AI. “If they [students] use a tool to have a dialogue and to continue and to further their critical thinking and further their skills, then it’s enhancing their learning.”

An anonymous student agreed. “I often use ChatGPT to help explain math problems or concepts I don’t understand,” the student said. “I think, when used properly, ChatGPT may actually have positive effects on our community and even reduce the burden of the learning center or teachers to help explain basic concepts.”

Another anonymous student feels the need for AI policies in classrooms to be clearly communicated in each class. They believe the department outlined AI policies to be effective, yet “depending on the teacher, the class usually doesn’t spend much time going over the policies together.” The student added, “It’s just up to the students to read the rules and understand it on their own.” In other words, making sure that students’ understandings of AI policies are aligned with what the department laid out is crucial for integrity and learning within classrooms. 

Whether or not AI is beneficial, there’s no doubt that it will be an integral part of our lives.  “It’s tough because, as you can imagine, there are some foundational skills that technologies can do well, but in order for you to grow as an individual and be able to do higher-order thinking, you do need to be able to do some of those basic foundational skills on your own as well to help make the connections build on those foundational skills,” Lovett remarked.

“I just hope that our school will roll up their sleeves and try to learn more about this technology so that we can use it for good,” Breen concluded. “There’s a way that it can be harnessed for good. It’s not going away. You are now a generation using AI for virtually every part of your lives. And the sooner we teach you how to use it responsibly, the better.”

Previous
Previous

Exonians Reflect on Academy’s Flight Delay Infrastructure

Next
Next

Community Anticipates MLK Day