Exeter Adapts to New ChatGPT Policies
By ISABEL EVANS, ELLEN JIN, and ADELLE PITTS
With the recent rise of AI, schools have been questioning the possible effects that may be caused by its use in an academic setting. In a school as rigorous as Phillips Exeter Academy, faculty and students are concerned about whether the use of AI may negatively affect the way students learn.
One of the most popular AI chatbots is ChatGPT, released by OpenAI on Nov. 22, 2022, and it is an application where people can ask questions, generate images, write essays, and much more. Teachers worldwide are trying to limit AI use in the classroom for fear of students abusing its capabilities. However, people have been questioning this response. Is ChatGPT a threat to education or can it be used to improve learning?
While there are a variety of views on this topic, most members of the community agree that using AI tools hinders student learning and thinking but can be helpful to a certain degree.
“The downsides of AI are many for students writing essays and narratives, as so much important learning happens in the midst of struggle and failure,” Barbara Desmond, the Chair of the English Department, said. “AI tools make it easy, even if you are only using them to get started: you never have to have that unsettling experience of staring at a blank screen.”
Hannah Hofheinz, the Chair of the Department of Religion, agreed that using AI takes away from students’ learning and prevents the originality of their writing, stating, “We want your ideas, your interests, your passions, your thinking, and we want to build that up. That’s true speaking, that’s true idea creation, that’s true writing. That’s true imagining. So if you use AI, you have short-circuited that.”
Hofheinz also believes that AI is something that needs to be explored and has the potential to bring positive changes to the academic setting. “I think the school must take time to do the necessary learning about the technology. Demystify the technology. We should understand how it works. It is not magic, it is simply something that was created,” she said.
Many people on campus agree with Hofheinz, and some have already started employing changes that have been put into motion due to the rise of AI. Instructor in Science A.J. Cosgrove said, “I rarely ask students to complete a traditional ‘lab report.’ Instead, I’ve moved to lab proficiency assessments. I’ve removed the potential for academic dishonesty, and I’m forced to think critically about what lab skills students should become proficient in.”
When students were asked if they use ChatGPT or AI, they mentioned they would use it for factual information. “I would say more recently I started using ChatGPT to find synonyms for words for papers. But that’s the extent of where I use it for homework,” upper Sophia Wang said.
Senior Emilie Carranza added, “I don’t really use it for homework except as an extra resource. Sometimes, I’ll search up chemistry theories or I’ll ask it to give me problems for math.”
Several students expressed concerns over the reliability of using AI tools. “I think that if you don’t know how to do your homework, you’re better off going to class and figuring out how to do it than you are using ChatGPT, especially because ChatGPT is usually wrong,” upper Sarah Huang said.
“Even if you did try to use it to write essays for class, it’s not a reliable tool and most teachers can recognize that,” Carranza added.
Wang also discussed the common occurrences of AI’s inaccuracy. When asked about seeing other students using AI, she said, “Sometimes people use AI to try to solve math problems, but the answer often comes out wrong. The answers from ChatGPT should be taken with a grain of salt because it’s outputting answers from a random website online. However, it is still sometimes helpful to see the process lined up for you, even if the numbers are wrong.”
Director of Studies and Instructor in Science Jeanette Lovett agreed, saying “While AI can provide information, it might not always be accurate or up-to-date. Students should learn to verify information from reliable sources.”
Lovett also mentioned the benefits of using AI: “Students can use generative AI to get assistance with homework or for understanding complex topics. It can provide explanations, examples, and help clarify doubts. And it is very friendly and non-judgmental.”
However, Lovett also believes in AI’s potential to harm students’ learning. “Students might become overly reliant on AI, hindering the development of critical thinking and problem-solving skills,” she said. “It’s crucial to encourage independent thought and research.”
The rise of ChatGPT has pushed different departments at Exeter to develop their own AI policies. Lovett said, “In the spring of 2023, an ad hoc faculty group discussed acceptable use of emerging generative AI technologies like ChatGPT. We came to the conclusion that our existing Academic Honesty policy already has the language to encompass generative AI. It uses more general language about first requesting permission from a teacher for the use of any tools/resources, as well as the need for citations.”
Individual departments also have their own policies regarding AI. “AI is a very new thing,” Chair of the Modern Languages Department Fermin Perez-Andreu said. “We are starting to learn about it in our department. We still need time to see how it can be productively introduced in the classroom and how we can avoid its problematic aspects. So far, maybe some of us have started to use it to create practice exercises for readings or videos we watch as part of our classes, but we haven’t gotten much further.”
Hofheinz also commented on the policies in the religion department. “We’re doing it on a term-by-term basis, but there is a religion department policy which essentially states that AI can only be used with advance and explicit permission from an instructor,” she said. “Every term we’re returning to it and rethinking it. Because our biggest concern is that we respect what our goals are — promoting individual thinking, that is.”
AI has raised concerns about how it may affect student learning, but it is generally agreed that regulated use could be beneficial to students. Lovett concluded, “Overall, incorporating generative AI into education requires a balanced approach that leverages its benefits while addressing potential challenges and promoting a well-rounded learning experience.”