A recent experiment conducted at UMass Amherst found that using AI to complete assignments did not significantly improve students' grades. However, students who were allowed to use AI showed more engagement in the subject material, attended class more often, and participated more actively in class. The study suggests that traditional factors such as a student's cumulative grade point average are more effective in determining academic performance.

AMHERST — Since the advent of ChatGPT and similar AI models, college professors have scrambled to try to counteract students from coasting on assignments when chatbots can seemingly do all the work for them. But how much does using AI actually help improve students’ grades?

Not by very much, according to a recent semester-long experiment conducted at the University of Massachusetts Amherst that sought to examine the impact of AI in student learning. The experiment was led by Christian Rojas, professor and chair in the Department of Resource Economics.

“The inspiration had to do with the intense discussions that we see out there about the effects of AI on learning,” Rojas said in an interview. “I’ve also been invested in AI for a couple of years now, and I wanted to have greater involvement with the the technology.”

Rojas, along with his academic colleagues Rong Rong and Luke Bloomfield, conducted the experiment on students in two different time slots of the same class taught by Rojas, a course on anti-trust economics.

Students in the morning class were forbidden from using AI for assignments and in the classroom, while the afternoon students received permission and structured guidance on AI usage throughout the course.

Rojas chose to grant access to AI usage to the afternoon class, he said, because he observed that students in the afternoon section of the course historically performed slightly worse than morning students.

Students in both classes were told they were taking part in an academic study, but were only told the study had to do with pedagogy and education, not on AI.

“The only difference between the two classes was the AI usage,” Rojas said. “We made sure they received exactly the same lectures, slides, homework and exams, all of that.”

But when it came to the course’s two major exams, as well as overall final grades, the professors found there were no major differences between the two classes. Rather than the usage of AI, more traditional factors such as a student’s cumulative grade point average proved far more effective in determining their academic performance in the class, the study found.

Although the students may not have improved academically from using AI, they showed other signs of improvement when it came to their studies, Rojas said. He noticed that students in the AI class appeared more engaged in the subject material, showed up for class more often and would participate more actively in class.

“We can’t say this in the [study], because we don’t have a real variable for this,” Rojas said. “But just going into the two classrooms as the semester evolved became like two different experiences.”

In surveys distributed to students who took the course, those in the AI class also noted a more positive outlook on the technology, as well as reporting fewer weekly hours spent studying outside of class. Students in the AI class also noted a higher frequency of using AI outside the class to edit and check for grammatical issues on their original work, rather than using it to try and obtain a complete answer.

“They don’t do better in the exams, but they spend less time doing the work,” Rojas said. “They get to the same finish line, but they’re able to do it in more efficient ways and with more engagement.”

Of course, this study is just a small sample of the total number of students attending UMass, and an even smaller sample of college students across the country, many of whom are likely using AI. But Rojas said the consistent pattern observed of how students interacted in class depending on permissions of AI usage was certainly worthy of further conversation.

He said in the future he would like to do an expanded version of such a study, such as tracking students across their freshman or sophomore year and then determining how they do later in college.

“Seeing these kids engaged in and coming to me and talking about AI and how they could use it, it just opened up this transparency,” Rojas said. “As an instructor, I would tell the class how I use AI to help me with certain tasks. So there was transparency on both sides, and I think that was very important.”

Alexander MacDougall is a reporter covering the Northampton city beat, including local government, schools and the courts. A Massachusetts native, he formerly worked at the Bangor Daily News in Maine....