Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.
Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.
A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.
I’m guessing there was a previous connection with some of the study authors.
I skimmed the paper, and I didn’t see it mention language. I’d be more interested to know if they were using ChatGPT in English or Turkish, and how that would affect performance, since I assume the model is trained on significantly more English language data than Turkish.
GPTs are designed with translation in mind, so I could see it being extremely useful in providing me instruction on a topic in a non-English native language.
But they haven’t been around long enough for the novelty factor to wear off.
It’s like computers in the 1980s… people played Oregon Trail on them, but they didn’t really help much with general education.
Fast forward to today, and computers are the core of many facets of education, allowing students to learn knowledge and skills that they’d otherwise have no access to.
GPTs will eventually go the same way.