Premium

What Happens if Using AI to Cheat Becomes the Norm?

AP Photo/Michael Dwyer

A little more than two years ago, the Supreme Court ended affirmative action for university admissions. Immediately, people on the left were looking for some kind of workaround, a way to game the system that wouldn't expose the universities to lawsuits. The most obvious one related to personal essays that were submitted with applications. The Washington Post reported at the time that admissions offices were thinking about using those essays as a way to maintain an unfair admissions system without admitting to doing so.

Before the ruling, admission leaders and lawyers around the country strategized for months over what would be permissible. Shannon R. Gundy, assistant vice president for enrollment management at the University of Maryland, which until now has considered race as a factor, told higher education leaders in April that universities should advise counselors on strategies for writing letters of recommendation and students about writing admission essays.

“Right now, students write about their soccer practice, they write about their grandmother dying,” Gundy said. “They write about the things that are personal to them. They don’t write about their trials and tribulations, they don’t write about the challenges that they’ve had to experience, and they don’t know how to and they don’t want to. We’re going to have to educate students in how to do that.”

In light of the new focus on personal essays as a way to keep some form of affirmative action alive, I wrote this post asking a simple question: What if students just lie? Because it seemed to me that what some schools were doing was creating an incentive structure where it would be beneficial to do that.

Some students are probably already lying or exaggerating in order to create a compelling essay. But in the past, students still had to submit their SAT scores and grades and those scores mattered a lot. Only the students with top grades and scores had a chance at Harvard regarless of what story they had to tell. Maybe a great essay could give them an advantage but it wasn’t enough on its own to get them in.

But if test scores are increasingly abandoned and the importance of grades are discounted in an effort to include more minority students, what’s left as the core of the application is a bunch of essays written (perhaps?) by students who know their best chance at acceptance is a story about their struggles with racism or some other ism.

Of course there are students who may genuinely have struggled with racism. I’m not denying or discounting that. I’m just pointing out that, with the importance of the essay being elevated, we create a major incentive for some people to just…make something up.

What didn't occur to me in mid-2023 was just how easy it would be to cheat thanks to AI. Chat-GPT was first released at the end of 2022 so it was still pretty new back then. But now, there is lots of evidence that students at every level are using ChatGPT all the time. One embarrassing example just happened at the University of Illinois Urbana-Champaign. The story involves students, mostly freshmen, enrolled in an introductory data science course.

About 1,200 students take the course, which is divided into two sections that meet on Monday, Wednesday and Friday.

Attendance and participation in the lectures count for 4 percent of the final grade in the class, which is primarily taken by first-year students.

To track the engagement of the class, the professors created an application known as the Data Science Clicker that requires students to log in on their phones or computers and, when prompted by a QR code, answer a multiple-choice question in certain amount of time, usually about 90 seconds.

But in early October, the professors said they began to grow suspicious when dozens of students who were absent from class were still answering the questions.

In fact, lots of students in the class had figured out that they could just cheat. They could fake their own attendance by answering the Data Science Clicker questions without even going to class. The two professors who lead the class decided to issue a warning to about 100 students in the class. Soon they started receiving apologies. But there was a problem.

...there was one problem, a glaring one: They had not written the emails. Artificial intelligence had, according to the professors, Karle Flanagan and Wade Fagen-Ulmschneider, an academic pair known to their students and social media followers as the Data Science Duo.

The students got their comeuppance in a large lecture hall on Oct. 17, when the professors read aloud their identical, less-than-genuine apologies from a projector screen, video from that class showed. Busted.

Here's an Instagram video in which the two professors talk about receiving the identical AI-written apologies from about 80% of the students who'd been warned about cheating on their attendance.


The most disturbing part of this is that nothing happened to any of these students. Of course the school has a student code of conduct which forbids cheating but in this case the school decided these attempts to game the system aren't covered and won't be punished. Even the professors teaching this course are joking that the whole thing is a "life lesson."

Here's a question someone at the school might want to look into. How many of the students who used AI to write their apologies also used AI to write their admissions essays just a few months earlier. My guess is it's close to 100%. In fact, I'd bet a significant percentage of the students who weren't caught cheating did the same.

So I'm going to renew my objection to making personal essays the center-point of college applications at a time when they can be completely faked in a matter of minutes. Not only can AI do all the work, for all we know the sob stories these students are telling are also completely fabricated. How would the admissions office know? Who is going to tell them that that a touching story about some kid overcoming personal adversity was a lie?

In a system where cheating is both very important to advancement and very easy to pull off, relying on students to police themselves is nothing short of insane. As the story above shows, we're already much further down this road than most people think. Cheating isn't something a handful of bad students are doing anymore, it's becoming commonplace.

And the worst part is that the more commonplace this gets, the higher the penalty is for not cheating. Why? Because writing good essays is hard. An 18-year-old who doesn't cheat is much more likely to turn in something real but imperfect. ChatGPT may not be artificial general intelligence yet, but it can probably write a neater, more perfect essay than the average 17-year-old. And it can certainly do it a lot faster. Non-cheaters have to plan ahead and decide where to use their resources. Cheaters don't have to worry about running out of time. They can whip up that essay the morning it's due without doing any reading or writing.

What happens in a few years from now when all of the "smartest people in the room" turn out to be middling sociopaths who have no compunction about cheating? I know there are people who are worried about what AI is going to do to all of us as a society. I just don't think they realize how far gone we already are.

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement