facebook-pixel

Detecting when students use AI for homework is this school year’s newest challenge. Here’s how one Utah district is dealing with it.

Provo School District, like other districts, banned ChatGPT from school-issued laptops. Now it is considering a different approach.

As Utah kicks off a new school year, educators are less stressed about students resorting to “the dog ate my homework” excuses — their more pressing worry is that a chatbot did the homework.

That scenario is one of many concerns driving the discussions as the Provo School District begins writing a new policy on artificial intelligence for its students.

“It’s not sensible to send home a writing assignment,” said Suzy Cox, director of innovative learning, “and expect that the writing that comes back was crafted by that individual student without the help of AI.”

The concern is primarily around generative AI, which can create original articles, images, music and more, by learning from existing data. ChatGPT is the most widely used generative AI platform. With a single prompt, it can produce an essay on the War of 1812, solve a complex math problem, or write a Shakespearean-style play within seconds.

“AI is not a fad,” Cox said. “This is not a technology that’s going away in a couple of months. This is transformative.”

She drew a comparison between the influence of AI on society to the impact of tractors. “Once you have the tractor, you’re not going back to the horse and the hoe,” Cox said. “You’re going to stick with that tractor, and that’s where we are right now.”

Since ChatGPT’s release last year, educators haven’t had much time to tackle questions like whether to allow its use in the classroom or how the tool affects student learning and creativity.

Many districts, Provo included, initially responded by banning the technology altogether — at least on school-issued laptops.

However, conversations are evolving as local and state officials recognize the need to adapt educational approaches to prepare students for careers that will inevitably involve AI.

“Our students have got to learn how to use this stuff,” Cox said. “They have to understand how it works, how it’s created, have opportunities to create it themselves, and along with that, how to use it ethically.”

Despite this need, few schools have established policies regarding AI use. A global survey by the United Nations Educational, Scientific and Cultural Organization found that less than 10% of schools and universities included in the survey had formal guidance on generative AI applications.

As the Provo School District looks to develop its own policy, officials are considering factors far beyond issues like cheating on homework, such as student privacy.

“Ultimately, our use has to be intentional, safe and equitable,” Cox told school board members last week. “So those are the principles that are driving what we’re talking about now.”

Student privacy

Current AI platforms available to students are not bound by data privacy agreements with the state or school districts, Cox said.

But schools and school districts must comply with the federal Family Educational Rights and Privacy Act, or FERPA, which protects the privacy of student records.

This raises legal issues and prevents teachers from requiring students to use AI for classwork, Cox said, or even allowing them to experiment with it.

Teachers can play with it in front of students, Cox explained, to show them how it works, but they can’t let them use it if parents haven’t signed off.

Equity between students

While parents can individually give permission for their children to use AI programs on devices they or their family own, this leads to inequity among students, Cox said. Those with the means to access multiple devices can easily go around restrictions on school devices — and they will — but others may not have the same opportunities.

“Without being naive that students are going to use it,” asked board member Megan Van Wagenen, “how do then we safeguard against the equity issue?”

“I don’t know,” Cox responded. “It’s really so variable. Because we have a lot of parents who don’t have a lot of awareness around AI, and so their kids may be using it completely without their awareness. We have other parents who are really on top of it.”

It’s something the district will have to continue to navigate, she said, and educate parents about.

Detecting when students use AI

While software exists that can determine if work was produced using the help of AI, it’s not particularly effective.

Plagiarism is more easily detected because it can be tracked to an existing source. Content generated by AI is original.

“Even ChatGPT is getting to the point where it can’t recognize its own work,” Cox said. “AI is changing so quickly that the software can’t keep up with it.”

Teachers are then in a position where it becomes the student’s word against a report that says it detected AI.

“There’s really no way that we can prove that one way or another,” Cox said.

Despite the limitations, Cox recommended the board purchase AI/plagiarism detection software, citing Copyleaks as a possible option. On its website, the company touts itself as the only “enterprise AI detection solution” and promises 99.1% accuracy.

How can teachers use AI?

As the Provo District mulls over the ethical use of AI for students, it also recognizes potential benefits for teachers.

“It has to be part of our instruction,” Cox said. “We as administrators, teachers and staff can and should be using it to increase our efficiency and our effectiveness and to improve our students’ experiences in our classrooms.”

AI can be used to assist with day-to-day tasks and offer ideas for lesson plans. Part of using AI as an instructional tool, she said, is recognizing its permanent impact on the practice of teaching, also referred to as pedagogy.

“We’re going to have to change our pedagogy,” Cox said.

If, for example, an academic focus continues to be on developing the quality of writing a student can produce on their own, how teachers approach writing must change. It might entail more writing in the classroom, changing the types of writing students are doing and monitoring work more closely, Cox said.

“[AI] is affecting not just writing,” she said. “It’s in art, it’s in everything … It’s permeating multiple fields.”

And it will continue to, in ways both expected and unexpected.