All Articles Education Edtech What does academic integrity look like amid AI?

What does academic integrity look like amid AI?

When considering academic integrity, schools should consider how AI is increasingly used outside the classroom and in the workforce, so students glean the skills they will need in the future, said Stanford University professor Victor Lee during a recent webcast discussion with CEO of ISTE+ASCD Richard Culatta.

5 min read

EdtechEducation

Student using ChatGPT

Photo by Matheus Bertelli, Pexels

A big myth circulating about the relationship between AI and academic integrity is the idea that the emergence of generative AI tools has opened a “Pandora’s Box” of student cheating, said Victor Lee, associate professor at the Graduate School of Education at Stanford University, during a recent ISTE+ASCD webinar. Lee, who is also the faculty lead for the Stanford Accelerator for Learning’s AI and Education initiative, says research shows that cheating — seen for decades among as many as 60 to 80% of students — has not increased with the emergence of AI. 

More clarity is needed, however, when it comes to identifying acceptable uses of AI for students, Lee said. Students may be using AI tools to help them study, to help explain concepts they don’t understand or to get feedback on assignments before they are turned in to a teacher. 

When we acknowledge that cheating is a longstanding problem, and recognize that AI is not to blame, Richard Culatta, CEO of ISTE+ASCD, suggested that schools and educators address some of the key causes of student cheating by:

  • Creating assessments that are meaningful and relevant to students.
  • Engaging students in the creation of school and class norms around academic integrity.
  • Giving students agency around how they demonstrate knowledge. 

Lee noted that when considering academic integrity, schools should consider how AI is increasingly used outside of school and in the workforce. When students see these uses and are not learning how to use the tools in school, there is a disconnect.  

If assessments are one piece of evidence in an argument showing how much students know, having students show that they know when and how to use AI tools intelligently could also be a piece of evidence, Lee said. 

AI concerns in the humanities vs. STEM

A comparison of the emergence of AI in humanities classrooms with the integration of calculators in math instruction suggests that such tools can help educators and students go deeper with learning, Culatta noted. When calculators are used for basic addition and subtraction, teachers have more time to focus on complex topics. Lee stated that the research shows humanities teachers may have more concerns around AI when it comes to student writing, but STEM education is also affected and STEM educators will also need to think about teaching about and with AI tools.  

Establishing norms around academic integrity

It is critical to involve students in setting norms around academic integrity, Cullata said. 

Today’s students are busier than ever, between sports, extracurriculars and part-time jobs, and a student faced with an assignment that doesn’t feel relevant or a particularly high-stakes assignment might not see the difference between getting help with grammar and writing from an in-person tutor (allowed) or an AI tool (not allowed), when one is more readily available. A guideline for AI use could involve requiring disclosure or transparency around where and when such tools are used, Culatta said. 

Critical thinking and AI literacy

When teachers and schools consider policies around AI use, they might also consider allowing its use for the type of tasks for which students might be required to use it in the future, Lee said, noting that critical thinking is still essential for students in reviewing anything that AI produces. Students should be learning what AI can do and what its limitations are, Lee said. 

The dos and don’ts of student AI use

When establishing norms, Culatta suggested educators look at their overall rules around technology use, emphasizing a focus on the ways students should be using AI and technology rather than on the prohibitions, sharing positive examples such as, “We expect you to use technology to fact check. We expect you to use technology to bring new ideas to class.”

Culatta also offered an analogy that can be used with younger students, likening AI to a dump truck and students’ brains to sports cars, noting how they are each good choices to use in different contexts. Applying the analogy to an example assignment, students can be encouraged to use the dump truck for the heavy lifting of style and formatting, while using the sports car for more complex tasks that are critical to the project.

Other examples of effective AI use in schools discussed included honing debate skills and academic arguments with chatbots, or using AI as a tutor to quiz students on a topic. AI tools are also increasingly being used to create visual representations of concepts students are learning, Culatta noted.

AI in assessments

When creating assessments, striving to create those that are AI-proof might be taking the wrong approach, according to Culatta.

“The goal should instead be to create authentic assessments, which AI can help create,” Culatta said. He recommended plugging a standard into AI and asking for suggestions on assessing the standard. Using AI for brainstorming lets teachers consider ideas that would work best for their students and their goals. 

This tactic also allows the tool to contribute rather than drive a project, Lee said. He gave the example of encouraging students to use AI in the first phase of a project, thus allowing them to progress more rapidly to the later phases of the project. 

Time and space for learning

In closing, Culatta called on decision-makers to ensure that teachers have the time, space and resources to learn and experiment with generative AI. “Time and space to fail and time and space to reflect on those failures and share and grow,” he said.