All Articles Education Events ISTE+ASCD, day 1: Ethical AI, sunshine committees and chatting with Paul Revere

ISTE+ASCD, day 1: Ethical AI, sunshine committees and chatting with Paul Revere

Ideas for culturally-relevant content, improving support for teachers and staff, navigating the tension between generative AI and instruction, and other lessons from ISTELive and ASCD Annual Show & Conference 2025.

8 min read

EducationEvents

Expo hall at ISTELive and ASCD Annual 2025

Photo credit: Kanoe Namahoe

Helping parents make friends with AI. Differentiated learning. Workforce culture. Ethics and AI.

The Henry B. Gonzalez Convention Center in San Antonio buzzed with activity Monday as educators engaged in sessions, exchanged ideas and checked out the latest technology wares at this year’s ISTELive and ASCD Annual Show & Conference.

Here are our takeaways from day one at the show.

Differentiated learning: Design toward the edges

At a Turbo Talk, Eric Carbaugh, a professor at James Madison University, outlined some of the tensions that arise at the junction where the use of generative AI and differentiated instruction meet. One of these is ensuring that AI is not being used without a metacognitive piece that ensures students understand the why of what they are learning. “We don’t want to short-circuit that pathway to expertise,” Carbaugh said, noting that the same thing goes for teachers. 

Carbaugh encouraged educators to think about creating classrooms that support differentiation by designing toward the edges. “Rather than aiming down the middle, thinking about how you’re designing outward to try to meet more kids where they are as often as possible. That’s really the goal if we’re aiming for maximum growth,” he said.

Potential ways for using AI in differentiation include providing scaffolding experiences around students’ readiness to learn. AI can also be used effectively to adjust text complexity to meet students’ needs. “To me, this is one of the really big game changers for AI use,” he said. 

Other ideas included using AI to develop choice-based activities or to provide feedback to students, Carbaugh said, cautioning that educators should ensure this use does not short circuit what teachers know about their students’ needs. AI tools can also be used as brainstorming partners or can help teachers proactively develop strategies to help students stretch past known sticking points, he said. 

“Ultimately, we’re trying to live in that middle ground, where DI meets AI, where we understand why we need to do this, we understand what it looks like and recognize that AI is a tool. It doesn’t in itself differentiate – the teachers do that,” he said. 

Bridging gaps: Culturally-relevant content

Preserving the Indigenous languages of the Marianas — Chamorro and Carolinian — is a priority for the Commonwealth of the Northern Mariana Islands Public School System, said Riya Nathrani, instructional technology coach for CNMI PSS, during a panel discussion about practical AI implementation strategies moderated by ISTE+ASCD Senior Director of Innovative Learning Jessica Garner. 

We want to ensure that our students know the languages, so that they are able to carry it on for future generations,” said Nathrani 

The challenge, though, was a lack of resources and materials to teach the languages effectively. The team turned to AI for help. It became “an idea bank, where they could get activities and lesson ideas and write stories, and then translate [them] into the languages,” said Nathrani. The project helped build a foundation from which teachers could create materials without having to start from scratch.

CNMI PSS teachers are also using AI to generate images of Pacific Islanders and create culturally-relevant materials. 

“[I]t’s hard to be what you cannot see,” said Nathrani. “[I]f you don’t really see yourselves reflected in that curriculum or in that role or in that leadership position, [you] won’t really aspire to do those things or to be in those roles.”

Nathrani gave the example of a science teacher who was doing a lesson on ocean biodiversity and wanted to highlight the oceans and coral reefs surrounding their islands. Unfortunately, the textbook did not include this information. The teacher used AI to create content and stories related to the Pacific Islands.

“[T]hat was really meaningful to our students,” said Nathrani. “[N]ow they could really see how it was so relevant to their lives and their surroundings.”

Bring on the sunshine!

Elyse Hahne, a K-5 life skills teacher in the Texas’ Grapevine-Colleyville school district, suggested school leaders take steps to improve their workplace culture by curating a sunshine committee to help support and show gratitude for teachers and staff. 

These committees can use surveys to gather ideas about staff interests and the ways in which they’d like to be supported. Ideas for events and activities can be found and shared in social media groups or through word of mouth, Hahne said. 

Whether through words of appreciation, gifts or acts of service, school leaders should be intentional about their approach and ensure they honor people’s preferences and cultures, Hahne said, and they can reach out to community partners to help make events and activities more affordable. 

The value of showing kindness and improving the culture extends to students as well, Hahne said. “As leaders we get to model this, whether in the classroom or out of the classroom. The kids are watching and they want to see us being nice to each other, and they’ll reciprocate.” 

Schooling parents on AI

How do you help parents adjust to the presence of AI in their children’s learning?

“[Parents] just need to be aware that these are the tools that are expected to be used in the class,” said Alicia Discepola Mackall, supervisor of instructional technology at Ewing Public Schools, during the panel discussion with Garner. 

Mackall referenced different ways schools are helping parents get comfortable with AI, including hosting AI academies or classroom demonstrations. These tactics can go a long way in building knowledge and nurturing support.

“[T]o be honest, a lot of people don’t know what [AI] is. They don’t understand, right?” said Mackall. “So having teachers and students show parents what they’re doing with AI might shift perspective further.”

Sharing AI-use guidance resources with parents can help quell safety concerns, said Mackall. She also encouraged educators to show parents how they can use AI in their own daily lives. “Starting with meal planning, so that people can start to see the power of it and not be quite as afraid of it,” said Mackall. “[Make] it accessible to them.”

Demonstrate how AI tools can charge students’ curiosity and help them think and question along new lines, Mackall advised. She gave the example of a conversation she had with her daughter, a third-grader, who was using SchoolAI as part of a history lesson. Mackall’s daughter and her classmates were engaging in conversations with historical figures.

“She came home and [said], ‘Mom, did you know that there was a girl who actually did a longer ride than Paul Revere?’ I was like ‘Who told you that?’ and she said, ‘Paul Revere,’” Mackall recounted.

Using AI to deliver creative learning experiences like this helps learning stick. Parents want to support that, said Mackall. 

“As a parent, that’s exactly what I want my kid to be doing,” said Mackall. “I want them to be questioning. Even if a parent’s not parenting how we think they should be, they still want what’s best for their kids, right? So I think it’s our job to invite them in virtually and show them what we might be able to do with tools like this and thinking like this.”

Exploring ethics and AI 

As AI has emerged, there has been damage to the social contract between teachers and students, said university lecturer, author and consultant Laurel Aguilar-Kirchhoff at an innovator talk with teacher librarian and program director Katie McNamara. Aguilar-Kirchhoff  shared a personal story of being accused of AI plagiarism by a professor in a graduate course. After explaining and providing evidence to the instructor showing that she had used an AI tool not to plagiarize but to merge two documents, the instructor admitted she wasn’t keeping up with the technology, but did not give her back the points she had taken off her grade. “Our contract here has been broken,” she said. 

Concerns around ethics at the classroom level also include the privacy piece. “Everytime a student uses AI for practice, for writing something, or whatever they’re doing, data is being collected about that learner,” Aguilar-Kirchhoff said. While schools and districts will mostly handle the vetting process, it’s important for educators to consider these implications as well and find out how the data is being stored and used when deciding to use a tool, McNamara advised. 

With concerns around negative bias and AI, educators can help students understand the algorithms being used and have them ask critical questions about what AI is producing. “We know that not only is it representing the biases in our society, in our world, but also it can perpetuate that because the AI outputs do impact societal problems,” Aguilar-Kirchhoff said.  

But addressing these and other concerns about AI use does not mean avoiding using it. “We have to prepare students for the future, critical thinking, digital literacies, digital citizenship and media literacy, “ Aguilar-Kirchhoff said. 

To access the benefits of AI in an ethical way, educators should consider their own practice and as lifelong learners ensure that they are building capacity and knowledge around AI, she said. And as with all edtech, they should be thinking about the specific tools they are using and why. “Because when we have that intentionality, you know it’s not just the next new thing,” she said.