As generative AI has become increasingly more accessible to students, there’s been a lot of concern over how it will change education. At New England Innovation Academy, we turned a critical — but open — eye toward ChatGPT the moment it burst onto the scene. As a school centered around innovation and technology, we wanted to moderate our reaction as we worked to understand the pros, cons and everything in between.
At NEIA, we come at everything with a human-centered design perspective. So, when we started conversations about generative AI, we did what we always do: Look toward the wisdom of our community. We spent time listening and iterating our response based on the ideas of our students. We looked to leverage the opportunities and understand the pitfalls of generative AI. Then, we spent much of the summer integrating it into our discussions and making the changes to our curriculum necessary to acknowledge the reality that AI is a part of our students’ toolboxes and that, quite frankly, it will be a useful skill for them to have in the future.
Homework, in particular, has generated a lot of worry when it comes to how AI is being used to cheat. There’s no doubt: AI has rendered traditional homework assessments more challenging for teachers everywhere. But instead of taking an unyielding line and forbidding the use of AI and adding “run the work through an AI checker” to our list of assessments, we’ve continued to reflect and iterate with intention about how the existence of AI can allow us to grow as educators. In short, what are we asking students to do for assignments, why are we asking them and what skills are we prioritizing?
A fresh approach to homework
Since our founding three years ago, the team at NEIA has always taken a unique approach to homework (or, as we call it, learning extensions). Our intention is to allow our students to explore passions outside of their school day, with the understanding that our curriculum, projects and real-world learning are not confined to the classroom.
Our challenge to our staff has always been this: Keep learning extensions to a minimum – no more than 30 minutes a night in any given class – and avoid assigning it to bury students in work or to catch up on what is missed during class time. At the core is a strategy to ensure that the scope of what teachers are planning fits within the time allotted for their class. There isn’t a no-homework policy written into our guidelines; we just don’t want our teachers to feel they must assign homework. Any learning extensions are designed to deepen the work already done in class or to prepare for the next day – not to present a new piece of material. By approaching homework with thoughtful vigor instead of overwhelming amounts of busy work, we’re able to create a policy around homework that allows students to pursue interests outside of the classroom.
AI’s impact on learning extensions
With our technology and innovation mission, our response to AI was likely more tempered than most. When students began using ChatGPT, we immediately understood that they’d need to be able to use this technology as adults and that integrating it into our curriculum, rather than resisting it, was essential. Thinking through our AI policy with our students, and using it responsibly and effectively, has been core to our work. But of course, this is more easily done in the classroom than it is when sending students home with assignments.
While we are using a human-centered design approach to create an AI policy with our entire school community, the emergence of AI almost immediately impacted how we teach and how we assign our learning extensions.
Instead of banning AI, we’re taking it into account when designing our lessons or learning extensions. For example, we are challenged to think about how we create learning extensions and assignments that exercise a student’s critical thinking while also allowing them to use the tools at their disposal. We’re actually bringing AI into our lessons to help build on student learning, like using it as a verifier after brainstorming ideas or as a kick-starter for future research.
When we are designing our assignments, we as educators are challenged to figure out how to create work that can’t be done with AI, and I’d argue that’s a positive development. For example, it’s difficult to have AI follow a set of writing guidelines or rules — so even if a student is able to use AI to help generate an assignment, they’ll need to review it carefully and think critically about its contents before turning it in. Finally, when we use generative AI, we are mandating that our students always cite what resources they are utilizing.
Integrating responsible sourcing guidelines
Once educators accept that AI will be used to work on a given assignment, it gives us the liberty to create guardrails around it and to teach students how to think about AI-generated work critically. We score our students on a number of competencies, and over the summer, we added the proper citation of sources to our list of competencies. AI is permissible, we have explained, as long as it’s properly credited.
This allows us to have open conversations about the use of AI as a source and its credibility, and to co-create our AI policies with the students. Properly citing and crediting research is even more important now than ever, as is understanding the source of your information or material. Our goal is not to restrict students from using AI, but rather to show them how to use it effectively and ethically.
The integration of AI into the classroom is stretching the imaginations of both teachers and students — but that stretch and challenge can be a good thing! It can help us create a curriculum that prepares students to be the innovators of the future and gives them a head start in leveraging new technology that they wouldn’t have if we banned or ignored it.
Ben Farrell is the assistant head of school and director of the upper school at New England Innovation Academy, a Massachusetts day and boarding school for grades six through 12 where students develop their passion and prepare for their best future.
Opinions expressed by SmartBrief contributors are their own.