All Articles Education Edtech 5 ways districts can center human connection in their AI strategy

5 ways districts can center human connection in their AI strategy

Generative AI can be a gamechanger in education, but schools still need to safeguard human connection, writes Julia Freeland Fisher.

6 min read

EdtechEducation

A colorful, computerized ray of light going through a computer monitor.

(Pixabay)

As more Generative AI tools hit the market this year, schools will be working to craft policies that keep up with the technology. But in that rush, there’s a chance that we may be missing the forest for the trees. How schools use this new technology is critical — but equally critical is the impact that new technologies could have on students’ offline lives.

As AI becomes increasingly capable of things like empathy, attunement, personalized support, and on-demand advice, many of the attributes we historically held as inherently human are no longer so. For example, researchers have already found that well-trained bots can alleviate loneliness on par with a human connection and outperform some doctors in their bedside manner.

The same may be starting in education. In a report out this month, Anna Arsenault and I analyzed navigation and guidance tools emerging in the age of AI. Among our findings? Some providers are building bots that take on more emotional, motivational, and esteem support roles. At the same time, we heard that districts and colleges aren’t demanding tools focused on scaling human connection. These technological capabilities and market incentives are a perfect storm: as the tech gets better and better at performing human-like tasks, the risk that students’ social needs are met by bots rather than humans gets worse.

With that in mind, districts’ AI policies and strategies, particularly when it comes to the use of chatbots, need to take human connection into account. Here are five considerations for leaders:

  1. Reinvest saved resources back into relationships. In many cases, AI promises tempting efficiencies where human-driven processes and interventions have been time and money-intensive. While efficiencies typically replace human costs, some institutions are pouring those resources back into human connection. For example, at Georgia State University, using Mainstay chatbots to increase student persistence, revenue gained through enrollment increases was put toward hiring more staff for student support. “Some people think we’re trading technology and getting rid of staff members,” said Tim Renick, head of the university’s National Center for Student Success. “Our advisor ratio 10 years ago when we didn’t have the technology was, in some cases, a thousand students to every academic advisor. Now we’re down to 350 to one. …The technology has allowed us to hold on to students, which means holding on to tuition dollars which allows us to plow more resources into hiring more people.”
  2. Invest in tech that connects. Not all schools, especially K-12 schools, have the luxury of recouping dollars to put back into staff. But even with those constraints, schools can still take a human-first approach to adopting new technologies. In fact, some tools are expanding schools’ pools of support by recruiting and training mentors, coaches, and experts to supplement schools’ scarce human resources. For example, Backrs, Career Village, College Advising Corps and Let’s Get Ready are all using AI to make connections to additional human supports.
  3. Ensure access to #human support. Some providers have taken pains to ensure that anytime a student is chatting with a bot, a human is just one click away. A number of tools include the option for students interacting with a chatbot to type “#human” to get automatically directed to a human advisor or coach. Advisors can also reinforce that choice. “At the end of my meeting I always say, ‘You can always chat [with] Blu, but if Blu makes a mistake or you want to talk to a real person, [we’re] behind the chat and we’ll be able to answer you,” said Maria Francisco an advisor with the nonprofit Bottom Line. This offers not only agency for students to choose what modality of support they most prefer but also requires systems to properly staff and support those human connections. 
  4. Escalate based on avoidant and over-reliant behaviors, not just acute emergencies. Districts using chatbots and companies building them often take pains to ensure that trigger words suggesting a student may be intending to harm themselves or someone else immediately raise a red flag to an adult. But there’s a yellow flag emerging in the space that schools would be wise to pay more attention to: students preferring to interact with bots rather than humans and using chatbots to avoid human interaction outright. That’s a trickier line to draw, but schools could implement escalation protocols in circumstances where students appear to be forming emotional bonds with bots or limit the duration that chats can occur.
  5. Renew district-wide commitments to connection. As AI technology rapidly advances, what we used to deem uniquely human is quickly becoming obsolete. Based on this clear trajectory, the question the field will face in the coming years is not whether bots should lend students various vital forms of support, but whether our schools stand for the value that everyone deserves human help and connection as well. In the past, that may have been implicit, as schools are social hubs in their community. But as bots in both consumer and edtech applications take on more and more social support roles, schools and their stakeholders can no longer take human connection as a given. “I think if everybody is on board with AI doing the mundane things in life, we’ll actually have time to build relationships,” said Tiffany Green, Founder and CEO of the nonprofit Uprooted Academy. “It will only work if everybody from a systems level is on board and saying we are all willing to step away from the control of the mundane of the knowledge piece and work on doing the relational.” In other words, a strong vision–and metrics–that schools will safeguard, deepen, and diversify students’ relationships provides a critical counterpoint to AI. 

As schools strive to reach each and every student, AI could be a game-changer. But as GenAI begins to emulate human connection in impressive and startling ways, schools must be aware of the social implications. By safeguarding connection in these ways, students can have the best of both worlds: a technology that extends human potential and a connected community to live into that potential together. 

Opinions expressed by SmartBrief contributors are their own.

 


Subscribe to SmartBrief’s FREE email newsletters to see the latest hot topics on educational leadership in ASCD and ASCDLeadersThey’re among SmartBrief’s more than 200 industry-focused newsletters.