Very few school districts were prepared for virtual learning at the scale we experienced last year. Many districts had laid some groundwork: Over the last few years, the number of districtwide 1:1 initiatives has more than doubled, but still fewer than half of districts had actually met that goal when the pandemic hit.
Like many districts, Chesapeake Public Schools in Virginia was delivering fully virtual learning while building the infrastructure necessary to manage the technology. The learning curve was steep.
When I came on board as the chief technology officer, my first task was to get a handle on the technology that had been purchased. What was being used? What was effective? I also needed to assess the interoperability of the data coming out of all that technology. What became clear is that after a year of scrambling to identify and implement technology on the fly, no one really had a complete picture of what we had and what was being used. Here’s how we got a handle on it all.
Auditing our capabilities
We decided to implement CatchOn, which could deliver the real-time data and learning analytics needed for critical volume and usage insights across the district. Its data-rich dashboard helped us work with building leaders and individual educators on what, how and why specific platforms and programs were used in the classrooms.
We spent about two months just tracking down all the programs, platforms, apps and all the contracts and licenses for each. The resulting reports were eye-opening. We identified purchases schools had made that had never even been delivered. We saw purchases coming in at the school level, department level, through our Title I department and from individual teachers.
The scope of the technology in use was vast. For many of these platforms, though, the adoption and usage data didn’t always align with licensing. For example, a principal may have purchased a platform intended for an entire department, but only a handful of staff was using it.
Beginning conversations with data, not justification
The data analytics tool allowed us to approach conversations in a solutions-focused rather than tension-inducing way. Now we could say, “We have 500 licenses for this program, but only 200 are actively being used. What do you think we should do?” Instead of feeling the need to justify a past choice, the educator now hones in on the best step forward.
Sometimes it’s a simple adjustment. Sometimes we can dig further into the data to see who is and isn’t using a program and discuss whether to implement new or better training for late adopters.
Targeted outreach and PD
We’d been using a licensed version of Google Workspace for Education, so we maintained continuity for students and teachers by choosing Google Meet for synchronous learning. That meant getting a significant number of outliers using another video conferencing platform to make the shift. We sent out emails, and we held trainings. But every week, we would pull up the usage data and we continued to see this application pop up.
We got a bit more granular and discovered that the hold-outs were mostly middle-school educators. We went to them directly and asked about their challenges or concerns in switching to Google Meet. Then we built grade-level-specific training tailored to their unique use of video. We highlighted the feature sets that would be most valuable to them and the types of content they were delivering. Soon, we began to see greater adoption.
Many districtwide adoptions and standardizations can feel very top-down. Data can help us identify which groups are adopting a new program so we can use them as champions for conversations about their experiences with their peers who are slower to adopt. Those personal connections and conversations can ensure that staff feel excited, rather than reluctant, about the new technology. That is a key difference.
Teacher feedback on tech purchases
I recently sat in a meeting where a mathematics department was trying to select a supplemental curriculum resource. They wanted to identify a tool that could be used for maybe 15 minutes three times a week to address material students may have missed during remote learning. The tool needed to allow students to work independently, adaptively and at their own pace.
The teachers identified three tools based solely on name recognition. I looked up the platforms the analytics platform, and we reviewed our usage data across elementary, middle, and high school. None was being used. We reached out to educators that the data identified as using some similar platforms and asked about their experiences. Did they like the platform? Did students like the platform? Were they seeing growth? It was a great example of how we can better identify platforms at the classroom level that we can scale up from there.
As technology director, I want to provide the training our educators need to use technology effectively. I do not want to provide training and force our teachers to use a platform or tool just because we bought it. This new method lets the data and application usage trends drive the change.
If you liked this article, sign up for SmartBrief’s free email newsletter on EdTech. It’s among SmartBrief’s more than 250 industry-focused newsletters.
More from SmartBrief Education: