I recently had the opportunity to meet, and learn from, Tom Guskey. Along with Guskey’s amazing work on assessment and grading — On Your Mark is a tremendously worthwhile read — he has also done phenomenal work regarding key components of effective professional learning. His book Evaluating Professional Development sets the stage for rethinking our design of workshops based on the evaluation measurements we use, and this text has proven helpful to me as our agency continues to strive to make the learning sessions we offer as valuable as possible.
Guskey’s evaluation methodology is based on five levels of evaluation, ranging from basic “temperature checks” to impact measures and a focus on student outcomes. As an agency that believes in collecting and using feedback regarding the decisions we make, we have generally done a respectable job across the board in addressing the first few levels. As I realized while listening to Tom Guskey, however, only a few of our programs have found effective ways to truly evaluate the upper levels focused on application and outcomes.
Upon reflecting on Guskey’s learning session, I’ve come to realize there are at least three steps we can take to broaden our evaluation methods to come closer to reaching all five benchmarks:
Tie our workshop evaluations to new workshop offerings.
In our curriculum department, we utilize workshop evaluations to collect feedback after every session, regardless of whether it is a multi-day institute or a one-day info session. We use these evaluations in a number of ways. Sometimes, it is purely as a source of feedback for our facilitators, and other times, we employ a bit of a “if you liked this, then you might like this” methodology, where we reach out to past participants and let them know about upcoming workshops that are tied to their potential areas of focus. What we’ve done regularly over the past few years, and must continue to do, is show that the evaluations we ask participants to complete are used to design new and further innovative learning sessions. This is important, particularly for the initial levels of Guskey’s professional development evaluation structure, where we are collecting general event information from participants because we want to make sure that these workshop evaluations, which are the heart and soul of the data we collect, are validated by those who complete them. After all, we won’t have the opportunity to collect impact data if our workshops aren’t meeting needs and learners aren’t attending. Since these evaluations are the tools we use most often, we need participants to see their value for us to see how the learning opportunities we hold influence practice.
Visit district representatives to understand their culture, context and community for learning.
This can also be done at the building level. For us to best understand the learning needs of those we support, we must understand their cultures and communities, as well as the context for learning, on their turf. In the past, we’ve spent considerable time visiting with our district representatives in their schools, getting insight into how we can help support them and asking questions about the work they are doing, hope to do and need to do. What is so fascinating, and maybe not entirely surprising, is that we often tend to learn the most from our colleagues when we are visiting with them in their spaces. Certainly, we learn much when meeting at our organization or in a neutral space; however, generally, we tend to be the most open when the environment we are in is not only familiar, but our own. Since Guskey’s third level requires us to understand an organization’s foundation, these types of meetings tend to be very helpful for us to fill the gaps of our knowledge and design learning opportunities that not only meet individual needs, but organizational ones as well.
Research application and outcomes in the future.
One area where we have room to grow is in asking past workshop participants about their application of learning since they joined us. To begin assessing this — and later, by connection, how their use of learning impacts student outcomes — we have to rethink the tools we use to collect participant information. Rather than rely on just an wrap-up workshop evaluation, we have to design a tool that we can use to check in with participants a number of months after the event. This would allow us to gauge not only what they learned, but also how they are using it. It doesn’t have to be lengthy; a few questions may be all that is needed. Here’s what we’re piloting now; I would appreciate hearing your feedback! Opportunities exist for us to grow this data collection too. In the future, we might conduct interviews, engage in visitations for those participants who are interested or create focus groups of past attendees. For the time being, however, this survey will help us move slightly closer to levels four and five of Guskey’s professional development evaluation model.
One of the most important lessons I took from the Guskey’s session was the importance of thinking of the levels as a scaffold. All early levels have to be in place if we are going to be capable of assessing impact and outcomes. Another key lesson was that each level is important. Though our ultimate goal is to see how educators are using what they have learned, thereby determining how student learning and practice has changed, we need to understand what participants have thought of the experience, what they have learned and their organizational needs and influences in order to provide the most well-rounded professional learning possible. How are you evaluating professional learning in your work? Please share all ideas, processes and thoughts!
Fred Ende (@fredende) is the assistant director of Curriculum and Instructional Services for Putnam/Northern Westchester BOCES in Yorktown Heights, N.Y. Fred blogs at www.fredende.blogspot.com, Edutopia, ASCD EDge and SmartBrief Education. His book, Professional Development That Sticks is available from ASCD. Visit his website:www.fredende.com.
Like this article? Sign up for ASCD SmartBrief to get news like this in your inbox, or check out all of SmartBrief’s education newsletters, covering career and technical education, educational leadership, math education and more.