All Articles Marketing #NRPAConference: How to make your program surveys smarter

#NRPAConference: How to make your program surveys smarter

Customer surveys are usually flawed, but they don't have to be. The 2019 NRPA Annual Conference offers some tips.

4 min read

Marketing

#NRPAConference: How to make your program surveys smarter

Pixabay

Surveys are a must-have for organizations that offer events and activities, including parks and recreation departments and other government agencies. You want to know what people liked, what they didn’t like and what they wish could be added. Resources are finite, so these findings can help organizations optimize their offerings.

But have you considered that your results could be warped by the types of questions you ask, the available responses and even the ordering of your questions?

Most of us haven’t thought enough about this, said Ron Vine, who presented 10 tips for better surveys on Tuesday at the 2019 NRPA Annual Conference in Baltimore, Md. Vine has decades of experience in parks and rec and as a consultant, and he offered quick tips from his work with 10 local parks and rec agencies.

Here are some of the highlights that can help with your customer and user surveys and programming, regardless of your organization type.

Take a district approach, not a program approach

In many agencies, every program is formulating its own survey and not talking to anyone else about it. This isn’t bad — every program has specific questions it needs to ask. But when you have 10 programs asking the same question eight ways, Vine said, you can’t compare the results.

Consistent question design for shared questions is essential for getting comparable data. While Vine focused on programs within a district communicating and collaborating with each other on question design, this approach could apply within a program. For instance, if you offer a yoga class every quarter and aren’t thinking about consistent question design, you won’t be able to compare results against previous classes, much less against the rest of your district’s offerings.

Good surveys also distinguish between “important” and “interesting” questions. An important question gets to the heart of your offerings, he says, such as “Which program spaces did you use when you came here?” with follow-up questions that get at satisfaction levels.

Be clear and consistent

Clarify and consistency were behind almost all of Vine’s examples of crucial, simple but easy to mess up tactics for surveys. For instance, you need to be cognizant of:

  • Is “neutral” an answer? Vine feels including or excluding is fine, but don’t mix and match. 
  • “Not applicable”? Consistency matters here, Vine says, and asking whether a question is applicable helps you see how important the answers are. He gave the example of a pool where much of the response was “dissatisfied,” but it turned out only 10% of respondents used the pool.
  • When to use yes/no questions. People like these types of questions, but Vine recommends against them unless the information is strictly factual. For opinions, you’ll get better information with a different approach. His example: Rather than ask, “Was your instructor on time, yes or no?” ask whether the instructor was on time “always, mostly, sometimes, never.”

Think about the big picture with:

  • Question ordering. Just like a book needs sequential ordering, so does a survey. Poor ordering can be confusing or reduce the completion rate. Vine offered as a rule of thumb to go “macro to micro” with your questions.
  • Limiting open-ended questions. Too many open-ended questions and the survey looks daunting, Vine says. People won’t fill them out. An exception is at the end of a survey, perhaps following a question gauging satisfaction level. For example, he says, you might ask people who were somewhat satisfied, “What could we do to improve that?”
  • Focusing on the program, but also zooming out. He talked about “core” questions that obtain information to help the entire agency. For instance, asking about what facilities were used besides the program’s facility. Be sure to carefully and consistently design these questions so the results are useful and comparable.

 

James daSilva is the longtime editor of SmartBrief’s leadership newsletter and blog content. Contact him at @James_daSilva or by email.

If you enjoyed this article, sign up for NRPA SmartBrief and SmartBrief’s email for city and county management, among SmartBrief’s more than 200 industry-focused newsletters.