I just finished teaching a month of classes for NSDF. They collect survey responses at the end of each class, which I think is a great idea. That is something that I always planned to do when I was running Music City Motion (but I never actually did it) and I think is a good idea for every event, with the following caveat: make sure it’s a useful survey.
The results from the class I taught were mixed, most people had good reactions to the class, a few had stellar reactions and a few were disappointed to varying degrees. That didn’t bother me much (though my partner was quite upset that we didn’t get rave reviews from everyone). What bothered me, is that because of the way the survey was formatted, I can’t use the results to learn or improve anything.
All I know from the survey is that a certain percent of the students were “very satisfied” with the class, yet a few of those “very satisfied” students thought our teaching was confusing and unorganized (which I will admit, there were times when I was confusing and unorganized). We don’t even know if it was me, or my partner that was more confusing to the students (I’m guessing it was me, simply because I talked much more than she did).
If you are going to survey, which I recommend you do, survey well and prepare your survey carefully (beyond making sure that spelling and grammar are proper). There are a few things you need to consider as you build the survey.
- Goals for the Survey – What do you want to accomplish? What do you hope to get from the survey results? Do you want to know what you should change? Do you just want an ego stroke? Before you make your survey you need to know how you plan to use the survey, and ask your questions from that vantage point. The more you want to get out of the survey, the more complex and deliberate your questioning needs to be.
- Audience – How many people will take the survey? This is important because if you can’t use the same survey with a small group as with a large group. Small groups are more difficult to evaluate because one or two extreme opinions (positive or negative) will skew results.
With a small group you have to have much more open ended questions (especially if you don’t want to give them a survey that is pages long) in order to get useful information. With a larger group you can get away with fewer open questions as long as you ask specific questions that can give you what you’ve decided you want to know.
- Precise Questions – Don’t ask a bunch of generic questions unless you want generic answers. “Please rate your instructors” isn’t going to tell you much unless you follow up with questions about specific areas of performance. You could easily wind up with very high, or very low ratings on very good, or very bad instructors because one of two things dominated the opinions of the students. Unless you break down the rating, you can’t know why the instructors got that rating, and the instructors don’t learn anything useful about how their teaching was perceived.
- No Double Questions – Don’t lump questions together if each question could have a separate answer. “I thought the instructors were well organized and easy to understand” is a bad question because there are two parts: well organized, and easy to understand. Each of those, while related, are different and could easily have different answers. When you write out your survey, look for these “and” clauses. If you’ve asked two questions either make it two questions or figure out which one you actually want to know and leave the other out.
- Test – For my real job, I do lots of testing, split testing, multi-variate testing, etc. I have learned the value of testing everything I do that provides me feedback, so that the feedback I get is the most useful. That old measure twice cut once adage.
After you’ve built your survey, give it to a sample group (I’d recommend that if you are going to survey at the end of something, midway through ask for some volunteers) and see if the responses you get are useable (you may also learn something you can improve upon before your event/class/etc is over). If your results aren’t that helpful you’ll be able to see pretty easily which questions are at fault. Than change them, and if you have time test again.
- Dissect the Results – Don’t just tabulate results and report the % responses to each question. Drop all the raw data into excel and use filters to see how answers of one question impacted the answers of another, and what the open responses you received to which kinds of satisfaction ratings etc. That is where the real value is found.
- Don’t Wait for Perfection – If you don’t have time to do all the above, go ahead and survey. But promise yourself to evaluate and modify your survey with the results before you conduct your next survey. It is often better to survey imperfectly than to not survey at all, just remember to take your results with a grain (or whole shaker) of salt.
There are many FREE resources on the web to help you build your survey. If you just search you could probably even build your survey by just copying and rewording some of the sample questions you’ll find out there.
If you survey well, your results will be far beyond the ego stroke that most surveys provide. You’ll learn what to change and how to change. You’ll learn why something (or someone) is satisfactory to some and unsatisfactory to others. And that is when a survey because useful.