Time

Access

Familiarity

Comfort

 When you want people to give you information, each of these can be an asset—or a barrier. They represent one of the most exciting puzzles evaluators get to solve: gathering high quality data while making the process easy – or even fun – for people to participate. Here are the things you can do to solve this puzzle:
  • When are your participants available? When possible, build evaluation activities into your program activities, so that no one has to take extra time to share information. Alternatively, acknowledge the time investment by thanking participants and giving them whatever token of appreciation you can afford.
  • What do you know about where participants are? Have forms or activities right where they are – in their homes, at your site, or during well-attended community activities.
  • How familiar are your participants with the issues you are asking about? Can they be expected to know the finer points of your programming, or do you need to give them some background information? Match questions to people’s actual knowledge, and when you aren’t sure whether they are familiar with a particular issue, provide them with a face-saving “out”. For example, we assume nearly everyone who has children in school will be familiar with the school start and end times; they may be less familiar with the curriculum or activities. Ensure your questions don’t imply they should know something they may not.
  • What will make participants most comfortable? Talking as a group may be fantastic for some, petrifying for others. Similarly, some will like the privacy of an anonymous survey; while others want to make sure you know that the opinions are theirs alone. Think through the setting, language and style that will make people feel most comfortable.
An example: We worked with a program that served students who are Native American in northwestern Minnesota. The youth in the program had problems with attendance, and the program was designed to help increase their participation, and success, in school. Not all of the kids wanted to participate in the program. We determined (1) there were few times that all of the kids were together; (2) they would be uncomfortable talking with an evaluator, who they associated with the justice system; (3) the kids had a lot of information they were already sharing through writing and story-telling. For our data sources, we used a youth survey, circulated by staff and submitted anonymously; staff members’ own observations; school attendance records; and the written stories of the youth (shared with permission). The result was a rich, comprehensive picture of the program’s results without overburdening the youth.