Blue folder with danger symbolSo far in this series, we’ve addressed data security, consent, and bad survey questions. Now we turn to a problem that isn’t specific to evaluation: over promising, under delivering. However, two factors make evaluation vulnerable to this problem:
  • We rely on others to share information and data with us
  • The data we get may not be structured the way we expect, requiring greater manipulation or simply making us unable to answer questions
These factors don’t excuse bad service, however. A client recently shared a previous experience, where an evaluator promised to deliver findings that she, in turn, could present at a community gathering, only to tell her at the last minute that nothing was ready to share. Whenever possible, we try to avoid the over-promise/under-deliver issue by:
  • Clearly articulating what our clients are expecting and when. After a couple of experiences in which we weren’t clear what our clients wanted, and ultimately went through more iterations and drafts than either of us cared for, we created a tool to help us talk through expectations for each deliverable. While very simple (including questions about audience, tone, style and length), it helps ensure that we are in agreement about the final products.
  • Knowing as much as we can about available data before determining how it will be used. For example, we had a client with a wide variety of items, stored in different tables and documents, all of which informed each other. We summarized the elements that were used and described where the gaps were in the information we hoped for as we prepared a report outline for the client.

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine