By Samantha Grant
By now, we are all convinced of the importance of doing evaluation of our programs. I hope we've all begun to collect data to inform our stakeholders and ourselves about how our programs are doing. I have blogged about practical evaluation in youth programs, and the theme of evaluation has been echoed by others in their posts.
Let's assume that you are collecting and analyzing data about your program -- what next? I argue that you must put in as much effort in communicating data as you did in collecting it.
Before making choices about how to package your data, think about:
Evergreen says that we have a bad habit of making our communication of data boring: "The disconnect lies between our desire to have our findings used and our methods of presenting them." Are you boring your audience with data?
In youth work we have the tendency to be so pleased that we've conducted evaluations that we neglect to think about use and communication. What good are the data if we can't communicate them in a compelling manner? How can we best create communications with users in mind?
Here are some ways to create more compelling communications of data. Compare them with what you do:
You are welcome to comment on this blog post. We encourage civil discourse, including spirited disagreement. We will delete comments that contain profanity, pornography or hate speech--any remarks that attack or demean people because of their sex, race, ethnic group, etc.--as well as spam.
By now, we are all convinced of the importance of doing evaluation of our programs. I hope we've all begun to collect data to inform our stakeholders and ourselves about how our programs are doing. I have blogged about practical evaluation in youth programs, and the theme of evaluation has been echoed by others in their posts.
Let's assume that you are collecting and analyzing data about your program -- what next? I argue that you must put in as much effort in communicating data as you did in collecting it.
Before making choices about how to package your data, think about:
- What data do you have?
- Who is the target audience for the data?
- What do you want your target audience to know?
Evergreen says that we have a bad habit of making our communication of data boring: "The disconnect lies between our desire to have our findings used and our methods of presenting them." Are you boring your audience with data?
In youth work we have the tendency to be so pleased that we've conducted evaluations that we neglect to think about use and communication. What good are the data if we can't communicate them in a compelling manner? How can we best create communications with users in mind?
Here are some ways to create more compelling communications of data. Compare them with what you do:
- Jot down the key messages from your evaluation. Build your presentation around these. Think about how you can make these 2-3 ideas stick.
- Ask youth to help. Chances are they will be able to help you get your creative juices flowing. Plus the act of engaging others in discussing your communication methods has to help you break out of your presentation rut.
- For an oral presentation, follow the 10-minute rule: if you can't get your point across in 10 minutes, restructure what you're talking about, otherwise your audience will be snoozing.
- Think about creating two evaluation reports -- one with more depth for stakeholders who want the details and another short, 1-2 page summary that can be shared widely.
-- Samantha Grant, evaluation director
You are welcome to comment on this blog post. We encourage civil discourse, including spirited disagreement. We will delete comments that contain profanity, pornography or hate speech--any remarks that attack or demean people because of their sex, race, ethnic group, etc.--as well as spam.
Hi Samantha,
ReplyDeleteThis blog post makes me so happy! " I argue that you must put in as much effort in communicating data as you did in collecting it." I couldn't have said it better. So true. And in my experience, asking youth for their ideas about how to present it usually brings up more options, and much more creative options, than I could ever devise on my own. I'd love to see how your reporting looks after you try some of the strategies you suggested!
Stephanie
Thanks for your comment Stephanie. I hope this post will encourage some people to jazz up their presentation of data. Right now I'm in the middle of a report, and although the funders are looking for a report, I hope to make it meaningful and then spend time talking about the results with a team. We can then take that information and share in creative ways. How about others, how do you balance the needs of creativity with grant stipulations?
ReplyDeleteSam you have hit a big important practice dilemma in youth work. I think getting to the purpose of why we evaluate is also key. What if we reframed the purpose to include our natural curiosity? Could we answer "I wonder if... or I wonder why...?" questions as we started evaluation process. We have moved to conducting evaluation to please our funders as the primary purpose - which is not always motivating. It then becomes a check, done, drudgery kind of task. Thinking of evaluation as a way of reflecting, learning and sharing what we are learning is far more inviting and useful.
ReplyDeleteHi Deborah,
ReplyDeleteI appreciate that you are seeing the value of evaluation in building a reflective practice. If we always view evaluation reports as meeting funder's needs, we lose sight of the value that an evaluation can give- information that can support the program. The one thing that I warn groups that I work with is to not follow every "interesting" lead because it often leads us to collect information that isn't useful. In reflecting about evaluation, if you think about what information you need that would either improve your program or tell about the effects of your program, you are more likely to collect useful information that can be translated into communication pieces. Have others struggled with focusing on why to evaluate?
What are your thoughts with youth participatory evaluation? Do stakeholders and policy makers value this method. Do you have any additional resources that may shed light on this?
ReplyDeleteThanks for this interesting topic.
Hi Brian,
ReplyDeleteI apologize for the late response. I see that technology isn’t helping me out because my first response didn’t make it through.
Some of the suggestions that I offered are based in a participatory approach. The key when involving participants, whether youth or adults, is to get feedback that makes an evaluation more grounded in the culture of the program and also increases buy-in during the process. In the field, there are mixed views on participatory evaluation. The biggest factor that weighs against this approach is the control that is lost when involving a team of participants. My suggestion is to look at what you value most in your current evaluation project- the ability to be responsive to the program or the ability to have a tight, evaluation design. Sometimes one weighs heavier than the other.
I would encourage you to check out resources from the American Evaluation Association on this topic. In addition, the Center for Youth Development published a research report on using youth as quality assessors. Although this wasn’t a complete participatory project, it is a nice example.
http://www.extension.umn.edu/youth/research/quality/docs/Minnesota-4-H-Quality-Improvement-Study.pdf
Thanks for your questions!
Samantha