Skip to main content

Brief programs can make a lasting impression. How can we measure that?

By Betsy Olson

Searching through childhood pictures of a clowning workshop I attended as an eight-year-old, I have strong, happy memories of our tumbling presentation, with my parents laughing in the audience. These memories resonate with me as I prepare for an upcoming youth leadership presentation, and have me thinking about how to evaluate brief programs.

Measuring impressions from them can be tricky. However, keeping a few considerations in mind can simplify the process of evaluating brief programs – defined as those lasting fewer than eight hours.

Understand the potential impact

Start with a full understanding of the program’s potential. Stephanie Wilkerson and Carole Haden developed a model outlining the relationship between STEM (Science, Technology, Engineering and Math) out-of-school time program outcomes and duration. This model uses the steps in the change process to isolate the outcomes you can expect from programs based on their duration. This follows other researchers (Patton, Rubin, Reisman & Clegg) who have identified awareness, interest and attitudes as the outcomes that can occur during the initial stages as people consider change.

Develop realistic measures

For an evaluation this means that the measures we use should focus on participants’:
  • Awareness of program elements
  • Interest in taking action on the program topic
  • Attitude on the issues presented in the program
  • Before including increases in skills or knowledge as a part of the evaluation, consider the complexity of the skills your program is designed to build. Some audiences have the capacity to gain significant skills and knowledge in a two-hour-long program, but complex skills or knowledge of broad concepts can take longer to develop. Be cautious with the expectations for participants during short programs.

Choose a tool

After identifying what you want to measure, select an evaluation tool. The most common tool is a survey, and it’s popular because it’s easy to interpret, easy to report and familiar to most participants. If you use this tool, there are a few cautions:

  1. Don’t use surveys for children less than eight years old. Developmentally, a survey is very difficult for children in this age group. They take the questions very literally, they are extremely suggestible and their attention span is limited.
  2. Be cautious when surveying youth younger than 11. They will have a hard time interpreting questions about groups such as, “Children your age are creative.” Their answers will be strongly skewed to their own point of view. Take care to craft survey questions that are about their own experience. Avoid stating questions as negative statements for this age group – an example would be, “I had never heard about the engineering process until this class.”
  3. Avoid questions that have more than one meaning; for example, “This session was useful and informative.”
  4. Be thoughtful when selecting the Likert scale to use for your survey.
Surveys are common, so they can also be a bit boring. For brief programs, surveys can be very short because there is a limit to the amount of change or impact that you can measure. This brevity can open up the possibility of using interactive evaluation tools such as a building block bar chart.

Here’s how: Write one Likert scale response (a simple Likert scale is “strongly agree, agree, disagree, strongly disagree”) on each index card and display them on a table. Give each participant one block for each question. Post or read an evaluation question and ask participants to stack their blocks next to the card that describes their answer. Repeat for each question.

Other evaluation methods like focus groups, diaries or interviews take more time out of a brief program, but you might still use them for short programs that repeat. The Harvard Family Research Project offers some perspective on choosing tools.

Have you evaluated brief programs? How? Do they pose challenges for you in terms of outcome evaluation? Do you have any more tips to share?

-- Betsy Olson, Extension educator

You are welcome to comment on this blog post. We encourage civil discourse, including spirited disagreement. We will delete comments that contain profanity, pornography or hate speech--any remarks that attack or demean people because of their sex, race, ethnic group, etc.--as well as spam.
Print Friendly and PDF

Comments

  1. Great post. I have been doing a lot of STEM programming lately and have wondered how I can measure what I am doing. I find it interesting when you say to be cautious when surveying youth under 11. Besides watching how the questions are worded, how would you suggest capturing their input? Thank you for the great resources.

    ReplyDelete
    Replies
    1. Hi Krista,
      Great question. Glad you have been collecting outcomes from the youth in your STEM programming. The main concern with phrasing survey questions well for youth from 8-11 is to be sure that questions can be understood literally and you use language common to that group. One example is that if youth call your program 4-H Day Camp and you refer to the program as STEM Camp in the question youth in this age group may not make the connection. They also are more likely to answer the questions the way they think you want the questions answered, so if you are really trying to get a read on their interpretation of the event an interview might work better.
      For programs with a short duration, interviews can be hard to fit into the schedule but are a great way to collect feedback from youth. Select a small sample and sit down with them for a brief interview. This is a great way to collect formative evaluation data or information on how to improve your program.
      If you are collecting information on how the program impacted participants surveys are still the easiest way to do that, I think. I recommend reading the survey over with a youth (in the age group you are targeting) and asking them what they think the questions mean before you use the survey to collect information. That way to can word questions in a way that you know a young person will understand.
      Thanks, Betsy

      Delete
  2. Excellent post, Betsy! I was going over some evaluations from a 4th-6th grade after-school session and was bummed by how the evaluations were coming back - I could see their excitement and engagement during the session, so I was struggling with why the evaluations were so bad...until I read your post. I wish I would have known these tips last week so I could have used the block method you described. Great post that I will keep handy for future programs so I don't have this frustration again!

    ReplyDelete
    Replies
    1. Hi Amy,
      I have had that same frustration. It is really difficult to capture energy, excitement and engagement with a paper evaluation. Sometimes an open-ended question or a discussion of that open-ended question (where the discussion is recorded) can be a good way to collect information in a way that can capture the engagement that you see in participants. A quote is better at communicating excitement than even the most impressive percentage. It can take a bit more work to collect quotes and responses to open ended questions but if it can really capture the energy of your participants it is worth it. Let me know how the building block evaluation works for you!

      Thanks,
      Betsy

      Delete

Post a Comment