By Allison Riley, PhD, MSW, Senior Vice President, Programming and Evaluation at Girls on the Run International. Girls on the Run is a physical activity-based positive youth development program that inspires girls to be joyful, healthy, and confident using a fun, experience-based curriculum that creatively integrates running.
The Afterschool Alliance is pleased to present the seventh installment of our "Evaluating afterschool" blog series, which answers some of the common questions asked about program evaluation and highlights program evaluation best practices. Be sure to take a look at the first, second, third, fourth, fifth, and sixth posts of the series.
My two-year-old daughter and I like to take walks together when I get home from work. Whether we are headed to see the neighbor’s chickens or visit a friend, we always have some goal in mind when we walk out of the door, though my toddler typically doesn’t take the most direct path. Even if I try to rush her along so we can more quickly reach our destination, she is sure to pause when a good learning opportunity comes her way. When I follow my daughter’s lead, our walks are purposeful yet flexible, and I always learn more, too.
As it turns out, my daughter’s approach to a walk translates well to my workday world. As someone who’s spent my career evaluating youth programming, I have learned the importance of having a clear purpose and goals for a project while being flexible and responsive to information gathered during the evaluation process. Let’s look at a recent Girls on the Run study as an example.
Choosing a path
In 2013, Girls on the Run International embarked on a journey to identify evaluation measures that align with our curriculum in preparation for a rigorous external evaluation study of our program’s effectiveness.
When our preliminary study revealed inconsistencies in how the program was being delivered across our program sites nationwide, we had two choices:
While it was tempting to proceed as planned, we understood that without addressing these delivery inconsistencies we would not be able to reach our ultimate program impact goal. While it wasn’t in the original plan, we shifted our resources into a National Coach Training initiative designed to equip our network of more than 50,000 volunteers with the tools and resources they needed to provide a supportive climate and implement our curriculum. After developing and launching this strategy we were able to redirect focus back to the external study.
Making it happen
Here are a few strategies to consider to ensure your evaluations have a clear goal while being flexible and responsive to information you uncover along the way.
For us, being flexible and responsive was not the easy choice, but in the end, we were able to maximize program impact through a focus on program quality and measure that impact with a rigorous, independent study. I suppose my daughter had the right approach all along - purposeful, yet flexible. We will make a great team on the next Take Your Daughter to Work Day.
High school students in The Possibility Project, New York, N.Y., tackle issues close to their heart, taking on leadership roles to enact change in their communities through performing arts and...
As spring finally arrives in Washington, D.C., so does a suite of new resources highlighting key facts and stats on afterschool! The first resource is a new fact sheet on afterschool, which...
Close to half of children (45 percent) in the U.S. have experienced at least one adverse childhood experience (ACE)—an experience that could have negative and lasting effects on one’s...
As the prominence of social and emotional learning (SEL) to support students’ development in school and beyond continues to grow in education circles, challenges implementing SEL programming...