RSS | Go To: afterschoolalliance.org
Get Afterschool Updates
Afterschool Snack, the afterschool blog. The latest research, resources, funding and policy on expanding quality afterschool and summer learning programs for children and youth. An Afterschool Alliance resource.
Afterschool Donation
Afterschool on Facebook
Afterschool on Twitter
Afterschool Snack Bloggers
Select blogger:
Recent Afterschool Snacks
JUL
3
2017

IN THE FIELD
email
print

Highlights from Policy Studies Associates’ afterschool report

By Elizabeth Tish

Policy Studies Associates (PSA) conducts research in education and youth development. This spring, PSA published a short report on afterschool program quality and effectiveness, reviewing more than 25 years of afterschool program evaluations they have completed.

The report further substantiates the benefits of afterschool for three specific stakeholders: students, families, and schools. The full details are available in the report, which covers the following topics in depth:

Afterschool programs work for students

  1. Increase school attendance and ease transitions into high school
  2. Offer students project-based learning opportunities
  3. Improve state language and math assessment scores while developing teamwork skills and personal confidence

Afterschool programs work for families

  1. Provide safe spaces for enriching activities and academic support
  2. Make it easier for parents to keep their job
  3. Provide an option for parents to miss less work

Afterschool programs work for schools

  1. Enhance the effectiveness of the school and reinforce school-day curriculum
  2. Create a college-going and career-inspiring culture in the school
  3. Foster a welcoming school environment

Want to find out which evaluations these statements came from? Visit the PSA brief, Afterschool Program Quality and Effectiveness: 25 Years of Results!!

share this link: http://bit.ly/2tFqwFt
learn more about: Evaluation and Data
JUN
6
2017

RESEARCH
email
print

Evaluating afterschool: Answering questions about quality

By Charlotte Steinecke

By Jocelyn Michelsen, Senior Research Associate at Public Profit, an Oakland, California-based evaluation consultancy focused on helping high-performing organizations do their best, data-driven work with children, youth, and families.

The Afterschool Alliance is pleased to present the fifth installment of our "Evaluating afterschool" blog series, which answers some of the common questions asked about program evaluation. Be sure to take a look at the firstsecondthirdand fourth posts of the series.

Raise your hand if this scenario sounds familiar: you keep up with new research on afterschool by reading articles and newsletters, following thought leaders, and attending conferences—but it is still hard to sort through all the information, let alone implement new strategies. Research often seems out of touch with the realities of programs on the ground, and while many anecdotal examples are offered, how-to guidelines are few and far between.

As an evaluator consulting with diverse afterschool programs across the San Francisco Bay Area and beyond, I frequently hear this frustration from program leaders. There is a real gap between the research and the steps that staff, leadership, and boards can take to build quality in their own programs. Additionally, it can be hard to sift through the research to get to the ‘why’—why implement these recommendations, why invest time and resources, why change?

share this link: http://bit.ly/2sPaDIQ
learn more about: Evaluation and Data
APR
28
2017

RESEARCH
email
print

What you need to know about the GAO's afterschool report

By Jen Rinehart

The U.S. Government Accountability Office (GAO) released a report on 21st Century Community Learning Centers on April 26 highlighting the benefits of afterschool participation and calling on the U.S. Department of Education to update their performance measures and data collection. The report confirms that participation in afterschool programs improves student behavior and school attendance and that the broad range of benefits from afterschool is more evident among students who attend their afterschool program for more than 60 days than among those who do not. The report also highlights the essential role that Community Learning Center grants play in helping afterschool programs leverage much-needed support from a range of community partners.

Afterschool community is committed to quality

Many afterschool providers have demonstrated their dedication to continuously improving their programs by adopting quality standards and utilizing continuous improvement tools. An array of program evaluations clearly demonstrate that quality programs are making a difference for children and youth. In fact, had the GAO selected a larger body of research on which to base its conclusions, including a wider array of state Community Learning Centers evaluations and other large studies of afterschool, its conclusions about program effectiveness would have been even stronger.

Widespread agreement that 21st CCLC performance measures need an update

In the years leading up to the reauthorization of the Elementary and Secondary Education Act (ESEA), we spent a great deal of time convening the afterschool field to gather input about the vision of 21st CCLC in ESEA reauthorization. In that process, it became clear that there is broad consensus from the field around the need for updated 21st CCLC performance measures and data collection. That consensus is echoed in the GAO report, which recommends broadening the measures to include classroom behavior, school day attendance, and engagement. Improved alignment between Community Learning Centers program objectives and performance measures will help afterschool programs more effectively demonstrate their role in supporting student success, which is essential for ongoing public support.

Technical assistance should expand

The GAO report also calls for the department to update and expand the technical assistance offered to grantees. That’s another change that the afterschool community pushed hard for—and won—in the reauthorization of ESEA. By implementing the changes called for in the reauthorization of 21st CCLC, the department can bring improvements to professional development, data collection, and program evaluation as early as the school year that begins this fall.

Continued federal investment is vital

More than anything, this new report underscores the need to continue the federal investment in quality afterschool programs, which keep kids safe, inspire them to learn, and help working families. The Trump administration should abandon its indefensible proposal to defund Community Learning Centers—which would take afterschool and summer learning programs away from 1.6 million kids, devastating low-income families and communities—and instead implement the GAO’s recommendations.

share this link: http://bit.ly/2qf89D0
learn more about: Department of Education Evaluations
APR
11
2017

IN THE FIELD
email
print

Evaluating afterschool: Find your best data-collection strategy

By Nikki Yamashiro

By Y-USA Achievement Gap Programs Evaluation Team.

The Afterschool Alliance is pleased to present the fourth installment of our "Evaluating afterschool" blog series, which turns to program providers in the field to answer some of the common questions asked about program evaluation. Be sure to take a look at the firstsecond, and third posts of the series.

 
Photo courtesy of Lori Humphreys, VP of Child Care, YMCA Of East Tennessee. 

At YMCA of the USA, our Achievement Gap Programs evaluation team provides a comprehensive evaluation strategy, training, tools, and support to hundreds of local Ys doing the important work of delivering Achievement Gap Signature Programs to thousands of children. The Achievement Gap Afterschool Program has expanded to over 130 sites since it began in the 2012-2013 school year and is currently serving over 7,000 children across the nation.

Organization leadership, funders, and community partners are often eager to see the data that comes out of program evaluation, and it is not uncommon for organizations to need additional guidance and resources to start the data collection process. We'd like to share what we think are data collection essentials for this important and possibly overwhelming part of the evaluation process.

The first step is for the program’s primary stakeholders to define program goals and benchmarks. Identifying the questions that should be answered about the program helps to focus on what matters most for the program.

Before you start

  • Be organized: Develop a plan from start to finish before diving into the data collection process. How will data be collected? What tools will be used to collect data? Who will do the data collection, entry, and analyses? How will the data be reported? Include a general timeline for each activity in the plan.
  • Be realistic: As a data collection plan is developed, be realistic about the resources you can dedicate to the project and plan accordingly.  When you can, simplify.
  • Be clear: All data collection processes should include a clear explanation for why the data is being collected and how the data will be protected and reported. Clarity of purpose ensures that staff, parents, and participants are fully informed on the program’s data collection practices.
  • Be concise: When developing tools to collect data, stay focused on only gathering essential information that relates to goals or the questions stakeholders have agreed upon. Collecting information that you don’t plan to use takes up precious time when creating data collection tools, when users fill out the tools, and when the data is analyzed.
share this link: http://bit.ly/2p4ZXVx
learn more about: Evaluations Guest Blog
NOV
30
2016

STEM
email
print

New report: Documenting the impact of afterschool STEM

By Melissa Ballard

Afterschool programs support students’ success in STEM (science, technology, engineering and math) in a multitude of ways—by helping them become interested and engaged, develop tangible STEM skills, and begin to see themselves as potential contributors to the STEM enterprise. While afterschool programs across the country are working hard to measure the impact they’re having on youth, we know that program evaluation is no small task—requiring a professional evaluator, getting staff on board, and ensuring student and parent participation.

Our new report “The impact of afterschool STEM: Examples from the field” compiles some of the most telling studies on how afterschool STEM programs are engaging students. Fifteen afterschool programs—diverse in size, structure, and approach—shared their evaluation data with us, thereby adding to the growing evidence that afterschool programs are crucial partners in bolstering student success in STEM education.

Here's a sample of the impacts you can read about in the report:

  • After participation in Girlstart, a Texas afterschool program, girls perform better on the state science and math tests compared to non-participants. Further, participants demonstrate a continued interest in STEM—Girlstart girls enroll in advanced 6th and 7th grade science and math courses at significantly higher rates than non-participants and 89 percent want to return to Girlstart After School in the next school year.
  • Youth members of The Clubhouse Network (pictured) report that they have learned how to use more technology (91 percent), are more confident using technology (88 percent), and use technology more often (84 percent) as a result of their Clubhouse experience. Almost 90 percent of youth in the Clubhouse’s Start Making! initiative felt they were better at solving hard problems, and had more skills to design, make or create projects.
  • After participating in Explore the Bay, an environmental and marine science afterschool program, 81 percent of students said that they were really interested in learning about plants and animals and 89 percent of students surveyed reported that they wanted to take better care of their environment.

To read more impacts of afterschool STEM, read the full report.

share this link: http://bit.ly/2gm6Kbo
learn more about: Evaluations Science
NOV
1
2016

IN THE FIELD
email
print

Evaluating afterschool: Evaluation as a mission-driven investment

By Robert Abare

The Afterschool Alliance is pleased to present the second installment of our "Evaluating afterschool" blog series, which turns to program providers in the field to answer some of the common questions asked about program evaluation. Be sure to take a look at the first post of the series, which explores evaluation lessons from Dallas Afterschool.

This post is written by Jason Spector, senior research & evaluation manager for After-School All-Stars, a national afterschool program serving more than 70,000 low-income, at-risk students across 11 states and the District of Columbia.

The After-School All-Stars of South Florida celebrated Lights On Afterschool 2016 with the Miami Marlins.

I recently left a meeting thinking I’m no longer doing the job I was hired to do. But for a professional evaluator of afterschool programs, change is a good thing.

When I joined After-School All-Stars (ASAS) to launch our national evaluation department two and a half years ago, my primary goal was to measure and support ASAS’ outcomes as the organization entered into an expansion phase. While I currently maintain this responsibility, our national evaluation team is now focused on examining program quality as opposed to outcomes measurement. Why the change? Simply put, we realized our top priority was to boost our quality, because when we do, the impact and outcomes will follow. 

This type of a shift is not an easy decision for a nonprofit to make. As nonprofits move toward more advanced outcomes measurements to satisfy increasingly savvy funders, leaders everywhere are faced with some critical questions:

  1. Should I deepen my organization’s investment in evaluation?
  2. What can I expect to receive in return?

These questions carry an assumption that an investment in evaluation is inherently not an investment in your organization’s mission and programs. Furthermore, many program leaders assume that evaluations must yield large positive outcomes in order to attract new funders and compensate for the “cost” of not putting dollars directly into program operations. But this logic fails to consider the many benefits evaluations afford organizations. 

share this link: http://bit.ly/2f7uTCk
learn more about: Evaluations Guest Blog
SEP
28
2016

IN THE FIELD
email
print

Evaluating afterschool: Tips for getting started from Dallas Afterschool

By Robert Abare

Program evaluation can be an overwhelming and intimidating undertaking for afterschool program providers. There are questions ranging from where to start to what to do with evaluation results and everything in-between that program providers need to think about. To answer some of the common questions raised by afterschool program providers about evaluations and to help make evaluations more approachable, the Afterschool Alliance has started a new blog series,"Evaluating afterschool," on program evaluation best practices. For this blog series, the Afterschool Alliance turns to program providers in the field who can offer tips and lessons learned on their evaluation journey.

The first blog of this series is written by Rachel Johns, the research and evaluation manager at Dallas Afterschool in Dallas, Texas. Dallas Afterschool promotes, expands and improves the quality of afterschool and summer programs in low income neighborhoods in our community.

This spring, Dallas Afterschool released findings from the 2014-2015 school year as part of an ongoing, engaged evaluation process. Our dynamic partnership with the Center on Research and Evaluation at Southern Methodist University has allowed us to explore questions about how to improve the quality of afterschool programs effectively and efficiently, and how the quality of an afterschool program might affect students in our context. As we enter our fourth year of this evaluation, we'd like to share some of what we have learned in the process.         

Considerations for practitioners

While an evaluation as extensive as Dallas Afterschool’s may not be practical for all organizations due to financial or human capacity restraints, there are many ways to enhance your benefit from any evaluation process.

  1. Clearly define the questions you want answered and circle back to them often. These questions are the guidepost for your evaluation and can help keep you focused on the pieces of data and the analyses that matter most. Evaluation becomes less useful when it lacks direction or tries to address too many questions.
  2. Plan for more time than you think you need. If you know what questions your evaluation is asking and what data needs to be collected to answer those questions, then you have a great start. Collecting your own data can make scheduling simple, but if you rely on colleagues to collect some of it, plan for an extra week buffer. Competing priorities can make data collection fall to the back burner, but good data collection is essential for a useful evaluation. Additionally, the amount of time it takes to clean that data to make it ready for analysis can be hard to estimate. When data is derived from many different sources or is collected inconsistently, you never know what you might find or need to correct.
  3. Regularly monitor your data to save a headache in the end. Especially if several people are collecting and entering data, regular monitoring of the data can give you the opportunity to retrain before a lot of time is wasted on data “cleaning” and correcting work that has already been done.
  4. Provide more support than you expect people will need. Some people may not need training or support, but you never know who will. You may need to document protocols for data collection, provide periodic trainings, or help staff and stakeholders to understand the process and the results.

Leveraging a university partner

Dallas Afterschool partners with a local university to access expertise in evaluation design and analysis, as well as to enhance our self-reflection with external perspectives. Though choosing a university partner and engaging with them throughout the evaluation process may be daunting or even confusing, consider the following to maximize your organization’s benefit and enjoyment of the process.

  1. Know what you want. Do you simply need a report for a specific grant requirement, or are you looking for a thought partner to challenge your assumptions about your program and help you make it even better? Many evaluators jump at the chance to help a program that truly desires to improve and is willing to engage with them throughout the entire process.
  2. Develop a symbiotic relationship. Find out what research the university is interested in doing that your organization might be able to help with. Are they working on anything that might benefit your field or an issue related to your population? By opening your program to engage in research or evaluations that align with your mission but extend beyond your own evaluation, you can develop a relationship with your University partner that is beneficial to both entities and potentially addresses systemic issues that your program could not affect on its own.
  3. Trust their academic expertise but challenge the practical application of results. University partners can provide excellent direction on the design and methods of your evaluation, but you know your population best. If they propose an angle for the evaluation that doesn’t seem especially useful to your program or its participants, push back and work together to find an angle that does. Evaluators want their work to be used to help programs and the people they serve, so don’t be shy.
share this link: http://bit.ly/2d7fqDi
learn more about: Evaluations Community Partners
SEP
15
2016

RESEARCH
email
print

New report: Participation in summer learning programs yields positive outcomes

By Erin Murphy

A new report shows that high levels of participation in summer learning programs can provide positive benefits for low-income students’ math and language arts performance and social-emotional skills. Last week, The Wallace Foundation released Learning from Summer: Effects of Voluntary Summer Learning Programs on Low-Income Urban Youththe third and final report analyzing the outcomes of their National Summer Learning Project.

This report, conducted by the RAND Corporation, is part of a six-year study offering the first-ever assessment of the effectiveness of voluntary, no-cost summer learning programs on the academic achievement, social-emotional competencies, and behavior of low-income, urban, elementary students. In fall 2013, third grade students enrolled in one of five urban school districts—Boston, Dallas, Jacksonville (FL), Pittsburgh, or Rochester (NY)—were selected to participate in the study. Half of the students were invited to participate in summer programming while half were not, and data on academic performance, social emotional skills, behavior and attendance was collected on both groups through the end of seventh grade.

Key findings on summer learning programs:

  • Students who were “high-attenders”—those attending a summer program at least 20 days—saw near and long-term positive effects in math assessments throughout the study.
  • High-attenders saw near and long-term positive effects in language arts assessments after the second summer of programming.
  • High-attenders saw positive benefits for their social and emotional skills after the second summer of programming.
  • When programs focused on math or language arts, students saw lasting positive gains in these subjects. Students who received a minimum of 25 hours of math instruction or 34 hours in language arts instruction during the summer outperformed students who did not receive the same level of instruction in the relevant subject in fall assessments. The report also found that the positive effects lasted into the spring after the second summer.
  • Providing students an invitation to attend did not lead to substantial long-term benefits, because of high rates of non-participation and low-attendance rates.
Infographic courtesy of the Wallace Foundation.