RSS | Go To: afterschoolalliance.org
Get Afterschool Updates
Afterschool Snack, the afterschool blog. The latest research, resources, funding and policy on expanding quality afterschool and summer learning programs for children and youth. An Afterschool Alliance resource.
Afterschool Donation
Afterschool on Facebook
Afterschool on Twitter
Afterschool Snack Bloggers
Select blogger:
JUL
28
2017

RESEARCH
email
print

Guest blog: Q&A with an afterschool researcher

By Guest Blogger

In May, the proposed FY2018 budget eliminated funding for the 21st Century Community Learning Centers (21st CCLC), the only federal funding stream dedicated to before-school, afterschool, and summer learning programs. In the budget, a justification given for the elimination of funding was that there is no demonstrable evidence that 21st CCLC programs have a positive impact on the students attending the programs. Although we have highlighted the existing body of research underscoring the difference 21st CCLC programs are making in the lives of students participating in programs, we decided to go directly to the source, asking someone who has conducted evaluations on 21st CCLC programs for 14 years. 

We posed a few questions to Neil Naftzger, American Institutes for Research (AIR), about his evaluation work related to 21st CCLC programs specifically, and the afterschool field broadly. Below are answers to two of the questions we asked, with our emphasis added in bold, which establish that there is in fact clear evidence demonstrating that 21st CCLC work for students. 

What are the strongest findings across your research on 21st CCLC programs? Do you see any important non-academic benefits from afterschool and summer learning programs?

Between 2011 and 2016, AIR completed a series of analyses designed to explore the relationship between regular participation in the 21st CCLC program and youth performance on a variety of outcomes pertaining to academic achievement and positive behaviors. These analyses were undertaken as part of a series of statewide 21st CCLC evaluation contracts held by AIR in five states—New Jersey, Oregon, Rhode Island, Texas, and Washington. The purpose of these analyses was to assess the manner in which regular participation in the program was related to school-related outcomes by comparing youth that participated in 21st CCLC programming for 60-days or more during a given school year with similar youth attending the same schools that did not attend programming. The hypothesis underpinning these analyses was that youth participating in 21st CCLC at 60-days or more would demonstrate more desirable school-related outcomes relative to similar youth attending the same schools that did not participate in the program.

In relation to academic achievement-related outcomes, this hypothesis—that youth regularly participating in 21st CCLC programs would perform better academically than their peer counterparts who did not participate in programs—was supported in terms of state assessment scores in mathematics, cumulative GPA, and credits earned toward graduation. Program effects were small, but largely commensurate with what would be expected given how much time youth spent in programming. These findings were replicated across multiple samples and years.

For example, of the fifteen analyses conducted to assess the relationship between regular program participation and mathematics state assessment scores, twelve were found to be statistically significant and indicative of 21st CCLC potentially having a positive effect on state assessment scores in mathematics. By comparison, few potential positive effects were found in relation to state assessment scores in reading, and one negative effect was found in relation to this particular outcome.

Our hypothesis was also supported in relation to youth regularly participating in 21st CCLC demonstrating fewer school-day absences and disciplinary referrals relative to similar youth not participating in programming. The potential program effects ranged from relatively small to large depending upon the state and year the analyses were completed. For example, youth participating in programming for 60-days or more demonstrated anywhere from 15 percent to 70 percent fewer absences than similar youth not enrolled in programming, with an average of 40 percent fewer absences across the domain of analyses under consideration. Only one analysis of the six conducted yielded a non-significant result related to school-day absences.

Results for disciplinary referrals were similar. All five of the analyses done to assess the potential relationship between regular program participation and fewer school-day disciplinary incidents were found to be significant and indicative of 21st CCLC-funded programs potentially supporting fewer school-day absences. Youth participating in programming for 60-days or more demonstrated anywhere from 5 percent to 72 percent fewer disciplinary incidents than similar youth not enrolled in programming, with an average of 24 percent fewer incidents across the domain of analyses under consideration.

Overall, these results suggest that there is a positive relationship between regular participation in 21st CCLC and positive school-related outcomes related to academic achievement and positive behaviors.

How are studies like yours best used (e.g., by policymakers, by program managers, by the field)?

I think the evaluation work we have done in relation to the 21st CCLC program can be used by key stakeholders to gain an understanding of what impact recent implementation of the program may be having on participating youth, especially in light of how the field has grown and developed during the span of the past decade as states have crafted and deployed quality improvement systems. Our work also highlights where data collection and analysis efforts should be targeted to more fully understand how the program may be contributing to the positive development of participating youth. With this understanding, federal, state, and local stakeholders can formulate more meaningful expectations for how these programs are likely to impact youth and support data collection and analysis efforts that look for evidence of positive outcomes in the places where they are likely to occur.

In addition, some of our work has been focused on examining the relationship between (a) observed program quality using some of the tools previously referenced [this information can be found in the full Q&A] that states are using as the foundation of their quality improvement systems (i.e., the YPQA, the APT-O) and (b) school-related outcomes. The hypothesis guiding this work was that youth who participated regularly in higher quality afterschool programs would demonstrate better functioning on a variety of youth outcomes as compared to similar youth attending lower quality programs. This hypothesis was largely born out in terms of participation in higher quality programs being associated with longer duration of attendance in afterschool programs, fewer school-day disciplinary referrals, and better academic achievement. In light of these findings, key stakeholders associated with the design and implementation of 21st CCLC would seemingly be well served to further support and incentivize program participation in quality improvement processes underpinned by these types of tools.

These are only two of the questions we posed to Neil. Next week, we'll publish his answers to our other questions, including what changes he would like to see in 21st CCLC data collection and what he sees for the future of afterschool evaluation.