AIR researchers and technical assistance consultants have a unique connection to and history of making contributions to the federally funded Nita M. Lowey 21st Century Community Learning Centers (CCLC) program, which spans almost the full life of the program. Specifically, we:
- Conduct rigorous mixed-methods evaluations of statewide 21st CCLC systems that are grounded in research and aligned to state goals and objectives;
- Support research on the importance of program quality, youth experiences in programming, youth knowledge, attitudes, and beliefs, and school-related outcomes;
- Develop data systems that support federal reporting requirements, facilitate statewide evaluation activities, and support program ability monitor, refine, and learn; and
- Provide ongoing, evidence-based technical assistance related to youth development practices, continuous quality improvement efforts, and evaluation efforts.
- Share what we know to inform policy decisions and program practice.
Learn more about how we work with the 21st Century Community Learning Centers program:
- History and collaboration with the program
- Statewide work across the country
- Research studies and support for the program
- Survey results showing the connection between program quality and youth outcomes
Our History and Collaboration with the 21st CCLC Program
AIR began working with nationwide 21st CCLC data in 2003, acting on behalf of the U.S. Department of Education (ED) to collect and analyze program data from more than ten thousand centers across the country. Using the Profile and Performance Information Collection System (PPICS), built and maintained by AIR through 2014 (when the system was retired), we gathered information on program operations, attendance levels, activity provision, staffing, and youth outcomes, both in aggregate and (in select states) at the student level.
AIR also conducted the National Partnership for Quality Afterschool study from 2004-2009. This study of high-functioning 21st CCLC programs was funded by ED to:
- Develop resources and professional development that addressed issues relating to the establishment and sustainability of afterschool programs.
- Provide models and indicators of promising practices.
- Identify other descriptive information that local sites could access in planning new afterschool programs or improve existing ones.
Findings from the study were used to guide the partnership’s work including development of tools, models, and technical assistance for 21st CCLC grantees. The Afterschool Toolkit created as a part of the project was disseminated nationally.
In 2009, AIR began conducting statewide evaluations of the 21st CCLC program. Since then, we have partnered with Illinois, Nevada, New Jersey, Ohio, Oregon, Rhode Island, South Carolina, Texas, and Washington. In addition, we have conducted research studies involving 21st CCLC programs in Massachusetts, Minnesota, and Wisconsin. Almost all our statewide evaluation contracts are still actively under way, with many in their second, third, or even fourth contract period. Our statewide evaluation work has focused on five key activities:
- Developing of key performance indicators
- Measuring social and emotional learning and related outcomes
- Linking quality programming to youth outcomes
- Assessing the program’s impact on school-related outcomes
- Developing data collection infrastructure
In states where we provide technical assistance and support for quality improvement, we focus on providing tools and professional development that helps 21st CCLC programs adopt research-supported practices and use data to drive program improvement efforts.
21st CCLC Statewide Work Across the Country
Our work with Illinois began in 2002, when AIR was the primary training partner of IL 21st CCLC grantees through statewide and regional training events. In 2007, AIR’s support for the Illinois State Board of Education’s (ISBE) 21st CCLC program expanded to support program management, development of a peer mentoring process, and statewide conferences. In 2013, AIR’s work with ISBE’s 21st CCLC program grew again, and AIR launched the Illinois Quality Afterschool project. Currently, we work in partnership with ISBE to ensure that 21st CCLC grantees in Illinois receive the necessary assistance—including training, tools, resource materials, and expertise—to deliver high-quality afterschool programs that can strengthen student engagement and academic achievement. A key component of this work is design, development, and provision of professional development opportunities including an annual webinar series, regional and statewide workshops, and a large-scale annual conference for grantee teams.
In Nevada, we began a statewide evaluation in January 2016. AIR works in partnership with the Nevada Department of Education (NDE) to design and implement a comprehensive evaluation of 21st CCLCs that will help local grantees improve quality afterschool program design and delivery and help state officials understand the connections between high-quality programming, student engagement, the development of skills and beliefs, and academic achievement. Additionally, AIR supports the implementation of a continuous quality improvement process (CQIP), provides resources, training, and technical assistance to support local evaluation efforts, and advises NDE staff on data collection efforts, such as youth survey development, that are aligned with both the latest research and federal data submission requirements (GPRA measures).
Our work with New Jersey began in 2009 and is ongoing. Our evaluation assesses grantee progress on key program indicators, and overall program impact. AIR’s data collection includes grantee-level action research and self-assessment data (collected to facilitate ongoing program improvement efforts), alongside youth and staff surveys. Starting in 2021, AIR is also reviewing local evaluation reports in order to provide individual grantees with constructive feedback. For the state-level program evaluation, propensity score matching is used to assess program impact on school-related outcomes in comparison with a nonparticipant sample. AIR also analyzes youth survey outcomes in correlational analyses.
In Ohio (2015-2020), we evaluated the state program on implementation progress and program impact, with a particular focus on literacy. On an annual basis, we also reviewed all local evaluation reports for grantees entering their second year, providing individualized feedback to facilitate local evaluation effort improvement. To gather data, AIR collected data directly from the grantees regarding basic program operations and attendance patterns, as well as collecting youth and teacher surveys. Matching these records with data maintained by the Ohio Department of Education, we used propensity score matching to create a non-participant comparison group to assess youth outcomes in a quasi-experimental evaluation design.
We began working with the Oregon Department of Education in 2011, first on an evaluation of their statewide 21st CCLC program. The evaluation was designed to answer research questions related to program implementation and program impact while also supporting the development of a comprehensive Leading Indicator system that would help the Oregon Department of Education and individual 21st CCLC grantees make use of data collected as part of the evaluation for program improvement efforts. Next, in 2013, AIR used the evaluation findings and recommendations (which suggested ODE should create a statewide Continuous Quality Improvement Process, or CQIP) to conduct a statewide needs assessment related to quality practice. The goal of this two-year project was to understand grantees’ quality improvement efforts and to support ODE and its stakeholders in implementing a continuous improvement process for all grantees across the state. In 2016, we then worked with the state and local grantees to develop a comprehensive CQIP. This two-year endeavor involved developing a process and supporting tools for a CQIP and determining what implementation looks like relative to key quality constructs.
In Rhode Island (2016-2020), AIR worked with the Rhode Island Department of Education (RIDE) to conduct a state-level evaluation focused on the relationship between program quality and youth outcomes. As part of this effort, AIR collected observational data from a select group of grantees, along with a series of youth and teacher surveys. The observation data and select surveys were used to derive center-level program quality scores. These scores, along with survey data concerning youth social and emotional outcomes, were combined with RIDE-collected 21st CCLC program data (including individual student participation data) and analyzed using a range of analytic techniques, notably Rasch modeling, hierarchical linear modeling, and propensity score matching. The propensity score matching involved use of a non-participant sample in a quasi-experimental evaluation design.
Our work in South Carolina began in 2016 and is focused on working with the South Carolina Department of Education (SCDE) to evaluate the state’s 21st CCLC program. The evaluation facilitates efforts to develop quality afterschool program design and delivery, optimize the use of program data to monitor the progress that programs are making toward desired program outcomes, and conduct rigorous impact analyses to assess how youth participating in 21st CCLC perform on critical school-related outcomes compared with similar youth who are not participating in the program. Critical to this task, was the development of the South Carolina 21st CCLC Data Portal, an online interface where each subgrantee submits center-level data to support both statewide evaluation efforts and federal data submission requirements.
Since 2011, AIR has worked with the Texas Education Agency on a statewide evaluation of their 21st CCLC programs in Texas (branded statewide as the Afterschool Centers on Education [ACE]). AIR and TEA are working assess how ACE programming supports students having key developmental experiences in programming and how program participation is associated with school-related outcomes. A key component of this work is also understanding what center characteristics are especially associated with positive student outcomes. Our research design includes: (a) an analysis of the relationship between ACE center characteristic data and youth outcomes and the development of key indicators used to guide and inform local evaluation efforts; (b) the evaluation of program impacts and implementation practices of ACE centers; and (c) dissemination and application of best practices that lead to improved student outcomes, as well as training and technical assistance to increase the capacity of ACE grantees and centers to locally evaluate progress on goals and outcomes that lead to program improvement.
AIR has worked with the 21st CCLC program in Washington since 2011. We have provided technical assistance and training for grantees on the use of federal reporting systems, helped the state collect Washington-specific student attendance information, and conducted impact analyses examining the relationship between participation in the program and improved outcomes. We also developed and implemented a leading indicators system designed to create reports on the performance of grantees on a variety of indicators related to programming quality, developed specifications for a data dashboard, and conducted a survey of youth skills and beliefs. Most recently, we have deployed the Washington 21st CCLC Data Portal to support data submission requirements for statewide and federal purposes, and have revised the local evaluation guidelines while providing appropriate training and technical assistance.
Research Studies and Supports for the 21st CCLC Program
AIR also conducts rigorous research studies of program quality and youth outcomes. Most recently, we examined the relationship between sustained attendance in high-quality 21st CCLC–funded programming and the social and emotional development of participating youth. With support from the Charles Stewart Mott Foundation, the Quality to Youth Outcomes Study was a four-year project in collaboration with the states of Massachusetts and Minnesota. The study followed a cohort of elementary-aged youth across 2 years of sustained participation in 21st CCLC programming at 54 centers deemed to be higher quality programs in Massachusetts and Minnesota. Study results provided support for the hypothesis that sustained enrollment in high-quality 21st CCLC programs was related to (a) growth a set of social and emotional outcomes, particularly those related to cognitive engagement and executive function, measured by the Survey of Academic and Youth Outcomes Teacher Survey (SAYO-T) and (b) that growth on these social and emotional outcomes was associated with the development of literacy skills in early elementary grade levels.
AIR conducted the National Scan of 21st CCLC Data, Impact, and Quality Improvement Systems Project from 2013 and 2015. This project was designed to explore how states are collecting and using data about the 21st CCLC programs they fund and oversee and initiate creation of a database of results from robust 21st CCLC evaluations undertaken by states and local 21st CCLC grantees which AIR continues to maintain.
Tools to Understand the Connection Between Program Quality and Youth Outcomes
To explore the relationship between program quality and youth outcomes, AIR has studied the relationship between afterschool program quality, youth engagement in programming, and changes in a series of youth development and related outcomes measured through the Youth Motivation, Engagement, and Beliefs Survey. Funded by the Raikes Foundation and the Washington Office of Superintendent of Public Instruction (OSPI), the study included 11 21st CCLC afterschool programs with youth in Grades 4–9. Findings suggest that higher levels of program quality would be associated with more positive youth experiences in programming and that these positive experiences would be associated with improved youth development outcomes.