Research Design, Analysis, and Reporting
AIR has conducted scientifically rigorous research since 1946 by applying our expertise in research design, data collection, and quantitative and qualitative analysis to determine program effectiveness and identify best practices.
Our evaluations provide concrete and practical recommendations to align and improve programs, systems, and work processes.
AIR's broad range of expertise and resources is tapped by governmental agencies and other stakeholders around the world to help them decide whether to continue, modify, or terminate policies and programs.
Randomized control trials
Randomized control trials are usually the best research design for studies that seek to identify the net impact of a program or policy. In such trials, AIR researchers randomly assign individuals, teachers, schools, or communities to either implement a pilot program or to continue doing what they were doing prior to the study. Random assignment allows us to precisely identify a program's causal effect, free of biases that could otherwise compromise the evaluation results and the decisions of the agencies that fund and implement these programs and policies.
It takes a great deal of experience, creativity, and diligence to successfully implement a randomized control trial in a real-life program or policy context. At AIR, we specialize in working with our clients and stakeholders to make these studies work with as little disruption and friction as possible.
In practice, randomized controlled trials may not always be feasible for various reasons. In these cases, AIR employs rigorous quasi-experimental alternatives to random assignment, such as regression discontinuity designs, propensity score matching, and comparative interrupted time series.
Qualitative data collection and analysis
Many critical evaluation questions concerning programs and policies cannot be answered using quantitative outcome measures alone. For this reason, most of the studies we design and implement include extensive qualitative data collection and analysis, using in-depth interviews, observations, focus groups, portfolio reviews, and budget analyses.
Qualitative methods help us answer important questions about whether and how well an intervention is implemented, whether stakeholders understand and support the intent of policies and interventions, and what barriers may prevent organizations from implementing programs with fidelity.
Making results relevant
In all our research, we place a great emphasis on making the results relevant and accessible to our clients and key stakeholders in the programs and policies we evaluate. We conduct qualitative and quantitative analyses that favor practical policy questions and produce results that are compelling and informative. These include analyses of differential impacts (on subgroups and sites), mediating processes, and variability in findings across research units such as schools, hospitals, and neighborhoods.
When reporting our research, we ensure that our findings directly address the concrete research questions that prompted the study in the first place. We supplement our primary research findings with rich contextual information about program settings, sample characteristics, and barriers to or facilitators of successful implementation. This way, AIR researchers ensure that their work has practical utility and relevance to their audiences.
Linking Research and Application Through Implementation Science
AIR is unique in the extent to which we provide both evaluation and technical assistance in many substantive policy areas. With the growing availability of high-quality program and participation data we are increasingly able to connect these two components of our work in a continuous effort to assess and increase the effectiveness of our technical assistance and the programs and policies our assistance is applied to. AIR uses the latest implementation science methods and works with other research institutions and agencies to further this new field of research.
Examples of Our Work in Research Design
Randomized Controlled Trials
- Broadening Access to Algebra I: The Impact on Eighth Graders Taking an Online Course
- Early Childhood Stimulation Program Evaluation in Bangladesh
- The Mathematics Professional Development (PD) Impact Study
- Ohio Gifted and Talented Study (Regression Discontinuity Study)
- Jefferson County Strategic Compensation Evaluation
- Deeper Learning Study (Propensity Score Matching)
- Evaluation of Cooperating Districts Initiative
- Early experiences of new enrollees in the health marketplaces
Improving Research Methods
- IES Research and Methods Grant to Examine the Robustness of CITS in Practice