Define Evidence of Effectiveness: Examples

Evidence Frameworks: RFA's State and Local Workforce Fellowship 

Evidence frameworks look different across the workforce field, as government agencies determine how many tiers they’ll use, the specific criteria for each and the role that equity will play. As part of their participation in Results for America’s State and Local Workforce Fellowship, Pennsylvania, Colorado, and Texas each developed or strengthened evidence frameworks to inform their contracting and grantmaking processes.

Here’s an overview of the evidence continuums the Colorado Workforce Development Council and Pennsylvania’s Department of Labor and Industry have adopted:

Pennsylvania and Colorado

Strong Evidence

Two or more rigorous evaluations support the program model

Moderate Evidence

At least one evaluation report has demonstrated that an intervention or strategy has been tested using a well-designed and well-implemented experimental or quasi-experimental design showing evidence of effectiveness on one or more key workforce outcomes. The evaluations should be conducted by an independent entity external to the organization implementing the intervention.

Preliminary Evidence

At least one evaluation report has demonstrated that an intervention or strategy has been tested using a well-designed and well-implemented pre/post-assessment without a comparison group or a post-assessment comparison between intervention and comparison groups showing evidence of effectiveness on one or more key workforce outcomes. The evaluation may be conducted either internally or externally.

Pre-preliminary Evidence

There is program performance data for the intervention showing improvements for one or more key workforce outputs or outcomes

See Colorado Workforce Development Council’s evidence continuum in action on pages 5-6 of this Reskilling, Upskilling, and Next-skilling Workers (RUN) grant, and review the Pennsylvania Department of Labor and Industry’s statewide Evidence of Effectiveness definitions.

After publishing the statewide Evidence of Effectiveness definitions with the Pennsylvania Department of Labor and Industry, the Pennsylvania Workforce Development Board conducted a field survey to understand more about local partners and how the definitions impacted them. The survey collected data from over 100 respondents across 12 workforce and education program types. Metrics included:

  • Self-assessed organizational/programmatic evidence rating (pre-preliminary, preliminary, moderate, or strong)

  • Organization’s current budget for evaluation

  • History of participating in external evaluations

Respondents also shared that staff expertise and capacity, data access, and non-staff related costs were the main barriers to building and using evidence, which could be mitigated through dedicated funds for evidence building within grant awards and improved technical assistance opportunities. This survey process ensured the state understood what barriers, supports, and resources would be required to equitably apply the Evidence of Effectiveness definitions.

Texas

The Texas Workforce Commission (TWC) chose to use a wider range of tiers beyond high, middle, and low in its evidence framework to include applicants with promising practices or the desire to build evidence of effectiveness through their work. TWC allows providers to receive funding for newer programs if they build evidence of effectiveness over the grant or contract period through improved data collection and evaluation. For a more detailed look at Texas’ Evidence Framework, click here.

High Evidence

At least two approved rigorous studies must show that the program produces positive and meaningful outcomes, with a high degree of confidence that the outcome is primarily caused by the program.

Moderate Evidence

Program must be supported by an approved rigorous evaluation which finds that the program has a positive and meaningful outcome, with a modest degree of confidence that the outcome is primarily caused by the program.

Performance

Program must provide historical output and outcome data for at least two years, along with assessments and post-program follow-up to demonstrate effectiveness.

Experience

Programs do not perform evaluations of participant success or collect data on the effectiveness of the program. Support for program effectiveness comes from anecdotal success stories or other testimonials.

New

Entirely new programs with no evidence of effectiveness or evaluation data. Applicants must explain why the program will achieve positive measurable outcomes and that there is sufficient capacity to collect data and track outcomes.

Other Strong Examples of State Evidence Frameworks

Minnesota Management and Budget Office Evidence Framework

The Results First Initiative within the state of Minnesota’s Management and Budget Office created an evidence framework to rate the state’s programs and services. In this framework, programs and services are considered evidence-based if they fall in the “Proven Effective” or “Promising” categories. Ratings from each program are publicized in the Minnesota Inventory, a quasi state evidence clearinghouse.

Note: For Minnesota, research includes “programs and services previously evaluated and featured in a national clearinghouse or meta-analysis (with respect to the nature, length, frequency, and target population.”

Proven Effective

A Proven Effective service or practice offers a high level of research on effectiveness for at least one outcome of interest. This is determined through multiple qualifying evaluations outside of Minnesota or one or more qualifying local evaluation. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.

Promising

A Promising service or practice has some research demonstrating effectiveness for at least one outcome of interest. This may be a single qualifying evaluation that is not contradicted by other such studies but does not meet the full criteria for the Proven Effective designation. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.

Theory Based

A Theory Based service or practice has either no research on effectiveness or research designs that do not meet the above standards. These services and practices may have a well-constructed logic model or theory of change. This ranking is neutral. Services may move up to Promising or Proven Effective after research reveals their causal impact on measured outcomes.

Mixed Effects

A Mixed Effects service or practice offers a high level of research on the effectiveness of multiple outcomes. However, the outcomes have contradictory effects. This is determined through multiple qualifying studies outside of Minnesota or one or more qualifying local evaluation. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.

No Effect

A service or practice rated No Effect has no impact on the measured outcome or outcomes of interest. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.

Proven Harmful

A Proven Harmful service or practice offers a high level of research that shows program participation adversely affects outcomes of interest. This is determined through multiple qualifying evaluations outside of Minnesota or one or more qualifying local evaluation. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.

Tennessee

The state of Tennessee’s Office of Evidence and Impact evidence framework considers programs evidence-based if they are supported by at least one rigorous evaluation. This framework is used as part of a program inventory to identify how funding is allocated across different programs, the services provided, and any evidence tied to the program’s outcomes. For Tennessee, rigorous evaluations include randomized controlled trials and quasi-experimental designs that use comparison groups.

Strong Evidence

Two or more rigorous evaluations support the program model.

Evidence

At least one rigorous evaluation supports the program model.

Outcomes

Data collected over time demonstrate a change or benefit for participants.

Outputs

Process measures to support continuous improvement.

Logic Model

“If we do X, Y, and Z activities, then we expect to see A, B, and C results.”

New Mexico

In 2019, the state of New Mexico passed the Evidence and Research Based Funding Requests Act, which defined four tiers of evidence and required state agencies to categorize sub-programs according to these tiers and report on the amount allocated for each of these evidence tiers. Each year, New Mexico’s Legislative Finance Committee oversees this work, providing budget guidance for agencies, recommendations for evidence-based programs through their Legislating for Results framework, training, and technical assistance.

Evidence-Based

A program or practice: (1) incorporates methods demonstrated to be effective for the intended population through scientifically based research, including statistically controlled evaluations or randomized trials; (2) can be implemented with a set of procedures to allow successful replication in New Mexico; and (3) when possible, has been determined to be cost beneficial.

Research-Based

A program or practice has some research demonstrating effectiveness, but does not yet meet the standard of evidence-based.

Promising

A program or practice, based on statistical analyses or preliminary research, presents potential for becoming research-based or evidence-based.

Lacking Evidence of Effectiveness

Programs or practices that do not fall into the other three evidence tiers.

Sample Evidence Frameworks from Federal Agencies

The U.S. Department of Labor’s Clearinghouse for Labor Evaluation and Research (CLEAR), includes descriptive, implementation, and impact studies for workforce development and employment-related programs on a wide variety of topics. CLEAR rates the evidence presented in impact studies as high, moderate, or low depending on how confident they can be that the study outcomes are attributable to the program. Review CLEAR’s rating criteria here.

The U.S. Department of Education’s What Works Clearinghouse (WWC) reviews and summarizes studies of education programs – including postsecondary career and technical education programs – and assigns those programs to strong or moderate evidence tiers.

AmeriCorps has a common evidence framework for funding decisions in the Senior Corps and AmeriCorps state and national programs, including pre-preliminary, preliminary, moderate, and strong evidence tiers. In FY22, 64% of competitively awarded funds were invested in interventions with moderate and strong evidence. In 2023, thirty states used this federal definition of evidence in their most recent AmeriCorps grant applications.

RFA's Evidence Honor Roll

Get access to a catalog of programs that define and prioritize evidence of effectiveness with RFA's "Honor Roll of State Grant Programs that Define and Prioritize Evidence." State agencies can nominate their state grant program(s) for potential inclusion using this form. Nominations and updates the honor roll on a rolling basis.