Assessing organizational readiness for evidence-based interventions: Prevention program assessment

Stamatakis, K., McQueen, A., Filler, C., Boland, E., Dreisinger, M., Brownson, R. C. & Luke, D. (2012). Measurement properties of a novel survey to assess stages of organizational readiness for evidence-based interventions in community chronic disease prevention settings. Implementation Science, 7(65), 1-10.

Description

This brief survey, the Prevention Program Assessment, consists of 23 questions in four domains (awareness, adoption, implementation and maintenance) to help users assess an organization's readiness to disseminate and implement evidence-based interventions in community and public health practice settings.

The tool measures four domains representing stages of readiness for evidence-based interventions (EBIs) based on Roger's Diffusion of Innovation (1995) theory and the RE-AIM framework. Roger's Diffusion of Innovation theory (see Dobbins' framework) describes five stages to adopting and implementing an innovation:

  • Knowledge phase — make people aware of the innovation
  • Persuasion phase — assess potenial benefits and risks of the innovation
  • Decision phase — adopt the innovation
  • Implementation phase — use organizational changes and strategies to support the innovation
  • Confirmation phase — evaluate the innovation and changes made to support the innovation

The tool also uses constructs from the RE-AIM framework, which is used to plan and evaluate implementation of EBIs by examining the reach, effectiveness, adoption, implementation and maintenance or sustainability of public health interventions.

The Prevention Program Assessment tool assesses organizational readiness for EBIs according to four stages:

  • Awareness — recognizing the need and availability of sources of EBIs, assessing community awareness and organizational awareness
  • Adoption — assessing the extent of evidence use in decision making
  • Implementation — adapting and implementing the intervention to meet community needs
  • Maintenance — making activities and resources available for ongoing support of the innovation

A separate scale representing organizational climate as a potential mediator of readiness for EBIs
was also included in the survey.

Each domain has four to nine items. A seven-point rating scale is used to measure the domains.

Other tools to assess organizational readiness to support innovation include:

  • Program Sustainability Assessment Tool — used to assess an organization's capacity to sustain an innovation (some items from this tool were used to develop the Prevention Program Assessment Tool)
  • Organizational Readiness to Change Assessment (ORCA) tool and the Alberta Context Tool (ACT) — longer, more comprehensive instruments that are based on the PARIHS framework and designed to be used in clinical settings; include assessing the organization's capacity for facilitation to support an evidence-based practice

Steps for Using Method/Tool

The tool consists of 23 questions to assess organizational readiness for evidence-based interventions (EBIs) in the following domains:

1. Awareness

  • Community awareness (community considers health issue a problem, considers intervention a solution)
  • Organization awareness (leadership and staff aware of sources of EBIs)

2. Adoption

  • Organization leadership encourages use of EBIs
  • EBIs are readily adopted
  • Supervisor expects research evidence
  • Organization currently uses research evidence
  • Organzation has access to help in using research evidence

3. Implementation

  • Organization has resources available to implement EBI
  • Intervention has support of leadership
  • Organization adapts EBI to meet community needs
  • Intervention has support of community leadership
  • Staff has necessary skills

4. Resource Maintenance

  • Resources are available to sustain EBI
  • Organization evaluates EBI

A separate scale representing organizational climate as a potential mediator of readiness for EBIs was also included in the survey.

Evaluation

The Prevention Program Assessment tool was tested by respondents sampled to represent program areas in chronic disease prevention, including respondents from state health departments, local health departments and community-based organizations. The researchers analyzed surveys completed by 243 individuals, with 92 of these respondents completing the tool a second time to assess test-retest reliability. Psychometric properties were measured by:

  • Confirmatory factor analysis (CFA) for construct validity
  • Interclass correlation for test-retest reliability and Cronbach's alpha for internal reliability

Validity

Confirmatory factor analysis (CFA) was used to test and refine the readiness scales. The measurement model was first tested on the initial four-factor model based on the four stages of organizational readiness. The model was adapted several times using multiple fit indices to evaluate model fit (chi-square/degrees of freedom, comparative fit index, root mean square error of approximation and associated 90% confidence interval). CFA found good fit for the five-item adoption and implementation scales, and the analysis resulted in revisions of the awareness and maintenance scales. The awareness scale was split into two two-item scales, representing community and agency awareness. The maintenance scale was split into five- and four-item scales, representing resource maintenance and evaluation maintenance, respectively. The model for the final revised scales approached good fit, with most factor loadings >0.6 and all >0.4.

Reliability

Test-retest reliability of the readiness scales, as measured by intraclass correlation, was found to be good to moderate across scales, ranging from 0.47 to 0.71. Internal reliability of scales, measured by Cronbach's alpha, ranged from 0.66 to 0.78.

These summaries are written by the NCCMT to condense and to provide an overview of the resources listed in the Registry of Methods and Tools and to give suggestions for their use in a public health context. For more information on individual methods and tools included in the review, please consult the authors/developers of the original resources.

We have provided the resources and links as a convenience and for informational purposes only; they do not constitute an endorsement or an approval by McMaster University of any of the products, services or opinions of the external organizations, nor have the external organizations endorsed their resources and links as provided by McMaster University. McMaster University bears no responsibility for the accuracy, legality or content of the external sites.

Have you used this resource? Share your story!