Program planning, implementation and evaluation tools

A summary of Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to outcomesTM: Promoting accountability through methods and tools for planning, implementation, and evaluation. Pittsburgh, PA: RAND Corporation. Retrieved from http://www.rand.org/pubs/technical_reports/TR101.html  

How to cite this NCCMT summary:

National Collaborating Centre for Methods and Tools (2010). Program planning, implementation and evaluation tools. Hamilton, ON: McMaster University. (Updated 29 April, 2011). Retrieved from http://www.nccmt.ca/registry/view/eng/71.html.

Keywords: Adapt, Implement, Evaluate, Program planning, Situational assessment, Organizational capacity and management

Tool Knowledge Knowledge Knowledge Planning Doing Evaluating

These summaries are written by the NCCMT to condense and to provide an overview of the resources listed in the Registry of Methods and Tools and to give suggestions for their use in a public health context. For more information on individual methods and tools included in the review, please consult the authors/developers of the original resources.

Relevance for Public Health

This tool will be helpful for program planners, managers and program evaluation specialists. It can be used to assist with program planning, implementation and evaluation phases to inform continuous quality improvement strategies. For instance, tools within this resource could help users refine and shift existing programs to meet the needs of a changing demographic population in diabetes prevention.

Description
The Getting to Outcomes approach helps practitioners improve the accountability and quality of their programs. This approach was developed to address the gap between research and practice by building capacity at the individual practitioner and program levels (e.g., choosing evidence-based practices and planning, implementing, evaluating and sustaining effective practices). The GTO 2004: Promoting Accountability through Methods and Tools for Planning, Implementation, and Evaluation report contains both methods and tools. This summary statement contains the tools from this report (refer to Methods for program planning, implementation and evaluation summary statement for the methods).

The approach is based on traditional evaluation methods, empowerment evaluation, results-based accountability and continuous quality improvement. While traditional evaluation methods typically use external evaluators, working at an arm's length from practitioners, empowerment evaluation supports collaborative relationships between evaluators and practitioners. Empowerment evaluation is based on the notion that program success is more likely when evaluators collaborate with practitioners and provide them with the tools and opportunities to plan, implement, evaluate and develop a continuous quality improvement system themselves.

Results-Based Accountability (RBA) focuses on the results of programs and what can be learned from program impacts and program effectiveness, rather than process or output information. Continuous Quality Improvement (CQI) is a technique within Total Quality Management (TQM). It is based on principles of quality improvement, error and cost reduction and increasing client satisfaction.

The GTO approach consists of 10 accountability questions: six planning questions (steps 1-6), two evaluation questions (steps 7-8) and two questions that address using data to improve and sustain programs (steps 9-10). The accountability questions are as follows:

  1. Needs and Resources: What are the underlying needs and conditions in the community?
  2. Goals: What goals, target populations and objectives (i.e., desired outcomes) will address the needs and change the underlying conditions?
  3. Best Practice: Which evidence-based models and best practice programs can you use to reach your goals?
  4. Fit: What actions do you need to take so that the selected program "fits" the community context?
  5. Capacities: What organizational capacities are needed to implement the program?
  6. Plan: What is the plan for this program?
  7. Process: How will you assess the quality of program implementation?
  8. Outcomes: How well did the program work?
  9. Continuous Quality Improvement: How will you incorporate continuous quality improvement strategies?
  10. Sustainability: How will effective programs be sustained?

The three key features of the GTO system are as follows:

  1. The GTO approach can be used at any stage of program planning, implementation and evaluation. Practitioners can choose which accountability questions are helpful at a particular point in time, depending on the development stage of their programs.
  2. The GTO approach is not linear. Although presented in a linear fashion, the GTO system can be used in an iterative way, where learning may feed back into earlier steps.
  3. The GTO approach promotes cultural competence in programming. Practitioners can incorporate the ethnic/cultural characteristics, experiences, norms and values of intended populations at each program development stage.
Implementing the Method/Tool
Steps for Using Method/Tool

The 10 sections of the resource provide a number of useful tools to help users plan, implement and evaluate programs.

A) Needs and Resources: What are the underlying needs and conditions in the community?
Data Collection Plan Tool (p. 22)
A clear data collection plan helps ensure that data collection is on track. The key components of a data collection plan include the following:

  • risk/protective factors to be assessed
  • indicators used for measurement
  • how/where to get the data
  • persons responsible
  • dates by which key tasks are to be completed

B) Goals: What are the goals, target populations and objectives (i.e., desired outcomes) that will address the needs and change the underlying conditions?
The ABCDE Method for Writing Objectives
(p. 36)

  • A - Audience (the population for whom the desired outcome is intended)
  • B - Behaviour (a clear statement of expected behaviour change)
  • C - Condition (what is the time frame for implementation and measurement?)
  • D - Degree (how much change is expected?)
  • E - Evidence (how will the change be measured?)

Goals Tool (p.40)
This section organizes program goals, objectives and intended populations to link each goal to a specific objective and population.

C) Capacities: What organizational capacities are needed to implement the program?
Capacity Tools
(p. 75-76)
The developers outline a four-step process to assess organizational capacity across human, technical, fiscal and structural/linkage capacities. The process includes identifying specific aspects of capacity to be assessed, determining the requirements needed for each capacity assessment item, determining if capacity is sufficient and choosing strategies to enhance capacity.

D) Plan: What is the plan for this program?
Planning Tool
(p. 87-92)
The Planning Tool help users link specific program components to identified objectives and identify specific tasks within each program component. The Planning Tool outlines the following elements of a program plan:

  • providing a program name and summary
  • identifying program components
  • identifying anticipated outputs
  • planning each program component
  • identifying target groups
  • choosing collaboration partners
  • maximizing program integration
  • identifying implementation barriers
  • completing summary checklists

E) Process: how will the quality of program implementation be assessed?
Process Evaluation Planning Tool
(p. 96)
Several process evaluation questions are listed with their corresponding data collection activities:

  • Did the program follow the basic plan for service delivery?
  • What are the program characteristics?
  • What are participants' characteristics?
  • What is the participants' satisfaction with the program?
  • What is the staff's perception of the program?
  • What were the individual participants' dosages of the program?
  • What were the program components' levels of quality?

Implementation Tool (p. 110-112)
The Implementation Tool helps users determine if the program was employed according to the program plan. This tool elaborates on questions posed in the Process Evaluation Planning Tool, providing information on the following:

  • monitoring component outputs
  • planning activities for each program component
  • recruitment of target groups
  • collaboration partners
  • program integration

F) Outcomes: How well did the program work?
Comparisons of the Common Evaluation Designs
(p. 123)
This table is useful for assessing the advantages and disadvantages of different types of outcome evaluation designs when deciding which kind of evaluation design is suited to your program. The developers recommend conducting a pre-post method with a comparison group, but if that is not feasible, than to do a pre-post. The following evaluation designs are assessed:

  • post only (measure outcomes after the program has been delivered)
  • pre-post (measure change by comparing baseline levels to outcomes after the program has been delivered)
  • retrospective pre-post (participants rate themselves at the end of the program on pre- and post-levels)
  • pre-post with comparison group (to compare changes in participants relative to a similar group who did not receive the intervention)
  • pre-post with control group (participants are randomly assigned to a control group or intervention group from the same target population)

Data Collection Methods at a Glance (p. 124-125)
This table highlights strengths and weaknesses of data collection methods, both quantitative and qualitative. While quantitative methods typically seek to answer who, what, where and how much, qualitative methods answer why and how.

Linking Design, Collection and Analysis at a Glance (p. 131)
This section links specific evaluation designs (post, pre-post and pre-post with comparison group or pre-post with control group) with the appropriate data collection method, data analysis method and the number of groups involved.

Outcome Evaluation Tool (p. 133-135)
The Outcome Evaluation Tool is designed to organized the following information:

  • summary of the needs assessment findings including relevant risk and protective factors
  • the target group (including numbers)
  • objectives
  • measures used
  • evaluation design selected
  • sample size: the number of people assessed in the evaluation
  • data analysis method
  • pre- and post-scores and their differences
  • interpretation of the results

G) Continuous Quality Improvement: How will continuous quality improvement strategies be incorporated?
CQI Tool
(p.141)
Summarizing and using the information learned from planning, implementing and evaluating the program will inform a continuous quality improvement strategy. The CQI Tool summarizes findings from earlier steps and determines how this information will inform changes to program implementation. The following questions are listed:

  • Have the needs of the target group/resources in the community changed?
  • Have the goals/desired outcomes/target population changed?
  • Are new and improved evidence-based/best practice technologies available?
  • Does the program continue to fit with your agency (both philosophically and logistically) and your community?
  • Have the resources available to address the identified needs changed?
  • How well did you plan? What suggestions do you have for improvement? 
  • How well was the program implemented? How well did you follow the plan you created? What were the main conclusions from the process evaluation?
  • How well did the program reach its outcomes? What were the main conclusions from the outcome evaluation?
Who is involved

Using this resource would require the involvement of many individuals and resources since evaluation and continuous quality improvement activities are long-term and ongoing processes. Some individuals who would be involved include program directors, program managers, public health nurses, public health nutritionists, health promotion officers, program coordinators, research and evaluation specialists and project specialists.

Conditions for Use
Not specified
Evaluation and Measurement Characteristics
Evaluation
Information not available
Validity
Not applicable
Reliability
Not applicable
Methodological Rating
Not applicableNot applicable
Method/Tool Development
Developer(s)

Matthew Chinman, PhD
Pamela Imm
Abraham Wandersman

RAND Corporation
4570 Fifth Avenue, Suite 600
Pittsburgh, PA
15213

Method of Development
The Getting to Outcomes 2004: Promoting Accountability Through Methods and Tools for Planning, Implementation, and Evaluation is one of three components designed to support practitioners in using evidence-based programs. This manual can be used in conjunction with face-to-face training and on-site technical assistance.

The purpose of the GTO approach is to enable organizations to modify their structure to better support the integration of evidence-based practices into routine operations. The overarching goal of the GTO approach is to transform organizations into learning organizations that are open to development and change.
Release Date
2004
Contact Person/Source
Matthew Chinman, PhD
RAND Corporation
4570 Fifth Avenue, Suite 600
Pittsburgh, PA
15213
email: Matthew_Chinman@rand.org

Resources

Title of Primary Resource
Getting to Outcomes 2004: Promoting Accountability Through Methods and Tools for Planning, Implementation and Evaluation
File Attachment
None
Web-link
Reference

Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to outcomesTM: Promoting accountability through methods and tools for planning, implementation, and evaluation. Pittsburgh, PA: RAND Corporation. Retrieved from http://www.rand.org/pubs/technical_reports/TR101.html

 

Type of Material
Report
Format
On-line Access
Cost to Access
None 
Language
English
Conditions for Use
Copyright © 2004 RAND Corporation 

Title of Supplementary Resource
Getting to outcomes: A results-based approach to accountability
File Attachment
None
Web-link
Reference
Wandersman, A., Imm, P., Chinman, M., & Kaftarian, S. (2000). Getting to outcomes: A results-based approach to accountability. Evaluation and Program Planning, 23, 389-395.
Type of Material
Journal article
Format
Periodical
Cost to Access
Journal article purchase 
Language
English
Conditions for Use
Copyright © 2000 Elsevier Science Ltd. 

Title of Supplementary Resource
The getting to outcomes demonstration and evaluation: An illustration of the prevention support system
File Attachment
None
Web-link
Reference
Chinman, M., Hunter, S. B., Ebener, P., Paddock, S. M., Stillman, L., Imm, P., et al. (2008). The getting to outcomes demonstration and evaluation: An illustration of the prevention support system. American Journal of Community Psychology, 41, 206-224.
Type of Material
Journal article
Format
Periodical
Cost to Access
Journal article purchase 
Language
English
Conditions for Use
Copyright © 2008 Springer Science+Business Media, LLC 

Title of Supplementary Resource
Getting to Outcomes 2004: Appendices
File Attachment
None
Web-link
Reference
Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to outcomes 2004: Appendices. Pittsburgh, PA: RAND Corporation. Retrieved from http://www.rand.org/pubs/technical_reports/TR101.html
Type of Material
Report
Format
On-line Access
Cost to Access
 
Language
English
Conditions for Use
Copyright © 2004 RAND Corporation 

Title of Supplementary Resource
Getting to Outcomes: 10 Steps for Achieving Results-Based Accountability.
File Attachment
None
Web-link
Reference
Wiseman, S., Chinman, M., Ebener, P., Hunter, S., Imm, P., Wandersman, A. (2007). Getting to Outcomes: 10 Steps for Achieving Results-Based Accountability. No. TR-TR101/2. Santa Monica, CA: RAND Corporation. Retrieved from: http://www.rand.org/pubs/technical_reports/TR101.2/.
Type of Material
Report
Format
On-line Access
Cost to Access
 
Language
English
Conditions for Use
Copyright © 2004 RAND Corporation 

Title of Supplementary Resource
Preventing Underage Drinking: Using Getting to Outcomes with the SAMHSA Strategic Prevention Framework to Achieve Results.
File Attachment
None
Web-link
Reference

Imm, P., Chinman, M., Wandersman, A., Rosenbloom, D., Guckenburg, S., Leis, R. (2007). Preventing Underage Drinking: Using Getting to Outcomes with the SAMSHA Strategic Prevention Framework to Achieve Results. RAND, TR-403-SAMHSA. Santa Moinca, CA: RAND Corporation. Retrieved from: http://www.rand.org/pubs/technical_reports/TR403/.

 

Type of Material
Report
Format
On-line Access
Cost to Access
 
Language
English
Conditions for Use
Copyright © 2004 RAND Corporation 


Have you used this resource? Share your story!