background preloader

Basic Guide to Program Evaluation (Including Many Additional Resources)

Basic Guide to Program Evaluation (Including Many Additional Resources)
© Copyright Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Adapted from the Field Guide to Nonprofit Program Design, Marketing and Evaluation. This document provides guidance toward planning and implementing an evaluation process for for-profit or nonprofit programs -- there are many kinds of evaluations that can be applied to programs, for example, goals-based, process-based and outcomes-based. Sections of This Topic Include Online Guides, etc.Outcomes-EvaluationGeneral Resources Also see Evaluations (many kinds) Related Library Topics Related Library Topics Also See the Library's Blogs Related to Program Evaluations In addition to the articles on this current page, see the following blogs which have posts related to Program Evaluations. Library's Business Planning Blog Library's Building a Business Blog Library's Strategic Planning Blog A Brief Introduction ... Note that much of the information in this section was gleaned from various works of Michael Quinn Patton. 1.. 2. 3. 1. 1.

Evaluation Toolkit At NSTTAC, we are committed to data-based decision making and we view evaluation as a tool for improving our work. For some, the idea of evaluation and data analysis can be an overwhelming task-we created theNSTTAC Evaluation Toolkit with that in mind! We want to assist transitioneducators and service providers to improve their programs and services by determining what is working, what is not working, and what needs changing or replicating. This toolkit will show you how. If you would like to make your own Evaluation Toolkit notebook, you will need to download the cover and spine label both of which you will need to print. NSTTAC Evaluation Toolkit (pdf, 469 KB: updated 8/2010) Section 2: NSTTAC Capacity Building Model (pdf 26 KB) Section 3: Taxonomy for Transition Programming (pdf, 409 KB) Team Planning Tool for Improving Transition Education and Services (Word) Section 4: Student-Focused Planning (pdf) Section 5: Student Development (pdf, 23 kb) Section 7: Family Involvement (pdf, 28 KB)

Evaluations : Commission Guidance Documents Evaluating EU cohesion policy Guidance Documents for the 2014 – 2020 funding period Impact evaluation centre In any programme, the crucial questions are "what do you want to change?" and "how would you know if you have changed it?" Impact evaluation in DG Regional Policy falls into two broad categories: The "Theory-based" impact approach, which follows each step of the intervention logic and focuses on the mechanisms leading to the observed change, is particularly appropriate for answering the questions "why?" The two approaches are complementary and the most useful impact evaluations draw on a mix of methods: counterfactual methods to quantitatively estimate an impact, theory-based methods to understand the underlying mechanisms and the context of an intervention thus helping to modify or generalize it to other contexts. Conterfactual approach Frequently asked questions (FAQS) Counterfactual impact evaluations of Cohesion Policy. Theory-based approach Guidance Documents 2007-2013

Indicators: Definition and Use in a Results-Based Accountability System / Browse Our Publications / Publications & Resources This brief defines and explores the role of indicators as an integral part of a results-based accountability system. The brief shows how indicators enable decision makers to assess progress toward the achievement of intended outputs, outcomes, goals, and objectives. About This Series These short reports are designed to frame and contribute to the public debate on evaluation, accountability, and organizational learning. I. An indicator provides evidence that a certain condition exists or certain results have or have not been achieved (Brizius & Campbell, p.A-15). II. Indicators can measure inputs, process, outputs, and outcomes. Outcome indicators measure the broader results achieved through the provision of goods and services. III. Choosing the most appropriate indicators can be difficult. Does this indicator enable one to know about the expected result or condition? IV. V. General Information on Indicators Brizius, J. Friedman, M. (1995 July). Information on Child and Family Indicators

Evaluation Logic Model The logic model is at the center of UW-Extension Program Development. It displays the sequence of actions that describe what the program is and will do – how investments link to results. We include 5 core components in this depiction of the program action: INPUTS: resources, contributions, investments that go into the program OUTPUTS: activities, services, events and products that reach people who participate or who are targeted OUTCOMES: results or changes for individuals, groups, communities, organizations, communities, or systems Assumptions: the beliefs we have about the program, the people involved, and the context and the way we think the program will work External Factors: the environment in which the program exists includes a variety of external factors that interact with and influence the program action. In UW-Extension, we use the logic model in planning, implementation, evaluation and communication.

INTRODUCTION A Guide to Developing and Using Performance Measures in Results-based Budgeting By Mark Friedman Prepared for The Finance Project May 1997 About The Author: Mark Friedman served for 19 years in the Maryland Department of Human Resources, including six years as the Departmentís chief financial officer. "Cheshire Cat," Alice began, "Would you tell me, please, which way I ought to go from here?" -- Lewis Carroll Hours after the last familiar sign, the driver kept up a steady pace. -- Anon. "Thank God we don't get the government we pay for." -- Will Rogers Will Rogers' cynicism about the performance of government still captures a common, if not always constructive, part of public life at the end of the 20th century. The title of this paper contains a crucial distinction between two types of accountability: accountability for results and accountability for performance. This paper is part of a series of papers published by The Finance Project on the subject of results accountability. A. B. C. A. B.

American Evaluation Association CIRCABC - 2014-2020 Outcome-based commissioning - Yorkshire & the Humber Joint Improvement Partnership - Developing Intelligent commissioning Outcome-based commissioning Basing all decisions on outcomes is a key principle for commissioners, although in many settings it remains aspirational. Most people are now working towards an outcomes-based approach to commissioning and all will need to be aware that this is central to the government’s approach to public expenditure commissioning. Outcome-based commissioning means putting in place a set of arrangements whereby a service is defined and paid for on the basis of a set of agreed outcomes. It means shifting the basis on which services are purchased and resources allocated from units of service provision (hours, days or weeks of a given activity) for pre-defined needs to what is needed to ensure that the outcomes desired by service users are met. The development of commissioning for quality and outcomes, with payment linked to work done, was a vision of the Commissioning framework for health and wellbeing, published in 2007. What is an outcome? Individual outcomes - e.g.

Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) | Virginia Tech Let's be frank about how much care we can - and cannot - afford Investment in prevention and perfect efficiency will not free up the resources to meet all social care needs. The government’s social care reform plans offer us the opportunity to be honest about that, says consultant Colin Slasberg. The level of funding for social care matters. However welcome investment in prevention, however efficient services can be made through best practices, any suggestion that these measures between them will solve the chronic funding shortfall in social care is optimistic to the point of recklessness. However, the draft Care and Support Bill creates an opportunity to get the relationship between needs and funding on a basis that holds authentic promise for the future. The key is that councils will have both a duty to meet some needs that have been assessed, and a power to meet all the others. The White Paper supporting the bill suggests that how much need a council meets beyond the national minimum will be a matter of choice for Councils. About Mithran Samuel

Related:  Health Assessment