background preloader

Basic Guide to Program Evaluation (Including Many Additional Resources)

Basic Guide to Program Evaluation (Including Many Additional Resources)
© Copyright Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Adapted from the Field Guide to Nonprofit Program Design, Marketing and Evaluation. This document provides guidance toward planning and implementing an evaluation process for for-profit or nonprofit programs -- there are many kinds of evaluations that can be applied to programs, for example, goals-based, process-based and outcomes-based. Sections of This Topic Include Online Guides, etc.Outcomes-EvaluationGeneral Resources Also see Evaluations (many kinds) Related Library Topics Related Library Topics Also See the Library's Blogs Related to Program Evaluations In addition to the articles on this current page, see the following blogs which have posts related to Program Evaluations. Library's Business Planning Blog Library's Building a Business Blog Library's Strategic Planning Blog A Brief Introduction ... Note that much of the information in this section was gleaned from various works of Michael Quinn Patton. 1.. 2. 3. 1. 1. Related:  Evaluation WebsitesEDTECH 505

Indicators: Definition and Use in a Results-Based Accountability System / Browse Our Publications / Publications & Resources This brief defines and explores the role of indicators as an integral part of a results-based accountability system. The brief shows how indicators enable decision makers to assess progress toward the achievement of intended outputs, outcomes, goals, and objectives. About This Series These short reports are designed to frame and contribute to the public debate on evaluation, accountability, and organizational learning. I. An indicator provides evidence that a certain condition exists or certain results have or have not been achieved (Brizius & Campbell, p.A-15). II. Indicators can measure inputs, process, outputs, and outcomes. Outcome indicators measure the broader results achieved through the provision of goods and services. III. Choosing the most appropriate indicators can be difficult. Does this indicator enable one to know about the expected result or condition? IV. V. General Information on Indicators Brizius, J. Friedman, M. (1995 July). Information on Child and Family Indicators

INTRODUCTION A Guide to Developing and Using Performance Measures in Results-based Budgeting By Mark Friedman Prepared for The Finance Project May 1997 About The Author: Mark Friedman served for 19 years in the Maryland Department of Human Resources, including six years as the Departmentís chief financial officer. "Cheshire Cat," Alice began, "Would you tell me, please, which way I ought to go from here?" -- Lewis Carroll Hours after the last familiar sign, the driver kept up a steady pace. -- Anon. "Thank God we don't get the government we pay for." -- Will Rogers Will Rogers' cynicism about the performance of government still captures a common, if not always constructive, part of public life at the end of the 20th century. The title of this paper contains a crucial distinction between two types of accountability: accountability for results and accountability for performance. This paper is part of a series of papers published by The Finance Project on the subject of results accountability. A. B. C. A. B.

Program Development and Evaluation – University of Wisconsin-Extension This course provides a holistic approach to planning and evaluating education and outreach programs. It helps program practitioners use and apply logic models – a framework and way of thinking to help us improve our work and be accountable for results. In this guide you will learn how to seamlessly import data from Qualtrics into MAXQDA. We’ll cover export options in Qualtrics, data cleanup in Excel, and import options in MAXQDA. By Ellen Bechtol & Christian Schmieder, Program Development & Evaluation Unit, UW-Cooperative Extension. In this guide you will learn how to pull a random sample from a MAXQDA dataset, using the random cell function in Excel.

InformalScience : informal learning projects, research and evaluation Chapter 4: Questionnaire Design No survey can achieve success without a well-designed questionnaire. Unfortunately, questionnaire design has no theoretical base to guide the marketing researcher in developing a flawless questionnaire. All the researcher has to guide him/her is a lengthy list of do's and don'ts born out of the experience of other researchers past and present. Hence, questionnaire design is more of an art than a science. Chapter Objectives This chapter is intended to help the reader to: · Understand the attributes of a well-designed questionnaire, and· Adopt a framework for developing questionnaires. Structure Of The Chapter A brief account of the key attributes of a sound questionnaire serves as the opening section of the chapter. The qualities of a good questionnaire Exploratory questionnaires: If the data to be collected is qualitative or is not to be statistically evaluated, it may be that no formal questionnaire is needed. 1. Figure 4.1 The steps preceding questionnaire design 1. "A short while ago"."

Outcome-based commissioning - Yorkshire & the Humber Joint Improvement Partnership - Developing Intelligent commissioning Outcome-based commissioning Basing all decisions on outcomes is a key principle for commissioners, although in many settings it remains aspirational. Most people are now working towards an outcomes-based approach to commissioning and all will need to be aware that this is central to the government’s approach to public expenditure commissioning. Outcome-based commissioning means putting in place a set of arrangements whereby a service is defined and paid for on the basis of a set of agreed outcomes. It means shifting the basis on which services are purchased and resources allocated from units of service provision (hours, days or weeks of a given activity) for pre-defined needs to what is needed to ensure that the outcomes desired by service users are met. The development of commissioning for quality and outcomes, with payment linked to work done, was a vision of the Commissioning framework for health and wellbeing, published in 2007. What is an outcome? Individual outcomes - e.g.

National Technical Assistance Center on Transition Selected Evaluation Terms Evaluation terminology can be confusing to even the most seasoned of evaluators. This resource provides commonly accepted definitions to evaluation terms frequently used in the out-of-school time field. For “real life” examples of how these terms are used, check our Out-of-School Time Evaluation Database, currently offering detailed evaluation profiles of over 20 out-of-school time programs nationwide. Accountability: Accountability means that a public or private agency, such as a state education agency, that enters into a contractual agreement to perform a service, such as administer 21st Century Community Learning Center programs, will be held answerable for performing according to agreed-on terms, within a specified time period, and with a stipulated use of resources and performance standards. Benchmark: (1) An intermediate target to measure progress in a given period using a certain indicator. (2) A reference point or standard against which to compare performance or achievements. 1.

Survey Research Design - How to Conduct Surveys The survey research design is often used because of the low cost and easy accessible information. Introduction Conducting accurate and meaningful surveys is one of the most important facets of market research in the consumer driven 21st century. Businesses, governments and media spend billions of dollars on finding out what people think and feel. Accurate research can generate vast amounts of revenue; bad or inaccurate research can cost millions, or even bring down governments. The survey research design is a very valuable tool for assessing opinions and trends. Television chat-shows and newspapers are usually full of facts and figures gleaned from surveys but often no information is given as to where this information comes from or what kind of people were asked. A cursory examination of these figures usually shows that the results of these surveys are often manipulated or carefully sifted to try and reflect distort the results to match the whims of the owners. Methodology Face to Face Mail

Let's be frank about how much care we can - and cannot - afford Investment in prevention and perfect efficiency will not free up the resources to meet all social care needs. The government’s social care reform plans offer us the opportunity to be honest about that, says consultant Colin Slasberg. The level of funding for social care matters. However welcome investment in prevention, however efficient services can be made through best practices, any suggestion that these measures between them will solve the chronic funding shortfall in social care is optimistic to the point of recklessness. However, the draft Care and Support Bill creates an opportunity to get the relationship between needs and funding on a basis that holds authentic promise for the future. The key is that councils will have both a duty to meet some needs that have been assessed, and a power to meet all the others. The White Paper supporting the bill suggests that how much need a council meets beyond the national minimum will be a matter of choice for Councils. About Mithran Samuel

Related: