What will your program achieve? How will you tell? Evidence-based Program Evaluation Updated...
-
Upload
lindsey-elliott -
Category
Documents
-
view
219 -
download
0
Transcript of What will your program achieve? How will you tell? Evidence-based Program Evaluation Updated...
What will your program achieve? How will you tell?
Evidence-based Program Evaluation
Updated December 14, 2011
Mary Campbell-Zopf, Ohio Arts [email protected]
Craig Dreeszen Ph.D., Dreeszen & [email protected]
1
Your Instructors
2
is deputy director at the Ohio Arts Council and has been with the agency since 1989. Between 1989 and 2011, Ms. Campbell-Zopf helped secure $12 million for state-level arts, arts education and international programming. She also played a central role in an agency-wide effort to expand the OAC’s International Program through a $1.2 million USDE grant, for which she served as the evaluation
manager and participated in professional development activities with educators and arts administrators in Chile.
In 2006, Campbell-Zopf oversaw development of The Appreciative Journey: A Guide to Developing International Cultural Exchanges. Ms. Campbell-Zopf has an enduring interest in strategic planning and program evaluation, which led to the publishing of the seven-volume series, Focusing the Light: the Art and Practice of Planning in 2009.
Ms. Campbell-Zopf has also been active at the state and national levels in arts education, including serving on numerous state advisory committees for content standards, curriculum, learner assessment, as well as on national advisory committees for the U.S. Department of Education, the National Endowment for the Arts, the National Assembly of State Arts Agencies and the Arts Education Partnership.
directs Dreeszen & Associates, a consulting firm in Northampton Massachusetts. He is a planner, educator, program evaluator, and organizational development consultant. For twelve years he directed the Arts Extension Service at the University of Massachusetts.
Since 1986 he has evaluated 30 programs for foundations and state arts agencies and provided strategic planning support to over
Mary Campbell-Zopf Craig Dreeszen
50 public and nonprofit organizations, mostly community based cultural organizations and the agencies that fund and support them.
Dr. Dreeszen is a contributing editor of Fundamentals of Arts Management, 5th Edition, author of the chapters, “Program Evaluation,” “Strategic Planning,” and “Board Development.” He wrote the online courses “Strategic Planning” and “Outcome Based Program Evaluation,” and other arts management books and articles.
Craig Dreeszen earned his Ph.D. in regional planning and M.A. in organizational development from the University of Massachusetts Amherst.
Our intended learning outcomes
• Understand logic and language of evaluation
• Define outcomes and indicators
• Design feasible program evaluations
• Answer funders: “How will you know if your proposed project will succeed?”“What results did your project achieve?”
Muse Machine artist Michael Bashaw, Thom Meyer photographer
3
Evaluation Part I Outline
• Why evaluate
• Link planning and evaluation
• Evaluation types
• Monitoring activities and measuring outcomes
• Logic of evaluation
• Language of evaluation
• Evaluation questions
• Outcomes and indicators
4
5
Benefits of Evaluation
• Greater insight into people's needs and interests
• Increased efficiency, economy, and effectiveness
• Improved visibility leading to greater public support
• Increased funding
6
An Effective Evaluation
• Structured on defined evaluation questions and information needs
• Logical, defined method of collecting data
• Unbiased and culturally-sensitive
• Positive and useful process
Why Evaluate Programs?
• Be accountable– Funders expect you to deliver results as promised– “If your project is funded, how will you know if it
succeeds?”
• Improve programs– Use evaluation to tell what is working and what
needs improvement– Measuring results critical to program planning
7
Pre-test
How confident are you right now that you could answer the funder’s question:
“If we fund your project, how will you know if it is successful?
1. Confident2. Somewhat confident3. Not very confident4. Not at all confident
9
Evaluation in plain English
What changes will your program make in the world?= Outcomes
What evidence would you accept? = Indicators
Did your program make the difference you intended?
= Outcome-based program evaluation
10
Planning and Evaluation
• Planning determines goals and objectives
• Outcome-based evaluation measures achievement of planned objectives
11
Planning and Evaluation
Planning and Evaluation
Plan objectives Observe outcomes
Just like financial management
Budgeting and Accounting
Project budget Report actual revenues & expenses
12
Build evaluation into your plans
"Would you tell me which way I ought to go from here?" asked Alice.
"That depends a good deal on where you want to get," said the Cat.
"I really don't care where" replied Alice.
"Then it doesn't much matter which way you go," said the Cat.
Lewis Carroll
13
Evaluation Defined
Evaluation
The process of gathering objective evidence about a program and using that evidence to make judgments that guide decisions.
14
Girl Mask, OAC AIR artist Kate Kern
Assessment Defined
Learner assessment
The process of describing, collecting, recording, scoring, and analyzing information about student knowledge, skills and dispositions against instructional objectives and standards of quality.
15
OAC AIR artist Joshua Brown teaches dance residency
This is Not a Pipe by Rene Magritte
Don’t mistake evaluation for reality.It is a flawed approximation, so keep your perspective.
16
Two likely situations
Program has clear objectives
– Determine extent to which objectives are achieved– Note unanticipated outcomes
Program objectives are not clear
– Discover outcomes– Clarify objectives for planning
17
Two kinds of program evaluations
Formative– in-progress evaluation for better programming– may be informal– often done internally– may be oral
Summative– final report for accountability– more formal– often an outside evaluator– usually written
18
What can be evaluated?
Observable outcomes -- tangible results
Indicators -- evidence of intangible results
Not long-range or general goals
Exceptions:
Some evaluations are not feasible, given time and $
“Goal” may be used to describe almost any intention22
Glossary
• Outcome: a specific result attributable to your program - a specific benefit to participants (delight audiences)
• Activity: what program participants do to achieve outcome (attend concert)
• Outputs: immediate products of activities that may not be a specific benefit to participants (present 2 concerts for 400 people)
• Indicators: measurable evidence of outcome achievement (positive response on exit poll)
23
Near-equivalent terms
Outcome Objective (in planning) Short-term outcome Intermediate outcome Long-term outcome Impact Result Performance expectation
Indicator
Evidence Performance measure Metrics
Activity
Output Task Product
24
Writing outcomes
If the goal is: “Connect elders’ learning experiences with heritage through the arts.”
What’s wrong with this outcome?“A guest historian will use theatrical techniques in a three-part workshop.”
25
Writing objectives as outcomes
Objective framed as activity: “A guest historian will use theatrical techniques in a three-part workshop for elders.”
If your draft plan describes activities only, add the intended result of these activities. “We do ____ to achieve ____.”
If you don’t do this when planning programs, you must do so later when evaluating outcomes.
Objective framed as outcome:“A guest historian will use theatrical techniques in a three-part workshop to help elders illuminate their personal stories through the heritage of their times.”
26
Activity yields outcome
Goal “Use the arts to transform learning.”
Activity“Poet in residence works with senior center residents to write reflective journals.”
Intended outcome:“Residents will improve writing skills through journal writing and poetry.”
27
Indicators are evidence of outcome
Outcome: – Participants will encounter the poet in
residence and write reflective journals
Indicators: – Instructors will report the number of
participants who produced journals– Instructor reads journals and observe number
that are reflective– May use a rubric to standardize observations
28
Rubrics
Rubric for Outcome: Students learn dance step
1. Not yet - little evidence student can accomplish step
2. Almost - student exhibits technique but with weakness
3. Meets standard - student exhibits technique
4. Exceeds standard - meets standard with enthusiasm
29
Relax
• You don’t have to evaluate every program– Start with one
• You don’t have to measure every outcome– Start with a few
• Reject indicators too expensive or time consuming to measure– Gather data in routine managing
of programs
30
Challenging evaluations
• If contact with beneficiaries is short term– measure short-term outcomes and outputs
• If outcomes will take a long time to achieve– measure progress
• If outcomes are intangible -- Joy– use qualitative measures -- sample opinions
31
Some results more profound than others
Output: How many participated?
Satisfaction: Did they enjoy the program?
Short-term outcome: Did they learn what was intended?
Long-term outcome: Did they use new learning?
Impact: What impact on lives, on community?
32
Proving results
Easier to report outcomes than toprove your program caused them
• Cause and effect may be difficult to prove
• Other factors may have influenced outcomes
• You can report a correlation– We ran program and observed
this result…
Cleveland Public Theatre Y-Haven Project Performance
33