IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment...

39
IWIP PARTICIPATORY PROGRAM EVALUATION

Transcript of IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment...

Page 1: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

   

 

 

IWIP PARTICIPATORY PROGRAM EVALUATION

Page 2: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 2 

 

Placement Student:

Mercedes Sharpe Zayas (MScPl Candidate)

Participants:

Sara Abdulhussain

Vessey George

Soraya Hoggarth-Bryan

Adilya Ibragimova

Amal Kanafani

Nipa Kar

Paridokht Shahcheraghi

Irum Siddiqui

Supervisors:

Alfred Jean Baptiste (TCCLD)

Charles Levkoe (University of Toronto)

Page 3: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 3 

 

TABLE OF CONTENTS

Evaluation Brief 4

Context

Regent Park TD Centre of Learning Immigrant Women’s Integration Program

5

Participatory Program Evaluation Theoretical Background Module Overview Toolkit

8

IWIP Participatory Evaluation Results Concerns

16

Recommendations 25

Appendix Module Content

29

Conclusion 27

Page 4: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 4 

 

EVALUATION BRIEF The TD Centre of Learning is interested in finding ways to cyclically evaluate its pro-grams. The process of measuring the Cen-tre’s success is critical in ensuring that the long-term goals of the Centre—increasing social inclusivity and encouraging commu-nity engagement— are met. This requires a complex evaluation method and close col-laboration with key stakeholders. A partici-patory evaluation process will allow the Centre to define its goals, measure its suc-cess, and refine its programming on a regu-lar basis. This process will also help the Centre continue to meet the shifting needs of Regent Park’s social landscape. Last year, a graduate studio group from Ryerson University’s School of Urban and Regional Planning completed a report that outlined the development of a logic model (consulting with the Centre’s staff, partici-pants, and decision-makers in order to de-fine its goals, and describe how its activities are believed to achieve those goals) and an evaluation tool to measure the Centre’s suc-cess in achieving its goals. The reflections from the student report confirmed the va-lidity of the Centre’s goals and the positive

influence that the Centre is having on par-ticipants. For my placement, I developed one of the last stages of the Ryerson report by collaborating with participants and facilitators of the Immigrant Women’s Integration Leadership Program (IWIP) to develop and implement a participatory program evaluation strategy. Following in the tradition of Participatory Action Research (PAR) (Reardon 1998; Burgess 2006; Cahill 2007), a participatory evalua-tion model was developed and conducted as a module of the IWIP program. This process involved creating a collective Theory of Change for the program, conducting a participatory evaluation in class, completing evaluation surveys for quantitative data, and taking part in a film project for qualitative data. The purpose of this participatory evaluation model with IWIP is to equip the women with the nec-essary skills to conduct a program evalua-tion in their own placements, while simul-taneously providing a case study for the larger task of developing a program evalua-tion for the TD Centre of Learning.

Page 5: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 5 

 

 

REGENTPARK

Canada’s first and largest public housing estate is

currently undergoing a socially mixed revitalization project

Central to the theory of social mix is the notion that reducing the area’s concentration of poverty by diversifying the local economy will increase social opportunities to all residents. Yet the mixed-income housing strategy has been a contentious process. As critical housing scholar, Martine August (2014, 1330), has remarked,

“ socially mixed public housing redevelopment does not tackle the causes of advanced marginalization and enduring inequality

that have daily impacts on the lives of public housing tenants. These inequalities include ongoing constraints on labour market opportunities, hous-ing maintenance funds, and social safety nets. Therefore, social strategies are crucial to ensure that the extant social support networks are not compromised by the area’s revitalization, but rather fostered and empowered.

CONTEXT

Page 6: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 6 

 

TD CENTRE OF LEARNING In 2006, the Regent Park Neighbourhood Initiative’s study, Embracing a Changing Landscape: A Community Effort in Planning for a New Regent Park, ranked education as one of the community’s top three social priorities for revitaliza-tion. The Daniels Centre of Learning, now known as the TD Centre of Learning, was created to address this chal-lenge, by providing academic, recreational, skill-building, and leadership programs to residents of Regent Park. Since it began operating in 2010, the Centre has worked towards educating, connecting, and empowering residents from different cultural backgrounds, faiths, and generations. The programs provide participants with the skills and knowledge to enhance their employability, health, and con-fidence in navigating civil society. The Centre provides a wide range of programs, classes, and activities, including but not limited to: academic upgrad-ing, language classes, computer classes, leadership training, speaker series, health and wellness programs (e.g. yoga, nutrition), skills workshops, and mentoring courses. The TD Centre of Learning not only develops participants’ skills and knowledge, but it also works to expand and strengthen residents’ social connections. This joint focus helps to build more equitable and inclusive structures for the residents of Regent Park by fostering subjective and collective well-being. The Centre’s adult education initia-tives aim to help residents access and actively participate in the shifting social and economic life of their community.

EXPANDING AND

STRENGTHENING SOCIAL

CONNECTIONS

FOSTERING SUBJECTIVE AND

COLLECTIVE WELL-BEING

DEVELOPING SKILLS AND

KNOWLEDGE

Page 7: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 7 

 

IMMIGRANT WOMEN’S INTEGRATION PROGRAM

The Immigrant Women’s Integration Program (IWIP) is a one-year adult education initiative that actively seeks to engage recently immigrated women within their communities across Toronto, enhance their research and leadership skills, and ultimately improve social inclusivity.

The unique aspect of this program is its emphasis on integration as a mechanism for engaging with diversity. As one of the trainees observed,

“ community engagement does not mean simply learning how to support members of the same ethnic background, but rather a

broader sense of embracing people of different ethnicities, different cultures, different sexual orientations, different religious and political perspectives. In essence, these women are training on how to combat

and refrain from disclosing prejudice, bias, and discrimination.

INTEGRATION

Page 8: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 8 

 

PROGRAM EVALUATION Non-profit and non-governmental organizations have taken on a critical role in the delivery of social services. The welfare reforms of the 1990s in the US and Canada greatly restructured social service delivery through the processed of devolution and privatization, which shifted responsibility to community agencies at the local level (Trudeau 2008). Such state restructuring involved dismantling public programs, downloading public responsibility to lower tiers of government, and privitasing accountability and responsibility (Ibid). Non-profits have the ability to intervene in this process by increasing accessibility and fostering democratic support for citizens’ (or non-citizens’) everyday lives. Yet community-level agencies are also subject to high competition over sustainable grants and risk having their objectives compromised by the agencies that provide funding.

One potential mechanism to evade these neg-ative consequences is program evaluation. Program evaluation has become a common-place method for learning about the imple-mentation of social policies and programs, their intended and unintended effects, and for whom and under what circumstances they are most effective (Mark, Henry, Julnes 2000). The primary role of evaluation is to collect data that enhances and supplements the efforts of democratic actors as they seek social betterment, thereby making it a key mecha-nism for ensuring equity and social change (Ibid). An evaluation can consider the politi-cal and economic context of a program, the implementation of a program, or the outcome of a program. Regardless of the approach tak-en, program evaluations are vital for demon-strating the value of a program to funding bodies and targeting areas of improvement to ensure better delivery of services.

Page 9: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 9 

 

PARTICIPATORY ACTION RESEARCH For the purposes of evaluation at the TD Centre of Learning, a participatory approach was developed with the Immigrant Women’s Integration Program (IWIP). Participatory program evaluation develops from the field of participatory action research (Whyte 1991; Hall 1993), which aims to dismantle the asymmetrical power dynamics traditionally associated with conducting research by working in collaboration with a community. Rather than reinforcing the professional-expert model, which restricts community input, participatory evaluation deems the community as experts whose lived experiences and problem-solving capacities will help improve the long-term impact of a program.

PARTICIPATORY VS. CONVENTIONAL EVALUATION

Participatory Conventional

Who conducts the evaluation?

Community Residents,

Project Staff

Other Stakeholders

Program Managers

Professional evaluators

Outside Experts

What are the benefits?

Local knowledge

Builds knowledge, skills, and re-lationships among community residents and stakeholders

Independent judgment

Standardized indicators allow comparison with other re-search findings

What are the costs?

Time, energy, and commitment from local residents

Training, skill development, and support of many players

Consultant and expert fees

Loss of critical information that only stakeholders can provide

Page 10: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 10 

 

As Kenneth Reardon (1998, 59) describes,

“ Participatory action research focuses on the information and analytical needs of society’s most economically, politically, and

socially marginalized groups and communities, and pursues research on issues determined by the leaders of these groups. It actively

involves local residents as co-investigators on an equal basis with university-trained scholars in each step of the research process, and is expected to follow a nonlinear course throughout the investigation as

the problem being studied is “reframed” to accommodate new knowledge that emerges. This research generally requires the

examination of a number of research questions in a serial fashion, and is best accomplished through research designs that combine

quantitative and qualitative methods. ” 

Page 11: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 11 

 

While this type of research takes time, commitment and effective communication between everyone involved, the TD Centre of Learning has strong precedents for participatory eval-uation. The Ryerson report (2014) identified that the Centre already offers programs that teach participants survey design and data collection and analysis, such as the development of Community Resource and Needs Assessment Reports in IWIP .

Therefore, a participatory program evaluation module was designed and implemented with the IWIP trainees. The module taught the overview of program evaluation, along with data collection and analysis methods, to help the trainees develop evaluation skills for their community service sector internships. The module was participatory in nature because it actively involved the trainees in collecting data and analyzing evidence about the program through in-class exer-cises. Although the pilot of the module focused on evaluating the IWIP program itself, this module can eventually be extended to the Centre as a whole.

Page 12: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 12 

 

PROGRAM EVALUATION MODULE

Participants will learn how to design and implement program evaluations for non-profit organizations. The evaluation module will focus on process, frameworks,

indicators, data collection and the presentation of findings. The exercises will involve practical and

participatory applications of the lessons learned, in order to evaluate the Centre’s programs.

OVERVIEW

To provide participants with the knowledge and application of evaluation and assessment tools in order to measure program impact. Additionally, the participatory

approach aims to actively engage the trainees as stakeholders in the Centre’s evaluation, in order to increase

commitment to community engagement.

GOALS

Page 13: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 13 

 

1. PURPOSE AND PROCESS

Performance Measures: Demonstrate understanding of why program evaluation is essential to program growth in the current socioeconomic context of service provision, as well as the process of understanding how to evaluate. A participatory framework will be emphasized as a method for enhancing community engagement.

2: EVALUATION FRAME-WORK AND OUTCOMES Performance Measures: Demonstrate understanding of how to design and prepare an evaluation framework through participatory exercises. Trainees will focus on collectively editing a program description, developing a participatory theory of change, and writing an evaluation brief.

3: DATA COLLECTION/DOCUMENTATION Performance Measures: Demonstrate understanding of critically developing indicators and collecting both quantitative and qualitative data. Students will explore evaluation tools such as surveys, focus groups, and alternative methods for collecting data.

4: ANALYSIS AND FINDINGS Performance Measures: Demonstrate understanding of data interpretation and the presentation of findings. The students will present the findings of their program evaluation, and develop recommendations.

MODULE SECTIONS

Page 14: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 14 

 

PROGRAM EVALUATION TOOLKIT

A Theory of Change (TOC) is a backwards-mapping tool that identifies the long-term goal of a program, and then defines all the necessary inputs, outputs, and outcomes that are necessary to reach that goal. Central to this model is the articulation of assumptions that are hidden within the causational steps of the branches, and the provision of interventions that are necessary to overcome said assumptions.

Free software is available on the Theory of Change Website

THEORY OF CHANGE

GOAL

(IMPACT)

OUTCOME

OUTCOME

OUTCOME

OUTPUT

OUTPUT

INPUTS

ASSUMPTION

ASSUMPTION INTERVENTION

Page 15: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 15 

 

SURVEYS

Primarily quantitative but can also collect qualitative data through open-ended or free response questions

Advantages: Easy to collect, compare and analyze ; Participant anonymity

Disadvantages: Possible response or sample bias; Potential for low response rate 

Recommended Software: QuestionPro or Google Forms

DOCUMENT ANALYSIS Primarily quantitative review of program applications, finances, memos, minutes, etc. Can also collect qualitative data in the form of documented narratives.   

Advantages: Provides comprehensive and historical information without interrupting program. Disadvantages: Information may be incomplete and restricted.

FOCUS GROUPS Primarily qualitative but can also collect quantitative data by numerically coding participant responses or observations using “dotmocracy.” Advantages: Collects common impressions; Efficient range and depth of information in a short time Disadvantage: Requires a well-trained facilitator; difficult to analyze

CASE STUDIES Primarily qualitative but can also collect quantitative data by coding observations, conducting surveys, or analyzing documents   

Advantage: In-depth interviews and longitudinal observations fully depict participants’ experience in program Disadvantages: Depth rather than breadth; Time-consuming

Page 16: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 16 

 

RESULTS The participatory program evaluation module was conducted over two five-hour sessions with the IWIP trainees in January of 2015. The module was complemented by a Theory of Change module in November of 2014, and a film project in March of 2015 that documented the women’s narratives as recent immigrants, as members of their communities, and as IWIP trainees.

This year’s program began with 8 trainees, yet 3 discontinued their studies over the course of the year for personal reasons (as indicated by the smaller points) .

Page 17: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 17 

 

THEORY OF CHANGE As part of the Program Evaluation model, the trainees created several branches of a participatory Theory of Change (TOC) model for IWIP. The TOC developed from the logic model set forth in the Ryerson Report (2014), and was extremely informative for developing key evaluation questions. The participatory nature of the process, however, was highly labour intensive and, due to time constraints, only focused on the Community Resource and Needs Assessment (CRNA) Report. This indicates that the participatory process of developing the TOC should be a work-in-progress as the students move through the different modules of the program.

IMPACT: COMMUNITY

EMOWERMENT

OUTCOME: GREATER

UNDERSTANDING OF THE

COMMUNITY

OUTPUT: CRNA

REPORT

CRNA MODULE

OUTCOME: GAIN SKILLS IN

DATA COLLECTION,

ANALYSIS, AND PRESENTATION

WOMEN GAIN COMMUNITY

KNOWLEDGE AND CAN

IMPLEMENT CHANGE

SHARE CRNA REPORT FINDINGS WITH COMMUNITY

MEMBERS OR CITY REPRESENTATIVES

INTERVENTTION

ASSUMPTION

RECOMMENDATION IS INVESTIGATED AND/OR

IMPLEMENTED

INDICATOR

SURVEY INDICATORS (URBAN HEART)

WILL REFLECT COMMUNITY NEEDS

ASSUMPTION

USE HYPOTHESIS TESTING TO REFLECT ON COMMUNITY PROBLEMS

AND ADAPT SURVEY QUESTIONS

INTERVENTTION

ISSUES ARE REFLECTED IN SURVEY RESULTS

INDICATOR

Page 18: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 18 

 

Is the program reaching its goals of improving well-being, increasing social inclusivity, and encouraging

trainees to become active in their communities?

KEYEVALUATIONQUESTION

The pilot program evaluation survey was developed by a workshop of Ryerson students last year. The pilot survey was coded to reflect the Ryerson logic model (2014), and

coded according to the Australia Bureau of Statistics guideline for measuring social capital indicators.

SURVEY

Social capital is a contested term, most intricately tied with Pierre Bourdieu’s (1986) critique of its role in reproducing structures of privilege and relationships of power.

However, due to the current neoliberal condition of service provision and the increasing pressure and competition for external funding, it has become strategic for

non-profits to match the financial language of funding bodies in order to demonstrate the social return on investment. Social capital indicators therefore have

become a standardized form for measuring and evaluating social networks.

SOCIAL CAPITAL

Page 19: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 19 

 

MEASURES OF WELL-BEING The survey was conducted with the five remaining trainees (above), as well as one of the participants who discontinued her studies to pursue a graduate degree (below). By comparing the two distributions, the Centre appears to have a strong impact on self-esteem, optimism, health, and reduced stress for those who continue with the program. By contrast, the self-management categories are not necessarily tied to the Centre, but rather individual choice.

Page 20: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 20 

 

MEASURES OF CIVIC ENGAGEMENT AND SOCIAL INCLUSIVITY By comparing the distributions of the current trainees (above) and a former trainee (below), the Centre appears to have a strong impact on capacity, action, and confidence in community lead-ership. According to the ABS framework, this correlates with network transactions in common action and negotiation. By contrast, current trainees appear to have a weaker interest in the political than the former trainee. Although this is largely reliant on personal interest, there are several methodological and structural explanations that can be explored

Page 21: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 21 

 

Only 6 surveys were submitted for review, therefore it is difficult to make generalized conclusions about trends in the data with such a

small sample size.

The survey was distributed a month before the trainees developed their social justice workshop, wherein they researched universal

human rights and immigrant political representation

METHODOLOGICALLIMITATIONS

The reported lack of interest in the political system points to a larger structural issue of immigrant political representation. As one of the respondents argued, the root cause of political apathy among new

immigrants is the waiting time for achieving citizenship

Similarly, one of the trainees observed that the reason that there tends to be higher interest in federal elections than municipal or provincial

elections is because changes in immigration policies are being implemented at the state level.

STRUCTURALLIMITATIONS

“ Despite Toronto’s inclusive civic culture, the city has a problematic pattern of

immigrant political incorporation.

Siemiatycki 2011, p. 1215 ”  

Page 22: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 22 

 

CONCERNS FOR THE CENTRE

By the time the participatory evaluation was being conducted, the program was experienc-ing turnover in its coordinators, several of the trainees had discontinued their studies, and only half of the remaining trainees were attending classes. The question on top of every-one’s mind was about access to and coordination of the program. Collectively, the class evaluated some of the factors that could either support or inhibit class attendance.

1. AFFORDABILITY The free nature of the course opens up accessibility, but also could potentially disincentive responsibility.

2. NEW PEDAGOGICAL METHODS AND STYLES The trainees were exposed to the integration of theory and practice, or praxis, yet varying levels of education also greatly shaped how trainees engaged with course material.

3: SKILL-BUILDING Developing technological skills in programs such as Excel and PowerPoint, along with technical skills in report building with APA citation, were highly valued amongst current and former trainees.

4: INTEGRATING INTO TORONTO The theoretical content of the course, along with the Centre’s physical placement in Regent Park, encouraged the trainees familiarize themselves with a new social and cultural landscape.

FACTORS SUPPORTING ATTENDANCE

Page 23: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 23 

 

1. FAMILY AND EMPLOYMENT Nearly all the women had trouble completing homework due to familial and employment oblig-ations. The Centre intervened by providing child care assistance and employment training.

2. VARYING LEVELS OF EDUCATION Some of the trainees had higher levels of training in human rights discourse or technological skills prior to the program, and therefore did not feel challenged by the material.

3: COMMUNICATION BARRIERS Not only were several trainees struggling with language barriers, but there were also severe technological issues in regards to excessive email notifications upon changes in the schedule.

4: COORDINATION AND ORGANIZATION Due to the numerous amount of coordinators, facilitators, and interns, one of the main issues that arose was the lack of clear, concise, and cohesive instruction and structure.

FACTORS INHIBITING ATTENDANCE

“ The IWIP program is a wonderful initiative in design and to a lesser extent in delivery. I felt that I wanted to participate in a program that

was more challenging academically. Also, it was quite a challenge on my part trying to balancing part-time employment and attending a full-time program for which I was not receiving monetary assistance. ” -Discontinued IWIP Trainee

Page 24: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 24 

 

All of these factors could potentially inhibit the cycle of learning that the

Centre is aiming to sustain.

Page 25: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 25 

 

RECOMMENDATIONS 1. Improving the communication, coordination, and organization of the

IWIP Program One of the main issues that arose during the participatory process was a significant lack of communi-cation, coordination, and organization among the different tiers of operation at the Centre. Due to the new or transient role of many coordinators and facilitators, it was difficult to achieve a cohesive course structure. Here are four potential interventions that can help address these issues:

Cohesive training process All instructors, facilitators, and place-

ment students should go through a training process to ensure that they are matching in delivery style and content.

This will overcome issues of miscom-munication and discrepancy in teach-ing, thereby making the content more accessible to students.

Establishing institutional memory There should be a collective platform

that allows for new staff to access to former course content and structure.

The Centre needs to implement a learning management system that provides a platform for course delivery, content, and management; a portal sys-tem for communication; and a system to record and analyze student assess-ment results and evaluations.

Collaboration Strategy with the Chang School of Continuing Education at Ryerson University

This year, the Centre partnered with Ryerson University to provide technical courses and guidance for the trainees in an institutional setting.

Continuing with this collaboration strategy would not only improve the program structure, but it could also in-centivize attendance. Trainees would benefit from this community-university partnership by training and gaining recognition on behalf of Ryerson University.

Repurposing the course structure Connecting the modules, spreading out

the due dates, offering more individual guidance

In order to improve communication and organization for the course, the Centre should incorporate a learning management system that coordinates emails, course content, and assessment tools. Institutional examples include Blackboard and BearTracks, however non-profit options, such as SUMAC, are also of great value. The information specialist should attempt to establish a platform that encompasses the organizational themes of these programs.

LEARNING MANAGEMENT SYSTEM

Page 26: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 26 

 

2. Integrating structures of care The Centre must remain cognizant of the potential structural barriers that trainees are facing as a result of the immigration process. The Centre could integrate structures of care by providing recommendations for social service resources as well as options for students with mental or physical health concerns

Online Course Structure One of the respondents expressed experiencing difficulties within a traditional bricks and

mortar institution due to her post-traumatic stress disorder, which was triggered by the refugee claim process. She believes that the Centre should prioritize creating an online program for IWIP, as was conducted at the start of the semester over GoTo Meeting, However, one of the facilitators argued that being physically present in the classes is integral to the learning process. Therefore, the course coordinators must evaluate whether or not this is a feasible option.

Self-Care Workshop As the trainees enter their community service sector internships, it is necessary to work

through the potential threat of compassion fatigue, or the emotional burnout that is often cor-related with affective labour. Discussing and sharing practices of self-care is particularly rele-vant in a culture where working, and its ostensible rewards, are valued above all else, including mental and physical health, social bonds, justice, and so on.

3. Different streams of education Although each of the trainees were university educated, they all exhibited different levels of understanding based on the fields in which they were trained. Therefore, the Centre should either be transparent about the level of understanding that a trainee should have, or should offer two streams for introductory and advanced social justice education.

“ The instruction, training, and learning approach could be improved by taking into consideration the

different learning styles of the participants An emphasis should be placed on preparing trainees not only to be job-seekers, but also to be job creators—entrepreneurs for the non-profit sector. ”  -IWIP Trainee

Page 27: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 27 

 

CONCLUSION

The Immigrant Women’s Integration Program offers a unique opportunity for recently immigrated women to learn how to become leaders within their new communities. By improving some of the concerns raised in the participatory evaluation process, the Centre can increase its long-term impact on community empowerment, resilience, and capacity throughout Toronto. The program evaluation module provides a model for the Centre to continually evaluate its programs, while simultaneously equipping the trainees with valuable evaluation skills.

Page 28: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 28 

 

ONLINE RESOURCES Evaluation Framework: http://betterevaluation.org/

Theory of Change: http://www.theoryofchange.org/what-is-theory-of-change/how-does-theory-of-change-work/

Participatory Evaluation: http://depts.washington.edu/ccph/pdf_files/Evaluation.pdf

WORKS CITED August M. (2014) "Challenging the rhetoric of stigmatization: the benefits of concentrated poverty in

Toronto’s Regent Park," Environment and Planning A 46(6), pp. 1317 – 1333.

Bourdieu, P. (1986) The forms of capital. In J. Richardson (Ed.) Handbook of Theory and Research for the

Sociology of Education. New York: Greenwood.

Burgess, J. (2006) “Participatory action research: First-person perspectives of a graduate student,” Action

Research 4 (4), pp.419-437.

Cahill, C. (2007) “The Personal is Political: Developing new subjectivities through participatory action

research,” Gender, Place and Culture 14 (3), pp. 267-292.

Hall, B. (1993) “From Margins to Center: The Development and Purpose of Participatory Planning in the

United States and Canada.” American Sociologist, 23 (4), 15–28.

Mark, M., Henry, G., Julnes, G. (2000) Evaluation: An Integrated Framework or Understanding, Guiding

and Improving Policies and Programs. San Francisco, CA: Jossey-Bass.

Reardon, K. (1998) “Participatory Action Research as Service Learning,” New Directions for Teaching and

Learning 73: pp. 57 – 64.

Ryerson Studio Workshop. (2014) “Daniels Centre of Learning: Building Community through

Education.” Retrieved from: <https://testtccld-my.sharepoint.com/personal/ryan_tccld_org/

_layouts/15/guestaccess.aspx?guestaccesstoken=SXOi6dc8nqZyoGzD0IxYwXOk2k%

2fxgj1fE6UxZYGqLhc%3d&docid=0bb0d92704b844601b2141cf4ca63b2d6>

Siemiatycki, M. (2011) “Governing immigrant city: Immigrant political representation in Toronto,”

American Behavioral Scientist, 55(9), pp. 1214–1234.

Trudeau, D. (2008) “Junior partner or empowered community? The role of nonprofit social service pro

viders amidst state restructuring.” Urban Studies 45 (13), pp. 2805-2827.

Whyte, W. F. (1991) Participatory Action Research. Thousand Oaks, Calif.: Sage.

Page 29: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 29 

 

APPENDIX PROGRAM EVALUATION COURSE CONTENT

1. PURPOSE AND PROCESS

Socioeconomic Context - Non‐profits play a significant role in social service provision due to the devolution and privatization of government services. Program evaluation is a key mechanism for securing funding in the age of rollback neoliberalism.

What is an evaluation? - Evaluation is an applied inquiry process for collecting and synthesizing data that culminates in con-

clusions about the value, significance, or quality of a program, policy or plan. It is both empirical and normative. What is an evaluation for? - Program evaluation is implemented to ensure that the goals of a program are being met, and indi-

cates any potential areas of improvement. The primary role of evaluation is to enhance and supplement the efforts of democratic actors as they seek social betterment.

Page 30: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 30 

 

What is the difference between Monitoring and Evaluation? Monitoring is a form of performance management. It is heavily focused on tracking the progress of a program, and is

by and large quantitative (e.g. attendance). Evaluation, by contrast, goes into greater analytical depth by asking causal questions, exploring value judgments, and

using progress markers. Therefore, it is not only quantitative, but qualitative. Different types of Evaluation

Formative – Conducted ½ way through a project for learning and improvement Summative – Conducted at the end of a project to measure accountability, worth, and outcomes. Real‐Time – Continuous evaluation – it is use‐based and ensures both improvement and enhancement through timeli-

ness and learning. Unlike the other two categories, this form of evaluation is used to change behavior while the program is taking place.

Problems with evaluation: Use and Relevance – For instance, if you are conducting a formative or summative evaluation, your results will likely be

produced too late for use. Ethics and Access – Does not require an ethics approval nor specify who has access to the evaluation. Objectivity – The ‘facts’ presented in an evaluation could potentially be steeped in value judgments depending on how

the evaluator framed the data. This is often an issue when the program evaluation is client‐controlled – they want to hear positive responses, and might go to coercive ends to reach this type of outcome.

Trade-off between Value‐Free Evaluation (Objective) or Evaluation for Equity (Subjective)

Page 31: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 31 

 

The scope of an evaluation varies between context, implementation, and outcome:

Context Evaluation – evaluating the context of the program (e.g. needs assessment) Used early in an intervention for:

Assessing the needs, assets, and resources of a target community in order to plan relevant and effective interventions within the context of the community

Identifying the political atmosphere and human services context of the target area to increase the likeli-hood that chosen interventions will be supported by community leaders and local organizations

Used in a mature intervention for: Gathering contextual info to modify project plans and/or explain past problems Identifying the political, social, and environmental strengths and weaknesses of both the community and

the project Examining the impact of changing federal and state climates on project implementation and success

Page 32: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 32 

 

SCOPE AND APPROACH CONTINUED... Implementation Evaluation– evaluating how the program is implemented (e.g. new programs, pilot projects, etc.)

Identifying and maximizing strengths in development Identifying and minimizing barriers to implementing activities (e.g. theory of change) Determining if project goals match target population needs Assessing whether available resources can sustain project activities (e.g. Program coordinator, consistent teachers,

etc) Measuring the performance and perceptions of the project (good in theory, not in practice) Determining the nature of interactions between staff and clients Ascertaining the quality of services provided by the project Documenting systemic change Monitoring clients’ and other stakeholders’ experiences with the project, and their satisfaction with and utilization

of project services

Outcome Evaluation – evaluating the outcome of a project. Demonstrating the effectiveness of your project and making a case for its continued funding or for expansion/

replication Helping to answer questions about what works, for whom, and in what circumstances, and how to improve pro-

gram delivery and services Determining which implementation activities and contextual factors are supporting or hindering outcomes and

overall program effectiveness.

Participatory approaches – involving the stakeholders in the design and implementation of the evaluation. Most Significant Change – data collection involves gathering and analyzing stories of change. Engages in a story-

telling process with the community through interviews or focus groups. Outcome Mapping - data collection involves journaling and mapping. Participants create an “outcome journal”

for progress markers; a “strategy journal” for strategy maps; and a “performance journal” for organizational practices

Contribution Analysis – data collection involves collecting evidence that supports or denies the causal assump-

tions of a collective theory of change. Impact Evaluation – data collection involves assessing the intended and unintended changes that can be attributed

to a particular intervention, such as a project, program or policy. Social Return on Investment – data collection involves measuring extra-financial value (i.e., environmental and

social value not currently reflected in conventional financial accounts) relative to resources invested. It ac-counts for stakeholders' views of impact, and puts financial 'proxy' values on all those impacts identified by stakeholders which do not typically have market values.

Page 33: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 33 

 

2. EVALUATION FRAMEWORK AND OUTCOMES Framework developed from betterevaluation.org (below).

Page 34: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 34 

 

Develop a Program Description - It is helpful to develop an initial description of the project, program or policy as part of begin-ning an evaluation. Checking this with different stakeholders can be a helpful way of beginning to identify where there are disagreements or gaps in what is known about it. IN CLASS EXERCISE: Provide a prepared program description for the program being evaluated, and ask the class to col-

lectively edit the description. Pay attention to the rationale for the program (the issue being addressed, what is being done, who is intended to benefit), the scale of the intervention, the resources allocated to the implementation, the roles of partner organizations and other stakeholders, the implications of contextual factors (geographic, social, political, economic and institutional circumstances), and significant changes that have occurred over time (contextual factors or lessons learnt).

Develop a Logic Model or Theory of Change - A logic model is a simplistic mapping tool that can set the stage for measure-ment. It sets out by identifying the intended goal or impact of a program, and then backwardly maps the outputs, inputs, and resources necessary to reach the intended outcomes. A Theory of Change is similar to a logic model, but further com-plicates the causational assumptions by identifying key interventions and branching out on alternative methods to reach the final goal. IN CLASS EXERCISE: Ask students to collectively map a theory of change for the program. Remember to identify under-

lying assumptions, potential unintended results, and interventions.

Page 35: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 35 

 

Framing an evaluation involves being clear about the boundaries of the evaluation, and requires key evaluation questions. For example, why is the evaluation being done? What are the broad evaluation questions it is trying to answer? What are the values that will be used to make judgments about whether it is good or bad, better or worse than alternatives, or getting bet-ter or worse?

IN CLASS EXERCISE: Create a collective evaluation brief by identifying primary intended users (who will use the eval-uation), deciding purposes and intended uses, specifying the key evaluation questions, and determining what ‘success’ looks like.

Page 36: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 36 

 

Managing an evaluation involves agreeing on how decisions will be made for each cluster of the evaluation (from framing an eval-uation to reporting and supporting use) and ensuring they are implemented well.

IN CLASS EXERCISE: If the IWIP trainees are evaluating another program, or the Centre itself, they have to identify and engage the stakeholders, establish decision making processes, and develop planning documents for the evaluation.

3. DATA COLLECTION/DOCUMENTATION

Demonstrates understanding of data collection and the various documentation styles. The module will encompass a quantitative and a qualitative approach. Step 1 – Determine what data collection strategy you will use (e.g. survey, focus groups, interviews, journaling, mapping, etc.) Step 2 - Developing Indicators An indicator is the actual variable being measured, such as average test scores or proficiency in a particular. The Indicators stage focuses on how to measure the implementation and effectiveness of the initiative. By collecting data on each outcome, the initia-tive can identify what it is or isn’t happening, and find out why. Each indicator has four parts: population, target, threshold and timeline.

Page 37: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 37 

 

Simply put, for each indicator you want to ask:

Population: Who is changing? (e.g. students graduating from the program) Target: How many do we expect will succeed? (e.g. 90% of the enrolled students) Threshold: How much is good enough? (e.g. a $12 per hour job for at least six months) Timeline: By when does this outcome need to happen? (e.g. perhaps within two months of graduation)

A key component to this step is a critical understanding the metrics being used. For example, the developed surveys are attempt-ing to measure social capital indicators. What are the potential deterrents of measuring social capital? (E.g. Reproducing systems of power, applying capitalist ideologies to social relations rendering them utilitarian, etc.) What are the potential benefits of so-cial capital? (e.g. proxy value for an otherwise unquantifiable phenomenon, speaking the same rhetoric as funding bodies, etc.)

Step 3: Collect and/ or retrieve data

IN CLASS EXERCISE: Distribute and complete surveys developed for the course – students may also adapt the surveys if they believe questions are missing / existing questions are not relevant, or incorporate other, more qualitative forms of data collection.

Page 38: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 38 

 

4. ANALYSIS AND FINDINGS

Page 39: IWIP PARTICIPATORY PROGRAM EVALUATION · 2015-04-09 · of Community Resource and Needs Assessment Reports in IWIP . Therefore, a participatory program evaluation module was designed

 39 

 

Step 1: Analyze and visualize data

Manually input answers into excel, or transfer the survey to an online format which can automatically generate figures.

Understand Causes - Analyze data to answer causal questions about what has pro-duced outcomes and impacts that have been observed.

How does the data correlate with your indicators? Do you notice any key trends?

Remember to consider counterfactual data and alternative causes Visualize your data using graphs, infographics, maps, etc.

Step 2: Synthesize - Combine data to form an overall assessment of the merit or worth of the intervention, or to summarize evidence across several evaluations. Provide rec-ommendations for future improvement.

Step 3: Report - Develop and present findings in ways that are useful for the intended users of the evaluation, and support them to make use of them. Disseminate the re-port(s) and support use of the evaluation findings