EVALUATION OF THE QUICK ACCESS SERVICE AT … · EVALUATION OF THE QUICK ACCESS SERVICE AT CHILD &...

50
EVALUATION OF THE QUICK ACCESS SERVICE AT CHILD & ADOLESCENT SERVICES IN HAMILTON Dina Bednar, M.A. Drew Dane, Ph.D. C. Psych. Gord Greenway, B.A., M.S.W. Louise Oke, B.A., M.Sc. Margarita Rabinovich, M.A. Report for Evaluation Capacity Building Grant 2008-2009 (ECBG)-888 Submitted April 28, 2009

Transcript of EVALUATION OF THE QUICK ACCESS SERVICE AT … · EVALUATION OF THE QUICK ACCESS SERVICE AT CHILD &...

EVALUATION OF THE QUICK ACCESS SERVICE AT CHILD & ADOLESCENT

SERVICES IN HAMILTON

Dina Bednar, M.A.

Drew Dane, Ph.D. C. Psych.

Gord Greenway, B.A., M.S.W.

Louise Oke, B.A., M.Sc.

Margarita Rabinovich, M.A.

Report for Evaluation Capacity Building Grant 2008-2009 (ECBG)-888

Submitted April 28, 2009

2

EXECUTIVE SUMMARY

The Quick Access Service (QAS) at Child and Adolescent Services in Hamilton

offers clients the opportunity to ‘drop in’ during one morning or one afternoon/evening

each week for one session of therapy within one week of their referral being completed

at Hamilton’s single point access referral service for children’s and developmental

services, CONTACT Hamilton. The purpose of the session is three-fold:

1. Begin to intervene to bring about changes desired by the family. 2. Find out what clients are hoping for and determine whether further service is

necessary and which of the services offered by Child and Adolescent Services are most appropriate.

3. Offer clients suggestions for community or information resources which may be helpful to them.

Theoretical ideas which underpin our approach to the QAS are primarily collaborative,

solution focused and strengths based. We aim to help families become more aware of

strengths and resources they already possess, to understand what they want and need

from therapy, and to build hope. All clinicians and student interns at the clinic work in

the Quick Access Service between two and three times each month.

Since we began the QAS in 2007, we have used evaluation forms after the

session for clients aged 10 or older (including parents, siblings and anyone else who

may have attended the session). We have reviewed these forms periodically although

we have not yet analyzed this data except in the most basic way. While we believe,

based on client feedback and clinicians’ experience, that the service is helpful to clients

and we are mostly fulfilling our objectives for the service we would like more formal and

conclusive evidence that this is the case. Our agency is struggling to keep service levels

stable with fewer clinicians and administrative staff so it is more important than ever that

we know that the service we are providing is working as effectively as possible for our

clients.

The proposed evaluation will look at whether a single session intervention decreases

parent ratings or, for adolescents, self ratings of symptoms one month after the session

3

using a non-randomized control group (clients who are referred but do not attend the

QAS). The evaluation will also explore client satisfaction and the impact of the

intervention on hopefulness, coping skills, general self-efficacy and parenting self

efficacy and understanding of the problem. The study utilizes standardized measures

including the Brief Child and Family Phone Interview (BCFPI), clinic designed pre-

session questionnaires and post session evaluation forms and a follow-up phone

interview which will contain some quantitative and some qualitative data. The design

has been chosen to shed light on both outcome and process questions. We are asking

whether hopefulness, self-efficacy, parenting self-efficacy, coping, knowledge and

understanding of the problem and awareness of strengths and resources are positively

impacted by the Quick Access intervention and whether symptoms are reduced one

month after the session. We also plan to look at client perceptions of the service,

including satisfaction, to describe our client population in some detail and identify the

main components of the service (active ingredients). Finally we hope to make

distinctions among groupings of clients such as are there characteristics of clients who

require only one session, who never attend Quick Access although they were referred

(non-attenders) or clients for whom the service is more or less helpful.

Our clinic has come a long way over the term of the grant in terms of our

readiness for evaluation. Our Research Committee, made up of interested staff and

students, the program manager and the project leader, have been most instrumental in

creating the framework. We have learned a lot about the connections between the QAS

intervention and hoped for outcomes for our clients. We have also begun to understand

the many steps of the evaluation process and the importance of some staff working on

research and evaluation in an ongoing way.

We have enjoyed collaborating within our Research Committee, with Susan

Kasprzak from the Centre, with other agencies who have capacity building grants and

4

with other agencies who are doing single session therapy and evaluating same. These

relationships have allowed for rich learning opportunities and have helped us to face the

sometimes formidable challenges of doing evaluation in a clinic setting. We appreciated

very much the Centre’s focus on building collaborative relationships to support each

others’ efforts and would encourage the Centre to do even more of this in future. It was

very helpful to us when Susan provided the names of others who were doing capacity

building planning for brief interventions. We would have liked to have even more “local”

directions to look for others doing evaluations and doing single session therapy.

The recommendations which we have for the clinic at this point which could help

to continue to grow our capacity for evaluation and make the evaluation relevant and

useful are:

1. Support some staff in using some of their time towards research and evaluation activities on an ongoing basis.

2. Make sure that you have regular updates with the whole staff and try to make these updates fun and interesting. Allow time for questions, comments and discussion. Invite different perspectives on services and on evaluation.

3. Form a research committee and encourage staff and students to attend even occasionally if that is all that is possible.

5

TABLE OF CONTENTS

INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

PROGRAM LOGIC MODEL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

METHODOLOGY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15

DISCUSSION & LESSONS LEARNED . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

CONCLUSIONS & RECOMMMENDATIONS/NEXT STEPS . . . . . . . . . . . . . . . . . . . 28

KNOWLEDGE EXCHANGE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

REFERENCE LIST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

APPENDICES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Appendix A: Program Logic Model: Evaluation of the Quick Access Service, Child & Adolescent Services, Hamilton, ON . . . . . . . . . . . . . . . . 35

Appendix B: Outcome Evaluation Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Appendix C: Process Evaluation Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Appendix D: Quick Access Service Client Information Fact Sheet . . . . . . . . . 38 Appendix E: Licensing Agreement for/and Young Child Outcome Rating Scale Young Child Session Rating Scale Child Outcome Rating Scale Child Session Rating Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Appendix F: Coping Self-Efficacy Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Appendix G: Tool to Measure Parenting Self-Efficacy . . . . . . . . . . . . . . . . . . . 41 Appendix H: Quick Access Service Parent/Caregiver Questionnaire . . . . . . . . 42 Appendix I: Quick Access Service Youth Questionnaire . . . . . . . . . . . . . . . . 43 Appendix J: Quick Access Service Collateral Questionnaire . . . . . . . . . . . . . . 44 Appendix K: Quick Access Service Client Evaluation . . . . . . . . . . . . . . . . . . . . 45 Appendix L: Brief Child and Family Phone Interview Adolescent Form (BCFPI-A) Brief Child and Family Follow-Up Interview (BCFPI) . . . . . . . . . . 46 Appendix M: Follow-up Interview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

6

Appendix N: Draft Quick Access Service Evaluation Information and Consent Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

Appendix O: Draft Therapist Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Appendix P: Description of ethics review process for Public Health Services-

City of Hamilton . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

7

INTRODUCTION

The Quick Access Service (QAS) at Child and Adolescent Services was

developed between January and June 2007 and began operation in July 2007. This

service offers most clients of Child and Adolescent Services the opportunity to attend

one session of therapy within a week of their referral being completed at Hamilton’s

single point access referral service for children’s and developmental services,

CONTACT Hamilton. The purpose of the session is three-fold:

1. Begin to intervene to bring about changes desired by the family.

2. Find out what clients are hoping for and determine whether further service is

necessary and which of the services offered by Child and Adolescent Services

are most appropriate.

3. Offer clients suggestions for community or information resources which may be

helpful to them.

Our clients are children and their families, between the ages of two and eighteen,

referred to us, usually for counseling services, from CONTACT Hamilton. Theoretical

ideas which underpin our approach to the QAS are primarily collaborative, solution

focused and strengths based. We aim to help families become more aware of strengths

and resources they already possess, to understand what they want and need from

therapy, and to build hope.

The QAS came about for a number of reasons. Prior to starting QAS, referrals

from CONTACT Hamilton were made directly to one of our services. In a moderate

proportion of cases we found that clients were placed in a service which did not really fit

for them. On these occasions clients might have been on a waiting list for service for

months only to find when service began that they were not in the service that could be

most helpful to them. This led to staff needing to stretch themselves in ways which

weren’t always comfortable in order to be responsive to client needs. We also found that

8

fairly frequently clients arrived for their first session of therapy with unrealistic or unclear

expectations of how therapy could benefit them. These expectations could lead to

frustrations for both clients and clinicians as they sought to develop a realistic treatment

plan.

Another reason for beginning the QAS was to address the reality of the wait for

service which many of our clients endure. We wanted to give clients the opportunity to

have one session as soon as possible after they sought help. In this way we hoped to

eliminate the need to wait for those who only required a single session and to weed out

clients whose expectations for therapy were unrealistic or inappropriate, thus shortening

our waiting list. For those clients who needed to wait for further service we hoped that

the QA session could help them begin to make changes while they waited by trying out

ideas discussed in the QA session, learning more about their child’s mental health issue

including possibly attending one of our psycho-educational groups, and accessing other

relevant community resources while they waited.

Since we started the QAS, we have used pre-session questionnaires to gather

clients’ views on the problem, child and family’s strengths, ideas about change and

impact of the problem as well as QA session evaluation forms. These forms have been

used by clinicians before and after sessions. We have not yet collated and analyzed the

data from any of these forms. The use of these forms fits with our collaborative

approach by letting clients know that their perceptions and experience are important to

us.

We have several reasons for planning this evaluation of our Quick Access

Service. While we believe, based on client feedback and clinicians’ experience, that the

service is helpful to clients and we are mostly fulfilling our objectives for the service we

would like more formal and conclusive evidence that this is the case. The QAS is time

consuming for staff and clients and makes scheduling more complicated, especially for

9

clinicians. Because of budget constraints resulting in the loss of two full-time clinical,

one part time consultant psychologist and one administrative position over the past three

years our clinic is very much striving to ‘do more with less’. Clinicians’ caseloads are full

to capacity and yet waitlists continue to lengthen. Given the investment we are making

in this service it is important that the service is as useful and effective as possible.

We designed the service using the forms and format employed at ROCK (Reach

Out Centre for Kids) in the Halton region as well as our knowledge of solution focused

brief interventions and an awareness of the common factors which contribute to therapy

outcome (Hubble and Miller, 1999). In particular, we wanted to pay attention to building

a collaborative and respectful therapeutic relationship in the first session and to get

change started quickly, thus raising hope and engagement in the therapeutic process.

It is critical that we discover how successful we are in fulfilling our objectives and

whether our approach is working for clients as we intend it to. One possible positive

outcome of the evaluation of the service and making changes based on the evaluation

outcome would be an increase in the number of successful single sessions (not referred

for further service at our agency) thus reducing the number of clients waiting for service

while also having satisfied clients. We may also find that results of the evaluation,

especially client feedback, lead to changes in how we provide the service including

allowing clients to attend more than one quick access session while they wait for service

or as an alternative to engaging in ongoing therapy.

The most relevant stakeholders for this evaluation project include clients, staff

and student interns of Child and Adolescent Services and CONTACT Hamilton. Child

and Adolescent Services is part of the Family Health Division of Public Health Services

for the City of Hamilton. The Director of our division, Debbie Sheehan, is keenly

interested in the project, and other Public Health Service managers are also aware of

and interested in the project. The evaluation is also relevant to ROCK in Halton because

10

they offer a similar service and are in the process of evaluating their walk-in service.

Over the course of working on this project we have connected with project leaders at two

other agencies, Dan Bajorek at Point in Time in Haliburton and Diane Barrett at George

Hull in Toronto, whose agencies also have Capacity Building Grants to evaluate brief

interventions. We attended a one day meeting of service providers (children’s mental

health and family service agencies) providing walk-in therapy in Ontario on April 1, 2009

and there joined a Community of Practice who are intervening in similar ways and are

also either actively evaluating their services or planning evaluations.

There is a growing body of international literature on research and evaluation of

single session therapy with adults, children and families. Some of the studies done thus

far have looked at client satisfaction with service and self ratings of change through

evaluation forms or follow-up interviews (Boyhan, 1996; Hampson, O’Hanlon, Franklin,

Pentony, Fridgant & Heins, 1999; Miller and Slive, 2004; Miller, 2008; Talmon, 1990). All

of these researchers found high rates of satisfaction with their single session therapy

interventions and positive ratings of change for most clients. Close to 50% or more of

client subjects in these studies did not require more service than the single session. All

of these studies were limited in the conclusions which could be drawn by their failure to

include pre-post outcome ratings through standardized measures, by examining a

limited range of outcomes and by not using control groups. Other researchers have

looked at client satisfaction in combination with some outcome measures but without a

control group (Coverley, Garralda & Bowman, 1995; Denner & Reeves, 1997; Sommers-

Flanagan, 2007; Theodor & Paolini, 1999). These studies had small sample sizes (<35)

but reported positive satisfaction ratings and good outcome ratings.

Two studies in children’s mental health clinics have used more rigorous research

designs, looking at satisfaction and outcome using standardized measures with sample

sizes between 78-94 and employing no treatment or different treatment comparison

11

groups (Engel, Barwick, Urajnik, Cohen, Summer, Reid & Khatun; 2008; Perkins, 2006;

Perkins & Scarlett, 2008). Ruth Perkins in Australia found significant improvements on

standardized parent and clinician ratings one month after one session of treatment when

compared to a waitlist control group. She went on to follow-up with the same subjects at

18 months and found that changes had been maintained. At the Yorktown Child and

Family Centre in Toronto in a study supported by The Provincial Centre of Excellence for

Child and Youth Mental Health, Engel et al (2008) found significant reductions on all

Brief Child and Family Phone Interview (BCFPI) subscales two weeks after a single

walk-in therapy session when compared to clients who received ‘service as usual’

including an intake interview. These results were maintained or improved at three month

follow-up.

One study from Australia (Campbell, 1999) looked at changes in the reported

nature of problems and coping following single session intervention as well as two family

functioning moderating variables: family structure and family pride. While limited by a

small sample size (38) Campbell reported, “Our results strongly support the hypothesis

that single session interventions can be very effective at reducing the presenting

problem and at increasing the sense of coping.” (p. 192). Campbell also found that the

moderating factor family pride had a big impact on the effect of the single session

intervention in a positive direction. This result raises interesting questions for future

research including, how to identify more moderating variables, how to possibly impact

the moderating variables before intervention, and how to adapt interventions to have

better results with different groups of families.

Of the above mentioned studies which included children or adolescents as

identified clients in their samples (Boyhan, 1996; Campbell, 1999; Engel et al, 2008;

Hampson et al, 1999; Miller & Slive, 2004; Miller, 2008; Perkins, 2006; Perkins &

Scarlett, 2008; Talmon, 1990; Theodor & Paolini, 1999) it is worth noting that none

12

mention seeking children’s opinions about the intervention or about how they were

doing. All of these studies relied on parent and/or clinician ratings of either satisfaction

or child and family functioning or both. This is a gap which we plan to begin to address

in our evaluation.

The objective of the current study is to expand on the research of Campbell

(1999) by examining individual and family variables that may serve as mediators or

moderators of the impact of single session treatment on outcome. If mediating variables

which lead to symptom change can be identified this could inform decisions about

therapeutic goals and interventions. Identifying moderator variables that influence

treatment outcome may facilitate such decisions as who may need more extensive

services and which kinds of clients may achieve good results from single session

therapy.

Two moderating factors we plan to look at are: coping self-efficacy and parenting

self efficacy. Coping is defined as behavioural or cognitive efforts to manage situations

that are appraised as stressful (Lazarus & Folkman, 1984). Perceived self-efficacy is

defined by Bandura (1997) as a belief about one’s ability to perform a specific behaviour.

Social cognitive theory holds that beliefs about personal efficacy determine the

acquisition of knowledge on which skills are founded (Bandura, 1997). Given that a big

part of our single session intervention is directed at impacting parent’s and youth’s sense

of their own ability to help themselves coping self-efficacy may both impact the

effectiveness of the intervention and may change as a result of the intervention. As

outlined in a recent review paper (Jones & Prinz, 2005), research has shown that

parental self-efficacy is related to parenting competence, parental stress and

adjustment, and children’s behavioural functioning. Furthermore, treatment outcome

studies have demonstrated that parental self-efficacy has improved as a function of

parent management training or family based interventions, and post-intervention

13

improvements in parenting and child behaviour were associated with increases in

parental self-efficacy (for a thorough review, see Jones & Prinz, 2005). Therefore, we

anticipate that parental self-efficacy may be enhanced by involvement in single-session

therapy, and that this improvement in turn may mediate the link between treatment and

symptom change at one-month follow-up. In addition, pre-intervention levels of parental

self-efficacy may moderate the impact of therapy, influencing who benefits maximally

from the experience. We also plan to look at pre-existing differences on a number of

variables relevant to children’s mental health (some of which include family structure and

income level, parent’s education level, internalizing and externalizing factors, caregiver

depression, family functioning and protective factors such as recreational or spiritual

involvement) and at process variables such as client satisfaction as moderators that may

predict who responds best to single session therapy.

Other areas which have been explored in research as relevant to therapeutic

outcome are: client engagement including accessing and using the client’s theory of

change (Manthei, 2007; Metcalf, Thomas, Duncan, Miller & Hubble, 1996; Trunzo,

2006), parent perceptions of barriers to care (Hoagwood, 2005, Kazdin, Holland &

Crowley, 1997), and the importance of the shared belief of therapist and client(s) in the

model of treatment (Hyun-nie & Wampold, 2001). We have designed our Quick Access

Service in order to pay attention to issues of client engagement, fit between client and

therapist/therapy and to address some barriers to service. These issues will be

addressed in our evaluation primarily through our pre and post session questionnaires

and follow-up interview. In summary, the following list describes the variables we have

included as either short or intermediate term outcomes, mediators or moderators and

process outcomes.

1. Intermediate-term outcomes at one month follow-up: parental self-efficacy, coping self-efficacy and symptom change.

2. Short-term post-intervention outcomes: coping self-efficacy, hopefulness and awareness of strengths.

14

3. Mediators of treatment impact: parental self-efficacy, coping self-efficacy, hopefulness and pre-existing strengths.

4. Moderators of treatment impact: Pre-intervention coping and parental self-efficacy Client demographic characteristics Parental or family functioning Client perceptions of therapist, therapy.

5. Process outcomes: Client satisfaction Characteristics of clients served Characteristics of clients not attending Active ingredients of service.

Program Logic Model: Evaluation of the Quick Access Service, Child and Adolescent Services, Hamilton, ON

LONG-TERM GOAL: Increased independence in managing own issues/problems in an constructive and effective manner and in a timely fashion

Activities Outputs Target

Outcomes Short-Term Intermediate

-Start the change process through collaborative intervention -Understand families and youths wants and needs for service - Discuss issues such as who should attend service, (family or individual), focus of session, length of service, issues to work on - Assess needs and refer those with priority needs for more intensive service - Provide written information/resources relating to presenting issues - Identify other supports/services - Provide advocacy where needed - Assist in envisioning life outside of presenting issues - Normalize parent’s experience by discussing age-appropriate child development and impact of mental health issues on families and youth. - Help to normalize child/youth’s experiences - Focusing on existing strengths and resources, develop skills/strategies for parenting - Discuss methods of coping

1 session, 1 ½ hour, 1:1 therapist/ youth and/or family

Children/ Youth, ages 2-18 years and their families (12+ years can self-refer) (30-40% - CAS Referrals)

- Increased motivation for attending subsequent services - Increased sense of competence and self-efficacy in resolving issues - Increased understanding of presenting issues - Increased coping skills - Increased access of appropriate services and supports - Increased coping skills - Increased competence and self-efficacy in resolving issues - Increased hopefulness for resolving issues - Increased understanding of presenting issues - Increased sense of competence and self-efficacy in resolving issues - Increased coping skills

- Increased attendance and completion of subsequent programs/services - Accessing formal and informal supports and services as needed - Decreased severity of symptoms - Increased implementation of new strategies/skills - Accessing formal and informal supports and services as needed - Increased implementation of new strategies/skills - Decreased severity of symptoms - Accessing formal and informal supports and services as needed - Increased implementation of new strategies/skills - Decreased severity of symptoms - Accessing formal and informal supports and services as needed

Qui

ck A

cces

s P

rogr

am

MCYS Funding Staff: - 1 Manager to attend QAS team meetings - 8-10 Clinicians per week: FTE clinicians do three shifts per month (3.5-4 hours). PTE clinicians do one shift every other week. - 1admin staff Time: -Manager 2 hrs per week -Per clinician 3 to 8 hours per week, based on number of clients seen. QAS summary report (1 hour each) and QAS team meeting (1 hour). -admin staff 25 hrs/wk Material Resources: File folder Label Forms per file: QAS info sheet, consent to treatment, client profile, D/B transfer sheet, questionnaires (for each family member, except children under 12), QAS report, QAS Evaluation (based on number of family members). Budget: No additional budget is required.

Inputs

15

METHODOLOGY

The evaluation framework was developed by members of our research

committee with help and guidance from Susan Kasprzak, Centre of Excellence

Evaluation Consultant. Our research committee was comprised of the following staff

members: Louise Oke-Clinical Therapist (Project Leader), Gord Greenway-Program

Manager, Margarita Rabinovich-Clinical Therapist, Dina Bednar-Clinical Therapist, Dr.

Drew Dane-Clinical Psychologist, Karen Timmerman-Clinical Therapist, Danielle

Clement-Contract Therapist and Research Assistant, and Diane Ribbins-Administrative

Assistant. Danielle Clement’s contract with our agency ended at the end of January and

Karen Timmerman had to resign from the research committee because of other

commitments but both were actively involved with the first half of the planning of the

framework. A social work intern from 2007-2008, Rhiannon Jones, developed a logic

model for the QAS which served as a basis for the logic model we developed further with

Susan Kasprzak.

All members of the research committee are staff with an interest in supporting

research and evaluation at the clinic and they represent the spectrum of services which

the clinic offers: Quick Access, Complex Trauma, Forensic, Solution Oriented Family

Therapy and Psychology. The research committee in its present form began meeting

about one year prior to receiving the capacity building grant and was instrumental in

deciding to apply for the grant and completing the application. We usually meet at least

once per month but met more frequently between October 2008 and April 2009 while we

were actively working on the evaluation framework. All decisions about research

questions, design and measures were discussed together and reached collaboratively.

Louise Oke and Danielle Clement did most of the between meeting work, including

liaising with Susan Kasprzak but, especially as the framework took shape, other

committee members and even some other staff took pieces to do separately and then

16

brought them back to the committee. All members of the research committee attended

the meeting with Susan Kasprzak to develop the logic model in October.

Developing the logic model (see p. 13 or Appendix A) for the Evaluation of the

Quick Access Service really helped us to become clearer about what we are trying to do

and strategies we use to achieve our objectives in our Quick Access service. Because

Quick Access is a relatively new service at our clinic we are still in the process of

experimenting with it and discovering what works and what does not. During the

process of developing and implementing QAS clinic staff had many discussions of its

purpose and strategies we would employ in the sessions. Creating the logic model

helped us to examine these purposes and strategies more closely and make explicit

links between interventions and hoped for outcomes for our clients. We realized that we

have several types of interventions built into the Quick Access session and that different

client presentations influence which interventions we focus on. We also learned that

some of the interventions may dovetail to have impacts on more than one separate but

related outcome.

The following are our research questions divided into two sections: outcome and

process.

OUTCOME: 1. Do clients feel more hopeful and motivated to pursue/complete service? 2. Do clients feel more confident in their abilities to deal with the problems after the

QAS session? 3. Are coping skills increased? 4. Are clients more aware of their families’ strengths and resources? 5. What is the impact of the QAS session on knowledge and understanding of the

problem? 6. What is the impact of the QAS on symptom severity at one month after

intervention? 7. Are clients utilizing their own strengths, resources and skills to resolve their

problems more at one month after intervention? 8. Are their particular characteristics of clients or types of problems which show

better outcomes from the QAS intervention? (Does QAS work better for some clients than it does for others?)

17

PROCESS: 1. What are clients’ perceptions of delivery of the program? Satisfaction? 2. Who are our clients including characteristics of client who attend QAS and those

who are referred but do not attend? 3. What are the main components of the service (active ingredients) and is the

service being delivered as intended? 4. Who are clients who no longer require service? The proposed evaluation will look at whether a single session intervention decreases

parent ratings or self ratings for adolescents of symptoms one month after the session

using a non-randomized control group (clients who are referred but do not attend the

QAS). The evaluation will also explore client satisfaction and the impact of the

intervention on hopefulness, coping skills, general self-efficacy and parenting self

efficacy and understanding of the problem. The study utilizes standardized measures,

clinic designed pre-session questionnaires and post session evaluation forms and a

follow-up phone interview which will contain some quantitative and some qualitative

data. The design has been chosen to shed light on both outcome and process

questions.

The outcome and process evaluation matrices contain detailed descriptions in

tabular form of the design and measures as they relate to the research questions. See

appendices B & C for these matrices. Copies of all measures and the research consent

form are also attached as Appendices D & E. Data will be collected at four different

times for most subjects. Our referring agency, CONTACT Hamilton, completes a Brief

Child and Family Phone Interview (BCFPI, Cunningham, Boyle, Hong, Pettingil &

Bohaychuk, 2009) with a caregiver or for self-referred adolescents with the adolescent

themselves during their intake phone interview. The BCFPI is used across Ontario and

British Columbia as a standardized intake interview. It has excellent reliability and

validity and is sensitive to change over relatively short periods of time (reference +

personal communication Peter Pettingil). The completed BCFPI and summary graph is

part of the referral package we receive on each client from CONTACT Hamilton before

18

clients attend the QAS. We receive an electronic copy of BCFPI data on disk every

three months from CONTACT Hamilton.

Before the QA session parents will complete the Parent Questionnaire, Coping

Self-Efficacy Scale (CSES, Chesney, Neilands, Chambers, Taylor & Folkman, 2006) and

the Tool to measure Parenting Self-Efficacy (TOPSE, Kendall & Bloomfield, 2005). Both

the CSES and the TOPSE have been shown to be reliable and valid measures which

are well suited to evaluating clinical intervention (Chesney et al, 2006; Scherbaum,

Cohen-Charash & Kern, 2006; Kendall & Bloomfield, 2005). Adolescents (>12) will

complete the Youth Questionnaire and the Coping Self-Efficacy Scale (CSES). Children

aged 10-12 will complete the Youth Questionnaire and the Child Outcome Rating Scale

(CORS, Duncan, Miller & Sparks, 2003). Children aged 6-10 will complete the CORS

and children from 3-5 will complete the Young Child Outcome Rating Scale (YCORS,

Duncan, Miller, Huggins and Sparks, 2003). These paper and pencil forms will be

completed with research staff or students in our conference room. Clients will sit at

tables and snacks and beverages will be provided. Children will be assisted to complete

their forms and kept busy with simple games or drawing when they are finished and they

are waiting for their parents to finish up. Based on pilot testing we expect the pre-

session questionnaires to take parents between 20 and 30 minutes, adolescents

between 10 and 20 minutes and children between two and 10 minutes.

After the session, also in the conference room with assistance from research

staff and students, parents and adolescents will complete the Quick Access Service

Evaluation Form and the CSES again. Children aged 10-12 will complete the Quick

Access Service Evaluation Form and the CORS for the second time. Children aged 6-

10 will complete the CORS again and the Child Session Rating Scale (CSRS, Duncan,

Miller and Sparks, 2003) and children from 3-5 will complete the YCORS again and the

Young Child Session Rating Scale (YCSRS, Duncan et. al., 2003). Again based on our

19

pilot testing, we expect parents and adolescents to take 10-20 minutes while children will

likely take between two and 10 minutes.

Before clients leave after filling out the post session forms research staff will

arrange a phone appointment to do the follow-up interview and Post BCFPI in about one

month with the person who did the BCFPI at intake. We modeled the questions for the

Follow-up Interview on the questions used by Moshe Talmon for his study on single

session therapy (1990). The follow-up interview includes one yes/no, six Likert scale

questions and nine short response open-ended questions. Interviewers will write down

responses in space provided on the form. Two questions combine a scale with an open-

ended response. Questions inquire about changes they’ve noticed, hope, changes in

view of the problem and reflections on the Quick Access Session. Some of the scaling

questions are the same as questions asked on the Quick Access Service Evaluation

Form. The Follow-up BCFPI will be administered and answers will be entered directly

into the BCFPI database by the researcher. The phone interview including the post

BCFPI should take 20-45 minutes to complete. When the phone interview is complete

the interviewer will remind the client that they will be receiving the CSES and TOPSE

questionnaires in the mail with a self-addressed, stamped envelope within a couple of

days. They will also be informed that when we receive the completed questionnaires we

will mail them a Tim Horton’s gift certificate for five to 10 dollars (depending on budget)

as a token of thanks for their participation through to the end. After the QAS session

therapists will fill out the Therapist Checklist about the session and the clients. For at

least one third of the QAS sessions student observers will also fill out the Therapist

Checklist during and/or after the session.

For clients referred to Quick Access who do not attend, data will be collected at

two times: during the CONTACT Hamilton referral interview and one month after the

middle date of when they could have attended the QAS. These clients complete the

20

BCFPI at intake as do the others. Non-attenders will be called at one month follow-up to

ask why they did not attend, whether they have sought service elsewhere and, if they’re

willing to do the Follow-up BCFPI. These subjects will be offered the same incentive if

they agree to do the follow-up interview. If non-attenders express a desire to attend the

QAS during the phone interview they will be given the opportunity to do so.

Data collection at the clinic will take place between October 2009 and February

2010. On average we get between 25-30 QAS clients each month. All of these clients

will be approached when they come to the QAS session and asked if they are willing to

be part of the evaluation. Research staff or students will explain what is involved before

and after the session and at the one month follow-up point and if clients agree they will

sign a consent form. We hope to get data on 100 QAS sessions which will mean at least

two sets of questionnaires per session.

Because our data set is quite complex and can be broken down into data subsets

we plan to seek at least two undergraduate psychology students and at least one

master’s level applied or clinical psychology or social work student to help with the

evaluation and use subsets of the data for their theses. If we are successful at receiving

the implementation grant we will approach Department Chairs in August and hope to

interest students who are working on theses in the September 2009 to April or August

2010 academic year. We know that we will approach Department Chairs at McMaster

University in the graduate Social Work program, at University of Guelph in Child and

Family Studies and Applied Psychology, and at Brock University in clinical psychology at

this point but we will likely approach others also.

The planned evaluation will create a rich database which will answer some basic

and more nuanced questions. We will focus our analysis in several different areas.

Descriptive analysis of the client group who attend the QAS and those who are referred

but do not attend and comparison between these two groups will provide a starting

21

place. We will also look at immediate impact of the session by comparing pre and post

session questionnaires including the Coping Self Efficacy Scale and the Children’s and

Young Children’s Outcome Rating Scales. To examine outcome we will compare

BCFPI subscale scores at intake with BCFPI subscale scores at one month post session

and then compare follow-up BCFPI subscale scores of those who received the

intervention with those who did not (non-attenders). We will also consider several

mediating and moderating variables for treatment effect at one month outcome. These

variables will include severity of problem (from pre BCFPI), pre-existing hope, change in

ratings of self-efficacy pre and post session and satisfaction with service. Analysis of

variance or analysis of covariance will be the main tests used to compare groups and to

look at changes over time (pre and post). Regression will be used to look at the impact

of moderating and mediating variables. Qualitative analyses will be done to categorize

answers to open ended questions about the presenting problem, family and individual

strengths and perceptions of the Quick Access Service.

As with any research design in an applied setting this plan contains some

limitations. For our control group we are relying on participants who did not receive

service from the clinic and therefore have no relationship with us or personal investment

in the clinic so they may not be willing to take the time to answer a follow-up BCFPI.

The number of non-attenders is about ¼ of those who do participate so the non-

attending potential group of subjects will be relatively small. We will compare non-

attenders and attenders on the pre-BCFPI to get a sense of whether they are equivalent.

If they are not equivalent on certain factors, we may be able to use these variables as

covariates and statistically control for them.

Subjects who did receive the intervention may feel a personal allegiance to their

Quick Access therapist and may therefore answer more positively. If they are returning

to the clinic for further service they may also believe it is in their best interest to be

22

positive about their experience of Quick Access. Other clients may be hesitant to notice

too much improvement if they believe that improving may restrict their access to further

service. We will try to deal with these possible influences on answers by assuring clients

that only identifying numbers rather than names will be attached to research

questionnaires and making it clear that the decision of the Quick Access team about

further service will not be in any way related to answers to research questionnaires.

Because these forms are being used clinically now, work remains to be done on the pre

and post session forms to assure anonymity. Another potential limitation in terms of

outcome is that we are relying primarily on caregiver’s (usually mother’s) ratings in using

the BCFPI. However, mother’s ratings tend to be the most reliable and useful indicators

of their children’s mental health concerns (Chuck Cunningham, 2008).

DISCUSSION AND LESSONS LEARNED

From overwhelmed to budding competence summarizes our experience of

developing this framework. At the beginning the whole project seemed amorphous and

ill defined. We even considered evaluating another service or QAS and another service

at the start. We learned quickly and repeatedly that evaluation is a time and labour

intensive business, harder than it looks and that establishing priorities and scaling down

are essential steps in the process. Developing the logic model and establishing

research questions were crucial steps in this project gradually taking a more realistic and

manageable shape. As project leader, I found it invaluable to have the research

committee meet regularly to discuss these and other questions. I was very aware that

this evaluation needed to be relevant to the whole clinic and was both unwilling and

unable to make decisions on my own. The Centre’s stepwise format for working through

the planning and consultation with Susan provided much needed structure for us to work

within. The discussions with our committee however allowed us to learn together about

23

the process of evaluation while making the research questions and design relevant to us

and to our vision of the intervention we’re evaluating.

In bouncing around ideas about design and measures and then sharing our ideas

with Susan we learned directly about what types of designs and measures allow for

different conclusions to be drawn. Despite the greater time involved we decided to

include more standardized measures than the BCFPI. The design which emerged is

really a hybrid of locally developed questionnaires most of which we are already using

for clinical and evaluation purposes, with standardized measures to enrich the design.

We also discovered that the Quick Access intervention and our hoped for outcomes for

clients really do seem to work together.

In reading about approaches to single session therapy in many other places and

settings we learned that there is a lot of commonality across these settings in the design

of the intervention and hoped for outcomes and that our Quick Access Service fits in

well. Reading more broadly and deeply about single session therapy and research and

evaluation of single session therapy renewed our enthusiasm for the intervention and

gave us new and different ideas which can be used to make improvements to our

service. The commitment that others have to evaluation and the work they’ve already

done also helped us to see the value of evaluation and to see evaluation as doable.

Reading Moshe Talmon’s book, Single Session Therapy (1990) helped to crystallize

some of the key ingredients of single session therapy and led us to a follow-up interview

format which made sense to us in providing answers for some of the questions which

most interested us.

Most of the communication about the framework happened within our clinic and

this in itself was challenging. We are a small clinic but even given that it was difficult to

find times for the research committee to meet partly because of the restrictions which

QAS commitments put on clinician’s time. Meetings and e-mail were valuable methods

24

of communicating for the research committee. I learned from experience that it is

important to follow an agenda and be well prepared for meetings in order to get the most

out of them. Attendance at research committee meetings by members was generally

consistent which helped a lot. Because of time constrictions we were unable to have an

administrative staff person at meetings to take minutes which meant that a committee

member had to take minutes and make sure they were distributed afterwards. This was

sometimes a problem.

Our staff meets altogether only bi-monthly so when information needed to be

communicated to the whole staff it had to happen during those meetings and time was

limited because there are many items for the agenda of those meetings. It was

challenging to communicate enough to the staff as a whole so that they could

understand our progress and the plan. At a recent staff meeting when we described the

plan in more detail and staff had the opportunity to look at the revisions to clinic forms

we have made there was a mixture of quiet support and skepticism. When some staff

heard the scope of the proposed project they asked (reasonably) how the data would be

used and whether it would wind up “gathering dust on the shelf”. This meeting brought

home for us the importance of proving the value of the evaluation to staff and continuing

to work to gain their interest and support. It is even more important that the evaluation

has real practical value because of the budget constraints which have resulted in

clinicians being stretched to their maximum. Staff must know that if some clinical staff

will be using even a small proportion of their time to evaluate service that the evaluation

will be useful for them and for our clients. If we go head with the evaluation it will be

necessary to intentionally block time to bring staff up to speed in terms of the design and

rationale for the project. In the meantime we plan to bring some of the best readings we

came across to share with other staff and begin to have discussions about them.

25

Communication with stakeholders outside of the clinic was mostly informal. Gord

Greenway meets fairly regularly with staff from CONTACT Hamilton and other local

agencies like Woodview Children’s Centre and Charlton Hall. At these meetings he

would occasionally talk about our project. I talked occasionally with Kathy deJong,

Rhiannon Jones and Karen Young from ROCK in Halton and discussed our progress

and theirs there. Two research committee members and one other staff person were

able to attend the one day meeting at ROCK about walk-in therapy and evaluation of

same. This meeting was invaluable to us from both a clinical and evaluation standpoint.

It gave us the opportunity to hear about different ways to approach walk-in therapy and

allowed questions to be answered and problems to be addressed directly by talking to

others with similar experience. I made a contact from that meeting, Gillian MacKay,

evaluation consultant from Catholic Family Services in Hamilton who I’ve since met with

to discuss evaluation of our services and trade our locally made questionnaires and

forms. As mentioned earlier I also had phone and e-mail contact with two other

agencies who are working on capacity building evaluation grants for brief interventions.

It was both useful and reassuring to share experiences with them and may lead to some

continued connection.

Our clinic is already benefiting from the work done thus far towards the

evaluation project. We have established a research office where resources and data

can be organized and stored. Through the process of seeking relevant literature we

discovered that through our Public Health Library Service we are able to log onto a large

academic and medical database from our own computers and search for articles

ourselves. Last year we arranged to get SPSS installed on several computers at the

clinic. Planning the project has also led us to create a plan to involve students in the

evaluation which is something we have wanted to do for years. Members of the

research committee revised all of the forms which clients fill out before and after QAS

26

sessions so that they are shorter, easier to complete and more relevant to the

intervention. Research committee members have begun to share some of the different

ideas about how to manage and support QAS that they have gleaned from their reading

and from talking to others doing similar work. Our hope is that the life which the

evaluation project has breathed into members of the research committee regarding

Quick Access but also the value of reflecting on and evaluating our work will be felt and

shared with other staff, students and clients.

One of the most important lessons learned for us has been that doing relevant,

useful and interesting evaluation of high quality is demanding. The reason that

evaluations are not done that often in small organizations has to do with the amount of

work involved to do a good job but more than that, with the lack of structure internally to

support this work. In small agencies where there is little to no direct support for research

it is very difficult to carve out enough time from normal workloads to allow projects to get

started and seen through. At our agency we have thought about doing evaluation

research for many years and have even done some small projects without receiving

additional funding or support. We routinely use evaluation forms clinic wide and adapted

for individual services, groups etc and look at the data we receive. Yet not until we had

some funding which allowed us to dedicate two days a week of the project leader’s time

could we really plan an evaluation as it should be planned-going through steps logically

from start to finish. Even with these resources in place it was difficult for me as project

leader to guard the days I blocked off for research so that they did not get eaten up by

other clinical meetings or phone calls. Clinical work has urgency to it whether or not

clients are in crisis. Timing is crucial in clinical work whereas at least in planning an

evaluation timing only became crucial close to the deadline. Nevertheless even having

the opportunity to do some work on the evaluation every week and occasionally devote

27

full days to it allowed us to finally build some momentum and continuity in the efforts we

put in.

Another important lesson we learned was that planning an evaluation required us

to reflect on our practice and to become more intentional in what we do. For the

research committee it was very much a process of discovery and learning to develop the

logic model and research questions and to consider which measures seemed most

relevant. The culture at our clinic overall is one which supports ongoing learning and

professional development, delivering respectful and relevant services to clients and

accountability and intentionality. Certainly for the research committee and for many of

the other staff at the clinic finally having the opportunity to plan this evaluation has

allowed us to start to put some of our values and hopes into action. For me, I have a

sense of pride that we are doing something of our own to look critically at, and prove the

value of, our own services. Being the “little cousin” children’s mental health service in

town alongside McMaster Children’s Hospital with their connection with the Offord

Centre for Studies on Child Development can be intimidating.

Our experience of working with the Centre of Excellence for Child and Youth

Mental Health has been a positive one. The structure laid out at the outset was well

thought out and logical. The process has almost seemed magical in some ways

because at the beginning it was impossible to imagine having a plan which makes sense

on many levels and seems doable. We have come a long way in a short time. The

tasks of filling in the tables/matrices were very useful in making things concrete and

understanding research principles in action. Working with Susan Kasprzak was a

pleasure. She was respectful and encouraging while bringing a different and useful

perspective to the planning of the project. I very much liked having some “milestones”

along the way when Susan would check in about progress. With a light and flexible

touch she helped us stay on track with deadlines. Susan was very accommodating and

28

generous with her time including continuing to be available for phone calls and draft

review during “crunch time” in April. I attended three of the four webinars (was ill for

one) and found them and the supplementary material very useful. Some of the

information was not new to me but it was good to have a review to reassure ourselves

on the committee that we were covering all the bases as well as possible. I also used

some of the toolkits and resources available online from the Centre, especially in the

early stages when we were trying to map out what was involved with doing an

evaluation.

CONCLUSIONS AND RECOMMENDATIONS/NEXT STEPS

One insight which we have gained through this process is that if an agency wants

to evaluate itself there needs to be commitment from all levels of the agency to the

evaluation process. Only with strong commitment of management and staff will those

involved with the evaluation process directly be able to focus enough of their time and

energy into the evaluation to get it done. I have felt as though I had this commitment

through the term of the capacity building grant in many ways: through the help and

involvement of the research committee, through my manager, Gord Greenway, being

flexible about my workplan and methods (for example allowing me to write this report at

home), and through other staff from their patience with demands on my time and their

interest in the evaluation.

The next insight is closely related to the first one; evaluation must be relevant to

the clients, staff and managers of the clinic in which it is done. Evaluations should be

done to answer questions which are relevant to those delivering the service because

they can see the concepts and issues being evaluated as relevant to the clients that they

serve. This reminds me of something I often hear from parents about their children,

“He/she does well in the subjects they like.” I often respond to parents that I think we all

29

do well at things we like and it’s much easier to learn something if you’re interested in it.

Staff need to see the evaluation as in some way making their workplace and their job

better.

The other lesson learned through this process has been how gratifying and

helpful it is to talk with others who are involved in delivering similar interventions and

evaluating their services. Through these connections many useful shortcuts and ideas

can be shared and the sense of one’s work having meaning and coherence to it

develops. Especially in difficult economic times with funding sources expecting ever

more with less and proof that it works, it is helpful and revitalizing to talk with other

agencies facing similar challenges. It helps to lessen feelings of isolation and

powerlessness.

The recommendations which we have for the clinic in order to continue to grow

its capacity for evaluation and make it useful, flow out of the insights we have gained

and reflections we have had on the process:

1. Support some staff in using some of their time towards research and evaluation activities on an ongoing basis.

2. Make sure that you have regular updates with the whole staff and try to make these updates fun. Allow time for questions/comments/discussion. Invite different perspectives on services and on evaluation.

3. Form a research committee and encourage staff and students to attend even occasionally if that is all that is possible.

In terms of our relationship with the Centre we found it very helpful to have Susan

come to the clinic to do the logic model with us. A longer visit or another visit would

have been great. Having the opportunity to talk to past recipients of grants and awards

in similar settings who have evaluated similar services or asked similar research

questions would have been a nice addition part way along the process. At times we felt

that there were issues related to doing research in clinical settings which were not

particularly well explored within the resources available through the Centre. The divide

30

between the academic world of research and the practical, messy, logistically crazy

world of the clinic remained in play at times in our relationship with the Centre.

The next steps planned at this point are to apply for the Evaluation Grant to carry out

the plan. Over the next few months we plan to continue to share ideas with other staff

members by having some in-service meetings where we discuss articles or chapters

about single session therapy and discuss aspects of our intervention. Our hope is to

work towards more consistency in the way the QAS is being delivered before we begin

the evaluation. We have already begun using our revised pre and post session

questionnaires for our QAS and will pay close attention to the responses we get and how

they differ from the information we got from the older versions. Our clinic is involved in

a process of re-visioning our mandate, roles, services and as part of this the research

committee and other interested staff members will be beginning to put together some

descriptive statistics of “who are our clients” using our intake BCFPI data and Kinark

database over the summer.

KNOWLEDGE EXCHANGE

I have described in detail some of the knowledge exchange activities which we have

engaged in thus far including connecting with two other agencies with capacity building

grants, joining a community of practice for walk-in therapy and sharing information with

our clinic at staff meetings. I did not mention yet that at the end of January as part of the

start of our re-visioning process as a clinic Danielle Clement and I analyzed BCFPI data

on clients referred between October 2007 and October 2008 and presented some of this

data to clinic staff as a Powerpoint Jeopardy game. Questions were designed to

encourage teams of staff members to make their best guess about qualities and

characteristics of the clients we see from demographic information to family and

individual subscales. Staff really enjoyed this activity and it got them talking and

31

interested in how more knowledge of characteristics of our clients could influence our

decisions about service provision. It also demonstrated areas in which clinical

experience proved valuable in predicting statistics and other areas where staff were

surprised at what the statistics showed. This is not the section to mention new skills

gained but I need to mention that this is the first time we have successfully done analysis

of BCFPI data at the clinic rather than relying on Peter Pettingil to do it for us! I sent

Peter a copy of the Jeopardy Powerpoint and he loved it. In terms of sharing information

about our evaluation plan we have already shared our Evaluation Matrices with Dan

Bajorek at Point in Time and I plan to send a copy of this report and our measures to Joh

Nelson and Ann McCarthy of Hands TheFamilyHelpNetwork in North Bay who were at

the walk-in meeting on April 1 and are interested in designing an evaluation for their

service.

If we receive funding to carry out the evaluation we have planned we will have a kick-

off half day meeting and potluck lunch in September to share the plan and logistics with

staff. We will also plan to have periodic updates with staff about how the research is

going and to hear any comments and concerns they may have. Because we are

involved in the preliminary stages of a re-visioning process as a clinic we plan to include

results from the evaluation as central to our review of the Quick Access Service. We

hope to complete this process by the end of 2010 and I believe that because of this the

chances of our results being closely reviewed and used to implement changes are high.

We will have a series of presentations of results to staff and managers when different

parts of the analysis are completed and discuss changes and application to clinical

practice. We will also encourage discussion of the evaluation results at various different

meetings/committees within our agency including team meetings, Planning and Advisory

Committee, re-visioning committee meetings and of course the Research Committee. I

very much hope we have the opportunity to present our results at one of the meetings of

32

the Walk-in Therapy Community of Practice meetings in late 2010. Once our results are

complete we will hold an open forum for interested children’s mental health agencies

and stakeholders in our region.

33

REFERENCE LIST

Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W. H. Freeman. Boyhan, P. (1996). Clients’ perceptions of single session consultations as an option to

waiting for family therapy. Australia & New Zealand Journal of Family Therapy, 17(2), 85-96.

Campbell, A. (1999). Single session interventions: An Example of clinical research in practice. Australia & New Zealand Journal of Family Therapy, 20(4), 183-194.

Chesney, M. A., Neilands, T. B., Chambers, D. B., Taylor, J.M. & Folkman, S. (2006). A Validity and reliability study of the coping self-efficacy scale. British Journal of Health Psychology, 11(3), 421-437.

Coverley, C. T., Garralda, M. E. & Bowman, F. (1995). Psychiatric intervention in primary care for mothers whose schoolchildren have psychiatric disorder. British Journal of General Practice, 45, 235-237.

Cunningham, C. E., Boyle, M., Hong, S., Pettingil, P. & Bohaychuk, D. (2009). The Brief Child and Family Phone Interview (BCFPI): Rationale, development and description of a computerized children’s mental health intake and outcome assessment tool. Journal of Child Psychology and Psychiatry, 50(4), 416-423.

Denner, S. & Reeves, S. (1997). Single session assessment and therapy for new referrals to CMHTS. Journal of Mental Health 6(3), 275-280.

Duncan, B. L., Miller, S. D., & Sparks, J. A. (2003). The Child Outcome Rating Scale. Authors: Ft. Lauderdale, FL.

Engel, K., Barwick, M., Urajnik, D., Cohen, L., Sumner, G., Reid, G. & Khatun, J. (2008). Walk-in mental health care: Clinical practice and research summary for the West End WI Counselling Centre. Powerpoint presentation Provincial Walk-in Meeting, Reach Out Centre for Kids, April 1, 2009.

Hampson, R., O’Hanlon, J., Franklin, A., Pentony, M., Fridgant, L. & Heins, T. (1999). The Place of single session family consultations: Five years’ experience in Canberra. Australia & New Zealand Journal of Family Therapy, 20(4), 195-200.

Hoagwood, K. E. (2005). Family-based services in childrens’mental health: a research review and synthesis. Journal of Child Psychology and Psychiatry, 46(7), 690-713.

Hubble, M. A., Duncan, B. L. & Miller, S. D. (Eds.), (1999). Introduction to The Heart and Soul of Change. American Psychological Association: Washington, D. C.

Hyun-nie, A. & Wampold, B. E. (2001). Where or where are the specific ingredients? A meta-anlaysis of component studies in counseling… Journal of Counseling Psychology, 48(3), 251-258.

Kazdin, A. E., Holland, L. & Crowley, M. (1997). Family experience of barriers to treatment and premature termination from child therapy. Journal of Consulting and Clinical Psychology, 65(3), 453-463.

Kendall, S. & Bloomfield, L. (2005). Developing and validating a tool to measure parenting self-efficacy. Journal of Advanced Nursing, 51(2). 174-181.

Lazarus, R. S. & Folkman, S. (1984). Stress, appraisal and coping. New York: Springer. Manthei, R. J. (2007). Clients talk about their experience of the process of counseling.

Counselling Psychology Quarterly, 20(1), 1-26. Metcalf, L., Thomas, F., Duncan, L., Miller, S. & Hubble, M. (1996). What works in

solution-focused brief therapy: A qualitative analysis of client and therapist perceptions. In S. Miller, M. Hubble, & B. Duncan, (Eds.), Handbook of solution focused brief therapy (pp. 335-349). San Fransciso: Jossey-Bass.

Miller,J. K. & Slive, A. (2004). Breaking down the barriers to clinical service delivery: Walk-in family therapy. Journal of Marital and Family Therapy, 30(1), 95-103.

34

Miller, J. K. (2008). Walk-in single session team therapy: A Study of client satisfaction. Journal of Systemic Therapies, 27(3), 78-94.

Perkins, R. (2006). The effectiveness of one session of therapy using a single-session therapy approach for children and adolescents with mental health problems. Psychology and Psychotherapy: Theory, Research and Practice, 79, 215-227.

Perkins, R. & Scarlett, G. (2008). The effectiveness of single session therapy in child and adolescent mental health. Part 2: An 18-month follow-up study. Psychology and Psychotherapy: Theory, Research and Practice, 81, 143-156.

Jones, T. L. & Prinz, R. J. (2005). Potential roles of parental self-efficacy in parent and child adjustment: A review. Clinical Psychology Review, 25, 341-363. Scherbaum, C. A., Cohen-Charash, Y. & Kern, M. J. (2006). Measuring general self-

efficacy: A comparison of three measures using item response theory. Educational and Psychological Measurement, 66, 1047-1063.

Sommers-Flanagan, J. (2007). Single-session consultations for parents: A preliminary investigation. The Family Journal: Counseling and Therapy for Couples and Families, 15(1), 24-29.

Talmon, M. (1990). Single-Session Therapy. San Francisco, CA: Jossey-Bass Inc., Publishers.

Theodor, F. & Paolini, A. (1999). The results of some preliminary research examining single session consultation. The Hincks-Dellcrest Centre, Sheppard Site.

Trunzo, A. (2006). Engagment, parenting skills, and parent-child relations as mediators of the relationship between parental self-efficacy and treatment outcomes for children with conduct problems. Ph.D. Dissertation: University of Pittsburgh, School of Social Work.

35

APPENDIX A

Program Logic Model: Evaluation of the Quick Access Service, Child and Adolescent Services, Hamilton, ON

36

APPENDIX B

Outcome Evaluation Matrix

37

APPENDIX C

Process Evaluation Matrix

38

APPENDIX D

Quick Access Service Client Information Fact Sheet

39

APPENDIX E

Licensing agreement for/and Young Child Outcome Rating Scale (YCORS) Young Child Session Rating Scale (YCSRS)

Child Outcome Rating Scale (CORS) Child Session Rating Scale (CSRS)

40

APPENDIX F

Coping Self-Efficacy Scale (CSES)

41

APPENDIX G

Tool to Measure Parenting Self-Efficacy (TOPSE)

42

APPENDIX H

Quick Access Service Parent/Caregiver Questionnaire

43

APPENDIX I

Quick Access Service Youth Questionnaire

44

APPENDIX J

Quick Access Service Collateral Questionnaire

45

APPENDIX K

Quick Access Service Client Evaluation

46

APPENDIX L

Brief Child & Family Phone Interview Adolescent Form (BCFPI-A) Brief Child and Family Follow-Up Survey (BCFPI)

47

APPENDIX M

Follow-Up Interview

48

APPENDIX N

Draft Quick Access Service Evaluation Information and Consent Form

49

APPENDIX O

Draft Therapist Checklist

50

Appendix P

Description of ethics review process for Public Health Services-City of Hamilton