SOCW 671 # 8
-
Upload
ronan-huff -
Category
Documents
-
view
59 -
download
2
description
Transcript of SOCW 671 # 8
![Page 1: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/1.jpg)
SOCW 671 # 8
Single Subject/System DesignsIntro to Sampling
![Page 2: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/2.jpg)
Single-Subject Designs
Evaluation designs that involve arrangements in which repeated observations are taken before, during, and/or after an intervention.
These observations are compared to monitor the progress and assess the outcome of that service.
![Page 3: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/3.jpg)
Logic of Single Subject/System Designs
Unlike experimental designs that involve experimental and control groups, single system designs have one identified client/system
This identified client/system may be an individual or group
These designs are based on a time-series
![Page 4: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/4.jpg)
Use on the Micro Level of Social Work Practice
If you are practicing at the micro level, this likely will be the most common method to use.
Directly related to client progress
![Page 5: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/5.jpg)
Measurement Issues Need to specify targets of intervention by
having an operational definition of target behavior
Triangulation - the use of two or more indicators or measurement strategies when confronted with a multiplicity of measurement options
Self-report scales often used, these have plusses and minuses
![Page 6: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/6.jpg)
Unobtrusive Measurement Preferred Will want to reduce bias and
reactivity through the use of unobtrusive measurement (means observing and recording behavioral data in ways that by and large are not noticeable to the person being observed).
![Page 7: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/7.jpg)
First Need Baseline (control phase) Measures
Pattern should not reflect a trend of dramatic improvement to the degree that it suggests the problem is nearing resolution
Should have many measurement points
Chronologically graphed data should be stable
![Page 8: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/8.jpg)
Alternative Designs
AB ABAB Multiple Baseline &
Successive Interventions Multiple Component
![Page 9: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/9.jpg)
AB: Basic Single-Subject Design
Collect data during baseline period
Collect data during intervention
Problems is that it does not control well for history
![Page 10: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/10.jpg)
ABAB: Withdrawal/Reversal Design
Two problems Improvement in target behavior may not
be reversible even when intervention is withdrawn
Practitioner may be unwilling to withdraw something that appears to be working
![Page 11: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/11.jpg)
Multiple Baseline-Design (Successive Interventions)
Consists of several different interventions The interventions are staggered. Each intervention is applied one after
another in separate phases. The application of the intervention is
provided to different target problems, settings, or individuals
![Page 12: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/12.jpg)
Multiple-Component Design Combines elements of the experimental
replication and successive intervention designs. Can be used with or without baselines/ Purpose is to compare the relative effectiveness
of two different interventions Problems with being able to infer that only one
component resulted in target behavior
![Page 13: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/13.jpg)
Data Analysis
Two-standard deviation-band approach (Sheward Chart)
Chi-square t-test & ANOVA
![Page 14: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/14.jpg)
Shewart Chart
Mean level of baseline data is identified Two standard deviation levels (bands) are
constructed above and below the mean line These bands are extended into the
intervention phase If two successive observations during
intervention, there is a significant change
![Page 15: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/15.jpg)
Complicating Factors Carryover – occurs when the effects obtained in one
phase appear to carry over into the next phase Contrast – when the subject reacts to the difference
in the two interventions or phases Order of presentation – when the order of the phases by themselves may be part of a causal impact
Incomplete data – when a subject of client does not “fit” nicely into the phase time frame
Training Phase – client may not have the prerequisite skills for full participation in the intervention when it begins
![Page 16: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/16.jpg)
Causality Criteria in Single Subject (System) Designs
temporal arrangement co-presence of the intervention & desired
change in target behavior repeated co-presence of the intervention and
the manifestations of the desired change consistency over time conceptually and practically grounded in
scientific/professional knowledge.
![Page 17: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/17.jpg)
Design Validity & Reliability
Replication is very useful Statistical Conclusion Validity: Did
Change Occur? Internal Validity: Was change Caused by
Intervention? Construct Validity: Was Intervention and
Measurement of Outcomes Accurately Conducted?
![Page 18: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/18.jpg)
Intro to Sampling
Non-probability
Probability
![Page 19: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/19.jpg)
Non-probability
Reliance on available subjects Quota sampling Snowball sampling Selecting informants
![Page 20: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/20.jpg)
Probability
Simple random Systematic Stratified Cluster
![Page 21: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/21.jpg)
Issues in Program Evaluation
Evaluation as Representation Program evaluation is not the program, only a snap
shot of it
Organizations are complex, therefore evaluations often focus on select services
Evaluations can go beyond consumer focus, may review staff, community relations, continuing education, etc.
![Page 22: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/22.jpg)
Common Characteristics
Program models Resource constraints Evaluation tools Politics and ethics Cultural considerations Presentation of evaluation findings
![Page 23: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/23.jpg)
Common Characteristics (continued) Program models
Need blueprint as expressed by logic model Program survival requires that evaluation be
performed to maintain contracts Outputs and outcomes monitored
Outputs are non-client related objectives Outcomes are client related objectives
Infrastructure related objectives serve program maintenance function
![Page 24: SOCW 671 # 8](https://reader033.fdocuments.us/reader033/viewer/2022061605/56812d21550346895d921cc3/html5/thumbnails/24.jpg)
Common Characteristics (continued) Resource constraints
Insufficient time, staff, money, or evaluation know-how
Typical implementation time Needs assessment 3 to 6 months Evaluability assessment 3 to 6 months Process evaluation 12 to 18 months Outcome evaluation 6 to 12 months Cost-benefit analysis 1 to 2 months