Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work...

96
DOCUMENT RESUME ED 395 207 CE 071 735 AUTHOR Ntiri, Daphne Williams, Ed. TITLE Evaluati,:n in Adult Literacy Research. Project ALERT. Phase II. INSTITUTION Detroit Literacy Coalition, MI. SPONS AGENCY Michigan State Dept. of Education, Lansing. Office of Adult Extended Learning. REPORT NO ISBN-0-911557-13-X PUB DATE 96 NOTE 104p.; For the phase I report, see CE 071 734. PUB TYPE Reports Evaluative/Feasibility (142) EDRS PRICE MF01/PC05 Plus Postage. DESCRIPTORS Adult Basic Education; *Adult Literacy; Black Education; Blacks; Community Education; Community Organizations; Cultural Differences; Educational Needs; Educational Research; Evaluation Criteria; *Literacy Ed,ation; Literature Reviews; Material Development; -"Models; Pilot Projects; Program Development; *Program Evaluation; Questionnaires; *Self Evaluation (Groups); *Self Evaluation (Individuals); Student Attitudes; Tables (Data); Teacher Attitudes; Tutors; Voluntary Agencies IDENTIFIERS 353 Project; African Americans; Detroit Literacy Coalition MI ABSTRACT This document contains an evaluation handbook for adult literacy programs and feedback from/regarding the evaluation instruments developed during the project titled Adult Literacy and Evaluation Research Team (also known as Project ALERT), a two-phase project initiated by the Detroit Literacy Coalition (DLC) for the purpose of developing and testing a user-friendly program evaluation model for evaluating literacy operations of community-based organizations throughout Michigan under the provisions of Section 353 of the Adult Education Act. The handbook, which is intended for literacy program administrators, explains a six-step process for constructing literacy evaluation tools tailored to particular programs. Presented next is a 23-item reference list. The remainder of the document consists of feedback obtained during the pilot tests of the project-developed self-evaluation instruments to be completed by students and tutors participating in volunteer literacy programs. Appendixes constituting approximately 507. of this document contain the following: letter to Michigan literacy providers; glossary; sample questionnaire cover letter; ALERT feedback form; final revised self-evaluation program tutor and student survey questionnaires; self-evaluation program questionnaire; DLC policy statement, overview, board of directors, mission profile, and services and primary goals; and list of responding literacy sites. (MN) **************i.******************************************************** Reproductions supplied by EDRS are the best that can be made from the original document. **********************************************************************

Transcript of Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work...

Page 1: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

DOCUMENT RESUME

ED 395 207 CE 071 735

AUTHOR Ntiri, Daphne Williams, Ed.TITLE Evaluati,:n in Adult Literacy Research. Project ALERT.

Phase II.INSTITUTION Detroit Literacy Coalition, MI.SPONS AGENCY Michigan State Dept. of Education, Lansing. Office of

Adult Extended Learning.REPORT NO ISBN-0-911557-13-XPUB DATE 96NOTE 104p.; For the phase I report, see CE 071 734.PUB TYPE Reports Evaluative/Feasibility (142)

EDRS PRICE MF01/PC05 Plus Postage.DESCRIPTORS Adult Basic Education; *Adult Literacy; Black

Education; Blacks; Community Education; CommunityOrganizations; Cultural Differences; EducationalNeeds; Educational Research; Evaluation Criteria;*Literacy Ed,ation; Literature Reviews; MaterialDevelopment; -"Models; Pilot Projects; ProgramDevelopment; *Program Evaluation; Questionnaires;*Self Evaluation (Groups); *Self Evaluation(Individuals); Student Attitudes; Tables (Data);Teacher Attitudes; Tutors; Voluntary Agencies

IDENTIFIERS 353 Project; African Americans; Detroit LiteracyCoalition MI

ABSTRACTThis document contains an evaluation handbook for

adult literacy programs and feedback from/regarding the evaluationinstruments developed during the project titled Adult Literacy andEvaluation Research Team (also known as Project ALERT), a two-phaseproject initiated by the Detroit Literacy Coalition (DLC) for thepurpose of developing and testing a user-friendly program evaluationmodel for evaluating literacy operations of community-basedorganizations throughout Michigan under the provisions of Section 353of the Adult Education Act. The handbook, which is intended forliteracy program administrators, explains a six-step process forconstructing literacy evaluation tools tailored to particularprograms. Presented next is a 23-item reference list. The remainderof the document consists of feedback obtained during the pilot testsof the project-developed self-evaluation instruments to be completedby students and tutors participating in volunteer literacy programs.Appendixes constituting approximately 507. of this document containthe following: letter to Michigan literacy providers; glossary;sample questionnaire cover letter; ALERT feedback form; final revisedself-evaluation program tutor and student survey questionnaires;self-evaluation program questionnaire; DLC policy statement,overview, board of directors, mission profile, and services andprimary goals; and list of responding literacy sites. (MN)

**************i.********************************************************

Reproductions supplied by EDRS are the best that can be madefrom the original document.

**********************************************************************

Page 2: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

CATIONd inwe,enwniThi616.EftT/UfsSAToldDIONocEdE71,7 FORMATION

roduced asreceived from the person or organizationoriginating it

0 Minor changes have been made toimprove reproduction quality

Points of view or opinions stated in thisdocument do not necessarily representofficial OERI position or policy

4,"PERMISSION TO REPRODUCE THISMATERIAL HAS BEEN GRANTED BY

/0-61

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC)."

4.

VP' 1

RESEARCHroject

A II

BEST COPY AVAILABLE

Page 3: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Detroit Literacy Coalition

EVALUATION

IN

ADULT LITERACY RESEARCH

Project ALERT

Phase II

Daphne Williams NtiriEditor & Project Director

3

Page 4: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

EVALUATION

IN

ADULT LITERARY RESEARCH

Project ALERT

Phase II

Daphne Williams Ntiri, Ph.D.Editor and Project DirectorAssociate ProfessorInterdisciplinary Studies ProgramWayne State University

Gregory RobinsonGladys Coleman WaldenResearch AssociatesCollege of EducationWayne State University

Weimo Zhu, Ph.D.Evaluation ConsultantAssistant ProfessorCollege of EducationWayne State University

Sponsored by Office of Adult Extended Learning ServicesMichigan State Department of EducationRon Gillum, DirectorKen Walsh, SupervisorGloria Grady Mills, Literacy Coordinator

Cover Illustration by Shirley Woodson Re Id

4

Page 5: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

EVALUATION

in

ADULT LITERACYRESEARCH

Project ALERTPHASE II

Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team.Phase II sets out to test the evaluation sur-veys developed by the 1994 team and tocollect feedback on the reactions front bothtutors and tutees on the structure and ap-propriateness of the inquiries. Funded bythe State Department of Education andexecuted through the Detroit Literacy Coali-tion at Wayne State University, the ProjectALERT team went beyond the confines ofthe metropolitan Detroit area to seek an-swers from stakeholders as far away asMuskegon, Grand Rapids and Ironwood.The hope was to enlist the reactions andcriticisms of many Michiganders in theprocess of reshaping the evaluation tools.

Daphne W. Ntiri, Associate Professor ofSocial Science, Interdisciplinary StudiesProgram, Wayne State University, who di-rected this project, is also a State LiteracyFacilitator and the Director of the DetroitLiteracy Coalition, a community-based or-ganization in the metro Detroit area. Thisorganization meets the needs of at-risk De-troiters through the collaborative networkingof its fifteen member local agencies. Dr.Ntiri has written extensively on Adult Edu-cation/Literacy in the Third World, especiallyAfrica, and, as Co-Director of the Council

vii

for Excellence in Adult Learning (CEAL),Wayne State University, is presently formu-lating strategies to advance long-term pro-fessional education here in Michigan.

Main entry under title: EVALUATION IN ADULT LIT-ERACY RESEARCH, Project ALERT

1. Literacy - Periodicals 2. Evaluation - Periodi-cals 3. Adult Education - United States - periodi-cals 4. Michigan - Periodicals I. Ntiri, DaphneWilliams

ISBN: 0-911567-13-X

0 1996 by Detroit Literacy Coalition, Wayne StateUniversity, Detroit, MI 48202

Copyright law prohibits the reproduction, storage, ortransmission in any form by any means of any portionof this publication without the express written per-mission of the Detroit Literacy Coalition and of thecopyright holder (if different) of the part of the publi-cation to be reproduced. The Guidelines for Class-room Copying endorsed by Congress explicitly statethat unauthorized copying may not be used to create,replace, or to substitute for anthologies, compilation,or collective works.

First Edition

Printed in the United States of America

Printed on Recycled Paper

Page 6: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

The DLC Statement of PurposeThe purpose of the Detroit Literacy Coalition (DLC) is to promote the total eradication of illiteracy in ourcommunity and to advance the cause of reading and writing both for work and pleasure. The DLC iscommitted to improving the quality of life of every Michigan citizen and resident by ensuring that increasedaccess to basic education is available to those deprived or those who for some reason or other missed outon early educational opportunities. Its central tenet is that total literacy is fundamental to the well-being ofDetroit society.

Program MissionWith the absence of a sound, scientific and viable instrument to measure effectiveness of statewide vol-unteer adult literacy programs, Project ALERT embarked on the first of a two-part series of an evalua-tion model plan to address program evaluation needs. The object was to create a user-friendly evalua-tion model that can be applied statewide to volunteer literacy programs.

Page 7: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

TABLE OF CONTENTS

LIST OF TABLES xi

LIST OF FIGURES xiii

TO THE READER xv

PREFACE xvii

I. INTRODUCTION 1

A. Program Goals 1

B. Program Management 2

H. EVALUATION HANDBOOK FOR ADULT LITERACY PROGRAMS 3

Bibliography 24

III. FEEDBACK ON EVALUATION

Part I 25Part H 29

BIBLIOGRAPHY 24

APPENDICES

A. Letter to Michigan Literacy Providers 45B. Glossary 49C. Sample of Cover Letter Accompanying Questionnaire 53D. ALERT Feedback Form 57E. Final Revised Self-Evaluation Program Tutor and Student

Survey Questionnaires 61

F. Self-Evaluation Program Questionnaires 73

G. DLC Policy Statement and DLC Overview 87H. DLC Board of Directors 91

I. DLC Mission Profile 95

J. Services and Primary Goals 99

K. List of Responding Literacy Sites 103

S4

Page 8: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

LIST OF TABLES

Table 1. Six Basic Steps for Conducting Adult Literacy ProgramEvaluation 5

Table 2. Differences Between Summative and Formative Evaluations 9

Table 3. Evaluation Methods Chart 23

Table 4. Demographics of Respondents 30Table 5. Characteristics of Respondents' Agencies/Institutions 31

Table 6. Chi-Square Statistics of Feedback of Respondents on KindAgencies/Institution 41

Table 7. Chi-Square Statistics of Feedback of Respondents onEducational Background 42

Table 8. Chi-Square Statistics of Feedback of Respondents by Yearsin Adult Literacy 42

Table 9. Chi-Square Statistics of Feedback of Respondents onMichigan Certification 43

Table 10. Chi-Square Statistics of Feedback of Respondents by Gender 43

xi

Page 9: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

LIST OF FIGURES

Figure 1. Feedback on Tutor'Questionnaire: Length of the Instrument 33

Figure 2. Feedback on Tutor Questionnaire: Appropriateness of Questions 34

Figure 3. Feedback on Tutor Questionnaire: Clarity of the Instrument 35

Figure 4. Feedback on Tutor Questionnaire: Applicability of the Instrument 36

Figure 5. Feedback on Student Questionnaire: Length of the Instrument 37

Figure 6. Feedback on Student Questionnaire: Appropriateness of Questions 38

Figure 7. Feedback on Student Questionnaire: Clarity of the Instrument 39

Figure 8. Feedback on Student Questionnaire: Applicability of the Instrument 40

Page 10: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

To the Reader

This is a sequel to an earlier study,Evaluation in Adult Literacy Research;Project ALERT, undertaken in 1994 underthe auspices of the Detroit Literacy Coalitiona nonprofit Brganization based at WayneState University, Detroit, Michigan. Evalua-tion in Adult Literacy Research, ProjectALERT, Phase Ii is dedicated to adult edu-cators in administration, instruction and vol-unteer service who take an interest in ad-vancing program quality and effectiveness.It has focused on the development ofevaluation criteria and evaluation instru-ments and their application to volunteer lit-eracy programs.

Because of the diversity in adult aducationprogramming, it seems almost impossible toestablish a standard evaluation model thatwould be applicable to all programs andpopulations. Given the diversity in provid-ers, delivery systems, course offerings,support services and adult education learn-ers, evaluation outcomes are impacted dif-ferently. It is as a result of this complexitythat the Project ALERT team embarked ona mission to fill a void, to create a user-friendly model designed to be adaptableand flexible, and to provide effective feed-back for program improvement and quality.

The first part of the evaluation researchstudy conducted in 1993/4 was the creation

of a set of evaluation instruments for volun-teer adult literacy programs. Modification tothe instruments led to the completion of aset of simplified, time-sensitive and ready-to-be-tested questionnaires for tutors andtutees in volunteer literacy program. Thiscurrent work continues with the past workand is presented in two parts. Phase 1 pre-sents a handbook to literacy administratorsand other stakeholders on how to constructliteracy evaluation tools best suited for theirprograms. Phase II shares reactions to aset of survey questionnaires tested with tu-tors and tutees in the field. The reactionswere shared on a feedback form. Thesewere not about any program per se butabout the instrument itself, its structure, va-lidity, relevance and objectivity. The feed-back has assisted with the generation of arefined and scientifically tested set of in-struments.

The intent of this work remains givingevaluation a central place in adult literacyprogramming, which in turn will ensurequality assessment and make data avail-able for decisions in program improvement.

Daphne Williams NtiriEditor & Project Director

^XV

Page 11: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

PREFACE

The MICHIGAN STATE DEPARTMENT OFEDUCATION, OFFICE OF ADULT EX-TENDED LEARNING SERVICES con-tracted with the DETROIT LITERACYCOALITION (DLC) to develop and pilot aprogram evaluation model that can be ap-plied to literacy operations of community-based organizations state-wide under theprovisions of Section 353 of the Adult Edu-cation Act. Officially titled Community-Based Literacy Organization ProgramEvaluation, the project assumed a newname during the course of the research:Adult Literacy and Research EvaluationTeam (ALERT). Because there are severalother adult literacy non-profit agencies inthe area with a vital interest in the project,and because DLC is the key community-based Adult Education organization in thecity, the DLC's access to research and ex-perimentation was significantly enhanced.

The Detroit Literacy Coalition is a ten-yearold 501(c)3 organization that works with keyliteracy non-profit organizations in Detroit tobring about collective change and eradicateilliteracy. The OLC's network of servicesincludes: grants support, staff training anddevelopment, technical assistance in adultliteracy resource development and leader-ship programs. It serves hundreds of met-ropolitan Detroiters at various levels offunctional illiteracy. The DLC works withover sixteen direct literacy providers includ-ing the City of Detroit, the Detroit Public Li-brary, the Detroit Public Schools/Adult Edu-cation Department, the Archdiocese-sup-ported Dominican Literacy Project, LiteracyVolunteers of America, AKA Reading pro-grams, Wayne State University, WayneCounty Community College, Highland Park

Community College and various churches,corporations and businesses. The DLCgives special attention to sheltered and dis-abled populations such as seniors and thedeaf. A successfully implemented DEAFOPTIONS program in 1991 contributed tothe improvement of literacy educationamong the deaf.

The organization meets the needs of at-riskDetroiters through the collaborative effortsof its members and through the over 1,000volunteers who serve as instructors, tutorsand helpers. The Detroit population includesan estimated 200,000 functional illiterates.Family and intergenerational illiteracy ismost common. Other affected populationsinclude teenage mothers who contribute tothe growth of family illiteracy in significantnumbers; the economically depressed whohave been denied or have denied them-selves the opportunities for a sound educa-tion; the physically and emotionally chal-lenged who have not always been situatedto benefit from special education services;masses of dislocated workers suffering fromrecent shifts in global trade and technology.

The DLC holds community activities suchas the Literacy Summit that assembled lit-eracy providers, and supporters to discussand review the problem of illiteracy in thissociety and an Annual Walk-A-Thon thatpromotes community consciousness aboutof illiteracy in our communities. The DLChosted the 1992 Annual State LiteracyConference that brought over 200 statepractitioners and theorists together to dis-cuss issues and map new trends in Literacyand Adult Education.

Page 12: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

I. INTRODUCTION

A. Program Goals

1) To field test a pair of evaluationinstruments for tutors and tuteesamong various populations in over100 volunteer literacy programsaround the State of Michigan

2) To obtain feedback on instrumentstructure, content relevance andappropriateness through a newlydesigned feedback form

3) To refine or reconstruct the exist-ing models based on data from thefeedback form

4) To develop a how-to manual for lit-eracy practitioners and administra-tors who may not have the knowl-edge of developing or applyingevaluation instruments

5) To develop a strong research basefor evaluation of volunteer literacyprogramming

6) To disseminate tho outcomesthrough publications and confer-ences

The Outlook

Participation and Attrition of Low-LiterateAdults in ABE Programs

An estimated 18 to 28 million low-literateAmerican adults have never entered AdultBasic Education (ABE) programs for amyriad of reasons (Hunter & Herman,1979). Personal barriers such as low self-

esteem and miscalculating the importanceand relevance of an education are oftenconsidered primary explanations for partici-pation and attrition irregularities. ThoughABE information is generally availablethrough the public libraries present in everycity, lack of immediate access to informa-tion about ABE programs is often given bynonreaders as a reason for nonparticipa-tion.

The actual attrition rate ranges from 40% to60% for adult literacy programs. In theFarra study (1988), 15% of adults referredto a literacy program failed to make contact,and 5% failed to enroll after contacting theprogram (Bean, Partanen, Wright & Aaron-son, 1989). According to the United StatesDepartment of Education, the largest of thethree adult literacy education programs,Adult Basic Education (ABE), serves only8% of its targeted population. Anotherstudy of ABE enrollees shows that 35% didnot complete 12 hours of instruction, and ofthose going beyond 12 hours, 50% com-pleted less than 48 hours (Dirkx & Jha,1994).

Knowledge about the systematic differencesof low-literate adults may require a varietyof recruitment strategies. Fingeret (1989)examined seven assumptions of the Ameri-can middle class about nonreading adults:dignity, diversity, intelligence, culture, theability to collaborate, aspirations, and resis-tance to the educational system. Provenresearch has demonstrated that these as-sumptions are biased and lack validity sincenonliterate learners can be manipulativeand are known to hold positions of power,all the while keeping their illiteracy a secret.

Page 13: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

2

B. Program Management

The grant award was from July 1994 to May1995.

Project ALERT- Phase II was directed by:

Dr. Daphne W. NtiriAssociate Professor, Interdisciplinary

Studies, Wayne State UniversityExecutive Director, Detroit Literacy

CoalitionLiteracy Facilitator, Region 4State ofMichigan

Other members:

Gregory RobinsonGraduate StudentCouncil for Excellence in Adult Learn-

ing, College of Education, WayneState University

Gladys Coleman WaldenTeacher, Detroit Public Schools/ABECouncil for Excellence in Adult Learn-

ing, College of Education, WayneState University

Weimo Zhu, Ph.D.Assistant Professor, Colege of Educa-

tion, Wayne State University

Clerical:

Carla JordanCrystal Lomax

Personnel Functions:

Project Coordination and PresentationDr. Daphne N. Ntlrl

Handbook DevelopmentGregory Robinson

Insirumentation & ImplementationGladys Coleman Walden

Technical & Statistical AnalysisWeimo Zhu, Ph.D.

Sponsored by Office of Adult ExtendedLearning, Michigan Department ofEducation

EDITOR & PROJECT DIRECTORProject ALERTDaphne Williams Ntiri, Ph.D.Wayne State University/Detroit LiteracyCoalition

ADVISORY BOARDAnna Meritt, DirectorLiteracy Volunteers of America - Detroit

Gloria Grady MillsMichigan State Department of Education

Guerin MontilusProfessor of AnthropologyWayne State University

Sister Marie SchoenleinDominican Literacy Center

Members of the Advisory Board were consultedin different capacitiessome for advice on se-lection of sites for interviews, others for input oninstrument structure, and yet others for carefulconsideration in meeting the state statute. TheDirector and staff would like to acknowledge thecritical support of the Advisory and regular DLCBoard in this significant undertaking.

13

Page 14: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

3

EVALUATION HANDBOOKfor

ADULT LITERACY PROGRAMS

Introduction

Various models of evaluation have beenreviewed in Adult Education. Such modelsinclude the predetermined objeclrves ap-proach advanced by Tyler (1949); goal-freeevaluation of Michael Scriven (1972); Kirk-patrick's hierarchy of evaluation (1967), andnaturalistic evaluation of E.gon Guba (1978)and Brookfield (1991). That programevaluation is central to the ongoing effec-tiveness of adult education programs is un-questionable. It is a regular and vital seg-ment of the adult education discourse be-cause of its political implications, the practi-cal realities, and its connection with thefunding cycle.

Evaluation data assume significance in thedecisions of government and public sectoragencies on the issue of support of adulteducation and literacy programs. Unfortu-nately, a "standard" evaluation model for alladult literacy programs cannot hold, giventhe varied characteristics of the programsand their clientele. According to the litera-ture, there is no standard evaluation thatcan be validated as a "model." Since mostadult literacy programs are short-term andinformal, such characteristics tend to behandicaps to the evaluation process andtherefore the subsequent strengthening ofthe program.

Why are evaluations so relevant and nec-essary to the survival of adult educationprograms? Several arguments exist tosupport the relevance and need for programevaluation. Reasons include to:

judge the effectiveness of ongoingprogramsseek realistic ways to enhance existingprogramsassess the value of innovative pro-gram initiativesincrease the effectiveness of programmanagementimprove cost efficiencymeet various accountability objectives

The focus of any educational evaluativeprocedure remains the generation of factualdata relative to program effectiveness. :fnot conducted competently and scientifi-cally, the evaluation outcomes lack meaningand can be condemned as worthless.Evaluation, therefore, should provide feed-back and answer questions that aid in theassessment of methods used to meet pro-gram objectives, and thereby facilitate pro-gram improvement.

In preparation for an evaluation, thereshould be ample discussion of what is to beevaluated, and whether internally or exter-nally. For both internal or external cases,the following elements are necessary:

the beliefs and expectations of spon-sor/employer and evaluatorthe gathering of informationthe formulation of a planthe organization of relative goals andobjectivesthe time frame allotted and budgetconstraints

Page 15: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Table 1Six Basic Steps for Conducting

Adult Literacy Program Evaluation

1. DEVELOP A PRIMARY PLAN OF ACTION

2. SELECT AN EVALUATION INSTRUMENT

3. GATHER RELEVANT DATA

4. COLLECT SUPPORT & FEEDBACK DATA

5. ANALYZE AND PROCESS THE DATA

6. PREPARE FINAL REPORT

Page 16: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

STEP ONE - Develop a Primary Plan of Action

Planning is the first important step in devel-oping any project. Therefore, the focus ofthe entire planning stage should reflect theideas of stakeholders and the evaluator.By focusing on a specific plan, the team cannarrow objectives and bring greater clarityto the evaluation process. Also, by clarify-ing goals and objectives, and by sharing allrelevant information concerning expectedoutcomes with stakeholders, the team willsharpen its understanding about the totalprocess and develop a cohesive approachderived from distinct yet related perspec-tives.

Most experts agree that planning a programevaluation is a complex task. A handbookalone cannot . provide everything that isneeded to know about evaluating a particu-lar literacy program. The diversity in volun-teer literacy organizations makes it impos-sible to recommend an evaluation proce-dure for all programs. However, what is ofnote is that the evaluation process must betailored to the particular program. Conse-quently, project evaluators agree that thefollowing elements should be incorporatedinto your primary plan of action:

Plot out ideas and define problems orconcerns.

Develop and work out a primary planagenda for discussion.

Set an initial meeting between evalu-ators and sponsors.

Discuss beliefs and expectations.

Share relative documentation or data.

Relate any time, political or budgetconstraints.

Initiate your evaluation primary plan oraction.

Use common sense.

4

These points are in keeping with an old butwise adage that states: "If you don't knowwhere you're going, how will you knowwhen you get there?"

Following these steps, the next issue con-cerns planning a primary meeting. Pleasenote the questions below:

1) What questions are you trying to an-swer from the study, and what are thepolitical motives?

2) What are the resources made avail-able to you?

3) By what system will the data be dis-tributed and collected?

4) How and by whom will the informationbe reported?

5) How will the results be used?(Promote or cut programs?)

6) How will the instruments be selected?

7) How much time and money is avail-able?

8) Who and by what process will data beanalyzed?

Oftentimes, evaluators find that initial ob-jectives or specific plans may need chang-ing because critical information was lacking.So, be prepared for this possibility and beguarded against this obstacle by remainingobjective in your approach. Further, con-sider the following options:

Continue to update and maintain areasonable contact with stakeholdersor sponsors.

Set alternate plans during your firstmeeting.

Know that data collection proceduresare always subject to a certain degreeof unreliability.

Incorporate ideas from previously suc-

Page 17: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

cessful projects in order to modify yourstudy.

Prepare your mind "up-front" for thejob.

Of great significande is "how do evaluatorsmanage the diversity of ideas, beliefs andexpectations that differ between themselvesand sponsors?"

The following suggestions may prove use-ful:

1) Remember, there is no such thing astoo much communication. Don't as-sume anything. ASK!

2) Make sure everyone understands thesame definitions that relate to theevaluation process (e.g., summativevs. formative).

3) Be ready to explain and define thedifferences between evaluation, as-sessment, and measurement proce-dures. Although all are related toevaluation, they all have distinct func-tions and characteristics.

Defining Evaluation

Simply defined, evaluation means thesystematic process of collecting and analyz-ing data. L.R. Gay, a noted researcher andexpert in the field adds, "It is, however, theonly part of the definition of evaluation onwhich everyone agrees* (p. 6). Studies in-dicate that there exists a host of distinctdifferences when defining evaluation. Gayfurther comments, "Examination of evalua-tion literature makes it very evident veryquickly that there are almost as many defi-nitions of evaluations as there are evalua-tion experts" (pg. 6).

He offers these two definitions:

1) Evaluation is the systematic process ofcollecting and analyzing data in orderto determine whether, and to what de-gree, objectives have been, or are

being, achieved.2) Evaluation is the systematic process of

collecting and analyzing data in orderto make decisions and improve pro-gram outcomes.

Remember, terms such as assessment,measurement and diagnostic, take on dif-ferent meanings and functions. See theglossary in the back of this handbook for abetter understanding of these terms. Don'tallow the sponsors or other members of theteam to be confused by these terms. Theirfocus might change and your original objec-tives could become altered.

Based on experiences with the tremendousdiversity in Michigan's volunteer adult liter-acy programs, ALERT found that the"formative" rather than the "summative" ap-proach to evaluation worked best for thisproject.

The ALERT definitions for these terms areoffered below:

Formative: An evaluation in which theprogram's objective is to generate out-come data that represent an ongoinglook at program operation in order toimprove the program through adjust-ment or change.

Summative: When evaluation resultscontain information given as only a fi-nal or definitive assessment of theprogram's effectiveness and as a ba-sis for which funding decisions aremade. This assessment generally in-cludes areas such as studentachievement, levels of proficiency,group scores, test results, perform-ance standards, grading scales, etc.

For additional information and clarity onthese terms see the table on the followingPage.

See the Appendix for an ALERT samplethat provides a listing of agenda items usedduring ALERTs primary planning meeting.

Page 18: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Table 2The Differences Between Summative

and Formative Evaluations

CategoryComparison

FormativeApproach

SummativeApproach

Purpose improvement-oriented Certification of program utility

Key players Program staffAdministration

Potential sponsors orfunding agencies

Evaluation teamselection

Usually internally selected Usually externally selected

Measures Ongoing,often informal

Specific time limitsvalid/reliable

Data collectionmeasures

Frequent, eitherquantitative or

qualitative

Limited, usuallyquantitatively designed

Size of samples Often small Usually large

Assessmentquestions

Definesparticipants/ improvementbased/examines strengths

& weaknesses

Specific outcomes/student achievement test

scores

Sample questionsWhat is working?What needs to be

improved?How do participants feel?

What results occur?With whom?

Under what conditions?At what cost?

-J

Page 19: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

10

It is strongly recommended that if one'sobjective is to conduct an evaluation thatmeasures such items as student achieve-ment, test scores, or other quantitative val-

ues, a "summative" framework be used af-ter the completion of a semester. .

On the other hand, if one's objective is todemonstrate that an ongoing program isworthwhile and the main concern is toevaluate the operational procedures them-selves, then a "formative" approach mightprove more useful.

Below are a few more hints to consider be-

fore going on to Step Two. Double check to

be sure:

that all involved parties and/or sponsorsare informed about exactly when theproject will start and what agencies andpopulations will be surveyed

that you know how each member of theteam feels about the process

that you determine and agree on eachteam member's role in the process

that you specify the length of the entireproject, and schedule additional meetingdates for the planning committee and

that you prepare a cover letter to ac-company your evaluation that is clear instated goals and mission for the project.

Summary: Developing a Primary Plan of Action

(Step One)

1. Brainstorm to plot out ideas and define problem areas.

2. Set a meeting date between evaluators and sponsors.

3. Write an agenda to discuss goals and objectives, beliefs and expecta-

tions.

4. Share any related data or documentation.

5. Relate all time, political, or budgetary constraints.

6. Formulate and initiate the evaluation plan of action.

Page 20: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

ISTEP TWO: Selecting Evaluation Instruments 1

Determining what evaluation instrument isbest suited for your particular adult literacyprogram is very critical. The instrumentselected should be capable of evaluatingthe overall program objectives.

In general, it is as convenient to use a stan-dardized instrument that has been designedto evaluate similar programs as to constructyour own. It is a good idea to find out whatevaluation resources are available beforedeciding on a tool or initiating the instrumentdevelopment process. Gay (1980) notes,"There are situations, however, for whichuse of available instruments is impractical,inappropriate, or both" (p. 110), so be pre-pared to add survey questions to a stan-dardized form or design a totally differentform altogether.

As we mentioned in the previous section,"formative" approaches work best forevaluating operational procedures. It is im-possible to evaluate every single compo-nent of a literacy program, as there are

too many considerations. The aim and in-tent should be to narrow the variables anddetermine which elements are the most im-portant in meeting assessment objectives.Therefore, the instrument should reflectthose goals and objectives set during theprimary planning stage.

Keep in mind that you don't want to over-whelm respondents by making question-naires or interview questions too long. Try,as much as possible, to limit sections of thequestionnaires to 10-15 questions. If thequestionnaire requires answers for differentcategories, use no more than three sec-tions. Some of the general areas you mightwant to focus on include: Who is served bythe program? How were they chosen toparticipate? How is the program funded?What type of services are offered?

To determine if the instrument will enhancea formative evaluation procedure, considerthese 10 major points:

1. Does it identify student needs?

2. Does it evaluate any additional services offered (e.g., counseling)?

& Does it evaluate whether the curric:ulum design is meeting student goals?

4. Does it evaluate the facility (e.g., for comfort, available technology, location)?

& Does it evaluate if learning supplies or materials are suitable and available?

& Does it evaluate whether tutors are trained and effective?

7. Does it evaluate the extent to which learning is achieved?

& Does it evaluate what outside barriers might prevent students from meeting goals?

9. Does it evaluate how the program has enhanced or promoted growth and opportunity instudents' lives?

10. Does it evaluate the specific needs or major strengths of your program?

Page 21: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

12

Most of the 10 elements mentioned aboveare included in the ALERT models. ALERTdesigned three different instruments in aneffort to cover these critical areas. In mostevaluation processes, only one of these in-struments is needed. The first model is avery lengthy form designed to address aspectrum of literacy program concerns. It iscomplete with questionnaires for evaluationof administrators, tutors, and tutees. Thesecond model is a shorter version of thefirst, and does not evaluate administrativestaff. The last revision, or third model, istotally different from the first two, and it par-ticularly addresses the feedback concernssolicited from over 100 literacy providers inthe state.

A review of all three ALERT models

(examples of the first two models are pro-vided in the Appendix) is recommendedbefore deciding which tool might work bestfor your particular program. The first modelserves as a prototype for the other two. It iscomprehensive and designed for lengthyevaluation projects. The shorter revisedmodels are less formative and are time-sensitive, but also cover many of the mostcritical areas.

A full sample of the third model is includedin the Appendix.

The following criteria might further assistyou in determining what type of instrumentwill work best for your particular evaluationproject.

1. Does the instrument -significantly increase your knowledge and understanding of the ele-ments to be investigated?

2. -Can :the. InforThation irom the quedtionnaire be categorically or systematically used to en-hanCetheanalysisprocess inthe most efficient and timely fashion?

3. Can the data -collected best be used in a qualitative or quantitative manner (this should bedecided before the actual evaluation procedures begin)?

4. Determine which instrument Is easiest to understand and use. Is it time-sensitive and user-friendly, especially the questions designed for the students?

5. Will the information and knowledge enhance eventual understanding about the overall pro-gram? Could this lead to better decision-maldng practices?

6. Please remember that if the standardized instrument is not appropriate for your particularliteracy program, then It may become necessary to design or develop your own.

2

Page 22: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

13

If this does become the case, keep thesehints in mind:

1. Because of the uniqueness of thishandbook, the information providedcan be used in almost any literacyevaluation project, usually with veryfew alterations.

2. A good practice is to include the ideas,suggestions and reflections of staff,sponsors and students to help withtool development.

3. Don't be afraid to utilize the massamount of literature on literacy re-search and evaluation procedurespublished by experts in the field.

. 4. As the primary evaluator, rememberthat your opinions are valuable. Fol-low basic guidelines, but use commonsense and experience because youmust feel comfortable with what youare doing.

Before you go to the next step, please pilot-test the instrument for validity and reliability.Contact a literacy provider in your group orarea by sending an announcement letterand follow up with a phone call. Explainthat he or she was chosen to pilot-test theinstrument, and then send a sample of thequestionnaire, requesting a speedy return.Be sure to ask some open-ended questionsas well.

Summary: Selecting Evaluation instruments(Step Two)

Brainstorm to determine if the study wilt be qualitative or quantitafive in nature.

2. Seek a standardized instrument that could best evaluate the program's objectives.

3. Complete the standard instrument yourself to make sure that it is time-sensitive and user-friendly.

4. Be prepared to make adjustments to the standard form or to design a separate form alto-gether.

5. Design and send out a letter to a selected literacy provider asking for assistance to .pilot-testthe instrument.

6. Upon retum of the pre-test instrument review the findings and share responses with stake-holders or sponsors.

7. Make any necessary adjustments to the instrument or to procedures.

Page 23: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

15

STEP THREE: Gathering Relevant Data

The third step in an evaluation project isgathering relevant data. This process en-tails how one goes about collecting the in-formation surveyed. An evaluation is onlyas good as the information gathered. Acommon weakness in many evaluationprojects is the focus of too much attentionon too many areas by evaluators with toolittle time and limited information. Oon'tmake this mistake! Your results are likely tobe invalid. The process of gathering andcollecting the most useful and relevant databecomes critical.

This step serves as a conceptual menu thatdirects the evaluator toward specific cate-gories of relevant data to enhance the col-lection and analysis process. In this proc-ess, it is a good idea to compile a listing ofall potential literacy providers you want tosurvey. Before you send out your ques-tionnaire, you must first send out a short,direct cover letter. Some main points to beaddressed are: the name of the organiza-tion conducting the study; the reason for thestudy; some very brief instructions; a returnaddress and a deadline date; and a thank-you in advance for participating. (A sampleof the ALERT cover letter is provided in theAppendix.)

Once you have allowed enough time for re-spondents to receive the cover letter, it is agood idea to make a follow-up phone call toensure that the letter was received and toanswer any questions the recipient mighthave.

Now you are ready to send out your ques-tionnaires. A questionnaire is just one ofmany methods of gathering information, butmost experts agree that it is probably thebest method. The questionnaire survey isfrequently used as a polling method togather facts. It is very Important that you

send the questionnaire out in a timely fash-ion, allowing program facilitators adequateopportunity to respond by your return date.

A good evaluator will always respect pri-vacy, confidentiality, and the rights of theindividual being surveyed. To assure re-spondents that your study is ethical and willbe conducted with sensitivity and integrity,explain the procedure to tutors and studentsduring interviews, or with program facilita-tors if interviewing will not be done. Alwaysask for permission to use data that might beconsidered confidential. Remember thatevaluations can be extremely provocativeand even threatening. If a potential re-spondent feels threatened, he/she might notrespond with honestyor might not re-spond at all.

n4 0

Be sure that the directions request a realis-tic return date, and that it is clear and easyto find in the package. Note also that rarelyare 100% of the questionnaires distributedreturned. In fact, in most cases, only about60% of potential respondents respond afterthe first contact. To assure that a greaterpercentage of samples eventually will bereturned, be sure to:

allow additional time for late responses

prepare to visit some sites

send out a second cover letter

make additional follow-up calls

When a follow-up plan is in place, usuallyanother 20-30% of samples are returned.Efficiency, then, is a very important elementin the process. A balance between avail-able time to collect the data, cost require-ments, and coding and organizing resultsmust be considered.

There are a number of methods to gatherinformation and collect data. Please decide

Page 24: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

16

early on which techniques will work best foryour particular situation. Keep in mindthese constraints: budget, schedule, andavailability and reliability of resources. Re-liability here represents the elements in-volved in the judgment by sponsors in suchareas as the items being evaluated, the ex-aminers, the examinees, and the accuracyof outcomes.

Patterns of a reliable study can take onthree different dimensions: objective, direct,

and structural. The objective approach in-volves an implicit frame of reference wherethe concepts of accuracy and error are bothconsidered. In the direct approach, re-spondents and data collectors agree on themethod and purpose of the collection proc-ess. In a structural framework, however,specific kinds of responses are determinedrelevant.

Summary: Gathering Relevant Data(Step Three)

1. Compile a list of all literacy providern to be evaluate&

2. Send each provider a cover 'letter announcing who you are and your intentions.

& Make follow-up phone calls to assure that the letter was received, and answer questions.

4. Send out the evaluatiori instrument or start interviewing respondent&

5. Have a follow-up plan in place to assure that the maximum number of responses are re-turned and collecte&

6. Immediately codify or organize responses as they are returned.

7. After the deadline date has expired, prepare to send all data to the analysis expert.

Page 25: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

17

ISTEP FOUR: Soliciting Support and Feedback Data

Once the collection of data and data analy-sis process are complete, fine-tuning of ef-forts is needed to test the variations amongdiverse target populations. For example, astandardized instrument that worked fine forevaluating an outreach literacy center mayhave little relevance, if any, when evaluatingprison populations or workplace literacyprograms.

In other words, because of differences intarget groups in literacy programs, a secondrevision of the instrument may prove nec-essary. This is often not required until theprimary evaluation process has been com-pleted. The second revision assures stake-holders that a fair assessment of thesegroups can be implemented, and that, in sodoing, evaluation services delivered to thegeneral literacy population are equal toservices delivered to the special targetedpopulation.

Remember, however, that no evaluationtool can cover all of the assessment andmeasurement needs of so many differentliteracy providers. In many cases, once afeedback questionnaire is circulated amongsome of the targeted populations, it canprovide substantive data that can add to thevalidity of a survey without complicatingprocedures in great deal.

There are several techniques that one canuse to gain greater feedback from the tar-geted groups. However, a good evaluatorwill narrow the choices in order to stay onpace with the teams' objectives. The basesfor implementing such changes usually fallon the need to provide more impressionisticevidence, because certain literacy providersor facilitators will be dissatisfied with theway efforts were conducted in relation totheir particular program. A feedback ap-proach is less threatening and more inviting.

The following elements should be consid-ered before attempting the support andfeedback process:

How to identify where target problemsand populations exist

What types of procedures or servicesto provide to target groups.

How to assess the qualifications andcompetencies of staff.

How to encourage cooperation fromthe target program's staff and stu-dents.

How to ensure strict confidentiality inorder to maintain reliability and validity.

How to use new evidence.

Page 26: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

18

Summary: Soliciting Support and Feedback Data(Step Four)

. Based on the responses of the initial survey, determine which programs can add greatervalidity to the study if questions are more generally designed to address their needs. Also,identify programs that did-not respond or are omitted in the first study because they serv-iced a special group.

2. Design and send out another cover letter along with a copy of the first questionnaire tothese groups, asking them for their input.

3. As soon as responses are returned, outline and collate the feedback data where appro-priate.

4. Eliminate unnecessary or non-useful feedback, and determine which responses could en-hance the Process without complicating procedures in great detail

5. Use the relevant feedback data to revise the original questionnaire.

Page 27: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

19

STEP FIVE: Processing, Analyzing,and Summarizing Data

The data analysis, or interpretation of re-sults, is vital to the evaluation process. Awell-developed analysis leads directly to theformulation of conclusions that can impactdecision-making practices. Accurate statis-tical interpretation, therefore, plays a majorrole in the analysis process as well. Thestatistics in evaluative research involve theprocess of describing, synthesizing and in-terpreting the quantitative data. There are anumber of ways to analyze data. Determin-ing which techniques will work for your par-ticular study greatly depends on such fac-tors as type of evaluation, evaluation ques-tions, and design of the evaluation instru-ment. To avoid conflicts in interpretation ofthe data, arrange meetings with the statisti-cal analyst.

The procedure that you select for analyzingthe data will differ, depending on whetherthe information collected is quantitative orqualitative. In general, a quantitative analy-sis measures the extent that data is re-ported numerically, while qualitative data is,in most cases, gathered through open-ended answers and is value-based (see theglossary for a better definition of theseterms).

A good analyst will, upon completion ofcollecting all data, select and organize theinformation according to the evaluationquestions. Also, if appropriate, he/she willcollate into specific categories the data ob-

tained from each source or method. Forexample, all the data will be coded to reflectdifferences in gender, age between tutorsand students.

If the members of your evaluation staff arenot qualified to prepare the analysis, then itis better to hire someone who knows first-hand how to handle the process. If this isnot possible, be sure that the evaluator se-lected for the job is well-trained and has thetime and patience to do the job.

The analysis should be constructed andpresented in a way that addresses the proj-ect's evaluation objectives. The ALERTproject, for example, was quantitative indesign and formative in nature; therefore,all statistical evidence was taken and col-lated to measure the quality of the ongoingprogramming and to determine the signifi-cance of the outcomes provided. In orderto achieve these ends, it was necessarythat the data given in this stage was pre-sented simply, accurately, scientifically, andclearly. It is also a good idea to identify allthemes that might be suggested by thefindings. Also, wherever possible, providecharts, tables, and graphs to express datafrom different lists or counts in order to aidsponsors and stakeholders to understandand interpret the information. Review thebibliography of this handbook for more re-source assistance.

Page 28: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

20

Summary: Processing, Analyzing, and Summarizing Data(Step Five)

1. Select a well-trained staff member for the task, or hire an analysis expert

2. Select and organize the data according to the evaluation questions and the types of or-ganizations responding.

3. Wherever needed, make sure the information is collated into specific sub-groups.

4. Select the educational analysis computer program that best meets your needs.A

5 uNinderstaalmsuzdth, at on_ acztstte analys. is process is completed, the information is dear, easy to

anP te

6. Provide charts, tables, andtor graphs to help validate and express the findings to stake-holders or other sponsors.

7. Prepare the final analysis so it can be incorporated into the final report.

Page 29: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

21

STEP SIX: The Final Report

The last step in an adult literacy evaluationis the final report. The writing of the reportshould remain as objective as possible.The report should not contain subjectiveopinions, remarks, or statements. There isalso no place for emotionalism in the report.As much as possible, stay away from per-sonal pronouns such as "I," "my," and "we."For example, instead of saying, "We ran-domly selected samples," the correct ex-pression would be, "The sample was ran-domly selected."

The report should include information Or asummary from each step of the project. Butit should be written in a straightforward stylecontaining clear and simple language.Make sure that the information presented inthe final report is consistent with the factsillustrated in the study and with table, chartsand graphs representing the findings. Addi-tional comments should be made (as brieflyas possible) for every instrument or re-search tool that may have been used in thestudy.

A section describing the methods used inthe study is always an excellent addition toany evaluation report. When describing themethodology, include in your sAtementcomments concerning the targeted popula-tions, the measuring instrument used, pre-plan statements, the actual evaluation de-sign, specific procedures followed, and anylimitation, biases or problems that mighthave occurred.

Stakeholders or sponsors should be madeaware of all findings and recommendationsbefore results are published. Where appli-cable, their suggestions and interpretationsof the findings should be included in thesummary. Remember, their involvementcan add credence to the study and will pro-vide them a sense of teamwork as a mem-ber of the research group.

Finally, be sure to detail all relative recom-mendations and conclusions found. Therecommendations in any evaluation reporttake the form of suggestions concerningways to improve the program or the evalua-tion project itself.

Page 30: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

22

Summary: The Final Report(Step Six)

1. Review all data from every step in the process and summarize the findings.

2. Share the findings and your recommendations with stakeholders or sponsors.

3. Use a computer (or other word processor) to type all the data.

4. Mow at least two people to proofread the final report before it is published.

5. Make sure that the report contains all these components:

A. An introduction, mapping goals and mission of the prqect

S. A methodology section describing the questionnaire process and thetargeted population

a Data analysis of results (e.g., graphs, tables)

D. All recommendations, conclusions and summaries.

Page 31: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

23

Table 3Evaluation Methods Chart

PROCEDURES ADVANTAGES DISADVANTAGES

GROUP INTERVIEWS

Stimulates thinling and.sharing of ideas andexperiences.

Provides a greater range for diversity pertain-ing to the subject.

Provides a consensus about the program.

Limits confidentiality.

Is more difficult to organize.

Can be considered as a threatening environ-ment

May require a more skilled or particularly trainedinterviewer.

INDIVIDUALiffERVEVIS

More likely to allow for sincere responses.

May make participants more willing to dis-close personal information.

Allows interviewers to control how and whenquestions are asked.

Usually is more time-consuming.

Is generally more difficult to analyze.

Will require a skilled interviewer.

Participants can sometimes feel threatened.

QUESTIONNAIRESURVEYS

Large amounts of data can be collected inshort periods of time

Is usually less expensive than other forms ofsurveying.

Is better for maintaining confidentiality.

Makes the taik of analyzing and summarizingdata less difficult.

Data are restricted by the way certain questionsare asked.

Greater planning time is needed.

Return rates are usually slower coming in.

QUANTITATIVE.EVAUIATION

Mows for more samples, which adds morecredence to the survey.

Deals wilt' specific numbers.

Provides better information if results are pre-minted in table or graph form.

Data are usually easier to collect, and is doneIn shorter periods of time.

Is generally easier to summarize and anatyzedata.

Does not allow participants to respond objec-tively.

Requires a skilled quantitative analysis expert.

Provides greater limitations in the amount ofinformation that can be collected and analyzed.

Can be seen as more threatening.

QUALITATIVEEVALUATION

Allows for participants to respond objectivelyand freely.

Mows for open-ended responses, which adddirect credibility to the study.

Interviewers can get different personal viewson same subject

Can control when and how questions areasked.

Allows interviewers to gain a better interpret&lion as to the way respondents actually feel.

Substantial planning time is required.

Return rates are usually very low.

Participants may not be directly familiar with

Program.

Is e difficult to e.

lilimited in confidentiality.

May be difficult to organize.

Usually reflects quality outcomes only.

JOURNALS/LOGS

Offers immediate reaction to events or activ-

Requires Me effort to collect data.

Is less expensive to administer.

Offers records of change over time.

Data are unreliable, and more subjective.

Are more difficult to analyze.

Mom training may be required in order to recordInformation.

Page 32: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

24

BIBLIOGRAPHYBean, R. M., Partanen, J., Wright, F., & Aaronson, J. (1989). Attrition in Urban Basic Literacy Programs and

Strategies to Increase Retention. Adult Literacy and Basic Education, 13(3), 146-154.

Bloom, B., et al. (1971). Formative and Summative Evaluation of Student Learning. New York: McGraw Hill.

Brookfield, Stephen. D. (1991). Understanding and Facilitating Adult Learning. San Francisco: Jossey-Bass.

Dirkx, J. M., & Jha, L. R. (1994). Completion and Attrition in Adult Basic Education: A Test of Two Pragmatic Pre-diction Models. Adutt Education Quarterly, 45(1), 269-285.

Farra, H. E. (1988). Demographic, Economic and Educational Characteristics of Adults Who Responded to theAdult Literacy Media Campaign: An Urban Profile. Unpublished doctoral thesis, University of Pittsburgh.

Fingeret, A. (1989). The Social and Historical Context of Participatory Literacy Education. New Directions for Con-tinuing Education, 42. San Francisco: Jossey Bass.

Friere, P. (1987). The Importance of the Act of Reading. Journal of Education, 165, 5-11.

Gay, L. R. (1979). Educational Evaluation & Measurement: Competencies for Analysis and Application. Columbus,OH: C. Merrill.

Guba, E. G. (1978). Toward a Methodology of Naturalistic Inquiry in Educational Evaluation. SE Monograph Se-ries in Evaluation, No. 8. Los Angeles: Center for the Study of Evaluation, University of California.

Hunter, C. S. J., & Harman, D. (1979). Adult Literacy in the United States. New York: McGraw Hill.

Kirkpatrick, D. L (1967). Evaluation in Training. In R. Craig and L Bittel (Eds.), Training and Development Hand-book New York: McGraw-Hill.

Lerche, R. S. (1985). Effective Adult Literacy Programs: A Practitioner's Guide. New York: Cambridge.

Ntiri, D. W. (1994). Evaluation Adult Literacy Research: Project ALERT. Detroit, MI: Detroit Literacy Coalition.

Payne, D. A. (1994). Designing Education Project and Program Evaluations: A Practical Overview Based on Re-search on Experience. Norwell, MA: Kluwer Academic Publishing Group.

Rossi, P. H., & Freeman, H. E. (1982). Evaluation: A Systematic Approach. Beverly Hills, CA: Sage.

Scriven, M. (1972). Pros and Cons about Goal Free Evaluation. Evaluation Comment, 3(4), 1-4.

Scriven, M. (1981). Evaluation Thesaurus. Pt. Reyes, CA: Edgepress Inverness.

Stecher, B. M., & Davis, W. A. (1987). How to Focus on Evaluation. Newbury Park, CA: Sage.

Steel, S. M. (1977). Contemporary Approaches to Program Evaluation. Washington, DC: Capital.

True, J. A. (1989). Finding Out: Conducting and Evaluating Social Research. Belmont, CA: Wadsworth.

Tyler, R. W. Basic Principles of Curriculum and Instruction. Chicago: University of Chicago Press.

UNESCO. (1990). Evaluating a Literacy Training Program. Pacific, Thailand: UNESCO.

Worthen, B. R., & Sanders, J. R. (1973). Educational Evaluation: Theory and Practice. Worthington, OH: CharlesA. Jones.

32

Page 33: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

25

FEEDBACK ON EVALUATION QUESTIONNAIREPART I

METHODOLOGY

This second phase of Project ALERT setout to collect reactions to the evaluationmodel (see Appendix). The prototype de-veloped in the first phase was createdthrough interviews with administrators, staffand students at primary literacy providers(Detroit Public Schools, Adult EducationDivision; Literacy Volunteers of America;Dominican Literacy Center) to enstire thatthe critical priority areas of volunteer literacyprogram instruction and administration werecovered, that is, attendance, personalgoals, counseling and motivation notwith-standing.

To maximize the share of reactions on thesurveys, an instrument was designed andcalled a feedback form (see Appendix).The first page of the feedback instrumentcollected data on both the program and therespondent. It was categorized into the fol-lowing:

area setting of the program

years in adult literacy

level of education

state teacher certification

gender

ethnic/racial composition

The second page solicited informationabout the student and tutor evaluation sur-vey questionnaire with regard to length andstructure of the instrument, appropriatenessof questions, clarity and applicability ofquestionnaire to respective program popu-lations.

Instrumentation

Evaluation in Adult Literacy Research:Project ALERT - Phase II deals with threeseparate survey instruments which havebeen discussed:

1. A Self-Evaluation Program Tutor Sur-vey Questionnaire

2. A Self-Evaluation Program StudentSurvey Questionnaire

3. A Feedback Form

Population

A set of all three instruments mentioned inInstrumentation, accompanied by a self-addressed envelope was sent to 221(N=221) literacy program participantsthroughout Michigan before November 14,1995. Listings were provided by the Michi-gan Literacy, Inc. directory (134 adult liter-acy organization) and the State Action PlanTeam (25 Michigan student supportgroups). Respondents were asked to studythe survey questionnaires and answerquestions on the feedback instrument withall their comments no later than December31, 1994. A total of 64 subjects (30%) re-turned the survey.

Revised Student and Tutor Self-Evaluation Program Questionnaire

The responses received allowed for thecreation of revised student and tutor pro-gram survey questionnaires, each of whichwas divided into three parts:

Part I Identification

Part II Program Information

Page 34: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

26

Part Ill Program Issues

All three parts of the revised evaluationmodels can be given in segments. Part Ican be given when the participants first en-ter the program. Part ll is designed to beanswered after participants have been inthe program for a minimum of two instruc-tional sessions. Part Ill was developed forparticipants who have a minimum of fourinstructional sessions.

It is advisable that the tutor's revisedevaluation be completed at one time or insegments. However, the student's revisedevaluation model should preferably becompleted in segments over a short periodof time because of reading and attentionlevels, and to minimize anxiety and frustra-tion.

Feedback Comments

The feedback comments are extensive andreflect common concerns of adult educatorsin the field on designing a model evaluationinstrument. They have been used to re-shape the outlook of the original evaluationmodel. Four areas of concern are noted:

tutee/student readability

word usage

relevance and applicability of ques-tions in the tutor questionnaire to vol-unteer literacy programs

questions for tutors on tutees' thinkingpatterns and perceptions

Tutee Readability/Word Usage

The evidence shows that the tutee/studenthad difficulties of comprehension of the in-struments. A significant number of themthought that the student evaluation formwas not "new reader friendly and thereforecomprehension was limited, from lengthwas frightening and language was difficult.For example, some of the tutor's comments

showed preferences over the use of certainwords:

Use the word "answer instead of"response"

Use "helped to learn" instead of "gave youinformation"

Use "signing up" instead of "enrollment"

Use the word "tutor" instead of "instnictor

The tutors shared that tutees would be un-able to complete the questionnaire withoutassistance, and for this responding tutor,the tutees she was in contact with found theform "scary," "hard to read" and "it madethem nervous." She continued that one ofthe students reading the word "reservation,"meaning restraint, confused this term withan Indian reservation in the next county.Equally, words like `purpose,"response,'`sources,"referral,' are at a higher readinglevel and thus cannot be fathomed by newreaders.

On the issue of word usage in the ques-tionnaire, there were concerns related toambiguity and difficulty levels. The com-mentaries pointed to difficulties in distin-guishing differences in words and shades ofmeaning.

What is the difference between "to a greatextent" and "to a good extent"

What is the difference between "just right"and "challenging"

Words like "purpose," "response" arecomplex words for real beginners. Yes,the tutor becomes the reader for the stu-dent depending on the level of difficulty.

What about a category for "don't know"

Tutors' Speaking for Tutees

The tutors were quite candid and forthrightabout the shortcomings of the instruments.Though fully cognizant of the limitations of

Page 35: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

27

the tutees/adult learners with limited readingand writing skills, they (tutors) were op-posed to being asked about "what thetutees/students think or what the tutees/students perceive about their learning proc-esses. The reasoning behind their opposi-tion touched on factors that relate to validat-ing perceptions of others who are less for-tunate. Further, they complained about theinstrument's failure to address tutor trainingneeds and the volunteer nature of employ-ment for many of the tutors. Some addedthat some questions bordered on irrele-vance and inappropriateness. The word"center" was used to refer to literacy pro-grams, but in essence not all literacy pro-grams see themselves as, or call them-selves centers. The reactions that followarticulate more directly and restate some ofthe questions as they should have beenpresented:

The instrument asks nothing aboutsupport or training needs for the tutors

I would personally like to see morequestions for the tutor regarding whatkind of feedback they would like fromthe organization that they do not get. Iwould like to see more questionsgeared toward improving retention andmaking it a positive experience for thetutor

Some of these questions have nothingto do with tutor satisfaction or prob-lems/concerns. It seems to be more ofa validation of the student question-naire

How about these types of questionsinstead of the ones given

o average number of times studentcancels/misses a session

o average number of times tutorcancels/misses a session

o average number of sessionsscheduled during month

On the question of whether the tutee rec-ommends the program to others or aboutthe perceptions of tutees by tutors(questions 9 and 13), many expressedfrustration and resentment. They noted:

many of the questions (9, 12, 13, 14,16, 17, 18) do not ask tutors what theythink but rather they ask the tutors whatthe students think or what the studentsperceive. Is it fair, on a formal ques-tionnaire, to ask any person what an-other person thinks?

our tutors cannot generalize about stu-dents in the program because theyknow only their own student well.

some of the questions (9, 10, 11, 12,14, 17, 18) are written in the pluralform. Since most tutors in the state aretrained to work with only one student ata time, they might feel this is a ques-tionnaire for teachers, not tutors.

some questions such as opinion of stu-dent participant cannot be answeredcollectively. What is the purpose ofasking questions relating to studentswithout asking them? That's difficultand impractical.

On the question, "please indicate the extentto which "your" student expectations arebeing met in the achievement of their goal,"one respondent would rather have thisquestion asked:

What is the extent to which program/staffare meeting tutor's expectation/need forsupport?

The instrument elicited responses that ex-posed deep-seated limitations about theadult literacy volunteer sector. The senti-ments are strong on the part of the tutorson expanding tutor training and enhancingclassroom techniques to upgrade results.One of the main objectives of evaluation isto generate concrete data that will improveservices and outcomes. Questions thatsurface in this inquiry are: Did we learn from

t)

Page 36: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

28

this survey? How will the data be used toeffect outcomes? What other questions areoutstanding that need to be included?Should more time have been spent on ask-ing respondents about instructional packag-ing and delivery in the classroom, or class-room interactions between tutor and tutee?Should there have been a focus on the im-pact of technology on adult education'sclassroom? How about support servicesand its links with student performance?

As mentioned in the beginning, there isnothing like a "standard" evaluation that canbe uniformly applied to all programs. Theintent of this field exercies was to send outsensors, get results and interpret them aspart of a mechanism toward program im-provement.

Reference

Padak, N.D., & Padak, G.M. What works:Adult literacy program evaluation. Journalof Reading, 34(5), 374-379.

Michigan Literacy, Inc. (1994). Directory ofMichigan Literacy Organizations. Lansing,Michigan.

0 0

Page 37: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

29

FEEDBACK ON EVALUATION QUESTIONNAIREPART II

PROCESS

Survey Instrument Development

A survey instrument (see Appendix) wasdeveloped for this project. The instrumentconsisted of two major parts: respondents'demographics and their feedback on theevaluation questionnaires. There were sixquestions in the first part focusing onagency, personnel background and thecharacteristics of their literacy programs.The second part of the survey was dividedinto two subcategories: feedback on thetutor survey evaluation questionnaire andfeedback on the student survey evaluationquestionnaire. There were four questions ineach subcategory, focusing on quality of thequestionnaires (see Appendix for details).

Data Analysis

The respondents' background and theirfeedback on the questionnaire were ana-lyzed first using descriptive statistics. Thegroup difference in feedback was furtherexamined using chi-square tests. Morespecifically, the statistical differences in thefeedback on both questionnaires by Kind-of-Agency/Institution (rural, urban, other),Education (<college, college, graduate),Years in Adult Literacy (s5 yr., 6-10 yr., >10yr.), Michigan Certification (elementary,secondary, other, no), and Gender (male,female) were examined using the chi-square tests, and type I error of the testswas set at 0.05.

Results and Discussion (Findings)

A total of 64 subjects returned the survey.The majority of the respondents were fe-males (87.1%), highly educated (85.3%),years of experience in adult literacy (six ormore years, 72.7%), and Michigan-statecertified (90.2%). Most of the agencies/ in-stitutions were either rural (40.7%) or urban(39.0%), and they served a variety of ra-cial/ethnic groups. Detailed description ofrespondent demographics and their agency/institution characteristics were displayed inTable 1 and Table 2, respectively.

Page 38: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

30

Table 4Demographics of Respondents

Variable Number I %

GenderMale 3 87.1Female 8 12.9

Highest DegreeEarned: : :

,HigliSiihool 3 49Assodate's 5 8.2Bachelor's 23 37.7

HiMaiter'S 25 41.0.Doctorate 2 3.3Othw. 3 4.9

Year* in Adultliteracy

i.eSS- than 2 3 5.113 22.0

6-10 26 44.111 or more 17 28.8

Ml State CertificationElementary 10 19.6Secondary 17 33.3VOcatiOnal 2 3.9Combined 9 17.7Other. 5 9.8

Page 39: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

31

Table 5Characteristics of Respondents'

Agencies/institutions

Characteristic Number

Kind ofAgency/Institution

Rural 24 40.7

Urban 23 39.0

Suburban 5 8.5

Other 7 11.9

Racial/Ethnic Average Approximate

Composition PerceritageAfdcan American 24.61

Asian/Pacific 1.53

Caucagian 65.81

Hispanic 4.23

Native American 2.07

Arab/Other 1.75

Page 40: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

32

TUTOR/STUDENT QUESTIONNAIRE FEEDBACK

STUDENT EVALUATION

Overall, the feedback on the tutor evalua-tion questionnaire was very positive. Themajority of respondents thought that thelength of the questionnaire was "just right'(65.0%). Many of them also thought thatthe "appropriateness of questions" of thequestionnaire was "just right" (42.3%). Re-garding "clarity of the evaluation instru-ment," most of them thought it was "justright." Finally, many of them (39.6%) be-lieved that the questionnaire was applicableto their program(s).

Still, the respondents pointed out that theevaluation questionnaire could be improved.A number of respondents believed that theinstrument was "too long" (28%), the ques-tions were "not appropriate" (7.7%), the in-strument was "not clear" (11.1%), or thequestionnaire was "not applicable" (8.35%).The detailed feedback on the tutor evalua-tion questionnaire is illustrated question-by-question in Figures 1, 2, 3, and 4, respec-tively.

A similar feedback was found on the stu-dent evaluation questionnaire. Again, theoverall feedback was positive. The majorityof students believed that the length of thequestionnaire was "just right" (65.8%), thequestions were appropriate (52.6%), thequestionnaire was clear (56.8%), and thequestionnaire was applicable (52.6%).

A small proportion of students also believedthat the student evaluation questionnairecould be improved, and thought that the in-strument was "too long" (34.2%), the ques-tions were "not appropriate" (26.3%), theinstrument was "not clear" (10.8%), or thequestionnaire was "not applicable" (5.3%).The detailed feedback on the studentevaluation questionnaire is illustrated ques-tion-by question in Figures 5, 6, 7, and 8,respectively.

Page 41: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

1: F

eedb

ack

on T

utor

Que

stio

nnai

re

Leng

th o

f the

Inst

rum

ent

TO

O S

HO

RT

6.0%

TO

O L

ON

G

28.0

%

Page 42: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

2: F

eedb

ack

on T

utor

Que

stio

nnai

re

App

ropr

iate

ness

of Q

uest

ions

JUS

T R

IGH

T

42.3

%

4

NO

T A

PP

RO

PR

IAT

E

7.7%

TO

BE

IMP

RO

VE

D

50.0

%

Page 43: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

JUS

T R

IGH

T

51.9

%

Fig

ure

3: F

eedb

ack

on T

utor

Que

stio

nnai

re

Cla

rity

of th

e In

stru

men

t

NO

T C

LEA

R

11.1

%

TO

BE

IMP

RO

VE

D

37.0

%

Page 44: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

4: F

eedb

ack

onT

utor

Que

stio

nnai

re

App

licab

ility

of t

he In

stru

men

t

JUS

T R

IGH

T

39.6

%

4`1

*

NO

T A

PP

LIC

AB

LE

8.3%

TO

BE

IMP

RO

VE

D

52.1

%

Page 45: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

5: F

eedb

ack

on S

tude

nt Q

uest

ionn

aire

Leng

th o

f the

Inst

rum

ent

JUS

T R

IGH

T

TO

O L

ON

G

Page 46: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

6: F

eedb

ack

on S

tude

nt Q

uest

ionn

aire

App

ropr

iate

ness

of Q

uest

ions

JUS

T R

IGH

T

TO

O D

IFF

ICU

LT

26.3

%

\ TO

BE

IMP

RO

VE

D

Page 47: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

7: F

eedb

ack

on S

tude

nt Q

uest

ionn

aire

Cla

rity

of th

e In

stru

men

t

56.8

%JU

ST

RIG

HT

---

-

53

r---

NO

T C

LEA

R

10.8

%

32.4

%T

O B

E IM

PR

OV

ED r 04

Page 48: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Fig

ure

8: F

eedb

ack

on S

tude

nt Q

uetio

nnai

re

App

licab

ility

of t

he In

stru

men

t

JUS

T R

IGH

T

NO

T A

PP

LIC

AB

LE TO

BE

IMP

RO

VE

D

Page 49: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

41

No statistical differences were found amonggroups in terms of their feedback on bothevaluation questionnaires. Chi-square sta-tistics by group were summarized in Table 3

(Kind-of-Agency/Institution), Table 4 (Edu-cation), Table 5 (Years in Adult Literacy),Table 6 (Michigan Certification), and Table7 (Gender), respectively.

Table 6Chi-Square Statistics of Feedback of Respondents

on KindAgencies/Institutions

FeedbackChl-Square

Value

Tutor QuestionnaireL 3.07 4 .55

Appropriateness 5.69 5 .22

am* 6.26 4 _ .18

Applicability 5.84 4 .21

Studerd QuestiomaireLength 2.31 4 .32

Appropriateness 3.14 4 .54

Clarity 2.16 4 .71

Applicability 3.05 5 .55

Page 50: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

42

Table 7Chl-Square Statistics of Feedback

of Respondents on Educational Background

5.37 6 .603.32 6 .772.60 6 .864.69 6 .58

2.06 2 363.98 4 .414.20 4 .384.41 4 .39

Table 8Chl-Square Statistics of Feedback

of Respondents by Years in Adult Literacy

Chi-SquareValue

Tutor:QUeatiOnnalreLength. 0.82 4 .94.AppnWattiness 2.50 4 .64Clarity: 4.93 4 .29AppliOabilitY 4.41 4 .35

Student QuestionnaireLength 0.78 2 .68Appropriateness 3.55 4 .47Clarity. 3.57 4 .47Applicability. 3.97 4 .41

Page 51: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

43

Table 9Chi-Square Statistics of Feedback of Respondents

on Michigan Certification

FeedbackChl-Square

Value P

Tutor QuestionnaireLength 7.18 6 .30

Appropriateness 8.14 6 .23

Clarity 3.32 6 .77

Applicability. 3.66 6 .72

Student QuestionnaireLength , 1.74 3 .63

Appropriateness 3.74 6 .71

Clarity 4.93 6 .55

Applicability 4.42 6 .62

Table 10Chi-Square Statistics of Feedback

of Respondents by Gender

FeedbackChi-Square

Value

Tutor QuestionnaireLength 0.68 2 .71

Appropriateness 1.02 2 .60

Clarity. 1.44 2 .49Applicability 0.98 2 .61

Student QuestionnaireLength 0.11 1 .73

Appropriateness 0.26 2 .88

Clarity 0.75 2 .69Applicability 0.76 2 .69

Page 52: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

44

ConclusionOverall, the respondents' feedback on bothquestionnaires was positive. A small pro-portion of respondents believed that thequestionnaires could be further improved.The respondents' background and thecharacteristics of their agency had no im-pact on their feedback.

REFERENCE

Ntiri, Daphne W. (1994). Evaluation in AdultLiteracy Research: Project ALERT.Detroit, MI: Detroit Literacy Coalition,WSU.

Page 53: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

45

APPENDIX A

Letter to Michigan Literacy Providers

Page 54: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

47

December 7, 1994

Dear Michigan Literacy Provider:

Re: Proposed Survey instruments for Statewide Application in Adult Literacy Programs

The Detroit Literacy Coalition (DLC) has just completed phase one of a two-part series evaluation model

plan to address adult literacy program evaluation needs. The objects is to create a user-friendly evalua-

tion model that can be applied statewide to volunteer literacy programs.

At this time, we would like to invite your participation in this process of improving the efficiency of our

model by completing the feedback questionnaires enclosed. The data you provide will allow us to refine

our instruments and develop effective, impartial and user-friendly tools that can be applied to diverse

populations around the state. We recognize, however, that differences will exist because of the critical

areas and concerns of so many different adult literacy programs affecting disadvantaged social and eco-

nomic groups such as dislocated workers, school drop-outs, drug rehabilitation clients, illiterate immi-

grants, teenage mothers and prison populations in urban and rural contexts.

This is a great opportunity for you to participate in the creation of an effective tool. Remember that

"evaluation is not to prove but to improve." Your cooperation and participation are greatly appreciated.

Completed feedback questionnaires should be forwarded no later thanDecember 30, 1994 to:

Detroit Literacy CoalitionWayne State University6001 Cass Ave., Room 451Detroit, Michigan 4202Tel: (313) 577-4612 (Crystal) or

(313) 577-3923

Sincerely,

Daphne W. Ntiri, Ph.D.Associate Prof. & Director, DLCDirector, Project ALERT

cc: Project Alert Team MembersGladys Coleman WaldenGregory Robinson

Page 55: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

r

49

APPENDIX B

Glossary

Page 56: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

51

GlossaryAdults Those people who have attained the legal and chronological status of adulthood,

and are pumosefully exploring the field of knowledge. In addition, these fields of

knowledge are taking place in an organized group setting (Brookfield, 1991, p.

2).

ALERT An acronym that stands for Adult Literacy Evaluation and Research Team; the

group that researched, analyzed and conducted this effort.

Analysis Careful examination of a whole and its parts, including the relationships amongthem.

Assessment A type of evaluation that is supposed to be based on a system of value-freechecking or ascertainment of the extent to which the worthwhileness, goals andobjectives of a program are being met. A non-judgmental check to determinewhether or not certain goals have been attained.

DLC The Detroit Literacy Coalition, a community-based organization that meets theliteracy needs of at-risk Detroiters through the collaborative networking of its 15

member local agencies.

Evaluation In general, research done to assess the worth of a program; also, the process ofdetermining, the merit or value of something, or the product of that process.

Feedback Information solicited from literacy providers after the primary survey isdata completed, in order to determine whether evaluation efforts have addressed the

most critical concerns, and to identify ways in which improvements can be made.

Formative An evaluation usually performed as ongoing programming is beingevaluation conducted and/or during the developmental or improvement phase of the pro-

gram. This process is designed to address operational procedures, and can beconducted by either the in-house staff or by an external evaluator.

Illiteracy An inability to read, write, and comprehend short, simple sentences; however, toa greater extent this lack of basic skills may lead to an individual's inability to ac-complish the kinds of basic and peniasive tasks necessary for everyday adultliving (Lerche, 1985, p. 1).

Literacy An individual's ability to rm.° write, speak, compute and solve problems at levelsof proficiency necessary to waction on the job and in society.

Measurement (as it relates to literacy) is the process of determining the extent of relatedfac-tors, by comparison, estimation or judgment, and by reducing the data to num-bers that indicate strength or frequency of occurrence.

Pilot-test The procedures involved in pre-testing an item or tool in order to determine itseffectiveness.

64

Page 57: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

52

Qualitativeevaluation

Quantitativeevaluation

Stakeholders

Sponsors

Summativeevaluation

Synthesis

TABE

Tutee

Tutor

Volunteerfor Literacy

An evaluation that is more concerned with the quality of the program, andsurveying information that cannot be measured (e.g., feelings, values, standards,etc.).

An evaluation concerned with amounts; the process of assessinginformation of things that can be measured exactly and objectively.

Organized groups, individuals, or other parties that have an interest politically orprofessionally, and that usually work for or reside within the geographical limits ofthe program.

The persons or organizations that initiate funds or provide the necessary re-sources used to conduct the evaluation.

An evaluation containing information offered only as a final or definitivejudgment of the program's effectiveness and used as the basis on which fundingdecisions can be made; usually conducted for the benefit of external audiencesor decision-makers (e.g., funding agency, historian).

A drawing together of related items to form a coherent whole.

The test of adult basic education that is the most widely used standardized test,and is commonly used to determine program eligibility.

Student of adult literacy programs who is being tutored.

Usually a volunteer; however, all references to tutor in this document will applyequally to alternative funded persons working as tutors (e.g., work study stu-dents).

includes tutor and non-tutor or any person who works for no pay in order to helpfight against the problem of illiteracy.

65

Page 58: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

53

APPENDIX C

Sample of Cover Letter Accompanying Questionnaire

6 3

Page 59: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

55

DETROIT LITERACY COALITIONADULT LITERACY EVALUATION RESEARCH TEAM

(Project ALERT)

Dear (Sample of cover letter accompanying questionnaire)

The Detroit Literacy Coalition (DLC) is a 501(c)3 nonprofit organization addressing theneeds of literacy providers and adult learners. It is currently conducting a study onAdult Literacy Evaluation and Research in an attempt to design and develop a state-

wide evaluation model.

Your facility has been randomly chosen as a pilot site to pre-test the instrument and tosolicit your input into the design of the instrument. We believe that "evaluation is lot toprove, but to improve," so we encourage your cooperation in the development of an ef-fective, impartial evaluation tool. We will be contacting you by phone in the next fewdays to follow up on this correspondence.

We thank you in advance for your cooperation and support.

Sincerely,

Gregory RobinsonALERT Coordinator

6 i"

Page 60: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

57

APPENDIX D

ALERT Feedback Form

C6'

Page 61: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

59txteracy

0 4.

D.L.C.DETROIT LITERACY COALITION (DLC)

WAYNE STATE UNIVERSITY6001 Cass Avenue - Room 451

Detroit, MI 48202

Tel: (313) 577-4612 (Crystal) (313) 964-1078 (Carla) FAX: (313) 577-8585

Adult Literacy Evaluation and Research Team (ALERT)Feedback for Statewide Use

This feedback form has been developed so that literacy stakeholders can provide feedback on the Tutor and

Student Survey Evaluation Questionnaires enclosed. We encourage you to be as candid as possible. Feel free

to attach additional feedback, duplicate, and share the evaluation questionnaires and this feedback form with

colleagues and students. Please return you completed feedback form/s to DLC in the self-addressed enclosed

envelope by December 7, 1994.

Name Title

Literacy Agency/Institution

Address City Zip Code

Instructions: Agency personnel, please answer the following bycircling the appropriate

number.

1. Is your agency/institution ...1. Rural 2. Urban 3. Suburban 4. Other

2. Number of years in adult literacy...

1. less than 2 2. 2 -5 3. 6 - 10 4. 11 - or more

3. Highest degree earned...1. High School 2. Associate 3. Bachelor 4. Master 5. Doctorate

4. Do you have Michigan state...

1. Elementary Certification 2. Secondary Certification 3. Vocational Certification

4. Other

5. Racial/ethnic composition of your clientele and approximate percentage...

1. African American 2. Asian'Pacific Islander 3. Caucasian

4. Hispanic/Latin 5. Native American

6. Your gender...1. Female 2. Male

Page 62: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

60

Tutor and Student Evaluation Questionnaire Feedback Forms

The following are feedback forms for the Tutor and Student Survey EvaluationQuestionnaires.Instructions: Circle the appropriate number.

Tutor Survey Evaluation Questionnaire Feedback Form

1. Len 2th of the evaluation instrument:1. too short 2. too long 3. just right

2. Appropriateness of questions:1. not appropriate 2. adequate but could be improved 3. just ri2ht

3. Clarity of the evaluation instrument:1. not clear 2. adequate but could be improved 3. just ti2ht

4. Applicability of questionnaire to your pro:tram population:1. not applicable 2. adequate but could be improved 3. just right

Comments

Student Survey Evaluation Questionnaire Feedback FormPersonnel please assist students as necessary

. Length of the evaluation instrument:1. too short 2. too long

2. Appropriateness of questions:1. too difficult 2. adequate but could be improved

3. Clarity of the evaluation instrument:1. not clear 2. adequate but could be improved

4. Applicability of questionnaire to your program population:1. not applicable 2. adequate but could be improved

Comments

3. just ri2ht

3. just right

3. just ri2ht

3. just riaht

We appreciate your feedback. thank you.

Page 63: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

61

APPENDIX E

Final Revised Self-Evaluation Program

Tutor and Student Survey Questionnaires

Page 64: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

63

Self-Evaluation ProgramTutor Survey Questionnaire

PART I.IDENTIFICATION

Please place an X in the boxes of all that apply.

1.1 Area of program involvement0 a. tutor/instructorO c. administration

1.2 Nature of engagement in the program0 a. full time

c. volunteer

b. coordinatord. other

0 b. part timed. paid

1.3 Literacy tutor/instmctor training0 a. Laubach Literacy Intern& (LLI) 0 b. Literacy Volunteer of America (LVA)

c. Elementary MI State Certification U d. Secondary MI State Certificatione. MI State Reading Endorsement 0 f. other

1.4 Training and/or experience in adult education0 a. yes 0 b. no

1.5 Length of time with the programa. less than a yearc. 3 - 5 yearse. 11 or more

1.6 Highest ct dree earned0 a. High School/GED

c. Bachelore. Doctorate

0 b. 1 - 2 yearsd. 6 - 10 years

0 b. Associate0 d. Master

4

Page 65: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

64

PART 11.PROGRAM INFORMATION

Please place an X in all the boxes that apply.

2.1 Your student/s represent/s the following population/s0 a. rural 0 b. semi-ruralO c. semi-urban U d. prison/correctionalO e. teenage mothers U f. illiterate immigrantsO g. dislocated workers U h. drug rehabilitation

i. other

2.2 Primary racial/ethnic group you service0 a. African American

c. Caucasian0 Native American

0 b. Asian/Pacific Islanderd. Hispanic

2.3 Instructional session lengthC3 a. less than 60 minutes 13 b. 60 minutes - 90 minutes

c. 91 minutes- 120 minutes U d. more than 2 hourse. other

2.4 Instructional session meeting day/s0 a. MondayO c. WednesdayO e. Friday

g. Sunday

2.5 Instructional session scheduling0 a. once a week1:3 c. 3 times a week

e. twice a month

2.6 Instructional session day parts0 a. Morning

C. Evening

0 b. Tuesdayd. Thursday

U. f. Saturday

b. 2 times a weekd. 4 -5 times a weekf. other

b. Afternoon (noon - 4:59)

2.7 Average number of students serviced per instructional sessionCI a. 1 El b. 2

c. 2 - 3 d. 4 - 5e. 6 or more

2.8 Instructional session location0 a. student's home

C. libraryL3 e. work

g. community center

2.9 Materials used in your program0 a. print

C. audio/cassettee. other

0 b. tutors homed. literacy centerf. chumh

0 h. other

0 b. computerd. video/films

Page 66: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

65

PART ILPROGRAM ISSUES

Name rank the following fikaracy program issues. Place an X in the number box that is mod in

agreement with youropkdon.

BAD OK GOOD VERY G000 EXCELLENT

1 2 3 4 s

O. Progrom issues1 2 3 4 5

3.1 Material for tutoring sessions

32 Tutor training

3.3 Support from the program staff or office

3.4 Student participation in lesson planning

3.5 Student attendance

3.6 instructional session location

3.7 Homework assignments

3.8 Discuss student's literacy goals

3.9 Student's level of motivation

,

3.10 Help student reach literacy goals

3.11 Student Menai to other suppod services

3.12 Rank of overall program operations

Page 67: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

67

Self-Evaluation ProgramStudent Survey Questionnaire

PART I.IDENTIFICATION

Please answer the questions by putting an x in the box (RI).

CI Rem contact your Udor or teacher If you need hellp reading.

1.1 How long have you been in the program?LI a. less than one month d. 7-12 monthsO b. 2-4 months e. more than 1 year

c. 5-6 months

1.2 Why did you start this program?0 a. to get a job0 b. please yourself

c. get a GED/High School Diplomad. read the Bible

1.3 What age group do you belong to?0 a. 16-18LI b. 19-21

c. 22-2513 d. 26-30

e. 31-35f. 36-40

e.f.g.

0 h.

read to childrenlaw/DSSlearn to read and writeother

g. 41-450 h. 46-50

i. 51-60k. 66-70j. 61-65

0 1. 71-plus

1.4 What racial or ethnic group do you belong to?O a. African American U d. Hispanic0 b. Asian/Pacific Islander U e. Native American

c. Caucasian

1.5 Are you a parent?O a. yes 0 b. no

1.6 Do you have a job?O a. yes 0 b. no

1.7 What is your gender?0 a. female 0 b. male

1.8 What is the last grade you attended in school?

Page 68: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

68

PART II.PROGRAM INFORMATION

Please answer the questions by putting an x in the box (0).

clr Please contact your tutor or teacher tf you need help reading.

2.1. How did you find out about the program?GI a. word of mouth 12 e. churchU b. poster/flyer/brochure U f. community centerU c. radio U g. newspaperO d. television C:1 h. other

2.2 How long are your classes?a. less than 60 minutes U d. more than 2 hoursb. 60 minutes - 90 minutes Ci e. otherc. 90 minutes - 120 minutes

2.3 What day/s of the week do you have your classes?CI a. Monday U e. FridayU b. Tuesday U f. Saturday

c. Wednesday U g. Sundayd. Thursday

2.4 What part of the day do you have your classes?a. morning U C. eveningb. afternoon (noon - 4:59)

2.5 How many times do you meet for class?O a. once a week U e. once a monthCI b. two times a week U f. twice a month

c. three times a week U g. otherCI d. four - five times a week

2.6 Where are your class meetings?la a. at homeO b. tutor's home

c. libraryd. school

e. workf. church

CI g. community centerO h. other

Page 69: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

69

2.7 How many students are therein your class?O a. 1 myself d. 4-5U b. 2 Ue.6ormore

c. 3

2.8 Do you and your tutor/teacher plan your classes together?U a. yes U c. sometimesU b. no

2.9 What materials do you use in class?U a. print (books, paper)U b. computer

c. audio/cassette

d. videoe. other

2.10 How many times have you stopped attending this program?O a. 0 Ud.threetimesO b. once U f. four times or more

c. twice

2.11 If you were to drop out of this program what would be the most likelyreason?

a. tutor/teacher U g. weatherO b. feels like I am not learning 0 h. no support

c. location of classes U I. work scheduled. time of classes U j. familye. child care U k. otherf. transportation

2.12 Would you attend a meeting for adult students, like yourself, to talk aboutlearning?U a. yes Uc.maybeO b. no

I

Page 70: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

71

PART III.ArrITUDES TO THE PROGRAM

Please rank the following literacy attitudes issues. Place an X in the number box that ismost in agreement with your opinion

BAD OK GOOD VERY GOOD EXCELLENT1 2 3 4 5

0. How Do You Feel? 1 2 3 4 5

3.1 Your attendance

3.2 Your tutor's/teacher's attendance

3.3 Learning materials (books, workbooks, papers

3.4 Tutor's/teacher's help in reaching your reading goals

3.5 Program's help with problems (refer to other agencies)

3.6 Home assignments

3.7 Pace of the class

3.8 Class meeting location

3.9 Understand your tutor/teacher .

3.10 Life improved since beginning the program

3.11 Growth in your reading ability

3.12 Growth in your writing abilityA

.

0 , .

3.13 Growth in your confidence since beginning theprogram

3.14 Improvement in your relationship/s with others sincebeginning the program

3.15 How you feel about yourself since beginning theprogram

3.16 Rate your classes

3.17 Rate your tutor

3.18 Rate the whole program

3.19 Recommend the program to others

Page 71: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

73

APPENDIX F

Self-Evaluation Program Questionnaires

Page 72: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

75

SELF-EVALUATION PROGRAMTUTOR SURVEY QUESTIONNAIRE

Please check only one response in all the responses below unless stated otherwise.

1. Please indicate the type of agency to be surveyed.

2. Please indicate your position in this program.

3. Please indicate the length of time you have been in this program.

a. Less than 6 months

b. 6-12 months

c. 12-18 months

d. 18-24 months

e. More than 2 years

4. Please indicate which of e following is the most effective source of information for knowledge aboutthe center.

a. Word of mouth

b. Poster or flyer

c. Radio or television

d. Referral

e. Other (specify)

S. Please indicate one major reason for student enrolling in the program.

a. Driver's license

b. Independent normal life

c. Employment opportunity

d. Law/government requirement

e._ Other (specify)

Page 73: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

76

6. Please indicate to what extent this program provides you with instructional material.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Other (specify)

7. Please indicate the average number of times that a student is absent from a session.

a. Less than 2 times a month

b. 2-3 times a month

c. 4-5 times a month

d. 6-7 times a month

e. More than 7 times a month

8. Please indicate to what extent a student's interaction with other students is beneficial.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

9. Please indicate to what extent your effort as an instructor plays a role in achieving students' goal.

a. To a great extent

b._ To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

Page 74: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

10. Please indicate to what extent the program provides counseling and guidance to students.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

11. Please indicate to what extent your students do their home assignments on a regular basis.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

12. Please indicate how the students perceive the level of work assignments from you.

a. Too hard

b. Challenging

c. _ Just right

d. _ Easy

e. Don't know

13. Please indicate the extent to which student expectations are being met in the achievement of theirgoal.

a. _ To a great extent

b. To a good extent

C. To a moderate extent

d. To a little extent

e. Not at all

Page 75: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

78

14. Please indiaste how the students perceive the pace of the program they are in.

a. Too fast

b. Fast

c. Just right

d. Slow

e. Too slow

15. Please indicate in your opinion mag major barrier that plays a role in student absenteeism from classsessions.

a. Baby-sitter problem

b. Transportation

c. Self/family health

d. Fear of failure

e. Other (specify)

16. Please indicate to what extent the learning experiences provided specifically address student goals.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

17. Please indicate to what extent the program is beneficial to the students in the program.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

S 3

Page 76: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

79

18. Please indicate how you think your students will rate the program they are in.

a. Excellent

b. Very good

c. Good

d. Average

e. Not bad

19. Please indicate if students recommend the program to others.

a. Yes, without reservation

b. Yes, with some reservations

c. No

d. It depends (explain)

20. Please make comments if you have any.

Page 77: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

81

SELF-EVALUATION PROGRAMSTUDENT SURVEY QUESTIONNAIRE

1. Please indicate the kind of program you are currently enrolled in.

2. Please indicate the purpose of your enrollment in this program.

Please check only one response in all the responses below unless stated otherwise.

3. a. Less than 6 months

b. 6-12 months

c. 12-18 months

d. 18-24 months

e. More than 2 years

4. Please check a sources that gave you information about this program.

a. Word of mouth

b. Poster or flyer

c. Radio or television

d. Referral

e. Other (please specify)

5. Please indicate one major reason or goal for your enrollment in this program.

a. Driver's license

b. Independent normal life

c. Employment opportunity

d. Law/government requirement

e. Other (please specify)

Page 78: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

82

6. Please indicate to what extent this program provides you with instructional material.

a. To a great extent

b. To a good extent

C. To a moderate extent

d. To a little extent

e. Not at all

7. Please indicate the number of times you were absent from the program.

a. Less than 2 times a month

b. 2-3 thnes a month

c. 4-S times a month

d. 6-7 times a month

e. More than 7 times a month

8. Please indicate to what extent the program allows you to interact with other students.

a. To a great extent

b. To a good extent

C. To a moderate extent

d. To a little extent

e. Not at all

9. Please indicate to what extent your instructor's efforts play a role in achieving your goal.

a. To a great extent

b. To a good extent

C. To a moderate extent

d. To a little extent

e. Not at all

r "N0 0

Page 79: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

83

10. Please indicate to what extent the program provides you counseling and guidance.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

11. Please indicate to what extent you have time to work on your assignment at home.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

12. Please indicate the level of work assignments in this program.

a. Too hard

b. Challenging

c. Just right

d. Easy

e. Don't know

13. Please indicate the extent to which your expectations are being met in achieving your goals.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

r0

Page 80: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

84

14. Please rate the pace of the program you are in.

a. Too fast

b. Fast

c. Just right

d. Slow

e. Too slow

15. Please indicate swg major barrier that plays a role in your class attendance.

a. Baby-sitter problem

b. Transportation

c. _ Self/family health

d. Fear of failure

e. Other (please specify)

16. Please indicate to what extent you think this program will help you in achieving your goals.

a. To a vest extent

b. To a good extent

C. To a moderate extent

d. To a little extent

e. Not at all

17. Please indicate to what extent your performance at home, at work or at some other place improvedsince you entered the program.

a. To a great extent

b. To a good extent

c. To a moderate extent

d. To a little extent

e. Not at all

Page 81: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

85

18. Please rate the pmgram you are in.

a. Excellent

b._ Very good

c. Good

d. Average

e. Not bad

19. Would you recommend this program to others.

a. Yes, without reservations

b._ Yes, with some reservations

c. No

d. It depends (please explain)

20. Please make comments if you have any.

8

Page 82: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

87

APPENDIX G

DLC Policy Statement and DLC Overview

0 3

Page 83: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

89

DETROIT LITERACY COALITION

POLICY STATEMENT

The Detroit Literacy Coalition was created in 1985 to strengthen and enhance theawareness and commitment to literacy in the City of Detroit. With the partnershipof business, labor, educational, public and private agencies, and the community-at-large, the Detroit Literacy Coalition aims to establish a unified, comprehensiveand coordinated plan to address the illiteracy problem of Detroit.

OVERVIEW

"The mission of the Detroit Literacy Coalition is to enrich the lives of individualsand families by helping them achieve their full potential through literacy by sup-porting, promoting and ensuring the availability and accessibility of literacy serv-ices, thus contributing to a workforce that will ensure a strong economy and apromising future for the City of Detroit."

Page 84: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

91

APPENDIX H

DLC Board of Directors

Page 85: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

93

DETROIT LITERACY COALITION

PRESOENTDr. ROSA MALLETT

Wayne County RESA

DIRECTORDr. DAPHNE WILLIAMS NTIRI

Wayne State University

STAFFPATRICIA TOMICH, Events Coordinator

CARLA JORDAN, ClericalWayne State University

BOARD OF DIRECTORS

Lt JOHN AUTREYDetroit Police Department

VERNITA BEVERLYWTVS-Channel 56

RUTH BIERSDORFDetroit Public Library

THOMAS E. COOKMEA/NEA Great Cities Program (Retired)

GLORIA EDMONDSAKA Reading Experience Program

EDITH EUBANKSNeighborhood Services

ANTHONY HARRISGraduate Student, Wayne State University

PATRICK LINDSAYChrysler Corporation

GLORIA GRADY MILLSMichigan State Department of Education

Dr. VERONA MORTONHighland Park Community College

MARY ANN REILLYUnited Way

Sr. MARIE SCHOENLEINDominican Literacy Project

IDA SHORTSchoolcraft College

Page 86: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

95

APPENDIX I

DLC Mission Profile

C P

4

Page 87: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

97

Detroit Literacy Coalition

Mission Profile

Statent of Purpose

The purpose of the Detroit Literacy Coalition (DLC) is to promote for the total eradication ofilliteracy in our com-

munity and to advance the cause of reading and %vriting both for work and pleasure. The DLC is committed to im-

proving the quality of life of every Michigan citizen and resident by ensuring that increased access to basic education

is available to those deprived or those who for some reason or other missed out on early educational opportunities.

Its central tenet is that total literacy is fundamental to the well-being of Detroit society.

For the DLC, die ongoing re-evaluation of our values and perspectives of the world in which welive demonstrates the need to equip Detroiters with more than the basic literacy skills to preparefor an increasingly technologically advanced world that requires sophisticated levels of knowledgefor the workplace. Functional illiterates or "prisoners of silence," according to John Kozol, live onthe edge of society. The DLC recognizes that the responsibility to provide for the marginalizedsectors who are constantly deprived rests on our volunteers, tutors and all who can help.

The DLC recognizes that there is a "New Order" in the Detroit workplace. Whereas with the oldorder fimctional illiterates could hide behind their ignorance, in the new order the introduction ofcomputers and the constantly changing standards and regulations for industrial workers make itnecessary for greater achievement in education of our employees.

The DLC as a pivotal force in literacy advocacy in Detroit will gain consistent, ongoing financial

support that will be distributed to member literacy providers.

The DLC is aware of the nontraditional family structure that exists today. Family and intergenera-tional literacy area a part of the plan of the DLC. The emphasis on the family, includingparentingskills, is incorporated in the planning and execution of literacy programs around the city.

The DLC will be the communications and information clearinghouse for literacy material.

The DLC articulates two strong words, "trust" and "love" that are lacking in the lives of many. As-sociated DLC providers are encouraged to offer programs that are situated in warm and friendly

environments to help learners feel more comfortable.

Page 88: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

98

The DLC maintains the interest in filling in the gaps in the existing literature to form a sound con-ceptual base for literacy research and advancement in Michigan and the nation.

The DLC's mission is primarily to serve the City of Detroit in generating city-wide awareness andcootdinating adult literacy resources and services. The DLC has taken the leadership in meetingthe needs of at-risk Detroiters through the collaborative efforts of its members. As part of its mis-sion to expand services and foster a healthy alliance with the associated literacy providers (DetroitPublic Schools, Detroit Public Libraries, churches, prisons, fraternities and clubs, communitygroups) it promotes public/private sector partnerships; it fosters greater corporate awareness ofadult functional illiteracy; it works toward increasing business involvement in literacy activities; itencourages volunteerism; it provides technical and networking assistance; it promotes adult liter-acy services through local and national media and through contacts with state agencies; and itstages highly visible activities such as the Literacy Summit and the Walkathon to boost the imageof the organization and make it a household word.

In short, the DLC has set the goal of alleviating illitericy, that is, reducing the munber of fimctioral illiterates from200,000 to less than 50% over the next few years in the metro Detroit area. This is particularly desirable for drop-outyouth, special populations, and ethnic enclaves where illiteracy seems to predominate. Ultimately, our goal is tomake the DLC an exemplary model that can be replicated elsewhere.

Page 89: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

99

APPENDIX J

Services and Primary Goals

CU

Page 90: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

101

Services

Special Events

Publications

Research

Community Services

Grantswriting and Fundraising

Leadership Development

Organizational Development

Information Sharing

Public Policy Strategies

Dialoguing

Primary Goals

To organize and present special events such as the Literacy Summit andthe Walkathon that increase awareness in the community about thescourge of illiteracy.

To produce and circulate publications such as newsletters, directoriesand other information documents (legislative, administrative, curricular)that will keep literacy providers and the public informed and knowl-edgeable.

To generate data bases and disseminate results of studies about literacyand education in general for metro Detroit, paying close attention to is-sues that will help alleviate the literacy dilemma.

To offer to neighborhood groups and literacy providers skills and tech-nical assistance such as program development and management, tutortraining, grant writing, board development, curriculum development,riga collection, etc.

To identify and obtain funds from a variety of sources that can make theDLC the central grant redistributing agency to promote literacy in thecity.

To be in a position to demonstrate leadership and to organize work-shops, colloquia and symposia designed to promote leadership amongliteracy providers.

To assist with organizational development of new and smaller groupsthat wish to join the literacy bandwagon. Such services often include:participant development, board development, curriculum development,marketing and grantwriting, promotion, and information sharing.

To serve as the inform arm clearinghouse for literacy materials.

To increase public awareness of the problem of illiteracy with full en-dorsement by the City of Detroit and the Common Council.

To promote exchanges, among Michigan's literacy providers such as thevoluntary literacy councils, adult basic education programs, state librarysystems, etc.

Page 91: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

103

APPENDIX K

List of Responding Literacy Sites

Page 92: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

105

Adult Basic EducationIshpeming, Negaunee, Nise Comm. Ed.101 Pioneer Ave.Negaunee, MI 49849

Adult Literacy ProgramJanet FultonAlpena County Library211 N. First AvenueAlpena, MI 49707

African Heritage Cultural CenterPatricia Noni GeesDetrolt Public Schools21511 W. McNicholsDetroit, MI 48219

Alger Area Literacy CouncilHerbert M. IngrahamR.R. 1, Box 1022Munising, MI 49862

Allendale Community EducationClaire Voetberg6633 Lake Michigan DriveAllendale, MI 48401

Antrim-Kalkaska Literacy CouncilToni Wayda201 State StreetMance Iona, MI 49659

Branch County Literacy CouncilColleen Knight9501 E. ChicagoColdwater, MI 49036

Cheboygan County Libraries for LiteracyFred BrickleyIndian River Library, P.O. Box 160Indian River, MI 49749

Cheboygan County Libraries for LiteracyCindy Lou Poquette3546 S. Straits Highway, P.O. Box 160Indian River, MI 49749

1 0 3

Detroit Public Schools; Adult Ed.Lura Bums3700 PulfordDetroit, MI 48207

Detroit PS Corn. Based Ed EastSharon GogleHarris School3700 PulfordDetroit, MI 48207

Dominican LiteracySister Marlene Lieder, OP9400 CourvilleDetroit, MI 48224

Dominican Literacy CenterDonald J. Bilinski9400 CourvilleDetroit, MI 48224

Dominican Literacy CenterDonald Johnson9400 CourvilleDetroit, MI 48224

Dominican Literacy CenterNancy Mayer9400 CourvilleDetroit, MI 48224

Dominican Literacy ProjectSister Marie Schoenlein9400 CourvilleDetroit, MI 48224

Down River Literacy CouncilMary Lou PrevostSouthgate Adult Education14101 LeroySouthgate, MI 48195

Dukette Learning CenterMary Jane Baumann, IHM530 W. Pierson RoadFlint, MI 48505

Page 93: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

106

Evart Community EducationCarol Wojcik161 1/2 N. Main StreetEvart, MI 49631

Even Start Family Education ProgramDenise Schmitz-Enking131 E. Aurora StreetIronwood, MI 49938

Gontcalm Adult Reading Council011ivette Kassouni1401 Van DeiuseGreenville, MI 48848

Grand Blanc Community EducationMark R. WallenG-11920 S. SaginawGrand Blanc, MI 48439

Industrial SitesBarbara Hubert20018 WisconsinDetroit, MI 48221

Kent County Literacy CouncilJudy Zainea60 Library Plaza, N.E.Grand Rapids, MI 49503

Leelanau Literacy CouncilHazel Le Anderson502 N. Traverse Lake RoadCedar, MI 49621

Linden Community Education7801 Silver Lake RoadLinden, MI 48451

Literacy Council of Bay CountyAnn Cotten401 Banger St.Bay City, MI 48706

Literacy Council of Calhoun CountyLynn Blake450 North AvenueBattle Creek, MI 49017

Literacy Council of Midland, MIDiane H. Kau220 E. Main Street, Suite 206Midland, MI 48640

Literacy Network of Kalamazoo CountyDenise Duquette1000 E. PatersonKalamazoo, MI 49080

Literacy Now!Verona MorionHighland Park Community CollegeGlendale at ThirdHighland Park, MI 48203

Ludington Public Library Lit. ProgramBarbara Sutton217 E. LudingtonLudington, MI 49431

LVA - DetroitHeidi Hatcher330 Fisher BuildingDetroit, MI 48202

LVA - Sanilac CouncilMartha Carpenter46 North JacksonSanduski, OH 48422

LVA - Sanilac CouncilGrace Temple46 North JacksonSandusky, OH 48471

,M1 Catholic Health SystemsBeverly Ciokajlo53 ChandlerHighland Park, MI 48203

I GI:

Page 94: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

107

Michigan Literacy Inc.Donna AudetteP.O. Box 30007Lansing, MI 48909

Montcalm Adult Reading CouncilVirginia Schantz205 S. FranklinGreenville, MI 48838

Montcalm Adult Reading Councilao Adult EducationOlivette Kassouni1401 Van DeinseGreenville, MI 48838

Mott Adult - Sarvis CenterKathryn Wiliams12037 Juniper Way, #530Grand Blanc, MI 48439

Oakland Literacy CouncilWilliam E. Engel17 S. SaginawPontiac, MI 48342

Oakland Literacy CouncilNancy B. Geddes17 S. SaginawPontiac, MI 48342

Oakland Literacy CouncilCathryn Weiss17 S. SaginawPontiac, MI 48342

Osceola County Literacy CouncilMarilyn Allen9890 Three Mile RoadSears, MI 49679

Project Lead - Adrian Pub. LibraryJanet Vem143 E. MonroeAdrian, MI 49221

1 G2

Project LiteracyDeborah Del ZappoP.O. Box 175Muskegan Oceana CountyPentwater, MI 49449

R.E.A.D. Alpena County LibrarySusan Plowman211 N. First AvenueAlpena, MI 49707

Reading for Adults Literacy ProgramDoris E. LanceAlpena County Library211 N. First AvenueAlpena, MI 49707

Reggie McKenzie FoundationPat Smith20 BartlettHighland Park, MI 48203

Retired Senior Volunteer ProgramTambrete D. PhillipsP.O. Box 1025Gaylord, MI 49735

Roscommon County Literacy CouncilJan MonteiP.O. Box 320Roscommon, MI 48653

Schoolcraft CollegeLinda Tolbert18600 HaggertyLivonia, MI 48025

Shiawassee Adult Literacy AssociationMaxine Capitan/Karen Fuoss515 W. Main StreetOwosso, MI 48867

South Haven Area Literacy CouncilKathy Steffler600 ElkenburgSouth Haven, MI 49090

Page 95: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

108

South Haven Area Literacy CouncilLouise Wepfer600 ElkenburgSouth Haven, MI 49090

South Kent Adult Learning CenterLorraine Rozegnal319 FairbanksGrand Rapids, MI 49508

South Kent Community EducationDoris Hackett3529 S. Division AvenueWyoming, MI 49548

South Kent Comm. Ed. Basic. Lit.George Jasperse3121 Maple Villa SEGrand Rapids, MI 49508

Straits Area Community Adult EdAnn Stafford840 Portage Road, P.O. Box 254St. lgnace, MI 49781

163

Tuscola Literacy CouncilEllen Toner4117 Doerr RoadCass City, MI 48867

Volunteers for Adult LiteracyMollie J. HembruchMott Adult High School1231 E. Kearsley StreetFlint, MI 48503

Wayne County Adult Literacy NetworkDr. Rosa WillettP.O. Box 80733500 Van Born RoadWayne, MI 48184-2497

West Branch Rose City Adult EducationSally Campbell960 S. M-33West Branch, Mi 48661

WSU/Council for Excellencein Adult Learning2727 2nd Avenue, Suite 107Detroit, MI 48202

Page 96: Phase II. 96 Reports - ERIC - Education Resources ... · Phase II is a continuation of the work initi-ated in 1994 with the Project ALERT team. Phase II sets out to test the evaluation

Mit

Detroit Literacy CoalitionDetroit, Michigan

'ISBN: 0-911557-13-X

---1

IC 4

-wqp

IMO