If we plan and conduct our assessment projects at every ...

69
If we plan and conduct our assessment projects at every step as if learning matters most -- not just the students’ learning, but ours, as well -- then the distance between means and ends will be reduced and our chances of success increased. - Thomas Angelo

Transcript of If we plan and conduct our assessment projects at every ...

Page 1: If we plan and conduct our assessment projects at every ...

If we plan and conduct our assessment projects at every step as if learning

matters most -- not just the students’ learning, but ours, as well -- then the

distance between means and ends will be reduced and our chances of

success increased. - Thomas Angelo

Page 2: If we plan and conduct our assessment projects at every ...

Table of Contents

Preface ……………………………………………………………………………………………….....1

Introduction.………………………………………………………………………………………..…..2

What is SLO Assessment…............................................................................................2 A. Definition of Basic TermsB. Myths and Facts about Assessment

Why do We Assess? …………………………………………………………..………….….3

A. Benefits of AssessmentB. SACS ExpectationsC. Creating a Culture of Continuous Improvement

SLO/PLO Assessment Process at Lone Star College System ………………………….……5

Flowchart for SLO/PLO Assessment Process……………………………………………...6

Basic Requirements…………………………………………………………………………...6 A. Course Selection Requirements & Curriculum Team ResponsibilitiesB. Section Selection Requirements & Program Chair/Director/

SLO/PLO Campus Liaison ResponsibilitiesC. Data Collection Requirements & Faculty responsibilitiesD. Campus-Wide vs. System-Wide RecommendationsE. Implementation of Recommendations

SLO/PLO Assessment Timeline………………………………………………………………….…8

Writing the SLO/PLO Assessment Plan………………………………………………………….12

Creating Student Learning Outcome Statements...………………………..……………...12

A. Guideline for Creating an SLO StatementB. Bloom’s Taxonomy and Domains of Learning

Designing Assessment Methods………………………………………..…………………..18

A. Selecting Measures(A) Defining Measures(B) How to Select Measures(C) Examples of Measures Aligned with Student Learning Outcomes

B. Designing Assessment Rubrics(A) Rubric Defined(B) How to Create a Rubric(C) Examples of Rubrics

LSC SLO Handbook (2nd ed.)

Page 3: If we plan and conduct our assessment projects at every ...

Setting Criteria and Target Outcomes.……………………………………………………….24 A. Setting CriteriaB. Setting Target Outcomes

Writing the Closing-the-Loop Report ……………………………………………………………...26

How to Report on Actual Results ………….……………………………………..…………..27

How to Interpret Results………………………………………………...……………………..27

How to Make Recommendations ………………………………………………...…………..28

Continuous Quality Improvement ………………………………………………………………….30

How to Report on the Implementation of Recommendations ………..……………….…..30

Assessment Tool: Compliance Assist…………………………..…………………………………31

Web Link to Compliance Assist ………………………………….…………………………..31

SLO Form: Guide to Data Entry ………………………………………………..…………….31

PLO Form: Guide to Data Entry …………………………………………..………………….31

Compliance Assist Navigation Tool ……………………………..…………………………...31

Glossary of Terms ……………………………………………….…………………………………...32

References………………………………………………………………………………………………36

Resources……………………………………………………………….……………………………...37

Appendices……………………………………………………….…………………………………….39

Appendix A: AA/AS PLO-Course Alignment Map…………………………………………..40

Appendix B: Workforce Program & AAT PLO-Course Alignment Map……………...……44

Appendix C: Curriculum Team Form Worksheet……………………………………………46

Appendix D: Instructions on How to Fill out the e-Form for Section Selection………..…47

Appendix E: Detailed Explanation of the SLO/PLO Process with Requirements…..…..50

Appendix F: Worksheet for Creating Course SLO Assessment Plans and Reports…....53

Appendix G: Worksheet for Creating PLO Assessment Plans and Reports.…………....55

Appendix H: SLO/PLO Assessment Self-Assessment Rubric……………………….……57

LSC SLO Handbook (2nd ed.)

Page 4: If we plan and conduct our assessment projects at every ...

Appendix I: Samples of Exemplary Course SLO Plans and Reports ………………..…..60

Appendix J: Samples of Exemplary PLO Plans and Reports ……………………………..62

Appendix K: AAHE Nine Principles of Good Practice for Assessing Student Learning...64

LSC SLO Handbook (2nd ed.)

Page 5: If we plan and conduct our assessment projects at every ...

Preface

It is easy to think that we have to conduct student learning outcome (SLO) assessment due to accreditation requirements. However, looking beyond the accreditation requirements, faculty members sincerely care about student learning, and they care about exploring ways to gauge if students are learning or not. More and more faculty members are embracing assessment of student learning as classroom inquiry and as “a feature of pedagogical imperative,” aimed at improving students’ learning experience.

Assessment at its best should not be the cumbersome additional work that pulls faculty away from engaging students in learning. Instead, it should be an integral part of classroom learning activities, part of the assignments that faculty give to students to gauge their mastery of learning, as Thomas Angelo puts it, “If we plan and conduct our assessment projects at every step as if learning matters most -- not just the students’ learning, but ours, as well -- then the distance between means and ends will be reduced and our chances of success increased.”

Lone Star College faculty members have increasingly taken ownership of assessing student learning, and they continue to share many success stories about using SLO assessment results for continuous improvement of teaching practices and their academic programs. To sustain a culture of continuous improvement and to maintain a consistency across Lone Star College System, we have compiled this SLO Assessment Handbook, which can serve the following functions:

First, the handbook provides a common set of expectations regarding the SLO and PLO assessment processes at Lone Star College System. Second, it serves as a resource to aid the campuses in their assessment efforts. Finally, it acts as a communication piece for our internal and external stakeholders, updating them about our assessment processes and accomplishments. As such, it is a living document that will be updated on a regular basis.

Additionally, the Office of Strategic Planning and Assessment provides online resources and regular professional development workshops on student learning outcome assessment. We want to support you in any way we can as you engage yourselves in the collective inquiry into continuous improvement of student learning and student success.

1LSC SLO Handbook (2nd ed.)

Page 6: If we plan and conduct our assessment projects at every ...

Introduction

What is SLO Assessment?

A. Definition of Basic Terms

SLO: Acronym for Student Learning Outcome. An SLO statement explains what the student islearning, including the accumulated and demonstrated knowledge, skills, abilities, behaviors,and habits of mind, as a result of actively participating in the course or program of study.

Currently, Lone Star College uses SLO acronym to refer to course level learning outcomeassessment.

PLO: Acronym for Program Learning Outcome. PLOs address the question, “What willstudents know or be able to do when they exit the program?”

Currently, Lone Star College has three categories of PLOs:

(1). PLOs for Workforce Programs. The assessment data are tracked, using the PLOAssessment Forms in Compliance Assist;

(2). PLOs for Associate of Arts in Teaching program. The assessment data are tracked, usingthe PLO Assessment Forms in Compliance Assist;

(3). PLOs for Associate of Arts/Associate of Science degrees. The AA/AS PLOs are assessedby course level SLOs, so the assessment data are tracked using SLO Assessment Forms inCompliance Assist.

Direct Measures: Refer to the type of data that can be used to directly measure students’knowledge and skills (Examples: exams, essays, and skill demonstration, etc.)

Indirect Measures: Refer to the type of data that can be used to infer student learning orachievement (Examples: surveys, interviews, and graduation rates, etc.)

Program-Level Measures: Refer to assignments assessing students’ knowledge and skills atthe end of the program, not embedded in any particular course (Examples: comprehensive exitexam, licensure exam, and capstone assignments, etc.)

Course-Level Measures: Refer to assignments embedded in specific courses selected tomeasure PLOs (Examples: exams and essays in a history or an English course).

B. Myths and Facts about Learning Outcome Assessment

Student learning outcome assessment began with the state-level mandates in the 1980s andbecame part of the accreditation requirements in the 1990s. These external requirements maynot be in tune with the traditional teaching practices that some faculty were used to, thusleading to some misunderstandings. To clarify what the main purposes and what the core taskof assessment are, we need to dispel some myths about student learning outcomeassessment.

2LSC SLO Handbook (2nd ed.)

Page 7: If we plan and conduct our assessment projects at every ...

Myths Facts

Learning outcome assessment data may negatively impact faculty’s job security.

The assessment data are meant to be used to gain insights into student learning and to improve student learning. They are not meant to be used as a threat to faculty’s jobs.

Assessment is an additional burden and pulls faculty’s attention away from teaching.

Student learning outcome assessment should be an integral part of teaching. Assignments given to students for SLO assessment should be part of the course assignments or program level assignments used to gauge if students are learning what they are expected to learn. It is up to faculty working collaboratively to design assignments for SLO assessment that are meaningful, not cumbersome.

Grades are enough to assess student learning.

Grades are not enough to assess student learning for three reasons: First, course grades sometimes include extraneous elements such as extra credits which are outside of learning. Second, grades do not provide a targeted assessment of defined skills or knowledge. For example, a grade of “B” does not offer any information as to what specific area a students is excelling or is deficient in. Finally, grades are indirect measures of learning and are therefore insufficient for assessing student learning.

Assessment is not worth the time. Using assessment data to reveal where the focus of teaching and learning should be will save time rather than waste time. Faculty members assess students’ learning all the time with or without the requirements for SLO assessment. The key is to integrate SLO assessment with faculty’s routine assessment of students’ performance, but focus more on gauging if students have mastered the well-defined learning outcomes.

Why do We Assess?

To further clarify why we should conduct learning outcome assessment, we should take a closer look at the benefits of assessment and SACS/COC expectations.

A. Benefits of Assessment

The benefits of learning outcome assessment are many folds: The assessment allows us to collect evidence to demonstrate quality teaching and learning; it helps us link courses together in a coherent sequence; it opens up dialogue about what was taught, why it was taught, what should be the standards and expectations. The assessment data shed light on areas for improvement in terms of curriculum and instruction. The assessment processes also challenge students to take ownership of their learning. Specifically, assessment benefits both faculty and students in the following ways:

3LSC SLO Handbook (2nd ed.)

Page 8: If we plan and conduct our assessment projects at every ...

Benefits for Students:

Students will be able to Learn clear expectations about what’s important in a course or program Be evaluated in a consistent and transparent way Gain assurance that there is common core content across all sections of a course Make better decisions about what program of study to pursue based on outcomes results

Benefits for Faculty:

Faculty will be able to Answer the question: Did my students learn what I wanted them to learn? Determine what's working and what's not working in their courses or programs Engage in valuable interdisciplinary and intercampus discussions Provide powerful evidence to justify needed resources to maintain or improve programs Provide reassurance that all faculty teaching a particular high demand course agree to

address certain core content

B. SACS/COC Expectations

Assessing student learning outcomes and using the assessment data for continuous improvement are required by SACS/COC. See SACS/COC Comprehensive Standard 3.3.1.1 below:

Comprehensive Standard 3.3.13.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)

3.3.1.1 educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate.

(SACS/COC Resource Manual, 2012, (p. 48)

C. Creating a Culture of Continuous Improvement

While it is important to comply with the accreditation requirements, it is more important to assess student learning for the sake of students. To make assessment meaningful, systematic, and ongoing, Lone Star College has implemented consistent processes, framework, and tools for assessment at different levels and in various forms.

In order to create and sustain a culture of continuous improvement, Lone Star College engages in both informal and formal assessment. Informal assessment is constant and ongoing (embedded in everything we do). Formal assessment is a systematic process and occurs on a regularly scheduled basis. LSC has numerous formal assessment processes, for example, the Annual Cycle of Effectiveness Process (ACE), the Program Review Process, and the Student Learning Outcome assessment process. This Handbook will limit its discussion to the student learning outcomeassessment at both program level and course level.

4LSC SLO Handbook (2nd ed.)

Page 9: If we plan and conduct our assessment projects at every ...

SLO/PLO Assessment Process at Lone Star College

As a multi-campus institution, Lone Star College has instituted an SLO/PLO assessment process that ensures both system-wide consistency and campus flexibility. The consistency is maintained by system-wide curriculum teams who select the same SLOs and same courses for assessment each year. Campuses have the flexibility to design methods of assessment, collect and analyze data, and use data for continuous improvement. If a campus recommends any changes to the curriculum, the relevant system-wide curriculum team will be engaged in the discussion and will make decisions on whether or not the change should be made and implemented across the system. The following chart reflects this balanced approach:

5LSC SLO Handbook (2nd ed.)

Page 10: If we plan and conduct our assessment projects at every ...

Basic Requirements

A. Course Selection Requirements & Curriculum Team Responsibilities:

1) Each Curriculum Team selects two courses and 4 SLOs for each annual assessment cycle(Each course assesses 2 SLOs, and both courses need to be assessed and reassessed before rotating to another set of two courses);See the link to the course SLO statements below:http://www.lonestar.edu/class-search.htm

2) Selected courses and SLOs need to be aligned with PLOs (AA/AS PLOs and Workforce PLOs); See AA/AS PLO-Course Alignment Map and Workforce Program & AAT PLO-Course Alignment Map at the SLO website: http://www.lonestar.edu/student-learning-outcomes.htm. Also, see the alignment maps in Appendix A and Appendix B.

3) Curriculum Team Chairs should fill out the Curriculum Team Form (See Appendix C for a sample) in Compliance Assist and communicates the selected courses and SLOs to their related disciplines at each campus prior to the beginning of each assessment cycle;

4) Curriculum Team Chairs will also document Curriculum Team SLO recommendationsand/or decisions in the Curriculum Team Form at the end of each assessment cycle.

B. Section Selection Requirements & Program Chair/Director/SLO/PLO CampusLiaison Responsibilities

Program chairs/directors should identify the course sections to be assessed for each course selected for assessment according to the following standards:

1) 25% of sections need to be identified for SLO assessment;

2) If the course has 4 or less sections, then all sections need to be assessed;

3) Course sections selected needs to include main campus, centers, online,and dual enrollment sections.

4) Program chairs/directors and campus SLO/PLO liaisons should document the selectedsections in the e-Form provided by Office of Research and Institutional Effectiveness(ORIE). See instructions on how to fill out the e-Form in Appendix D.

5) Program chairs/directors and campus SLO/PLO liaisons should also communicate with thefaculty teaching in the selected course sections before the semester begins and explainthe method of assessment and evaluation instrument, etc.

6LSC SLO Handbook (2nd ed.)

Page 11: If we plan and conduct our assessment projects at every ...

C. Data Collection Requirements & Faculty Responsibilities

Faculty members chosen to assess should submit their SLO results using SLO Data Collection Tool.

See the following flowchart for the process of submitting SLO data:

Instructions on how to use the SLO Data Collection Tool is available from the SLO website:

http://www.lonestar.edu/student-learning-outcomes.htm

D. Campus-Wide vs. System-Wide Recommendations

1). Campus faculty should make recommendations based on SLO data analysis and interpretation.

2). Recommendations can be made in the area of instructional strategies, curriculum revision, student support, and faculty professional development, to name a few.

3). If the campus recommendations impact curriculum changes, the System Curriculum Team should meet to discuss the recommendations and suggested changes.

4). System-wide curriculum teams should also meet to share best practices and recommendations implemented at each campus.

5). Recommendations made by the system-wide curriculum teams need to be documented in Compliance Assist in the Recommendation section of the Curriculum Team Form, and these recommendations should also be reflected in the Recommendation section of the SLO assessment form completed by each campus.

6). Campus SLO liaisons are responsible for submitting SLO results and use of results for recommendations in Compliance Assist.

From the SLO

Data Collection

Tool, SPA

generates the

aggregated

reports by

course and by

campus.

Faculty discuss

the aggregated

reports and

make

recommend-

ations;

SPA sends the

SLO Data

Collection

Tool link to

faculty through

SLO

Assessment

Leaders;

Faculty submit

the SLO data to

the SLO Data

Collection

Tool;

Discipline SLO

Assessment

Liaisons submit

aggregated

results, data

interpretation,

recommendations

to Compliance

Assist

7LSC SLO Handbook (2nd ed.)

Page 12: If we plan and conduct our assessment projects at every ...

E. Implementation of Recommendations

After the recommendations are made, you still need to implement recommendations during the subsequent assessment cycle – reassess the course during a 2nd year of assessment; You will need to report on the implementation at the end of the subsequent cycle, after you reassess the course. However, the report on the implementation of the recommendations should be documented in the Implementation section of the SLO form in the same cycle when the recommendations were made. See the illustration chart below for visual explanation:

2013-14 SLO Closing-the- Loop Report

Report the Actual Result Interpret Data Make Recommendations Report on the Implementation

(At the end of 2014-15 cycle)

For detailed description of the requirements involved in the assessment process, please refer to Appendix E.

SLO/PLO Assessment Timelines: Lone Star College System’s SLO assessment data (at the course level) are mainly collected during Fall semesters, allowing Spring semesters to catch up with assessing courses that were offered in Fall semesters but were cancelled for various reasons or courses that are only offered in Spring semesters. Spring semesters are also utilized to discuss data and strategies to implement changes. The SLO assessment loop is closed at the end of the academic year. Likewise, PLO assessment is also an annual process. Data collection may be at course level and/or program level. The PLO assessment loop is also closed at the end of each academic year. See the timeline for SLO and PLO assessment presented in the next pages.

8LSC SLO Handbook (2nd ed.)

Page 13: If we plan and conduct our assessment projects at every ...

9LSC SLO Handbook (2nd ed.)

Page 14: If we plan and conduct our assessment projects at every ...

10LSC SLO Handbook (2nd ed.)

Page 15: If we plan and conduct our assessment projects at every ...

11LSC SLO Handbook (2nd ed.)

Page 16: If we plan and conduct our assessment projects at every ...

Writing the SLO Assessment Plan

Campus SLO/PLO liaisons, in collaboration with program chairs/directors and faculty, are responsible for developing the SLO assessment plans for courses and SLOs selected by the system-wide curriculum team. Please note that SLOs selected should support the assessment of the AA/AS PLOs.

Workforce program directors are responsible for engaging faculty in developing PLO assessment plans for their respective programs (PLO statements for each program should be identical across the system).

Please refer to the following chart for key components of the SLO/PLO assessment plans:

SLO Assessment Plan Components PLO Assessment Plan Components

AA/AS PLO Alignment SLO Statement Method of Assessment (A): Measure

(Student Work/Performance Used forAssessment)

Method of Assessment (B):Instructor’s Evaluation Method (Rubricor Scoring Guide)

Criterion (for Individual Student’sMastery of SLO)

Target Outcome (Expected GroupSuccess Rate)

PLO Statement Program or Course Level Measure Direct or Indirect Measure Method of Assessment (A): Measure

(Student Work/Performance Used forAssessment)

Method of Assessment (B):Instructor’s Evaluation Method (Rubricor Scoring Guide)

Criterion (for Individual Student’sMastery of SLO)

Target Outcome (Expected GroupSuccess Rate)

After the assessment plans are developed, the campus SLO/PLO liaisons are responsible for entering the assessment plans into Compliance Assist.

For detailed explanations on how to create the SLO/PLO assessment plans, see the following sections:

Creating Student Learning Outcome Statements

The student learning outcome (SLO) is a statement that explains what the student is learning, including the accumulated and demonstrated knowledge, skills, abilities, behaviors, and habits of mind, as a result of actively participating in the course or program of study.

A. Guideline for Creating an SLO Statement:

To create an SLO statement, keep in mind the following:

Focus on what the student can do. Don't address what was taught or presented, butaddress the observable outcome you expect to see in the student.

Use active verbs. Active verbs are easier to measure. For instance, if you want the studentsto understand how to correctly use a microscope - using the word understand is notmeasurable. Can you measure understanding? Instead try to imagine the outcome - Students

12LSC SLO Handbook (2nd ed.)

Page 17: If we plan and conduct our assessment projects at every ...

will focus and display an image on the microscope. For this you can both develop criteria and measure ability.

Include an assessable expectation. It helps if you have clearly defined expectationsconcerning the criteria related to that outcome. In the above example, some of the criteriarelated to using the microscope would include:

- a clearly focused image

- correct lighting adjustment of the diaphragm and condenser

- appropriate magnification for the object

- an evenly distributed specimen field

- clearly located object identified by the pointer

- a written identification

Share the outcomes with faculty from other disciplines and within your own discipline.

This helps focus the meaning of the statements. For instance in the above criteria the facultymay ask for clarification of "appropriate magnification."

Share the outcomes with your students. Students need to clearly understand what isexpected, they are unfamiliar with the discipline specific language. This helps focus the clarityof the statements.

Modify as you learn from experience. Leave the word "DRAFT" at the top of your SLOs toremind yourself and communicate to others that you are actively improving them.

B. Bloom’s Taxonomy and Domains of Learning

Student learning outcomes for a course or a program should address different domains of learning and various levels of learning. Therefore, it is necessary to understand domains and taxonomy of learning, as defined by Bloom and his colleagues. The following are three domains widely recognized in the field of learning:

Cognitive domain (Bloom, 1956; Anderson, Krathwohl, 2000), which defines knowledgeclassification.

Psychomotor domain (Gronlund, 1970; Harrow, 1972; Simpson, 1972), which definesphysical skills or tasks classification.

Affective domain (Krathwhol, Bloom, and Masia, 1964), which defines behaviors thatcorrespond to attitudes and values.

The following three tables lay out the taxonomies (classifications) of these three domains. They are arranged to proceed from the simplest to more complex levels. Each level contains general learning outcomes and examples of verbs that can be used to compose learning outcomes:

13LSC SLO Handbook (2nd ed.)

Page 18: If we plan and conduct our assessment projects at every ...

Cognitive Domain

Learning Outcomes Related To Knowledge

Remembering Understanding Applying Analyzing Evaluating Creating

Students are able to retrieve,

recognize, and recall relevant

knowledge from long-term memory, e.g. find out, learn terms, facts,

methods, procedures,

concepts

Students can construct meaning

from oral, written, and graphic messages

through interpreting,

exemplifying, classifying,

summarizing, inferring,

comparing, and explaining;

students can also demonstrate

understanding of implications and applications of terms, facts,

methods, procedures,

concepts

Students can carry out or use a

procedure through

executing or implementing;

can apply practice or

theory; solve

problems; use

information in new

situations

Students can break

material into constituent

parts; determine how the

parts relate to one another

and to an overall structure

or purpose through

differentiating, organizing,

and attributing; can take

concepts apart; break them

down; analyze structure; recognize

assumptions and poor logic; evaluate relevancy

Students can make judgments based on

criteria and

standards through

checking and

critiquing; can set

standards; judge the

use of standards, evidence, rubrics;

can accept or reject on

basis of criteria

Students can put elements together

to form a coherent or functional

whole; reorganize elements into a

new pattern or structure

through generating, planning, or producing; put things together;

bring together various parts;

put information together in a

new and creative way

Acquire Define, Distinguish Draw Find Label List Match Read Record

Compare Demonstrate Differentiate Fill in Find Group Outline Predict Represent Trace

Convert, Demonstrate Differentiate between Discover Discuss, Examine Experiment Prepare Produce Record

Classify Determine Discriminate Form Generalizations Put into categories Illustrate Select Survey Take apart Transform

Argue Award Critique Defend Interpret Judge Measure Select Test Verify

Arrange Blend Create Deduce Devise Organize Plan Present Rearrange Rewrite

14LSC SLO Handbook (2nd ed.)

Page 19: If we plan and conduct our assessment projects at every ...

Psychomotor Domain

Learning Outcomes Related To Skills

Observe Model Recognize Standards

Correct Apply Coach

Students translate sensory

input into physical tasks or

activities.

Students are able to

replicate a fundamental skill or task.

Students recognize

standards or criteria

important to perform a skill

or task correctly.

Students use standards to evaluate their

own performances

and make corrections.

Students apply this skill to real

life situations.

Students are able to

instruct or train others to perform this skill in other situations.

Hear

Identify

Observe

See

Smell

Taste

Touch

Watch

*Usually nooutcomes or objectives written at this level.

Attempt

Copy

Follow

Imitate

Mimic

Model

Reenact

Repeat

Reproduce

Show

Try

Check

Detect

Discriminate

Differentiate

Distinguish

Notice

Perceive

Recognize

Select

Adapt

Adjust

Alter

Change

Correct

Customize

Develop

Improve

Manipulate

Modify

Practice

Revise

Build

Compose

Construct

Create

Design

Originate

Produce

Demonstrate

Exhibit

Illustrate

Instruct

Teach

Train

15LSC SLO Handbook (2nd ed.)

Page 20: If we plan and conduct our assessment projects at every ...

Affective Domain

Learning Outcomes Related To Attitudes, Behaviors, and Values

Receiving Responding Valuing Organizing Characterizing

Students become aware of an attitude, behavior, or

value.

Students exhibit a reaction or change

as a result of exposure to an

attitude, behavior, or value.

Students recognize value and

display this through

involvement or

commitment.

Students determine a new

value or behavior as

important or a priority.

Students integrate consistent behavior as a naturalized value in spite of discomfort or

cost. The value is recognized as a part

of the person’s character.

Accept

Attend

Describe

Explain

Locate

Observe

Realize

Receive

Recognize

Behave

Comply

Cooperate

Discuss

Examine

Follow

Model

Present

Respond

Show

Studies

Accept

Adapt

Balance

Choose

Differentiate

Defend

Influence

Prefer

Recognize

Seek

Value

Adapt

Adjust

Alter

Change

Customize

Develop

Improve

Manipulate

Modify

Practice

Revise

Authenticate

Characterize

Defend

Display

Embody

Habituate

Internalize

Produce

Represent

Validate

Verify

Source: Adapted from http://www.craftonhills.edu/~/media/Files/SBCCD/CHC/Faculty%20and%20Staff/SLOs/Step%201/Blooms%20Taxonomy%20and%203%20Domains%20of%20Learning.pdf It is recommended that student learning outcomes address relevant outcomes for each of these domains but must be appropriate to the course or the program. The essence of student learning outcomes lies in focusing on the results you want from your course or the program rather than on what you will cover in the course or in the program.

16LSC SLO Handbook (2nd ed.)

Page 21: If we plan and conduct our assessment projects at every ...

C. SLO Examples

Remembering

Students will be able to describe history, purpose, and scope of physical therapy. Students will be able to identify the social, political, economic and cultural influences

and differences that affect the development process of the individual. Understanding

Students will be able to distinguish important aspects of the western moral theories, from the virtue theory of ancient philosophers to the modern theories of act ethics.

Students will be able to explain georeferencing of photos or images to maps. Students will be able to summarize the principles of magnetism.

Applying

Students will be able to implement plan of care for patients and families within the legal, ethical, and regulatory parameters.

Students will be able to apply appropriate organizational skills for selected types of speeches.

Students will be able to conduct basic laboratory experiments involving classical mechanics.

Analyzing

Students will be able to analyze and design electrical and electronic circuits and systems, using knowledge of mathematics and basic sciences.

Students will be able analyze self-practice in relation to the roles of the professional nurse.

Evaluating

Students will be able to use critical thinking and a systematic problem-solving process for providing comprehensive care.

Students will be able to evaluate economic events that apply to the preparation of financial statements.

Creating

Students will be able to integrate the pathophysiological assessment findings to formulate a field impression.

Coordinate human information and material resources in providing care for patients and their families.

For a list of Program Learning Outcomes, please use the following link: http://www.lonestar.edu/images/01162014-List-of-PLOs.pdf

17LSC SLO Handbook (2nd ed.)

Page 22: If we plan and conduct our assessment projects at every ...

Designing Assessment Methods Assessment methods refer to the use of direct or indirect measures (data sources) and the evaluation instrument (such as a rubric or a survey) to gauge students’ mastery of student learning outcomes. Appropriate use of assessment methods requires faculty members to have the following two skills:

1) Selecting and designing direct measures (e.g. assignments or exams) or indirect measures (e.g. surveys or interviews) at the course level or program level;

2) Selecting and designing evaluation instruments (e.g. a rubric, a score distribution guide, or a survey.)

A discussion on how to select measures and how to design assessment rubrics is as follows:

A. Selecting Measures (A) Defining Measures: Direct Measures look at the actual student work (assignments or exams) that can be used to directly measure students’ knowledge and skills

Examples: Pre-test, post-test, comprehensive subject matter exam, licensure exam, portfolio, thesis, writing assignment, capstone assignment, and performance demonstration (recital, art exhibit, or science project)

Indirect Measures refer to the type of data that can be used to infer students’ knowledge and skills

Examples: Surveys, Interviews, Focus Group Studies, Document Analyses, Students’ Self-Reports.

Program-Level Measures: Refer to assignments or tests that assess students’ knowledge and skills at the end of the program, not embedded in any particular course.

Examples: Exit exams (standardized, licensure/professional exams), Thesis, Dissertation

Capstone Course Paper or Project, Portfolios

Course-Level Measures: Refer to assignments or tests that are given in specific courses. These course-level data can be used to measure students’ knowledge and skills at the end of a course or a program.

Examples: Exams, Tests, Rubric-Based Assignments (Projects, Essays, Portfolios, Presentations, and Performance Demo.)

18LSC SLO Handbook (2nd ed.)

Page 23: If we plan and conduct our assessment projects at every ...

(B) How to Select Measures: Depending on what levels of learning you want to measure, you may give objective assignments, or subjective assignments, or performance-based assignments. The following chart provides an overview on how to what kind of assignments to give based on levels of learning in the cognitive domain:

Cognitive Domain Measures Aligned with Cognitive Domain

Remembering & Understanding (knowledge)

Objective Assignments: Assignments that have right or wrong answers or the best answers: Multiple-choice questions, true-or-false questions. Subjective Assignments: Assignments that entail varied responses (no right or wrong answer): Essays, Presentations, Portfolios

Applying (skills) Performance-Based Assignments: Performance Demonstrations, Artwork, Music Performance, Projects, Presentations, Internship Projects

Analyzing, Evaluating, Creating Subjective Assignments: Essays, Essay Questions, Research Paper, Presentations, Portfolios, Internship Projects, Capstone Projects

19LSC SLO Handbook (2nd ed.)

Page 24: If we plan and conduct our assessment projects at every ...

(C) Examples of Measures Aligned with Student Learning Outcomes Depending on what student learning outcomes are, you may select measures that entail objective or subjective responses from students. See examples from the chart below:

Student Learning Outcomes Measures

Students will be able to describe history, purpose, and scope of physical therapy. (Demonstrating Knowledge)

Exam on the history and purpose of Physical Therapy Essay describing the scope of physical therapy

Students will be able to distinguish important aspects of the western moral theories, from the virtue theory of ancient philosophers to the modern theories of act ethics. (Demonstrating Comprehension)

Exam or Essay Questions on the Exam summarizing the principles of ethical theories Essay comparing the similarities and differences between the ancient and modern theories

Students will be able to apply appropriate organizational skills for selected types of speeches. (Application of Skills)

Demonstration of public speaking in front of the class

Students will be able to analyze a variety of issues that influence learning in secondary schools. (Analysis)

Essay analyzing which educational movements affected minority students’ learning

Students will be able to coordinate human information and material resources in providing care for patients and their families. (Synthesis).

Internship in a clinical setting to provide patient care under the supervision of an experienced physical therapist.

20LSC SLO Handbook (2nd ed.)

Page 25: If we plan and conduct our assessment projects at every ...

B. Designing Assessment Rubrics (A) Rubric Defined

A rubric is a systematic scoring guide to evaluate student performance. It specifies levels of quality in identified areas.

A rubric is needed if an assignment entails varied responses, rather than one correct answer. Essay Essay questions on an exam Research paper Oral presentation Portfolio Demonstration of critical thinking skills Demonstration of technical skills An Art Show, etc.

A rubric is NOT needed if assignments or exam questions entail only right or wrong answers, then a score distribution guide indicating the distribution of points for correct answers is used.

Math questions Multiple-choice questions True or false questions Filling-in-the-blanks test Matching exercise or test

(B) How to Create a Rubric

There are many kinds of rubrics, but the most commonly used rubric is the descriptive rubric. A descriptive rubric should include at least three essential parts:

1) A Set of Indicators of Learning 2) A Scale 3) A Set of Descriptors

1) Indicators of Learning Indicators of learning are traits or features of an assignment that an instructor wants to make judgment on. Indicators of Learning should be aligned with the instructional objectives. If an instructor’s objectives of teaching an essay are related to helping students compose an essay that is focused, coherent, organized, etc., then these features should serve as indictors of learning. Example 1: Indicators of Learning for an Essay for an English class:

Focus Coherence Organization Sentence Structure Word Choice

21LSC SLO Handbook (2nd ed.)

Page 26: If we plan and conduct our assessment projects at every ...

Example 2: Indicators of Learning for Critical Thinking Skills: Explanation of issues Evidence Influence of context and assumptions Student’s position (perspective, thesis/hypothesis) Conclusions and related outcomes (implications and consequences)

Example 3: Indicators of Learning for an Oral Presentation:

Organization Subject Knowledge Graphics Mechanics Eye Contact Elocution

2) Scale

A scale indicates points to be assigned in scoring a piece of work on a continuum of quality. High numbers are typically assigned to the best work. Scale examples: Needs Improvement (1)... Satisfactory (2)... Good (3)... Exemplary (4) Beginning (1)... Developing (2)... Accomplished (3)... Exemplary (4) Needs work (1)... Good (2)... Excellent (3) Novice (1)... Apprentice (2)... Proficient (3)... Distinguished (4)

3) Descriptors Descriptors are used to describe signs of performance at each level; the description needs to include both strengths and weaknesses (Weaknesses should be described particularly in lower levels of performance). Example 1: Word Choice (in an essay): 4-- Vocabulary reflects a thorough grasp of the language appropriate to the audience. Word choice is precise, creating a vivid image. Metaphors and other such devices may be used to create nuanced meaning. 3-- Vocabulary reflects a strong grasp of the language appropriate to the audience. Word choice is accurate, but may be inappropriate in a couple of places. 2-- Vocabulary reflects an inconsistent grasp of the language and may be inaccurate or inappropriate to the audience. 1-- Vocabulary is typically inaccurate and inappropriate to the audience. Word choice may include vague, non-descriptive, and/or trite expressions. Example 2: Depth of Discussion (in a research paper) 4– In-depth discussion and elaboration in all sections of the paper 3– In-depth discussion and elaboration in most sections of the paper 2-- The writer has omitted pertinent content, or content runs on excessively. Quotations from others outweigh the writer’s own ideas excessively. 1– Cursory discussion in all the sections of the paper or brief discussion in only a few sections.

22LSC SLO Handbook (2nd ed.)

Page 27: If we plan and conduct our assessment projects at every ...

(C) Examples of Rubrics Example 1: AACC Critical Thinking VALUE Rubric (A Snapshot) Capstone Milestones Benchmark Indicators 4 3 2 1

Explanation of Issues

Issue/problem to be considered critically is stated clearly and described comprehensively, delivering all relevant information necessary for full understanding

Issue/problem to be considered critically is stated, described, and clarified so that understanding is not seriously impeded by omissions.

Issue/problem to be considered critically is stated but description leaves some terms undefined, ambiguities unexplored, boundaries undetermined, and/or backgrounds unknown.

Issue/problem to be considered critically is stated without clarification or description.

Evidence

Information is taken from source(s) with enough Interpretation/ evaluation to develop a comprehensive analysis or synthesis.

Information is taken from source(s) with enough interpretation /evaluation to develop a coherent analysis or synthesis.

Information is taken from source(s) with some interpretation/ evaluation, but not enough to develop a coherent analysis or synthesis.

Information is taken from source(s) without any interpretation or evaluation.

Example 2: A Research Paper Rubric (A Snapshot) Indicator Expert Proficient Apprentice Novice Topic Focus

The topic is focused narrowly enough for the scope of this assignment. A thesis statement provides direction for the paper, either by statement of a position or hypothesis.

The topic is focused but lacks direction. The paper is about a specific topic but the writer has not established a position.

The topic is too broad for the scope of this assignment.

The topic is not clearly defined.

Depth of Discussion

In-depth discussion & elaboration in all sections of the paper.

In-depth discussion & elaboration in most sections of the paper.

The writer has omitted pertinent content or content runs on excessively. Quotations from others outweigh the writer’s own ideas excessively.

Cursory discussion in all the sections of the paper or brief discussion in only a few sections.

23LSC SLO Handbook (2nd ed.)

Page 28: If we plan and conduct our assessment projects at every ...

Example 3: An Oral Presentation Rubric (A Snapshot)

Indicators 1 2 3 4

Eye Contact

Student reads all of report with no eye contact.

Student occasionally uses eye contact, but still reads most of report.

Student maintains eye contact most of the time but frequently returns to notes.

Student maintains eye contact with audience, seldom returning to notes.

Elocution

Student mumbles, incorrectly pronounces terms, and speaks too quietly for students in the back of class to hear.

Student's voice is low. Student incorrectly pronounces terms. Audience members have difficulty hearing presentation.

Student's voice is clear. Student pronounces most words correctly. Most audience members can hear presentation.

Student uses a clear voice and correct, precise pronunciation of terms so that all audience members can hear presentation.

See fully developed samples of rubrics from the link below: http://www.lonestar.edu/images/5b-SampleRubric-AACC.pdf http://www.lonestar.edu/images/5c-Samples_of_Rubric.pdf Setting Criteria and Target Outcomes

A. Setting Criteria (for individual students mastering the SLO): To determine whether students have successfully met the learning outcomes, we need to specify what score or rating an individual student needs to get in order to be considered as meeting the learning outcome. See examples below:

Example 1: Measure: Exam on the history and purpose of Physical Therapy Evaluation Method: Score Weighting Guide with scores assigned to each exam question Criterion for Meeting the SLO: A score of 70 or above on the exam Example 2: Measure: Essay describing the scope of physical therapy Evaluation Method: Grading Rubric specifying levels of quality in identified areas Criterion for Meeting the SLO: A score of 3 or above on a scale of 1-4

24LSC SLO Handbook (2nd ed.)

Page 29: If we plan and conduct our assessment projects at every ...

B. Setting Target Outcomes (for group success):

To determine whether students as a group have successfully met the learning outcomes, we need to specify what % from the group should meet the learning outcome. See examples below: Example 1: Measure: Exam on the history and purpose of Physical Therapy Evaluation Method: Score Weighting Guide with scores assigned to each exam question Criterion for Meeting the SLO: A score of 70 or above on the exam Target Outcome: 85% of the students will score 70 or above on the exam. Example 2: Measure: Essay describing the scope of physical therapy Evaluation Method: Grading Rubric specifying levels of quality in identified areas Criterion for Meeting the SLO: A score of 3 or above on a scale of 1-4 Target Outcome: 70% of the students will score a 3 or above on their essays

Samples of PLO and SLO Assessment Plans

Sample 1: PLO Assessment Plan for AA in Political science

PLO Statement Measure (Student Work/Performance)

Teacher's Evaluation Method

Criterion (What cut-off score is considered as meeting PLO for each student?)

Target Outcome (Group Success: % of the group meeting the PLO)

Students will understand the American politics subfield of political science.

GOVT2304 (Intro to Political Science) Essay Question on American Politics

Essay Assessment Rubric

A score of 3 or above on essay (on a scale of 1-5)

80% of the students answering the essay question will get a score 3 or above.

Students will understand the political theory subfield of political science.

GOVT2304 Essay Question on Political Theory

Essay Assessment Rubric

A score of 3 or above on essay (on a scale of 1-5)

80% of the students answering the essay question will get a score 3 or above.

Students will demonstrate a level of civic engagement

Internship Project Comments on strengths and weaknesses of the students’ internship performance in an evaluation letter

70% of the comments in the evaluation letter are positive.

95% of the students will get a positive evaluation letter from their internship supervisor.

25LSC SLO Handbook (2nd ed.)

Page 30: If we plan and conduct our assessment projects at every ...

Sample 2: SLO Assessment Plan for ENGL 1302 Rhetoric and Composition Course

CLO Statement Measure (Student Work/Performance)

Teacher's Evaluation Method

Criterion (What cut-off score is considered as meeting SLO for each student?)

Target Outcome (Group Success: % of the group meeting the SLO)

Students will be able to present a valid argument using convincing evidence.

Persuasive Essay Essay Grading Rubric

A score of 3 or above (1-5 scale)

80% of the students taking the course will receive a score of 3 or above.

Students will be able to demonstrate appropriate use of standard English.

Grammar Quiz Writing Portfolio

a. Quiz grading standard b. Portfolio Grading Rubric

a. A score of 70 on the quiz b. A score of 3 or above (1-5 scale)

75% of the students taking the Engl1302 course will receive a score of 70 for the quiz and a score of 3 or above for the portfolios.

Student will apply researched information in writing a research paper.

Summary of Readings Research Paper

a. Summary Grading Rubric b. Research Paper Grading Rubric

a. A score of 3 or above (1-5 scale) b. A score of 3 or above (1-5 scale)

75% of the students taking the Engl1302 course will receive a score of 3 or above for the summary assignment and for the research papers.

Writing the Closing-the-Loop Report Campus SLO/PLO liaisons work with the deans and chairs to engage the faculty in discussing data and using results for further action. The key components of a Closing-the-Loop Report include the following:

Report on Actual Results Interpretation of Results Recommendations for Improvements Report on Implementation

The following sections present more detailed discussion on how to create the Closing-the-Loop Reports:

26LSC SLO Handbook (2nd ed.)

Page 31: If we plan and conduct our assessment projects at every ...

How to Report on Actual Results

A. Present the aggregated data on the following: a. Total # of students enrolled for SLO assessment/Total # of students eligible for

assessment for PLO assessment b. Total # of students assessed (those who turned in assignments for assessment) c. Total # of students meeting success criteria d. % of assessed students who met success criteria (# meeting criteria divided by #

assessed)

Example: # of students enrolled: 100 # of students assessed: 80 # of students meeting success criteria: 70 % of assessed students who met success criteria: 70/80 = 87.5%

B. Draw objective conclusion based on the results by doing the following:

a. Compare the target outcome with the actual results and identify the difference between the

two; b. Comment on if the target outcome was met, partially met, or not met.

Examples: Example 1: Our target outcome was that 70% of students enrolled will pass the final exam on the use of various ceramic materials, production methods, and firing processes. Our actual results showed 85% of our students passed the final. We exceeded the target. Example 2: We targeted 80% of students being able to score a 3 and 100% being able to score a 2 or above on a scale of 1-3). Our actual result showed that 54% of the students scored a 3 (Excellent), and 100% of the students scored a 2 Satisfactory or above. The target was partially met.

How to Interpret Results Once we reported on the actual results, we need to engage faculty in discussing data and interpret the data so that we can act on using data for improvement. To interpret the data, address the following question: What factors contributed to the results? (Why was the target met or not met?) See examples illustrated in the table on the next page:

27LSC SLO Handbook (2nd ed.)

Page 32: If we plan and conduct our assessment projects at every ...

Actual Results

Interpretation of Data

Our target outcome was that 70% of students enrolled will pass the final exam on the use of various ceramic materials, production methods, and firing processes. Our actual results showed 85% of our students passed the final. We exceeded the target.

Further analysis of the data indicated that 15% of the students (16 students) did not meet the outcome, and out of these 16 students, 10 students didn’t describe the firing processes adequately. This raised the questions about the teaching method and assessment method. Is verbal explanation of the firing processes sufficient? What is the best way to test students’ mastery of firing processes?

We targeted 80% of students being able to score a 3 and 100% being able to score a 2 or above on a scale of 1-3). Our actual result showed that 54% of the students scored a 3 (Excellent), and 100% of the students scored a 2 Satisfactory or above. The target was partially met.

Faculty discussed the actual results and found that students who failed to score “3” failed to demonstrate critical thinking skills when designing the intervention plan, which prompted the necessity to implement critical thinking activities in OTHA2302.

How to Make Recommendations Recommendations for further action taken to improve teaching and learning should be based on the interpretation of data. See the examples below: Interpretation of Data

Recommendations

Further analysis of the data indicated that 15% of the students (16 students) did not meet the outcome, and out of these 16 students, 10 students didn’t describe the firing processes adequately. This raised the questions about the teaching method and assessment method. Is verbal explanation of the firing processes sufficient? What is the best way to test students’ mastery of firing processes?

1. Use visual aid to explain the firing process;

2. Use video to demonstrate the firing process;

3. Test students’ mastery of the firing processes by having students display their ceramic art work that is produced by students’ applying firing processes.

Faculty discussed the actual results and found that students who failed to score “3” failed to demonstrate critical thinking skills when designing the intervention plan, which prompted the necessity to implement critical thinking activities in OTHA2302.

Faculty will design student critical thinking activities and incorporate these activities in OTHA2302.

28LSC SLO Handbook (2nd ed.)

Page 33: If we plan and conduct our assessment projects at every ...

Recommendations are typically made in the following areas: Instructional Strategies Curriculum Changes Student Support Teacher Development Assessment Methods

See examples of different types of recommendations below: Recommendation Types Examples

Student support

(1). Continue to advise and motivate students: encourage attendance and encourage completion of coursework. (2).Offer tutoring during office hours

Teacher Development (1). We would like to provide professional development for faculty on collaborative learning. (2). Implement faculty mentoring system

Assessment Method (1). Using one capstone project to evaluate all PLOs may not be adequate. We will add simulation questions that simulate the state exams to assess each PLO. (2). We will revise the grading rubric to target more discrete skills so that we can identify which area is most challenging to students.

Student Support (1). Continue to advise and motivate students: encourage attendance and encourage completion of coursework. (2).Offer tutoring during office hours

Teacher Development (1). We would like to provide professional development for faculty on collaborative learning. (2). Implement faculty mentoring system

Assessment Methods (1). Using one capstone project to evaluate all PLOs may not be adequate. We will add simulation questions that simulate the state exams to assess each PLO. (2). We will revise the grading rubric to target more discrete skills so that we can identify which area is most challenging to students.

29LSC SLO Handbook (2nd ed.)

Page 34: If we plan and conduct our assessment projects at every ...

Continuous Quality Improvement Continuous quality improvement entails making data-driven decisions and take action based on data to improve student learning, teaching, and support services. This means recommendations made based on data need to be implemented, and the implementation results will be used for further decision making and action. How to Report on the Implementation of Recommendations LSCS SLO/PLO assessment process requires that recommendations made during the current cycle should be implemented during the subsequent assessment cycle. Ways to implement the recommendations include the following:

A. Make the recommendation a departmental requirement; B. Implement the recommendation in the classroom; C. Form a taskforce to complete the recommended project; D. Other activities.

See examples of reporting on the implementations below: Example 1 (Departmental Requirement): The recommendation, “Provide professional development to adjunct faculty with regard to the course content standards set by the department and require them to abide by the standards,” was implemented by the chair who organized the professional development event and also communicated the requirements to the adjunct faculty in Fall 2013. Example 2 (Faculty Classroom Activity): The recommendation, “Faculty will design student critical thinking activities and incorporate these activities in OTHA2302,” was implemented by Faculty who designed and implemented the critical thinking activities in OTHA2302 in Fall 2013. Example 3 (Taskforce Activity): A taskforce was formed with full-time and part-time faculty representatives to address the recommendation, “Revise the grading rubric to target more discrete skills so that we can identify which area is most challenging to students.” The revised rubric was adopted in Fall 2013. Using the revised rubric, the faculty was able to identify the “translation process” to be the most challenging area for students, so instruction was adjusted to focus more on teaching the “translation process.” Due to the time needed for implementing the recommendations, there is a one-year lapse between the time when recommendations are made and the time when reports on implementation are entered in Compliance Assist.

30LSC SLO Handbook (2nd ed.)

Page 35: If we plan and conduct our assessment projects at every ...

Assessment Tool: Compliance Assist Web Link to Compliance Assist Lone Star College uses Compliance Assist as the assessment tool to store SLO/PLO assessment plans and reports. You may access Compliance Assist link from the link below: https://lonestar.compliance-assist.com/Planning/login.aspx?ReturnUrl=%2fPlanning%2findex.aspx Compliance Assist Navigation Tool To create SLO or PLO forms or enter data in Compliance Assist, please use the Compliance Assist Navigation Tool, which can be accessed from the link below: http://www.lonestar.edu/spa#ca

31LSC SLO Handbook (2nd ed.)

Page 36: If we plan and conduct our assessment projects at every ...

Glossary of Terms

Assessment

The systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. (Palomba & Banta, 1999)

An ongoing process aimed at understanding and improving student learning. It involves making our expectations explicit and public; setting appropriate criteria and standards for learning quality; systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and using the resulting information to document, explain, and improve performance. (Angelo, 1995)

Benchmarking

An actual measurement of group performance against an established standard at defined points along the path toward the standard. Subsequent measurements of group performance use the benchmarks to measure progress toward achievement. (New Horizons for Learning)

Classroom Assessment

The systematic and on-going study of what and how students are learning in a particular classroom; often designed for individual faculty who wish to improve their teaching of a specific course. Classroom assessment differs from tests and other forms of student assessment in that it is aimed at course improvement, rather than at assigning grades. (National Teaching & Learning Forum)

Criterion-Referenced Assessment

An assessment where an individual's performance is compared to a specific learning objective or performance standard and not to the performance of other students. Criterion-referenced assessment tells us how well students are performing on specific goals or standards rather than just telling how their performance compares to a norm group of students nationally or locally. In criterion-referenced assessments, it is possible that none, or all, of the examinees will reach a particular goal or performance standard.

Direct Assessment

Gathers evidence about student learning based on student performance that demonstrates the learning itself. Can be value added, related to standards, qualitative or quantitative, embedded or not, using local or external criteria. Examples are written assignments, classroom assignments, presentations, test results, projects, logs, portfolios, and direct observations. (Leskes, 2002)

Embedded Assessment

A means of gathering information about student learning that is built into and a natural part of the teaching-learning process. Often uses for assessment purposes classroom assignments that are evaluated to assign students a grade. Can assess individual student performance or aggregate the information to provide information about the course or program; can be formative or summative, quantitative or qualitative. Example: as part of a course, expecting each senior to complete a research

32LSC SLO Handbook (2nd ed.)

Page 37: If we plan and conduct our assessment projects at every ...

paper that is graded for content and style, but is also assessed for advanced ability to locate and evaluate Web-based information (as part of a college-wide outcome to demonstrate information literacy). (Leskes, 2002)

Evaluation

The use of assessment findings (evidence/data) to judge program effectiveness; used as a basis for making decisions about program changes or improvement. (Allen, Noel, Rienzi & McMillin, 2002)

Formative Assessment

The gathering of information about student learning-during the progression of a course or program and usually repeatedly-to improve the learning of those students. Example: reading the first lab reports of a class to assess whether some or all students in the group need a lesson on how to make them succinct and informative. (Leskes, 2002)

Indirect Assessment

Acquiring evidence about how students feel about learning and their learning environment rather than actual demonstrations of outcome achievement. Examples include surveys, questionnaires, interviews, focus groups, and reflective essays. (Eder, 2004)

Learning Outcomes

Operational statements describing specific student behaviors that evidence the acquisition of desired knowledge, skills, abilities, capacities, attitudes or dispositions. Learning outcomes can be usefully thought of as behavioral criteria for determining whether students are achieving the educational objectives of a program, and, ultimately, whether overall program goals are being successfully met. Outcomes are sometimes treated as synonymous with objectives, though objectives are usually more general statements of what students are expected to achieve in an academic program. (Allen, Noel, Rienzi & McMillin, 2002)

Norm-Referenced Assessment

An assessment where student performance or performances are compared to a larger group. Usually the larger group or "norm group" is a national sample representing a wide and diverse cross-section of students. Students, schools, districts, and even states are compared or rank-ordered in relation to the norm group. The purpose of a norm-referenced assessment is usually to sort students and not to measure achievement towards some criterion of performance.

Performance Criteria

The standards by which student performance is evaluated. Performance criteria help assessors maintain objectivity and provide students with important information about expectations, giving them a target or goal to strive for. (New Horizons for Learning)

Portfolio

A systematic and organized collection of a student's work that exhibits to others the direct evidence of a student's efforts, achievements, and progress over a period of time. The collection should involve

33LSC SLO Handbook (2nd ed.)

Page 38: If we plan and conduct our assessment projects at every ...

the student in selection of its contents, and should include information about the performance criteria, the rubric or criteria for judging merit, and evidence of student self-reflection or evaluation. It should include representative work, providing a documentation of the learner's performance and a basis for evaluation of the student's progress. Portfolios may include a variety of demonstrations of learning and have been gathered in the form of a physical collection of materials, videos, CD-ROMs, reflective journals, etc. (New Horizons for Learning)

PLO

Acronym for Program Learning Outcome. PLOs address the question, “What will students know or be able to do when they exit the program?”

Currently, Lone Star College has three categories of PLOs: (1). PLOs for Workforce Programs. The assessment data are tracked, using the PLO Assessment Forms in Compliance Assist; (2). PLOs for Associate of Arts in Teaching program. The assessment data are tracked, using the PLO Assessment Forms in Compliance Assist; (3). PLOs for Associate of Arts/Associate of Science degrees. The AA/AS PLOs are assessed by course level SLOs, so the assessment data are tracked using SLO Assessment Forms in Compliance Assist.

Program Evaluation

Determination of the adequacy of the program in fulfilling its mission, goals, and objectives.

Qualitative Method

Qualitative method of assessment collects data that are narrative. Data are analyzed by looking for recurring themes. Examples of this method include interviews, observations, focus group study, etc.

Quantitative Method

Quantitative method of assessment collects data that can be summarized into meaningful numbers and can be analyzed statistically. Examples include test score comparison, analysis of survey ratings, and number of events comparison.

Rubric

Specific sets of criteria that clearly define for both student and teacher what a range of acceptable and unacceptable performance looks like. Criteria define descriptors of ability at each level of performance and assign values to each level. Levels referred to are proficiency levels which describe a continuum from excellent to unacceptable product. (System for Adult Basic Education Support)

SLO

Acronym for Student Learning Outcome. An SLO statement explains what the student is learning, including the accumulated and demonstrated knowledge, skills, abilities, behaviors, and habits of mind, as a result of actively participating in the course or program of study.

34LSC SLO Handbook (2nd ed.)

Page 39: If we plan and conduct our assessment projects at every ...

Currently, Lone Star College System uses SLO acronym to refer to course level learning outcome assessment.

Standards

Sets a level of accomplishment all students are expected to meet or exceed. Standards do not necessarily imply high quality learning; sometimes the level is a lowest common denominator. Nor do they imply complete standardization in a program; a common minimum level could be achieved by multiple pathways and demonstrated in various ways. (Leskes, 2002)

Summative Assessment

The gathering of information at the conclusion of a course, program, or undergraduate career to improve learning or to meet accountability demands. When used for improvement, impacts the next cohort of students taking the course or program. Example: examining student final exams in a course to see if certain specific areas of the curriculum were understood less well than others. (Leskes, 2002)

Value Added

The increase in learning that occurs during a course, program, or undergraduate education. Can either focus on the individual student (how much better a student can write, for example, at the end than at the beginning) or on a cohort of students (whether senior papers demonstrate more sophisticated writing skills-in the aggregate-than freshmen papers). Requires a baseline measurement for comparison. (Leskes, 2002)

35LSC SLO Handbook (2nd ed.)

Page 40: If we plan and conduct our assessment projects at every ...

References Allen, M., Noel, R. C.,Rienzi, B. M., & McMillin, D, J. (2002). Outcomes Assessment Handbook.

California State University, Institute for Teaching and Learning, Long Beach, CA.

Angelo, T. A. (1995). Reassessing (and defining) assessment. The AAHE Bulletin, 48 (2), 7-9.

Angelo, T. A. (1999, May). Doing assessment as if learning matters most. The AAHE Bulletin:

www.aahebulletin.com/public/archive/angelomay99.asp.

Bloom, B.S. (1956). Taxonomy of educational objectives: the classification of educational goals.

Handbook I: Cognitive Domain. White Plains, N.Y.: Longman.

DeMars, C. E., Cameron, L., & Erwin, T. D. (2003). Information literacy as foundational: determining

competence. JGE: The Journal of General Education, 52 (4), 253.

Eder, D. J. (2004). General education assessment within the disciplines. JGE: The Journal of General

Education, 53 (2), 135.

Leskes, A. (2002). Beyond confusion: an assessment glossary. Peer Review, 4 (2/3).

McTighe, J., & Ferrara, S. (1998). Assessing learning in the classroom. Washington D.C.: National

Education Association.

National Center for Research on Evaluation, Standards & Student Testing (CRESST). Glossary.

National Teaching & Learning Forum, Classroom Assessment Techniques.

New Horizons for Learning. (2002). Glossary of Assessment Terms.

Palomba, C & Banta T. (1999). Assessment essentials: planning, implementing, and improving

assessment in higher education. San Francisco: Jossey Bass.

Smith, K., & Harm, T. (2003). Clarifying different types of portfolios. Assessment & Evaluation in

Higher Education, 28 (6), 625.

Southern Association of Colleges and Schools (2012). Principles of accreditation: Foundations for quality enhancement (2012 ed.). Decatur, GA: Commission on Colleges of the Southern Association of Colleges and Schools.

System for Adult Basic Education Support. Glossary of Useful Terms.

Suskie, L. (2009). Assessing Student Learning. San Francisco: Jossey-Bass.

36LSC SLO Handbook (2nd ed.)

Page 41: If we plan and conduct our assessment projects at every ...

Resources Internal Websites: LSC SLO Webpage: http://www.lonestar.edu/student-learning-outcomes.htm LSC IE Website: http://www.lonestar.edu/institutional-effectiveness.htm Course Learning Outcome List: http://www.lonestar.edu/refresh_learning_outcomes.html Program Learning Outcome List: http://www.lonestar.edu/images/01162014-List-of-PLOs.pdf External Websites: AACC Publication Webpage: http://www.aacu.org/value/publications.cfm National Institute for Learning Outcomes Assessment Website: http://www.learningoutcomeassessment.org/ Stephen F. Austin State University Assessment Resources Webpage: http://www.sfasu.edu/assessment/rubrics.asp Texas A&M University Assessment Resources Webpage: http://assessment.tamu.edu/resources/resources_index.html UT-Austin Assessment Webpage: https://www.utexas.edu/provost/planning/assessment/iapa/workshops.html Valencia College Institutional Assessment Webpage: http://valenciacollege.edu/academic-affairs/institutional-effectiveness-planning/institutional-assessment/ Books:

Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college

teachers (2nd ed.). San Francisco, CA: Jossey-Bass

Banta, T.W., Lund, J.P., Black, K.E., & Oblander, F.W. (1996). Assessment in practice: Putting principles to work on college campuses. San Francisco, CA: Jossey-Bass Publishers.

Banta, T. W. (Ed.). (2007). Assessing student achievement in general education: Assessment

update collections. San Francisco, Jossey-Bass.

Banta, T. W. (Ed.). (2007). Assessing student learning in the disciplines: Assessment update

collections. San Francisco: Jossey-Bass.

37LSC SLO Handbook (2nd ed.)

Page 42: If we plan and conduct our assessment projects at every ...

Banta, T. W. (Ed.). (2011). A bird’s-eye view of assessment: Selections from editor’s notes. San

Francisco: Jossey-Bass.

Bishop-Clark, C., Dietz-Uhler, B. (2012). Engaging in the Scholarship of Teaching and Learning.

Sterling, VA: Stylus Publishing, LLC.

Nichols, J.O. (1995b). A practitioner’s handbook for institutional effectiveness and student outcomes

assessment implementation. Flemington, NJ: Agathon Press.

Nichols, J. O., Nichols, K. W., et al. (2005). A road map for improvement of student learning and

support services through assessment. New York: Agathon Press.

Palomba, C.A., & Banta, T.W. (1999). Assessment essentials: Planning, implementing, and improving

assessment in higher education. San Francisco: Jossey-Bass.

Palomba C. A., & Banta, T. W. (2001). Assessing student competence in accredited disciplines:

Pioneering approaches to assessment in higher education (1st ed.). Sterling, VA: Stylus.

Persellin, D. C. & Daniels, M. B. (2014). A Concise Guide to Improving Student Learning.

Sterling, VA: Stylus Publishing, LLC.

Maki, P.L. (2010). Assessing For Learning (2nd ed.). Sterling, VA: Stylus Publishing, LLC.

Stevens, D.D. & Levi, A.J. (2013). Introduction to Rubrics (2nd ed.). Sterling, VA: Stylus Publishing,

LLC.

Suskie, L. (2009). Assessing Student Learning. San Francisco: Jossey-Bass.

38LSC SLO Handbook (2nd ed.)

Page 43: If we plan and conduct our assessment projects at every ...

Appendices

39LSC SLO Handbook (2nd ed.)

Page 44: If we plan and conduct our assessment projects at every ...

AA/AS PLOs (EEOs)

CommunicationENGL

2328

SPCH

1315

PHIL

1301

COMM

1307

SPCH

1311

1.1 To understand and demonstrate writing and speaking processes through invention, organization, drafting, revision, editing, and presentation.

SLO1

1.2 To understand the importance of specifying audience and purpose and to select appropriate communication choices. SLO1 SLO2

1.3 To understand and appropriately apply modes of expression, i.e., description expositive, narrative, scientific, and self-expressive, in written, visual, and oral communication.

SLO2

1.4 To participate effectively in groups with emphasis on listening, critical and reflective thinking, and responding.

SLO1

1.5 To understand and apply basic principles of critical thinking, problem solving and technical proficiency in the development of exposition and argument.

SLO2 SLO3

1.6 To develop the ability to research and write a documented paper and/or to give an oral presentation.

SLO5

Humanities, Visual and Performing ArtsARTS

2346

PHIL

1310

DRAMA

1310

ENGL

2322

HUMA

1301

2.1 To demonstrate awareness of the scope and variety of works in the arts and humanities.

SLO2 SLO2 SLO2 SLO4

2.2 To understand those works as expressions of individual and human values within an historical and social context. SLO2

2.3 To respond critically to works in the arts and humanities. SLO3 SLO2

2.4 To engage in the creative process or interpretive performance and comprehend the physical and intellectual demands required of the author or visual or performing artist.2.5 To articulate an informed personal reaction to works in the arts and humanities.

SLO3

2.6 To develop an appreciation for the aesthetic principles that guide or govern the humanities and arts.2.7 To demonstrate knowledge of the influence of literature, philosophy, and/or the arts on intercultural experiences.

SLO4

AA/AS PLO-Course Alignment Map(Worksheet with Examples: AA/AS PLOs Aligned with Course SLOs)

For Questions, call Jinhao Wang at 832-813-6255 or email at [email protected]

Aligned Courses/SLOs

(Examples from Fall 2012 & Fall 2013

Assessed Courses)

Aligned Courses/SLOs

(Identify Fall 2014 New Course/SLOs)

Directions:

1. All academic transfer courses' SLO assessment needs to be aligned with AA/AS Program Learning Outcomes (PLOs) as laid out below on the

left-hand column;

2. Workforce programs/AAT program have their own sets of PLOs and a different worksheet;3. Use the following worksheet to identify courses in the blue row and SLO numbers in the blank cell (See examples from Fall 2012 and Fall 2013 in the first 5 columns next to each PLO.)

40LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
Appendix A
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Page 45: If we plan and conduct our assessment projects at every ...

Social and Behavioral SciencesANTH

2301

GOVT

2306

HIST

2301

ECON

2302CRIJ 2313

3.1 To employ the appropriate methods, technologies, and data that social and behavioral scientists use to investigate the human condition.3.2 To examine social institutions and processes across a range of historical periods, social structures, and cultures.

SLO1 SLO1

3.3 To use and critique alternative explanatory systems or theories. SLO2 SLO1 SLO2

3.4 To develop and communicate alternative explanations or solutions for contemporary social issues.

SLO1

3.5 To analyze the effects of historical, social, political, economic, cultural, and global forces on the area under study.

SLO2

3.6 To comprehend the origins and evolution of U.S. and Texas political systems, with a focus on the growth of political institutions, the constitutions of the U.S. and Texas, federalism, civil liberties, civil and human rights.

SLO1

3.7 To understand the evolution and current role of the U.S. in the world.3.8 To differentiate and analyze historical evidence (documentary and statistical) and differing points of view.

SLO2

3.9 To recognize and apply reasonable criteria for the acceptability of historical evidence and social research.3.10 To analyze, critically assess, and develop creative solutions to public policy problems.

SLO6

3.11 To recognize and assume one’s responsibility as a citizen in a democratic society by learning to think for oneself, by engaging in public discourse, and by obtaining information through the news media and other appropriate information sources about politics and public policy.

3.12 To identify and understand differences and commonalities within diverse cultures.

41LSC SLO Handbook (2nd ed.)

Page 46: If we plan and conduct our assessment projects at every ...

Multicultural CompetenciesEDUC

2301

EDUC

1301

HUMA

1303

SOCI

2319

ANTH

2347

4.1 Demonstrates knowledge of those elements and processes that create and define culture.

SLO1 SLO2

4.2 Develops an understanding of the values, practices, beliefs, and responsibilities of living in a multicultural world.

SLO2 SLO2

4.3 Develops cross-cultural understanding, empathy, and communication. SLO1

4.4 Demonstrates an understanding of the underlying unity of diverse cultural expressions and their influences on cross-cultural interactions.

SLO1 SLO1

MathMATH

1316

MATH

2413

CETT

1402

CHEM141

2

ACNT

2303

5.1 To apply arithmetic, algebraic, geometric, higher order thinking, and statistical methods to modeling and solving real-world situations.

SLO1 SLO5

5.2 To represent and evaluate basic mathematical information verbally, numerically, graphically, and symbolically. SLO2

5.3 To expand mathematical reasoning skills and formal logic to develop convincing mathematical arguments.

SLO1 SLO2

5.4 To use appropriate technology to enhance mathematical thinking and understanding and to solve mathematical problems and judge the reasonableness of the results.

SLO2

5.5 To interpret mathematical models such as formulas, graphs, tables and schematics, and draw inferences from them.

SLO2

5.6 To recognize the limitations of mathematical and statistical models. SLO2

5.7 To develop the view that mathematics is an evolving discipline, interrelated with human culture, and understand its connections to other disciplines.

SLO5

42LSC SLO Handbook (2nd ed.)

Page 47: If we plan and conduct our assessment projects at every ...

Natural SciencesBIOL

1408

BIOL

1406

CHEM

1412

GEOL

1405

PHYS

2425

6.1 To understand and apply methods and appropriate technology to the study of natural sciences.

SLO1 SLO6 SLO6 SLO1 SLO5

6.2 To recognize scientific and quantitative methods and the differences between these approaches and other methods of inquiry and to communicate findings, analyses, and interpretation both orally and in writing.

SLO4 SLO10 SLO15

6.3 To identify and recognize the differences among competing scientific theories.

SLO5

6.4 To demonstrate knowledge of the major issues and problems facing modern science, including issues that touch upon ethics, values, and public policies.

SLO3

6.5 To demonstrate knowledge of the interdependence of science and technology and their influence on, and contribution to, modern culture.

SLO6

43LSC SLO Handbook (2nd ed.)

Page 48: If we plan and conduct our assessment projects at every ...

Program Learning Outcomes Course 1 Course 2 Course 3 Course 4

Program Level

Assignment1 *

Program Level

Assignment2 *

Program Level

Assignment3*

**Adapted from UT-Austin Program Assessment Model

Workforce Program & AAT PLO-Course Alignment Map (Worksheet)**

* Including exit exams, capstone project, portfolio, final report, etc.

Directions:

1. List your program's PLOs in the 1st column;

2. Identify courses in your program that are used for assessing PLOs and put them in the top row;

3. Identify program level assignments or tests that are used to assess PLOs as well (See an example below the worksheet).

44LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
jinhwang
Typewritten Text
Appendix B
Page 49: If we plan and conduct our assessment projects at every ...

Program Learning Outcomes BUSI 1301 BUSI 1307 BUSI 2301 BUSI 2304 BUSI 2372

Program Level

Assignment1 *

Program Level

Assignment2 *

PLO1: Apply critical-thinking

skills and innovative problem-

solving to complex issues in

business. SLO2 SLO4 SLO3 SLO12 SLO7 Graduate Portfolio Graduate Interview

PLO2: Demonstrate oral and

written proficiency in presenting

management issues and

solutions. SLO1 SLO4 SLO6, SLO7 SLO7 SLO7 Graduate Portfolio Graduate Interview

PLO3: Demonstrate leadership

and workplace interpersonal

skills. SLO5 SLO2 SLO3 SLO2 SLO1 Graduate Portfolio Graduate Interview

Example: AA in Business Field of Study PLO and Course Alignment Map

45LSC SLO Handbook (2nd ed.)

Page 50: If we plan and conduct our assessment projects at every ...

Discipline: (e.g. Biology)Course: (e.g. BIOL 1406)Academic Year:Learning Outcome # (Drop-Down Menu in CA)Enter the actual # listed beside the SLO on the course SLO listAA/AS PLO Alignment (Drop-Down Menu in CA)

Student Learning Outcome Statement

Learning Outcome # (Drop-Down Menu in CA)Enter the actual # listed beside the SLO on the course SLO listAA/AS PLO Alignment (Drop-Down Menu in CA)

Student Learning Outcome Statement

Curriculum Team Work - Recommendations, Comments, Documentation

Curriculum Team Form(Worksheet)

For Questions, call Jinhao Wang at 832-813-6255 or email at [email protected]

Directions:

1. Curriculum Team Chairs are responsible for filling out this form prior to each academic year;2. This form is located in Compliance Assist within the SLO Forms tab;3. The purpose of this form is to build consistency across the system for courses and SLOs selected for assessment;4. Disciplines at each campus must follow the exact wording and # of the SLOs laid out in this form;5. Each year, each discipline should fill out two Curriculum Team Forms: one for the new course selected and the other one for the re-assessed course.6. Each form consists of two SLOs (to be filled out prior to the beginning of each assessment cycle) and report on Curriculum Teamwork, such as recommendations, comments, and documentation (to be filled out at the end of each assessment cycle.

46LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
Appendix C
jinhwang
Typewritten Text
Page 51: If we plan and conduct our assessment projects at every ...

Section Selection E-Form Guide

1. Open the link to the e-Form sent to you via email. The link will take you to a screen that looks like thefollowing:

2. Click on the button labeled “Section Selection E-Form”

3. Enter the campus, subject, and catalog number; then, click the “Select Sections” button.

47LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
Appendix D
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Page 52: If we plan and conduct our assessment projects at every ...

4. A new screen will appear with all the sections for the subject and catalog number at the chosencampus.

If there are more sections than can appear on the screen, there will be a scroll bar on theright hand side of the screen to scroll down.

5. Click on the box in the “SLO Section” column to select the class section for assessment.

6. Once all sections are selected, they are auto-saved, so you can just close the window using the “x” inthe top right corner.

48LSC SLO Handbook (2nd ed.)

Page 53: If we plan and conduct our assessment projects at every ...

7. This will take you back to the Section Selection E-Form again:

a. If you need to select sections for another subject and catalog number, you can change theSubject and Catalog Number fields and click the “section selection” button, and follow steps 4 –6 to select the sections, or

b. If you are finished selecting your sections, close the “Section Selection E-Form” using the “x” inthe upper right hand corner.

8. At this point, you are now done selecting sections and can close the web browser window. The formwill continue to be available if you need to select more sections later.

b.

a.

49LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Page 54: If we plan and conduct our assessment projects at every ...

Detailed Explanation of the SLO/PLO Assessment Processes

Explanation of the Process Responsible Personnel Timeline Tool

1. Selecting the courses and SLOsfor assessment:

A. Select two courses and 4 SLOs each academic year (one new course with 2 SLOs and one reassessed course with 2 SLOs);

B. Selected courses and SLOs need to be aligned with PLOs (AA/AS PLOs and Workforce PLOs)

Curriculum Team Chairs are responsible for filling out the Curriculum Team Forms for courses and SLOs selected so that campus disciplines can follow the exact SLO # and Statements for their SLO assessment plans

Before the start of Fall semesters

(preferably on the Spring Curriculum Team Work

Day)

Curriculum Team Forms

AA/AS Course & PLO

Alignment Worksheet

Workforce Program course

& PLO Alignment Worksheet

2. Planning:A. Academic disciplines develop SLO

assessment plans for the selected two courses with 4 SLOs (create a total of 4 SLO Assessment Forms in Compliance Assist);

B. Workforce programs, besides creating 4 SLO Assessment Forms, also create PLO Assessment Forms for all PLOs. The SLO assessment data should support the PLO assessment;

C. Assessment Plans include such elements as course-PLO alignment, SLO or PLO statements, methods of assessment, criteria for meeting the outcome, and target outcomes.

Deans, chairs/program directors, faculty, and

discipline SLO assessment liaisons

May 1 for Fall SLO

assessment & for PLO

assessment

December 1 for Spring SLO

assessment

SLO Assessment

Forms in Compliance

Assist

PLO Assessment

Forms in Compliance

Assist

3. Sampling:A. 25% of sections need to be

identified for SLO assessment; B. If the course has 4 or less

sections, then all sections need to be assessed;

C. Sections selection needs to include online, off-site, and dual enrollment sections;

D. Faculty members teaching the selected sections need to be notified and trained to conduct assessment activities and collect assessment data.

Chairs/program directors are

responsible for identifying section selected for SLO

assessment using the e-Form

Sections selected and

faculty notified by the

Fall Official Day for Fall assessment and by the

Spring Official Day for Spring

assessment

e-Form for Section

Selection

jinhwang
Typewritten Text
Appendix E
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
50
jinhwang
Typewritten Text
jinhwang
Typewritten Text
LSC SLO Handbook (2nd ed.)
Page 55: If we plan and conduct our assessment projects at every ...

4. Assessing:A. Faculty will conduct learning

activities and administer

assignments designed to measure

the degree of mastery of the SLOs

or PLOs;

B. Faculty will score the assignments

using rubrics or a scoring guide;

C. Faculty will calculate number of

students participating in the

assessment, number meeting the

scoring criteria, and % meeting

the criteria;

D. % meeting the criteria = number

met/number assessed.

Faculty members are responsible for

assessing students’ learning and collecting

data.

Throughout the semester

Training ppt presentations

on how to design

assessment methods and

how to develop rubrics

available on SLO webpage

5. Submitting Data:A. SPA will send out the SLO Data

Collection Tool link to deans, chairs, and program directors, who will forward the link to selected faculty to submit their SLO assessment data;

B. SPA will generate the aggregated reports and send these reports to VPIs to distribute;

C. If a workforce program requires the program-level data to assess its PLOs, the program director will be responsible for collecting these data.

Deans, chairs/program directors, faculty, SPA

Fall semester’s

data submission:

from November 1 to the end of

semester

Spring semester’s

data submission:

from April 1 to the end of the

semester

SLO Data Collection Tool

6. Interpreting Data and MakingRecommendations:

A. Engage faculty in the discussion of data, analyzing what factors impacted the results, and make recommendations based on the interpretation;

B. Recommendations can be made in the area of instructional strategies, curriculum revision, student support, and faculty professional development, etc.

Deans, chairs/program directors, faculty,

Discipline SLO Assessment Liaisons

Before February 15

for Fall assessment

Before September 15

for Spring assessment

SLO Assessment

Forms in Compliance

Assist

PLO Assessment

Forms in Compliance

Assist

7. System-Level Sharing andDecision Making:

A. If the campus recommendations impact curricular changes, then the Curriculum Team meets and makes recommendations;

B. Recommendations made by the system-wide curriculum teams

Curriculum Teams Before February 15

for Fall assessment

Before September 15

for Spring

Curriculum Team Work

section of the Curriculum

Team Form in Compliance

Assist

jinhwang
Typewritten Text
jinhwang
Typewritten Text
51
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
LSC SLO Handbook (2nd ed.)
jinhwang
Typewritten Text
Page 56: If we plan and conduct our assessment projects at every ...

need to be documented in Compliance Assist in the Curriculum Team Work section of the Curriculum Team Form in Compliance Assist

assessment

8. ImplementingRecommendations:

A. Implement recommendations during the subsequent year;

B. Report on the implementation at the end of the subsequent year in Compliance Assist in the Implementation section of the SLO Assessment Form and/or PLO Assessment Form (For example, the 2013-14 recommendations were implemented in 2014-15, so the report on the implementation should occur at the end of 2014-15 cycle and be put in the 2013-14 SLO/PLO Assessment Forms.)

Deans, chairs/program directors, faculty

Implement recommendat

ions during the

subsequent Fall and/or

Spring

Report on implementati

ons before September 15

SLO Assessment

Forms in Compliance

Assist

PLO Assessment

Forms in Compliance

Assist

9. Closing-the-Loop Reporting:

A. The closing-the-loop reports include reports on the actual results, interpretation of data, recommendations, and implementation of recommendations (the implementation part of the report occurs at the end of the subsequent year);

B. If system-wide curriculum teams made recommendations on curriculum changes, then these recommendations and implementation of the recommendations need to be documented in the SLO/PLO Assessment Forms as well.

Deans, chairs/program directors, faculty,

Discipline SLO Assessment Liaisons

Before February 15

for Fall assessment

Before September 15

for Spring assessment

SLO Assessment

Forms in Compliance

Assist

PLO Assessment

Forms in Compliance

Assist

jinhwang
Typewritten Text
jinhwang
Typewritten Text
52
jinhwang
Typewritten Text
LSC SLO Handbook (2nd ed.)
jinhwang
Typewritten Text
Page 57: If we plan and conduct our assessment projects at every ...

Student Learning Outcome Assessment FormAcademic Year: __________________________ Semester: ______________________________

Campus: ________________________________ Discipline: ______________________________

Course Prefix & Number: _________________ SLO Number: ___________________________

AA/AS PLO Alignment: _____________________________________________________________

Please fill out section 1 at the beginning of the semester.

Section 1

Student Learning Outcome Statement (word-for-word from syllabus)

Method of Assessment (A) – Measure: Student Work/Performance Assessed

Method of Assessment (B) – Instructor's EvaluationMethod

Criteria – What defines successfully meeting SLO for each student?

Target Outcome – Expected Group Success Rate

53LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Appendix F
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Page 58: If we plan and conduct our assessment projects at every ...

Please fill out section 2 at the end of the semester.

Section 2

Total Number of Students Enrolled on Official Day:

Total Number of Students Assessed:

Total Number of Students Meeting Success Criteria:

Percentage of Assessed Students Who Met Success Criteria:

Actual Results: Was the student learning outcome met?

Interpretation of Results

Recommendations for Improvements

Report on Implementation (via ACE objectives and action steps or other activities): Report on what recommendations have already been implemented:

Additional Comments:

Progress Status:

Please include all supporting documents: rubrics, class data, assessment tools, or other documents containing information pertaining to SLO data collection and results.

54LSC SLO Handbook (2nd ed.)

Page 59: If we plan and conduct our assessment projects at every ...

Program Learning Outcome Assessment Form

Academic Year: __________________________ Start Date: ____________ End Date: ___________

Campus: ________________________________ Program: _________________________________

PLO Number: ____________________________

Please fill out section 1 at the beginning of the assessment period.

Section 1

Program Learning Outcome Statement:

Level of Assessment (Course Level or Program Level):

Course Name if Embedded at Course Level:

Type of Assessment (Direct or Indirect):

Method of Assessment (A) – Measure: Student Work/ Performance

Method of Assessment (B) – Instructor's Evaluation Method

Criteria – What defines successfully meeting PLO for each student?

Target Outcome – Expected Group Success Rate

55LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
jinhwang
Typewritten Text
Appendix G
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Page 60: If we plan and conduct our assessment projects at every ...

Please fill out section 2 at the end of the assessment period.

Section 2

Total Number of Students Eligible for PLO Assessment:

Total Number of Students Assessed:

Total Number of Students Meeting Success Criteria:

Percentage of Assessed Students Who Met Success Criteria:

Actual Results: Was the program learning outcome met?

Interpretation of Results

Recommendations for Improvements

Report on Implementation (via ACE objectives and action steps or other activities)

Additional Comments:

Please include all supporting documents: rubrics, class data, assessment tools, or other documents containing information pertaining to PLO data collection and results.

56LSC SLO Handbook (2nd ed.)

Page 61: If we plan and conduct our assessment projects at every ...

Assessment

Elements

Beginning

1

Developing

2

Acceptable

3

Exemplary

4

SLO or PLO

Statement

SLOs or PLOs are too broad and abstract. They are not measurable (Ex. Students will understand the history of Western Civilization).

SLOs or PLOs are broad, but can be measurable if rubrics are well developed (Ex. Students will demonstrate entry level skills as vocational therapists)

SLOs or PLOs are specific and measurable (Ex. Students will be able to critically analyze a theatrical production using the cited elements as the basis for analysis.)

SLOs or PLOs are specific and measurable, addressing various levels of learning and different learning domains (See Appendix I and J for examples.)

Methods of

Assessment (A):

Assignments given

to students to

gauge the master of

the SLO or PLO

Confuses course grades as SLO or PLO assessment assignments. Or assignments cannot produce evidence to show students' mastery of an SLO or PLO.

Part of a big assignment can be used to assess students' mastery of an SLO or PLO, but the whole big assignment's score is used to gauge students' mastery of the particular SLO or PLO (Ex. A final exam score is used to demonstrate students' mastery of an SLO when only an essay question embedded in the exam can gauge the mastery of that SLO.)

Assignment is specific and relevant to the SLO or PLO and can produce evidence to demonstrate the mastery of the SLO or PLO (Ex. To assess the SLO in the row above, the assignment requires students to attend a theatrical production at LSC-CyFair and write a critique of the production.)

Assignment is specific and relevant. It also stimulates deep learning (Ex. An internship report that must demonstrate that the student has learned something new and relevant to the leadership skills in the business world.)

Methods of

Assessment (B):

Scoring

Method/Evaluation

Rubric

No rubric is designed or no scoring method is defined.

Rubric is too holistic to gauge the specific level of quality of student work./Scoring guide that lays out the score distribution is not justified.

Rubric developed can appropriately gauge different levels of student performance/Scoring guide appropriately assigns the weight of scores according the difficulty levels of the assignment.

Rubric developed can appropriately gauge different levels of student performance/Scoring guide appropriately assigns the weight of scores according the difficulty levels of the assignment. Additionally, the evaluation instrument is validated by research and best practices in the field.

Criterion (for

Individual Student’s

Mastery of SLO)

No criterion for individual student's mastery of SLO or PLO is defined.

Criterion for individual student's mastery of SLO or PLO is vaguely defined.

Criterion for individual student's mastery of SLO or PLO is clearly defined.

Criterion for individual student's mastery of SLO or PLO is clearly defined and is both realistic and challenging based on research and best practices in the field.

Self-Assessment Rubric for Completing SLO/PLO Assessment Plans and Reports

P

l

a

n

57LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
Appendix H
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
jinhwang
Typewritten Text
Page 62: If we plan and conduct our assessment projects at every ...

Assessment

Elements

Beginning

1

Developing

2

Acceptable

3

Exemplary

4

Self-Assessment Rubric for Completing SLO/PLO Assessment Plans and Reports

Target Outcome

(Expected Group

Success Rate)

No target for group success (% of the group mastering SLO or PLO) is defined.

Target for group success is vaguely defined.

Target for group success is clearly defined.

Target for group success is clearly defined. Target set is both realistic and challenging based on research and best practices in the field.

Results Aggregated results are presented in a confusing way or not presented at all.

Aggregated results are presented but either the # or % of the group mastering each SLO or PLO is missing. Or there is a calculation error in data presentation.

Aggregated results (# and % of the group mastering each SLO or PLO) are presented in a clear and understandable way.

Aggregated results (# and % of the group mastering each SLO or PLO) are presented in a clear and understandable way. Conclusion is objectively drawn based on the comparison between target outcome and actual results.

Interpretation of

Results

No factors that impact the results are identified.

Factors that impact the results are identified, but they are arbitrary, irrelevant to the issues addressed by the SLO or PLO.

Factors that impact the results are identified.

Factors that impact the results are identified. Analysis of the factors is relevant and insightful, based on faculty feedback and research.

Recommendations Recommendations are not made (Ex. "No recommendation this time").

Recommendations are made but they are broad, not actionable (Ex. "Data will be shared with faculty and administrators").

Recommendations are specific, actionable, but it is not clear if these recommendations are addressing factors identified (Ex. When the facotor was identified as lack of professional development for adjunct faculty, the recommendation was to "increase the recruitment effort for adjunct faculty."

Recommendations are specific, actionable, addressing factors identified from data interpretation (Ex. When ENGL 1302 students were found to have hard time integrating sources of information in their research paper, the recommendation was to "implement critical thinking activities in ENGL 1301, a prerequisite course for ENGL 1302"

C

l

o

s

e

-

t

h

e

-

L

o

o

p

R

e

p

o

r

t

P

l

a

n

58LSC SLO Handbook (2nd ed.)

Page 63: If we plan and conduct our assessment projects at every ...

Assessment

Elements

Beginning

1

Developing

2

Acceptable

3

Exemplary

4

Self-Assessment Rubric for Completing SLO/PLO Assessment Plans and Reports

Report on

Implementation of

Recommendations

(One Year Later)

Fail to report on the implementation of recommendations.

Report on implementation of recommendations is not specific, nor is it in past tense (Ex. Discuss data with faculty and administrators.")

Report on implementation of recommendations is specific, using past tense (Ex. "Two adjunct job fairs were hosted in Fall 2013 and Spring 2014 respectively.")

Report on implementation of recommendations is specific, using past tense. In addition, follow-up strategies to exam the impact of the implementation are identified (The English Department implemented critical thinking activities in ENGL 1301 in Fall 2013. Students from the ENGL 1301 Fall 2013 cohort will be tracked to see their performance in ENGL 1302 with regard to their ability to integrate the sources in the research paper).

C

l

o

s

e

-

t

h

e

-

L

o

o

p

R

e

p

o

r

t

59LSC SLO Handbook (2nd ed.)

Page 64: If we plan and conduct our assessment projects at every ...

Student Learning

Outcomes

(Must be specific and measurable, addressing different levels of learning)

Method of Assessment (A) -

Measure: Student

Work/Performance

(Must be relevant and able to produce evidence to gauge the mastery of PLOs)

Method of Assessment (B) -

Instructor's Evaluation

Method

(Must have well-defined rubric or scoring guide)

Criteria - What

defines

successfully

meeting PLO for

each student?

(Must be realistic and challenging)

Target Outcome -

Expected group

success rate (Must be realistic and challenging)

Actual Results: Was the

program learning

outcome met? (Must draw objective conclusion based on the actual results)

Interpretation of Results

(Must identify what factors contributed to the results)

Recommendations

(Suggestions for change and improvement--Must be specific and actionable)

Implementation of

Recommendations

(Must report on what has been implemented. Must be specific and use past tense)

Students will demonstrate the theatrical production process through performance.

Students will memorize lines from a monologue or scene, conduct a character analysis and perform their roles authentically.

Rubric for observing students' acting.

A score 2 or above on a scale of 1-3.

80% of students will score 2 or above.

81% (36 out of 44) scored a 2 or above. This is 1% higher than the targeted 80%.

Students who didn't meet the criterion were the ones who didn't put the effort into learning their lines nor conducting the necessary script analysis to successfully complete scene and character analysis.

Place greater emphasis upon the pre-production process of developing a scene or monologue for performance.

At the Curriculum Team Work Day in January 2014, the Drama Department across the system added the pre-production assignment to the DRAM 1351 course.

Students will read, analyze, and interpret scripts.

Students will demonstrate mastery of objectives, actions, and motivations, through written scene analysis and performed scene work.

Rurbrics for observation of demonstration and criteria evaluation of written work.

An average score 2 or above on demonstration and written work.

80% of students will score an average of 2 or above.

95% (21 out of 22) scored an average of 2 or above. This was 15% higher than the targeted 80%. Target was exceeded.

Although most students met the criteria, some lacked fluency in demonstrating the scenes.

Schedule more time to rehearse the scenes both inside and outside the classroom.

Social groups were formed to meet outside the class for rehearsal for the Spring 2014 DRAM 1351 classes.

Sample of Exemplary SLO Plans and Reports

DRAM 1351

60LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
jinhwang
Typewritten Text
Appendix I
Page 65: If we plan and conduct our assessment projects at every ...

Student Learning

Outcomes

(Must be specific and measurable, addressing different levels of learning)

Method of Assessment (A) -

Measure: Student

Work/Performance

(Must be relevant and able to produce evidence to gauge the mastery of PLOs)

Method of Assessment (B) -

Instructor's Evaluation

Method

(Must have well-defined rubric or scoring guide)

Criteria - What

defines

successfully

meeting PLO for

each student?

(Must be realistic and challenging)

Target Outcome -

Expected group

success rate (Must be realistic and challenging)

Actual Results: Was the

program learning

outcome met? (Must draw objective conclusion based on the actual results)

Interpretation of Results

(Must identify what factors contributed to the results)

Recommendations

(Suggestions for change and improvement--Must be specific and actionable)

Implementation of

Recommendations

(Must report on what has been implemented. Must be specific and use past tense)

Identify and compare the macromolecules of life.

Exam Scoring guide A score of 70 on the exam

70% of the students will score at least 70 on the exam.

The target outcome of 70% passing the questions on the exam was met by 87% of students (52 out of 60)

a) The target outcome for this outcome was met, due to the following factors: Instructors used a variety of strategies including active learning, labs, homework, and lecture. b) Although the target outcome was met, the curriculum committee felt that more discrete skills should be analyzed to identify students' weak areas so that teaching strategies can be used to improve the more discrete areas.

Recommend that the curriculum committee evaluate the elements of the outcome and identify more discrete areas for improvement and then implement teaching strategies that address the areas identified as needing improvement.

More specific questions on the assessment were used in the classroom to determine areas where students were lacking understanding. Topics related to basic concepts were identified needing reteach, so instructors were able to reteach these topics to better enhance student understanding.

Illustrate how the molecules involved in inheritance are synthesized and used.

5 Embedded questions in an exam.

Scoring guide Correctly answer 3 or more of the 5 questions

70% of the students will correctly answer 3 or more of the 5 questions

The actual result of this assessment was 70% of students (42 out of 60), so the outcome was met.

Although students met the criterion, faculty felt that assessment questions and the rubric were not discrete enough to pin down to which areas students didn't do well.

We would like to suggest a different assessment assignment and rubric to measure each part of the outcome. This would allow us to look at specific parts of the outcome to know the student success in each area.

After using a more discrete rubric, instructors determined that students struggled with process of translation. Knowing this information helped instructors adjust instruction to focus more on helping students better understand this complicated process.

BIOL 1408

61LSC SLO Handbook (2nd ed.)

Page 66: If we plan and conduct our assessment projects at every ...

Program Learning

Outcomes

(Must be specific and measurable, addressing different levels of learning)

Method of Assessment (A) -

Measure: Student

Work/Performance

(Must be relevant and able to produce evidence to gauge the mastery of PLOs)

Method of Assessment (B) -

Instructor's Evaluation

Method

(Must have well-defined rubric or scoring guide)

Criteria - What

defines

successfully

meeting PLO for

each student?

(Must be realistic and challenging)

Target Outcome -

Expected group

success rate (Must be realistic and challenging)

Actual Results: Was the

program learning

outcome met? (Must draw objective conclusion based on the actual results)

Interpretation of Results

(Must identify what factors contributed to the results)

Recommendations

(Suggestions for change and improvement--Must be specific and actionable)

Implementation of

Recommendations

(Must report on what has been implemented. Must be specific and use past tense)

Analyze technical issues as related to machine tool manufacturing.

Examinations and lab projects

Scoring guide for exams using 1-100 scale/Rubrics for lab projects on 1-5 scale.

An average score of 90 or above for exams, and an average of 3 or above for lab projects

100% of students will score an average of 90 or above on exams and an average of 3 or above on lab projects

The target was 100% of students demonstrating mastery of this PLO, and the actual result was 100% (26 out 26 students). Target was met.

100% of students successfully met the outcome, and this was due to the intensive hands-on training on well defined set of skills.

Faculty will continue to provide hands-on experience and practice to students on how to analyze, formulate, trouble shoot, and incorporate technical manufacturing practices.

Students were provided hands-on experience and practice on how to choose machines, tooling, and workholding for a particular part.

Design a machined part for the development of technical requirements.

Each student will produce a 2 dimensional part on CAD-CAM, conducting proper tooling selection, selecting a proper tool path and running a successful simulation.

Rubrics for direct inspection, observation based on efficient G & M codes produced.

An average score of 3 or above on a scale of 1-5 for sketching a machined workpiece

100% of students will score an average of 3 or above

The target was 100% of students demonstrating mastery of this PLO, and the actual result was 100% (26 out 26 students). Target was met.

All students were successful at sketching a machined workpiece; however, students need to be exposed to more tools and more practices.

Faculty will train students on using technology such as Solidworks, Edgecam, or Microsoft Paint to create sketches.

Students were asked to create a sketch of a machine part of their own creation. This part was to contain milling, drilling, and tapping operations for a milling machine while a lathe part was to contain facing, turning, grooving, threading, and boring operations.

Machining Technology

Sample of Exemplary PLO Plan and Report

62LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
Appendix J
Page 67: If we plan and conduct our assessment projects at every ...

Program Learning

Outcomes

(Must be specific and measurable, addressing different levels of learning)

Method of Assessment (A) -

Measure: Student

Work/Performance

(Must be relevant and able to produce evidence to gauge the mastery of PLOs)

Method of Assessment (B) -

Instructor's Evaluation

Method

(Must have well-defined rubric or scoring guide)

Criteria - What

defines

successfully

meeting PLO for

each student?

(Must be realistic and challenging)

Target Outcome -

Expected group

success rate (Must be realistic and challenging)

Actual Results: Was the

program learning

outcome met? (Must draw objective conclusion based on the actual results)

Interpretation of Results

(Must identify what factors contributed to the results)

Recommendations

(Suggestions for change and improvement--Must be specific and actionable)

Implementation of

Recommendations

(Must report on what has been implemented. Must be specific and use past tense)

Evaluate manufacturing system problems, plans and solutions for cost effective production.

Each student will produce a 5 axis part via the proper G & M codes running successfully with no errors. Also conducting proper tooling selection, selecting the proper tool path and running a successful simulation prior to actual production.

Rubrics for inspection and observation of 5 Axis programing

An average score of 3 or above on a scale of 1-5 for 5 Axis programing

100% of students will score an average of 3 or above

The target was 100% of students demonstrating mastery of this PLO, and the actual result was 100% (26 out 26 students). Target was met.)

Although all students met the outcome, some struggled with the basics. Improvements should be to give them more time for practice and stick to the very basics of the subject.

Require students to do a homework project-- to perform a cost analysis on a part in order to problem solve in ways to bring cost down.

Students were required in Spring 2014 course to perform a cost analysis on a part in order to problem solve in ways to bring cost down.

Communicate effectively to a group of peers the technical aspects of part manufacturing.

Each student will present to the class his/her completed project (capstone project) with the use of computer and and video technology.

Rubric for assessing the capstone project

An average score of 3 or above on a scale of 1-5 for class presentation

100% of students will score an average of 3 or above

The target was 100% of students demonstrating mastery of this PLO, and the actual result was 100% (26 out 26 students). Target was met.)

Although all students were successful, some were not as fluent as they should be.

Each class day, students will be advised and given the opportunity to present their projects as they build them, which will help them prepare for their final.

Each student gave a presentation of manufacturing processes, including green machining practices, required to machine a part.

63LSC SLO Handbook (2nd ed.)

Page 68: If we plan and conduct our assessment projects at every ...

AAHE Nine Principles of Good Practice for Assessing Student Learning

1. The assessment of student learning begins with educational values. Assessment is not an end in itself but a vehicle for educational improvement. Its effective practice, then, begins with and enacts a vision of the kinds of learning we most value for students and strive to help them achieve. Educational values should drive not only what we choose to assess but also how we do so. Where questions about educational mission and values are skipped over, assessment threatens to be an exercise in measuring what's easy, rather than a process of improving what we really care about.

2. Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. Learning is a complex process. It entails not only what students know but what they can do with what they know; it involves not only knowledge and abilities but values, attitudes, and habits of mind that affect both academic success and performance beyond the classroom. Assessment should reflect these understandings by employing a diverse array of methods, including those that call for actual performance, using them over time so as to reveal change, growth, and increasing degrees of integration. Such an approach aims for a more complete and accurate picture of learning, and therefore firmer bases for improving our students' educational experience.

3. Assessment works best when the programs it seeks to improve have clear, explicitly stated purposes. Assessment is a goal-oriented process. It entails comparing educational performance with educational purposes and expectations -- those derived from the institution's mission, from faculty intentions in program and course design, and from knowledge of students' own goals. Where program purposes lack specificity or agreement, assessment as a process pushes a campus toward clarity about where to aim and what standards to apply; assessment also prompts attention to where and how program goals will be taught and learned. Clear, shared, implementable goals are the cornerstone for assessment that is focused and useful.

4. Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes. Information about outcomes is of high importance; where students "end up" matters greatly. But to improve outcomes, we need to know about student experience along the way -- about the curricula, teaching, and kind of student effort that lead to particular outcomes. Assessment can help us understand which students learn best under what conditions; with such knowledge comes the capacity to improve the whole of their learning.

5. Assessment works best when it is ongoing not episodic. Assessment is a process whose power is cumulative. Though isolated, "one-shot" assessment can be better than none, improvement is best fostered when assessment entails a linked series of activities undertaken over time. This may mean tracking the process of individual students, or of cohorts of students; it may mean collecting the same examples of student performance or using the same instrument semester after semester. The point is to monitor progress toward intended goals in a spirit of continuous improvement. Along the way, the assessment process itself should be evaluated and refined in light of emerging insights.

64LSC SLO Handbook (2nd ed.)

jinhwang
Typewritten Text
Appendix K
Page 69: If we plan and conduct our assessment projects at every ...

6. Assessment fosters wider improvement when representatives from across the educational community are involved. Student learning is a campus-wide responsibility, and assessment is a way of enacting that responsibility. Thus, while assessment efforts may start small, the aim over time is to involve people from across the educational community. Faculty play an especially important role, but assessment's questions can't be fully addressed without participation by student-affairs educators, librarians, administrators, and students. Assessment may also involve individuals from beyond the campus (alumni/ae, trustees, employers) whose experience can enrich the sense of appropriate aims and standards for learning. Thus understood, assessment is not a task for small groups of experts but a collaborative activity; its aim is wider, better-informed attention to student learning by all parties with a stake in its improvement.

7. Assessment makes a difference when it begins with issues of use and illuminates questions that people really care about. Assessment recognizes the value of information in the process of improvement. But to be useful, information must be connected to issues or questions that people really care about. This implies assessment approaches that produce evidence that relevant parties will find credible, suggestive, and applicable to decisions that need to be made. It means thinking in advance about how the information will be used, and by whom. The point of assessment is not to gather data and return "results"; it is a process that starts with the questions of decision-makers, that involves them in the gathering and interpreting of data, and that informs and helps guide continuous improvement.

8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. Assessment alone changes little. Its greatest contribution comes on campuses where the quality of teaching and learning is visibly valued and worked at. On such campuses, the push to improve educational performance is a visible and primary goal of leadership; improving the quality of undergraduate education is central to the institution's planning, budgeting, and personnel decisions. On such campuses, information about learning outcomes is seen as an integral part of decision making, and avidly sought.

9. Through assessment, educators meet responsibilities to students and to the public. There is a compelling public stake in education. As educators, we have a responsibility to the publics that support or depend on us to provide information about the ways in which our students meet goals and expectations. But that responsibility goes beyond the reporting of such information; our deeper obligation -- to ourselves, our students, and society -- is to improve. Those to whom educators are accountable have a corresponding obligation to support such attempts at improvement.

Authors: Alexander W. Astin; Trudy W. Banta; K. Patricia Cross; Elaine El-Khawas; Peter T. Ewell; Pat Hutchings; Theodore J. Marchese; Kay M. McClenney; Marcia Mentkowski; Margaret A. Miller; E. Thomas Moran; Barbara D. Wright

65LSC SLO Handbook (2nd ed.)