STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14:...

28
Working Group Report: Standard 14 Page 1 STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: EDUCATIONAL OUTCOMES—THE MEASUREMENT OF INSTITUTIONAL EFFECTIVENESS AND LEARNING ABILITY Chair: John K. Bechtold Vice Chair: Stephen J. Tricamo; Katia Passerini (moved to RASC, Fall 2010) Institutional Research Advisor: Eugene P. Deess Committee Members: William Barnes, Phyllis Bolling, Nicholas Carlson, John M. Cays, Carol S. Johnson, James Lipuma, Thomas G. Moore, Sharon E. Morgan, Mary Kate Naatus, Marvin K. Nakayama, Naomi G. Rotter, Oleksandr Rudniy, Mohamad A. Saadeghvaziri, Davida Scharf Final Report Submitted : May 31, 2011 Prepared for the Middle States Commission on Higher Education Reaccreditation 2012

Transcript of STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14:...

Page 1: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 1

STANDARD 14: ASSESSMENT OF STUDENT LEARNING

REPORT OF WORKING GROUP 4: EDUCATIONAL

OUTCOMES—THE MEASUREMENT OF INSTITUTIONAL EFFECTIVENESS AND LEARNING ABILITY

Chair: John K. Bechtold Vice Chair: Stephen J. Tricamo; Katia Passerini (moved to RASC, Fall 2010) Institutional Research Advisor: Eugene P. Deess Committee Members: William Barnes, Phyllis Bolling, Nicholas Carlson, John M. Cays, Carol S. Johnson, James Lipuma, Thomas G. Moore, Sharon E. Morgan, Mary Kate Naatus, Marvin K. Nakayama, Naomi G. Rotter, Oleksandr Rudniy, Mohamad A. Saadeghvaziri, Davida Scharf Final Report Submitted: May 31, 2011

Prepared for the Middle States Commission on Higher Education Reaccreditation 2012

Page 2: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 2

TABLE OF CONTENTS

14.0 WORKING GROUP ASSESSMENT CHECKLIST FOR STANDARD 14 3  

14.1 INTRODUCTION 4  14.1.1 Précis: NJIT Planning and The ViSTa Model of Tactics and Metrics 4  14.1.2 An Overview of Group 4’s Standard 14 Charge and Questions Addressed 4  

14.2 SELF-STUDY INQUIRY AND OUTCOMES 6  14.2.1 The ViSTA Model of Institutional Assessment and the Assessment of Student Learning 6  14.2.2 Towards a Cohesive Plan for the Measurement of Learning Ability 7  14.2.3 Outcomes Assessment within the NJIT Colleges 8  14.2.4 The Office of Institutional Research and Planning and the Assessment of Student Learning 9  14.2.4.1 Exams 11  14.2.4.2 Surveys 12  14.2.4.3 Internal Data Systems 20  14.2.4.4 Benchmark Comparisons 20  14.2.5 Innovative Outcomes Research in Student Performance 21  

14.3 CRITICAL ANALYSIS AND CONCLUSIONS 24  

14.4 COLLABORATION WITH OTHER WORKING GROUPS 24  

14.5 RECOMMENDATIONS FOR IMPROVEMENT 24  14.5.1 Recommendations Table: Standard 14: Assessment of Student Learning 24  

Page 3: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 3

14.0 WORKING GROUP ASSESSMENT CHECKLIST FOR STANDARD 14 Fundamental Elements of Student Learning

Team Evaluation

From Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards of Accreditation. Philadelphia, PA: MSCHE, 2009

4=Exemplary 3=Emerging Excellence 2=Meets Standard 1=Developing Competency

clearly articulated statements of expected student learning outcomes (see Standard 11: Educational Offerings), at all levels (institution, degree/program, course) and for all programs that aim to foster student learning and development;

3

a documented, organized, and sustained assessment process to evaluate and improve student learning;

4

assessment results that provide sufficient, convincing evidence that students are achieving key institutional and program learning outcomes;

3

evidence that student learning assessment information is shared and discussed with appropriate constituents and is used to improve teaching and learning; and

3

documented use of student learning assessment information as part of institutional assessment.

3

Page 4: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 4

14.1 INTRODUCTION 14.1.1 Précis: NJIT Planning and The ViSTa Model of Tactics and Metrics The Reviewer’s Report on the 2007 periodic review notes “Decision-making at NJIT is clearly driven by continuous assessment of institutional effectiveness at multiple levels, including comprehensive coverage of student learning outcomes.” It describes practices that NJIT applies to learning outcomes assessment as “varied, exemplary, and in some cases, unique.” Specific assessment practices that were cited in the report include the first-of-its-kind electronic storage of student design work in the College of Architecture and Design (CoAD), the portfolio approach to track student writing, capstone evaluation by practicing engineers in NCE, and the NJIT university-wide course evaluation system. These practices have since been enhanced, and new practices have been developed and implemented to create a robust, data-driven university-wide model of student learning outcomes assessment. Emblematic of the NJIT commitment to assessment of student learning is the creation of an assessment plan to accompany the recent recommendations of the Task Force on Retention and Graduation. Described in 14.2.5, the assessment plan accompanying the Task Force recommendations illustrates the University’s commitment to assessment of student learning 14.1.2 An Overview of Group 4’s Standard 14 Charge and Questions Addressed The steering committee and Working Group 4 jointly developed the following set of research questions that challenged all of us to reflect on ongoing practices of student learning assessment at NJIT, and how those practices have been used in decision-making to ensure continuous improvement.

1. What evidence demonstrates that there is campus support for the assessment of student learning? (Section 14.2.1)

2. How do our current student learning outcomes ensure that there is consistent quality of admitted students? (Section 14.2.5)

3. How does NJIT document that the institution’s students have levels of knowledge and skills that are consistent with the NJIT mission? (Section 14.2.4-14.2.5)

4. What evidence is used to document that the General University Requirements are assessed through performance-based measures? (Section 14.2.4-14.2.5)

5. What evidence is used to document that degree programs are assessed through performance-based measures? (Section 14.2.5)

6. What evidence exists that our current student learning outcomes lead to curriculum transformation in terms of consistently achieved program objectives? (Section 14.2.4.4)

7. How has assessment of student learning influenced instruction? (Section 14.2.4.4)

Page 5: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 5

8. What strategies have we used to demonstrate that current student learning outcomes lead to curricular adjustments in terms of increased student engagement? (Section 14.2.2-14.2.4.4)

9. How do our current student learning outcomes lead to adjustments in terms of increased student retention? (Section 14.2.2-14.2.4.4)

10. How do our current student learning outcomes lead to informed decisions about curricular planning and resource allocation? (Section 14.2.4)

11. During the period of the present self study, how have we articulated the need to develop a university-wide outcomes assessment plan that unifies existing efforts? How is such a plan being developed? (Section 14.2.4.4)

These questions prompted us to document evidence that performance-based measures are used to assess degree programs, General University Requirements, and other components of student learning to demonstrate that our graduates possess the knowledge and skills consistent with the NJIT mission. Another key theme of these questions was to demonstrate that results of these assessment activities are used to influence changes/adjustments to improve all aspects of student life at NJIT. Our research required that we solicit input from all programs and departments (both academic and non-academic), which led to a university-wide discussion, and hence greater awareness, of student learning assessment. Selected examples, identified by this study, of how assessment has led to university-wide transformation are summarized in the following table:

Table 14.0. Transformation of Student Learning Assessment at NJIT

Targeted Area: (Status in 2007)

Assessment or Research Driving Change

Transformation: Status in 2011

Writing Placement Writing Placement Exam Study, Dept. of Humanities

New Placement Tool; Substantial reduction in remediation

Information Literacy Collaboration of Librarians and Dept. of Humanities

Adoption of Institute Information Literacy Plan across the curriculum

Course Evaluation System: (Pen-and-pencil)

IRP analysis found low response rate, costly.

Web-based, paperless system. Used to grade teaching effectiveness in performance based merit.

Testing Core Competencies: (Grades in GUR classes only)

Not Compliant with MSCHE and Voluntary System of Accountability

National-normed exams, ETS, administered, in addition to grades in GUR classes

Learning Goals and Core Competencies:

Self-study found commonalities, but not

Institutional Learning Goals and Core Competencies

Page 6: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 6

cohesion across colleges and programs.

clearly articulated by the Committee on Academic Affairs.

IRP Surveys: Desire for cost-effectiveness, efficiency and transparency.

All IRP surveys are web-based. Reports are posted online.

Program Review: (Committee on Department and Program Review)

Compliance with regional and program accrediting agencies requires comprehensive plan for program review.

Design and Implementation of a cohesive plan for measurement of student learning.

Distance Learning

Student Satisfaction Student Satisfaction Survey Some examples: Enhanced offerings in student dining, more social activities

Student Engagement National Survey on Student Engagement

Birth of Learning Communities; Connections, Miniversity

Retention Data from IRP shows low retention and graduation rates.

Creation of Task Force on Retention and Graduation Rates.

Academic Advising

Student feedback indicated “prescriptive” advising in several colleges too impersonal.

Creation of Task Force on Retention and Graduation Rates.

Student Intervention:

Low passing rate in some classes contribute to slow progress toward completion of degree.

Common exam scores in first and second year math and physics courses uploaded to Moodle for access by CAPE in order to aid students as necessary.

Instruction Wide variation in student evaluation of instructor performance.

Student evaluation of instructor performance a key component in new Performance Based Merit System.

14.2 SELF-STUDY INQUIRY AND OUTCOMES 14.2.1 The ViSTA Model of Institutional Assessment and the Assessment of Student Learning The NJIT Program Review Process, the centerpiece of the university’s student learning assessment initiative, has four objectives: to provide a forum for the assessment and

Page 7: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 7

improvement of all degree granting and university programs; to demonstrate continuous improvement in the delivery of educational curricula; to promote a culture of assessment by building a cohesive assessment strategy; and to create a central, web-based repository for assessment design and supporting documents. These objectives have yielded a university-wide forum for collaboration to share the visions we have for our programs, the strategies we use to achieve our program goals, tactics by which we implement these goals, and the metrics we use to measure our success. Our methods of assessment of educational effectiveness are thus aligned with our ViSTa model of assessment of institutional effectiveness, as described in Working Group Report, Standard 7. 14.2.2 Towards a Cohesive Plan for the Measurement of Learning Ability The NJIT Program Review Process The NJIT Program Review Process provides a cohesive educational assessment framework. It was developed in response to the trends influencing assessment of student learning at NJIT, including regional and program accreditation, the strategic plans of the university and colleges, and funded educational research. A discussion of the role of program review on educational offerings may be found in Working Group Report, Standard 11. The NJIT Strategic Plan 2010-2015 set a strategic priority to enhance the quality of academic and campus life for the university community, with the objective to refine outcomes assessment efforts in student learning to achieve continuous curricular improvement. Seven tactics were proposed with corresponding metrics to measure progress toward meeting that objective (Deess, Elliot, Deek, Tricamo, 2010). Although outcomes assessment has been taking place at the department and program level for many years, our planning has yielded a cohesive, university-wide, outcomes assessment model. Our office of Academic Affairs has articulated to the NJIT community our Core Values, Institutional Learning Goals, and General University Requirement Core Competencies. A standing committee for outcomes assessment was formed in order to identify/develop assessment methods to measure directly student learning in regards to Institutional Learning Goals and Core Competencies, to document how findings lead to curricular improvement, and to communicate these results to all stakeholders. The Committee on Department and Program Assessment and the Subcommittee on Assessment These two committees were formed in order to develop a university-wide process of program review that is consistent with university strategic planning initiatives, and that provides accountability and transparency. What has emerged from these efforts is a very structured process by which all of NJIT’s degree programs are reviewed on a five-year

Page 8: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 8

rotating cycle. The vision of the program review process is to answer the following questions:

• What is the cohesive framework of program assessment at NJIT? • How are degree program and course goals integrated and articulated? • How are learning outcomes tied to these goals? • How have planned assessment processes been implemented? • How do assessment results provide convincing evidence of student learning? • How do program administrators use assessment in decision making? • How have actions been taken as a result of assessment? • How have program assessment results been communicated to a variety of shareholders?

Details of this process and its implementation are given in “NJIT Program Review Process: Guidelines” (Deess, Deek, Elliot, 2011). The Committee on Department and Program Assessment has also launched a Moodle site and web-based repository for transparent documentation of the review process. Assessment of NJIT Core Competencies NJIT also has in place a process to assess our GUR Core Competencies, both within GUR classes and using nationally-normed exams. A thorough description of NJIT’s General University Requirements is given in Working Group Report, Standard12, on General Education. The four performance-based Core Competencies are Writing, Reading/Critical Thinking, Quantitative Reasoning and Information Literacy. The first three are tested using the ETS Proficiency Profile, and IL is tested using the iSkills exam. The use of these nationally-normed exams fulfills our commitment as a member of the Voluntary System of Accountability, and enables us to measure our performance against peer institutions nation-wide (Subcommittee on Assessment, 2011). Information literacy is also assessed according to the metrics established in the Institute IL Plan. 14.2.3 Outcomes Assessment within the NJIT Colleges Programs in four of NJIT’s six colleges undergo accreditation from external agencies, and each of these colleges is required to document compliance with an outcomes assessment standard as part of their cyclical review. Though each agency has a different focus, there exist many commonalities in outcomes assessment, which are also consistent with NJIT’s student learning assessment initiative. Charts documenting the most recent accreditation reports for these programs, as well as responses to recommendations from the appropriate accrediting agency, are provided below for NCE, CoAD, SoM and CCS.

Page 9: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 9

Although programs within CSLA do not undergo agency accreditation, each program does undergo NJIT program review on a regular basis. Program review documents for each department are given below. CSLA is also deeply involved in strategic planning, in which student learning assessment is an integral part of all program goals. The current CSLA Strategic Plan, 2009-2014, builds on the gains witnessed from the previous plan, and is congruent with the NJIT Strategic Plan. The core values identified in both plans are neatly aligned, and a mapping of NJIT Core Values across all science and liberal arts units within CSLA reveals this integration (Deek, 2010). Strategic and academic plans, as well as balanced scorecards for each program within CSLA, are posted publicly on the web, for the sake of transparency. 14.2.4 The Office of Institutional Research and Planning and the Assessment of Student Learning As noted in the 2007 PRR, “decision making at NJIT is clearly driven by continuous assessment of institutional effectiveness at multiple levels, including comprehensive coverage of student learning outcomes. The thorough assessment mechanism that evaluates progress on all components of the strategic plan is but one instance of the thorough-going culture of evidence that characterizes the institution.” The Office of Institutional Research and Planning oversees institutional assessment and the assessment of student learning outcomes at NJIT. With assessment functions housed in a single department, the university has developed an integrated assessment program where the various types of assessment, one might call them genres, serve multiple pragmatics. For example, studies of learning outcomes may have implications for advising and resource allocation, in addition to curricular reform. Similarly, indirect measures of student learning outcomes such as satisfaction surveys may suggest areas for special attention in the assessment of learning outcomes. Along these lines, the audience for student learning outcomes assessment is not hermetically sealed and essentially different from the audience for the assessment of student services or advisement. Even beyond shared governance, a university has integrated administration as any complex institution must. The table below outlines the integrated nature of assessment at NJIT.

Table 14.1 Integrated Assessment Model at NJIT

Genre Form Audience Pragmatics

iSkills

Program directors/Academic administration

Assessment of student learning/curricular development/resource allocation

Exams

ETS Profile

Program directors/Academic administration

Assessment of student learning/curricular development/resource allocation

Page 10: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 10

Business Management field test by ETS SOM program director

Assessment of student learning/curricular development/resource allocation

NSSE Senior administration/ chairs

Curricular development/resource allocation

Course evaluation

Faculty/chairs/dean/ senior administration

Curricular development/ resource allocation/faculty hiring/salary increases

Enrolling Students Admissions office/deans

Admissions projections and targeting/marketing/ communications plan

Student satisfaction

Student services administration/ academic administration

Improving campus life/promoting community/ resource allocation/ advisement

Graduating Students

Student services administration/ academic administration

Assessment of student learning/Curricular development/resource allocation

Alumni Chairs/deans/senior administration

Assessment of student learning/ curricular development/ resource allocation

Employers Chairs/deans/senior administration

Assessment of student learning/Curricular development/resource allocation

Intern supervisors

Chairs/deans/senior administration

Assessment of student learning/Curricular development/resource allocation

Educause (IT) IT department/senior administration

IT service improvement/ resource allocation

Surveys

PolyDasher Senior administration Overall institutional assessment

Custems Senior administration Overall institutional assessment

External Data Systems Efficiency table President/provost Resource allocation Internal Data Systems

Grade distribution report

Chairs/deans/ provost Academic administration

Page 11: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 11

Enrollment Admissions/ budget/ provost

Academic administration/ academic planning/ resource allocation/ facilities planning

Graduation report

Chairs/deans/ provost/ student services

Academic administration/ academic planning

Retention report

Chairs/deans/ provost/ student services

Academic administration/ academic planning

Migration report

Chairs/deans/ provost/ student services

Academic administration/ academic planning

Retention/ graduation report

Chairs/deans/ provost/student services

Academic administration/ academic planning

Diversity report Provost/student services Admissions planning Benchmark comparisons

Comprehensive benchmark report President/provost

Overall institutional assessment

14.2.4.1 Exams As part of NJIT’s increasing commitment to the Voluntary System of Accountability (VSA) the university has adopted the annual use of two exams to evaluate student learning outcomes. These constitute the centerpiece of formal learning outcomes assessment university wide. The ETS Profile, the standard to be included for the VSA, has eight components; critical thinking, reading, writing, mathematics, humanities, social sciences, natural sciences, and essay. These components, as well as the overall score are nationally normed and serve as a general measure of learning outcomes. Compared to research universities across the nation, NJIT students perform at the mean of 448. Figure 14.0. Sample Student Learning Assessment: ETS Proficiency Profile

Quantiles 100.0% Maximum 497 50.0% Median 449.5 0.0% Minimum 400

Moments Mean 448.15625 Std Dev 24.479233 N 32

Page 12: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 12

In fall 2011 the same ETS Profile exam will be administered to entering freshmen. The comparison of these results to those of the seniors should indicate learning of general skills during the course of university study. Obviously, this exam only measures general skills, not the disciplinary skills that form the core of learning at a technically oriented university, but it still affords insight into general strengths or weaknesses of the curriculum. Individual programs bridge this gap in different ways as discussed in section 14.2.4.6. The School of Management supplements the ETS profile with a Management Field Test by ETS to test disciplinary skills in the field of business. That exam is used to modify the disciplinary curriculum directly. The university also administers the iSkills, a test of information literary as a measure of information skills appropriate to the internet age. This exam is intended as a tool to assess both the overall relevance of the NJIT curriculum to information literacy in the internet world and to test the effectiveness of the targeted information literacy curriculum. Like the ETS profile it is administered to seniors and to freshmen. As with the ETS profile NJIT performs at or near the national norm (260) for peer schools. Further analysis will be conducted after fall 2011 results are available.

Figure 14.1. Sample Student Learning Assessment: ETS iSkills

14.2.4.2 Surveys The Office of Institutional Research and Planning conducts surveys annually, analyzes data and provides information to senior management to aid in data-driven decision making. Surveys are web-based to insure large response rates, and reports summarizing the data are made available to the NJIT community on the IRP web site (IRP, Reports, 2011).

Quantiles 100.0% Maximum 450 50.0% Median 260 0.0% Minimum 60

Moments Mean 253.34906 Std Dev 83.974781 N 212

Page 13: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 13

University-wide Course Evaluation Survey Course evaluations, as discussed in Working Group Report, Standard 7, play a role as indirect measures in the assessment of student learning. The opening four questions of the survey ask students to evaluate the following:

• The quality of the course textbook(s). • The quality of other instructional materials (handouts, multimedia, etc.). • The extent to which the course content is current and relevant • The overall educational value of the course

Although these are not direct measures of student learning outcomes, they play an important part in considering whether a course should be redesigned better to meet the needs or interests of students. In particular, we know that NJIT students are highly goal oriented with a preference for learning that can be applied in their careers. A course that is seen by a majority of students as not current or relevant is in need of redesign. Similarly, if students do not see the educational value of a course it may warrant review. The question about, “My preparation for this course from previous courses” is also an important tool to measure indirectly the learning achieved in prerequisite courses. Three surveys discussed in Working Group Report, Standard 7, offer more direct measures for the assessment of student learning outcomes.

(a) Graduating Student Survey The graduating student survey is something of an indirect measure, except that as students compete for post-graduation jobs they are increasingly aware of the preparation they received through their study. This survey focuses on the extent to which the program in which they participated met their career goals and provided the knowledge they need to compete successfully in the job market.

(b) Alumni Survey The alumni survey asks the usual questions about current employment status of the graduates and it also encourages alumni to reflect on the quality of the curriculum at NJIT. As graduates who have been in careers for up to seven years they have a much clearer understanding of their strengths and weaknesses and the strengths and weaknesses of the program from which they graduated. For this reason the measure approaches a direct measure of student learning outcomes.

(c) Survey of Employers of NJIT Graduates and Survey of Employers of Co-op/Intern Students The survey of employers is a direct measure to assess the quality of NJIT graduates’ preparation for careers. In this survey actual employers of NJIT graduates are asked to reflect on the preparation of these employees compared to

Page 14: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 14

that of employees who graduated from other schools. The skills they can assess are disciplinary such as applied engineering competency and abstract abilities like leadership, communication, and teamwork.

Although not directly tied to a formal assessment of student learning, these three surveys have provided important information about NJIT students. In the past results have resulted in an increased emphasis on teamwork and communication skills and, for several engineering programs, updating of the curriculum to produce graduates with knowledge more germane to the contemporary market.

(d) The National Survey of Student Engagement (NSSE) As with course evaluations, monitoring the process of teaching through NSSE is central to assuring quality, however, it is not used directly to assess student learning outcomes. NSSE allows academic administrators to view the actual process of student learning and evaluate the quality of the curriculum according to measures of student participation, contact between faculty and students, etc. Since 2004, NJIT has participated in the National Survey of Student Engagement (NSSE) every two years as a way to benchmark our institution nationally with our peers. NSSE addresses level of academic challenge, active and collaborative learning, student-faculty interaction, enriching educational experiences, and supportive campus environment. The results of the NSSE survey were disseminated to and discussed with senior administration, the Committee on Academic Affairs and the Retention Subcommittee to aid improving the student experience. Historically, NJIT has outperformed our Carnegie Classification peers in the area of active and collaborative learning (ACL) for first-year students and seniors. For 2008 and 2010, first-year student means are consistently higher than NSSE survey mean, (see table below).

Page 15: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 15

Table 14.2. NSSE Results, 2008 and 2010

The ‘+’ symbol indicates that your institution’s score is higher than the respective comparison group (p<.05),

The ‘-‘ symbol indicates a score lower than the comparison group, and a blank space indicates no significant difference. Results from the NSSE survey also show that NJIT excels in key areas of classroom interaction compared to the Carnegie Classification peers, the writing consortium, and NSSE participants as a whole. There is reason for NJIT to take pride in the diversity of the campus and the challenge of the academic experience.

Page 16: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 16

Table 14.3. NSSE Results, SIG Writing, 2010

First-Year Students NJIT Writing

Consortium Carnegie

Class NSSE 2010

1b ACL Made a class presentation 41% 29% 27% 35%

1g ACL Worked with other students on projects during class 53% 44% 45% 46%

1j ACL Tutored or taught other students (paid or voluntary) 27% 16% 18% 16%

1u EEE Had serious conversations w/ students of another race or ethnicity

68% 49% 54% 52%

1v EEE Had serious conversations w/ students of other relig./politics/values

64% 53% 56% 55%

Seniors 3c LAC Wrote at least one paper or

report of 20 pages or more 58% 48% 47% 50%

1h ACL Worked with classmates outside of class to prepare class assignments

72% 62% 62% 60%

1j ACL Tutored or taught other students (paid or voluntary) 32% 22% 22% 21%

1u EEE Had serious conversations w/ students of another race or ethnicity

67% 53% 56% 54%

7h EEE Completed a culminating senior experience (capstone, thesis, comp. exam)

50% 34% 31% 33%

Moreover, additional NSSE results found in the Digital Archive reveal that 53% of first-year students frequently discuss readings or ideas from coursework outside of class and work with other students on projects in class. Fifty-one percent work with peers outside of class. Forty-one percent of first-year students report that they make frequent presentations in class. By their senior year, 47% of students have participated in some form of practicum, internship, field experience, co-op, or clinical assignment. For 2010, our first-year students scored higher than the NSSE survey mean in level of academic challenge. Eighty-six percent of first-year students feel that NJIT places substantial emphasis on academics. Sixty-two percent reported that they work harder than they thought in order to meet faculty expectations. Fifty-seven percent say that their exams strongly challenge them to do their best work.

Page 17: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 17

There are also important aspects of the educational experience that can be improved. Although 74% of first-year students believe that NJIT is committed to their academic success, and 47% feel well supported in terms of their social needs, NJIT underperforms in area of supportive campus environment (SCE) for both first-year students and seniors. For seniors, NJIT falls short in respect to student-faculty interaction (SFI). Doing their first year, only 17% of students participate in a learning community. To meet this challenge, NJIT will rollout a Learning Communities initiative in Fall 2011 to integrate students better into the NJIT campus experience. In reference to diversity, 68% of first-year students frequently have serious conversations with students of a different race, and 64% have serious conversations with students who are different in terms of their religious, political, or personal beliefs. Assessing the Writing Curriculum NSSE provides an excellent tool for assessing the effectiveness of writing. By joining a consortium of schools involved in similar writing programs NJIT can comparatively assess the quality of the writing curriculum. Again, this does not touch on actual student learning, but by promoting an environment where factors associated with learning are available it is expected that the program is effective. The results also indicate areas of the teaching curriculum that may be improved.

Table 14.4. NSSE Results, Writing Across the Curriculum, 2010

NSSE 2010 Mean Comparisons a NJIT NJIT compared with CSWC

Refer to the CSWC consortium codebook for response option values Variable Class Mean Mean Sig b

Effect size c

1. During the current school year, for how many of your writing assignments have you done each of the following?

FY 3.55 3.52 .02 1a.

Brainstormed (listed ideas, mapped concepts, prepared an outline, etc.) to develop your ideas before you started drafting your assignment

SWC1001A

SR 3.46 3.38 .07

FY 3.12 2.97 * .14 1b. Talked with your instructor to develop your ideas before you started drafting your assignment

SWC1001B SR 2.95 2.86 .09

FY 3.34 3.34 .01 1c.

Talked with a classmate, friend, or family member to develop your ideas before you started drafting your assignment

SWC1001C

SR 3.15 3.24 -.09

FY 3.47 3.28 ** .15 1d. Received feedback from your instructor about a draft before turning in your final assignment

SWC1001D SR 2.97 2.91 .05

1e. Received feedback from a classmate, SWC1001E FY 3.27 3.27 .00

Page 18: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 18

friend, or family member about a draft before turning in your final assignment

SR 2.86 2.96 -.09

FY 2.20 2.14 .05 1f. Visited a campus-based writing or tutoring center to get help with your writing assignment before turning it in

SWC1001F SR 1.67 1.75 -.07

FY 2.08 1.68 *** .34 1g. Used an online tutoring service to get help with your writing assignment before turning it in

SWC1001G SR 1.58 1.46 * .12

FY 4.04 4.29 *** -.26 1h. Proofread your final draft for errors before turning it in SWC1001H

SR 4.07 4.36 *** -.29 Scale: 1=Never; 2=Sometimes; 3=Often; 4=Very Often a Weighted by gender and enrollment status (and size for comparisons) b * p<.05, ** p<.01, ***p<.001 c Mean difference divided by the pooled SD CSWC=Consortium for the Study of Writing in College

NSSE also affords an opportunity to see whether the way in which writing is taught is consistent with the character of the NJIT experience. The table below shows that technical aspects of science writing, including the use of numerical or statistical data and the inclusion of drawing and multimedia presentations are well emphasized. This is confirmation that the writing curriculum is, in fact, consistent with the likely career needs of students.

Table 14.5. NSSE Results, SIG Writing, 2010

NSSE 2010 Mean Comparisons a

NJIT NJIT compared

with CSWC

Variable Class Mean Mean Sig b Effect size c

2 .

During the current school year, in how many of your writing assignment did you:

FY 2.77 2.74 .04 2a. Narrate or describe one of your own experiences SWC1002A

SR 2.42 2.50 -.08 FY 3.22 3.22 .00

2b. Summarize something you read, such as articles, books, or on-line publications SWC1002B

SR 3.01 3.18 ** -.17

FY 3.63 3.58 .06 2c. Analyze or evaluate something you read, researched, or observed SWC1002C

SR 3.49 3.63 * -.15

FY 3.06 2.75 *** .26 2d. Describe your methods or findings related to data you collected in lab or field work, a survey project, etc.

SWC1002D SR 3.25 2.92 *** .28

FY 3.47 3.33 * .13 2e. Argue a position using evidence and reasoning SWC1002E

SR 3.11 3.11 .00

FY 2.72 2.35 *** .31 2f. Explain in writing the meaning of numerical or statistical data SWC1002F

SR 2.84 2.57 *** .24

Page 19: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 19

FY 2.80 2.70 .08 2g. Write in the style and format of a specific field (engineering, history, psychology, etc.)

SWC1002G SR 3.33 3.26 .05

FY 2.81 2.31 *** .44 2h.

Include drawings, tables, photos, screen shots, or other visual content into your written assignment

SWC1002H SR 3.43 2.81 *** .54

FY 2.87 2.39 *** .43 2i.

Create the project with multimedia (web page, poster, slide presentation such as PowerPoint, etc.)

SWC1002I SR 3.22 2.91 *** .28

Scale: 1=Never; 2=Sometimes; 3=Often; 4=Very Often a Weighted by gender and enrollment status (and size for comparisons) b * p<.05, ** p<.01, ***p<.001 c Mean difference divided by the pooled SD

The table below indicates some areas of the classroom experience that are ripe for improvement. These include clarity in expectations and communication between instructor and student.

Table 14.6. NSSE Results, SIG Writing, 2010

NJIT NJIT compared with CSWC

Variable Class Mean Mean Sig b Effect size c

During the current school year, for how many of your writing assignments has your instructor done each of the fol lowing?

FY 3.79 3.97 *** -.21 Provided clear instructions describing what he or she wanted you to do SWC1003A

SR 3.64 3.93 *** -.35

FY 3.59 3.73 * -.15 Explained in advance what he or she wanted you to learn SWC1003B

SR 3.42 3.71 *** -.31

FY 3.64 3.90 *** -.28 Explained in advance the criteria he or she would use to grade your assignment SWC1003C

SR 3.64 3.90 *** -.28

FY 2.82 2.95 -.11 Provided a sample of a completed assignment written by the instructor or a student SWC1003D

SR 2.49 2.77 *** -.24

FY 2.97 2.65 *** .28 Asked you to do short pieces of writing that he or she did not grade SWC1003E

SR 2.09 2.20 -.09

FY 3.08 3.01 .06 Asked you to give feedback to a classmate about a draft or outline the classmate had written

SWC1003F

SR 2.24 2.30 -.05

FY 2.75 2.61 * .12 Asked you to write with classmates to complete a group project SWC1003G

SR 2.74 2.74 .00 Asked you to address a real or imagined audience SWC1003H FY 2.90 2.81 .07

Page 20: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 20

such as your classmates, a politician, non-experts, etc.

SR 2.52 2.60 -.07

Which of the fol lowing have you done or do you plan to do before you graduate from your institution?

FY .35 .11 *** .72 Prepare a portfolio that collects written work from more than one class SWC1004A

SR .17 .21 * -.11

FY .09 .04 ** .23 Submit work you wrote or co-wrote to a student or professional publication (magazine, journal, newspaper, collection of student work, etc.)

SWC1004B SR .08 .10 -.06

Scale: 1=Never; 2=Sometimes; 3=Often; 4=Very Often a Weighted by gender and enrollment status (and size for comparisons) b * p<.05, ** p<.01, ***p<.001 c Mean difference divided by the pooled SD

14.2.4.3 Internal Data Systems IRP uses internal data systems at the university to conduct a range of studies into grade distribution, enrollment, retention and graduation that directly or indirectly reflect curricular effectiveness. Many of these are routine products shown in the appendix and available on the web (IRP, Data Analysis, 2011). Two studies worthy of note are comprehensive investigations of the remedial process at NJIT and analyses of factors affecting retention at the university. These studies conducted over the past four years have resulted in a comprehensive restructuring of the placement process and the system for remedial education. Finally, an innovative study developed by IRP shows migration of students through programs over time. This simple chart allows deans and chairs to see how students vote with their feet by flowing out of some programs and into others. Although this is not a direct measure of student learning, again it contributes to a robust understanding of the educational curriculum and offers clues to possible unforeseen problems. 14.2.4.4 Benchmark Comparisons Benchmark comparisons are a key component of assessment at both the student outcomes and institutional levels. This process is delineated in Working Group Report, Standard 7 and deserves note here only as a component of assessing student learning outcomes. Retention and graduation issues should not be considered in isolation. They are key student outcomes, and it is vital to understand them in a comparative framework. For example, the current NJIT six year graduation rate of 55% is extraordinary compared to the 37% rate in 1999, however, relative to benchmarks there is still room for improvement.

Page 21: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 21

14.2.5 Innovative Outcomes Research in Student Performance College of Architecture and Design Portfolio Evaluation The College of Architecture and Design at NJIT was the first school in the United States to store all student design work electronically on a custom-designed, web-based, information retrieval system known as the Kepler Project. The system provides student, faculty and administration access to all work produced in CoAD courses. It enables students to create a portfolio of their work, provides faculty the ability to monitor more easily student achievement and assess teaching outcomes, and allows CoAD administration access to student work in order to do longitudinal studies and perform outcomes assessment. The system was fully implemented in the spring 2007 semester.

Writing Placement Examination Study This study was undertaken in 2008 by members of the Department of Humanities in conjunction with the Office of Institutional Research and Planning to assess the mechanism for placement in freshman writing classes. The impetus for this work was the loss of the state-wide basic skills test with the dissolution of the NJ Department of Higher Education. A rigorous analysis of data accumulated over a decade provided layers of evidence that the writing portion of the SAT, traditionally used as an admissions tool, can in fact be used reliably for placement. With this evidence, the research team has developed an organized and sustained process for writing placement, including advanced placement. This innovative, data-driven system has been widely disseminated across the higher education community and has opened avenues of research for early-career faculty at NJIT.

Portfolio Studies The Department of Humanities in the College of Science and Liberal Arts has developed a system for assessing basic components of the General University Requirements (GURs) using online portfolios. Overall, the portfolio assessment allows a broader range of skills to be assessed than a single grade for a single artifact. A schedule of assessments creates a cycle that in turn creates a continual improvement loop; when a new assignment is added (or one is removed), the effect of this change can be seen in the following assessment. At the same time, each assessment highlights areas that can be improved. The assessment has helped to keep the GUR courses up-to-date and has also maintained curricular unity within a group of otherwise diverse faculty and lecturers. The system uses two independent scorers, and therefore the analytic outcomes, which can be mined for statistical data, are much more likely to be unbiased than portfolios scored by the instructors or by an administrative committee. It is notable that this is the only such portfolio scoring system now in operation. For more information, see publications in Appendix regarding stages of the development of the overall portfolio assessment system.

Page 22: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 22

Information Literacy Plan The NJIT Information Literacy Plan grew out of collaborative work of NJIT librarians and Humanities faculty on information literacy instruction and assessment begun in 2005. The plan outlines an instruction and assessment cycle for eight information literacy learning outcomes in the foundational writing courses and in each undergraduate program. An IL assessment rubric was developed by librarians and is used to measure program effectiveness, as evidenced by documents in a sample of student portfolios. The Institute IL Plan was approved in May 2009 by the Undergraduate Curriculum Review Committee and the Committee on Academic Affairs.

Core Competencies Research Group An integral part of the NJIT plan for the measurement of learning ability is the systematic investigation of student performance on our Core Competencies: Writing, Reading and Critical Thinking; Quantitative Reasoning; and Information Literacy. This group collaborated on a study design to carry out this investigation. Recommendations and rationales were then made to the Subcommittee on Assessment for their consideration and implementation (Subcommittee on Assessment, 2011).

Task Force on Retention and Graduation As noted in Section 8.2.16, on February 3, 2011, President Altenkirch created a Task Force on Undergraduate Retention and Graduation. The Task Force was charged to examine recent national findings on college completion rates, benchmark NJIT’s retention and graduation rate, and identify strategies to improve those rates. The final report of the Task Force was submitted on May 15, 2011. As Working Group 8 observed, the four month Task Force effort embodies the timely response and collaborative efforts that are integral to the NJIT sense of mission-centered planning, assessment, and action. The Task Force approved four recommendations, each of which have precise outcomes based on the ViSTa model: to examine and improve the placement rate of students into credit-bearing courses and develop an evidence-based plan to reduce substantially the remediation rate for first year students; to create a unified, professional advisement system for undergraduate students; to continue to contribute to the state’s economic competitiveness by producing professionals who will graduate in a timely fashion and contribute to workforce development; and to permanently establish a mechanism to examine issues related to retention and graduation under a shared governance structure. Remarkably, the recommendations were under implementation, even as the committee was deliberating and preparing its findings. To ensure a timely pattern of graduation, for example, each of the following tactics are already in place: placement of students in a first-semester curriculum of no more than 16 credits (exceptions follow well-defined criteria); assurance that course prerequisites are met throughout the curriculum, including the calculus prerequisite for Physics; use of full-time lecturers, tenure-track, and tenured

Page 23: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 23

faculty for key courses in the first year and for undergraduate majors; increased opportunities for project-based learning; increased opportunities for cooperative learning; and increased exposure to research and professional development within program curricula. To support the learning community effort, a key part of the effort to increase retention and graduation rates, an assessment plan has been designed and is now in place. This plan, using an embedded assessment technique, will collect and analyze data on student performance as it relates to criteria applied in placement for courses and major. To allow national comparison of student performance that fulfills the NJIT commitment to the Voluntary System of Accountability, the ETS Proficiency Profile and the iSkills assessment will be used, along with locally developed measures such as common examinations in mathematics and course portfolios in writing classes, to better understand levels of student performance.

Learning Communities NJIT implemented learning communities for the fall 2011 freshman class. This plan is intended to foster the experience of community among regular students who are not members of the Honors College or participants in EOP. These learning communities are structured communities of student groups with similar interests, such as students with the same major. The implementation plan includes detailed assessment of learning community effectiveness in speeding progress toward graduation and promoting actual learning success. Questions designed to baseline characteristics were built into the student satisfaction survey implemented in spring 2011, before building learning communities. In spring 2012 IRP will compare student responses for learning community participants and those who would have been selected for learning communities had they existed in 2010. Other variables including GPA, course success rate, credits accumulated, and exam performance will also be compared. Faculty-Driven Outcomes Assessment Research Outcomes assessment has become a very active area for faculty researchers at NJIT. Indeed, many faculty from a wide spectrum of disciplines have published articles in this area. A directory of faculty publications in the field of outcomes assessment research has been created in the Digital Archive.

Selected Case Studies of other Assessment Activities Examples of assessment of student learning outcomes at the department and program level are summarized in the Digital Archive.

Page 24: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 24

14.3 CRITICAL ANALYSIS AND CONCLUSIONS Group 4 concludes that NJIT has a cohesive plan for the measurement of student learning assessment, and mechanisms are in place that ensure assessment is used to drive curricular change for the benefit of our students. Expected student learning outcomes are clearly articulated at all levels and are consonant with both the NJIT mission and the standards of higher education. The Program Review Process, an initiative of the Office of the Provost, ensures that data-driven assessment activities are organized and sustained. The Office of Institutional Research and Planning plays a key role in collecting and analyzing data, and distributing results of those analyses to appropriate shareholders for consideration and recommending action. NJIT has a history of performing innovative outcomes research of student performance, and many faculty members from various disciplines have published in this area. In summary, NJIT continues to apply practices to learning outcomes assessment that are varied, exemplary, and in some cases, unique. Indeed, NJIT is poised to become a national leader in SLA efforts for science and technology universities. 14.4 COLLABORATION WITH OTHER WORKING GROUPS In scheduled meetings hosted by the Rapid Assessment and Steering Committee, our Working Group collaborated with other groups. Collaboration was also strengthened through meetings with the self study consultant (Robert Clark). Asynchronous communication was fostered through the open source content management system (Moodle); in that platform, the Working Groups collaboratively reviewed each stage of the planning and reporting process, from question design to outlines of the Working Group Reports, to edited review, to final copy. Working Group 4’s Chair and Vice-Chair have consulted with Chairs and Co-chairs of other groups, as well as IRP, sharing data and other information resources to enhance the content of reports that share related questions and scope. 14.5 RECOMMENDATIONS FOR IMPROVEMENT 14.5.1 Recommendations Table: Standard 14: Assessment of Student Learning RECOMMENDATION 1 Insuring Sustainability of

Assessment of Student Learning at NJIT

VISION: The desired future for the recommendation

Strengthen the existing coordinated structure and culture of student learning assessment.

STRATEGY: The methodology recommended to achieve the vision

Employ the ViSTA approach to student learning assessment to improve student learning; coordinated by Office of the

Page 25: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 25

Provost. TACTIC: The specific action

recommended to implement the strategy

Strengthen the centrality of SLA in accreditation-based periodic program reviews and departmental/college self-studies in order to ensure that assessment results are being used to strengthen the curriculum for the benefit of students. Further develop the NJIT Core Competencies assessment activities to strengthen the curriculum for the benefit of students

ASSESSMENT: The metric recommended to measure achievement of the vision

100% participation of degree programs by the Subcommittee on Assessment in Program Review Process by the end of the five year cycle. Conduct longitudinal study of performance of students on NJIT Core Competencies on ETS Proficiency Profile and iSkills exams in order to determine learning trends and improve curricular delivery.

RECOMMENDATION 2 Improve analysis and reporting of data in support of assessment of student learning outcomes to drive curricular transformation and to inform shareholders of assessment results.

VISION: The desired future for the recommendation

Data on assessment will continue to be used to inform and formulate recommendations for curricular change.

STRATEGY: The methodology recommended to achieve the vision

Degree of data analysis by IRP should be used to model best practices.

TACTIC: The specific action recommended to implement the strategy

Comprehensive reports on all assessment measures will be disseminated on the web to appropriate stakeholders who, in turn, will make recommendations for change.

ASSESSMENT: The metric recommended to measure achievement of the vision

IRP reports published annually online on all SLA efforts, with special emphasis on Program Assessment and Core Competency Assessment. Annual report by UCRC on curricular changes to be delivered to Office of Provost.

RECOMMENDATION 3 To be a national leader in SLA

Page 26: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 26

efforts for science and technology universities.

VISION: The desired future for the recommendation

Achieve NJIT national prominence in SLA of degree programs and Core Competencies.

STRATEGY: The methodology recommended to achieve the vision

Strengthen the Program Review Process.

TACTIC: The specific action recommended to implement the strategy

Expand the sphere of influence of the Subcommittee on Assessment.

ASSESSMENT: The metric recommended to measure achievement of the vision

Present and publish SLA findings at national conferences.

Page 27: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 27

References Altenkirch, Robert. Task Force on Retention and Graduation. Newark: NJIT, 2011. Web. Dees, Perry, and Norbert Elliot. Assessment of Writing Ability at a Science and Technology University. Implementing an Effective Accreditation Process. Audience: Advanced. Middle States Commission on Higher Education Annual Conference. Philadelphia, PA. December 10, 2010. Deek, Fadi. CSLA Program Goals. Newark: NJIT, 2010. Web. Deess, Eugene, Fadi Deek, Norbert Elliot. NJIT Program Review Guidelines. Newark: NJIT, 2011. Web. Deess, Eugene, Norbert Elliot, Stephen Tricamo, Fadi Deek, Self-Study Steering Committee. The NJIT Program Review Process: Toward a Cohesive Educational Assessment Framework. Newark: NJIT, 2010. Web. Educational Testing Service. iSkills Assessment Case Study: Preparing Students for Today’s Information-based at Economy New Jersey Institute of Technology. Princeton: ETS, 2009. Gatley, Ian, et al. Final Report: Task Force on Undergraduate Retention and Graduation. 2011. Web. (IRP) Institutional Research and Planning. Assessment of NJIT Core Competencies. Newark, NJ: NJIT, 2010. Web. (IRP) Institutional Research and Planning. NJIT Program Review Process: Guidelines. Newark, NJ: NJIT, 2010. Web, IRP (Institutional Research and Planning). Data Analysis. Newark: NJIT, 2011. Web. IRP (Institutional Research and Planning). IRP Reports. Newark: NJIT, 2011. Web. Johnson, Carol, et al. “Undergraduate Technical Writing Assessment: A Model.” Programmatic Perspectives, 2.2 (2010): 110-151. Irvin R. Katz, Norbert Elliot, Davida Scharf, Yigal Attali, Donald Powers, Heather Huey, Kamal Joshi, and Vladimir Briller, “Information Literacy Assessment: Case Study of a Multi-Method Approach.” ETS Research Report RR-08-33. 2008. Peer Reviewed.

Klobucar, Andrew, et al. “Representation, Automation, and Application: The Search for Valid Writing Assessment.” New Directions in International Writing Research. Ed. Charles Bazerman. Anderson, SC: Parlor Press. Forthcoming.

Page 28: STANDARD 14: ASSESSMENT OF STUDENT LEARNING · PDF file · 2015-10-19STANDARD 14: ASSESSMENT OF STUDENT LEARNING REPORT OF WORKING GROUP 4: ... The Reviewer’s Report on the 2007

Working Group Report: Standard 14 Page 28

Middaugh, Michael. Planning and Assessment in Higher Education: Demonstrating Institutional Effectiveness. San Francisco: Jossey-Bass, 2010. Middle States Commission on Higher Education. Student Learning Assessment: Options and Resources. 2nd ed. Philaselphia, PA: MSCHE., 2007. Mislevy, R. J. (2007). Validity by design. Educational Researcher, 36, 463-469. Mislevy, R. J., Almond, R. G., & Lukas, J.F. (2003). A brief introduction to evidence-centered design (ETS Research Report RR-03-16). Princeton, NJ: Educational Testing Service. Scharf, Davida, et. al. “Direct Assessment of Information Literacy Using Writing Portfolios.” The Journal of Academic Librarianship 33.4 (2007): 462-478. Subcommittee on Assessment. Assessment of the NJIT Core Competencies. Newark: NJIT, 2011. Web.