Evaluating the Quality of Online Programs

27
Evaluating the Quality of Online Programs Diane Ruiz Cairns, Ed.S. Lawrence Technological University eLearning Services

description

Evaluating the Quality of Online Programs. Diane Ruiz Cairns, Ed.S. Lawrence Technological University eLearning Services. Agenda. Overview Why monitor quality? Methods for monitoring quality of online programs Lawrence Tech, eLearning Services experience Next steps. Overview. - PowerPoint PPT Presentation

Transcript of Evaluating the Quality of Online Programs

Page 1: Evaluating the Quality of Online Programs

Evaluating the Quality of Online

ProgramsDiane Ruiz Cairns, Ed.S.

Lawrence Technological UniversityeLearning Services

Page 2: Evaluating the Quality of Online Programs

Agenda

• Overview• Why monitor quality?• Methods for monitoring quality of online programs• Lawrence Tech, eLearning Services experience• Next steps

Page 3: Evaluating the Quality of Online Programs

Overview

• Program Evaluation• Making data-driven decisions• Support of performance improvement• Alignment of resources, performance and strategic goals• Add measurable value

Page 4: Evaluating the Quality of Online Programs

Overview

• Quality of online programs impacts student retention, enrollment and graduation rates

• Online environment includes:• Institutional Technology• Course Development• Course Structure• Teaching and Learning• Student and Faculty support• Methods of Evaluation and Assessment

Page 5: Evaluating the Quality of Online Programs

Overview

• Measures of success include:• Enrollment Reports• Assessment of Learning• Student Evaluation Survey• Informal Feedback

Page 6: Evaluating the Quality of Online Programs

Why Monitor Quality?

• Results based• Measurable results• Effective course content• Efficiencies in operation• Teaching effectiveness

Page 7: Evaluating the Quality of Online Programs

Why Monitor Quality?

• Alignment of methods for measuring and assurance of quality• Stability of online programs• Impacting student satisfaction• Value of online programs

Page 8: Evaluating the Quality of Online Programs

Evaluating Quality of Online Programs

• Views and methods for evaluating programs varies• Adoption of a comprehensive tool or methods brings

alignment• A validated tool recognized by industry can assist with

reliability• Requirements before adopting a tool

Page 9: Evaluating the Quality of Online Programs

Evaluating Quality of Online Programs

• Repeatable data collection results in meaningful collection of data• Comprehensive approach• Multiple collection cycles supportive of reasonable and

responsible data

Page 10: Evaluating the Quality of Online Programs

Evaluating Quality of Online Programs

• Requires careful planning• Data collection focus on mega, macro, micro levels• Systems approach• Data collected include:• Effective course content• Efficiencies in operation• Teaching effectiveness

Page 11: Evaluating the Quality of Online Programs

Evaluating Quality of Online Programs• Mega

• Success of online program at meeting university enrollment goals• Support of teaching and learning goals

• Macro• Technological infrastructure• Individual courses support of teaching and learning guidelines• Faculty engagement

• Micro• Instructional design impact• Student, Faculty, Staff use of technology• Student and faculty support services

Page 12: Evaluating the Quality of Online Programs

Evaluating Quality of Online Programs

• Planning• Timeline• Participation• Communication, Communication, Communication• Conduct the evaluation• Plan for interventions

Page 13: Evaluating the Quality of Online Programs

Monitoring of Online Program

• Create Dashboard• Seven to Ten Metrics

• Method for reporting (communicating)• Data collection periods

Page 14: Evaluating the Quality of Online Programs

Example of Data Collection Schedule

Page 15: Evaluating the Quality of Online Programs

Building a Dashboard

• Elements of organizational performance• Enrollment Goals• Teaching and Learning Goals• Graduation rates• Employment outcomes• Technological metrics: uptime, type of support calls• Quality of teaching and learning• Faculty engagement• Faculty training, participation• Student evaluation survey data

Page 16: Evaluating the Quality of Online Programs

Dashboard Examples

Page 17: Evaluating the Quality of Online Programs

Lawrence Tech Experience• Sloan-C

• Evaluation of Online Program organization• Self assessment

• Baldrige Education Performance Excellence• Evaluation of education organization• Assessed by Baldrige evaluators

• Blackboard Exemplary Course Rubric• Evaluation of course development• Self assessment

• Quality Matters• Evaluation of course development• Assessed by qualified evaluators

Page 18: Evaluating the Quality of Online Programs

Lawrence Tech Experience• Operation Quality• Course Quality• Course Delivery Quality• Documenting standards• Identify metric requirements• Adopting industry standards:• Sloan-C• Blackboard Exemplary Course Rubric• QM Course Design• Baldrige - future

Page 19: Evaluating the Quality of Online Programs

Getting Started

• Why do this?• What will you do with the data?• Benchmarking• Building team• Confirming plan• Collecting data, what data• Reporting results• Engagement across campus services

Page 20: Evaluating the Quality of Online Programs

Confirming

• Monitoring schedule• Reinforcement of quality measures• Integration • Policy and practices of monitor, evaluating, assessing• Managing, planning for change• Oversight

Page 21: Evaluating the Quality of Online Programs

Change

• Be an agent of change• Lens of student, employers, accrediting bodies, stakeholders• Define critical success practices

Page 22: Evaluating the Quality of Online Programs

Timeline

Page 23: Evaluating the Quality of Online Programs

Conclusion

• Confirm metrics• Begin program evaluation• Sloan-C

• Develop Dashboard• Report• Refine• Apply intervention

Page 24: Evaluating the Quality of Online Programs

Dashboard Data

Page 25: Evaluating the Quality of Online Programs

Dashboard Data

Page 26: Evaluating the Quality of Online Programs

Discussion

There is nothing wrong with change, if it is in the right direction.-- Winston Churchill

Page 27: Evaluating the Quality of Online Programs

References

• Cokins, G. (2008, April 3). How are balanced scorecards and dashboards different? Information Management.com. Retrieved April 12, 2014, from http://www.information-management.com/news/10001076-1.html?zkPrintable=true

• Cowan, K. (2013, December 15). Higher education’s higher accountability. Accreditation and Standards, Winter(2014). Retrieved from http://www.acenet.edu/the-presidency/columns-and-features/Pages/Higher-Education%27s-Higher-Accountability.aspx

• Dessinger, J. C. & Moseley, J. L. (2004). Confirmative evaluation: Practical strategies for valuing continuous improvement. San Francisco, CA: John Wiley & Sons, Inc.• Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program Evaluation: Alternative approaches and practical guidelines. Boston, MA: Pearson Education.• Frigo, M. (2012). The balanced scorecard: 20 years and counting. Strategic Finance, p. 49-53.• Griggs, V., Blackburn, M., & Smith, J. (2012). The educational scorecard: The start of our journey. The Electronic Journal of Business Research Methods, 10(2), 121-131.• Guerra-López, I. (2007). Evaluating impact: Evaluation and continual improvement for performance improvement practitioners. Amherst, MA, HRD Press, Inc.• Hell, M., Vidačić, S., & Garača, Ž. (2009). Methodological approach to strategic performance optimization. Management, 14(2), 21-42.• Hughes, K. E., & Pate, G. R. (2013). Moving beyond student ratings: A balanced scorecard approach for evaluating teaching performance. American Accounting

Association, 28(1), 49-75.• Kaufman, R., Gurerra, I., & Platt,W. A. (2006). Practical evaluation for educators: Finding what works and what doesn’t. Thousand Oaks, CA: Corwin Press.• Kaufman, R., Oakley-Browne, H., Watkins, R., & Leigh, D. (2003). Strategic planning for success: Aligning people, performance, and payoffs. San Francisco, CA: Josey-

Bass/Pfeiffer.• Kesler, G., & Kates, A. (2011). Leading organization design: How to make organization design decisions to drive the results you want. San Francisco, CA: Josey-Bass.• Laureate Education, Inc. (Producer). (2011a). Assessment and accountability in education: Dashboards, part 1. Baltimore, MD: Author.• Laureate Education, Inc. (Producer). (2011b). Assessment and accountability in education: Dashboards, part 2. Baltimore, MD: Author.• Popham, W. J. (2008). Transformative assessment. Alexandria, VA: Association for Supervision and Curriculum Development.• Shelton, K. (2010). A quality scorecard for the administration of online education programs: A Delphi study. Journal of Asynchronous Learning Networks, 14(4), 36-62.• Shelton, K., & Saltsman, G. (2005). An administrator’s guide to online education. USDLA Book Series on Distance Learning.• The Sloan Consortium (2012). Changing course: Ten years of tracking online education in the United States (2013). Babson Survey Research Group and Quahog

Research Group. Retrieved from http://sloanconsortium.org/publications/survey/changing_course_2012• U.S. Department of Education, NCES (2011, October 5). Learning at a distance: Undergraduate enrollment in distance education courses and degree programs v. 154.

NCES: Author. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2012154• United States Government Accountability Office. Higher Education (2011). Use of new data could help improve oversight of distance education. (GAO-12-39).

Retrieved from Retrieved from http://www.gao.gov/assets/590/586340.pdf• U. S. News World Report (2014, January 7). Online education. Retrieved from http://www.usnews.com/education/online-education