Evaluation of Respondus LockDown Browser Online Training Program

28
1 Evaluation of Respondus LockDown Browser Online Training Program Angela Wilson EDTECH 505-4173 August 4 th , 2013

Transcript of Evaluation of Respondus LockDown Browser Online Training Program

Page 1: Evaluation of Respondus LockDown Browser Online Training Program

1

Evaluation of Respondus LockDown Browser Online Training Program

Angela Wilson EDTECH 505-4173

August 4th, 2013

Page 2: Evaluation of Respondus LockDown Browser Online Training Program

2

Table  of  Contents  

Learning  Reflection  .............................................................................................................................  3  

Executive  Summary  .............................................................................................................................  4  Purpose  of  the  Evaluation  .................................................................................................................  5  

Purpose  ................................................................................................................................................................................  5  Evaluation  Questions  .....................................................................................................................................................  5  Stakeholders  ......................................................................................................................................................................  5  

Background  Information  ...................................................................................................................  6  Rationale  for  RLDB  Training  Program  .....................................................................................................  6  RLDB  Training  Program  ................................................................................................................................  6  Definition  of  Respondus  LockDown  Browser  .....................................................................................................  6  Program  Objectives  ........................................................................................................................................................  7  Program  Personnel  .........................................................................................................................................................  7  Previous  Programs  ..........................................................................................................................................................  7  Program  Characteristics  ...............................................................................................................................................  7  

Description  of  Evaluation  Design  ...................................................................................................  9  Evaluator’s  Program  Description  ..............................................................................................................  9  Evaluation  Design  .........................................................................................................................................  10  

Results  ...................................................................................................................................................  12  Discussion  of  Results  ........................................................................................................................  16  

Conclusions  and  Recommendations  ............................................................................................  18  

References  ............................................................................................................................................  19  Appendices  ...........................................................................................................................................  20  Evaluator’s  Program  Description  (original)  .......................................................................................  20  e-­‐Learning  Services  Unit  Action  Plan  ....................................................................................................  21  Interview  Transcript  ...................................................................................................................................  22  Respondus  LockDown  Browser  Training  ............................................................................................  23  Help  Desk  Ticket  ...........................................................................................................................................  24  Feedback  Forms  ............................................................................................................................................  25  

 

Page 3: Evaluation of Respondus LockDown Browser Online Training Program

3

Learning  Reflection   I’ve learned so much about the evaluation process it’s hard to know where to begin my reflection. I basically went from “evaluation is not the same as assessment and testing, but I don’t really know what it involves” to gaining a solid understanding that evaluation is a complex process involving specific elements of pre-planning, data collection, analysis and reporting, and drawing conclusion to help inform decision making about a program, process, or product. I’ve developed a better understanding of formative and summative evaluation strategies, and I plan to utilize both in an upcoming evaluation I’ve been tasked with conducting over the 2014 spring semester regarding training in the use of the Quality Matters standards for online course design. After taking this course, I feel confident I’ll actually be able to do this! It turns out I had a lot of incorrect notions about research and evaluation, data collection, and interpreting results. Prior to this course, I understood “research” to be only those activities conducted in highly controlled environments by scientists in lab coats in an effort to “prove” that X causes Y. I now realize this to be an incomplete and partially incorrect understanding of research. Making observations of things happening in the real world is research, too. (As are many other activities that I won’t take the space to spell out here.) Neither research nor evaluation can “prove” X causes Y, but both can help us identify X factors that make a strong case for why Y happened. I also had the false impression that quantitative data were the only data useful to an evaluation, and that statistical significance was necessary for results to be meaningful. Of course, now I realize that qualitative data are just as useful in an evaluation, and that practical significance is just as important (if not more so, in some cases) as statistical significance. If the results of data collection can’t help you make a decision about something in the real world, then what was the point of doing the evaluation? This course definitely provided me opportunities to develop skills in support of standard 5 of the AECT standards for competent educational technologists. Standard 5.1, Problem Analysis, is defined as the ability to “collect, analyze, and interpret data to modify and improve instruction.” In order to conduct this evaluation report, I had to identify a program to evaluate and determine questions I might ask that would result in data useful for making decisions about the program. I collected data in various forms and analyzed these data in relation to the goals of the evaluation. I then made suggestions for improving the program based on this analysis. Standard 5.3, Formative and Summative Evaluation, requires candidates to “integrate formative and summative evaluation strategies and analyses into the development and modification of instruction.” I engaged in summative evaluation in this evaluation report by gathering existing data on a training program in order to evaluate the program in its current state. Hopefully, the suggested changes will be made to the training program and a second cycle of evaluation that includes formative measures can be conducted.  

Page 4: Evaluation of Respondus LockDown Browser Online Training Program

4

Executive  Summary   This report evaluates the Respondus LockDown Browser (RLDB) training program for online teachers at Lewis-Clark State College (LCSC). The RLDB is a secure browser for delivering tests in online courses. The training program is a 19 minute video tutorial delivered online via the Professional Development Training website for the college. Participants complete a short quiz to verify completion of the training. An e-Learning Services support staff member verifies completion of the training by checking the participant’s submitted answers against a key. The training program was initiated in the spring semester of 2012, and since then has been completed by 12 online instructors at LCSC. The RLDB tool is in use in 42 unique online sections as of the spring 2013 semester. The RLDB training program was evaluated for cost effectiveness, training effectiveness, and teacher satisfaction with the training. Data were collected on the cost of the software license, teacher pay rates, completion rates, number of courses using RLDB, help desk support records, and training development and implementation. In addition, there were six participant feedback surveys available about the online training. At the end of the spring 2013 semester, the cost per course to implement the RLDB was $36. It cost an average of $15 in wages per teacher to complete the training, with a total of $180 dollars invested in training to date. The training materials were found to align with the training assessment measures. With regard to training effectiveness and participant satisfaction, all participants reported an increase in their knowledge about RLDB, skill in using RLDB, and the usefulness of the RLDB tool to their jobs after completing the training. Participants also reported satisfaction with the overall training, specific training materials, and the trainer’s presentation. Despite the participants’ positive attitudes about the training, the evaluator recommends adjusting the training objectives to more directly assess participant ability to correctly configure the RLDB tool. The evaluator also recommends examining the ways in which the RLDB tool and the training are advertised to online teachers at LCSC to see if participation in the training and RLDB tool use can be increased in coming semesters to make it more cost effective to license the tool.  

Page 5: Evaluation of Respondus LockDown Browser Online Training Program

5

Purpose  of  the  Evaluation  

Purpose  As part of its mission, the e-Learning Services department at Lewis-Clark State College (LCSC) offers training and support for the Blackboard Learn Course Management System (CMS) and related technologies. This includes live hands-on training workshops and asynchronous training delivered via the institution’s Professional Development Training (PDT) website. When a license for a third-party tool such as Respondus LockDown Browser (RLDB) is purchased by the college and integrated into the Blackboard CMS, it falls to e-Learning Services to provide training for instructors in the use of the new tool. Tools that require a yearly subscription fee, such as RLDB, are subject to periodic review to determine if the subscription should be continued. This evaluation report looks at tool usage, effectiveness of training, and cost in an effort to guide decisions about the continued use of the tool.

Evaluation  Questions  This evaluation seeks to answer the following questions:

• How many instructors have completed the training since the institution first licensed the RLDB tool?

• In how many courses is the tool being used? • Is the RLDB correctly configured in the courses in which it is in use? • Do the stated goals of the training align with the assessment measures, and are these

training goals being met? • Do participants perceive the training to be effective in terms of teaching them the skills

they need to implement the RLDB tool? • Is the software cost effective based on training requirements and the number of courses in

which it is in use?

Stakeholders  Director of e-Learning Services: The director is responsible for authorizing the continued licensing of the RLDB software. She will be interested in the results of the evaluation report to help guide her decision to renew the software license for another cycle. LCSC Instructors: The online instructors will be impacted if the director of e-Learning Services chooses not to renew the RLDB license based on the evaluation results. Instructors would have to seek alternative methods for securing their online tests, possibly through local division funds. To that effect, division chairs also have a stake in the evaluation. Currently, the money for test security software comes out of the e-Learning Services department. e-Learning Instructional Designer: The e-Learning ID designed the RLDB training. If it is decided that the training needs to be modified in an attempt to increase tool use or improve participant satisfaction, the e-Learning ID will be responsible for making changes to the training.

Page 6: Evaluation of Respondus LockDown Browser Online Training Program

6

Background  Information  

Rationale  for  RLDB  Training  Program  LCSC’s e-Learning Services department oversees online courses delivered through the Blackboard course management system. As part of LCSC’s accreditation process, online instructors are required to arrange for midterm and final exams to be proctored if the exams are delivered through Blackboard. Other tests delivered in Blackboard may also be proctored at the instructor’s discretion. To take proctored exams, students must come to e-Learning Services on the LCSC campus, one of six LCSC outreach centers in Idaho, or other approved proctor locations as determined by e-Learning Services. Since the 2011 fiscal year, LCSC e-Learning Services has overseen proctoring services for an average of 2,100 exams each semester (excludes summer semesters). With such a high volume of test takers requiring proctoring services throughout a given term, it can be difficult for proctoring center personnel to observe all student activity on the testing computers at all times. As a result, e-Learning Services reports between thirty and forty incidents per semester from all proctoring sites of students accessing websites or other restricted computer programs outside of the Blackboard test environment during proctored exams. As a result of instructor concerns about the security of their online tests, e-Learning Services licensed the Respondus LockDown Browser at the start of the spring 2012 semester. The license costs $3000 per year and is paid for out of local funds from e-Learning Services. The license allows for unlimited download and installation of the RLDB on computers at proctor sites approved by e-Learning Services. To enable a Blackboard test to require use of the RLDB, an instructor must enter unique settings and a password using the RLDB “dashboard” inside the Blackboard course environment. Blackboard also contains a built in password setting for tests that can conflict with the RLDB settings if not properly configured. Due to this potential conflict of settings, the director of e-Learning Services initiated a mandate that online instructors must complete training in the use of the RLDB tool before the tool would be enabled and supported in their courses.

RLDB  Training  Program  

Definition  of  Respondus  LockDown  Browser  Respondus (2013) describes the software as follows: Respondus LockDown Browser™ is a custom browser that locks down the testing environment within Blackboard, ANGEL, Desire2Learn, Canvas, Moodle, and Sakai. When students use Respondus LockDown Browser they are unable to print, copy, go to another URL, or access other applications. When an assessment is started, students are locked into it until they submit it for grading. Students must open the RLDB and log in to their Blackboard course using this browser before they can start a RLDB-secured test. If a student attempts to open a RLDB test in a different browser, she will receive a message notifying her that use of the RLDB is required. The proctor must enter the RLDB password to initiate the test for the student. Once initiated, the student

Page 7: Evaluation of Respondus LockDown Browser Online Training Program

7

cannot access any websites, any other areas of the Blackboard course, or any programs on the computer unless they are “white listed” in the exam settings.

Program  Objectives  The objectives of the RLDB training are as follows:

• Define LockDown Browser. • Offer reasons to use LockDown Browser. • Identify some best practices in the use of LockDown Browser.

Program  Personnel    The Instructional Designer (ID) for e-Learning Services was tasked with developing instructor training for the RLDB. During the 2012 spring semester, the ID offered three live, one-hour training workshops for the RLDB. A total of three instructors attended one live session; the other two sessions had no attendees. Due to lack of attendance and instructor requests to have access to training at their convenience, the ID was tasked with creating an online training program. The ID developed the online training program after the midterm point of the spring 2012 semester. According to the ID, he invested approximately two hours planning and developing the online training. The Blackboard Administrator for e-Learning Services reviewed the content for the online training program developed by the ID. The Blackboard Administrator reported investing approximately half an hour reviewing the training materials and communicating with the ID about the training before giving his approval.

Previous  Programs  The three live workshops are the only previous training conducted prior to the development of the online training program. No data regarding effectiveness of the live training or participant opinions of the live training were collected at any point. According to the e-Learning Services ID, instructor word of mouth about the desire for online training prompted the change in the training delivery model.

Program  Characteristics  The RLDB training program is housed on LCSC’s (PDT) website, and is available to all faculty, adjunct faculty, and staff employed by LCSC. The training includes a downloadable PDF containing very basic steps for using the RLDB tool and a 19-minute tutorial video. Upon watching the tutorial, participants are asked to respond to the following questions and submit them via email to the LCSC Blackboard Help Desk for verification.

1. What is Respondus LockDown Browser?

2. Why might you use LockDown Browser for a test?

3. Which of the following are best practices with LockDown Browser?

• A. Send Distance Learning the Respondus-generated password

• B. Vet the test with a student account

• C. Choose not to set a password in the Respondus dashboard

• D. Enter links into the question field, if you want students to view them

Page 8: Evaluation of Respondus LockDown Browser Online Training Program

8

• E. Check the “Respondus Lockdown Browser” box on the Proctored Exam form

When asked about the process for verifying participant completion, the e-Learning ID indicated he verifies correctness of the answers, and were it necessary he would follow up with any participants who didn’t answer the questions correctly via an email. In the email, he would indicate incorrect answers, supply the correct answer, and ask the participant if they have any additional questions or concerns about configuring the tool. If the participant requested additional help, he would schedule a one on one appointment to answer questions and review the RLDB configuration process hands-on. To date, all participants who have completed the RLDB training have correctly answered all questions.

After a participant has completed the online training, e-Learning ID notifies the PDT program manager, Julie Crea, of the participant’s successful completion. Crea then emails the participant a link to complete the online evaluation form used to evaluate online trainings offered through the PDT website.

 

Page 9: Evaluation of Respondus LockDown Browser Online Training Program

9

Description  of  Evaluation  Design  

Evaluator’s  Program  Description   Program Objectives The following objectives are listed in the description of the online training:

1. Define LockDown Browser. 2. Offer reasons to use LockDown Browser. 3. Identify some best practices in the use of LockDown Browser.

Activities to Observe • Configuration of RLDB tests inside Blackboard courses • Presence of helpdesk tickets related to use of RLDB

Data Sources • Feedback forms completed by training participants • Completion records (numerical count of instructors who have completed the training) • Data gathered from Blackboard on the number of course using the RLDB • Blackboard helpdesk tickets of reported user errors encountered when implementing

RLDB • Informal interview with the Director of e-Learning Services • Informal interview with the e-Learning Services Instructional Designer • Idaho Statesman Idaho Salaries Database

Population Sample • All instructors who have completed the online RLDB training

Data Collection Design • Gather feedback form data during evaluation process • Gather completion data during evaluation process • Gather data on number of courses using RLDB during evaluation process • Gather helpdesk ticket data during evaluation process • Conduct informal interviews

Data Analysis • Compare pre-training and post-training ratings from feedback forms • Report numerical data for completion • Report numerical data for courses using RLDB • Descriptive analysis of helpdesk tickets (I may be able to categorize the nature of help

desk requests and graph incidents by “issue.”) • Compare ticket issues to training objectives and describe disparities

Audience • e-Learning Services Director • e-Learning Services Instructional Designer

Page 10: Evaluation of Respondus LockDown Browser Online Training Program

10

Evaluation  Design  This is a summative evaluation based on the decision-making model, employing systems analysis and goal-based methods of data collection and analysis.

According to Boulmetis and Dutwin (2011), the decision making model is used to make decisions regarding the future use of a program and is “well suited to summative evaluation” (p. 107). Data collection occurred at the end of the program cycle and is limited to existing records and interviews. The “big picture” question being evaluated in this report is as follows:

• Is the software cost effective based on training development, completion requirements, and the number of courses in which it is in use?

To determine cost effectiveness, I asked the e-Learning Services director about the cost of the RLDB per year. I queried the Idaho Statesman Idaho Salaries database1 using the search parameters “Faculty” and “Lewis-Clark State College” to determine the average hourly pay for faculty at LCSC. Last, I asked e-Learning ID about the average time investment expected of participants completing the RLDB online training, the amount of time he spent developing the training, and his salary.

Systems analysis is used to examine a program’s efficiency in terms of input, throughput, and output. Systems analysis looks at elements such as the number of participants entering the program and their performance and the results of the program in terms of completion rates and changes in participant attitudes and behaviors. This model can be used to determine if a program is achieving its goals in an efficient manner (Boulmetis & Dutwin, 2011). Systems analysis was used to answer the following questions in this evaluation:

• How many instructors have completed the training since the institution first licensed the RLDB tool?

• In how many courses is the tool being used? • Is the RLDB correctly configured in the courses in which it is in use?

As part of the data collection process, I asked the e-Learning ID for data on completion rates and courses using the RLDB. He provided me with a copy of the “Respondus LockDown Browser Approved List,” which identifies the instructors who have completed the training and the courses in which the RLDB tool is being used. I conducted a short interview with the e-Learning Services director for background information about the RLDB. She also provided me with a copy of e-Learning Service’s “Unit Action Plan,” which contains data on test proctoring. I also accessed the “Bugnet” ticketing system used to track technical support issues and searched for tickets related to the RLDB tool. This evaluation also involved goals-based analysis of data collected. According to Boulmetis and Dutwin (2011), goals-based evaluation involves foreknowledge on the evaluator’s part of the evaluated program’s goals or objectives. Goals-based evaluation examines whether or not the participants have met the stated objectives of the program by the end of the program. Goals-based evaluation was used to answer the following questions in this evaluation:

1 http://tools.idahostatesman.com/salaries/ 2 Retrieved from http://www.lcsc.edu/e-learning/headcounts/index.htm

Page 11: Evaluation of Respondus LockDown Browser Online Training Program

11

• Do the stated goals of the training align with the assessment measures, and are these training goals being met?

• Do participants perceive the training to be effective in terms of teaching them the skills they need to implement the RLDB tool?

To evaluate the goals of the RLDB training program, I asked the e-Learning ID for copies of the participant feedback forms completed for the online training. I also asked the e-Learning ID about the development and implementation of the training, and whether or not any participants had failed to complete the training objectives. I accessed the online training program and retrieved a copy of the program’s objectives and the assessment measures used to determine participants’ attainment of the objectives.

 

Page 12: Evaluation of Respondus LockDown Browser Online Training Program

12

Results  Figure 1. Respondus LockDown Browser Approved List

There are currently 12 approved users of the Respondus LockDown Browser out of 1692 instructors who teach online for LCSC. As of spring 2013, the tool has been implemented in 42 unique course sections. e-Learning Services oversees an average of 450 unique course sections per semester.

Figure 2- Use by semester

The RLDB was implemented in nine courses in the spring semester of 2012. It was implemented in nineteen additional courses in the fall semester of 2012, for a total of 28 courses. By the end of the 2013 spring semester, the RLDB had been implemented in fourteen more courses, for a total of 42 courses.

2 Retrieved from http://www.lcsc.edu/e-learning/headcounts/index.htm

Page 13: Evaluation of Respondus LockDown Browser Online Training Program

13

Figure 3- Cost per course per semester

Semester Number of Courses Cost per Course

SP2012 9 $167

FA2012 28 $54

SP2013 42 $36

Based on a $3000 per year license fee ($1500 per semester), the cost per course for the 2012 spring semester was $167 dollars. The cost per course dropped to $54 for the 2012 fall semester. The spring 2013 cost per course was $36.

Training Costs The e-Learning ID reported the expected completion time for participants for the online training to be approximately 45 minutes. This estimate included accessing the training, viewing the instructional video (18 min 56 sec), and answering the training questions. According to the Idaho Statesman Idaho Salaries database, the mode hourly wage for LCSC instructors is $20.00. The resulting cost per participant for completing training is $15.00. To date, the total cost of instructor training is $180.00.

Employee Cost The e-Learning ID reported spending two hours planning and developing the training program. His hourly wage at the time was $13.25, for a one-time cost of $26.50 to develop the training.

HelpDesk Tickets There was one help desk ticket related to the Respondus LockDown Browser.  Ticket Description:  “The Respondus Lockdown browser doesn’t seem to be “locking” students out of tests. 2012FA-NU-341 Exam 1 (located on week 4) is a respondus-configured test, but if the instructor enters the password in a non-respondus browser the system will let the student in to take the test.”

Page 14: Evaluation of Respondus LockDown Browser Online Training Program

14

Figure 4- Program Objectives and Assessments Comparison Program Objective Program Assessment Measure

Define LockDown Browser. What is Respondus LockDown Browser? Offer reasons to use LockDown Browser. Why might you use LockDown Browser for a

test? Identify some best practices in the use of LockDown Browser.

Which of the following are best practices with LockDown Browser?

A. Send Distance Learning the Respondus-generate password

B. Vet the test with a student account C. Choose not to set a password in the

Respondus dashboard D. Enter links into the question field, if

you want students to view them. E. Check the “Respondus LockDown

Browser” box on the Proctored Exam form.

Program Assessment Results The e-Learning ID reported that all participants who completed the online training assessment answered the assessment items correctly. Figure 5- Participant Feedback About Training Effectiveness

On a scale of 1 to 5, with 5 being the highest, the average participant attitude about overall knowledge of RLDB went from 2.2 before training to 4.6 after training, for an average increase of 2.4. Average participant attitude about their skill set for using the RLDB went from 2.7 before training to 4.5 after training, for an average increase of 1.8 points. Average participant

Page 15: Evaluation of Respondus LockDown Browser Online Training Program

15

attitudes about the usefulness of RLDB for their job went from 3.2 before training to 4.7 after training, for an average increase of 1.5 points. Figure 6- Satisfaction with Training Content and Presentation

On a scale of 1 to 5, with 5 being the highest, the average of scores for satisfaction with overall content was 4, the average of scores for satisfaction with course materials was 3.8, and the average of scores for satisfaction with the presentation of the training was 3.8.

Page 16: Evaluation of Respondus LockDown Browser Online Training Program

16

Discussion  of  Results   How many people have completed the RLDB training? In how many courses has the RLDB tool been correctly configured?

The RLDB training program was implemented in the spring semester of 2012. Since then, only twelve instructors out of the 169 who teach online have completed the training program. The RLDB tool was in use in 42 unique course sections out of approximately 430 unique sections for the spring 2013 semester. This is 14% of online instructors and just over 10% of courses for the most recent term for which there was data.

According to the Blackboard Help Desk ticketing system, only one instructor reported a problem with configuring the RLDB for an exam since the spring 2012 semester. The staff member who replied to the ticket included a copy of the corrective measures sent to the instructor in the ticket comments, and the absence of follow-up comments suggests the instructor was able to correct the RLDB configuration error. The data suggest the tool is correctly configured in all courses in which it is in use.

Do the stated goals of the training align with the assessment measures, and are these training goals being met? Do participants perceive the training to be effective in terms of teaching them the skills they need to implement the RLDB tool? An analysis of the program objectives and the assessment measures provided in the training indicates that the two align. The e-Learning ID, the person responsible for evaluating the correctness of participant responses to the training questions, reported that all participants to date correctly answered the training questions. Thus, it appears that participants are meeting the training objectives. There were six feedback forms submitted for the online training program. Quantitative data from these training feedback forms indicate participants were satisfied with the overall content, course materials, and presentation of the training. Participants ranked the categories on a scale of 1 to 5, with 5 being the highest. The averages of scores for satisfaction with these categories were 4, 3.8, and 3.8, respectively. The participants were also asked to rate themselves in terms of knowledge about the tool, skill with using the tool, and usefulness of the tool for their job before and after the training. Quantitative data from the feedback forms show a positive increase in all three areas after training, with averages of the three scores increasing by 2.4 points, 1.8 points, and 1.5 points, respectively. These results suggest that participants perceived the training to be effective at teaching them the skills they need to correctly configure the RLDB tool. One piece of qualitative data supports the quantitative data in terms of participant perception of training effectiveness. One participant wrote on the feedback form, “If you plan to use the Respondus LockDown Browser, you definitely want to attend this PDT. It will help you avoid making a lot of mistakes when setting up a test and informing students how to use the RLD browser.”

Page 17: Evaluation of Respondus LockDown Browser Online Training Program

17

What costs are associated with the training program and the RLDB tool?

There are several costs associated with the RLDB tool, including cost of training development, the software licensing fee, and cost related to instructor completion of the training. The following table summarizes these costs.

Item Description Total Cost

Development of training by the e-Learning ID -2 hours spent -$13.25/hr

One-time cost: $26.50

RLDB software license fee $3000 per year

Cost per course for software, as of 2013 SP semester

$36

Instructor training -assume 45 min completion time -assume average hourly rate of $20

$15 per participant

$180 total to date

 

 

Page 18: Evaluation of Respondus LockDown Browser Online Training Program

18

Conclusions  and  Recommendations  Immediate Conclusions

• Without access to information about the e-Learning Services’ budget information, it is uncertain whether or not the $36 per course fee breakdown for the cost to license the RLDB software warrants its renewal in the coming fiscal year. That said, I would suggest requiring the use of the RLDB in all online courses for high-stakes midterm and final exams. This would reduce the cost per course, thus increasing the appeal to the director of e-Learning Services to renew the software from a financial standpoint. This would also more fully address the concerns instructors allegedly have about test security in online courses.

• It is difficult to draw conclusions of any practical significance with only six responses to the feedback survey. I would suggest that the requirements for being allowed to use RLDB in a course be expanded to include mandatory completion of the survey so the evaluation for the next software license renewal cycle contains more data to analyze.

• While the feedback survey provides participants with the opportunity to rate their experience with the training, I feel the use of a generic survey form is not as effective at gauging participant attitudes as would be a survey tailored specifically to the RLDB training. I suggest the training developer create his own feedback survey for use with participants, using prompts that mirror the specific language of the training objectives.

Long-Range Planning

• Despite the seeming success of the training in terms of teaching people how to use the RLDB tool, I can’t help but express concern about the seeming inadequacy of the training objectives. The objectives focus on defining the tool and its uses, but there are no objectives in place requiring participants to actually demonstrate proper use of the tool. I suggest reworking the training objectives and materials to include a measure for performance assessment. This could be something as simple as participants accessing a “sandbox” course and correctly configuring an exam in the course to use the RLDB tool. While this would increase the support burden for e-Learning Services in terms of verifying correctness of participant performances, it would be a more authentic measure of their ability to use the tool.

• To increase use of the RLDB tool, I would suggest examining the methods used by e-Learning Services to promote awareness of the RLDB tool and the necessary training. It’s possible online instructors are simply unaware the tool exists, or they are unaware of what training is available.

Evaluation Insights Upon completing this evaluation, I was surprised at how few people had actually completed the training. I was further surprised that only half of this already small number of participants completed the feedback survey. I can’t help but feel the quantitative results are somewhat meaningless in terms of drawing practical conclusions about the effectiveness of the training. Rather than providing useful data about the program’s training effectiveness, the results (or lack there of) seem to indicate a weakness in promoting the existence of the RLDB and the availability of training. It would have been beneficial to investigate how the training program is promoted on the LCSC campus and how it is promoted to off-site instructors.

Page 19: Evaluation of Respondus LockDown Browser Online Training Program

19

References   Boulmetis, J. & Dutwin, P. (2011). The ABCs of evaluation. San Francisco, CA: Jossey-Bass.

Respondus (2013). Respondus LockDown Browser. Retrieved from: http://www.respondus.com/products/lockdown-browser/  

 

 

Page 20: Evaluation of Respondus LockDown Browser Online Training Program

20

Appendices  

Evaluator’s  Program  Description  (original)  

Evaluation of ‘Respondus LockDown Browser’ Training The Respondus LockDown Browser (RLDB) prevents students from navigating to external websites or accessing other programs while taking an exam in a Blackboard course environment. The eLearning Services department requires faculty to complete RLDB training prior to configuring the tool for use with exams delivered through Blackboard. This training is offered asynchronously online through the institution’s professional development training website. Program Objectives The following objectives are listed in the description of the online training:

4. Define LockDown Browser. 5. Offer reasons to use LockDown Browser. 6. Identify some best practices in the use of LockDown Browser.

Activities to Observe • Configuration of RLDB tests inside Blackboard courses • Presence of helpdesk tickets related to use of RLDB

Data Sources • Training evaluation surveys completed by faculty • Completion records (numerical count of faculty who have completed the training) • Data gathered from Blackboard on the number of course using the RLDB • Blackboard helpdesk tickets of reported user errors encountered when implementing

RLDB Population Sample

• All faculty who have completed the online training Data Collection Design

• Gather pre/post participant survey data during evaluation process • Gather completion data during evaluation process • Gather data on number of courses using RLDB during evaluation process • Gather helpdesk ticket data during evaluation process

Data Analysis • Compare pretest and posttest ratings • Report numerical data for completion • Report numerical data for courses using RLDB • Descriptive analysis of helpdesk tickets (I may be able to categorize the nature of help

desk requests and graph incidents by “issue.”) • Compare ticket issues to training objectives and describe disparities

Audience • e-Learning Services director (She is ultimately responsible for the quality of training

provided to online faculty) • e-Learning Services Blackboard Support Technician (creator of the training)

 

Page 21: Evaluation of Respondus LockDown Browser Online Training Program

21

e-­‐Learning  Services  Unit  Action  Plan  Unit Action Plan (UAP)—Fall 2012

Name of Unit: e-Learning Services (Community Programs/Provost) Reporting Body: Director, e-Learning Services Mission Description: e-Learning Services (e-L) facilitates delivery of instruction and services to faculty, staff, students, and businesses in the following areas:

• Internet Instruction (100% online instruction) • Interactive Video Conferencing (traditional campus course using IVC technology) • Hybrid/Blended Learning Courses (reduced traditional campus component with

30% or more online delivery) • Lecture-Enhanced Courses (traditional campus course with less than 30% online

component enhancement) • Non-matriculated courses/programs (Workforce Training online courses, Professional

Technical Academy Fundamentals for Health Professions (high school) online courses, as well as SBDC NxLeveL Online.

In addition, e-L Course Management provides administrative support of the campus learning management system (Blackboard), development of online and technology-enhanced courses, faculty/student technical support, and faculty training. <Extraneous content removed> Testing Services The following chart provides information regarding test proctoring from FY11 to current numbers for FY13 (noting that SP13 numbers are unavailable at the time of this report). It is anticipated that test proctoring numbers for FY13 will exceed that of FY12:

0  

500  

1000  

1500  

2000  

2500  

3000  

FY11   FY12   FY13  

Proctored  Exams  by  Semester  

Summer  

Fall  

Spring  

Page 22: Evaluation of Respondus LockDown Browser Online Training Program

22

Interview  Transcript  Transcript of interview with the director of e-Learning Services July 22, 2013 Angela: Why did the college choose to purchase the Respondus LockDown Browser? Director: Mostly because faculty were worried about the security of their online tests. Especially the Nursing Division. Angela: Do you get that a lot? People cheating on tests? Director: I wouldn’t say a lot, but it does happen. It’s hard to keep an eye on everyone and do everything else in the office. And the testing centers are in the same boat. Angela: So how many incidents would you say you get in a semester? Director: Oh, probably thirty or forty. Sometimes more, some times less. A lot less for those that are using the Respondus now. Angela: How much does the RLDB software cost per year? Director: It’s around $3000 dollars a year. It’s site-wide so any of our centers can put it on their computers. And students can use it at home, too. Though I don’t think any of them do. Angela: And that came from your department? Director: Yes. We paid for it, and the whole campus can use it in their online courses. We also do all the training for it, as you know.

Page 23: Evaluation of Respondus LockDown Browser Online Training Program

23

Respondus  LockDown  Browser  Training    

Page 24: Evaluation of Respondus LockDown Browser Online Training Program

24

Help  Desk  Ticket    

Page 25: Evaluation of Respondus LockDown Browser Online Training Program

25

Feedback  Forms  

 

Page 26: Evaluation of Respondus LockDown Browser Online Training Program

26

Page 27: Evaluation of Respondus LockDown Browser Online Training Program

27

Page 28: Evaluation of Respondus LockDown Browser Online Training Program

28