Generating, Understanding, and Using the Peregrine Reports › media.ashx › a50baa43-aa... ·...
Transcript of Generating, Understanding, and Using the Peregrine Reports › media.ashx › a50baa43-aa... ·...
Generating, Understanding, and Using the Peregrine Academic
Services’ Reports for COMP Exams, APA, and ALC Services
Peregrine Academic Services
www.PeregrineAcademics.com
2/3/2014 Peregrine Academic Services 1
Purpose The purpose of this report is to help our clients to be able to:
– Generate the required reports from the new Client Admin site (updated in October 2013) and how to select the variables for the reports
– Understand how the data and analyses are presented
– Know the strengths and limitations of the data, both the client data and the aggregate data
– Apply the results for the continuous improvement of the academic program (e. g. learning outcomes evaluation)
2/3/2014 Peregrine Academic Services 2
Table of Contents • Available Reports
• Glossary of Key Terms
• Sending/Receiving Reports
I. COMP Exam Reports A. Client Admin & the
Individual Student Data Report
B. Internal Analysis and Benchmarking
1) Internal Analysis Report
2) Executive Summary: Internal Analysis Report
C. External Comparison and Benchmarking
1) External Comparison Report
2) Executive Summary: External Comparison Report
D. Program/Cohort Comparison Report
E. Longitudinal Analysis Report
F. Pairwise Report
II. ALC Reports A. Client Admin & Raw Data
B. ALC Student Summary Report
III. APA Reports A. Client Admin & Raw Data
B. APA Student Summary Report
IV. Grade Scale Report
2/3/2014 Peregrine Academic Services 3
Available Reports
1. Individual Student Data Report. An Excel file with the student-by-student results showing the percent
correct for each topic and subject.
2. Internal Analysis Report. A report of a selected group of exams with the selection of one aggregate
pool at a time with both an analysis of means and an analysis of frequencies.
3. Executive Summary: Internal Analysis Report. An abbreviated summary report of the internal analysis
report.
4. External Comparison Report. A report of a selected group of exams comparing the results against one
or more aggregate pools.
5. Executive Summary: External Comparison Report. An abbreviated summary report of the External
analysis report.
6. Program/Cohort Comparison Report. A side-by-side comparison of the results between one or
academic programs or cohorts of students where there is overlap of topics on the student exams.
7. Longitudinal Analysis Report. A side-by-side comparison of different exam periods (up to 5 exam
periods can be shown on the report).
8. Pairwise Report. Shows student-by-student results when the same students who took the Inbound
Exam also complete a Mid-point or Outbound Exam.
9. ALC Report. A student-by-student summary of the student’s ALC module results.
10. APA Report. A student-by-student summary of the students APA results.
11. Grade Scale Report. A report based upon the client school’s COMP exam results used to determine a
school-specific grading scale based on percentile scoring.
2/3/2014 Peregrine Academic Services 4
Glossary of Key Terms
• Aggregate Pools. The aggregate pool is the data set used for external benchmarking and comparisons and is based on
the results from accredited institutions.
• Assessment Period. The date range for the report, which includes all the exams administered within these dates. For
synchronous schools, the assessment period is generally based upon the semester or quarter. For asynchronous
schools, the assessment period is generally annual, semiannual, or quarterly. School officials determine the
assessment period.
• Cohort. A group of students based upon a demographic factor such as degree program, course delivery modality,
campus location, program start date, etc. We provide cohort-level analysis based upon cohort categories identified at
the start of the exam cycle.
• Frequency of Questions Correct. For outbound exams, the frequency of questions correct is calculated for each
subject within a CPC topic. The formula is: Number of Questions Correct / Number of Questions Offered) * 100. In
order to provide a relative index for understand these data, an average of questions correct is shown for the aggregate
pool selected for the Analysis Report. To see the comparisons for other pools, the Analysis Report can be re-run with a
different pool selected.
• Inbound Exam. A student exam administered early in the student's program, usually during their first or second core
course, that measures the student's knowledge level at the beginning of their academic program.
• Outbound Exam. A student exam administered at the end of the student's academic program, usually within their last
course, that measures the student's knowledge level at the end of their academic program.
• Percentage Change. The percentage change between two scores. For inbound/outbound testing, the percentage
change is calculated using the following formula: (Outbound Score / Inbound Score) - 1.
• Percentage Difference. The percentage difference between a school's outbound student results and the aggregate,
calculated using the following formula: Aggregate Score – School Score.
2/3/2014 Peregrine Academic Services 5
1 of 4
Glossary of Key Terms
• Percentile. Percentiles are shown within the subject level analysis based upon the frequency of questions answered correctly. The measure is used to establish relevancy of the school’s score with the selected aggregate pool used for the Analysis Report. The percentile benchmarks indicate to what level an average score is needed in order to be at the 80th, 85th, 90th, or 95th percentile, which school officials can subsequently use for academic benchmarking and for setting performance targets. A percentile rank is the percentage of scores that fall at or below a given score and is based on the following
formula: ((NumValuesLessThanScore + (0.5 * NumValuesEqualScore)) / TotalNumValues) * 100. When shown,
the percentile rank of the school’s exam sample of the subject/subtopic/topic score to the aggregate pool is
based on using exam results within the aggregate pool grouped by school and calculated using samples of 30
exams. The percentile rank is not a ranking based on the number of individual schools included within the
aggregate pool, rather it is a percentile ranking compared to the exam results included within the aggregate
pool.
The percentile benchmark values are calculated using the Empirical Distribution Function with Interpolation
based upon the Excel Function of PERCENTILE.INC (array,k) with the following formula: (n-1)p=i+f where i is
the integer part of (n-1)p, f is the fractional part of (n-1)p, n is the number of observation, and p is the
percentile value divided by 100. The percentile benchmark then is the required score of questions correct to
be at a specific percentile value (80th, 85th, 90th, or 95th) and is based on interpolation.
• Percent Change Comparison. The percent difference between the school's percent change between inbound and outbound exam results and the aggregate pool's percent change between inbound and outbound exam results. The percent change comparison represents a relative learning difference between the specific school and demographically similar schools.
2/3/2014 Peregrine Academic Services 6
2 of 4
Glossary of Key Terms
• Topic. A broad category of a Common Professional Component (CPC) Area, often associated with a
course or courses within a degree program.
• Subtopic. For the CPCs of Economics and Management, there are identified subtopics. For the CPC
topic of Economics, the subtopics are Macroeconomics and Microeconomics. For the CPC topic of
Management, the subtopics are Operations/Production Management, Human Resource Management,
and Organizational Behavior. NOTE: When analyzing and evaluating the sub-topic scores, the
cumulative totals of the subtopic scores (percentages) will not equal the topic score. The subtopic
scores are based on the number of questions answered correctly for that specific subtopic. For
example, getting 2 out 3 questions correct for the subtopic of Human Resource Management is a score
of 66.66%, 3 out of 4 correct on Organization Behavior is 75% and 1 out of 3 on Operations/Production
Management is 33.33%. The total Management topic score, however, is 2+3+1 = 6 out of 10, or 60%.
• Subjects. For each CPC topic and subtopic, questions are grouped using 4-8 subject areas. Subjects
generally correspond to the school's learning outcomes associated with each CPC topic. In using these
data, consider the Subject is the Learning Outcome without the verb. The school then sets their
specific benchmarks based on the subject-level scores (frequencies) in conjunction with the topic/sub-
topic level scores (means).
2/3/2014 Peregrine Academic Services 7
3 of 4
Glossary of Key Terms
• Inbound/Mid-Point/Outbound Exams. These are COMP
exams that span the topics of the academic degree program.
• Pre/Post Tests. These are tests administered with either an
APA writing style service or an ALC module that cover only
the material included in the specific learning material.
• Course-level Test. Upon request, we can develop specific
course-level (or concentration/specialization-level) tests that
include only one or two topics, usually 40-60 questions in
length.
2/3/2014 Peregrine Academic Services 8
4 of 4
Aggregate Pools (Used with COMP Exam Reports): The aggregate pool is the data set used for external benchmarking and comparisons.
Pools Based on Program Delivery Modality (for each academic degree level):
1. Traditional. The majority of the program is delivered at a campus location at an established college or university. The majority of the students are recent high school graduates, typically 18-22 years old. Courses are taught on a semester or quarter basis, typically Monday through Friday.
2. Online. The majority of the program is delivered online to students and there is little, if any, requirement for the students to go to a campus location any time during their college or university experience. The majority of the students are considered non-traditional, meaning they tend to be older, may have some college credit prior to starting their program, and are often working adults completing their degree program.
3. Blended. The program is delivered to students using a combination of online and campus-based instruction and/or the program is delivered in an accelerated format. The course term is typically 4 to 8 weeks. Campus-based instruction tends to be either at night or on weekends with generally longer sessions. The student population tends to be non-traditional, meaning they tend to be older, may have some college credit prior to starting their program, and are often working adults completing their degree program.
2/3/2014 Peregrine Academic Services 9
1 of 5
Aggregate Pools (Used with COMP Exam Reports): The aggregate pool is the data set used for external benchmarking and comparisons.
Pools Based on Location (for each academic degree level) :
1. Outside-US. Includes colleges and universities outside of the United States. Program delivery is usually campus-based; however, the aggregate pool includes some blended programs and online programs.
2. Regional/Country. Includes colleges and universities outside of the United States from specific regions (e.g. Latin America, Europe, Asia, etc.) or from specific countries (e.g. Mongolia). Program delivery is primarily campus-based; however, the pools may include some blended and online course delivery.
3. US. Includes all US-based schools and programs.
2/3/2014 Peregrine Academic Services 10
2 of 5
Aggregate Pools (Used with COMP Exam Reports): The aggregate pool is the data set used for external benchmarking and comparisons.
Pools Based on Institute Characteristics (for each academic degree level) :
1. Large Private. This aggregate pool includes large, privately owned universities within the United States.
2. HBCU. Includes colleges and university that are designated as Historically Black Colleges and Universities.
3. Private. US schools that are privately owned. 4. Public. US schools that are government owned. 5. Faith-based. US schools that have a specific
religious affiliation or association.
2/3/2014 Peregrine Academic Services 11
3 of 5
Aggregate Pools (Used with COMP Exam Reports): The aggregate pool is the data set used for external benchmarking and comparisons.
Pools Based on Masters Programs:
1. Masters-MBA. Includes programs that are designated as Masters of Business Administration.
2. Masters-MS. Includes programs that are designated as Masters of Science.
3. Masters-MA. Includes programs that are designated as Masters of Arts.
2/3/2014 Peregrine Academic Services 12
4 of 5
Aggregate Pools (Used with COMP Exam Reports): The aggregate pool is the data set used for external benchmarking and comparisons.
Pools Based on Dual-Accreditation Affiliation (for each academic degree level):
1. IACBE. Includes business schools and programs affiliated with the International Assembly for Collegiate Business Education. Where available, this pool is further divided by IACBE Region.
2. ACBSP. Includes business schools and programs affiliated with the Accreditation Council of Business Schools and Programs. Where available, this pool is further divided by ACBSP Region.
3. AACSB. Includes business schools and programs accredited with the Association to Advance Collegiate Schools of Business.
2/3/2014 Peregrine Academic Services 13
5 of 5
Sending/Receiving Reports: Pull or Push?
In general, we will “push” to you each month at the start of the month the previous
month’s raw data (along with the monthly invoice). This is an Excel file that we send
you. Although you can also download this file from your Client Admin site, we will
also push this report to you each month so that you are aware of the previous
month’s activity.
In general, you use a “pull” system for the more detailed analysis and summary
reports, meaning you request these reports from us directly or pull these reports
from your Client Admin site. It is difficult for us to push these reports to you because
we don’t always know your exam/cohort/term dates. When requesting a report, be
sure to indicate all of the variables needed to generate the report (e. g. aggregate
pools you want to see, the date ranges, any cohort/program variables, etc.). As the
old saying goes, we can’t hear you think; thus, please let us know if you need a
report and you want us to send the report to you.
2/3/2014 Peregrine Academic Services 14
I. COMP Exam Reports
A. Client Admin & the Individual Student Results Report B. Internal Analysis and Benchmarking Report C. Executive Summary: Internal Analysis Report D. External Comparison Report E. Executive Summary: External Comparison Report F. Program/Cohort Report G. Longitudinal Report H. Pairwise Report
2/3/2014 Peregrine Academic Services 15
Client Admin Site: Variable Selections
Generating a report generally requires 3-5 steps. Each step is sequential and builds upon the previous selection.
1. Select the data range
2. Select the Academic Degree Level
3. Select any Program/Course Options
4. Select one or more comparison aggregate pools
5. Select the output (PDF report or Excel Raw Data)
From the Client Admin Site, you have unlimited report generation capabilities. You can re-run reports based on different sets of responses to variables.
2/3/2014 Peregrine Academic Services 16
As new reports are added for
the Client Admin Site
with the SaaS re-write, more selections will be available.
COMP Exam Individual Student Results: Excel File
The report shows all student-by-student results for the selected period based on the selection criteria used when generating the report. These data are often used in conjunction with the school’s student database for additional demographical analysis (e. g. comparing results based on age, gender, ethnicity, etc.).
NOTE: When the Business Topic of Management is used, the total management score will not equal a sum of the subtopic scores because the total score is based on X/10 correct where as a subtopic score is based on X/3 or X/5 correct. Same for Economics with two subtopics.
Student ID # is used in addition to the student e-mail address for any future pairwise reporting and to allow greater ease for the school to integrate results with the school’s databases.
2/3/2014 Peregrine Academic Services 17
Internal Analysis Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 18
Report Applications: Internal Analysis Report
2/3/2014 Peregrine Academic Services 19
• The purpose of this report is
primarily associated with internal
benchmarking; however, one
aggregate comparison is shown to
help establish relativity of the
data.
• This report is used mostly for
evaluations of learning outcomes
using the frequency correct data
for such analyses.
Report Section: Introduction
2/3/2014 Peregrine Academic Services 20
1 of 10
Within the introduction section, there is content related to how to use and understand the report, tip and techniques for academic analysis, and how to interpret the exam scores. The end of the report includes the glossary of terms.
Report Section: Inbound/Outbound Overview
2/3/2014 Peregrine Academic Services 21
2 of 10
The first graph is a side-by-side overview of the exam results. If Inbound Exams are included, this graph will display both the inbound and outbound exam averages. The topic averages and the sub-topic averages will both be shown (the topics of Management and Economics include sub-topics).
Report Section: Score/Completion Time Scatter Plots
2/3/2014 Peregrine Academic Services 22
3 of 10
The scatter plot are sorted by Exam Completion Time (low to high) and plotted as such with the scores of the shorter completion time shown to the left and the scores with the longer completion time shown towards the right. The Y axis is exam score and the X axis is completion time. Scatter plots are shown for both Inbound and Outbound Exam results for total, topic, and subtopic.
Report Section: Exam Summary Table
2/3/2014 Peregrine Academic Services 23
4 of 10
The Exam Summary Table is a an overview of each topic/subtopic data with the percent correct, a comparison with the aggregate data, a percentile rank, and the percentile benchmarks used for learning outcomes evaluation. Similar tables are shown for each topic, with similar subject-level data.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 24
5 of 10
For each topic/subtopic for both inbound and outbound exams, the reported data include:
1. Inbound Exam/Outbound Exam Score Comparison side-by-side
2. Table of the Assessment Summary Statistics
3. Scatter Plot of the Score (Y Axis) and the Completion Time (X Axis)
4. Bar Graph of the Subjects within the Topic/Subtopic compared to the selected aggregate pool used with the report.
5. A Frequency Analysis Table of the questions offered on the exam.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 25
6 of 10
For each topic/subtopic for both inbound and outbound exams, the reported data include:
1. Inbound Exam/Outbound Exam side-by-side
2. Table of the Assessment Summary Statistics
3. Scatter Plot of the Score (Y Axis) and the Completion Time (X Axis)
4. Bar Graph of the Subjects within the Topic/Subtopic compared to the selected aggregate pool used with the report.
5. A Frequency Analysis Table of the questions offered on the exam.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 26
7 of 10
For each topic/subtopic for both inbound and outbound exams, the reported data include:
1. Inbound Exam/Outbound Exam side-by-side
2. Table of the Assessment Summary Statistics
3. Scatter Plot of the Score (Y Axis) and the Completion Time (X Axis).
4. Bar Graph of the Subjects within the Topic/Subtopic compared to the selected aggregate pool used with the report.
5. A Frequency Analysis Table of the questions offered on the exam.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 27
8 of 10
For each topic/subtopic for both inbound and outbound exams, the reported data include:
1. Inbound Exam/Outbound Exam side-by-side
2. Table of the Assessment Summary Statistics
3. Scatter Plot of the Score (Y Axis) and the Completion Time (X Axis)
4. Bar Graph of the Subjects within the Topic/Subtopic compared to the selected aggregate pool used with the report.
5. A Frequency Analysis Table of the questions offered on the exam..
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 28
9 of 10
For each topic/subtopic for both inbound and outbound exams, the reported data include:
1. Inbound Exam/Outbound Exam side-by-side
2. Table of the Assessment Summary Statistics
3. Scatter Plot of the Score (Y Axis) and the Completion Time (X Axis)
4. Bar Graph of the Subjects within the Topic/Subtopic compared to the selected aggregate pool used with the report.
5. A Frequency Analysis Table of the questions offered on the exam. This table shows the percentile rank and the percentile benchmarks based on the selected aggregate pool. For percentile ranking calculations and for the percentile benchmarks shown for the selected aggregate pool, results are subject to sample size limitations. In general, percentile ranking and percentile benchmarks should be used with caution relative to making programmatic changes based on the results if the sample of Questions Offered for the aggregate pool is less than 300 for a specific subject.
Topic/Subtopic Analysis – Statistical Artifacts
2/3/2014 Peregrine Academic Services 29
10 of 10
The formulas used for percentile calculations are shown within the glossary of terms. Show on this slide, however, are two statistical artifacts that could appear on your reports where the percentile rank seems “off” when compared to the calculated values for the percentile benchmarks.
Artifact #1: Due to the use of different formulas used to calculate the school’s percentile rank and the required score for specific benchmarks, the school’s rank is less than or higher than the required score for a percentile benchmark. When calculating the percentile rank, we use the school’s score and simply calculate the percent of scores that are at or below that score. When we calculate the percentile benchmark, we use an interpolation function to determine the required score for a specific percentile. Therefore, we use two different formulas for the percentile values: the first concerns the score and how many at/equal to the given score and the second an interpolation to calculate the desired score. Both use the same distribution list of scores, arranged in sequence from low to high. When we developed the distribution tables, we used 5 decimal points. When we calculated the benchmarks, we also calculated to 5 decimal points. We show, however, two decimal points in the table.
Artifact #2: Due to sample size limitations and rounding, the school’s rank is less than the required score for a higher percentile benchmark. The lower the number of exams in the pool, the more these situations will occur. In this example, the school score is 56.52% and the 85th percentile is 56.52. In this case, both calculations are correct; the issue concerns sample size. With only 586 questions offered in the pool, we have a distribution sample of 15 values. When we do the rank calculation (the 81st), it comes out “low” due to the sample size and the values within the distribution. When we do the calculations of the benchmarks (interpolation), the actual 85th benchmark to 5 decimal places is 56.52377, but rounds to 56.52 in the table. The school’s score of 56.52 and the full number is 56.52173 (52/92 correct). The school’s value is below the benchmark of 56.52% for the 85th Percentile, but due to rounding, it looks like the school’s score should be at the 85th percentile.
1
2
Learning Outcome (LO) Evaluation: Example of a Process Suppose you had a LO of “MBA graduates will have a comprehensive understanding of business finance and accounting principles”. How do you measure the LO using our COMP exam services?
Consider the following steps:
1. Select the exam topic(s) that correspond to the LO functional area(s). In this example, it would be the Business Finance and the Accounting topics.
2. Look at the subject areas within these two topics and identify the key areas (~3-6 subjects) that you believe are important for your LO.
3. Look at the scores (percent correct) for the key areas (subjects) that you selected. What is the percentile rank of the score? Where do you think you should be?
4. Set your benchmark. If you are at or above benchmark, great. If you are below benchmark, consider what steps you might take to improve the score.
There many ways to use these reports for your LO evaluations. Here’s a few tips:
“Subjects” are essentially learning outcomes without the verb. Not every subject will directly correspond to a school’s learning outcome, but most will.
Most LOs are not written to be quantifiably measured. What you need to do is identify specific targets (benchmarks) are that quantifiable.
A measureable benchmark might include:
A percent correct score that equals a desired percentile.
A topic-level score that equals a desired percentile
A topic-level score that is within X% of a comparison aggregate pool score.
A total exam score that is within X% of a comparison aggregate pool score.
2/3/2014 Peregrine Academic Services 30
Executive Summary: Internal Analysis Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 31
Report Applications: Executive Summary: Internal Analysis Report
2/3/2014 Peregrine Academic Services 32
• Since the full Internal Analysis Report can be upwards of 150 pages depending upon the exam sets selected, the purpose of the Executive Summary : Internal Analysis Repot is to show a short summary of the internal analysis results.
• This report can then be distributed to both internal and external stakeholders.
• The report includes an intro section with of how to interpret and use the exam score and a glossary of terms at the end.
• Data are reported only to the topic/subtopic level (no subject-level analysis).
Report Section: Analysis Summary
2/3/2014 Peregrine Academic Services 33
If the data include both inbound and outbound exams, there are two sets of data shown: a summary graph and a summary table. You can select one aggregate for comparison purposes and re-run the report as required to see comparisons with other aggregate pools.
External Comparison Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 34
Report Applications: External Comparison Report
2/3/2014 Peregrine Academic Services 35
• The propose of the external comparison report is to show the school’s inbound/outbound exam results compared to one or more selected aggregate pools. Up to 8 aggregate pools can be selected for comparison proposes.
• Comparisons include a comparison of the scores and a comparison of percent change (when inbound exams are included).
Report Section: Introduction
2/3/2014 Peregrine Academic Services 36
1 of 3
Within the introduction section, there is content related to how to use and understand the report, tip and techniques for academic analysis, and how to interpret the exam scores. The end of the report includes the glossary of terms.
Report Applications: Summary Comparisons
2/3/2014 Peregrine Academic Services 37
For both inbound and outbound exam sets, overview graphs are provided for both the comparison
of the means and the comparison of percent change (only available with inbound exams).
Different graph sets are provided for each of the selected aggregate pools.
2 of 3
Report Section: Topic/Subtopic Comparisons
2/3/2014 Peregrine Academic Services 38
3 of 3
Comparisons are show for each topic and subtopic and include both a comparison of the score means and a comparison of the percent change from inbound to outbound (if inbound exams are included).
Learning Outcome (LO) Evaluation: Example of a Process Suppose you had a LO of “MBA graduates will have a comprehensive understanding of business finance and accounting principles”. How do you measure the LO using our COMP exam services?
Consider the following steps:
1. Select the exam topic(s) that correspond to the LO functional area(s). In this example, it would be the Business Finance and the Accounting topics.
2. Look at the topic-level scores for these two subjects. How do your exam results compare to other exams results based on the selected aggregate pool?
3. Set your benchmark. If you are at or above benchmark, great. If you are below benchmark, consider what steps you might take to improve the scores.
There many ways to use these reports for your LO evaluations. Here’s a few tips:
“Subjects” are essentially learning outcomes without the verb. Not every subject will directly correspond to a school’s learning outcome, but most will.
Most LOs are not written to be quantifiably measured. What you need to do is identify specific targets (benchmarks) are that quantifiable.
A measureable benchmark might include:
A percent correct score that equals a desired percentile.
A topic-level score that equals a desired percentile
A topic-level score that is within X% of a comparison aggregate pool score.
A total exam score that is within X% of a comparison aggregate pool score.
2/3/2014 Peregrine Academic Services 39
Executive Summary: External Comparison Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 40
Report Applications: Executive Summary: External Comparison Report
2/3/2014 Peregrine Academic Services 41
• Since the full external comparison
can be upwards of 60 pages, this
executive summary report is a
condensed version that just shows
the overall summaries compared to
the different aggregate pools.
• This report can then be distributed
to both internal and external
stakeholders.
• The report includes an intro section
with of how to interpret and use the
exam score and a glossary of terms
at the end.
• Data are reported only to the
topic/subtopic level in summary
form (no individual graphs).
Report Section: Executive Summary : External Comparison Report
2/3/2014 Peregrine Academic Services 42
There are 3 charts in this report. The first is a summary chart showing the inbound/outbound comparisons for all the topics/subtopics included on the assessment. The next two graphs are side-by-side comparisons of the school’s results (mean scores and percent change) with the selected aggregate pools.
Program/Cohort Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 43
Report Applications: Program/Cohort Report
2/3/2014 Peregrine Academic Services 44
• The purpose of this report is to show a
side-by-side comparison of exam
results for different programs or
cohorts of students and then compare
these results with one or more
selected aggregate pools.
• A “program” is an academic program
(e. g. BA in Business Economics).
Usually, these are different exams, but
could also be pull-down menus that
students use when they start the exam.
• A “cohort” could be any number of
areas including specializations, campus
locations, online vs. on-campus
students, etc.). Students use pull-down
menus to indicate their cohort at the
start of the exam.
Report Section: Introduction
2/3/2014 Peregrine Academic Services 45
1 of 4
The Glossary of Terms section describes the terms used within the report.
Report Section: Total Score Summary
2/3/2014 Peregrine Academic Services 46
2 of 4
A side-by-side comparison of the exam results TOTAL SCORE is shown for each of the selected cohorts/programs. The green line (and data point show above the graph) is for the selected aggregate pool used for relative comparisons. More than one aggregate pool can be shown.
NOTE: Usually, cohorts will have the same topics on exams and therefore the total score comparison is valid. However, different exams with Programs will often have different topics and the total score comparison may not be valid (in which case refer to the topic-by-topic comparisons.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 47
3 of 4
In this example, the exams for each of the 5 compared programs included the topic “Business Ethics”. The green line (and data point shown above the graph) are for the selected aggregate pool. More than one pool can be selected when generating this report.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 48
4 of 4
In this example, only 4 out of the 5 selected programs included the exam topic “Business Finance”. Therefore, there are no data for the Information Systems program.
NOTE: Usually, cohorts will have the same topics on exams; however, program-level exams will often have different topics, but with overlap.
Longitudinal Analysis Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 49
Report Applications: Longitudinal Analysis Report
2/3/2014 Peregrine Academic Services 50
• The purpose of this report is to show a
side-by-side comparison of exam
results for different exam periods (e. g.
semester, year, quarter) and then
compare these results with one or
more selected aggregate pools.
• The user can select up to 5 different
exam periods and each period is
defined by a specific date range.
• Aggregate data are available for both
Inbound and Outbound exam averages;
however, we do not have similar
aggregate pools for mid-point exams.
• Use of this report is typically with
accreditation submissions to show
data points over time.
Report Section: Introduction
2/3/2014 Peregrine Academic Services 51
1 of 4
The Glossary of Terms section describes the terms used within the report.
Report Section: Total Score Summary
2/3/2014 Peregrine Academic Services 52
2 of 4
A side-by-side comparison of the exam results TOTAL SCORE is shown for each of the selected exam periods. The green and purple lines (and data values) is for the selected aggregate pool used for relative comparisons. More than one aggregate pool can be shown.
NOTE: We do not maintain aggregate pool data for mid-point exams because the application and administration of these mid-point exams is not consistent between schools.
Report Section: Topic/Subtopic Analysis
2/3/2014 Peregrine Academic Services 53
3 of 4
Similar graphs are shown for each topic and subtopic included on the exam.
NOTE: We do not maintain aggregate pool data for mid-point exams because the application and administration of these mid-point exams is not consistent between schools.
Report Section: Trend Line Analysis
2/3/2014 Peregrine Academic Services 54
4 of 4
Shown is the linear trend line for the Outbound exam results with the R2 value.
R² = 0.9868
0
20
40
60
80
100
2010 2011 2012 2013 2014
Outbound Exam Scores: TOTAL SCORE
Scores
Linear (Scores)
Pairwise Report
COMP Exam Reports
2/3/2014 Peregrine Academic Services 55
Report Applications: Pairwise Report
2/3/2014 Peregrine Academic Services 56
• If the school uses both inbound and
outbound exams , then pairwise
reporting is possible once the
individual students who completed an
inbound exam also complete an
outbound exam.
• Pairwise reporting possible after 2-3
years of testing because it will
generally take that long for a student
starting their program (inbound exam)
to then graduate from the program
(outbound exam).
• The purpose of the report is to show a
summary of individual student results
over time.
Report Section: Introduction
2/3/2014 Peregrine Academic Services 57
1 of 3
The Glossary of Terms section describes the terms used within the report.
Report Section: Inbound/Outbound Summary
2/3/2014 Peregrine Academic Services 58
2 of 3
For the group of reported
students, this chart provides a
comparison of the average
inbound scores with the average
outbound scores, both total score
and by each topic/subtopic.
Report Section: Student Analysis
2/3/2014 Peregrine Academic Services 59
3 of 3
For each selected student, the side-
by-side inbound/outbound exam
results are shown.
Shown for each topic/subtopic are:
– Percent Difference
– Percent Change
– Inbound Average (based on
this set of results)
– Outbound Average (based on
this set of results
FAQ for COMP Exams
2/3/2014 Peregrine Academic Services 60
Question Response
How should COMP Exams be graded? We have a suggested grading scale that is based on a broad range of data from a variety of institutions. After you have administer 100 or more exams, you may want to consider using the Grading Scale Report to determine your own normed grading scale.
How does grading an exam differ from analyzing exam results?
Exams should be graded based on a normed scale and grading applies to specific student exam. Analyzing exam results, however, is done when you look a group of exams, or a data set.
Can we see the data for the aggregate pools? Yes, on your Client Admin site, you can download and use any of the Excel files with all of the aggregate pool information. There are 2 worksheets within the file per poo;: one for the summary statistics and one for the percentile benchmarks.
When will you next update the aggregate pools? We update the pools every July and then these pools are “fixed” through the following June. Starting 2014, these will be 3-year rolling pools so that we maintain a sample of about 100,000 exams in the data base used to generate the aggregate pools.
How can I understand the relativity of my school’s percent correct scores?
The calculated percentile rank and the associated percentile benchmarks will help you determine the relatively of your specific exam results.
Can I pick-and-choose the subjects I want on the exams?
You can only select which topics to include on the exam, not the specific subjects within a topic, else risk invalidating the comparative analysis.
II. Academic Leveling Course (ALC) Student Summary Reports
A. Raw Data
B. ALC Student Summary Report
2/3/2014 Peregrine Academic Services 61
Client Admin Site: Variable Selections
Generating a report generally requires 3-5 steps. Each step is sequential and builds upon the previous selection.
1. Select the data range
2. Select the Academic Degree Level
3. Select any Program/Course Options
4. Select one or more comparison aggregate pools
5. Select the output (PDF report or Excel Raw Data)
From the Client Admin Site, you have unlimited report generation capabilities. You can re-run reports based on different sets of responses to variables.
2/3/2014 Peregrine Academic Services 62
As new reports are added for
the Client Admin Site
(.Net re-write), more selection
swill be available.
ALC Individual Student Results Report: Excel File
The raw data shows all student-by-student results for the selected period based on the selection criteria used when generating the report. These data are often used in conjunction with the school’s student database for additional demographical analysis (e. g. comparing results based on age, gender, ethnicity, etc.).
ALC modules (pre/post tests) should be graded AND analyzed based on a traditional 0-100% grading scales. Unlike the COMP Exam which is normed and assesses retained knowledge, the ALC module pre/post tests evaluate new learned knowledge as a validation of learning and successful completion of the module/course.
Student ID # is used in addition to the student e-mail address for any future pairwise reporting and to allow greater ease for the school to integrate results with the school’s databases.
2/3/2014 Peregrine Academic Services 63
ALC Student Summary Report
ALC Exam Reports
2/3/2014 Peregrine Academic Services 64
Report Applications: ALC Student Summary Report
2/3/2014 Peregrine Academic Services 65
• The purpose of this report is to
simply summarize the total set of
results and to show student-by-
student results. The score results
by each student are the same as
what is listed on the individual
student completion certificates.
• The dates of the report are listed
on the report cover.
• The student names and ID
numbers included within the
report are also listed on the report
cover.
Report Section: Introduction
The Glossary of Terms section describes the terms used within the report.
2/3/2014 Peregrine Academic Services 66
1 of 3
Report Section: Report Summary
The Report Summary Section
summarizes the Pre-test/Post-
test results for the selected
students, showing average
scores for each of the ALC
Modules included with the
Report.
These scores are the means of
the scores by the students
selected for the specific report.
2/3/2014 Peregrine Academic Services 67
2 of 3
Report Section: Student Results
2/3/2014 Peregrine Academic Services 68
3 of 3
For each student, a summary of the
student’s results are shown for each ALC
Module. If no data are shown, as with
the post-test score for Quant/Statistics
in this example, it means that the
student has not completed the post-test
within the reported period.
The red vertical line is the total average
scores, that is the averages of the ALC
module post-test scores. This line is
shown to indicate relativity of the
scores.
FAQ for ALC Services
2/3/2014 Peregrine Academic Services 69
Question Response
How should the ALC Pre/Post Tests be graded?
The ALC Pre/Post tests should be graded on a traditional, 0-100%, grading scale similar to any other assignment.
III. American Psychological Association (APA) Writing Style
Student Summary Report
A. Raw Data
B. APA Student Summary Report
2/3/2014 Peregrine Academic Services 70
Client Admin Site: Variable Selections
Generating a report generally requires 3-5 steps. Each step is sequential and builds upon the previous selection.
1. Select the data range
2. Select the Academic Degree Level
3. Select any Program/Course Options
4. Select one or more comparison aggregate pools
5. Select the output (PDF report or Excel Raw Data)
From the Client Admin Site, you have unlimited report generation capabilities. You can re-run reports based on different sets of responses to variables.
2/3/2014 Peregrine Academic Services 71
As new reports are added for
the Client Admin Site
(.Net re-write), more selection
swill be available.
APA Individual Student Results Report: Excel File
The raw data shows all student-by-student results for the selected period based on the selection criteria used when generating the report. These data are often used in conjunction with the school’s student database for additional demographical analysis (e. g. comparing results based on age, gender, ethnicity, etc.).
The APA Competency Exam should be graded AND analyzed based on a traditional 0-100% grading scales. Unlike the COMP Exam which is normed and assesses retained knowledge, the APA Competency Exam evaluates new learned knowledge as a validation of learning and successful completion of the APA course.
Student ID # is used in addition to the student e-mail address for any future pairwise reporting and to allow greater ease for the school to integrate results with the school’s databases.
2/3/2014 Peregrine Academic Services 72
APA Writing Style Services Report
APA Reports
2/3/2014 Peregrine Academic Services 73
Report Applications: APA Student Summary Report
• The purpose of this report is to
simply summarize the total set of
results and to show student-by-
student results. The score results
by each student are the same as
what is listed on the individual
student completion certificates.
• The dates of the report are listed
on the report cover.
• The student names and ID
numbers included within the
report are also listed on the report
cover.
2/3/2014 Peregrine Academic Services 74
Report Section: Introduction
The Glossary of Terms section describes the terms used within the report.
2/3/2014 Peregrine Academic Services 75
1 of 3
Report Section: Report Summary
The Report Summary Section
summarizes the competency
exam results for the selected
students, showing average
scores for each of the subjects
included within the APA
Competency Exam.
These scores are the means of
the scores by the students
selected for the specific report.
2/3/2014 Peregrine Academic Services 76
2 of 3
Report Section: Student Results
2/3/2014 Peregrine Academic Services 77
3 of 3
For each student, a summary of
the student’s results are shown
including total exam score and
the subject-level score. These
are simple percent correct
values.
The red vertical line is the total
average scores, that is the
averages of the subject-level
scores. This line is shown to
indicate relativity of the scores.
FAQ for APA Services
2/3/2014 Peregrine Academic Services 78
Question Response
How should the APA Competency Exam be graded?
The APA Competency exam should be graded on a traditional, 0-100%, grading scale similar to any other assignment.
What is the difference between Average Exam Score and Total as shown on page 5 of the Report?
The Average Exam Score (total score) is the mean of the total scores received by the students on the exam. The Total is the means of the subject-level scores, which is then subsequently used on the following pages for the student-by-student results. Total then is just used to show a relatively of the scores: those student results that were above or below the overall averages.
Can I use the APA Competency Exam with my Outbound COMP Exam?
Yes. Although the competency exam and APA course are most typically used at the start of a student’s program, some schools require the APA competency exam included with the Outbound exam as another direct measure of learning. When used this way, it is a separate exam of 50 questions with a separate pricing structure.
IV. Grade Scale Report
The Grade Scale Report
2/3/2014 Peregrine Academic Services 79
Report Applications: The Grade Scale Report
For Inbound Exams, we recommend that you only grade the exam Pass/Fail for completing the exam, perhaps 5 course points or even extra credit points.
For Outbound Exams, we do recommend that you incentivize the exam with grading in order to encourage the students to do their very best on the exam. Such a grade could be as high as 10% of a course grade. Grading, however, must be done on a scale because this is a normed exam with an average degree of difficulty of 55-60%. We have a suggested grading scale that can be used for grading purposes of the Outbound Exam.
2/3/2014 Peregrine Academic Services 80
The Grade Scale Report
2/3/2014 Peregrine Academic Services 81
After you have used our exam services and if you wish
to hone your grading scale, there is an option to
generate your own grading scale based on your exam
results.
This simple report takes your student results and sorts
those results to generate the 60th….99th percentile
marks that you can then use for your grading
purposes.
Since there are many grading scales used in higher
education, this report only shows the percentile
scores and then you can determine what scores
correspond to the actual letter (and point) grades: A,
A-, B+, B, B-, C+, C, C-, D+, D, D-, F.
NOTE: You should have at least 50 completed
Outbound Exams to make this report statistically
meaningful.
Peregrine Academic Services
Box 741
Gillette WY 82717
Phone: (307) 685-1555
Fax: (307) 685-0141
Toll Free: 1-877-260-1555
Your Valued Partner for Academic
Preparedness