The Many Threats to Test Validity David Mott, Tests for Higher Standards and Reports Online Systems...
-
Upload
kaden-darland -
Category
Documents
-
view
215 -
download
0
Transcript of The Many Threats to Test Validity David Mott, Tests for Higher Standards and Reports Online Systems...
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
The Many Threats to Test Validity
David Mott, Tests for Higher Standards and Reports Online Systems Presentation at the Virginia Association of Test
Directors (VATD) Conference, Richmond, VA, October 28, 2009
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
The Many Threats to Test Validity
In order for a test or assessment to have any value whatsoever, it must be possible to make reasonable inferences from the score. This is much harder than it seems. The test instruments, the testing conditions, the students, and the score interpreters, and perhaps Fate, ALL need to be working together to produce data worth using. Many specific threats will be delineated; a number of solutions suggested; and audience participation is strongly encouraged.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity and Value come from the same Latin root. The word has to do with being strong, well, good.
Validity = Value
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Initial Attitude Adjustment
Amassing StatisticsThe government are very keen on amassing statistics — they collect them, raise them to the n
th power, take the cube root and prepare wonderful diagrams. But what you must never forget is that every oneof those figures comes in the first instance from the village watchman,who just puts down what he damn well pleases.
(J. C. Stamp (1929). Some Economic Factors in Modern Life. London: P. S. King and Son)
Distance from DataI have noticed that the farther one is from the source of data, the more likely one is to believe that the data could be a good basis for action.
(D. E. W. Mott (2009). Quotations.)
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT The ExaminationThe Examinationas shown by the Ghost of Testing Pastas shown by the Ghost of Testing Past
The Examinationas shown by the Ghost of Testing Past
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — Older Formulations
1950’s through 1980’s content validity concurrent validity predictive validity construct validity
Lee J. Cronbach
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Content Validity — Refers to the extent to which a measure
represents all facets of a given social construct. Social constructs such as: Reading Ability, Math. Computation Proficiency, Optimism, Driving Skill, etc. It is a more formal term than face validity. As face validity refers, not to what the test actually measures, but to what it appears to measure. Face validity is whether a test "looks valid" to the examinees who take it, the administrative personnel who decide on its use, and to others.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Concurrent Validity — Refers to a demonstration of how well a
test correlates well with a measure that has previously been validated. The two measures may be for the same construct, or for different, but presumably related, constructs.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Predictive Validity — Refers to the extent to which a score on a
scale or test predicts scores on some criterion measure. For example, how well do your final benchmarks predict scores on the state SOL Tests?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Construct Validity — Refers to whether a scale measures or
correlates with the theorized underlying psychological construct (e.g., "fluid intelligence") that it claims to measure. It is related to the theoretical ideas behind the trait under consideration, i.e. the concepts that organize how aspects of personality, intelligence, subject-matter knowledge, etc. are viewed.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New Formulation
1990’s through now Six aspects or views of Construct Validity
content aspect substantive aspect structural aspect generalizability aspect external aspect consequential aspect
Samuel Messick
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New FormulationSix aspects or views of Construct Validity
Content aspect – evidence of content relevance, representativeness, and technical quality
Substantive aspect – theoretical rationales for consistency in test responses, including process models, along with evidence that the processes are actually used in the assessment tasks
Structural aspect – judges the fidelity of scoring to the actual structure of the construct domain
Generalizability aspect – the extent to which score properties and interpretations generalize to related populations, settings, and tasks
External aspect – includes converging and discriminating evidence from multitrait-multimethod comparisons as well as proof of relevance and utility.
Consequential aspect – shows the values of score interpretation as a basis for action and the actual and potential consequences of test use, especially in regard to invalidity related to bias, fairness, and distributive justice
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New FormulationSix aspects or views of Construct Validity
Content aspect – evidence of content relevance, representativeness, and technical quality
Substantive aspect – theoretical rationales for consistency in test responses, including process models, along with evidence that the processes are actually used in the assessment tasks
Structural aspect – judges the fidelity of scoring to the actual structure of the construct domain
Generalizability aspect – the extent to which score properties and interpretations generalize to related populations, settings, and tasks
External aspect – includes converging and discriminating evidence from multitrait-multimethod comparisons as well as proof of relevance and utility.
Consequential aspect – shows the values of score interpretation as a basis for action and the actual and potential consequences of test use, especially in regard to invalidity related to bias, fairness, and distributive justice
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New FormulationSix aspects or views of Construct Validity
Content aspect – evidence of content relevance, representativeness, and technical quality
Substantive aspect – theoretical rationales for consistency in test responses, including process models, along with evidence that the processes are actually used in the assessment tasks
Structural aspect – judges the fidelity of scoring to the actual structure of the construct domain
Generalizability aspect – the extent to which score properties and interpretations generalize to related populations, settings, and tasks
External aspect – includes converging and discriminating evidence from multitrait-multimethod comparisons as well as proof of relevance and utility.
Consequential aspect – shows the values of score interpretation as a basis for action and the actual and potential consequences of test use, especially in regard to invalidity related to bias, fairness, and distributive justice
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New FormulationSix aspects or views of Construct Validity
Content aspect – evidence of content relevance, representativeness, and technical quality
Substantive aspect – theoretical rationales for consistency in test responses, including process models, along with evidence that the processes are actually used in the assessment tasks
Structural aspect – judges the fidelity of scoring to the actual structure of the construct domain
Generalizability aspect – the extent to which score properties and interpretations generalize to related populations, settings, and tasks
External aspect – includes converging and discriminating evidence from multitrait-multimethod comparisons as well as proof of relevance and utility.
Consequential aspect – shows the values of score interpretation as a basis for action and the actual and potential consequences of test use, especially in regard to invalidity related to bias, fairness, and distributive justice
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New FormulationSix aspects or views of Construct Validity
Content aspect – evidence of content relevance, representativeness, and technical quality
Substantive aspect – theoretical rationales for consistency in test responses, including process models, along with evidence that the processes are actually used in the assessment tasks
Structural aspect – judges the fidelity of scoring to the actual structure of the construct domain
Generalizability aspect – the extent to which score properties and interpretations generalize to related populations, settings, and tasks
External aspect – includes converging and discriminating evidence from multitrait-multimethod comparisons as well as proof of relevance and utility.
Consequential aspect – shows the values of score interpretation as a basis for action and the actual and potential consequences of test use, especially in regard to invalidity related to bias, fairness, and distributive justice
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Validity — New FormulationSix aspects or views of Construct Validity
Content aspect – evidence of content relevance, representativeness, and technical quality
Substantive aspect – theoretical rationales for consistency in test responses, including process models, along with evidence that the processes are actually used in the assessment tasks
Structural aspect – judges the fidelity of scoring to the actual structure of the construct domain
Generalizability aspect – the extent to which score properties and interpretations generalize to related populations, settings, and tasks
External aspect – includes converging and discriminating evidence from multitrait-multimethod comparisons as well as proof of relevance and utility.
Consequential aspect – shows the values of score interpretation as a basis for action and the actual and potential consequences of test use, especially in regard to invalidity related to bias, fairness, and distributive justice
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Administration Validity Administration Validity is my own
term.
A test administration or a test session is valid if nothing happens that causes a test, an assessment, or a survey to fail to reflect the actual situation.
Test-session validity is an alternate term.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Administration Validity Many things can come between the
initial creation of an assessment from valid materials and the final uses of the scores that come from that assessment.
Imagine a chain that is only as strong as its weakest link. If any link breaks, the value of the whole chain is lost.
This session deals with some of those weak links.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Areas of Validity Failure
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Areas of Validity Failure We create a test out of some “valid”
items —Discuss some of the realities most of us face:
We either have some “previously validated” tests or we have a “validated” item bank we make tests from. Let’s assume that they really are valid, this is, the materials have good content matches with the Standards/ Curriculum Frameworks/Blueprints, and so on.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Areas of Validity Failure
Some examples of things that can creep in within the supposedly “mechanical” aspects of creating a test from a bank.
Here are two items from a Biology benchmark test we recently made for a client:
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Two Biology ItemsBio.3b
5. Which organic compound is correctly matched with the subunit that composes it?A maltose – fatty acidsB starch – glucoseC protein – amino acidsD lipid – sucrose
Bio.3b
6. Which organic compounds are the building blocks of proteins?A sugarsB nucleic acidsC amino acidsD polymers
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Two Biology ItemsBio.3b
5. Which organic compound is correctly matched with the subunit that composes it?A maltose – fatty acidsB starch – glucoseC protein – amino acidsD lipid – sucrose
Bio.3b
6. Which organic compounds are the building blocks of proteins?A sugarsB nucleic acidsC amino acidsD polymers
Standard BIO.3b The student will investigate and understand the chemical and biochemical principles essential for life. Key concepts include b) the structure and function of macromolecules.
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Two Biology ItemsBio.3b
5. Which organic compound is correctly matched with the subunit that composes it?A maltose – fatty acidsB starch – glucoseC protein – amino acids *D lipid – sucrose
Bio.3b
6. Which organic compounds are the building blocks of proteins?A sugarsB nucleic acidsC amino acidsD polymers
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Two Biology ItemsBio.3b
5. Which organic compound is correctly matched with the subunit that composes it?A maltose – fatty acidsB starch – glucoseC protein – amino acids *D lipid – sucrose
Bio.3b
6. Which organic compounds are the building blocks of proteins?A sugarsB nucleic acidsC amino acids *D polymers
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
A Life Science ItemLS.6c
12. In this energy pyramid, which letter would represent producers?
A A
B B
C C
D D
A
B
C
D
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
The same Life Science Item “Randomized”LS.6c
12. In this energy pyramid, which letter would represent producers?
A C
B D
C A
D B
A
B
C
D
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Moving from test creation
to test administration
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
What Can Fail in the Test Administration Process
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
What Can Fail in the Test Administration Process
Students aren’t properly motivated Random responding Patterning responses Unnecessary guessing Cheating
Let’s look at what some of these look like:
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT One Student's Item AnalysisA B C D
1 12 03 04 05 06 17 08 09 010 011 012 013 014 015 116 017 018 119 020 121 022 023 024 025 1 6
What happened here?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT Another Student's Item Analysis
A B C D1 02 03 04 15 06 17 08 09 0
10 011 012 013 114 015 016 117 018 119 120 021 022 023 124 025 0 7
What happened here?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT Yet Another Student's Item Analysis
A B C D1 12 13 14 15 16 17 18 19 1
10 111 112 113 114 015 116 117 118 119 020 021 022 123 024 025 1 19
What happened here?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
What Can Fail in the Test Administration Process
Students or teachers make mistakes. Stopping before the end of test Getting off position on answer sheets Giving a student the wrong answer sheet Scoring a test with the wrong key
Let’s look at what some of these look like:
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Yet, Yet Another Student's Item AnalysisA B C D
1 12 13 14 15 16 178 09 0
10 011 112 013 014 015 016 017 018 019 120 021 022 023 024 025 0 826 027282930
What happened here?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Moving to diagnosing students’ needs
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Standard 4.1 4.2 4.3Subtotal Subtotal Subtotal Total
Student 1 3 2 4 9Student 2 4 3 5 12Student 3 3 1 5 9Student 4 5 0 5 10Student 5 5 0 4 9Student 6 5 2 5 12Student 7 4 1 4 9Student 8 4 0 3 7Student 9 4 1 3 8Student 10 5 2 4 11Student 11 5 1 4 10Student 12 5 0 5 10
Average .87 .22 .85 .64
Results for a Three-Standard Test
What is the obvious conclusion about these test results?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Standard 4.1 4.1 4.1 4.1 4.1 4.3 4.3 4.3 4.3 4.3 4.2 4.2 4.2 4.2 4.2Item 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Total
Student 1 1 1 0 1 0 1 1 1 0 1 1 1 0 0 0 9Student 2 1 0 1 1 1 1 1 1 1 1 1 1 1 0 0 12Student 3 1 1 1 0 0 1 1 1 1 1 1 0 0 0 0 9Student 4 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 10Student 5 1 1 1 1 1 1 1 1 0 1 0 0 0 0 0 9Student 6 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 12Student 7 0 1 1 1 1 1 1 1 0 1 1 0 0 0 0 9Student 8 1 1 1 0 1 0 1 1 1 0 0 0 0 0 0 7Student 9 1 1 1 1 0 1 1 0 1 0 1 0 0 0 0 8Student 10 1 1 1 1 1 1 1 1 0 1 1 1 0 0 0 11Student 11 1 1 1 1 1 1 1 0 1 1 1 0 0 0 0 10Student 12 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 10
Average .92 .92 .92 .83 .75 .92 1.00 .83 .67 .83 .67 .33 .08 .00 .00 .64
Results for a Three-Standard Test
What do you think now?
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
The chain has many links
Nearly any of them can break
Try to find the weakest links in your organizations efforts
Fix them – one by one
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
What are some of my solutions to all of this?
To the problems of mistakes in test creation Use test blueprints Be very careful of automatic test construction Read the test carefully yourself and answer the
questions Have someone else read the test carefully and
answer the questions Use “Kid-Tested” items *
* Future TfHS initiative
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
What are some of my solutions to all of this?
Be careful when reading reports – look past the obvious
For problems of careless, unmotivated test taking by students (even cheating) — Make the test less of a contest between the system/teacher and the student and more of a communication device between them Watch the students as they take the test and realize
that proctoring rules necessary for high-stakes tests are possibly not best for formative or semi-formative assessments
Look for/flag pattern marking and rapid responding * Watch the students as they take the test
* Future TfHS/ROS initiative
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Here is a graph showing the timing of student responses to an item
Number of Responses Over Time to a Rather Easy Item
0
2
4
6
8
10
12
14
16
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5
Time (in sec)
Nu
mb
er
of
Re
sp
on
se
s
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
For online tests it is possible to screen for rapid responding *
* Future TfHS/ROS initiative
Number of Responses Over Time to a Rather Easy Item
0
2
4
6
8
10
12
14
16
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5
Time (in sec)
Nu
mb
er
of
Re
sp
on
se
s
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
A major new way of communicating!
Let the students tell you when they don’t know or understand something – eliminate guessing
New mc scoring scheme: * 1 point for each correct answer 0 points for each wrong answer ⅓ point for each unanswered question Students mark where they run out of time
* Future TfHS/ROS initiative
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
A major new way of communicating!
Students have to be taught the new rules Students need one or two tries to get the
hang of it Students need to know when the new
scoring applies It is better for students to admit not
knowing than to guess
Continued
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
One Student's Item AnalysisA B C D
1 12 13 14 156 1 Test Length 257 1 Regular 1, 1/3, 08 1 11 13.009 1 44.00% 52.00%10 111 Corrected for Test Length12 1 Regular 1, 1/3, 013 11 13.0014 0 55.00% 65.00%151617 1181920 021 0 022232425
Answering under the new scoring scheme
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
One Student's Item AnalysisA B C D
1 12 13 14 156 1 Test Length 257 1 Regular 1, 1/3, 08 1 11 13.009 1 44.00% 52.00%10 111 Corrected for Test Length12 1 Regular 1, 1/3, 013 11 13.0014 0 55.00% 65.00%151617 1181920 021 0 022232425
Answering under the new scoring scheme
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
One Student's Item AnalysisA B C D
1 12 13 14 156 1 Test Length 257 1 Regular 1, 1/3, 08 1 11 13.009 1 44.00% 52.00%10 111 Corrected for Test Length12 1 Regular 1, 1/3, 013 11 13.0014 0 55.00% 65.00%151617 1181920 021 0 022232425
Scored under the new scoring scheme
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
Humor
Time flies like an arrow;
fruit flies like a banana.
We sometimes need to take a 90° turn in our thinking
TE S T S F O R H I G H E R ST A N D A R D S
PROVIDE FOCUS + FACILITATE ACHIEVEMENT
My contact information
David Mott – [email protected]
TfHS website – www.tfhs.net ROS website – rosworks.com