The Essentials of e-asTTle 2015. ILPs Console Reports Marking and data input Marking and data input...
-
Upload
daniela-scott -
Category
Documents
-
view
220 -
download
2
Transcript of The Essentials of e-asTTle 2015. ILPs Console Reports Marking and data input Marking and data input...
The Essentials of e-asTTle
2015
ILPsConsole ReportsMarking and data
input
The SOLO Taxonomy
What Next Profile
Rules of engagement
GLPs
Indiv. Question Analysis
Assessment Misconceptions
It’s like High Jump
Creating writing tests
Creating adaptive tests
Creating customised tests
Designing tests
Copying tests, similar tests
ILP Writing
Tabular Report
Progress ReportsManage Students
- Groups etc.
Curr. Levels Report
Naming testsTarget Setting
Student Result Summary
Accessing e-asTTle
Seminar Overview• Welcomes, Introductions• Principles of e-asTTle
• How e-asTTle works - rules of engagement• Thinking differently about testing – Item Response Theory• Assessment misconceptions
• Test creation• Customised and Adaptive Tests• Copy tests and Similar tests• Writing test overview
• Administration considerations• Reports & next steps
• Interpreting the reports (incl Target Setting and Student Result Summary) for multiple purposes
• Reflection and Evaluation
Basic Rules of Engagement• The data needs to be used to inform teaching and learning.
• Students should use this assessment and its reports to help understand their current skills and knowledge and inform their learning goals.
• In order to provide accurate results, and useful data, testing needs to be done by current achievement level, not year group.
• The results need to be interpreted alongside other evidence to ensure that good teaching and learning decisions are made.
• A student’s test score in NOT an Overall Teacher Judgment as it simply does not even come close to encompassing the broad set of skills and knowledge described in the standards.
Think differently about testing – changing our hardwired thinking
Hardwired #1 -“Everyone needs to do the same test for
a fair comparison”• If students of wide ability take the same test it is really only
the ‘average’ sector where any useful information will be gained.
• Getting too many right or wrong provides us with very little information.
Implication• Test students on current achievement levels, not age. • It may be necessary to create up to four different tests for
some cohorts. Each student is then assigned to a test which is challenging for him/her.
Hardwired #2 -“100% is the goal?”
• If we want to get good information from e-asTTle, the students need to get some answers wrong.
• A student that gets 100% correct gives little information on his/her next learning steps and e-asTTle has to extrapolate to give an indication of achievement. This is not necessarily an accurate score.
Implication• We must assign a test that is challenging for each and every
student.
Hardwired #3 -“Everyone must take the test at the
same time in the same place”
• e-asTTle provides the opportunity/challenge to test students at different times and places, especially when taking an online test.
• Students can even sit a test at home if that is something you want to happen.
• The MAIN PURPOSE of the test will dictate when, where and how testing can take place.
Hardwired #4 -“e-asTTle means you take the test
on the computer”
• In many cases computer based testing is the best way forward.
• However, paper based testing should also be considered depending on your situation.
• Paper based testing is the recommended method for areas such as Geometry or Measurement where interaction with graphics is required.
• Use of the reading passage booklet is a great hybrid option
Creating Tests
The following tests can be created: A Customised Test An Onscreen Adaptive Test A Writing Prompt
And the following actions can also occur: Copy an Existing Test Create a Similar test Add Numeracy Data
Customised Tests
Customised Test• This test is customised by the user and the process allows the person
creating the test to select from a range of curriculum strands, different processes or time and control features, such as on-line versus paper.
Designing customised tests
Differentiating Customised Tests – the planning stage
L2 L3 L4 L5 L6 Duration
Easy (A) 80% 20% 30 min
Easy/Mid (B) 50% 50% 30
min
Mid/Diff (C) 50% 50% 36
min
Difficult (D) 50% 50% 36
min
Differentiated Customised Tests – Example 1
L2 L3 L4 L5 L6 Duration
Easy (A) 80% 20% 30 min
Easy/Mid (B) 50% 50% 30
min
Mid/Diff (C) 10% 60% 30% 36
min
Difficult (D) 10% 60% 30% 36
min
Differentiating Customised Tests – the implementation stage
L2 L3 L4 L5 L6 Duration
Easy (A) 80% 20%30
min
(A) Actual 17 5
30 min
Easy/Mid (B)
50% 50% 30 min
( B) Actual 10 12
30 min
Mid/Diff (C)
20% 60% 20% 36 min
(C) Actual 7 16 4
36 min
Difficult (D)
20% 60% 20% 36 min
(D) Actual 6 17 7
36 min
What can you tell me about these tests?
1. “Reading test”
2. “Maths March 12”
3. “CL4PAPSIdMar12”
4. “AL23NknNOpAlgNov11”
Naming tests
It is well worth coming up for a system of naming tests, particularly if you are making them for other teachers to use as well.For example, if I created a mainly level 3 Probability and Stats test which is the second most difficult test in a group of tests it could be named like this.
BL3ProbStatsMarch12
Facilitator modelling
• Facilitator to model online the creation of the following test
Name L2 L3 L4 L5 L6 Duration Strands Attitude Paper or Online
CL3/4PSPAIdMar15 50 50 40 Proc & Strat.Purpose & Aud.
Ideas
Interest Reading
Online
Time to have a go…Create a customised test
• Think about a group of students from your school and design a test that will be challenging for them.
• Plan on the design page first• e-asTTle Training site – Teacher loginhttps://training.e-asttle.education.govt.nz/SCWeb/login.faces
• e-asTTle Training site – Student loginhttps://training.e-asttle.education.govt.nz/StudentWeb/login.faces
• e-asTTle on TKIhttp://e-asttle.tki.org.nz/
Online Adaptive tests
Online Adaptive TestThe user can choose strands and curriculum levels and the test adapts twice during the test to give easier or harder sets of questions to the student. There is less control over the content and type of questions. It is completed online and only has closed questions.
Adaptive test – how it works
Time to have a go…
- create an adaptive test
• Use a group of students from your school and design a test that will be suitable (challenging) for them.
• Again, plan on the design sheet first
• What’s will you have to do different this time?
Creating a writing prompt
• e-asTTle writing assesses students’ writing from Years 1 – 10• It assess their ability to write continuous texts across a variety of
communicative purposes- describe, explain, recount, narrate & persuade
• It assesses generic writing competence rather than writing specific to any learning area
• Writing tests can only be completed on paper• Students respond to a prompt• The time given for a writing test is up to 40 minutes
and is preset.
Creating a similar, or copying a test. Why would you do this?
• Copying a test allows you to use the same test with a different group of students. Copying rather than reassigning the same test allows you to keep the two sets of data separate.
• Creating a similar test means that you can create a test using the same settings as previously but it will result in a new set of questions.
• A better option if testing the same set of students again is to create a slightly harder test. This will be a better fit for the students as it acknowledges the progress they have made throughout the year.
• Closed questions (i.e. multi choice or true/false type questions) do not require any data input or marking. The programme marks them automatically
• Open questions are marked by question with student responses clustered.
Online tests – Marking open response questions
Entering Paper Tests data
Entering e-asTTle Writing scores
e-asTTle Reports
The Console Reports
The Console Comparisons Report (Maori)
The Console Comparisons Report (Schools like mine)
The Multi-test Console Report
The Individual Learning Pathway Report
The Individual Learning Pathway Report
The ILP Report - Student Speak
• One standard error of measurement around the students score (± 15)
• This is like the 'margin of error' reported in political polls
• Two out of three times the student’s true score will lie somewhere between the top and the bottom of the red circle
• A difference in scores needs to be >22 to be ‘statistically significant’
The Individual Learning Pathway Report- Norm information
The Individual Learning Pathway Report- Criterion information
When a dash “ – “appears in the e-asTTle ILP it means that the student has failed to get more than 3 questions correct in that strand.
If a dash appears in the overall score it means the student has not achieved three correct answers in the test.
• Unexpectedly correct• Harder than student’s overall ability
Strengths
• Unexpectedly wrong• Easier than or equal to the student’s overall
ability
Gaps
• Correct as expected• Easier than or equal to the student’s
overall ability
Achieved
• Wrong as expected• Harder than student’s overall ability
To Be Achieved
Cor
rect
Easy Items for this student
Inco
rrec
t
Hard Items for this student
The Individual Learning Pathway Report- Interpreting the quadrants
• Harder than the student’s ability but unexpectedly answered correctly.
• Given the students’ overall asTTle score these items are more difficult than his/her overall ability.
• This quadrant displays the student’s unexpected strengths that should be exploited in future teaching and learning.
Strengths
• Easier than the student’s ability but unexpectedly, answered incorrectly.
• The teacher needs to investigate to determine the nature of the gap e.g. carelessness, skipping items, not taught.
• The teacher should eliminate as a concern or put in place a remedial plan. He/she should learn quickly and fill in the gap.
Gaps
• Easier than the student’s ability and, as expected, answered correctly.
• Given the student’s overall asTTle score these are the items that were expected to be answered correctly and were.
• “The Green Light”. The teacher can confidently give the student more challenging work in these areas.
Achieved
• Harder than the student’s ability and, as expected, answered incorrectly.
• Given the students’ overall asTTle score these are the items we expected him/her not to get right and did not.
• These are the areas that the student still has to achieve in and which it is expected the teacher will carry out more teaching.
To Be Achieved
The Individual Learning Pathway Report- Interpreting the quadrants
The Individual Learning Pathway Report- Interpreting the quadrants
“90 second analysis” or “shared analysis”
“90 second analysis” or “shared analysis”
The Group Learning Pathway Report
Individual Question Analysis
Individual Question Analysis – Student Speak
The Group Learning Pathway Report
The Group Learning Pathway Report
The Group Learning Pathway Report
The Group Learning Pathway Report
The Curriculum Levels Report
The Curriculum Levels Report
The Curriculum Levels Report
3P
3A
4A
2A
The Curriculum Level Report (aka Skyline)
The Curriculum Levels Report
The What Next Profile – in e-asTTle online
The What Next Profile – on the TKI site
http://assessment.tki.org.nz/Assessment-tools-resources/What-Next
The What Next Profile
The Curriculum Levels Report
The Tabular Report – Excel file (.csv)
The Tabular Report – Cut scores
• Use sub levels and scores
• Consider standard error of measurement
2A 3B
1374 1390
+ 16
The Tabular Report – Cut scores
2P 2P
1295 1342
+ 47
The Progress Reports
The Progress Report – for 1 student
The Progress Report – for a group
The Progress Report – more than two assessments
Strengths
Any element sublevel score two sublevels or more above the student’s
overall score
Gaps
Any element sublevel score two sublevels or more below the student’s
overall score
Achieved
Any element sublevel score within a sublevel of the student’s overall score
Jade Battle’s
overall score = 3B
• In Jade Battle’s case this would be scores of 3A and above
• In Jade Battle’s case this would be 2A, 3B or 3P.
• In Jade Battle’s case this would be scores of 2P and below.
The Individual Learning Pathway Report
- Writing
The Individual Learning Pathway Report
- Writing
Manage Students Area
Manage students – Student details
Manage students – Student details
Download Student Logins • by PDF or csv (excel)• As a PDF you can print out as a card, hand it to the student and receive back once
the password has changed.
• To reset password go into “Manage existing students”, choose students and then click “reset password”.
Manage students – Using groups
Manage students – Using groups
• Some groups will automatically be created when SMS data is imported. Most often Class, Year etc.
• However you can create custom groups that enable the viewing of reports etc. for that particular group
• By doing this you DO NOT need to test a certain group again. You can use existing data but just view reports for the selected students in your group
• Examples of custom groups could be:• Your top reading group• An extension or remedial group• Maori boys or similar groups that need to be monitored• Target students
Manage Students Area
– Student result summary
Target setting
Target setting
Target settingThe important learning conversation to be had?
What do we need to focus on to help you achieve this target?
Accessing e-asTTle
Accessing e-asTTle
Accessing e-asTTle
Surface and Deep in e-asTTleThe SOLO Taxonomy
• Structure of Observed Learning Outcomes (SOLO)
Surface and Deep in e-asTTleThe SOLO Taxonomy
• e-asTTle uses the SOLO Taxonomy - Structure of Observed Learning Outcomes (SOLO)
• SURFACE (increase in quantity)
• Unistructural, Multistructural
• DEEP (change of quality)• Relational, Extended Abstract
Surface Processes
Unistructuralrequire the knowledge or use of only one piece of given information, fact, or idea, obtained directly from the problem.
Multistructuralrequire knowledge or use of more than one piece of given information, fact, or idea, each used separately, or two or more distinct steps, with no integration of the ideas. This is fundamentally an unsorted, unorganised list.
Relational:integration of more than one piece of given knowledge, information, fact, or idea. at least two separate ideas are required that, working together, will solve the problem.
Extended Abstract:higher level of abstraction. The items require the student to go beyond the given information, knowledge, information, or ideas and deduce a more general rule or proof that applies to all cases.
Deep Processes
Maths ExampleAlgebra: Patterns in Number
• How many sticks are needed for 3 houses? UNI
• How many sticks are there for 5 houses? ______ MULTI
• If 52 houses require 209 sticks, how many sticks do you need to be
able to make 53 houses? ______ RELATIONAL
• Make up a rule to count how many sticks are needed for any
number of houses. EXTENDED ABSTRACT
Houses 1 2 3
Sticks 5 9 __Given:
Reading ExampleGoldilocks and the Three Bears
• “Whose house did Goldilocks go into?”UNI
• “What are three aspects about the way the bears live that tell us that the story is not a real life situation?” MULTI
• “Goldilocks eats thebaby bear’s food, breaks his chair, and sleeps in his bed. What does this tell us about the kind of person she is”? RELATIONAL
• “Why do nursery tales allow wild animals to act in human fashion?” EXTENDED ABSTRACT
SOLO Taxonomy
Hattie, J.A.C., & Brown, G.T.L. (2004, September). Cognitive processes in asTTle: The SOLO taxonomy. asTTle Technical Report
#43, University of Auckland/Ministry of Education.
• the curriculum sublevel score does not seem to change the likelihood of scoring higher in surface or deep thinking.
• there appears to be a link between the performance of a student against the relative mean of his/her year group and depth of thinking score. If his/her score is significantly below the national norm they have a greater likelihood of having a higher deep score than surface score.
Surface vs Deep
Question SOLO taxonomy
classification
e-asTTle classification
“Whose house did Goldilocks go into?” Unistructural Surface
“What are three aspects about the way the bears live that tell us that the story is not a real life situation?”
Multistructural Surface
“Goldilocks eats the baby bear’s food, breaks his chair, and sleeps in his bed. What does this tell us about the kind of person she is”?
Relational Deep
“Why do nursery tales allow wild animals to act in human fashion?”
Extended Abstract Deep
• What this suggests is that while surface questions are cognitively less demanding they may often require more involvement with the text than their deeper counterparts.
• This could explain why students scoring below the mean score for reading achievement do poorly with surface questions in comparison to deep questions.
• It could be that they simply do not have the reading comprehension and processing skills needed to consistently answer questions directly related to text.
Summary of student year group data
Year group No. of students
Year 4 220
Year 5 242
Year 6 244
Year 7 253
Year 8 225
Year 9 164
Year 10 139
Total students 1487
YearaRs
ScoreSurface Score
Deep Score
Difference between
Surface and Deep
4 466 448 492(448-492)
-44
All Students - aRs Scores Compared to the National Mean Score
61%
36%
43%
25%
36% 37%
14%
28%
20%
0%
10%
20%
30%
40%
50%
60%
70%
aRs Scores more than 15 points below nationalaRs mean
aRs Scores within 15 points of national aRs mean aRs scores more than 15 points above aRs mean
asTTle deep score higher than surface score asTTlle surface score higher than deep score asTTle deep and surface scores the same
The analyses of the asTTle reading test data point toward a number of key trends. Some concern must be raised at the nature of the Year 10 data given its enormous skew towards higher deep scores, but given that the rest of the data is representative of normal asTTle results we can observe that:
In depth of thinking within asTTle, a student has a far greater likelihood of scoring higher in deep thinking than surface thinking