Met Project 2012_asking Students About Teaching
-
Upload
luiz-carvalho -
Category
Documents
-
view
216 -
download
0
Transcript of Met Project 2012_asking Students About Teaching
-
7/30/2019 Met Project 2012_asking Students About Teaching
1/28
METpjet
AskingStudentsabout TeachingStudent Perception Surveysand Their Implementation
Policy andPracTicE BriEf
-
7/30/2019 Met Project 2012_asking Students About Teaching
2/28
ABOUT THIS REPORT: Based on the experience o the Measures o Eective Teaching (MET) project, its partners, and other
districts, states, and organizations, this report is meant or practitioners and policymakers who want to understand student
surveys as potential tools or teacher evaluation and eedback, as well as the challenges and considerations posed by their
implementation. MET project ndings on the reliability and validity o student surveys are explored in depth in a separate
research report, Learning about Teaching: Initial Findings from the Measures of Effective Teaching Project.
AcknOwlEdgmEnTS: This document is based on the collective knowledge o many individuals who shared their experiences
with student surveys or the benet o their peers in other states, districts, and organizations. Thanks to:
Ryan Balch, Vanderbilt University
Kelly Burling and colleagues, Pearson
Carrie Douglass, Rogers Family Foundation
Ronald Ferguson, Harvard University and the Tripod
Project or School Improvement
Tiany Francis, Sam Franklin, and Mary Wolson,
Pittsburgh Public Schools
James Gallagher, Aspire Public Schools
Alejandro Ganimian, Harvard University
Hilary Gustave, Denver Public Schools
Monica Jordan and Rorie Harris, Memphis City Schools
Kevin Keelen, Green Dot Public Schools
Jennier Preston, North Carolina Department o Public
Instruction
Rob Ramsdell and Lorraine McAteer, Cambridge Education
Valerie Threlall and colleagues, Center or Eective
Philanthropy
Ila Deshmukh Towery and colleagues, TNTP
Deborah Young, Quaglia Institute or Student Aspirations
Sara Brown Wessling, Johnston High School in
Johnston, Iowa
ABOUT THE mET PROjEcT: The MET project is a research partnership o academics, teachers, and education organizations
committed to investigating better ways to identiy and develop eective teaching. Funding is provided by the Bill & Melinda
Gates Foundation. For more inormation, visit www.metproject.org.
September 20
http://metproject.org/downloads/Preliminary_Finding-Policy_Brief.pdfhttp://www.metproject.org/http://www.metproject.org/http://metproject.org/downloads/Preliminary_Finding-Policy_Brief.pdf -
7/30/2019 Met Project 2012_asking Students About Teaching
3/28
ContentsWhy Ask Students about Teaching 2
Measure What Matters 5
Ensure Accuracy 11
Ensure Reliability 14
Support Improvement 18
Engage Stakeholders 21
Appendix: Tripod 7 C Survey Items 23
Student Perception Surveys and Their Implementation
-
7/30/2019 Met Project 2012_asking Students About Teaching
4/282 Measures o Eective Teaching (MET) Project
n e hs bgge stke tehg eetveess th stuets. Norare there any better experts on how teaching is experienced by its intendedbeneciaries. But only recently have many policymakers and practitioners cometo recognize thatwhen asked the right questions, in the right waysstudentscan be an important source o inormation on the quality o teaching and thelearning environment in individual classrooms.
Why Ask Students
about Teaching?
a new us the
cssmAs states and districts reinvent how
teachers are evaluated and provided
eedback, an increasing number are
including student perception surveys as
part o the mix. In contrast to the kinds
o school climate surveys that have been
commonly used in many systems, these
student perception surveys ask students
about specic teachers and specic
classrooms. District-wide administra-
tion o such surveys has begun in Denver,
Memphis, and Pittsburgh. Statewide
pilots have taken place in Georgia and
North Carolina. And student surveys
are now used or eedback and evalua-
tion by theNew Teacher Center, TNTP
(ormerly The New Teacher Project), and
such charter management organizations
as Aspire Public Schools and Green Dot
Public Schools.
There are many reasons or this new
interest. Teaching is a complex inter-
action among students, teachers, and
content that no one tool can measure.
The search or dierent-but-aligned
instruments has led many to use student
surveys as a complement to such other
tools as classroom observations and
measures o student achievement gains.
Analysis by the Measures o Eective
Teaching (MET) project nds that
teachers student survey results are
predictive o student achievement gains.
Students know an eective classroom
when they experience one. That survey
results predict student learning also
suggests surveys may provide outcome-
related results in grades and subjects
or which no standardized assessments
o student learning are available.
Further, the MET project nds student
surveys produce more consistent
results than classroom observations
or achievement gain measures (see
the MET projects Gathering Feedback
policy and practitioner brie). Even a
high-quality obser vation system entails
at most a handul o classroom visits,
while student surveys aggregate theimpressions o many individuals whove
spent many hours with a teacher.
Asking Students about Teaching2
http://www.newteachercenter.org/http://www.tntp.org/http://www.aspirepublicschools.org/http://www.greendot.org/http://www.greendot.org/http://metproject.org/downloads/MET_Gathering_Feedback_Practioner_Brief.pdfhttp://metproject.org/downloads/MET_Gathering_Feedback_Practioner_Brief.pdfhttp://www.greendot.org/http://www.greendot.org/http://www.aspirepublicschools.org/http://www.tntp.org/http://www.newteachercenter.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
5/28
Student surveys also can provide eed-
back or improvement. Teachers want
to know i their students eel suciently
challenged, engaged, and comortable
asking them or help. Whereas annual
measures o student achievement gains
provide little inormation or improve-
ment (and generally too late to do much
about it), student surveys can be admin-
istered early enough in the year to tell
teachers where they need to ocus so
that their current students may ben-
et. As eedback tools, surveys can be
powerul complements to other instru-
ments. Surveys might suggest stu-
dents misunderstandings arent being
addressed; done well, observations can
diagnose a teachers specic attempts at
clarication.
impetves impemett
Benets aside, student perception
surveys present their own set o chal-
lenges and considerations to states
and districts. Some o these relate to
the survey instrument itsel. Not every
survey will produce meaningul inor-
mation on teaching. Not to be conused
with popularity contests, well-designed
student perception surveys capture
important aspects o instruction and the
classroom environment. Rather than
pose, Do you like your teacher? typi-
cal items might ask the extent to which
students agree that, I like the way the
teacher treats me when I need help and
I you dont understand something, my
teacher explains it another way. Survey
development is a sophisticated, data-
driven process.
But even a good instrument, imple-
mented poorly, will produce bad
inormation. Attending to issues such as
student condentiality, sampling, and
accuracy o reporting takes on greater
urgency as systems look toward includ-ing student surveys in their evaluation
systems. The care with which sys-
tems must administer surveys in such
contexts is akin to that required in the
ormal administration o standardized
student assessments. Smooth admin-
istration and data integrity depend on
piloting, clear protocols, trained coordi-
nators, and quality-control checks.
Student Perception Surveys and Their Implementation
-
7/30/2019 Met Project 2012_asking Students About Teaching
6/28
The ollowing pages draw on the experi-
ence o the MET project, its partners,
and other leading school systems and
organizations to clariy our overriding
requirements o any system considering
student surveys as part o ormal eed-
back and evaluation or teachers:
1. Mesue wht mttes. Good
surveys ocus on what teachers do
and on the learning environment
they create. Surveys should refect
the theory o instruction that denes
expectations or teachers in a
system. Teachers with better survey
results should also have better
outcomes on measures o student
learning.
2. Esue u. Student responses
should be honest and based on clear
understanding o the survey items.
Student condentiality is a must.
Accuracy also means that the right
responses are attributed to the right
teacher.
3. Esue ebt. Teachers should
have condence that surveys can
produce reasonably consistent
results and that those results refect
what they generally do in their
classroomsnot the idiosyncrasies
o a particular group o students.
Reliability requires adequate
sampling and an adequate number
o itemsbut without overtaxing
students.
4. Suppt mpvemet.
Measurement or measurements
sake is wasted eort. Teachers
should receive their results ina timely manner, understand
what they mean, and have access
to proessional development
resources that will help them
target improvement in areas o
need. Student surveys are as
much about evaluating systems o
support or teachers as they are
about diagnosing the needs within
particular classrooms.
Addressing these requirements amid
nite resources and competing concerns
necessarily involves trade-os.
Although the study o student surveys
has intensied, research to date cannot
oer hard and ast rules or striking
the right balance on many issues.
Highlighted throughout this report are
some o the varied approaches that
systems are taking. But implementation
in the eld is a work in progress.Individual districts and states are
piloting beore deciding how to deploy
surveys ormally and at scale. This
document concludes by emphasizing the
importance o stakeholder engagement
or achieving positive results.
It will be important to monitor how
student surveys behave as they move
into the realm o ormal evaluation. The
Tripod survey studied by the MET proj-
ect, developed by Harvard researcher
Ronald Ferguson, has been rened over
11 years o administration in many thou-
sands o classrooms as a research and
proessional development tool. But only
recently have systems begun to con-
sider its use as part o ormal teacher
evaluation.
Any instrument, when stakes are
attached, could distort behavior in
unwanted ways, or produce a less accu-
rate picture o typical practice. One can
imagine a teacher who, consciously ornot, acts more lenient i student sur-
veys are actored into evaluation, even
though well-designed surveys stress a
balance o challenge and support. This
is urther reason or multiple measures.
It lets teachers stay ocused on eective
teaching and not on any one result.
Benefits to Student Perception Surveys1. Feeba. Results point to strengths and areas or improvement.
2. Fae vaiity. Items refect what teachers value.
3. Preitive vaiity. Results predict student outcomes.
4. Reiabiity. Results demonstrate relative consistency.
5. lo ost. Expense o administration is minimal.
Asking Students about Teaching4
http://www.tripodproject.org/http://www.tripodproject.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
7/28
Measure What Mattersi the bjetve mesuemet s t suppt mpve stuet utmes, thenthe measures used should relate to those outcomesand to the means teachersuse to help students achieve them. Its a problem i a measure has no relationshipto how teachers in a system are expected to teach, or to the outcomes or whichtheyre held accountable. Making sure those relationships exist should be part oadopting a student perception survey.
its Tehg
To measure something broad like teaching
requires breaking the activity down into
discrete components that together rep-
resent a theory about the instruction that
supports student learning. Developers o
student perception surveys oten reer to
these components as constructsdi-
erent aspects o teaching and the learn-
ing environment, the quality o which are
indicated by responses to multiple related
questions about students observations
and eelings.
Consider the Tripod survey employed
in the MET project study. Tripod is
designed to measure teaching, student
engagement, school norms, and student
demographics. To measure teaching, the
survey groups items under seven con-
structs, called the 7 Cs: Care, Control,
Challenge, Clariy, Coner, Captivate,
and Consolidate. For each, the survey
poses a series o statements, asking
students level o agreement on a ve-
point scale. Here are two items under
Clariy:
My teacher has several good
ways to explain each topic that
we cover in class.
My teacher knows when the class
understands and when we do not.
An important quality o any measure
o teaching is its ability to dierentiate
among classrooms. I all teaching looks
the same, the instrument is not provid-
ing much inormation or eedback.
As shown in gue 1, students who
completed the Tripod survey as part o
the MET project perceived clear dier-
ences among teachers. The chart shows
the percentage o students agreeing
with one statement rom each construct
among classrooms at the 75th and 25th
percentiles in terms o how avorably
their students responded.
Note the results or the item: The
comments I get help me know how to
improve. Among classrooms at the
75th percentile in terms o avorable
responses to that statement, more
than three-quarters o students
agreed. Among classrooms at the 25th
percentile, well under hal o students
agreed. These results dont show
how survey results relate to studentachievementthat is addressed urther
onbut the contrasts suggest that
students are capable o discerning
Student Perception Surveys and Their Implementation
-
7/30/2019 Met Project 2012_asking Students About Teaching
8/28
gue 1
Students Perceive Clear Diferences among TeachersPeetge stuets geeg wth eh sttemet
signicant dierences among
classrooms.
Tripod is one o the oldest and most
widely used o-the-shel survey
instruments, and the only one studied
by the MET project. Some school
systems have drawn rom Tripod items
included in the MET project as they
adopt student surveys. But other toolsexist. As shown in the table on pages
7 and 8highlighting our currently
available o-the-shel surveys that
ask about specic classrooms
instruments share many similar
structures and constructs. Dierences
in emphasis oten relate to a surveysparticular origins. Many specics on
the number o items and their wording
are moving targets as developers adjust
their instruments based on ongoing
research and system needs.
An important aspect o any measure o teaching is its ability to distinguish among classrooms. Measures that ail this test provide
little inormation or eedback. The above chart shows the percentage o students agreeing with one statement rom each o the
Tripod student perception surveys constructs among classrooms ranked at the 75th and 25th percentiles in terms o how avorably
their students responded. Clearly, the students were able to discern signicant dierences among classrooms in terms o the
qualities measured by the survey.
200 40 60 80 100
Among classrooms at the 25th percentilein terms of favorable student responses
Among classrooms at the 75th percentilein terms of favorable student responses
Asking Students about Teaching6
-
7/30/2019 Met Project 2012_asking Students About Teaching
9/28
Stuet Pereptio Surveys ith cassroo-leve Ites
T Bkgu numbe items cstuts respse opts Smpe items
Tp(www.tripodproject.org)
Developed by Harvard
researcher Ronald Ferguson;distributed and administered byCambridge Education
Online and paper administration
Grew out o study o studentengagement and relatedteaching practices
Versions or three grade bands:K2; 35; 612
MET project study ound Tripodpredictive o achievement gainsand able to produce consistentresults or teachers (see
MET projects Learning aboutTeaching)
Core are approx.36 items in theTripod 7 Cs at
the secondarylevel; ewer atearlier grades
Additionalitems askabout studentsengagement,background, andacademic belies
Full versionsincludes 80+items; shorterorms available
7 cs
1. Care
2. Control
3. Clariy
4. Challenge
5. Captivate
6. Coner
7. Consolidate
Also includesadditionaleggemet itemson:
Academic goalsand behaviors
Academic beliesand eelings
Social goals andbehaviors
Social belies andeelings
Grades 35/612:
1. No, never/Totallyuntrue
2. Mostly not/Mostlyuntrue
3. Maybe,sometimes/Somewhat
4. Mostly yes/Mostlytrue
5. Yes, always/Totallytrue
Grades K2: No,Maybe, Yes
c: My teacherexplains dicultthings clearly.
cste: Myteacher takes thetime to summarizewhat we learn eachday.
ct: Our classstays busy anddoesnt waste time.
yuthTuth(www.youthtruthsurvey.org)
Developed and distributed by theCenter or Eective Philanthropy
Started as school-ocusedtool at secondary level; nowincludes classroom-level itemsadapted rom YouthTruth Rigor& Relationships constructs androm Tripod 7 Cs highlighted byMET project study
Versions or grades 68 and912; online and paper
TNTP is using YouthTruthto administer Tripod-baseditems as one component inevaluation to determine i TNTP-trained novice teachers arerecommended or licensure
Analysis or predictive validityand reliability o classroom-level items as administered byYouthTruth orthcoming
Version used byTNTP includes 25items
Constructs drawnrom Tripod 7 Cshighlighted by theMET project study
Plus:
Rigor
Relationships
1. Strongly disagree
2. Somewhatdisagree
3. Neither agree nordisagree
4. Somewhat agree5. Strongly agree
rg: The workthat I do in thisclass really makesme think.
retshps:My teacher is
willing to giveextra help onschool work i Ineed it.
(continued on p.8)
Student Perception Surveys and Their Implementation
http://www.tripodproject.org/http://www.tripodproject.org/http://www.youthtruthsurvey.org/http://www.youthtruthsurvey.org/http://www.youthtruthsurvey.org/http://www.youthtruthsurvey.org/http://www.tripodproject.org/http://www.tripodproject.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
10/28
T Bkgu numbe items cstuts respse opts Smpe items
M Stuet Suve(www.mystudentsurvey.com)
Developed by Ryan Balch, ExpertFellow at Vanderbilt University
Based on research-basedteaching practices and validatedobservation rubrics such asFramework or Teaching
Versions or grades 45 and612
Developer-led study oundresults reliably consistentand predictive o studentachievement and studentengagement when administeredto 15,000 students in Georgia(see My Student Survey siteor research report)
55 items 1. Presenter
2. Manager
3. Counselor
4. Coach
5. MotivationalSpeaker
6. Content Expert
1. Never
2. Sometimes
3. Oten
4. Almost always
5. Every time
Pesete: Whenpresenting newskills or ideas inclass, my teachertells us aboutmistakes thatstudents otenmake.
ch: My teachergives us guidelinesor assignments sowe know how we willbe graded (gradingrules, charts,rubrics, etc.).
KwMcss(www.iKnowMyClass.com )
Developed by Russell Quaglia atthe Quaglia Institute or StudentAspirations in Portland, ME, astool or teacher eedback
Online administration only;no capacity to link individualstudent results to other data
Focus on student engagementand relationships
Classroom-level survey
available or grades 612;version or grades 35 plannedor release September 2012
Developer-led study validated612 version or constructvalidity (items within eachconstruct ound to be related),but no study yet on tools abilityto predict student outcomes
Quaglia is also working with thePearson Foundation to adapt itsschool-level MyVoice surveyor classroom level (grades 35and 612) (see www.myvoice.
pearsonoundation.org)
Full and shortversions orgrades 612
Full: 50 items
Short: 20 items
Grades 35version to have27 items
1. Engagement
2. Relevance
3. Relationships
4. Class Ecacy
5. CooperativeLearningEnvironment
6. Critical Thinking
7. Positive Pedagogy
8. DisciplineProblems
1. Strongly disagree
2. Disagree
3. Undecided
4. Agree
5. Strongly agree
css Ef: Ieel comortableasking my teacheror individual helpabout the things weare learning.
cpetveleg: Theteacher encouragesstudents to worktogether.
Pstve Pegg:I am encouraged
to use myimagination.
(continued)
Asking Students about Teaching8
http://www.mystudentsurvey.com/http://www.iknowmyclass.com/http://www.iknowmyclass.com/http://www.mystudentsurvey.com/ -
7/30/2019 Met Project 2012_asking Students About Teaching
11/28
chei or Aiet ith a Theory o Istrutio
Under a new evaluation system designed with input rom teacher leaders and administrators in the Memphs ct Shs,
stakeholder perceptionsincluding student survey responsesaccount or 5 percent o teachers overall ratings. Other
components include:
Classroom observations: 40%
Student achievement/learning gains: 50%
Evidence o content knowledge: 5%
Designers o the evaluation system recognized the
importance o alignment between the dierent components
so that teachers received consistent messages about their
perormance. In adopting Tripod as its student survey,
they mapped the instruments constructs and items to 11
dimensions in the districts evaluation ramework that relate
to teaching and the classroom environment. They ound all
o the dimensions represented to some degree in the survey.
Examples in the table below illustrate the alignment.
agmet Evut mewk Suve cmpets
Memphs ct Shs Tehe Evut rub Tp Stuet Suve
Evut ubmes/suvestut
Use strategies that develop higher-level thinkingskills
Challenge
Exmpe t/tem Questions require students to apply, evaluate, orsynthesize
My teacher wants me to explain myanswerswhy I think what I think.
Evut ubmes/suvestut
Check or understanding and respondappropriately during the lesson
Clariy
Exmpe t/tem I an attempt to address a misunderstanding isnot succeeding, the teacher, when appropriate,responds with another way o scaolding
I you dont understand something, myteacher explains it another way.
Petg outmes
Measuring what matters also means
capturing aspects o teaching and the
learning environment that relate todesired student outcomes. I not, the
resulting eedback and evaluation wont
support the improvements a school
system cares aboutor worse, it may
be counterproductive. This reers to
predictive validitythe extent to which
survey results predict which teachers
will have more student achievement
gains. For a survey to be predictively
valid, it means that, on average, the
teachers who get the most avorablesurvey responses are also those who
are helping students learn the most. I
students perceive dierences among
teachers, those dierences should gen-
erally predict student outcomes.
The MET projects analysis o Tripod
ound this to be true or the sample
o teachers it studied. MET project
researchers ranked teachers based on
the proportion o their students whogave avorable responses to the items
within each o the 7 Cs. When they then
looked at those same teachers student
achievement gains when teaching
other students, they ound that those
identied as being in the top 25 percent
Student Perception Surveys and Their Implementation
http://www.mcsk12.net/http://www.mcsk12.net/ -
7/30/2019 Met Project 2012_asking Students About Teaching
12/28
based on Tripod results had students
who were learning the equivalent o
about 4.6 months o schooling more
in mathover the course o the year
and measured using state teststhan
the students o teachers whose survey
results were in the bottom 25 percent.
Tripods predictive power accounts
or less than hal as many months
dierence when it came to gains based
on state English language arts (ELA)
tests (a common nding among many
measures), but a clear relationship was
ound nonetheless.
The positive relationship between
teachers Tripod results and their
student achievement gains is urther
shown in gue 2.
Systems that use student surveys should
similarly test or predictive validity bycomparing teachers survey results with
their student achievement gainsas
they should or any measure they use.
Moreover, checking or predictive validity
needs to be an ongoing process. Over
time, alignment with student outcomes
could deteriorate. This could happen i
somehow teachers altered their actions
in ways that improved their survey
results, but without improving their
underlying perormance on practices
associated with better outcomes.
In such a situation, a system would
see that teachers rankings based on
survey results bore little relationship
to their students learning gains. Such
misalignment could signal the need or
survey renement.
gue 2
Teachers Tripod Student Survey ResultsPredict Achievement Gains
Percentile Rank on Tripod, Year 1 Percentile Rank on Tripod, Year 1
0 20 40 60 80 1000 20 40 60 80 1000 20 40 60 80 100MathValue-AddedSc
ores,
Year2
ELA
Value-AddedSc
ores,
Year2
0.11
0.075
0.05
0.025
0
-0.025
-0.05
0.11
0.075
0.05
0.025
0
-0.025
-0.05
Tripod Results and MathAchievement Gains
Tripod Results and English LanguageArts Achievement Gains
These slopes represent the relationship between teachers rankings based on Tripod student survey
results rom one year and their value-added scoresa measure o student achievement gainsrom
the ollowing year. As shown, teachers with more avorable Tripod results had better value-addedscores. Achievement gains here are based on state assessments.
Asking Students about Teaching10
-
7/30/2019 Met Project 2012_asking Students About Teaching
13/28
Ensure Accuracy esuts t be useu, the must be et. Its o no benet i surveyresponses refect misunderstanding o whats being asked, or simply whatstudents think others want them to say. Worse yet are results attributed tothe wrong teacher. Honest eedback requires careully worded items andassurances o condentiality. Standardized procedures or how surveys aredistributed, proctored, and collected are a must. Accurate attributionrequires verication.
item Wg
Survey items need to be clear to the
students who respond to them. A well-
crated item asks about one thing. One
wouldnt pose the statement: This class
stays busy all the time and the students
act the way the teacher wants. Good
items also avoid double-negatives,
and their language is age appropriate.
Wording or similar items may vary
depending on the grade level. Note the
slight dierence in explicitness o the
ollowing two items rom Tripod:
Grades 35: When my teachermarks my work, he/she writes on my
papers to help me understand.
Grades 612: The comments that I
get on my work in this class help me
understand how to improve.
Student perception survey develop-
ment involves discussion with students
to determine i theyre interpreting the
items as intended. Through systematic
cognitive interviews, survey develop-
ers probe both students understand-
ing o each item and whether the
item addresses its desired objective.
Items that arent clear are reworded
and tested again. During survey
administration, students are gener-
ally told not to answer items they dont
understand as a urther check against
meaningless results and to indicate
which items may need clarication.
yug Stuets Spe Pputs
Well-designed surveys account or the
act that not all students read at the
same grade level. Even still, school
systems oten involve special educa-
tion teachers and teachers o English
language learners (ELL) in reviewing an
instruments survey items beore they
are used. For some ELL students, sur-
veys that are translated into their native
language may need to be available.
Special care is also required when sur-
veying the youngest studentssome o
whom are not yet able to read. Although
not part o the MET project analysis,
Tripod has a K2 version o its survey,
which diers rom the grades 35 and
612 orms in three respects:
It has ewer items overall;
It has three response optionsyes/
no/maybeinstead o ve; and
Student Perception Surveys and Their Implementation 1
-
7/30/2019 Met Project 2012_asking Students About Teaching
14/28
Items are worded more simply, or
example:
K2: My teacher is very good at
explaining things.
Grades 35: I you dont under-
stand something, my teacher
explains it another way.
Survey administration in grades K2
also requires accommodations. Given
the need to ensure condentiality while
also reading items to younger students,
someone other than a class teacher
must proctor. Oten, young students are
administered surveys in small groups
not entire classesto allow adequate
support rom proctors. Even with such
accommodations, results should be
monitored or indications that items are
being understood. A lack o consistency
in responses to items within the same
construct, or example, might signal the
need or dierent wording.
Esug cfett
Condentiality or students is a non-
negotiable i surveys are part o ormal
eedback and evaluation. I students
believe their responses will negatively
infuence how their teachers treat them,
eel about them, or grade them, then
theyll respond so as to avoid that hap-
pening. More undamentally, students
shouldnt be made to eel uncomort-
able. They should be told, in words and
actions, that their teachers will not know
what individual students say about their
classrooms.
Consistently applied protocols are
essential or providing students such
assurance. Although in many situations
teachers will distribute student percep-
tion surveys in their own classrooms, no
teacher should receive back a completedsurvey orm that would allow the teacher
to identiy who lled it out. In the MET
project, ollowing procedures generally
employed in administering Tripod, paper
surveys were distributed to students
with their names on peel-o labels that
they removed beore completing them.
All that remained on the orm when
they nished were unique bar codes to
let researchers link their responses to
other data collected or the study but
which no school personnel could use
to identiy respondents. Students also
placed their completed orms in opaque
envelopes and sealed them.
Condentiality also requires setting
a minimum number o respondentsor providing teachers with results. I
results or a class are based on surveys
rom only three or our students, then
a teacher could conjecture that a highly
avorable or unavorable result or an
item refects how all o those students
responded individually. (Although
less likely, this still could happen i all
students in a class give the same highly
unavorable or avorable response to a
teacher.)
au attbut
Systems must be certain about which
teacher and class each completed
survey belongs to. Part o ensuring this
requires making sure students have the
right teacher and class in mind when
responding, through verbal and writ-
ten instructions. For classes taught by
more than one teacher, systems may
Proeures or cofetiaity
deve Pub Shs is piloting a student survey or use in a district-wide teacher evaluation system slated or ull
implementation in the 201415 school year. As part o the pilot, paper surveys were administered in about 130 schools in all
2011 and spring 2012. One goal o the pilot is to hone administration guidelines. Here are a ew ways current guidelines address
condentiality:
Scripts are provided to teachers on survey administration
that assure students: I will never get to see yourindividual student answers, only a class summary o the
responses. Students are also told the surveys will help
teachers and principals understand what they do well
and how they may be able to improve.
Teachers assign a student the task o collecting all
completed surveys, sealing them in an envelope, and
taking them to the schools survey coordinator.
Results rom classes in which ewer than 10 students
respond are not reported back to teachers because lowresponse rates may suggest how individual students
responded. Teachers are told to strive or classroom
response rates o at least 80 percent. (Teachers with
ewer than ve students in a class do not participate.)
Asking Students about Teaching12
http://www.dpsk12.org/http://www.dpsk12.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
15/28
need to decide who is the teacher o
recordthe one primarily responsible
or instructionor administer separate
surveys or each teacher in a classroom.
A signicant challenge is posed by link-ing each students individual responses
to a particular teacher and class.
Doing so while ensuring condential-
ity generally requires that each survey
orm be assigned a unique student
identier beore a student completes it.
While systems can administer surveys
without this eature and still attribute
a classrooms group o surveys to the
right teacher, they wouldnt be able
to compare how the same studentsrespond in dierent classrooms or at
dierent points in time. Nor could they
connect survey results with other data
or the same students. Such connections
can help in assessing the eectiveness
o interventions and the reliability o the
survey itsel. They also allow or more
data integrity checks.
Distributing surveys with individual
identiers or each student means
generating lists o the students in eachclass ahead o time, and then veriy-
ing them. As simple as that may sound,
determining accurate class rosters or
every teacher in a district can be vexing.
Until recently, ew stakes have been
attached to the accuracy o the student
and teacher assignments recorded in
the inormation systems rom which
rosters would be generated. Class lists
rom the start o the school year may
be out o date by the time a survey isadministered.
Rosters must be veried at the point in
time that surveys are administered, with
procedures in place to assign unique
identiers to eligible students in a class
who arent listed. (Typically, systems
dont survey students unless theyve
been with a teacher or at least sev-
eral weeks to ensure amiliarity.) Many
places implementing paper student sur-
veys employ procedures similar to those
used by the MET project in administering
Tripod, in which each teacher receives a
packet that includes:
A class roster to check and correct i
needed;
Survey orms with unique identiers
or all students on the roster; and
A set o extra orms that each have
unique identiers to assign eligible
students in the class not listed in the
roster.
Similar verication procedures were
used or online administration in the
MET project, but with teachers distribut-
ing login codes with unique identiers
instead o paper orms.
How the Pittsburgh Public SchoolsEnsures Accuracy o Survey Attribution
Before surveyadministration
After determining whichperiods to survey in eachschool, central office createslists showing what eachteacher teaches during thoseperiods and which studentsare in those classes.
Principals check and correct
these lists before surveyforms are printed for eachclass to be surveyed.
During surveyadministration
After surveyadministration
Teachers review andcorrect roster lists theyreceive with the surveyforms for their classes.
Teachers are givenextra survey formswith unique identifiersto assign to students
missing from rostersprovided.
Completed surveys go toCambridge Educationdistributors of Tripodwhich manages surveyprocedures and workswith district administratorsto quality assure databefore they are released to
teachers.
Like many systems implementing student surveys, Pttsbugh Pub Shs (PPS) has opted to assignunique student identiers to each survey orm to allow or more quality-control checks and more precisetracking o results. That means ensuring the accuracy o each teachers student roster in those sections to
be surveyed. This gure shows the multiple points at which PPS checks class inormation during each surveyadministration.
Student Perception Surveys and Their Implementation 1
gue 3
http://www.pghboe.net/http://www.pghboe.net/ -
7/30/2019 Met Project 2012_asking Students About Teaching
16/28
Ensure Reliablitycsste bus fee. Teachers want to know that any assessment otheir practice refects what they typically do in their classrooms, and not somequirk derived rom who does the assessing, when the assessment is done, orwhich students are in the class at the time. They want results to be reliable.
cptug csstetVew
In many ways, the very nature o student
perception surveys mitigates against
inconsistency. The surveys dont ask
about particular points in time, but
about how students perceive what typi-
cally happens in a classroom. Moreover,
a teachers results average together
responses rom multiple students.
While one student might have a par-
ticular grudge, its unlikely to be shared
by most students in a classor i it is,
that may signal a real issue that merits
addressing.
It should perhaps not be surprising then,
that the MET project ound Tripod to bemore reliable than student achievement
gains or classroom observationsthe
instruments studied to date. When
researchers compared multiple results
on the same measure or the same
teachers, the student surveys were
more likely than achievement gain mea-
sures or observations to demonstrate
consistency. While a single observa-
tion may provide accurate and useul
inormation on the instruction observed
at a particular time on a particular day,
it represents a small slice o what goes
on in a classroom over the course o the
year. In the context o the MET projects
research, results rom a single survey
administration proved to be signicantly
more reliable than a single observation.
numbes items
Although student surveys can produce
relatively consistent results, reliability
isnt a given. Along with the quality o the
items used, reliability is in part a unc-
tion o how many items are included in a
survey. Both reliability and eedback can
be enhanced by including multiple items
or each o a surveys constructs.
Oie or Paper?
Online and paper versions o the same survey have been
ound to be similarly reliable. Instead, the question o which
orm to useor whether to allow or bothis largely a matter
o capacities. Online surveys allow or more automated
processing and are less prone to human error, but they
require an adequate number o computers within each school
and sucient bandwidth to handle large numbers o students
going online at the same time.
Asking Students about Teaching14
-
7/30/2019 Met Project 2012_asking Students About Teaching
17/28
Aapti a Streaii a Survey Istruet
Ater initially administering a version o the Tripod survey
similar to that used in the MET project (with some 75
items), deve Pub Shs (DPS) ocials heard rom
some teachers concerned that the survey took too long
or students to complete. For this and other reasons, the
system engaged teachers in deciding how to streamline
a tool while still asking multiple items within each Tripod
construct. The result or 201112: a survey that includes
three questions or each o the Tripods 7 Cs. (The districts
prekindergarten through grade 2 version includes nine
items total, with the construct o Care as the only one with
multiple items.) The table below compares the items under
Care in the upper elementary versions o the Tripod-MET
project survey with those in DPSs instrument to illustrate
the districts adaptation. The school system is assessing
results to determine the reliability o the streamlined tool
and the value teachers see in the eedback provided.
Tp-MET pjet ves deve Pub Shs ptt
Answers on 5-point scale:
1. Totally untrue
2. Mostly untrue
3. Somewhat true
4. Mostly true
5. Totally true
Answers on 4-point scale
1. Never
2. Some o the time
3. Most o the time
4. Always
ce
I like the way my teacher treats me when I need help. My teacher is nice to me when I need help.
My teacher is nice to me when I ask questions.
My teacher gives us time to explain our ideas.
I I am sad or angry, my teacher helps me eel better. I I am sad or angry, my teacher helps me eel better.
My teacher seems to know when something is botheringme.
The teacher in this class encourages me to do my best. The teacher in this class encourages me to do my best.
My teacher in this class makes me eel that he/she reallycares about me.
Student Perception Surveys and Their Implementation 1
http://www.dpsk12.org/http://www.dpsk12.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
18/28
The Tripod surveys administered in
the MET project included more than 75
items, but analysis or reliability and
predictive validity was based on 36 items
that related to the 7 Cs (see the appendixor a complete list o the 7 C items in the
ET projects analysis). Other questions
asked about students backgrounds,
eort, and sense o sel-efcacy. To
understand how dierent-but-related
items can add inormational value, con-
sider these two under the construct o
Clariy in the Tripod survey:
I you dont understand something,
my teacher explains it another way.
My teacher has several good ways
to explain each topic that we cover in
class.
oth relate to a similar quality, but in
dierent ways. A avorable response on
both presumably is a stronger indica
on just one. Moreover, one can imagine
a teacher who oten has several good
ways to explain topics but who doesnt
recognize when students misunder-
stand. Dierent results or somewhatdierent-but-related items can better
pinpoint areas or improvement.
At the same time, too many items might
result in a survey taking too much
instructional time to complete. Such
concerns lead many systems to shorten
their surveys ater initial piloting. In
doing so, the balance they strive or is
to maintain reliability and the promise
o useul eedback while producing a
streamlined tool. Forthcoming analyses
by MET project partners will explore in
greater detail the relationship between
reliability and number o items.
amp ng
Even a comparatively reliable measure
could be made more so by using
bigger samples. When the MET projectanalyzed a subset o participating
classrooms in which Tripod surveys
were given in December and March,
researchers ound signifcant stability
across the two months or the same
teacher. However, averaging together
results rom dierent groups o
students or the same teacher would
reduce the eects o any variance due
to the make-up o a particular class.
MET project research partners now areanalyzing results to determine the pay-
o in increased reliability rom doing so.
But in practice, many school systems
are including two survey administrations
per year in their pilots as they weigh the
costs o the approach against the beneft
o being able to gauge improvement
Asking Students about Teaching16
-
7/30/2019 Met Project 2012_asking Students About Teaching
19/28
Sapi or Teahers who Teah mutipe Setios
Deciding which students to survey or a teacher who teaches a sel-contained classroom is straightorward: survey the entire
class. But a more considered approach is needed or teachers who teach multiple sections. Systems have experimented with
varied approaches:
Survey a stuets o a o their teahers. One option
that requires minimal planning to implement is to survey
all students on all o their teachers. But doing so would
have individual students completing surveys on several
teachers, which could prompt concerns o overburdening
students. aspe Pub Shs has recently tried
this approach but only ater reducing the number o
questions in its survey.
Survey oe ass perio per teaher ith eah
aiistratio.While reducing the potential or
overburdening students, this approach still results in
some students completing surveys on more than one
teacher because typically in a school there is no one
class period during which all teachers teach. Whenever
surveys are given during more than one period, chances
are some students will complete surveys on more
than one teacher. Pttsbugh Pub Shs, which
has surveyed twice a year, determines guidelines or
teachers at each school on which period to survey based
on analysis o class rosters meant to minimize the
burden on students. The district surveys during dierent
periods in the all and spring to capture the views o
dierent students.
Survey a rao sape aross setios.Ater trying
various strategies, Gee dt Pub Shs settled
on one designed to minimize the burden on students
while addressing concerns teachers expressed that one
section may not be adequately representative. Twice
a year, Green Dot uses sotware to randomly draw the
names o at least 25 students rom all o those taught by
each teacher. It does so in such a way that no student in
a school is listed or more than two teachers. Charter
management organization ocials use those lists to
create surveys that are then administered to students in
advisory classes, a kind o homeroom period ocused
on study skills and youth development. So dierent
students in the same advisory complete surveys or
dierent teachersbut across all advisories, each
teacher has at least 25 randomly selected students
who are surveyed. Green Dot stresses that the strategy
requires a highly accurate and up-to-date student
inormation system.
Student Perception Surveys and Their Implementation 1
http://www.aspirepublicschools.org/http://www.pghboe.net/http://www.greendot.org/http://www.greendot.org/http://www.pghboe.net/http://www.aspirepublicschools.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
20/28
aute mesuemet s esset, but it is insucient to improve eectiveness.Imagine the athlete whose perormance is closely monitored but who never receiveseedback or coaching. Likewise, without providing support or improvement, schoolsystems will not realize a return on their investments in student surveys. Thatmeans helping teachers understand what to make o their results and the kinds ochanges they can make in their in practice to improve them.
The MET project has not completed
analysis on the eectiveness o eedback
and coaching, instead ocusing initial
research on what it takes to produce
the kind o valid and reliable inorma-
tion on which such support depends. But
examples o how survey results can be
used in eedback and support may be
ound among those school systems at
the oreront o implementation.
Mkg resuts
MeguMeaning comes rom specicity, points
o reerence, and relevance. Its o little
help to a teacher to be told simply you
scored a 2.7 out o 4.0 on Care. To
understand what that means requires
knowing each question within the con-
struct, ones own results or each item,
and how those results compare with
those o other teachers. It also requires
knowledge o how Care, as dened bythe instrument, relates to the overall
expectations or teachers within the
school system.
For these reasons, how and when
results are presented to teachers is
critically important i student surveys
are to support improvement. Data
displays should be complete and easy to
interpret, allowing teachers to see the
strengths and weaknesses within their
classrooms and to see how the evidence
supports those judgments. Ideally, they
should integrate results rom multiple
measures, calling out the connections
among results rom surveys and class-
room observations, student achievement
gains, and other evaluation components.
The online teacher-evaluation inorma-
tion system developed by Gee dt
Pub Shs illustrates how survey
results can be presented meaning-
ully. As shown in the excerpt in gue
4, teachers who access their results via
the platorm can compare their results
on each item to school-wide averages
and to average results across all Green
Dot schools. Overall averages are
provided, as well as the distribution o
responses or each itemshowing the
extent to which students in a class eel
similarly.
Support
Improvement
Asking Students about Teaching18
http://www.greendot.org/http://www.greendot.org/http://www.greendot.org/http://www.greendot.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
21/28
Another screen on Green Dots plat-
orm shows teachers their survey
results organized by each indicator in
the charter management organiza-
tions teacher evaluation rubricor
example, showing their average results
or all items related to the indicator
o communicating learning objectives
to students. Elsewhere a high-level
dashboard display presents teachers
summary results on each o Green Dots
measuresobservations, surveys, and
student achievement gainsas well as
overall eectiveness ratings.
chg
Rare are the individuals who get better
at something by themselves. For most
people, improvement requires the
example and expertise o others. While
student surveys can help point to areas
or improvement, they cant answer the
question: Now what? Motivation without
guidance is a recipe or rustration.
Unortunately, education has a gener-
ally poor track record when it comes to
providing eective proessional devel-opment. Although teachers now employ
cycles o data-based instructional plan-
ning to support their students learning,
school systems have lagged ar behind
in implementing cycles o proessional
growth planning or teachers. One-shot
workshops remain the norm in many
places.
Even many school systems at the ore-
ront o implementing student surveyshave yet to meaningully integrate the
tools into robust coaching and plan-
ning structures. One example o what
such a structure might look like comes
rom a research project under way
in Memphis, and soon to expand to
Charlotte, NC: The Tripod Proessional
gue 4
Green Dots Displayo Student Survey Results
3.48
3.33
3.41
3.12
2.99
3.09
3.45
3.36
3.43
3.30
3.04
3.05
I can ask the teacher for help when I need it.
I like the way we learn in this class.
I know what I need to do in order to be successful.
I know how each lesson is related to other lessons.
StronglyDisagree
DisagreeAgreeStronglyAgree
Teacher
School-wide
System-wide
Teacher
School-wide
System-wide
Teacher
School-
wide
System-wide
Teacher
School-wide
System-wide
Average Score(14)
57.6% 36.4%
44.3% 47.2%
50.4% 42.0%
30.3% 57.6%
21.8% 60.0%
28.8% 54.3%
51.5% 45.5%
45.1% 48.0%
50.1% 44.5%
48.5% 42.4%
31.0% 49.2%
33.0% 45.9% 14.2%
13.6%
13.7%
Student Perception Surveys and Their Implementation 1
http://www.mcsk12.net/http://www.cms.k12.nc.us/http://www.cms.k12.nc.us/http://www.mcsk12.net/ -
7/30/2019 Met Project 2012_asking Students About Teaching
22/28
Development (PD) research project,
which includes several MET project
partners: Cambridge Education (dis-
tributors o the Tripod survey), Memphs
ct Shs and the chtte-
Mekebug Shs, and researchers
rom the University o Virginia, Harvard,
and the University o Chicago. Funding
comes rom the Bill & Melinda Gates
Foundation.
The Tripod PD research project com-
bines the use o student surveys with
a video-based one-on-one coaching
cyclecalled MyTeachingPartner
created at the University o Virginia
by developers o the cssm
assessmet Sg Sstem (claSS),one o several observation instruments
studied by the MET project. In each cycle,
teachers submit recordings o them-
selves engaged in teaching, and then
they work with trained coaches by phone
and e-mail over two to three weeks to
analyze their instruction and create
plans or implementing new practices in
the classroom.
In the PD project, teachers and theircoaches begin the rst cycle inormed
by the teachers Tripod survey results,
and coaching protocols ocus video-
analysis through each cycle on the
components o teaching represented
in Tripods 7 Cs. So a coach might use
a one-minute excerpt rom the video
o the teachers lesson that illustrates
clarication to organize dialogue around
how the observed practice supports
student understanding. To implementaction plans, teachers will have access
to a video library o clips showing
exemplary practice within each o
Tripods 7 Cs. Participants in the Tripod
PD coaching process will go through
eight cycles in a school year.
To evaluate the supports eectiveness,
the research project includes a random-
ized experiment in which teachers who
receive coaching will be compared to
those who dont in terms o their student
achievement gains and changes in
their Tripod results. Along with one-
on-one coaching, the pilot is testing
the eectiveness o support provided
via an online community that includes
discussion orums and access to articles
related to each C, as well as access to
the MTehgPte video library.
Full implementation o the project is
taking place during the 201213 school
year, and results o the research are to
be released in 2014.
Teacher submits videoo sel teaching lessons
Coach reviews videand identies excer
refecting Tripodsurvey constructs
Teacher reviewsexcerpts and responds
to coachs prompts
Teacher and coachdiscuss responses
to prompts and videoexcerpts
Teacher and coachcreate action planor implementing
new practices
mephis myTeahiParter-Tripo Pd coahi cye
Bee the fst e,tehe sutte pve thetehes Tpstuet suve esuts
gue 5
Asking Students about Teaching20
http://www.mcsk12.net/http://www.mcsk12.net/http://www.cms.k12.nc.us/http://www.cms.k12.nc.us/http://www.teachstone.org/http://www.teachstone.org/http://curry.virginia.edu/research/centers/castl/mtphttp://curry.virginia.edu/research/centers/castl/mtphttp://www.teachstone.org/http://www.teachstone.org/http://www.cms.k12.nc.us/http://www.cms.k12.nc.us/http://www.mcsk12.net/http://www.mcsk12.net/ -
7/30/2019 Met Project 2012_asking Students About Teaching
23/28
-
7/30/2019 Met Project 2012_asking Students About Teaching
24/28
listei a respoi to teahers i Pittsburh
To communicate with teachers on elements o the Pttsbugh Pub Shs
new evaluation systemcalled RISE, or Research-Based Inclusive System o
Evaluationthe district relies heavily on teams o inormed teacher leaders at
each school, called RISE leadership teams, who also participate in a district-wide
committee that helps plan Pittsburghs new measures o teaching.
In school-level conversations with teachers organized through these teams, district
leaders have identied specic concerns and ideas or addressing them. For example,
they learned in these discussions that teachers wanted more guidance on how to
explain to students the purpose o the surveys.
In addition, system leaders have prepared a comprehensive FAQ document that
addresses many o the questions teachers ask. Among the 35 questions now
answered in the document:
What is the Tripod student survey?
What kind o questions are on the survey?
Where can I nd more inormation about Tripod?
How has the Tripod survey been developed?
When will the survey be administered?
How many o my classes will take the survey at each administration point?
Who will lead survey administration at each school?
What are the key dates or survey administration?
How do I know that the results are accurate?
When will I see results?
Will principals see teacher survey results this year? Who else will see the results?
How does the survey t with our other measures o eective teaching?
Will the survey data be included as part o my summative rating?
Where can teachers and principals access survey training materials?
Asking Students about Teaching22
http://www.pghboe.net/http://www.pghboe.net/ -
7/30/2019 Met Project 2012_asking Students About Teaching
25/28
(continued on p.24)
* Reverse coded item. Agreement represents an unavorable response.
AppendixSe Ves
carE
My teacher in this class makes me eel s/he really caresabout me.
My teacher seems to know i something is bothering me.
My teacher really tries to understand how students eel aboutthings.
conTrol
Student behavior in this class is under control.
I hate the way that students behave in this class.*
Student behavior in this class makes the teacher angry.*
Student behavior in this class is a problem.*
My classmates behave the way my teacher wants them to.
Students in this class treat the teacher with respect.
Our class stays busy and doesnt waste time.
clariy
I you dont understand something, my teacher explains itanother way.
My teacher knows when the class understands, and when wedo not.
When s/he is teaching us, my teacher thinks we understandwhen we dont.*
My teacher has several good ways to explain each topic thatwe cover in class.
My teacher explains dicult things clearly.
cHallEnGE
My teacher asks questions to be sure we are ollowing alongwhen s/he is teaching.
My teacher asks students to explain more about the answersthey give.
In this class, my teacher accepts nothing less than our ulleort.
My teacher doesnt let people give up when the work gets hard.
My teacher wants me to explain my answerswhy I think whatI think.
In this class, we learn a lot almost every day.
In this class, we learn to correct our mistakes.
Tp 7 c Suve items iue MET Pjet ass
Uppe Eemet Ves
carE
I like the way my teacher treats me when I need help.
My teacher is nice to me when I ask questions.
My teacher in this class makes me eel that s/he really caresabout me.
I I am sad or angry, my teacher helps me eel better.
The teacher in this class encourages me to do my best.
My teacher seems to know i something is bothering me.
My teacher gives us time to explain our ideas.conTrol
My classmates behave the way my teacher wants them to.
Our class stays busy and does not waste time.
Students behave so badly in this class that it slows down ourlearning.*
Everybody knows what they should be doing and learning inthis class.
clariy
My teacher explains things in very orderly ways.
In this class, we learn to correct our mistakes.
My teacher explains dicult things clearly.
My teacher has several good ways to explain each topic thatwe cover in this class.
I understand what I am supposed to be learning in this class.
My teacher knows when the class understands, and when wedo not.
This class is neateverything has a place and things are easyto nd.
I you dont understand something, my teacher explains itanother way.
cHallEnGE
My teacher pushes us to think hard about things we read.My teacher pushes everybody to work hard.
In this class we have to think hard about the writing we do.
In this class, my teacher accepts nothing less than our ull eort.
Student Perception Surveys and Their Implementation 2
-
7/30/2019 Met Project 2012_asking Students About Teaching
26/28
(continued)
* Reverse coded item. Agreement represents an unavorable response.
Uppe Eemet Ves
caPTiVaTE
School work is interesting.
We have interesting homework.
Homework helps me learn.School work is not very enjoyable.* (Do you agree?)
conEr
When s/he is teaching us, my teacher asks us whether weunderstand.
My teacher asks questions to be sure we are ollowing along whens/he is teaching.
My teacher checks to make sure we understand what s/he isteaching us.
My teacher tells us what we are learning and why.
My teacher wants us to share our thoughts.
Students speak up and share their ideas about class work.My teacher wants me to explain my answerswhy I think what Ithink.
conSolidaTE
My teacher takes the time to summarize what we learn each day.
When my teacher marks my work, s/he writes on my papers tohelp me understand.
Asking Students about Teaching24
Se Ves
caPTiVaTE
This class does not keep my attentionI get bored.*
My teacher makes learning enjoyable.
My teacher makes lessons interesting.
I like the way we learn in this class.
conEr
My teacher wants us to share our thoughts.
Students get to decide how activities are done in this class.
My teacher gives us time to explain our ideas.
Students speak up and share their ideas about class work.
My teacher respects my ideas and suggestions.
conSolidaTE
My teacher takes the time to summarize what we learn each day.
My teacher checks to make sure we understand what s/he isteaching us.
We get helpul comments to let us know what we did wrong onassignments.
The comments that I get on my work in this class help meunderstand how to improve.
-
7/30/2019 Met Project 2012_asking Students About Teaching
27/28
2012 Bill & Melinda Gates Foundation. All Rights Reserved.
Bill & Melinda Gates Foundation is a registered trademark
in the United States and other countries.
Bi & meia gates Fouatio
Guided by the belie that every lie has equal
value, the Bill & Melinda Gates Foundation
works to help all people lead healthy,
productive lives. In developing countries, it
ocuses on improving peoples health and
giving them the chance to lit themselves out
o hunger and extreme poverty. In the United
States, it seeks to ensure that all people
especially those with the ewest resources
have access to the opportunities they need to
succeed in school and lie. Based in Seattle,
Washington, the oundation is led by CEO Je
Raikes and Co-chair William H. Gates Sr.,
under the direction o Bill and Melinda Gates
and Warren Buett.
For more inormation on the U.S. Program,
which works primarily to improve high school
and postsecondary education, please visit
www.gatesoundation.org.
She:
http://www.gatesfoundation.org/mailto:?subject=MET%20project%20Student%20survey&body=The%20MET%20project%20has%20published%20its%20most%20recent%20findings%20on%20student%20perception%20surveys.%20Learn%20more%20about%20how%20students%20can%20be%20an%20important%20source%20of%20information%20on%20the%20quality%20of%20teaching%20and%20the%20learning%20environment.%20%20http://bit.ly/UEodKYhttp://www.linkedin.com/shareArticle?mini=true&urlhttp://bit.ly/UEodKYhttp://www.facebook.com/share.php?u=http://bit.ly/UEodKYhttp://twitter.com/?status=Latest+findings+from+the+MET+project+on+the+value+of+student+perception+surveys.+http://bit.ly/UEodKYhttp://www.gatesfoundation.org/ -
7/30/2019 Met Project 2012_asking Students About Teaching
28/28
www.gatesoundation.org
http://www.gatesfoundation.org/http://www.gatesfoundation.org/