National Assessment Governing BoardAdministering the contextual questionnaire in the selected...

193
GOVERNING BOARD AND NAEP RESOURCES Table of Resources and Links Attached documents are listed with page numbers. Click on the active links below to access unattached documents. Page No. National Assessment Governing Board: Authority and Organization NAEP Law Board By-laws National Assessment Governing Board: Composition and Responsibilities ...................3 National Assessment Governing Board Nominations .....................................................4 Board Staff Organization NAEP Organizational Model (relationship to other organizations) National Assessment Governing Board Current Contracts .............................................5 NAEP Schedule of Assessments NAEP Schedule of Assessments (latest version) .............................................................6 History of Changes to the NAEP Schedule of Assessments............................................7 Recent NAEP releases General Web-based Resources Home page of Governing Board web site Home page of the Nation’s Report Card web site Materials for previous Board meetings Board Policies for NAEP General Policy: Conducting and Reporting NAEP Framework Development Item Development and Review Developing Student Performance Levels for NAEP Reporting, Release, and Dissemination of NAEP Results o Guidelines for the Initial Release of The Nation's Report Card o Resolution on Reporting 12th Grade Academic Preparedness for College o Resolution on Reporting on Preparedness of 12th Grade Students Collection and Reporting of Background Data by NAEP NAEP Testing and Reporting on Students with Disabilities and English Language Learners Trial Urban District Assessment: Eligibility Criteria and Selection Procedures o List of Eligible TUDA Districts Resolution on Linking NAEP and International Assessments

Transcript of National Assessment Governing BoardAdministering the contextual questionnaire in the selected...

Page 1: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

GOVERNING BOARD AND NAEP RESOURCES

Table of Resources and Links

Attached documents are listed with page numbers. Click on the active links below to access unattached documents.

Page No.

National Assessment Governing Board: Authority and Organization

• NAEP Law• Board By-laws• National Assessment Governing Board: Composition and Responsibilities ...................3• National Assessment Governing Board Nominations .....................................................4• Board Staff Organization• NAEP Organizational Model (relationship to other organizations)• National Assessment Governing Board Current Contracts .............................................5

NAEP Schedule of Assessments

• NAEP Schedule of Assessments (latest version) .............................................................6• History of Changes to the NAEP Schedule of Assessments ............................................7• Recent NAEP releases

General Web-based Resources

• Home page of Governing Board web site• Home page of the Nation’s Report Card web site• Materials for previous Board meetings

Board Policies for NAEP

• General Policy: Conducting and Reporting NAEP• Framework Development• Item Development and Review• Developing Student Performance Levels for NAEP• Reporting, Release, and Dissemination of NAEP Results

o Guidelines for the Initial Release of The Nation's Report Cardo Resolution on Reporting 12th Grade Academic Preparedness for Collegeo Resolution on Reporting on Preparedness of 12th Grade Students

• Collection and Reporting of Background Data by NAEP• NAEP Testing and Reporting on Students with Disabilities and English Language

Learners• Trial Urban District Assessment: Eligibility Criteria and Selection Procedures

o List of Eligible TUDA Districts• Resolution on Linking NAEP and International Assessments

Page 2: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NAEP Assessment Design

• Overview of NAEP Assessment Design ..........................................................................8• NAEP Alliance Contractors .............................................................................................11

Selected Board-commissioned research reports and papers (from most to least recent)

• Technical Report: NAEP 12th Grade Preparedness Research• Technical Panel on 12th Grade Preparedness Research – Final Report• The Future of 12th Grade NAEP: Report of the Ad Hoc Committee on Planning for

NAEP 12th Grade Assessments in 2009• Redesigning the National Assessment of Educational Progress

Previous “Inside NAEP” presentations

• Developing NAEP Frameworks: A Look Inside the Process• Developing NAEP Test Questions• Introduction to Validity

Glossary of Acronyms and Other Terms .....................................................................................12

Page 3: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Composition The Board is non-partisan, with 26 members representing gender, geographic, and racial-ethnic diversity. Specific categories of members specified in the NAEP law:

– Policymakers: governors or former governors (2), state legislators (2), chief stateschool officers (2), local school district superintendent (1), state (1) and local (1)school board members, nonpublic school administrator or policymaker (1)

– Educators: classroom teachers (3), principals (2), curriculum specialists (2)

– Public: general public representatives (2), parents (2), business representative (1)

– Technical experts: testing and measurement experts (3)

The director of the Institute of Education Sciences serves as an ex-officio 26th member.

Responsibilities The responsibilities of the Board are mandated by Congress, and include:

• Test Development

– Select subject areas to assess

– Develop assessment objectives and test specifications

– Ensure all items are free from bias

– Have final authority on appropriateness of all items

• Technical Methodology

– Develop appropriate student achievement levels

– Design the methodology of the assessment to ensure that assessment items arevalid and reliable

• Reporting and Dissemination

– Develop guidelines for reporting and disseminating results

– Plan and execute the initial public release of NAEP reports

– Take appropriate actions needed to improve the form, content, use, and reportingof results

3

--

--

--

---

-

-

-

-

Page 4: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Nominations Annual Nominations Timeline

Early August – Annual call for Board nominations for terms beginning October 1 of the following year. Late October – Nominations due. November to February – Board reviews nominees. March – Board action on finalists. Late Spring to late Summer – Secretary reviews finalists. Early Fall – Secretary announces Board member appointments for terms beginning October 1. October 1 – Newly appointed Board members begin their terms.

Members and Categories by Term Expiration Date 2014 2015 2016 2017

David Driscoll* General Public (Incl. Parents) Shannon Garrison Fourth Grade Teacher Brent Houston** Secondary School Principal Hector Ibarra Eighth Grade Teacher Tom Luna** Chief State School Officer

Andres Alonso** Local School Superintendent Louis Fabrizio* Testing & Measurement Expert Terry Holliday Chief State School Officer Dale Nowlin Twelfth Grade Teacher Fielding Rolston State School Board Member Susan Pimentel* Curriculum Specialist Cary Sneider Curriculum Specialist (Vacancy)

Business Representative

Anitere Flores*

State Legislator (Republican) Rebecca Gagnon

Local School Board Member

Andrew Ho Testing & Measurement Expert

Terry Mazany General Public Representative

Joseph O’Keefe Non-public School Administrator or Policymaker

Lucille Davy

General Public (Incl. Parents)

James Geringer Governor (Republican)

Doris Hicks* Elementary School Principal Tonya Miles* General Public (Incl. Parents) Ronnie Musgrove*

Governor (Democrat) W. James Popham* Testing & Measurement Expert Leticia Van de Putte* State Legislator (Democrat)

* Member currently serving 2nd term; not eligible for reappointment Updated 4/22/2014 ** Member has taken/will be taking a new position; not eligible for reappointment 4

Page 5: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NATIONAL ASSESSMENT GOVERNING BOARD CURRENT CONTRACTS

BoardCommittee&Activity

AreaofWork Contractor StaffMember

CommitteeonStandards,DesignandMethodology(COSDAM)

CollegeCourseContentAnalysis–12thgradepreparednessresearch

EducationalPolicyImprovementCenter(EPIC)(Year2of2Years)

MichelleBlair

EvaluatingReadingandMathFrameworks–12thgradepreparednessresearch

HumanResourcesResearchOrganization(HumRRO)(Year2of2Years)

MaryCrovo

ContentAlignmentStudies:ACTExplore

Procurementinprocess MichelleBlair

AchievementLevelsDescriptions

WestEd SharynRosenberg

AchievementLevelsSetting(TEL)

NCSPearsonInc.(Year1of15months)

SharynRosenberg

DataSharingAgreements Via statesandNAEPAlliancecontractorsETSandWestat

SharynRosenberg

ReportingandDisseminationCommittee

OutreachandDissemination

ReingoldInc.(Year2of5Years)

StephaanHarris

WorldWideWebServices Quotient(Year2of5Years)

StephaanHarrisMaryCrovo

12thGradePreparednessReports

WidmeyerCommunications(Year4of5Years)

MaryCrovoMichelleBlair

ExecutiveCommittee

BusinessPolicyTaskForce:BusinessOutreach

NoralGroup

TBD

NAGB/CouncilofChiefStateSchoolOfficers:StateOutreach

CouncilofChiefStateSchoolOfficers

MaryCrovoMichelleBlair

NAGBOperations(notassignedtoaCommittee)

Supportservices AFYA MaryCrovo

Updated July 8, 2014

5

Page 6: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NAEP Schedule of Assessments – Approved August 3, 2013 Year National State 2005 Reading

MATHEMATICS Science High School Transcript Study

Reading (4, 8) MATH (4, 8) Science (4, 8)

2006 U.S. History Civics ECONOMICS (12)

2007 Reading (4, 8) Mathematics (4, 8) Writing (8, 12)

Reading (4, 8) Math (4, 8) Writing (8)

2008 Arts (8) Long-term trend

2009 READING Mathematics* SCIENCE** High School Transcript Study

READING (4, 8, 12) Math (4, 8, 12) SCIENCE (4, 8)

2010 U.S. History Civics Geography

2011 Reading (4, 8) Mathematics (4, 8) Science (8)** WRITING (8, 12)**

Reading (4, 8) Math (4, 8) Science (8)

2012 Economics (12) Long-term trend

2013 Reading Mathematics

Reading (4, 8, 12) Math (4, 8, 12)

2014 U.S. History (8) Civics (8) Geography (8) TECHNOLOGY AND ENGINEERING LITERACY (8) **

2015 Reading Mathematics Science**

Reading (4, 8) Math (4, 8) Science (4, 8)

2016 Arts (8) 2017 Reading

Mathematics Writing**

Reading (4, 8, 12) Math (4, 8, 12) Writing (4, 8, 12)

*New framework for grade 12 only. **Assessments involving test administration by computer. NOTES: (1) Grades tested are 4, 8, and 12 unless otherwise indicated, except that long-term trend assessments sample students at ages 9, 13, and 17 and are conducted in reading and mathematics. (2) Subjects in BOLD ALL CAPS indicate the year in which a new framework is implemented or assessment year for which the Board will decide whether a new or updated framework is needed. (3) In 2009, 12th grade assessments in reading and mathematics at the state level were conducted as a pilot in 11 volunteering states (AR, CT, FL, IA, ID, IL, MA. NH, NJ, SD, WV). For 2013, 13 states agreed to participate (with MI and TN added). (4) The Governing Board intends to conduct assessments at the 12th grade in World History and Foreign Language during the assessment period 2018-2022.

6

Page 7: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

 

     

History of Changes to the NAEP Schedule of Assessments Historical Schedule Changes The major schedule changes adopted by the Board over the last 10 years are listed below:

1. Added grade 4 and 8 state-level reading and mathematics every two years. (No Child Left Behind; 2002) [Prior to NCLB state assessments at grades 4 and 8 were given every two years with reading and writing in one biennium and mathematics and science in the next. Therefore, these subjects and grade 12 in reading and mathematics were tested once every four years.]

2. Added the High School Transcript Study (HSTS) as a regularly scheduled study. (2005) 3. Scheduled U.S. history, civics and geography on a once every four years cycle. (2005) 4. Added Technology & Engineering Literacy (TEL) to the NAEP subjects assessed. (2005) 5. Added grade 12 state-level reading and mathematics for volunteer states with a

periodicity of every four years. (2008) 6. Adjusted the periodicity of science to correspond to the periodicity of TIMSS for the

purpose conducting international benchmarking studies in both mathematics and science. (2010)

7. Scheduled Writing as a technology based assessment, beginning with national data collections only and delaying fourth grade in order to complete a special study. (2010)

Other schedule changes and program adjustments due primarily to budget constraints and/or technical challenges have been considered in the development of the draft schedule:

Postponing the state-level writing assessment. Postponing various other assessments/studies (World History, Foreign Language, HSTS,

Long-Term Trend). Assessing fewer grade levels in non-required subject areas (U.S. history, civics, and

geography; writing; TEL). Changing the sample size and reporting depth for states in reading and writing, referred

to as focal and non-focal subject reporting. Guiding Principles for Schedule Changes Guiding principles that have been used to guide planned updates to the NAEP schedule of assessments include:

1. Follow the guidance in the NAEP Act (303(b)(2)), 2. Continue to cover a broad range of subject areas, and 3. Administer all assessments using technology beginning in 2017.

Guidance for the schedule is found in Title 303 Sec. 303(b)(2) which addresses the use of random sampling (A), testing in reading and mathematics at grades 4 and 8 once every two years (B), and testing in reading and mathematics at grade 12 at regularly scheduled intervals (at least as often as prior to NCLB (C).

After this initial guidance, Sec. 303(b)(2)(D) provides guidance for including other subjects in grades 4, 8, and 12 to the extent time and resources allow. It says, including assessments “… in regularly scheduled intervals in additional subject matter, including writing, science, history, geography, civics, economics, foreign languages, and arts, and the trend assessment described in subparagraph (F).”

Summary last updated: May 2014

7

Page 8: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Overview of NAEP Assessment Design The content and format for each NAEP subject-area assessment is determined by a NAEP assessment framework, developed under the Governing Board’s direction. General details about the structure of NAEP assessments include:

Long Test, Short Student Test Booklet

– Each student gets a small part of the test

– No individual student scores

Common Block Structures Across Subjects

– Items are within blocks, blocks are within booklets Example: At grade 4: Reading has 10 blocks and Math has 10 blocks

Test Question Types

– Multiple-choice

– Open-ended

– Computer-based tasks (Writing, Science, TEL)

Contextual Questions

– Student, teacher, administrator questionnaires

Student Booklet Block Design While some NAEP assessments are conducted on a technology-based platform (TEL, Writing), for paper-based assessments NAEP uses a focused balanced incomplete block (BIB) or partially balanced incomplete block (pBIB) design to assign blocks or groups of cognitive items to student booklets. Because of the BIB and pBIB booklet designs and the way NAEP assigns booklets to students, NAEP can sample enough students to obtain precise results for each test question while generally consuming an average of about an hour and a half of each student's time.

The "focused" aspect of NAEP's booklet design requires that each student answer questions from only one subject area. The "BIB" or "pBIB" design ensures that students receive different interlocking sections of the assessment forms, enabling NAEP to check for any unusual interactions that may occur between different samples of students and different sets of assessment questions.

In a BIB design, the cognitive blocks are balanced; each cognitive block appears an equal number of times in every possible position. Each cognitive block is also paired with every other cognitive block in a test booklet exactly the same number of times. In a pBIB design, cognitive blocks may not appear an equal number of times in each position, or may not be paired with every other cognitive block an equal number of times. NAEP booklet design varies according to subject area (e.g., geography, mathematics, reading, science, U.S. history, writing).

8

Page 9: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Once the instrument developer has laid out the configuration of all blocks for each booklet in a booklet map shown here with the following column headings,

Booklet number

Cognitive block 1

Cognitive block 2

Contextual question

directions

General student contextual questions

Subject-specific

contextual questions

1 2 3

the number of rows (booklet numbers) provides the booklet spiral design information needed for the bundling of the student booklets.

Source: http://nces.ed.gov/nationsreportcard/tdw/instruments/cog_blockdesign.aspx

NAEP Assessment Sample Design Each assessment cycle, a sample of students in designated grades within both public and private schools throughout the United States (and sometimes specified territories and possessions) is selected for assessment. In addition, in state assessment years, of which 2007 is an example, the samples of public schools and their students in each state are large enough to support state-level estimates. In all cases, the selection process utilizes a probability sample design in which every school and student has a chance to be selected, and standard errors can be calculated for the derived estimates.

Public School Selection in State Assessment Years The selection of a sample of public school students for state assessment involves a complex multistage sampling design with the following stages:

• Select public schools within the designated areas, • Select students in the relevant grades within the designated schools, and • Allocate selected students to assessment subjects.

The Common Core of Data (CCD) file, a comprehensive list of operating public schools in each jurisdiction that is compiled each school year by the National Center for Education Statistics (NCES), is used as the sampling frame for the selection of sample schools. The CCD also contains information about grades served, enrollment, and location of each school. In addition to the CCD list, a set of specially sampled jurisdictions is contacted to determine if there are any newly formed public schools that were not included in the lists used as sampling frames. Considerable effort is expended to increase the survey coverage by locating public schools not included in the most recent CCD file.

As part of the selection process, public schools are combined into groups known as strata on the basis of various school characteristics related to achievement. These characteristics include the physical location of the school, extent of minority enrollment, state-based achievement scores, and median income of the area in which the school is located. Stratification of public schools

9

Page 10: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

occurs within each state. Grouping schools within strata by such selected characteristics provides a more ordered selection process with improved reliability of the assessment results.

On average, a sample of approximately 100 grade-eligible public schools is selected within each jurisdiction; within each school, about 60 students are selected for assessment. Both of these numbers may vary somewhat, depending on the number and enrollment size of the schools in a jurisdiction, and the scope of the assessment in the particular year. Students are sampled from a roster of individual names, not by whole classrooms. The total number of schools selected is a function of the number of grades to be assessed, the number of subjects to be assessed, and the number of states participating.

Private School Selection in State Assessment Years In years in which state-level samples are drawn for public schools, private schools are classified by type (e.g., Roman Catholic, Lutheran, etc.), and are grouped for sampling by geography (Census region), degree of urbanization of location, and minority enrollment. About 700 private schools, on average, are included, with up to 60 students per school selected for assessment. These samples are not large enough to support state-level estimates for private schools. Thus, inferences for private schools are limited to the national level, even in years when public school assessments are state-specific.

A national sample of private schools in all grades is then drawn from a list compiled through the Private School Universe Survey (PSS), which is a mail survey of all U.S. private schools carried out biennially by the U.S. Census Bureau under contract to NCES. The PSS list is updated for new schools only for a sample of Roman Catholic dioceses.

National-Only Assessment Years In years when the NAEP samples are intended only to provide representation at the national level and not for each individual state, the public and private school selection process is somewhat different. Rather than selecting schools directly from lists of schools, the first stage of sampling involves selecting a sample of some 50 to 100 geographic primary sampling units (PSUs). Each PSU is composed of one or more counties. They vary in size considerably, and generally about 1,000 PSUs are created in total, from which a sample is selected. Within the set of selected PSUs, public and private school samples are selected using similar procedures to those described above for the direct sampling of schools from lists. The samples are clustered geographically, which results in a more efficient data collection process. The selection of PSUs is not necessary when the sample sizes are large in each state, as in state assessment years.

Source: http://nces.ed.gov/nationsreportcard/tdw/sample_design/default.aspx

NAEP Alliance Contractors NAEP is conducted by the Assessment Division of NCES, which also works with a series of contractors. The following chart presents the structure of the collaboration between these contractors.

10

Page 11: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

 

 

 

 

NAEP Alliance Contractors

 

 

To learn more about NAEP contractors in addition to the NAEP Alliance contractors, visit: http://nces.ed.gov/nationsreportcard/contracts/history.aspx

11

Page 12: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Glossary of Acronyms and Other Terms The following acronyms and terms are commonly used in the work of the National Assessment Governing Board.

AASA American Association of School

Administrators

ACT Formerly American College Testing

ADC Assessment Development Committee (Board Committee responsible for test development on all NAEP subjects)

AERA American Educational Research Association

AFT American Federation of Teachers

AIR American Institutes for Research

ALDs Achievement Level Descriptions

ALS Achievement Levels Setting

ARRA American Recovery and Reinvestment Act of 2009

AYP Adequate Yearly Progress (From the No Child Left Behind Act)

BOTA Board on Testing and Assessment, National Academy of Sciences

CCSS Common Core State Standards

CCSSO Council of Chief State School Officers

CGCS Council of the Great City Schools

COSDAM Committee on Standards, Design and Methodology (Board committee responsible for technical issues)

CRESST Center for Research on Evaluation, Standards, and Student Testing (Research Center at UCLA)

DAC Design and Analysis Committee (Advisory panel to ETS on technical issues in NAEP operations)

ECS Education Commission of the States (First NAEP contractor and organization supporting state policy leaders)

EIMAC Education Information Management Advisory Consortium (Advisory committee to CCSSO, mostly state testing directors)

ELs or ELLs

English Learners or English Language Learner (Pronounced "Ls"; formerly called Limited English Proficient or LEP)

ELPA English Language Proficiency Assessment (Also ELPA21)

EPIC Education Policy Improvement Center

ESEA Elementary and Secondary Education Act

ETS Educational Testing Service

FAR Federal Acquisition Regulations

GAO Government Accountability Office

GPO Government Printing Office

GSA General Services Administration

HSTS High School Transcript Study (A special NAEP data collection)

IEP Individualized Education Plan (A required document under the Individuals with Disabilities Education Act, which specifies learning objectives for an individual student found with a disability)

IES Institute of Education Sciences (The Department of Education office in which NCES is located. The Director of IES is an ex-officio member of the Governing Board.)

12

Page 13: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

IRA International Reading Association

IRT Item Response Theory (A theory for design, analysis, and scoring of tests)

KaSA Knowledge and Skills Appropriate (A series of NAEP research studies to improve measurement precision)

KSA Knowledge, Skill, and/or Ability (A statement describing a subset of academic content)

LEP Limited English Proficient (Term formerly used for an English Language Learner)

LTT Long Term Trend Assessment (Series of NAEP tests that began in the early 1970’s)

MST Multi-stage Testing (A testing format where subsets of test items are presented to students based on item difficulty and student performance)

NAE National Academy of Education

NAEP National Assessment of Educational Progress (Pronounced "nape")

NAESP National Association of Elementary School Principals

NAGB National Assessment Governing Board (Pronounced "nag bee")

NAS National Academy of Sciences

NASBE National Association of State Boards of Education

NASSP National Association of Secondary School Principals

The Nation’s Report Card

Alternate reference for NAEP assessments

NCES National Center for Education Statistics (Project office for NAEP in the U.S. Department of Education and IES)

NCLB No Child Left Behind Act of 2001

NCME National Council on Measurement in Education

NCTE National Council of Teachers of English

NCTM National Council of Teachers of Mathematics

NEA National Education Association

NEA National Endowment for the Arts

NEH National Endowment for the Humanities

NGSS Next Generation Science Standards

NRC National Research Council

NSBA National School Boards Association

NSLP National School Lunch Program

NVS NAEP Validity Studies Panel

OGC Office of the General Counsel (in the U.S. Department of Education)

OMB Office of Management and Budget

PARCC Partnership for Assessment of Readiness for College and Careers

PIRLS Progress in International Reading Literacy Study

PISA Program for International Student Assessment

POC Principal Operating Components (Divisions of the U.S. Department of Education)

PTA Parent Teacher Association

13

Page 14: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

R&D Reporting and Dissemination Committee (Board Committee responsible for NAEP reporting issues)

RFP Request for Proposals

RP Response probability (probability of correct response on a test question)

RTT Race to the Top (also referred to as RTTT)

SBAC SMARTER Balanced Assessment Consortium

SD Students with Disabilities

SES Socio-economic Status

TBA Technology-based Assessment

TEL Technology and Engineering Literacy (A content area assessed by NAEP)

The Department

United States Department of Education

The Secretary

Secretary of Education (Honorable Arne Duncan during the Obama administration)

TIMSS Trends in International Mathematics and Science Study

TUDA Trial Urban District Assessment (NAEP component that measures students in large urban districts)

14

Page 15: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 16: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

WelcomeMargaret HoneyPresident & CEO, New York Hall of Science

Claire ShulmanQueens Borough President, 1986 – 2002

Seth Dubin Trustee & President Emeritus, New York Hall of Science

Rick BonneyDirector of Program Development and Evaluation, Cornell Laboratory of Ornithology

Andy FraknoiChair, Astronomy Department, Foothill College

Sheila GrinellConsultant to science centers

Ira FlatowHost, Science Friday

Mary CrovoDeputy Executive Director, National Assessment Governing Board

Preeti GuptaDirector of Youth Learning and Research,American Museum of Natural History

Dennis SchatzNational Science Foundation & Pacific Science Center

Ron OttingerNoyce Foundation

Mickey Friedman

Page 17: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Tribute to Alan J. Friedman

Mary Crovo, Deputy Executive Director National Assessment Governing Board

June 14, 2014

I am honored to speak today on behalf of Alan Friedman's colleagues and friends on the

National Assessment Governing Board and on behalf of the Board staff. We have some

individuals here today representing the Board and its staff. Alan was a member of the Governing

Board from October 2006 until September 2013. He made immense contributions to our work.

I am Mary Crovo, the Board's Deputy Executive Director. I worked closely with Alan as staff to

the Assessment Development Committee, which he chaired.

As others have noted already, Alan Friedman was a brilliant and charming man. He was

also perceptive and persistent and a man of ideas. His insight and creativity helped point our

Board in new directions. His tact and flair and ability to explain helped lead us there with a

minimum of fuss and discord.

The Board on which Alan served has 26 members, appointed by the U.S. Secretary of

Education. It sets policy for the National Assessment of Educational Progress, which has the

acronym NAEP [N – A – E – P]. The program also is called the Nation's Report Card, and

assesses representative samples of students in elementary, middle, and high school grades.

NAEP reports its results for the nation, the states, and 21 large urban districts, including, of

course, the largest—New York City. By law, the NAEP program cannot report results for

individual students, teachers, or schools.

NAEP covers a broad range of subjects, and Alan added to that in an important way by

spearheading development of a new Technology and Engineering Literacy Assessment or TEL.

The assessment was given for the first time this year—to more than 20,000 8th

grade students

across the nation. In effect, Alan became the chief spokesman for TEL, explaining the general

concepts of technology and engineering to Board members and the public. He even made a

video about it, which is posted on YouTube.

NAEP also collects a considerable amount of background or contextual information on

students, teachers, and schools. Alan added to that too by pushing us to gather more data on out-

of-school-learning experiences. And very unusually for a Board member, he co-authored a

report just last year, based on NAEP findings, on context and instruction in science education.

That report is on the Board’s website, and it sings with Alan's distinctive voice in its prose.

The Governing Board is bipartisan and independent. Its membership was established by

Congress as a sort of a Noah's Ark of the interests involved in education. Included are state and

local officials, classroom teachers, test and measurement experts, representatives of business and

nonpublic schools, and four "representatives of the general public," the category in which Alan

served.

Page 18: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

2

In his years on the Board’s Assessment Development Committee Alan reviewed several

thousand test questions for NAEP. These included questions on science, of course, where he was

a bonafide expert, but also in reading, mathematics, U.S. history, writing, and other subjects.

Alan went over those very carefully too and made perceptive comments. This may sound like a

terribly tedious task, this review of test questions. But not so! Alan’s favorite NAEP questions

were the dynamic, computer-based tasks that measure student achievement in science,

technology, and engineering. After a six-hour closed door review session, Alan would close the

committee meeting with our traditional chime, but not before apologizing for making so many

comments and edits. But then Alan would say, “these were terrific!—I learned so much about

these engaging topics.”

He also represented the Board at several press release events for NAEP reports. At one

release, for a report called “Science in Action,” Alan brought a yellow remote-controlled model

helicopter, and flew it around the room to illustrate a point about science and technology. A few

months later, by request, he flew the helicopter again at a Board meeting—to underscore his

point.

Even after his term ended last September, Alan remained involved in the Governing

Board's work. He served as a facilitator at our parent leader summit in January. He chaired the

planning committee for our 25th anniversary symposium at the end of February, and spoke at a

lively panel session on innovation in NAEP, which he had done so much to foster.

Back in the spring of 2008, the Governing Board held one of its quarterly meetings in

New York City. Alan invited the Board members to this wonderful Hall of Science for a

memorable tour and lunch. Our Board members were amazed by all that Alan and the staff had

accomplished since the museum reopened in 1986. The Board’s favorite part of the tour was

crawling through the tunnel to enter the inflatable planetarium, to gaze at the simulated night sky

views.

At our Board meeting last month, after Alan's sudden passing, our Chairman David

Driscoll reminisced about the Hall of Science visit and the model helicopter and also about

Alan's work on TEL. But he said Alan Friedman's contribution had been broader than that.

Almost every time the Board debated something, the Chairman said, and here I quote, “Alan

would wait, and somehow, near the end of the discussion, he would weigh in, and always

brought things to a logical conclusion."

The memorial resolution that our Board passed, notes Alan's leadership roles and

substantive accomplishments. And it continues: "His rigorous intelligence, engaging wit, and

enormous passion for nurturing others earned him the respect, admiration and affection of his

fellow Board members...."

Alan Friedman was "esteemed as a trusted colleague," the resolution says, "inspiring as a

creative teacher, revered as a knowledgeable, supportive mentor, and valued as a responsive

friend.... He touched the hearts and minds of children and adults...in encouraging ways that made

them strive to be better."

Page 19: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

3

In his many years on the Board, members and staff not only came to know Alan well, but

also his wife Mickey, who accompanied him to many of our meetings. Mickey enjoyed taking in

a new museum exhibit or revisiting a favorite venue. We express our sorrow and share our

sympathy with her. Alan Friedman was a wonderful man. We are grateful that he touched our

lives. He will be missed.

Thank you very much.

[Written by Larry Feinberg and Mary Crovo]

Page 20: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board

Approved Unanimously May 17, 2014

Resolution in Memory of Alan J. Friedman

Whereas, Alan J. Friedman served as a member of the National Assessment Governing Board

from October 1, 2006 through September 30, 2013; and

Whereas, during his service on the Governing Board, Alan J. Friedman carried out numerous

leadership assignments with wisdom, skill, and tact, as a member of the Reporting and

Dissemination Committee, the Nominations Committee, and the Executive Committee; as an

officer, serving first as Vice Chair and then rising to Chair of the Assessment Development

Committee; and as Chair of the Board’s 25th Anniversary Planning Committee; and

Whereas, he has left a legacy of substantive, lasting accomplishments through his work on the

Governing Board, including the 2014 Technology and Engineering Literacy Assessment, the

2011 computer-based Writing Assessment, the 2009 Science Assessment, the 2013 General

Policy for the Conduct and Reporting of the National Assessment of Educational Progress

(NAEP), and as a workshop facilitator at the Governing Board’s 2014 National Parent Education

Summit, among many others; and

Whereas, the example of his rigorous intelligence, engaging wit, and enormous passion for

nurturing others earned him the respect, admiration, and affection of his fellow Board members;

and

Whereas, Alan J. Friedman was esteemed as a trusted colleague, inspiring as a creative teacher,

revered as a knowledgeable, supportive mentor, and valued as a responsive friend; and

Whereas, he was a brilliant, charming storyteller, using analogy, illustration, and props—most

famously a model flying helicopter—to convey complex ideas in simple terms and advance the

positions he championed with persuasive precision; and

Whereas, Alan J. Friedman touched the hearts and minds of the children and adults he

encountered in encouraging ways that made them strive to be better;

Therefore, be it resolved that the National Assessment Governing Board express its grateful

recognition of the important contributions to NAEP and our nation’s children made by Alan J.

Friedman, and that the Board convey to his family the deep sorrow and sincere sympathy felt

upon his untimely death; and

Be it further resolved that a copy of this resolution be entered permanently into the minutes of

the National Assessment Governing Board.

Page 21: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

See the following information posted at http://nysci.org/friedmancenter/

THE ALAN J. FRIEDMAN CENTER FOR THE DEVELOPMENT OF YOUNG SCIENTISTS There was no one else like Alan Friedman. As a scientist, educator, museum leader, mentor and friend, he was an inspiration to so many people in so many ways. To honor his memory, the Noyce Foundation is making a $500,000 grant to the New York Hall of Science to establish the Alan J. Friedman Center for the Development of Young Scientists. And in response to the many people who have expressed a wish to make a contribution in Alan’s memory, the Noyce Foundation is also offering a matching challenge of up to $250,000 to further advance the Center. At the core of Alan’s vision for the New York Hall of Science was the commitment to provide the opportunity for high school and college students to develop their interests in science by sharing the experience of discovery with others. For nearly 30 years, the brilliance of that vision has been proven through the many programs Alan created and inspired, most notably the Science Career Ladder. The Alan J. Friedman Center for the Development of Young Scientists will encompass both the Science Career Ladder program and the recently launched Science Career Ladder Institute. The Institute provides Explainers with intensive support through career preparation, pre-professional mentorship and apprenticeships, field trips, networking opportunities and research-methods training. In the future, additional programs will be created to cultivate the interests and careers of young scientists in ways we can only imagine. Knowing that Alan meant so much to all of us, we take inspiration from knowing that the Center will enable his vision to stay with us long into the future. For more information, email to [email protected]. To donate, please fill in the form below and select “I wish to make a gift to the Alan J. Friedman Center for the Development of Young Scientists.”

Page 22: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Assessment Development Committee

July 31 – August 1, 2014

AGENDA

Thursday, July 31, 2014

8:30 am – 4:00 pm

Closed Session ACTION: Science Items and Interactive Computer Tasks (ICTs)

Kathleen Scalise, ETS

Secure material sent under separate cover

Friday, August 1, 2014

9:45 – 9:50 am

Welcome, Introductions, and Agenda Overview

Shannon Garrison, Chair

9:50 – 10:10 am

Discussion on NAEP Writing Assessment

Committee Discussion

Attachment A

10:10 – 11:00 am

Update on the 2014 NAEP Technology and Engineering (TEL) Assessment

William Ward, NCES Jonas Bertling, ETS

Attachment B

11:00 – 11:50 am

Transitioning to NAEP Technology-Based Assessments in Reading and Mathematics

Eunice Greer, NCES

Attachment C

11:50 – 12:15 pm

NAEP and Next Generation Science Standards: A Comparison Study

Teresa Neidorf, AIR

Attachment D

12:15 – 12:30 pm

NAEP Item Review Schedule

Mary Crovo, Governing Board Staff

Page 23: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

Is The Nation’s Report Card “College and Career Ready”?

After nearly a decade of research, the National Assessment Governing Board (NAGB) released in May the first outcomes of its efforts to use the results of the 2013 12th grade National Assessment of Educational Progress (NAEP) to report on the academic preparedness of U.S. 12th graders for college. It found that only 38% of 12th graders meet its preparedness benchmark in reading, and 39% meet its preparedness benchmark in math. NAGB’s efforts to track college readiness in the United States is uniquely important as it has the only assessment program that reports on the academic performance of a representative national sample of high school students.

That said, the group that issues the Nation’s Report Card deserves a grade of “Incomplete” for its work. Reading and math are obviously necessary indicators of academic preparation for college and careers after high school, but higher education and employers say it’s not enough. When it comes to the ability to complete college level work (and to being career ready), writing skills are essential. Yet, despite the fact that NAGB also administers a 12th grade writing test, it inexplicably chose not to include writing as an indicator of readiness.

If NAEP wants to remain the “gold standard” for assessment, NAGB must remedy this situation quickly. Postsecondary institutions and systems throughout the nation assess writing in order to determine whether students have the academic skills to succeed in first year courses. According to ACT, approximately one third of ACT test takers do not meet its readiness standard for English Composition. Recent data from Florida indicates that 32% of first year students are placed into developmental writing courses. Using preparedness indicators that do not include writing will not only provide incomplete information to the public but will send the wrong signal about the importance of writing for high school graduates. And states that assess writing need an independent external benchmark they can rely on, which NAEP has always provided with their reading and math assessments.

Unfortunately, the current NAEP 12th grade writing assessment, starting with the Writing Framework that guides the development of test items, will need substantial revisions to be a valid indicator of academic preparedness. One of the most important advances made through the development of the Common Core State Standards (CCSS) English Language Arts/Literacy standards is the understanding that preparation for both postsecondary education and careers requires the ability to read texts of appropriate complexity and mobilize evidence from the text to make a clear and logical written argument. Achieve’s earlier research with states on college and career readiness for the American Diploma Project provides a strong foundation for expecting high school

2

Page 24: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

students to be able to write coherent arguments supported by evidence from credible sources. The CCSS are quite explicit on this issue, building the idea of "writing to sources" into the grade-by-grade progression of the writing standards. Focus groups of postsecondary faculty conducted by PARCC assessment consortium powerfully underscored the importance of these skills.

While NAGB does not need to align its assessments and their frameworks to the CCSS, it does need to pay careful attention to the evidence upon which they rest.

A review of the 2011 NAEP Framework and sample items makes clear that the assessment does not address the ability of students to draw on evidence to make persuasive arguments. In fact, the released 2011 12th grade items do not come close to assessing writing to sources.

One item asks students to write an essay describing how he/she uses technology. It includes a prompt that presents survey data on how students use computers, but doesn't require use of or reference to the data in order to respond to the prompt.

12th Grade NAEP Writing Prompt Write an essay for a college admissions committee about one kind of information or communications technology you use. Describe what it is and explain why the technology is important to you. Develop your essay with details so the admissions committee can understand the value of this technology. You may use information from the presentation in your essay.

NAGB’s web site shows several sample responses, including one that was rated Effective (the highest score), one rated Competent, and one rated Adequate. None of those highly ranked essays made any use of the survey data presented in the question. Those data were window dressing. In short, this item does not require students to read anything (except the question), nor to make an argument based on the evidence provided.

Another item asks 12th grade students to write a persuasive letter to the local council on whether or not to build a discount store in the area. It too is also of limited value for assessing

12th Grade NAEP Writing Item The following article recently appeared in your local newspaper. Write a letter to the local council members arguing for or against the building of Big Discount in your area. Support your argument and defend it against the arguments the opposing side might make.

3

Page 25: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

preparedness in writing. It asks students to read a contrived "newspaper article" regarding plans to build a store in the community. First, the text is considerably less complex than what 12th graders should be able to handle and even less complex than what would be found in many newspaper stories.

And while students are expected to marshal evidence to support their positions, the sample responses include assertions about evidence and facts, but with no sources cited, and no useful evidence provided in the article students were asked to read. Students could simply make up evidence for their response. That’s not the type of preparation for college we should encourage.

If NAGB wants to make a significant contribution to the national conversation about college readiness, it will have to quickly step up its game. Both multi-state assessment consortia, PARCC and SBAC, have developed assessments that incorporate “writing to sources” into their high school assessment programs, and many states will begin to administer them next year.

PARCC 11th Grade Sample Writing Task Today you will read a biography of Abigail Adams, and then you will read two examples of correspondence between Abigail and her husband, John Adams, who served as President of the United States from 1797 to 1801. As you read these texts, you will gather information and answer questions that will help you understand John and Abigail Adams’s relationship and opinions. When you are finished reading, you will write an analytical essay. Question: Both John and Abigail Adams believed strongly in freedom and independence. However, their letters suggest that each of them understood these terms differently based on their experiences. Write an essay that explains their contrasting views on the concepts of freedom and independence. In your essay, make a claim about the idea of freedom and independence and how John and Abigail Adams add to that understanding and/or how each illustrates a misunderstanding of freedom and independence. Support your response with textual evidence and inferences drawn from all three sources.

Sending the right signal to the public and to state policymakers about the importance of assessing writing for college readiness is particularly important now as some states are contemplating buying off the shelf tests or creating their own.

In addition, if NAGB is serious about having a complete indicator of college readiness, they should revise the schedule for administering the 12th grade writing assessment. The last 12th grade writing assessment was given in 2011, and it is not scheduled to be administered again until 2017. Every six years simply isn’t enough.

4

Page 26: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

In the decade it took NAGB to conduct its academic preparedness research, states moved rapidly to make college and career readiness the mission of their K-12 systems, and a national priority. Today, every state has adopted college- and career-ready standards in literacy and mathematics, either the CCSS or their own state standards. And, states are working to develop and administer tests that measure college ready skills – and are honored by postsecondary institutions – to high school students statewide. Twenty states have raised course-taking requirements for high school graduation, and many are working to incorporate indicators of college-readiness into their accountability and reporting systems.

In short, the states are way out in front on promoting and assessing college readiness. NAGB doesn’t have a moment to waste.

5

Page 27: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment B

Technology and Engineering Literacy (TEL) Assessment Update

In 2014, the first-ever national NAEP Technology and Engineering Literacy (TEL) assessment

was conducted at grade 8 and administered on computers. The 2014 NAEP TEL Framework

broadly defines technology and engineering literacy as the capacity to use, understand, and

evaluate technology as well as to understand technological principles and strategies needed to

develop solutions and achieve goals. This Framework served as the guide for the development of

the TEL assessment and defines what students should know and be able to do with technology.

As with other NAEP subject-area assessments TEL also includes a survey questionnaire

component for students and school administrators. The TEL survey questionnaires capture

important contextual factors that relate to achievement and help better understand and interpret

the achievement results.

Historically, NAEP has designed its contextual questionnaires around single questions, and

questionnaire results have been reported as single questions as well. NCES has suggested

changing the approach from questionnaires with only a single question indicator per construct to

a balanced approach where indices based on aggregation of several questions are also developed

to add more robust policy-relevant reporting elements to the NAEP survey questionnaires.

NAEP survey questionnaire indices will allow for the creation of a more robust

database with important contextual variables. These indices will also add value to the Nation’s

Report Card with potential new reporting elements on additional outcome variables that could

serve as a basis for sub-group comparisons, trend analyses, and extended reporting. This

approach aligns with the Policy Statement on NAEP Background Questions and the Use of

Contextual Data in NAEP Reporting, which was unanimously adopted by the National

Assessment Governing Board in August, 2012. Further, the approach is similar to the practice

applied in international large-scale assessments (e.g., Programme for International Student

Assessment [PISA], Trends in International Mathematics and Science Study [TIMSS]) or student

and teacher surveys (e.g., Gallup Student Poll, Teaching and Learning International Survey

[TALIS]).

In this session, since TEL is the first assessment where the indices approach is being

implemented, findings for six potential TEL questionnaire indices (including out-of-school

learning and self-efficacy) will be presented supported by data from the 2014 TEL assessment.

This session will serve as a basis for the development and discussion of an approach for

questionnaire indices not only for TEL but also for future NAEP assessments. Challenges,

constraints and decisions points in the process will be outlined. Additionally, NCES will provide

an updated timeline for post-2014 TEL administration activities.

6

Page 28: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment C

Update on Technology-Based Assessments (TBA) White Paper To help plan NAEP’s transition from its current paper-based assessments to technology-based assessments, a “White Paper” is being written that will describe the overall approach being to taken to accomplish this transition and its rationale. There are many reasons why this transition must begin now for NAEP’s core subject-areas: mathematics, reading, and science (the writing assessment is already technology based). Perhaps the most important reason, however, is that assessment and learning in schools across the country have already started this transition. In order for NAEP to remain relevant and meaningful in the broader educational landscape, the program must begin now to convert to technology-based assessments that reflect how students are being prepared for post-secondary work and academic experiences. Of particular concern to the “Nation’s Report Card” with its decades of valuable performance trends is the ability to maintain trend lines well into the future. As such, the program is planning a multistep process that will carefully and thoughtfully implement this important transition in a manner that is most likely to protect this valuable aspect. Whether or not trends can be maintained across paper-based and technology-based modes of administration is clearly an empirical question. All due care is being taken, however, to increase the likelihood that this important objective is achieved, and that NAEP will maintain its reputation as the gold standard of educational assessments. In addition to the careful attention being paid to maintaining performance trend lines across paper-based and technology-based administration modes, the transition to TBA is being informed by the expert guidance of subject-area, cognitive-science, and measurement experts. This transition presents numerous opportunities to enhance our measurement of framework objectives, and possibly increase the program’s relevance as a measure of preparedness for post-secondary pursuits. In addition, TBA presents numerous possibilities to extend and enhance NAEP’s reporting capabilities and opportunities. To these ends, the White Paper will focus on subject-specific issues and opportunities for leveraging technology delivery to enhance NAEP’s measurement and reporting goals. The paper is expected to be completed towards the end of summer 2014.

7

Page 29: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

Update on the Comparison Study of NAEP and the Next Generation Science Standards

The recent release of the Next Generation Science Standards (NGSS) in 2013 and the National

Research Council (NRC) report on Developing Assessments for the Next Generation Science

Standards in 2014, is leading to major changes in state curricula and assessments in response to

the NGSS emphasis on the integration of scientific and engineering practices with disciplinary

core ideas in science. To inform ongoing discussions of NAEP’s role in emerging national

systems of large-scale assessments in science, technology, engineering, and mathematics

(STEM), NCES is conducting a comparison study of NGSS with the NAEP frameworks in

technology and engineering literacy (TEL) as well as science and relevant aspects of the

mathematics framework. The goal of the study is to provide evidence of the extent to which the

STEM frameworks in NAEP are aligned with the content and scientific and engineering

practices in the NGSS.

At the last Governing Board meeting in May 2014, AIR provided a presentation to the

Governing Board’s Assessment Development Committee on plans for the NGSS/NAEP

framework comparison study. Since then, an expert panel with experience in NGSS and NAEP

was convened and a meeting was conducted in July to provide comparison data on the similarity

of content and the alignment of the scientific and engineering practices in the NGSS and NAEP

frameworks. At the August ADC meeting, AIR will share some initial outcomes and feedback

from the expert panel meeting and provide an update on the status of analysis and reporting

plans.

8

Page 30: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NGSS/NAEP Expert Panel

NAME

AFFILIATION

Alicia Alonzo Michigan State University College of Education Dept. of Teacher Education

Rodger Bybee Director, Emeritus Biological Sciences Curriculum Study

George DeBoer AAAS Director, Project 2061 Washington, DC

Jacob Foster MA Dept. of Elementary and Secondary Education

Brett Moulding Director Building Capacity for State Science Education (CCSSO)

Kathleen Scalise University of Oregon College of Education Dept. of Educational Methodology, Policy & Leadership

Jacqueline Smalls Education Professional Development Consultant Langley STEM Education Campus DC Public Schools (formerly)

9

Page 31: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Executive Committee

July 31, 2014 4:30-5:30 pm

AGENDA

4:30 – 4:35 pm

Welcome, Introductions, Comments, and Agenda Overview

David Driscoll, Chair Full Board Agenda and Action Items

Cornelia Orr, Executive Director

4:35 – 4:37 pm Nomination of Vice Chair Lou Fabrizio

4:37 – 4:40 pm

Updates: NAGB Staffing

NAEP Budget NAEP Reauthorization

Cornelia Orr

4:40 – 5:00 pm

Committee Topics: Issues and Challenges

Assessment Development – Shannon Garrison COSDAM – Lou Fabrizio Reporting and Dissemination – Andrés Alonso

Nominations – Tonya Miles

Closed Session

5:00 – 5:30 pm

NAEP Schedule of Assessments & Budget – Closed Session Discussion for Board Action

Cornelia Orr, NAGB Peggy Carr, NCES

Attachment A

5:30 pm Adjourn

Page 32: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

NAEP Schedule of Assessments At the July 31-August 2, 2014 meeting of the National Assessment Governing Board action will be taken on the NAEP Schedule of Assessments through 2024. In addition to the discussions held at the May 15-17, 2014 meeting, the Governing Board will meet in closed session to review the proposed schedule in full consideration of current budget constraints projected for 2015, 2016, and 2017. Prior to taking action on the staff-proposed schedule, the Board will have two opportunities to review and discuss the proposal which will be sent under separate cover. Closed Session Discussions

1. Executive Committee meeting – Thursday, July 31, 2014 at 4:30-5:30 p.m. 2. Board meeting – Friday, August 1, 2014 at 3:30-5:00 p.m.

Board Action – Saturday, August 2, 2014 Three background resources are provided herein. Attachment A1 – The existing NAEP Schedule Approved August 2013

This attachment is the currently adopted schedule of the Board which includes

assessments scheduled through 2017.

Attachment A2 – The NAEP/NAGB Budget 101 Webinar from July 2013 This attachment contains the PowerPoint slides used in the webinar given by Peggy Carr

last year. The purpose of the webinar was to describe the complexities of planning and

implementing the NAEP program budget, especially as tasks cross fiscal and calendar

years. Slides 7-10 provide an apt illustration of why advanced program and budget

planning are critical to the success of NAEP.

Attachment A3 – The NAEP Schedule of Assessments History Attachment A3 includes the Introduction & Background, Discussion, and Historical

Schedule Change, which are repeated verbatim from the May 2014 Governing Board

meeting materials.

2

Page 33: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A1

*New framework for grade 12 only. **Assessments involving test administration by computer. NOTES: (1) Grades tested are 4, 8, and 12 unless otherwise indicated, except that long-term trend assessments sample students at ages 9, 13, and 17 and are conducted in reading and mathematics. (2) Subjects in BOLD ALL CAPS indicate the year in which a new framework is implemented or assessment year for which the Board will decide whether a new or updated framework is needed. (3) In 2009, 12th grade assessments in reading and mathematics at the state level were conducted as a pilot in 11 volunteering states (AR, CT, FL, IA, ID, IL, MA. NH, NJ, SD, WV). For 2013, 13 states agreed to participate (with MI and TN added). (4) The Governing Board intends to conduct assessments at the 12th grade in World History and Foreign Language during the assessment period 2018-2022.

NAEP Schedule of Assessments – Approved August 3, 2013 Year National State 2005 Reading

MATHEMATICS Science High School Transcript Study

Reading (4, 8) MATH (4, 8) Science (4, 8)

2006 U.S. History Civics ECONOMICS (12)

2007 Reading (4, 8) Mathematics (4, 8) Writing (8, 12)

Reading (4, 8) Math (4, 8) Writing (8)

2008 ARTS (8) Long-term trend

2009 READING Mathematics* SCIENCE** High School Transcript Study

READING (4, 8, 12) Math (4, 8, 12) SCIENCE (4, 8)

2010 U.S. History Civics Geography

2011 Reading (4, 8) Mathematics (4, 8) Science (8)** WRITING (8, 12)**

Reading (4, 8) Math (4, 8) Science (8)

2012 Economics (12) Long-term trend

2013 Reading Mathematics

Reading (4, 8, 12) Math (4, 8, 12)

2014 U.S. History (8) Civics (8) Geography (8) TECHNOLOGY AND ENGINEERING LITERACY (8) **

2015 Reading Mathematics Science**

Reading (4, 8) Math (4, 8) Science (4, 8)

2016 Arts (8) 2017 Reading

Mathematics Writing**

Reading (4, 8, 12) Math (4, 8, 12) Writing (4, 8, 12)

3

Page 34: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

1

NAEP Budget 101National Assessment Governing Board 

Webinar

Peggy G. Carr 

Associate Commissioner

July 30, 2013

Purpose

• To facilitate understanding of factors affecting the NAEPbudget

Topics

• NAGB and NCES budget responsibilities

• NAEP budget history

• Budgeting principles

• Cost drivers

NAEP Budget 101

2

4

Attachment A2

Page 35: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

2

NAEP Budget

$132.3M(FY14)

NCES Commissioner

Administration

$124.6M

National Assessment Governing Board

$7.7M

Single Budget Line / Split Funding

3

Collectively, the Governing Board and NCES are responsible for:

• National and state assessments in reading and mathematics in 4th and 8th grades every 2 years

• National assessments in reading and mathematics at 12th

grade at least every 4 years

• Assessments in other subjects (to the extent that time and resources allow)

Governing Board and NCES

4

5

Page 36: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

3

Framework Development

Item Development

Pilot TestingSampling & 

Data Collection

Scoring, Analysis, & Reporting

NAEP Assessment Cycle

5

YearItem 

Development

Pilot 

Testing

Sampling & 

Data Collection

Scoring, Analysis, & 

Reporting

2013

2015 Science

2017 Mathematics 

2017 Reading

2014 TEL (8)

2015 Mathematics (4,8)

2015 Reading  (4,8)

2013 Mathematics 

2013 Reading 

2012 LTT Mathematics 

2012 LTT Reading 

2012 Economics (12) 

2013 Mathematics (4,8)

2013 Reading (4,8)

2014

2017 Mathematics 

2017 Reading 

2019 Science

2018 U.S. History

2018 Civics

2018 Geography

2015 Science 2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2013 Reading (12)

2013 Mathematics (12)

2015

2018 Economics (12)

2019 Reading

2019 Mathematics 

2020 LTT Mathematics 

2020 LTT Reading

2021 Writing

2017 Mathematics 

2017 Reading 

2017 Writing

2015 Reading (4,8) 

2015 Mathematics (4,8)

2015 Science 

2015 HS Transcript Study

2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2015 Reading (4,8)

2015 Mathematics (4,8)  

NAEP Assessment Timeline

6

6

Page 37: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

4

YearItem 

Development

Pilot 

Testing

Sampling & 

Data Collection

Scoring, Analysis, & 

Reporting

2013

2015 Science

2017 Mathematics 

2017 Reading

2014 TEL (8)

2015 Mathematics (4,8)

2015 Reading  (4,8)

2013 Mathematics 

2013 Reading 

2012 LTT Mathematics 

2012 LTT Reading 

2012 Economics (12) 

2013 Mathematics (4,8)

2013 Reading (4,8)

2014

2017 Mathematics 

2017 Reading 

2019 Science

2018 U.S. History

2018 Civics

2018 Geography

2015 Science 2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2013 Reading (12)

2013 Mathematics (12)

2015

2018 Economics (12)

2019 Reading

2019 Mathematics 

2020 LTT Mathematics 

2020 LTT Reading

2021 Writing

2017 Mathematics 

2017 Reading 

2017 Writing

2015 Reading (4,8) 

2015 Mathematics (4,8)

2015 Science 

2015 HS Transcript Study

2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2015 Reading (4,8)

2015 Mathematics (4,8)  

NAEP Assessment Timeline

7

YearItem 

Development

Pilot 

Testing

Sampling & 

Data Collection

Scoring, Analysis, & 

Reporting

2013

2015 Science2017 Mathematics 

2017 Reading

2014 TEL (8)

2015 Mathematics (4,8)

2015 Reading  (4,8)

2013 Mathematics 

2013 Reading 

2012 LTT Mathematics 

2012 LTT Reading 

2012 Economics (12) 

2013 Mathematics (4,8)

2013 Reading (4,8)

2014

2017 Mathematics 

2017 Reading 

2019 Science

2018 U.S. History

2018 Civics

2018 Geography

2015 Science 2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2013 Reading (12)

2013 Mathematics (12)

2015

2018 Economics (12)

2019 Reading

2019 Mathematics 

2020 LTT Mathematics 

2020 LTT Reading

2021 Writing

2017 Mathematics 

2017 Reading 

2017 Writing

2015 Reading (4,8) 

2015 Mathematics (4,8)

2015 Science 2015 HS Transcript Study

2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2015 Reading (4,8)

2015 Mathematics (4,8)  

NAEP Assessment Timeline

8

7

Page 38: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

5

YearItem 

Development

Pilot 

Testing

Sampling & 

Data Collection

Scoring, Analysis, & 

Reporting

2013

2015 Science

2017 Mathematics 

2017 Reading

2014 TEL (8)

2015 Mathematics (4,8)

2015 Reading  (4,8)

2013 Mathematics 

2013 Reading 

2012 LTT Mathematics 

2012 LTT Reading 

2012 Economics (12) 

2013 Mathematics (4,8)

2013 Reading (4,8)

2014

2017 Mathematics 

2017 Reading 

2019 Science

2018 U.S. History

2018 Civics

2018 Geography

2015 Science 2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2013 Reading (12)

2013 Mathematics (12)

2015

2018 Economics (12)

2019 Reading

2019 Mathematics 

2020 LTT Mathematics 

2020 LTT Reading

2021 Writing

2017 Mathematics 

2017 Reading 

2017 Writing

2015 Reading (4,8) 

2015 Mathematics (4,8)

2015 Science 

2015 HS Transcript Study

2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2015 Reading (4,8)

2015 Mathematics (4,8)  

NAEP Assessment Timeline

9

YearItem 

Development

Pilot 

Testing

Sampling & 

Data Collection

Scoring, Analysis, & 

Reporting

2015

2018 Economics (12)

2019 Reading

2019 Mathematics

2020 LTT Mathematics

2020 LTT Reading

2021 Writing

2017 Mathematics 

2017 Reading 

2017 Writing

2015 Reading (4,8) 

2015 Mathematics (4,8)

2015 Science2015 HS Transcript Study

2014 U.S. History (8)

2014 Civics (8)

2014 Geography (8)

2014 TEL (8)

2015 Reading (4, 8)

2015 Mathematics (4, 8)  

Decision Milestones

10

2015 Science Milestones

Dec 2013

•Design Summit

May 2014

•Notify Schools

Jun 2014

•Print Assess‐mentMaterials

Aug 2014

•Complete Training Materials

Nov 2014

•Train Field Staff

Dec 2014

•Ship Assess‐mentMaterials

8

Page 39: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

6

NCES NAEP Contracts

11

NCES

NAEP Alliance (8) 

Critical Path

Auxiliary Contracts (11+49)

Critical Path Support

This slide left blank intentionally.

12

9

Page 40: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

7

13

NAEP Budget

0.0

20.0

40.0

60.0

80.0

100.0

120.0

140.0

NAEP

 Budget in $Millions

Fiscal Year

2013Sequester

14

History of the NAEP Budget

2002NCLB

1990Trial state assessment

2009Full schedule

10

Page 41: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

8

15

Budgeting Principles

• Maintain the Gold Standard

• Implement Governing Board policies

• Maintain efficient design and methodology

• Invest for the future

• Ensure start up funds for next cycle

• Maintain a positive cash flow

Budgeting Principles

16

11

Page 42: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

7/17/2014

9

17

Cost Drivers

• Type of assessment – National, State, TUDA

• Subject(s) to be assessed

• Grade(s) to be assessed

• Number of schools

• Number of students

• Number of accommodation sessions

• Mode of testing

• Types of items

• Method of reporting

• Number of released items

• New frameworks 

Cost Drivers

18

12

Page 43: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NAEP Schedule of Assessmentsi

Introduction and Background

The Governing Board’s authorizing legislation (P.L. 107-279, Section 302), defines the duties of the Board, one of which is selecting “the subject areas to be assessed (consistent with section 303(b);” (Section 302(e)(1)(A) generally referred to as the NAEP schedule of assessments.

In the Board’s general policyii for conducting and reporting NAEP, the first goal is “To serve as a consistent external, independent measure of student achievement by which results across education systems can be compared at points in time and over time.” The policy further provides (emphasis added):

“National, state, and local education leaders and policymakers—public and private—rely on NAEP data as an independent monitor of student achievement and as a way to compare performance across education systems. For NAEP to serve in this role, NAGB, in consultation with NCES and stakeholders, periodically establishes a dependable, publicly announced assessment schedule of at least ten years in scope. The schedule specifies the subject or topic (e.g., High School Transcript Study), grades, ages, assessment year, and sampling levels (e.g., national, state) for each assessment.

“The NAEP schedule of assessments is the foundation for states’ planning for participation in the assessments. It is the basis for NCES operational planning, annual budget requests, and contract statements of work. In making decisions about the NAEP schedule of assessments, NAGB includes the wide range of important subjects and topics to which students are exposed. NAGB also considers opportunities to conduct studies linking NAEP with international assessments.

“As the NAEP authorizing legislation provides, assessments are conducted in reading and mathematics, and, as time and resources allow, in subjects such as science, writing, history, civics, geography, the arts, foreign language, economics, technology and engineering literacy and other areas, as determined by NAGB. The goal for the frequency of each subject area assessment is at least twice in ten years, to provide for reporting achievement trends.” iii

Discussion

In Article VII.2.a. of the Governing Board By-laws, the Executive Committee is given the responsibility for “recommending the selection of subject areas to be assessed, for consideration by the full Board.” The Board has made changes in the subjects to be assessed based on changes to legal requirements (e.g., No Child Left Behind), important content considerations (e.g., the Arts and Technology and Engineering Literacy), and new initiatives (e.g., linking to international assessments). Schedule adjustments also have been needed to meet the NAEP budget constraints.

The most recent fiscal uncertainty has not provided the opportunity for the Board to adopt a prospective schedule for 10 years into the future. In fact, the current schedule (Attachment A)

13

Attachment A3

Page 44: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

only extends the schedule until 2017, four years from now. At the May 2014 Governing Board meeting the full Board will discuss a first draft of a schedule that extends through 2024 (Attachment B). The attached draft is proposed to elicit discussion and to inform the development of a proposed schedule by the Executive Committee. No action will be taken at the May 2014 meeting. Historical Schedule Changes This section highlights the major schedule changes that have occurred in the last 10 years. Below is a list of the big decisions that have contributed to the current schedule.

1. Added grade 4 and 8 state-level reading and mathematics every two years. (No Child Left Behind; 2002) [Prior to NCLB state assessments at grades 4 and 8 were given every two years with reading and writing in one biennium and mathematics and science in the next. Therefore, these subjects and grade 12 in reading and mathematics were tested once every four years.]

2. Added the High School Transcript Study (HSTS) as a regularly scheduled study. (2005)

3. Scheduled U.S. history, civics and geography on a once every four years cycle. (2005)

4. Added Technology & Engineering Literacy (TEL) to the NAEP subjects assessed. (2005)

5. Added grade 12 state-level reading and mathematics for volunteer states with a periodicity of every four years. (2008)

6. Adjusted the periodicity of science to correspond to the periodicity of TIMSS for the purpose conducting international benchmarking studies in both mathematics and science. (2010)

7. Scheduled Writing as a technology based assessment, beginning with national data collections only and delaying fourth grade in order to complete a special study. (2010)

Other schedule changes and program adjustments due primarily to budget constraints and/or technical challenges have been considered in the development of the draft schedule:

• Postponing the state-level writing assessment. • Postponing various other assessments/studies (World History, Foreign Language, HSTS,

Long-Term Trend). • Assessing fewer grade levels in non-required subject areas (U.S. history, civics, and

geography; writing; TEL). • Changing the sample size and reporting depth for states in reading and writing, referred

to as focal and non-focal subject reporting. i The Introduction and Background, Discussion, and Historical Schedule Changes sections are repeated verbatim from the May 2014 Governing Board meeting materials. ii General Policy: Conducting and Reporting the National Assessment of Educational Progress, National Assessment Governing Board, August 3, 2013, page 5. iii Ibid.

14

Page 45: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Committee on Standards, Design and Methodology

August 1, 2014 9:45 am – 12:30 pm

AGENDA

9:45 – 10:45 am

NAEP Testing and Reporting on Students with Disabilities and English Language Learners Lou Fabrizio, COSDAM Chair Andrés Alonso, R&D Chair Grady Wilburn, NCES [Joint meeting with Reporting and Dissemination]

Attachment A

10:45 – 10:55 am

Break

10:55 – 11:00 am

Introductions and Review of Agenda Lou Fabrizio, COSDAM Chair

11:00 – 11:30 am

ACTION: Technology and Engineering Literacy (TEL) Achievement Levels Descriptions Lou Fabrizio, COSDAM Chair Edys Quellmalz, WestEd

Attachment B

11:30 am – 12:10 pm

TEL Achievement Levels Setting Contract Paul Nichols, Pearson

Attachment C

12:10 – 12:20 pm

The Future of Academic Preparedness Research David Driscoll, Chairman of the Board

12:20 – 12:25 pm

Questions on Information Items Sharyn Rosenberg, NAGB Staff

See below

12:25 – 12:30 pm

Other Issues and Questions

COSDAM Members

Information Items:

• Update on Academic Preparedness Research • Development of White Paper on Maintaining

Trends During Transition to Technology Based Assessments

• Update on Evaluation of NAEP Achievement Levels Procurement

Attachment D Attachment E Attachment F

Page 46: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

NAEP Testing and Reporting on Students with Disabilities

In this joint session, the Committee on Standards, Design, and Methodology (COSDAM) and Reporting and Dissemination Committee (R&D) will discuss a proposed edit to the 2010 Board policy on NAEP Testing and Reporting on Students with Disabilities and English Language Learners, as well as alternatives to the policy for adjusting scores for students excluded from taking the National Assessment of Educational Progress (NAEP). The proposal addresses concerns about a particular part of the policy not being implemented and the possible impact the policy could have on students and schools involved in NAEP. A brief history and background are below. The Policy In Brief The March 2010 Governing Board policy on NAEP Testing and Reporting on Students with Disabilities (SD) and English Language Learners (ELL) was intended to reduce exclusion rates and provide more consistency across jurisdictions in which students are tested on NAEP. The policy promoted sound reporting of comparisons and trends (the policy statement is included as Attachment B2). The policy limits the grounds on which schools can exclude students from NAEP samples to two categories—for SD, only those with the most significant cognitive disabilities, and for ELL, only those who have been in U.S. schools for less than a year. Previously, schools excluded students with Individualized Education Programs (IEPs) that called for accommodations on state tests that NAEP does not allow, primarily the read-aloud accommodation on the Reading assessment. Under the current Board policy, schools could not decide to exclude students whose IEPs for state tests specify an accommodation not allowed on NAEP. Instead, such students had to take NAEP with allowable accommodations. Additionally, parents and educators were encouraged to permit them to do so, given that NAEP provides no scores and causes no consequences for individuals, but needs fully representative samples to produce valid results for the groups on which it reports. By law, individual participation in NAEP is voluntary and parents may withdraw their children for any reason.

Inclusion Rates and Implementation During the December 2013 Board meeting, COSDAM and R&D met in joint session to discuss the 2013 student participation data for grades 4 and 8 Reading and Mathematics. There had been large increases in inclusion rates over the past ten years, and the Board’s first inclusion rate goal—95 percent of all students in each sample—was met in almost all states in 2013. However, 11 states and eight districts failed to meet the Board’s second goal of testing at least 85 percent of students identified as SD or ELL. Contrary to Board policy, NCES has continued to permit schools to exclude students whose IEPs called for accommodations that NAEP does not allow. NCES believes changing this practice could possibly be detrimental to students, increase refusals, change NAEP’s target population, and be counter to current statistical procedures. The Committees asked the staffs of NAGB and NCES to consider possible policy and operational changes and what their impact might be, as well as a timeline for possible Board action.

The staffs of NAGB and NCES have had several conversations about the implementation of the SD/ELL policy. The policy could be clarified by revising the language about converting excluded students to refusals. The fourth implementation guideline for students with disabilities states, “Students refusing to take the assessment because a particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP

2

Page 47: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A data analysis procedures.” NCES asserts that it is technically incorrect to apply a weight class adjustment1 that combines students who did not participate due to receiving accommodations on their state tests that are not allowed on NAEP with students who refused for other reasons. The former group cannot be assumed to be randomly missing, which is a necessary assumption for the current NAEP statistical procedures.

Policy Alternatives and Moving Forward In the May 2014 COSDAM session, Grady Wilburn of NCES and Rochelle Michel from Educational Testing Service (ETS) presented three alternative methods for adjusting scores for students who were excluded from NAEP, contrary to the Board policy:

• “Expanded” population estimates. Improving upon the methodology of the full population estimates (FPEs) and incorporating additional data from NAEP teacher and school contextual questionnaires and from school records (e.g., state test scores for individual students).

• Modified participation A. Administering only the contextual questionnaire to excluded students and using that additional information to predict how the students would have performed on the cognitive items.

• Modified participation B. Administering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g., Mathematics) and using both sources of information to predict how the students would have done on the Reading assessment.

COSDAM members expressed serious reservations about implementing any of the three procedures due to the following reasons: current concerns about collecting student data; the potential for jeopardizing trend reporting; increased costs; and the threat of depressing scores due to a change in the population of tested students. There was general consensus that NCES’ current practices on this particular aspect of the policy—encouraging schools to include more students in NAEP even when they receive accommodations on their state tests that are not allowed on NAEP, but still allowing schools to exclude such students if they insist—was acceptable. The Committee asked whether it is possible to identify students who do take the NAEP Reading assessment despite receiving a read-aloud accommodation on their state tests. Peggy Carr, Associate Commissioner of NCES, noted that the SD questionnaire will be modified for 2015 to capture this information. The Committee agreed with a suggestion from member Andrew Ho that, instead of classifying students as refusals when they do not take the assessment because a particular accommodation is not allowed, the policy be edited to reflect that the number of such students be tracked and minimized to the extent feasible. At this August 1 joint session, COSDAM and R&D members will discuss proposed edits to the policy to address ongoing concerns and questions about implementation.

1 This refers to a set of units (e.g., schools or students) that are grouped together for the purpose of calculating nonresponse adjustments. The units are homogeneous with respect to certain unit characteristics, such as school size, location, public/private, student's age, sex, and student disability status.

3

Page 48: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A Materials The March 2010 Board policy on NAEP Testing and Reporting on Students with Disabilities and English Language Learners, with the proposed edit

Pages 5-10

An excerpt of the 2015 NAEP Questionnaire about Students with Disabilities

Page 11

2013 national and state inclusion rates for NAEP Reading, grades 4 and 8

Pages 12-13

2013 TUDA inclusion rates for NAEP Reading, grades 4 and 8

Page 14

2013 national and state inclusion rates for NAEP Mathematics, grades 4 and 8

Pages 15-16

2013 TUDA inclusion rates for NAEP Mathematics, grades 4 and 8

Page 17

Minutes from the May 2014 COSDAM session on this topic

Pages 18-19

Minutes from the December 2013 Joint COSDAM and R&D session on this topic

Pages 20-22

4

Page 49: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Adopted: March 6, 2010

National Assessment Governing Board

NAEP Testing and Reporting on Students with Disabilities and English Language Learners

Policy Statement

INTRODUCTION To serve as the Nation’s Report Card, the National Assessment of Educational Progress (NAEP) must produce valid, comparable data on the academic achievement of American students. Public confidence in NAEP results must be high. But in recent years it has been threatened by continuing, substantial variations in exclusion rates for students with disabilities (SD) and English language learners (ELL) among the states and urban districts taking part.

Student participation in NAEP is voluntary, and the assessment is prohibited by law from providing results for individual children or schools. But NAEP’s national, state, and district results are closely scrutinized, and the National Assessment Governing Board (NAGB) believes NAEP must act affirmatively to ensure that the samples reported are truly representative and that public confidence is maintained. To ensure that NAEP is fully representative, a very high proportion of the students selected must participate in its samples, including students with disabilities and English language learners. Exclusion of such students must be minimized; they should be counted in the Nation’s Report Card. Accommodations should be offered to make the assessment accessible, but these changes from standard test administration procedures should not alter the knowledge and skills being assessed.

The following policies and guidelines are based on recommendations by expert panels convened by the Governing Board to propose uniform national rules for NAEP testing of SD and ELL students. The Board has also taken into consideration the views expressed in a wide range of public comment and in detailed analyses provided by the National Center for Education Statistics, which is responsible for conducting the assessment under the policy guidance of the Board. The policies are presented not as statistically-derived standards but as policy guidelines intended to maximize student participation, minimize the potential for bias, promote fair comparisons, and maintain trends. They signify the Board’s strong belief that NAEP must retain public confidence that it is fair and fully-representative of the jurisdictions and groups on which the assessment reports.

Attachment A

5

Page 50: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

POLICY PRINCIPLES

1. As many students as possible should be encouraged to participate in the National Assessment. Accommodations should be offered, if necessary, to enable students with disabilities and English language learners to participate, but should not alter the constructs assessed, as defined in assessment frameworks approved by the National Assessment Governing Board.

2. To attain comparable inclusion rates across states and districts, special efforts should be made to inform and solicit the cooperation of state and local officials, including school personnel who decide upon the participation of individual students.

3. The proportion of all students excluded from any NAEP sample should not exceed 5

percent. Samples falling below this goal shall be prominently designated in reports as not attaining the desired inclusion rate of 95 percent.

4. Among students classified as either ELL or SD a goal of 85 percent inclusion shall be established. National, state, and district samples falling below this goal shall be identified in NAEP reporting.

5. In assessment frameworks adopted by the Board, the constructs to be tested should be

carefully defined, and allowable accommodations should be identified. 6. All items and directions in NAEP assessments should be clearly written and free of

linguistic complexity irrelevant to the constructs assessed. 7. Enhanced efforts should be made to provide a short clear description of the purpose

and value of NAEP and of full student participation in the assessment. These materials should be aimed at school personnel, state officials, and the general public, including the parents of students with disabilities and English language learners. The materials should emphasize that NAEP provides important information on academic progress and that all groups of students should be counted in the Nation’s Report Card. The materials should state clearly that NAEP gives no results for individual students or schools, and can have no impact on student status, grades, or placement decisions.

8. Before each state and district-level assessment NAEP program representatives should

meet with testing directors and officials concerned with SD and ELL students to explain NAEP inclusion rules. The concerns of state and local decision makers should be discussed.

Attachment A

6

Page 51: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

IMPLEMENTATION GUIDELINES For Students with Disabilities

1. Students with disabilities should participate in the National Assessment with or without allowable accommodations, as needed. Allowable accommodations are any changes from standard test administration procedures, needed to provide fair access by students with disabilities that do not alter the constructs being measured and produce valid results. In cases where non-standard procedures are permitted on state tests but not allowed on NAEP, students will be urged to take NAEP without them, but these students may use other allowable accommodations that they need.

2. The decision tree for participation of students with disabilities in NAEP shall be as

follows:

NAEP Decision Tree for Students with Disabilities

BACKGROUND CONTEXT

1. NAEP is designed to measure constructs carefully defined in assessment frameworks adopted

by the National Assessment Governing Board. 2. NAEP provides a list of appropriate accommodations and non-allowed modifications in each

subject. An appropriate accommodation changes the way NAEP is normally administered to enable a student to take the test but does not alter the construct being measured. An inappropriate modification changes the way NAEP is normally administered but does alter the construct being measured.

STEPS OF THE DECISION TREE

3. In deciding how a student will participate in NAEP: a. If the student has an Individualized Education Program (IEP) or Section 504 plan and is

tested without accommodation, then he or she takes NAEP without accommodation. b. If the student’s IEP or 504 plan specifies an accommodation permitted by NAEP, then

the student takes NAEP with that accommodation. c. If the student’s IEP or 504 plan specifies an accommodation or modification that is not

allowed on NAEP, then the student is encouraged to take NAEP without that accommodation or modification.

Attachment A

7

Page 52: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

3. Students should be considered for exclusion from NAEP only if they have previously been identified in an Individualized Education Program (IEP) as having the most significant cognitive disabilities, and are assessed by the state on an alternate assessment based on alternate achievement standards (AA-AAS). All students tested by the state on an alternate assessment with modified achievement standards (AA- MAS) should be included in the National Assessment.

4. The number of sStudents who do notrefusing to take the assessment because a

particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP data analysis procedures be tracked and minimized to the extent possible.

5. NAEP should report separately on students with Individualized Education Programs

(IEPs) and those with Section 504 plans, but (except to maintain trend) should only count the students with IEPs as students with disabilities. All 504 students should participate in NAEP.

At present the National Assessment reports on students with disabilities by combining results for those with an individualized education program (who receive special education services under the Individuals with Disabilities Education Act [IDEA]) and students with Section 504 plans under the Rehabilitation Act of 1973 (a much smaller group with disabilities who are not receiving services under IDEA but may be allowed test accommodations).* Under the Elementary and Secondary Education Act, only those with an IEP are counted as students with disabilities in reporting state test results. NAEP should be consistent with this practice. However, to preserve trend, results for both categories should be combined for several more assessment years, but over time NAEP should report as students with disabilities only those who have an IEP.

6. Only students with an IEP or Section 504 plan are eligible for accommodations on

NAEP. States are urged to adopt policies providing that such documents should address participation in the National Assessment.

For English Language Learners

1. All English language learners selected for the NAEP sample who have been in United

States schools for one year or more should be included in the National Assessment. Those in U.S. schools for less than one year should take the assessment if it is available in the student’s primary language.

* NOTE: The regulation implementing Section 504 defines a person with a disability as one who has a physical or mental impairment which substantially limits one or more major life activities, has a record of such an impairment, or is regarded as having such an impairment. 34 C.F.R. § 104.3(j)(1).

Attachment A

8

Page 53: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

One year or more shall be defined as one full academic year before the year of the assessment.

2. Accommodations should be offered that maximize meaningful participation, are responsive to the student’s level of English proficiency, and maintain the constructs in the NAEP framework. A list of allowable accommodations should be prepared by NAEP and furnished to participating schools. Such accommodations may be provided only to students who are not native speakers of English and are currently classified by their schools as English language learners or limited English proficient (LEP).

3. Bilingual versions of NAEP in Spanish and English should be prepared in all subjects, other than reading and writing, to the extent deemed feasible by the National Center for Education Statistics. The assessments of reading and writing should continue to be in English only, as provided for in the NAEP frameworks for these subjects.

4. Staff at each school should select from among appropriate ELL-responsive

accommodations allowed by NAEP, including bilingual booklets, those that best meet the linguistic needs of each student. Decisions should be made by a qualified professional familiar with the student, using objective indicators of English proficiency (such as the English language proficiency assessments [ELPA] required by federal law), in accordance with guidance provided by NAEP and subject to review by the NAEP assessment coordinator.

5. Schools may provide word-to-word bilingual dictionaries (without definitions)

between English and the student’s primary language, except for NAEP reading and writing, which are assessments in English only.

6. NAEP results for ELL students should be disaggregated and reported by detailed

information on students’ level of English language proficiency, using the best available standardized assessment data. As soon as possible, NAEP should develop its own brief test of English language proficiency to bring consistency to reporting nationwide.

7. Data should be collected, disaggregated, and reported for former English language

learners who have been reclassified as English proficient and exited from the ELL category. This should include data on the number of years since students exited ELL services or were reclassified.

8. English language learners who are also classified as students with disabilities should

first be given linguistically-appropriate accommodations before determining which additional accommodations may be needed to address any disabilities they may have.

Attachment A

9

Page 54: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

RESEARCH AND DEVELOPMENT The Governing Board supports an aggressive schedule of research and development in the following areas:

1. The use of plain language and the principles of universal design, including a plain language review of new test items consistent with adopted frameworks.

2. Adaptive testing, either computer-based or paper-and-pencil. Such testing should provide more precise and accurate information than is available at present on low-performing and high-performing groups of students, and may include items appropriate for ELLs at low or intermediate levels of English proficiency. Data produced by such targeted testing should be placed on the common NAEP scale. Students assessed under any new procedures should be able to demonstrate fully their knowledge and skills on a range of material specified in NAEP frameworks.

3. A brief, easily-administered test of English language proficiency to be used for

determining whether students should receive a translation, adaptive testing, or other accommodations because of limited English proficiency.

4. The validity and impact of commonly used testing accommodations, such as extended

time and small group administration. 5. The identification, measurement, and reporting on academic achievement of students

with the most significant cognitive disabilities. This should be done in order to make recommendations on how such students could be included in NAEP in the future.

6. A study of outlier states and districts with notably high or low exclusion rates for

either SD or ELL students to identify the characteristics of state policies, the approach of decision makers, and other criteria associated with different inclusion levels.

The Governing Board requests NCES to prepare a research agenda on the topics above.

A status report on this research should be presented at the November 2010 meeting of the Board.

Attachment A

10

Page 55: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

Excerpt from the 2015 NAEP Questionnaire about Students with Disabilities

What accommodations does STUDENT receive on the state test for Reading?

If a student is not assessed on the state test in Reading, base the response on how the student is assessed in the classroom in Reading.

NOTE: For a description of how each accommodation is conducted in NAEP, place your cursor over the name of each accommodation. Choose all that apply.

□ Student does not receive any accommodations □ Extended time □ Small group □ One on one □ Read aloud in English – directions only □ Read aloud in English – occasional □ Read aloud in English – most or all □ Breaks during testing □ Must have an aide administer the test □ Large print version of the test □ Magnification □ Uses template/special equipment/preferential seating □ Presentation in Braille □ Response in Braille □ Presentation in sign language □ Response in sign language □ Other (specify) In 2015, the information that is captured will allow us to distinguish between accommodations allowed on the NAEP Reading Assessment (e.g., Read aloud in English – directions only) and accommodations not allowed on the NAEP Reading Assessment (e.g., Read aloud in English – occasional, Read aloud in English – most or all). In 2013, a single item asked whether students received any Read aloud accommodation (directions only/occasional/most or all); therefore, it was not possible to distinguish between accommodations allowed by NAEP and accommodations not allowed by NAEP.

11

Page 56: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inclusion rate and confidence interval in NAEP reading for fourth- and eighth-gradepublic and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion InclusionState/jurisdiction rate Lower Upper rate Lower Upper

Nation 97 1 97.3 97.6 98 1 97.7 98.0

Nation (public) 97 1 97.2 97.5 98 1 97.5 97.9Alabama 99 1 98.3 99.3 99 1 98.2 99.3Alaska 99 1 97.9 99.0 99 1 98.1 99.0Arizona 99 1 98.3 99.3 99 1 98.0 98.9Arkansas 99 1 98.4 99.2 98 1 97.3 98.6California 97 1 96.7 98.1 97 1 96.7 98.1Colorado 98 1 97.9 98.9 99 1 98.4 99.2Connecticut 98 1 97.8 98.9 98 1 97.2 98.4Delaware 95 1 94.3 96.1 97 1 95.8 97.1Florida 97 1 96.1 97.8 98 1 97.4 98.7Georgia 95 1 93.7 96.2 96 1 95.2 97.0Hawaii 98 1 97.6 98.6 98 1 97.4 98.5Idaho 99 1 98.0 98.9 98 1 97.8 98.8Illinois 99 1 98.3 99.1 99 1 98.1 98.9Indiana 98 1 96.4 98.3 98 1 97.4 98.6Iowa 99 1 98.4 99.2 99 1 98.1 99.2Kansas 98 1 97.5 98.7 98 1 97.7 98.7Kentucky 97 1 96.4 97.5 97 1 95.9 97.4Louisiana 99 1 98.4 99.2 99 1 98.3 99.1Maine 98 1 97.7 98.7 98 1 97.9 98.9Maryland 87 85.9 88.3 91 89.4 91.7Massachusetts 97 1 96.7 97.8 98 1 97.1 98.4Michigan 96 1 95.0 97.1 96 1 95.1 97.5Minnesota 97 1 96.5 97.9 98 1 97.0 98.2Mississippi 99 1 99.0 99.7 99 1 98.9 99.5Missouri 99 1 98.2 99.2 99 1 98.5 99.3Montana 97 1 96.5 97.6 98 1 97.0 98.3Nebraska 96 1 95.4 97.2 97 1 96.2 97.7Nevada 98 1 98.0 98.9 99 1 98.6 99.3New Hampshire 97 1 96.7 98.0 97 1 96.5 97.6New Jersey 98 1 97.4 98.9 97 1 96.4 98.1New Mexico 99 1 98.6 99.3 98 1 97.8 98.7New York 99 1 97.9 99.1 99 1 98.6 99.4North Carolina 98 1 97.4 98.7 98 1 97.6 98.8North Dakota 96 1 95.3 96.5 96 1 94.9 96.4Ohio 97 1 96.3 98.2 98 1 96.8 98.4Oklahoma 98 1 97.5 98.8 99 1 98.0 99.0Oregon 98 1 96.8 98.1 99 1 98.0 99.0Pennsylvania 98 1 96.9 98.3 98 1 97.6 98.7Rhode Island 99 1 98.2 99.0 99 1 98.2 99.0South Carolina 98 1 97.3 98.9 98 1 97.5 98.6

National Center for Education Statistics

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

12

Page 57: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

public and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion InclusionState/jurisdiction rate Lower Upper rate Lower Upper

Nation 97 1 97.3 97.6 98 1 97.7 98.0

Nation (public) 97 1 97.2 97.5 98 1 97.5 97.9South Dakota 98 1 97.1 98.3 97 1 96.1 97.7Tennessee 97 1 96.0 97.6 97 1 96.0 97.5Texas 95 1 94.0 96.0 96 1 95.6 97.2Utah 97 1 96.1 97.6 97 1 96.0 97.7Vermont 99 1 98.3 99.2 99 1 98.6 99.4Virginia 98 1 97.9 98.9 99 1 98.1 99.0Washington 97 1 96.2 97.9 98 1 96.8 98.1West Virginia 98 1 97.6 98.7 98 1 97.6 98.6Wisconsin 98 1 97.8 98.8 98 1 97.8 98.8Wyoming 99 1 98.3 99.1 99 1 98.5 99.1Other jurisdictions

District of Columbia 98 1 97.6 98.9 98 1 97.6 98.6DoDEA2 94 93.2 94.8 96 1 95.3 96.9

Assessment Governing Board goal of 95 percent.

(public) and state/jurisdiction results include public school students only. Data for DoDEA schools are included in the overall national results, but not in the national (public) results.

Education Statistics, National Assessment of Educational Progress (NAEP), 2013 Reading Assessment.

1 The state/jurisdiction’s inclusion rate is higher than or not significantly different from the National

2 Department of Defense Education Activity (overseas and domestic schools).NOTE: The overall national results include both public and nonpublic school students. The national

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for

95% confidence interval 95% confidence interval

National Center for Education StatisticsInclusion rate and confidence interval in NAEP reading for fourth- and eighth-grade

Grade 4 Grade 8

Attachment A

13

Page 58: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

students, as a percentage of all students, by jurisdiction: 2013

Inclusion InclusionJurisdiction rate Lower Upper rate Lower UpperNation (public) 97 2 97.2 97.5 98 2 97.5 97.9Large city1 97 2 96.1 97.1 98 2 97.2 97.9Albuquerque 99 2 98.8 99.5 98 2 97.1 98.6Atlanta 99 2 98.5 99.1 99 2 98.4 99.3Austin 96 2 94.1 97.4 97 2 95.6 97.4Baltimore City 84 81.8 86.3 84 81.1 85.8Boston 96 2 94.9 96.3 97 2 95.8 97.3Charlotte 99 2 98.4 99.5 98 2 97.6 98.8Chicago 99 2 97.6 99.1 98 2 97.5 99.0Cleveland 95 2 94.5 96.0 96 2 95.5 97.3Dallas 83 75.3 88.5 96 2 95.4 97.3Detroit 95 2 92.7 96.0 94 2 92.6 95.7District of Columbia (DCPS) 98 2 96.7 98.5 97 2 96.4 98.3Fresno 98 2 96.7 98.3 97 2 96.0 97.6Hillsborough County (FL) 99 2 98.3 99.3 98 2 97.2 98.7Houston 94 2 91.0 95.5 96 2 95.3 97.0Jefferson County (KY) 95 2 92.5 96.3 96 2 94.4 96.7Los Angeles 98 2 96.6 98.7 97 2 96.4 98.0Miami-Dade 95 2 92.3 97.4 97 2 95.2 98.3Milwaukee 96 2 93.8 97.3 96 2 94.4 97.1New York City 98 2 97.4 99.0 99 2 97.8 99.0Philadelphia 96 2 94.9 97.1 96 2 93.0 98.0San Diego 98 2 96.6 98.4 97 2 96.2 98.3

the participating districts.

Assessment Governing Board goal of 95 percent.

Statistics, National Assessment of Educational Progress (NAEP), 2013 Reading Assessment.

1 Large city includes students from all cities in the nation with populations of 250,000 or more including

2 The jurisdiction’s inclusion rate is higher than or not significantly different from the National

NOTE: DCPS = District of Columbia Public Schools.SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education

National Center for Education StatisticsInclusion rate and confidence interval in NAEP reading for fourth- and eighth-grade public school

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

14

Page 59: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inclusion rate and confidence interval in NAEP mathematics for fourth- and eighth-grade public and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion State/jurisdiction rate Lower Upper Inclusion rate Lower Upper

Nation 99 1 98.5 98.7 99 1 98.4 98.6

Nation (public) 98 1 98.4 98.6 98 1 98.3 98.5Alabama 99 1 98.1 99.4 99 1 98.6 99.2Alaska 99 1 98.4 99.2 99 1 98.5 99.2Arizona 99 1 98.3 99.1 99 1 98.2 99.1Arkansas 99 1 98.3 99.1 98 1 97.5 98.5California 98 1 97.4 98.6 99 1 98.0 98.9Colorado 99 1 98.3 99.2 99 1 98.3 99.3Connecticut 99 1 98.1 99.1 98 1 97.4 98.4Delaware 98 1 97.1 98.5 99 1 98.2 99.0Florida 98 1 97.5 98.6 98 1 97.7 98.8Georgia 99 1 97.9 99.0 98 1 97.7 99.0Hawaii 99 1 98.3 99.1 98 1 97.8 98.8Idaho 99 1 98.3 99.0 99 1 98.5 99.3Illinois 99 1 98.4 99.4 99 1 98.6 99.3Indiana 98 1 97.9 98.9 98 1 97.7 98.8Iowa 99 1 98.8 99.6 99 1 98.8 99.5Kansas 98 1 97.9 98.8 98 1 97.7 98.8Kentucky 99 1 98.0 99.0 98 1 97.2 98.5Louisiana 99 1 98.3 99.3 99 1 98.5 99.2Maine 98 1 97.3 98.4 99 1 98.2 99.0Maryland 99 1 98.6 99.3 98 1 97.7 98.7Massachusetts 98 1 97.3 98.5 98 1 97.1 98.6Michigan 98 1 97.3 98.6 98 1 95.8 98.6Minnesota 99 1 98.1 99.0 98 1 97.6 98.8Mississippi 99 1 98.7 99.5 99 1 98.5 99.6Missouri 99 1 98.0 99.0 99 1 98.2 99.1Montana 98 1 97.8 98.7 99 1 98.0 99.0Nebraska 98 1 97.6 98.8 98 1 97.6 98.6Nevada 99 1 98.1 99.0 99 1 98.4 99.3New Hampshire 99 1 98.3 99.1 99 1 98.5 99.3New Jersey 99 1 98.3 99.2 98 1 97.7 98.8New Mexico 99 1 98.2 99.2 98 1 97.9 98.8New York 99 1 98.1 99.2 98 1 97.1 98.7North Carolina 99 1 98.3 99.1 99 1 98.2 99.1North Dakota 97 1 96.8 97.9 97 1 96.5 97.5Ohio 99 1 98.2 99.0 98 1 98.0 98.9Oklahoma 98 1 97.5 98.6 98 1 97.7 98.9Oregon 98 1 97.2 98.4 99 1 97.9 99.0Pennsylvania 98 1 97.8 98.8 98 1 97.4 98.9Rhode Island 99 1 98.4 99.2 99 1 98.5 99.2South Carolina 99 1 98.2 99.3 99 1 98.0 99.1

National Center for Education Statistics

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

15

Page 60: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inclusion rate and confidence interval in NAEP mathematics for fourth- and eighth-grade public and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion State/jurisdiction rate Lower Upper Inclusion rate Lower UpperSouth Dakota 99 1 98.0 99.0 99 1 98.2 99.1Tennessee 99 1 98.0 99.1 98 1 97.7 98.7Texas 98 1 97.9 98.7 98 1 97.4 98.6Utah 99 1 98.1 99.2 98 1 97.9 98.9Vermont 99 1 98.2 99.0 99 1 98.8 99.4Virginia 98 1 98.0 98.9 99 1 98.6 99.2Washington 98 1 97.0 98.4 98 1 97.3 98.5West Virginia 98 1 97.6 98.8 98 1 97.8 98.7Wisconsin 98 1 97.7 98.6 98 1 97.9 98.9Wyoming 99 1 98.6 99.3 98 1 98.0 98.9Other jurisdictions

District of Columbia 99 1 98.1 99.0 99 1 98.5 99.4DoDEA2 98 1 97.9 98.7 99 1 98.4 99.2

Assessment Governing Board goal of 95 percent.

(public) and state/jurisdiction results include public school students only. Data for DoDEA schools are included in the overall national results, but not in the national (public) results.

Statistics, National Assessment of Educational Progress (NAEP), 2013 Mathematics Assessment.

1 The state/jurisdiction’s inclusion rate is higher than or not significantly different from the National

2 Department of Defense Education Activity (overseas and domestic schools).NOTE: The overall national results include both public and nonpublic school students. The national

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education

National Center for Education Statistics

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

16

Page 61: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

school students, as a percentage of all students, by jurisdiction: 2013

Inclusion InclusionJurisdiction rate Lower Upper rate Lower UpperNation (public) 98 2 98.4 98.6 98 2 98.3 98.5Large city1 98 2 98.0 98.4 98 2 97.9 98.4Albuquerque 99 2 98.1 99.3 98 2 97.8 99.0Atlanta 99 2 98.4 99.4 99 2 98.8 99.6Austin 98 2 97.0 98.6 98 2 97.4 98.6Baltimore City 98 2 96.9 99.2 98 2 96.6 99.1Boston 96 2 95.4 97.0 97 2 96.7 98.0Charlotte 99 2 97.6 99.4 99 2 97.8 99.2Chicago 99 2 98.3 99.3 99 2 98.0 99.2Cleveland 96 2 94.8 96.5 97 2 96.6 98.0Dallas 98 2 96.8 98.3 98 2 96.7 98.2Detroit 95 2 93.3 96.1 96 2 94.4 96.9District of Columbia (DCPS) 98 2 97.1 98.6 98 2 97.4 98.9Fresno 99 2 98.5 99.5 98 2 97.5 98.8Hillsborough County (FL) 99 2 98.1 99.3 99 2 97.8 99.2Houston 98 2 97.1 98.8 98 2 97.1 98.3Jefferson County (KY) 98 2 97.4 98.8 98 2 97.5 98.9Los Angeles 98 2 97.0 98.7 98 2 97.8 98.9Miami-Dade 98 2 96.5 98.4 98 2 97.0 98.3Milwaukee 97 2 95.2 97.6 96 2 93.6 97.4New York City 99 2 98.0 99.1 98 2 97.4 98.8Philadelphia 97 2 95.1 97.6 96 2 92.6 98.2San Diego 99 2 97.7 99.1 98 2 96.8 98.3

the participating districts.

Governing Board goal of 95 percent.

Statistics, National Assessment of Educational Progress (NAEP), 2013 Mathematics Assessment.

National Center for Education StatisticsInclusion rate and confidence interval in NAEP mathematics for fourth- and eighth-grade public

Grade 4 Grade 895% confidence interval 95% confidence interval

1 Large city includes students from all cities in the nation with populations of 250,000 or more including

2 The jurisdiction’s inclusion rate is higher than or not significantly different from the National Assessment

NOTE: DCPS = District of Columbia Public Schools.SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education

Attachment A

17

Page 62: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

National Assessment Governing Board Committee on Standards, Design and Methodology

May 16, 2014 EXCERPT

COSDAM Members: Chair Lou Fabrizio, Vice Chair Fielding Rolston, Lucille Davy, James Geringer, Andrew Ho, Terry Holliday, James Popham, and Leticia Van de Putte. Governing Board Staff: Michelle Blair and Sharyn Rosenberg. Other Attendees: John Easton, Director of the Institute of Education Sciences and ex officio member of the Governing Board. NCES: Peggy Carr, Arnold Goldstein, Dana Kelly, Daniel McGrath, and Grady Wilburn. AIR: Fran Stancavage. CRP: Carolyn Rudd. ETS: Rochelle Michel and Andreas Oranje. HumRRO: Lauress Wise. Optimal Solutions Group: Lipika Ahuja. Pearson: Brad Thayer. Westat: Keith Rust. NAEP Testing and Reporting on Students with Disabilities Mr. Fabrizio noted that the session would focus on a particular challenge associated with the March 2010 Board policy on NAEP Testing and Reporting on Students with Disabilities (SDs) and English Language Learners (ELLs). The policy was intended to reduce exclusion rates and provide more consistency across jurisdictions in which students are tested on NAEP to promote sound reporting of comparisons and trends. The policy limits the grounds upon which schools can exclude students to two categories—for SDs, only those with the most significant cognitive disabilities, and for ELLs, only those who have been in U.S. schools for less than one year. Although schools cannot limit student participation on any other grounds, individual participation in NAEP is voluntary by law and parents may withdraw their children for any reason. The policy states, “Students refusing to take the assessment because a particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP data analysis procedures.” Under NAEP data analysis procedures, a weight class adjustment is used to account for students who refuse to take the assessment, but excluded students have no impact on estimated scores. Contrary to the Board policy, NCES has continued to permit schools to exclude students whose Individualized Education Programs (IEPs) call for accommodations that NAEP does not allow. NCES asserts that it is technically incorrect to apply a weight class adjustment that combines students who did not participate due to receiving accommodations on their state tests that are not allowed on NAEP with students who refused for other reasons. Grady Wilburn of the National Center for Education Statistics (NCES) and Rochelle Michel from Educational Testing Service (ETS) presented three alternative methods for adjusting scores for students who were excluded from NAEP, contrary to the Board policy. The first method, “Expanded” population estimates, would improve upon the methodology of the full population

18

Page 63: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

estimates (FPEs) and incorporate additional data from NAEP teacher and school contextual questionnaires and from school records (e.g., state test scores for individual students). The second method, Modified participation A, would involve administering only the contextual questionnaire to excluded students and using that additional information to predict how the students would have performed on the cognitive items. The third method, Modified participation B, would involve administering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g., Mathematics) and using both sources of information to predict how the students would have done on the Reading assessment. COSDAM members expressed serious reservations about implementing any of the three procedures due to the following reasons: current concerns about collecting student data; the potential for jeopardizing trend reporting; increased costs; and the threat of depressing scores due to a change in the population of tested students. There was general consensus that NCES’ current practices on this particular aspect of the policy—encouraging schools to include more students in NAEP even when they receive accommodations on their state tests that are not allowed on NAEP, but still allowing schools to exclude such students if they insist—was acceptable. The committee asked whether it is possible to identify students who do take the NAEP Reading assessment despite receiving a read-aloud accommodation on their state tests. Peggy Carr, Associate Commissioner of NCES, noted that the SD questionnaire will be modified for 2015 to capture this information. Andrew Ho suggested the following edit to the policy: “Students refusing to take the assessment because a particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP data analysis procedures be tracked and minimized to the extent possible.” The committee agreed with Mr. Ho’s suggestion. Mr. Fabrizio asked that this recommendation be shared with the Reporting and Dissemination Committee in joint session during the August 2014 meeting.

19

Page 64: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

National Assessment Governing Board Committee on Standards, Design and Methodology

December 6, 2013

EXCERPT JOINT MEETING WITH REPORTING AND DISSEMINATION COMMITTEE Attendees COSDAM Members: Chair Lou Fabrizio, Vice Chair Fielding Rolston, Lucille Davy, Andrew Ho, Terry Holliday, and James Popham. Reporting and Dissemination Committee Members: Acting Chair Terry Mazany (Vice Chair of the Reporting and Dissemination Committee), Anitere Flores, Rebecca Gagnon, Tom Luna, Tonya Miles, and Father Joseph O’Keefe. Governing Board Staff: Executive Director Cornelia Orr, Michelle Blair, Larry Feinberg, Stephaan Harris, and Sharyn Rosenberg. Other Attendees: John Easton, Director of the Institute of Education Sciences and ex officio member of the Governing Board. NCES: Commissioner Jack Buckley, Gina Broxterman, Patricia Etienne, Arnold Goldstein, Andrew Kolstad, and Daniel McGrath. AIR: Victor Bandeira de Mello, George Bohrnstedt, Markus Broer, and Cadelle Hemphill. ETS: Andreas Oranje, John Mazzeo, and Lisa Ward. Hager Sharp: David Hoff, Debra Silimeo, and Melissa Spade Cristler. HumRRO: Steve Sellman and Laurie Wise. Optimal Solutions Group: Rukayat Akinbiyi. Reingold: Amy Buckley, Erin Fenn, Sarah Johnson, and Valeri Marrapodi. Virginia Department of Education: Pat Wright. Westat: Chris Averett and Keith Rust. Widmeyer: Jason Smith. Lou Fabrizio, Chair of the Committee on Standards, Design and Methodology (COSDAM), called the meeting to order at 10:02 a.m. and welcomed members and guests. The purpose of the joint session was to discuss implementation in the NAEP 2013 assessments of the Governing Board policy on NAEP Testing and Reporting on Students with Disabilities (SD) and English Language Learners (ELL). Larry Feinberg, of the Governing Board staff, described the March 2010 policy, which was intended to reduce exclusion rates and provide consistency across jurisdictions in how students are tested to promote sound reporting of comparisons and trends. The policy limits the grounds on which schools can exclude students from NAEP samples to two categories—for SD, only those with the most significant cognitive disabilities, and for ELL, only those who have been in U.S. schools for less than a year. He noted that previously, schools could exclude students with Individualized Education Programs (IEPs) that called for accommodations on state tests that NAEP does not allow because they would alter the construct NAEP assesses. The most widely used of these were

20

Page 65: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

having the test read aloud for the Reading assessment and using a calculator for all parts of the Mathematics assessment. Under the current Board policy, schools can no longer decide to exclude students whose IEPs for state tests specify an accommodation not allowed on NAEP. Instead, such students should take NAEP with allowable accommodations. Parents should be encouraged to permit them to do so, given that NAEP provides no scores and causes no consequences for individuals but needs fully representative samples to produce the valid results for the groups on which it reports. By law, individual participation in NAEP is voluntary and parents may withdraw their children for any reason. When parents refuse to allow children to participate in NAEP, scores are imputed based on reweighting the performance of other students with similar characteristics. However, when students are excluded, they do not impact group scores at all, and, in effect, are considered to achieve at the group average. Grady Wilburn, of NCES, presented 2013 participation data for grades 4 and 8 Reading and Mathematics. He noted large increases in inclusion rates over the past ten years, and said the Board’s inclusion goals—95 percent of all students in each sample and 85 percent of students identified as SD or ELL—had been met in almost all states. According to calculations by Keith Rust, of Westat, converting exclusions in reading to refusals would produce a statistically significant change in only one state, Maryland. However, Peggy Carr, Associate Commissioner of Assessment at NCES, said the impact would be much greater in some of the urban districts in TUDA, whose 2013 results have not yet been released.

In accordance with Board action, Mr. Wilburn said NCES had also published scores based on full-population estimates, (FPEs), which adjust state and district averages by imputing scores for excluded SD and ELL students based on the performance of similar SD and ELL students who are tested. Member Andrew Ho said these estimates should be given more emphasis as a way to give consistency to trends and make it clear when score changes are likely to have been caused by changes in exclusion rates. Ms. Carr said improvements were possible in the models for imputing FPEs. Mr. Wilburn explained that, contrary to the Board policy, NCES had continued to permit schools to exclude students whose IEPs called for accommodations that NAEP does not allow, in most cases, read-aloud. NCES believes changing this practice would increase refusals, impact reported trends, change NAEP’s target population, and violate sound psychometric procedures. For mathematics in 2013, NCES introduced a new option for students whose IEPs call for a calculator accommodation, where schools could choose to have these students take two calculator-active NAEP blocks, even if those were not the blocks that would have been randomly assigned through the matrix sampling design. Mr. Feinberg said this change, by reducing exclusions, had also impacted some reported trends.

21

Page 66: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

Jack Buckley, the Commissioner of Education Statistics, noted that it is not clear who gets to define NAEP’s target population. He said NCES and the Board disagree about whether it should include students whose IEPs specify accommodations that NAEP does not allow. Mr. Wilburn said NCES plans to publish a technical memo that will focus on how refusal and exclusion issues impact NAEP participation and performance. The memo will include total participation rates that summarize non-participation from all causes—exclusions, refusals, and absence (which is the largest category). The memo will also provide data on the proportion of exclusions based on NAEP not allowing a state-provided accommodation. There was additional discussion on the impact that exclusion and refusal changes would have on TUDA districts. Terry Mazany, the acting chair of the Reporting and Dissemination Committee, conveyed a message from Andrés Alonso, the Committee chair who was not present. He said Mr. Alonso, former superintendent of Baltimore schools, had urged that policy changes impacting NAEP exclusions and scores should be highlighted in NAEP reports to provide context for interpreting results and that historical data should be provided. The Committees asked the staffs of NCES and NAGB to consider possible policy changes and what their impact might be. Lou Fabrizio, chair of the Committee on Standards, Design and Methodology, asked staff to prepare recommendations for moving forward and a timeline for possible Board action.

22

Page 67: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment B

ACTION: Technology and Engineering Literacy (TEL) Achievement Levels Descriptions (ALDs)

The first step in setting achievement levels for the 2014 grade 8 NAEP Technology and Engineering Literacy (TEL) Assessment is to finalize the achievement levels descriptions (ALDs) for grade 8. There are specific requirements for achievement level descriptions for NAEP. The achievement levels describe what students should know and be able to do to meet the requirements at each of three levels of achievement (Basic, Proficient, and Advanced). These TEL ALDs must be aligned with the content of the TEL Framework and the policy descriptions of each achievement level.

The preliminary achievement levels descriptions for each assessment area (Technology and Society; Design and Systems; and Information and Communication Technology) were developed as part of the Technology and Engineering Literacy Framework development project1. The Governing Board requires expert evaluation of these descriptions in order to develop final descriptions for use in the achievement levels setting process. The final descriptions will be for the overall Technology and Engineering Literacy Assessment rather than separately by assessment area.

A statement of objectives was developed, and a Request for Quotations was issued in early February. A contract was awarded to WestEd on April 4, 2014. The project staff works within WestEd’s Science, Technology, Engineering, and Mathematics (STEM) program and has an extensive understanding of curriculum, instruction, and assessment issues in these content areas. Project staff have led framework development projects, conducted studies of educational technology programs, and developed assessments on the use of educational technology. Most significantly, they managed the development of the 2014 NAEP TEL Framework.

In early May, experts in Design and Systems, Information and Communication Technology, and Technology and Society were convened to review and revise the descriptions included in the 2014 TEL Framework. Following the refinement of the TEL ALDs, WestEd sought public comment from TEL education stakeholders and other key individuals and organizations. The content experts then reconvened a virtual meeting to review the public comment and further revise the descriptions. The draft descriptions were sent to COSDAM members and were discussed via teleconference on July 3rd. Per the Committee’s request, a few edits were made and the revised version of the TEL ALDs was distributed to COSDAM in mid-July.

At the August 2014 meeting, Edys Quellmalz from WestEd will provide an overview of the process used to develop the TEL ALDs. COSDAM will take action on the TEL ALDs for use in the TEL achievement levels setting process.

1 http://www.nagb.org/publications/frameworks/technology/2014-technology-framework.html

23

Page 68: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment B

Technology and Engineering Literacy Achievement Levels Descriptions

Basic: Eighth grade students performing at the Basic level should be able to use common tools and media to achieve specified goals and identify major impacts. They should demonstrate an understanding that humans can develop solutions by creating and using technologies. They should be able to identify major positive and negative effects that technology can have on the natural and designed world. Students should be able to use systematic engineering design processes to solve a simple problem that responsibly addresses a human need or want. Students should distinguish components in selected technological systems and recognize that technologies require maintenance. They should select common information and communications technology tools and media for specified purposes, tasks, and audiences. Students should be able to find and evaluate sources, organize and display data and other information to address simple research tasks, give appropriate acknowledgement for use of the work of others, and use feedback from team members (assessed virtually). Proficient: Eighth grade students performing at the Proficient level should be able to understand the interactions among parts within systems, systematically develop solutions, and contribute to teams (assessed virtually) using common and specialized tools to achieve goals. They should be able to explain how technology and society influence each other by comparing the benefits and limitations of the technologies’ impacts. Students should be able to analyze the interactions among components in technological systems and consider how the behavior of a single part affects the whole. They should be able to diagnose the cause of a simple technological problem. They should be able to use a variety of technologies and work with others using systematic engineering design processes in which they iteratively plan, analyze, generate, and communicate solutions. Students should be able to select and use an appropriate range of tools and media for a variety of purposes, tasks, and audiences. They should be able to contribute to work of team collaborators (assessed virtually) and provide constructive feedback. Students should be able to find, evaluate, organize, and display data and information to answer research questions, solve problems, and achieve goals, appropriately citing use of the ideas, words, and images of others. Advanced: Eighth grade students performing at the Advanced level should be able to draw upon multiple tools and media to address complex problems and goals and demonstrate their understanding of the potential impacts on society. They should be able to explain the complex relationships between technologies and society and the potential implications of technological decisions on society and the natural world. Given criteria and constraints, students should be able to use systematic engineering design processes to plan, design, and use evidence to evaluate and refine multiple possible solutions to a need or problem and justify their solutions. Students should be able to explain the relationships among components in technological systems, anticipate maintenance issues, identify root causes, and repair faults. They should be able to use a variety of common and specialized information technologies to achieve goals, and to produce and communicate solutions to complex problems. Students should be able to integrate the use of multiple tools and media, evaluate and use data and information, communicate with a range of audiences, and accomplish complex tasks. They should be able to use and explain the ethical and appropriate methods for citing use of multimedia sources and the ideas and work of others. Students should be able to contribute to collaborative tasks on a team (assessed virtually) and organize, monitor, and refine team processes.

24

Page 69: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment C

Technology and Engineering Literacy (TEL) Achievement Levels Setting

A Request for Proposals (RFP) was issued on March 24, 2014 to set achievement levels for the 2014 grade 8 NAEP Technology and Engineering Literacy (TEL) Assessment. On July 2, 2014, a 15-month contract in the amount of $1.1 million was awarded to NCS Pearson (Pearson); the press release is available at http://www.nagb.org/newsroom/press-releases/2014/release-20140702.html. Pearson has subcontracted with edCount, LLC, a woman-owned small business that focuses on standards, assessment, and accountability, and Conference Solutions, LLC, a woman-owned small business with expertise in planning meetings.

In this session, Pearson will provide an introduction and overview of the TEL ALS contract, including key staff and project milestones.

25

Page 70: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

NAEP 12th Grade Academic Preparedness Research

Phase 1 Research

The first phase of the Governing Board’s research on academic preparedness is now complete; results from more than 30 studies are available at: http://www.nagb.org/what-we-do/preparedness-research.html. During the August 2013 meeting, the Board voted on a motion to use the phase 1 research on academic preparedness for college in the reporting of the 2013 grade 12 national results for reading and mathematics, released on May 14, 2014. The motion, validity argument, and phase 1 final report are now available on the aforementioned website.

Phase 2 Research

The second phase of the Governing Board’s research on academic preparedness currently consists of the following studies that are planned or underway:

Study name Sample August 2014 Update

Statistical linking of NAEP and ACT National; FL, IL, MA, MI, TN

Page 27

Longitudinal statistical relationships: Grade 12 NAEP FL, IL, MA, MI, TN Page 28

Statistical linking of NAEP and EXPLORE KY, NC, TN Page 29

Longitudinal statistical relationships: Grade 8 NAEP KY, NC, TN Page 30

Content alignment of NAEP and EXPLORE Page 31

College Course Content Analysis Page 33

Evaluating Reading and Mathematics Frameworks and Item Pools as Measures of Academic Preparedness for College and Job Training (Research with Frameworks)

Pages 34-36

Brief overviews and informational updates are provided for each study.

26

Page 71: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

National and State Statistical Linking Studies with the ACT

The Governing Board is planning to partner with ACT, Inc. to conduct a statistical linking study at the national level between NAEP and the ACT in Reading and Mathematics. Through a procedure that protects student confidentiality, the ACT records of 12th grade NAEP test takers in 2013 will be matched, and through this match, the linking will be performed. A similar study at the national level was performed with the SAT in 2009. There will not be a national statistical linking study performed for NAEP and the SAT in 2013.

In addition, the state-level studies, begun in 2009 with Florida, will be expanded with 2013 NAEP. Again using a procedure that protects student confidentiality, ACT scores of NAEP 12th grade test takers in the state samples in partner states will be linked to NAEP scores. We are in the planning stages with five states to be partners in these studies at grade 12: Florida, Illinois, Massachusetts, Michigan, and Tennessee. In three of these states (IL, MI, TN), the ACT is administered to all students state-wide, regardless of students’ intentions for postsecondary activities.

Draft Research Questions for National and State Statistical Linking Studies with the ACT:

1. What are the correlations between the grade 12 NAEP and ACT student score distributions in Reading and Math?

2. What scores on the grade 12 NAEP Reading and Math scales correspond to the ACT college readiness benchmarks? (concordance and/or projection)

3. What are the average grade 12 NAEP Reading and Math scores and interquartile ranges (IQR) for students below, at, and at or above the ACT college readiness benchmarks?

4. Do the results differ by race/ethnicity or gender?

August 2014 Update: The data sharing agreements are still in the process of being finalized.

27

Page 72: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

Longitudinal Statistical Relationships: Grade 12 NAEP

In addition to the linking of ACT scores to NAEP 12th grade test scores in partner states, the postsecondary activities of NAEP 12th grade test takers will be followed for up to six years using the state longitudinal databases in Florida, Illinois, Massachusetts, Michigan, and Tennessee. These studies will examine the relationship between 12th grade NAEP scores and scores on placement tests, placement into remedial versus credit-bearing courses, GPA, and persistence.

Draft Research Questions for Longitudinal Statistical Relationships, Grade 12 NAEP:

1. What is the relationship between grade 12 NAEP Reading and Math scores and grade 8 state test scores?

2. What are the average grade 12 NAEP Reading and Math scores and interquartile ranges (IQR) for students with placement in remedial and non-remedial courses?

3. What are the average grade 12 NAEP Reading and Math scores (and the IQR) for students with a first-year GPA of B- or above?

4. What are the average grade 12 NAEP Reading and Math scores (and the IQR) for students who remain in college after each year?

5. What are the average grade 12 NAEP Reading and Math scores (and the IQR) for students who graduate from college within 6 years?

August 2014 Update: The data sharing agreements are still in the process of being finalized.

28

Page 73: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

State Statistical Linking Studies with EXPLORE

In 2013, linking studies between 8th grade NAEP in Reading and Mathematics and 8th grade EXPLORE, a test developed by ACT, Inc. that is linked to performance on the ACT, are planned with partners in three states: Kentucky, North Carolina, and Tennessee. In all three of these states, EXPLORE is administered to all students state-wide during grade 8.

Draft Research Questions for State Statistical Linking Studies with EXPLORE:

1. What are the correlations between the grade 8 NAEP and EXPLORE scores in Reading and Math?

2. What scores on the grade 8 NAEP Reading and Math scales correspond to the EXPLORE college readiness benchmarks (concordance and/or projection)?

3. What are the average grade 8 NAEP Reading and Math scores and the interquartile ranges (IQR) for students below, at, and at or above the EXPLORE college readiness benchmarks?

August 2014 Update: The data sharing agreements are complete; we are in the process of obtaining the data to begin the analyses.

29

Page 74: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

Longitudinal Statistical Relationships: Grade 8 NAEP

In 2013, the Governing Board will also expand the state-level studies by partnering with two states at grade 8. Again using a procedure that protects student confidentiality, secondary and postsecondary data for NAEP 8th grade test takers in the state samples in partner states will be linked to NAEP scores. These studies will examine the relationship between 8th grade NAEP scores and scores on state tests, future ACT scores, placement into remedial versus credit-bearing courses, and first-year college GPA.

Two states will be partners in these studies at grade 8: North Carolina and Tennessee.

Draft Research Questions for Longitudinal Statistical Relationships, Grade 8 NAEP:

1. What is the relationship between NAEP Reading and Math scores at grade 8 and state test scores at grade 4?

2. What are the average NAEP Reading and Math scores and the interquartile ranges (IQR) at grade 8 for students below the ACT benchmarks at grade 11/12? At or above the ACT benchmarks?

3. What are the average NAEP Reading and Math scores and the interquartile ranges (IQR) at grade 8 for students who are placed in remedial and non-remedial courses in college?

4. What are the average NAEP Reading and Math scores (and the IQR) at grade 8 for students who obtain a first-year college GPA of B- or above?

5. What is the relationship between grade 8 NAEP Reading and Math scores and grade 12 NAEP Reading and Math scores? (contingent on feasibility of sampling the same students in TN and NC)

August 2014 Update: The data sharing agreements are complete; we are in the process of obtaining the data to begin the analyses.

30

Page 75: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

Content Alignment Study of Grade 8 NAEP Reading and Mathematics and EXPLORE

Content alignment studies are a foundation for the trail of evidence needed for establishing the validity of preparedness reporting, and are, therefore, considered a high priority in the Governing Board’s Program of Preparedness Research. The alignment studies will inform the interpretations of preparedness research findings from statistical relationship studies and help to shape the statements that can be made about preparedness. Content alignment studies were recommended to evaluate the extent to which NAEP content overlaps with that of the other assessments to be used as indicators of preparedness in the research.

We plan to conduct an alignment study of grade 8 NAEP Reading and Mathematics and ACT EXPLORE. Results from this content alignment study will be particularly important for interpreting the findings from the statistical linking studies of NAEP and EXPLORE.

August 2014 Update: ACT has agreed to having a NAEP and EXPLORE content alignment study performed by an independent third party. A Request for Proposals (RFP) was released on June 20th, and proposals are due on August 4th. We intend to award the contract by the end of September.

31

Page 76: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

OVERVIEW OF REFERENCED ASSESSMENTS

For additional background information, the following list presents a brief description of the assessments referenced in the phase two academic preparedness research studies. In each case, only the mathematics and reading portions of the assessments are the targets for analysis, although analyses with the composite scores may be conducted.

ACT – The ACT assessment is a college admissions test used by colleges and universities to determine the level of knowledge and skills in applicant pools, including Reading, English, Mathematics, and Science tests. ACT has College Readiness Standards that connect reading or mathematics knowledge and skills and probabilities of a college course grade of “C” or higher (0.75) or “B” or higher (0.50) with particular score ranges on the ACT assessment.

ACT EXPLORE – ACT EXPLORE assesses academic progress of eighth and ninth grade students. It is a component of the ACT College and Career Readiness System and includes assessments of English, Mathematics, Reading, and Science. ACT EXPLORE has College Readiness Standards that connect reading and mathematics knowledge and skills and probabilities of a college course grade of “C” or higher (0.75) or “B” or higher (0.50) by the time students graduate high school with particular score ranges on the EXPLORE assessment.

SAT – The SAT reasoning test is a college admissions test produced by the College Board. It is used by colleges and universities to evaluate the knowledge and skills of applicant pools in critical reading, mathematics, and writing. The SAT has calculated preparedness benchmarks are defined as the SAT scores corresponding to a 0.65 probability of earning a first-year college grade-point average of 2.67 (B-) or better.

32

Page 77: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

COLLEGE COURSE CONTENT ANALYSIS

Project Status Update Contract ED-NAG-12-C-0003

The College Course Content Analysis (CCCA) study is one of a series of studies contributing to the National Assessment of Educational Progress (NAEP) Program of 12th Grade Preparedness Research conducted by the National Assessment Governing Board (NAGB). The purpose of the CCCA study is to identify a comprehensive list of the reading and mathematics knowledge, skills, and abilities (KSAs) that are pre-requisite to entry-level college mathematics courses and courses that require college level reading based on information from a representative sample of U.S. colleges. The Educational Policy Improvement Center (EPIC) is the contractor working for the Board to conduct this study.

Another goal of the CCCA study is to extend the work of the two previous preparedness studies—the Judgmental Standards Setting (JSS)1 study, implemented in 2011 and the Job Training Program Curriculum (JTPC) study, implemented in 2012. The CCCA study is designed so the results can be compared to the JSS and JTPC studies, reporting on how this new information confirms or extends interpretations of those earlier studies. The design of the CCCA study is based on the JTPC study but with modifications based on the lessons learned.

August 2014 Update: The project is now complete (see May 2014 COSDAM materials for Executive Summary). The final report is now available on the Governing Board’s website at: http://www.nagb.org/content/nagb/assets/documents/what-we-do/preparedness-research/judgmental-standard-setting-studies/College_Course_Content_Analysis.pdf.

1 National Assessment Governing Board. (2010). Work Statement for Judgmental Standard Setting Workshops for the 2009 Grade 12 Reading and Mathematics National Assessment of Educational Progress to Reference Academic Preparedness for College Course Placement. (Higher Education Solicitation number ED-R-10-0005).

33

Page 78: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

EVALUATING READING AND MATHEMATICS FRAMEWORKS AND ITEM POOLS

AS MEASURES OF ACADEMIC PREPAREDNESS FOR COLLEGE AND JOB TRAINING

Project Status Update July 10, 2014

Contract ED-NAG-13-C-0001 The National Assessment Governing Board contracted with the Human Resources Research Organization (HumRRO) in June 2013 to conduct three tasks related to research on 12th grade preparedness: 1. Evaluation of the Alignment of Grade 8 and Grade 12 NAEP to an Established

Measure of Job Preparedness: In its June 2009 report, Making New Links: 12th Grade and Beyond, the Technical Panel on 12th Grade Preparedness Research recommended that content alignment studies be conducted to examine the structure and content of various assessments relative to NAEP. The purpose of such content alignment would be to determine whether the scores on NAEP and the other assessments convey similar meaning in terms of the knowledge and skills of examinees. In fact, the panel specifically recommended that content alignment studies be conducted between NAEP and WorkKeys to determine the correspondence between the content domain assessed by NAEP and that of WorkKeys. If the alignment is relatively high, or even moderately high in some cases, then statistical relations between NAEP and WorkKeys may allow for the interpretation of NAEP results in terms of how WorkKeys would typically be interpreted. Using WorkKeys as a measure of job training preparedness allows the comparison of findings from this research to findings from previous content alignment studies with WorkKeys. This would provide a cross-validity check with NAEP grade 12 and also expand the content alignment study by using NAEP grade 8 as well. This study will extend prior analysis of the relation of NAEP to WorkKeys by including the NAEP grade 8 assessments and by expanding the method for assessing content alignment. The study method will follow the Governing Board content alignment design document for preparedness research studies, with some modifications. The two-pronged approach includes alignment of: (a) WorkKeys to the NAEP frameworks, and (b) NAEP items to the framework from which WorkKeys was developed.

2. O*NET Linkage Study: This study a) identified relevant linkages between the National

Assessment of Educational Progress (NAEP) and training performance requirements for selected occupations, and b) compared the levels of knowledge, skills, and abilities (KSAs) required for the relevant NAEP content to the levels of KSAs required for the relevant job training content. .

For this study, tasks (i.e., performance requirements) for each occupation were extracted from O*NET. The O*NET, or Occupational Information Network, is the U.S. Department of Labor’s occupational information database. The O*NET contains standardized descriptions of 974 occupations, including the five occupations that are the focus of the National Assessment Governing Board’s (Governing Board) program of research on job preparedness. Because the O*NET descriptors provide a “common language” for describing similarities and differences across occupations, it is a very useful resource for this study. Occupational

34

Page 79: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

experts from each of the target occupations reviewed the O*NET task lists for their appropriateness to job training. This review was necessary because the O*NET tasks describe job performance requirements, but not training performance requirements, and the focus of the Governing Board’s research is preparedness for job training. Based on the feedback from the occupational experts, edits were made to the O*NET task lists to ensure their applicability to job training. Next, occupational experts used these lists to identify NAEP content that is relevant (“linked”) to training performance requirements. The occupational experts also identified the training performance requirements that are relevant (“linked”) to NAEP content. Irrelevant content was removed from further consideration. Finally, trained project analysts used academically-relevant KSAs from O*NET to systematically rate the levels of KSAs needed for the relevant NAEP content and the levels of KSAs needed for the relevant job training content. Disconnects between the levels of KSAs needed for NAEP and the levels needed for job training were flagged for discussion.

3. Technical Advisory Panel (TAP) Symposium: HumRRO assembled a technical advisory panel (TAP) of five experts in educational measurement and five experts in industrial-organizational (I-O) psychology to review extant research and to generate ideas for commissioned papers on preparedness. The TAP met in Washington D.C. in late October 2013. This brainstorming session included presentations by Governing Board and HumRRO staff describing findings from previous studies and descriptions of other studies currently underway, followed by an open discussion of issues and possible additional areas of investigation. Each panelist was asked to use this information to propose a paper that he/she could develop. TAP members submitted nine proposals from which Governing Board staff commissioned five papers. Panelists have several months to develop the papers. The TAP will reconvene in a late summer 2014 symposium during which authors will present their papers and the entire panel will discuss implications for preparedness research. HumRRO will produce a proceedings document summarizing the commissioned papers and discussion. (A list of TAP members is included on the next page.)

In addition, HumRRO will produce a comprehensive project report at the conclusion of the contract in December 2014. Work completed as of August 2014: Evaluation of Alignment of Grade 8 and 12 NAEP to an Established Measure of Job Preparedness: Held two rounds of workshops to evaluate alignment between grade 8 and 12 NAEP Reading and Mathematics and WorkKeys. Four 6-person panels were convened in Louisville, Kentucky in June 2014 and the process was replicated in Alexandria, Virginia in July 2014. Analysis is underway. O*NET Linkage: This task was completed in April 2014; see May 2014 COSDAM materials for details. TAP Symposium: Governing Board staff reviewed proposals submitted by TAP panelists and commissioned four (4) papers to be completed by the panelists. Draft papers are under development. Authors will present final papers at the second TAP meeting on August 20, 2014.

35

Page 80: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

Technical Advisory Panel (TAP) Members John Campbell Suzanne Lane Professor of Psychology Professor, Research Methodology University of Minnesota University of Pittsburgh School of (Member, NAGB Technical Panel on 12th Education Grade Preparedness Research, 2007-2008) Barbara Plake Michael Campion University Distinguished Professor, Herman C. Krannert Emeritus Professor of Management University of Nebraska-Lincoln Purdue University Ann Marie Ryan Gregory Cizek Professor of Psychology Professor of Educational Measurement Michigan State University and Evaluation University of North Carolina at Chapel Hill Nancy Tippins Senior Vice President Brian Gong CEB Valtera Executive Director of Center for Assessment National Center for the Improvement of Educational Assessment, Inc. Ronald Hambleton Distinguished University Professor, Educational Policy, Research, & Administration Executive Director, Center for Educational Assessment University of Massachusetts at Amherst

36

Page 81: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment E

White Paper on Transition to Technology Based Assessments To help plan NAEP’s transition from its current paper-based assessments to technology-based assessments (TBA), the National Center for Education Statistics (NCES) has commissioned a white paper that will describe the overall approach being taken to accomplish this transition and its rationale. There are many reasons why this transition must begin now for NAEP’s core subject-areas: mathematics, reading, and science (the writing assessment is already technology based). Perhaps the most important reason, however, is that assessment and learning in schools across the country have already started this transition. In order for NAEP to remain relevant and meaningful in the broader educational landscape, the program must begin now to convert to technology-based assessments that reflect how students are being prepared for post-secondary work and academic experiences. Of particular concern to the “Nation’s Report Card” with its decades of valuable performance trends is the ability to maintain trend lines well into the future. As such, the program is planning a multistep process that will carefully and thoughtfully implement this important transition in a manner that is most likely to protect this valuable aspect. Whether or not trends can be maintained across paper-based and technology-based modes of administration is clearly an empirical question. All due care is being taken, however, to increase the likelihood that this important objective is achieved, and that NAEP will maintain its reputation as the gold standard of educational assessments. In addition to the careful attention being paid to maintaining performance trend lines across paper-based and technology-based administration modes, the transition to TBA is being informed by the expert guidance of subject-area, cognitive-science, and measurement experts. This transition presents numerous opportunities to enhance our measurement of framework objectives, and possibly increase the program’s relevance as a measure of preparedness for post-secondary pursuits. In addition, TBA presents numerous possibilities to extend and enhance NAEP’s reporting capabilities and opportunities. To these ends, the white paper will focus on subject-specific issues and opportunities for leveraging technology delivery to enhance NAEP’s measurement and reporting goals. The white paper is expected to be completed towards the end of summer 2014.

37

Page 82: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment F

Update on Evaluation of NAEP Achievement Levels Procurement

Objective To receive a brief informational update from NCES on the current status of the procurement being planned to evaluate NAEP achievement levels. Ongoing updates will be provided at each COSDAM meeting.

Background

The NAEP legislation states:

The achievement levels shall be used on a trial basis until the Commissioner for Education Statistics determines, as a result of an evaluation under subsection (f), that such levels are reasonable, valid, and informative to the public.

In providing further detail, the aforementioned subsection (f) outlines:

(1) REVIEW-

A. IN GENERAL- The Secretary shall provide for continuing review of any assessment authorized under this section, and student achievement levels, by one or more professional assessment evaluation organizations.

B. ISSUES ADDRESSED- Such continuing review shall address--

(i) whether any authorized assessment is properly administered, produces high quality data that are valid and reliable, is consistent with relevant widely accepted professional assessment standards, and produces data on student achievement that are not otherwise available to the State (other than data comparing participating States to each other and the Nation);

(ii) whether student achievement levels are reasonable, valid, reliable, and informative to the public;-

(iii) whether any authorized assessment is being administered as a random sample and is reporting the trends in academic achievement in a valid and reliable manner in the subject areas being assessed;

(iv) whether any of the test questions are biased, as described in section 302(e)(4); and

38

Page 83: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment F

(v) whether the appropriate authorized assessments are measuring, consistent with this section, reading ability and mathematical knowledge.

(2) REPORT- The Secretary shall report to the Committee on Education and the Workforce of the House of Representatives and the Committee on Health, Education, Labor, and Pensions of the Senate, the President, and the Nation on the findings and recommendations of such reviews.

(3) USE OF FINDINGS AND RECOMMENDATIONS- The Commissioner for Education Statistics and the National Assessment Governing Board shall consider the findings and recommendations of such reviews in designing the competition to select the organization, or organizations, through which the Commissioner for Education Statistics carries out the National Assessment.

Responsively, a procurement has been planned to administer an evaluation of NAEP achievement levels. The last update COSDAM reviewed on this topic was in May 2014.

In the following brief written update, NCES provides the Committee with a summary of the status of this procurement.

Evaluation of NAEP Achievement Levels

The National Center for Education Evaluation and Regional Assistance (NCEE), part of the Institute for Education Sciences (IES), will administer the Evaluation of the NAEP Achievement Levels. NCEE and the Department of Education’s Contracts and Acquisitions Management (CAM) office began this procurement during fiscal year 2014. A solicitation was released in early May (https://www.fbo.gov/index?s=opportunity&mode=form&tab=core&id=f0bbb548714e156f0b2773fcbce6214d&_cview=0) and amended mid-June to revise the proposal due date to July 9, 2014.

According to the statement of objectives, a report (produced within 18 months of contract award), “…should provide sufficient information upon which the Commissioner of NCES can determine if the trial designation of the NAEP reading and mathematics achievement levels at grades 4, 8, and 12 should be removed or whether the trial designation should be continued” (page 4). The statement of objectives also includes a 6-month option to extend the contract to 24 months; if this option is exercised, the contractor would plan and conduct dissemination events to communicate the conclusions of the final report to various groups of stakeholders.

39

Page 84: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board

Reporting and Dissemination Committee

August 1, 2014 9:45 am – 12:30 pm

AGENDA 9:45 – 10:45 am Joint Session with COSDAM:

NAEP Testing and Reporting on Students with Disabilities and English Language Learners Lou Fabrizio, COSDAM Chair Andrés Alonso, R&D Chair Grady Wilburn, NCES

Attachment A

10:45 – 11:10 am ACTION: Governing Board Communications Plan

Stephaan Harris, NAGB Staff Amy Buckley, Reingold

Attachment B

11:10 am – 12:30 pm Review of Core Contextual Questions for 2017 NAEP Administration

Andrés Alonso, R&D Chair Stephaan Harris, NAGB Staff

Attachment C

Information Items:

• Overview of Webinar Release of Performance of Fourth‐Grade Students in the 2012 NAEP Computer‐Based Writing Pilot Assessment

• Timing of Board Input on 2014 NAEP Report Cards

• Projected Schedule for Future NAEP Reports

Attachment D Attachment E

Attachment F

Page 85: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NAEP Testing and Reporting on Students with Disabilities

In this joint session, the Committee on Standards, Design, and Methodology (COSDAM) and Reporting and Dissemination Committee (R&D) will discuss a proposed edit to the 2010 Board policy on NAEP Testing and Reporting on Students with Disabilities and English Language Learners, as well as alternatives to the policy for adjusting scores for students excluded from taking the National Assessment of Educational Progress (NAEP). The proposal addresses concerns about a particular part of the policy not being implemented and the possible impact the policy could have on students and schools involved in NAEP. A brief history and background are below. The Policy In Brief The March 2010 Governing Board policy on NAEP Testing and Reporting on Students with Disabilities (SD) and English Language Learners (ELL) was intended to reduce exclusion rates and provide more consistency across jurisdictions in which students are tested on NAEP. The policy promoted sound reporting of comparisons and trends (the policy statement is included as Attachment B2). The policy limits the grounds on which schools can exclude students from NAEP samples to two categories—for SD, only those with the most significant cognitive disabilities, and for ELL, only those who have been in U.S. schools for less than a year. Previously, schools excluded students with Individualized Education Programs (IEPs) that called for accommodations on state tests that NAEP does not allow, primarily the read-aloud accommodation on the Reading assessment. Under the current Board policy, schools could not decide to exclude students whose IEPs for state tests specify an accommodation not allowed on NAEP. Instead, such students had to take NAEP with allowable accommodations. Additionally, parents and educators were encouraged to permit them to do so, given that NAEP provides no scores and causes no consequences for individuals, but needs fully representative samples to produce valid results for the groups on which it reports. By law, individual participation in NAEP is voluntary and parents may withdraw their children for any reason.

Inclusion Rates and Implementation During the December 2013 Board meeting, COSDAM and R&D met in joint session to discuss the 2013 student participation data for grades 4 and 8 Reading and Mathematics. There had been large increases in inclusion rates over the past ten years, and the Board’s first inclusion rate goal—95 percent of all students in each sample—was met in almost all states in 2013. However, 11 states and eight districts failed to meet the Board’s second goal of testing at least 85 percent of students identified as SD or ELL. Contrary to Board policy, NCES has continued to permit schools to exclude students whose IEPs called for accommodations that NAEP does not allow. NCES believes changing this practice could possibly be detrimental to students, increase refusals, change NAEP’s target population, and be counter to current statistical procedures. The Committees asked the staffs of NAGB and NCES to consider possible policy and operational changes and what their impact might be, as well as a timeline for possible Board action.

The staffs of NAGB and NCES have had several conversations about the implementation of the SD/ELL policy. The policy could be clarified by revising the language about converting excluded students to refusals. The fourth implementation guideline for students with disabilities states, “Students refusing to take the assessment because a particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP

Attachment A

2

Page 86: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

data analysis procedures.” NCES asserts that it is technically incorrect to apply a weight class adjustment1 that combines students who did not participate due to receiving accommodations on their state tests that are not allowed on NAEP with students who refused for other reasons. The former group cannot be assumed to be randomly missing, which is a necessary assumption for the current NAEP statistical procedures.

Policy Alternatives and Moving Forward In the May 2014 COSDAM session, Grady Wilburn of NCES and Rochelle Michel from Educational Testing Service (ETS) presented three alternative methods for adjusting scores for students who were excluded from NAEP, contrary to the Board policy:

• “Expanded” population estimates. Improving upon the methodology of the full population estimates (FPEs) and incorporating additional data from NAEP teacher and school contextual questionnaires and from school records (e.g., state test scores for individual students).

• Modified participation A. Administering only the contextual questionnaire to excluded students and using that additional information to predict how the students would have performed on the cognitive items.

• Modified participation B. Administering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g., Mathematics) and using both sources of information to predict how the students would have done on the Reading assessment.

COSDAM members expressed serious reservations about implementing any of the three procedures due to the following reasons: current concerns about collecting student data; the potential for jeopardizing trend reporting; increased costs; and the threat of depressing scores due to a change in the population of tested students. There was general consensus that NCES’ current practices on this particular aspect of the policy—encouraging schools to include more students in NAEP even when they receive accommodations on their state tests that are not allowed on NAEP, but still allowing schools to exclude such students if they insist—was acceptable. The Committee asked whether it is possible to identify students who do take the NAEP Reading assessment despite receiving a read-aloud accommodation on their state tests. Peggy Carr, Associate Commissioner of NCES, noted that the SD questionnaire will be modified for 2015 to capture this information. The Committee agreed with a suggestion from member Andrew Ho that, instead of classifying students as refusals when they do not take the assessment because a particular accommodation is not allowed, the policy be edited to reflect that the number of such students be tracked and minimized to the extent feasible. At this August 1 joint session, COSDAM and R&D members will discuss proposed edits to the policy to address ongoing concerns and questions about implementation.

1 This refers to a set of units (e.g., schools or students) that are grouped together for the purpose of calculating nonresponse adjustments. The units are homogeneous with respect to certain unit characteristics, such as school size, location, public/private, student's age, sex, and student disability status.

Attachment A

3

Page 87: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Materials The March 2010 Board policy on NAEP Testing and Reporting on Students with Disabilities and English Language Learners, with the proposed edit

Pages 5-10

An excerpt of the 2015 NAEP Questionnaire about Students with Disabilities

Page 11

2013 national and state inclusion rates for NAEP Reading, grades 4 and 8

Pages 12-13

2013 TUDA inclusion rates for NAEP Reading, grades 4 and 8

Page 14

2013 national and state inclusion rates for NAEP Mathematics, grades 4 and 8

Pages 15-16

2013 TUDA inclusion rates for NAEP Mathematics, grades 4 and 8

Page 17

Minutes from the May 2014 COSDAM session on this topic

Pages 18-19

Minutes from the December 2013 Joint COSDAM and R&D session on this topic

Pages 20-22

Attachment A

4

Page 88: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Adopted: March 6, 2010

National Assessment Governing Board

NAEP Testing and Reporting on Students with Disabilities and English Language Learners

Policy Statement

INTRODUCTION To serve as the Nation’s Report Card, the National Assessment of Educational Progress (NAEP) must produce valid, comparable data on the academic achievement of American students. Public confidence in NAEP results must be high. But in recent years it has been threatened by continuing, substantial variations in exclusion rates for students with disabilities (SD) and English language learners (ELL) among the states and urban districts taking part.

Student participation in NAEP is voluntary, and the assessment is prohibited by law from providing results for individual children or schools. But NAEP’s national, state, and district results are closely scrutinized, and the National Assessment Governing Board (NAGB) believes NAEP must act affirmatively to ensure that the samples reported are truly representative and that public confidence is maintained. To ensure that NAEP is fully representative, a very high proportion of the students selected must participate in its samples, including students with disabilities and English language learners. Exclusion of such students must be minimized; they should be counted in the Nation’s Report Card. Accommodations should be offered to make the assessment accessible, but these changes from standard test administration procedures should not alter the knowledge and skills being assessed.

The following policies and guidelines are based on recommendations by expert panels convened by the Governing Board to propose uniform national rules for NAEP testing of SD and ELL students. The Board has also taken into consideration the views expressed in a wide range of public comment and in detailed analyses provided by the National Center for Education Statistics, which is responsible for conducting the assessment under the policy guidance of the Board. The policies are presented not as statistically-derived standards but as policy guidelines intended to maximize student participation, minimize the potential for bias, promote fair comparisons, and maintain trends. They signify the Board’s strong belief that NAEP must retain public confidence that it is fair and fully-representative of the jurisdictions and groups on which the assessment reports.

Attachment A

5

Page 89: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

POLICY PRINCIPLES

1. As many students as possible should be encouraged to participate in the National Assessment. Accommodations should be offered, if necessary, to enable students with disabilities and English language learners to participate, but should not alter the constructs assessed, as defined in assessment frameworks approved by the National Assessment Governing Board.

2. To attain comparable inclusion rates across states and districts, special efforts should be made to inform and solicit the cooperation of state and local officials, including school personnel who decide upon the participation of individual students.

3. The proportion of all students excluded from any NAEP sample should not exceed 5

percent. Samples falling below this goal shall be prominently designated in reports as not attaining the desired inclusion rate of 95 percent.

4. Among students classified as either ELL or SD a goal of 85 percent inclusion shall be established. National, state, and district samples falling below this goal shall be identified in NAEP reporting.

5. In assessment frameworks adopted by the Board, the constructs to be tested should be

carefully defined, and allowable accommodations should be identified. 6. All items and directions in NAEP assessments should be clearly written and free of

linguistic complexity irrelevant to the constructs assessed. 7. Enhanced efforts should be made to provide a short clear description of the purpose

and value of NAEP and of full student participation in the assessment. These materials should be aimed at school personnel, state officials, and the general public, including the parents of students with disabilities and English language learners. The materials should emphasize that NAEP provides important information on academic progress and that all groups of students should be counted in the Nation’s Report Card. The materials should state clearly that NAEP gives no results for individual students or schools, and can have no impact on student status, grades, or placement decisions.

8. Before each state and district-level assessment NAEP program representatives should

meet with testing directors and officials concerned with SD and ELL students to explain NAEP inclusion rules. The concerns of state and local decision makers should be discussed.

Attachment A

6

Page 90: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

IMPLEMENTATION GUIDELINES For Students with Disabilities

1. Students with disabilities should participate in the National Assessment with or without allowable accommodations, as needed. Allowable accommodations are any changes from standard test administration procedures, needed to provide fair access by students with disabilities that do not alter the constructs being measured and produce valid results. In cases where non-standard procedures are permitted on state tests but not allowed on NAEP, students will be urged to take NAEP without them, but these students may use other allowable accommodations that they need.

2. The decision tree for participation of students with disabilities in NAEP shall be as

follows:

NAEP Decision Tree for Students with Disabilities

BACKGROUND CONTEXT

1. NAEP is designed to measure constructs carefully defined in assessment frameworks adopted

by the National Assessment Governing Board. 2. NAEP provides a list of appropriate accommodations and non-allowed modifications in each

subject. An appropriate accommodation changes the way NAEP is normally administered to enable a student to take the test but does not alter the construct being measured. An inappropriate modification changes the way NAEP is normally administered but does alter the construct being measured.

STEPS OF THE DECISION TREE

3. In deciding how a student will participate in NAEP: a. If the student has an Individualized Education Program (IEP) or Section 504 plan and is

tested without accommodation, then he or she takes NAEP without accommodation. b. If the student’s IEP or 504 plan specifies an accommodation permitted by NAEP, then

the student takes NAEP with that accommodation. c. If the student’s IEP or 504 plan specifies an accommodation or modification that is not

allowed on NAEP, then the student is encouraged to take NAEP without that accommodation or modification.

Attachment A

7

Page 91: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

3. Students should be considered for exclusion from NAEP only if they have previously been identified in an Individualized Education Program (IEP) as having the most significant cognitive disabilities, and are assessed by the state on an alternate assessment based on alternate achievement standards (AA-AAS). All students tested by the state on an alternate assessment with modified achievement standards (AA- MAS) should be included in the National Assessment.

4. The number of sStudents who do notrefusing to take the assessment because a

particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP data analysis procedures be tracked and minimized to the extent possible.

5. NAEP should report separately on students with Individualized Education Programs

(IEPs) and those with Section 504 plans, but (except to maintain trend) should only count the students with IEPs as students with disabilities. All 504 students should participate in NAEP.

At present the National Assessment reports on students with disabilities by combining results for those with an individualized education program (who receive special education services under the Individuals with Disabilities Education Act [IDEA]) and students with Section 504 plans under the Rehabilitation Act of 1973 (a much smaller group with disabilities who are not receiving services under IDEA but may be allowed test accommodations).* Under the Elementary and Secondary Education Act, only those with an IEP are counted as students with disabilities in reporting state test results. NAEP should be consistent with this practice. However, to preserve trend, results for both categories should be combined for several more assessment years, but over time NAEP should report as students with disabilities only those who have an IEP.

6. Only students with an IEP or Section 504 plan are eligible for accommodations on

NAEP. States are urged to adopt policies providing that such documents should address participation in the National Assessment.

For English Language Learners

1. All English language learners selected for the NAEP sample who have been in United

States schools for one year or more should be included in the National Assessment. Those in U.S. schools for less than one year should take the assessment if it is available in the student’s primary language.

* NOTE: The regulation implementing Section 504 defines a person with a disability as one who has a physical or mental impairment which substantially limits one or more major life activities, has a record of such an impairment, or is regarded as having such an impairment. 34 C.F.R. § 104.3(j)(1).

Attachment A

8

Page 92: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

One year or more shall be defined as one full academic year before the year of the assessment.

2. Accommodations should be offered that maximize meaningful participation, are responsive to the student’s level of English proficiency, and maintain the constructs in the NAEP framework. A list of allowable accommodations should be prepared by NAEP and furnished to participating schools. Such accommodations may be provided only to students who are not native speakers of English and are currently classified by their schools as English language learners or limited English proficient (LEP).

3. Bilingual versions of NAEP in Spanish and English should be prepared in all subjects, other than reading and writing, to the extent deemed feasible by the National Center for Education Statistics. The assessments of reading and writing should continue to be in English only, as provided for in the NAEP frameworks for these subjects.

4. Staff at each school should select from among appropriate ELL-responsive

accommodations allowed by NAEP, including bilingual booklets, those that best meet the linguistic needs of each student. Decisions should be made by a qualified professional familiar with the student, using objective indicators of English proficiency (such as the English language proficiency assessments [ELPA] required by federal law), in accordance with guidance provided by NAEP and subject to review by the NAEP assessment coordinator.

5. Schools may provide word-to-word bilingual dictionaries (without definitions)

between English and the student’s primary language, except for NAEP reading and writing, which are assessments in English only.

6. NAEP results for ELL students should be disaggregated and reported by detailed

information on students’ level of English language proficiency, using the best available standardized assessment data. As soon as possible, NAEP should develop its own brief test of English language proficiency to bring consistency to reporting nationwide.

7. Data should be collected, disaggregated, and reported for former English language

learners who have been reclassified as English proficient and exited from the ELL category. This should include data on the number of years since students exited ELL services or were reclassified.

8. English language learners who are also classified as students with disabilities should

first be given linguistically-appropriate accommodations before determining which additional accommodations may be needed to address any disabilities they may have.

Attachment A

9

Page 93: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

RESEARCH AND DEVELOPMENT The Governing Board supports an aggressive schedule of research and development in the following areas:

1. The use of plain language and the principles of universal design, including a plain language review of new test items consistent with adopted frameworks.

2. Adaptive testing, either computer-based or paper-and-pencil. Such testing should provide more precise and accurate information than is available at present on low-performing and high-performing groups of students, and may include items appropriate for ELLs at low or intermediate levels of English proficiency. Data produced by such targeted testing should be placed on the common NAEP scale. Students assessed under any new procedures should be able to demonstrate fully their knowledge and skills on a range of material specified in NAEP frameworks.

3. A brief, easily-administered test of English language proficiency to be used for

determining whether students should receive a translation, adaptive testing, or other accommodations because of limited English proficiency.

4. The validity and impact of commonly used testing accommodations, such as extended

time and small group administration. 5. The identification, measurement, and reporting on academic achievement of students

with the most significant cognitive disabilities. This should be done in order to make recommendations on how such students could be included in NAEP in the future.

6. A study of outlier states and districts with notably high or low exclusion rates for

either SD or ELL students to identify the characteristics of state policies, the approach of decision makers, and other criteria associated with different inclusion levels.

The Governing Board requests NCES to prepare a research agenda on the topics above.

A status report on this research should be presented at the November 2010 meeting of the Board.

Attachment A

10

Page 94: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Excerpt from the 2015 NAEP Questionnaire about Students with Disabilities

What accommodations does STUDENT receive on the state test for Reading?

If a student is not assessed on the state test in Reading, base the response on how the student is assessed in the classroom in Reading.

NOTE: For a description of how each accommodation is conducted in NAEP, place your cursor over the name of each accommodation. Choose all that apply.

□ Student does not receive any accommodations □ Extended time □ Small group □ One on one □ Read aloud in English – directions only □ Read aloud in English – occasional □ Read aloud in English – most or all □ Breaks during testing □ Must have an aide administer the test □ Large print version of the test □ Magnification □ Uses template/special equipment/preferential seating □ Presentation in Braille □ Response in Braille □ Presentation in sign language □ Response in sign language □ Other (specify) In 2015, the information that is captured will allow us to distinguish between accommodations allowed on the NAEP Reading Assessment (e.g., Read aloud in English – directions only) and accommodations not allowed on the NAEP Reading Assessment (e.g., Read aloud in English – occasional, Read aloud in English – most or all). In 2013, a single item asked whether students received any Read aloud accommodation (directions only/occasional/most or all); therefore, it was not possible to distinguish between accommodations allowed by NAEP and accommodations not allowed by NAEP.

Attachment A

11

Page 95: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inclusion rate and confidence interval in NAEP reading for fourth- and eighth-gradepublic and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion InclusionState/jurisdiction rate Lower Upper rate Lower Upper

Nation 97 1 97.3 97.6 98 1 97.7 98.0

Nation (public) 97 1 97.2 97.5 98 1 97.5 97.9Alabama 99 1 98.3 99.3 99 1 98.2 99.3Alaska 99 1 97.9 99.0 99 1 98.1 99.0Arizona 99 1 98.3 99.3 99 1 98.0 98.9Arkansas 99 1 98.4 99.2 98 1 97.3 98.6California 97 1 96.7 98.1 97 1 96.7 98.1Colorado 98 1 97.9 98.9 99 1 98.4 99.2Connecticut 98 1 97.8 98.9 98 1 97.2 98.4Delaware 95 1 94.3 96.1 97 1 95.8 97.1Florida 97 1 96.1 97.8 98 1 97.4 98.7Georgia 95 1 93.7 96.2 96 1 95.2 97.0Hawaii 98 1 97.6 98.6 98 1 97.4 98.5Idaho 99 1 98.0 98.9 98 1 97.8 98.8Illinois 99 1 98.3 99.1 99 1 98.1 98.9Indiana 98 1 96.4 98.3 98 1 97.4 98.6Iowa 99 1 98.4 99.2 99 1 98.1 99.2Kansas 98 1 97.5 98.7 98 1 97.7 98.7Kentucky 97 1 96.4 97.5 97 1 95.9 97.4Louisiana 99 1 98.4 99.2 99 1 98.3 99.1Maine 98 1 97.7 98.7 98 1 97.9 98.9Maryland 87 85.9 88.3 91 89.4 91.7Massachusetts 97 1 96.7 97.8 98 1 97.1 98.4Michigan 96 1 95.0 97.1 96 1 95.1 97.5Minnesota 97 1 96.5 97.9 98 1 97.0 98.2Mississippi 99 1 99.0 99.7 99 1 98.9 99.5Missouri 99 1 98.2 99.2 99 1 98.5 99.3Montana 97 1 96.5 97.6 98 1 97.0 98.3Nebraska 96 1 95.4 97.2 97 1 96.2 97.7Nevada 98 1 98.0 98.9 99 1 98.6 99.3New Hampshire 97 1 96.7 98.0 97 1 96.5 97.6New Jersey 98 1 97.4 98.9 97 1 96.4 98.1New Mexico 99 1 98.6 99.3 98 1 97.8 98.7New York 99 1 97.9 99.1 99 1 98.6 99.4North Carolina 98 1 97.4 98.7 98 1 97.6 98.8North Dakota 96 1 95.3 96.5 96 1 94.9 96.4Ohio 97 1 96.3 98.2 98 1 96.8 98.4Oklahoma 98 1 97.5 98.8 99 1 98.0 99.0Oregon 98 1 96.8 98.1 99 1 98.0 99.0Pennsylvania 98 1 96.9 98.3 98 1 97.6 98.7Rhode Island 99 1 98.2 99.0 99 1 98.2 99.0South Carolina 98 1 97.3 98.9 98 1 97.5 98.6

National Center for Education Statistics

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

12

Page 96: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

public and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion InclusionState/jurisdiction rate Lower Upper rate Lower Upper

Nation 97 1 97.3 97.6 98 1 97.7 98.0

Nation (public) 97 1 97.2 97.5 98 1 97.5 97.9South Dakota 98 1 97.1 98.3 97 1 96.1 97.7Tennessee 97 1 96.0 97.6 97 1 96.0 97.5Texas 95 1 94.0 96.0 96 1 95.6 97.2Utah 97 1 96.1 97.6 97 1 96.0 97.7Vermont 99 1 98.3 99.2 99 1 98.6 99.4Virginia 98 1 97.9 98.9 99 1 98.1 99.0Washington 97 1 96.2 97.9 98 1 96.8 98.1West Virginia 98 1 97.6 98.7 98 1 97.6 98.6Wisconsin 98 1 97.8 98.8 98 1 97.8 98.8Wyoming 99 1 98.3 99.1 99 1 98.5 99.1Other jurisdictions

District of Columbia 98 1 97.6 98.9 98 1 97.6 98.6DoDEA2 94 93.2 94.8 96 1 95.3 96.9

Assessment Governing Board goal of 95 percent.

(public) and state/jurisdiction results include public school students only. Data for DoDEA schools are included in the overall national results, but not in the national (public) results.

Education Statistics, National Assessment of Educational Progress (NAEP), 2013 Reading Assessment.

1 The state/jurisdiction’s inclusion rate is higher than or not significantly different from the National

2 Department of Defense Education Activity (overseas and domestic schools).NOTE: The overall national results include both public and nonpublic school students. The national

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for

95% confidence interval 95% confidence interval

National Center for Education StatisticsInclusion rate and confidence interval in NAEP reading for fourth- and eighth-grade

Grade 4 Grade 8

Attachment A

13

Page 97: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

students, as a percentage of all students, by jurisdiction: 2013

Inclusion InclusionJurisdiction rate Lower Upper rate Lower UpperNation (public) 97 2 97.2 97.5 98 2 97.5 97.9Large city1 97 2 96.1 97.1 98 2 97.2 97.9Albuquerque 99 2 98.8 99.5 98 2 97.1 98.6Atlanta 99 2 98.5 99.1 99 2 98.4 99.3Austin 96 2 94.1 97.4 97 2 95.6 97.4Baltimore City 84 81.8 86.3 84 81.1 85.8Boston 96 2 94.9 96.3 97 2 95.8 97.3Charlotte 99 2 98.4 99.5 98 2 97.6 98.8Chicago 99 2 97.6 99.1 98 2 97.5 99.0Cleveland 95 2 94.5 96.0 96 2 95.5 97.3Dallas 83 75.3 88.5 96 2 95.4 97.3Detroit 95 2 92.7 96.0 94 2 92.6 95.7District of Columbia (DCPS) 98 2 96.7 98.5 97 2 96.4 98.3Fresno 98 2 96.7 98.3 97 2 96.0 97.6Hillsborough County (FL) 99 2 98.3 99.3 98 2 97.2 98.7Houston 94 2 91.0 95.5 96 2 95.3 97.0Jefferson County (KY) 95 2 92.5 96.3 96 2 94.4 96.7Los Angeles 98 2 96.6 98.7 97 2 96.4 98.0Miami-Dade 95 2 92.3 97.4 97 2 95.2 98.3Milwaukee 96 2 93.8 97.3 96 2 94.4 97.1New York City 98 2 97.4 99.0 99 2 97.8 99.0Philadelphia 96 2 94.9 97.1 96 2 93.0 98.0San Diego 98 2 96.6 98.4 97 2 96.2 98.3

the participating districts.

Assessment Governing Board goal of 95 percent.

Statistics, National Assessment of Educational Progress (NAEP), 2013 Reading Assessment.

1 Large city includes students from all cities in the nation with populations of 250,000 or more including

2 The jurisdiction’s inclusion rate is higher than or not significantly different from the National

NOTE: DCPS = District of Columbia Public Schools.SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education

National Center for Education StatisticsInclusion rate and confidence interval in NAEP reading for fourth- and eighth-grade public school

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

14

Page 98: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inclusion rate and confidence interval in NAEP mathematics for fourth- and eighth-grade public and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion State/jurisdiction rate Lower Upper Inclusion rate Lower Upper

Nation 99 1 98.5 98.7 99 1 98.4 98.6

Nation (public) 98 1 98.4 98.6 98 1 98.3 98.5Alabama 99 1 98.1 99.4 99 1 98.6 99.2Alaska 99 1 98.4 99.2 99 1 98.5 99.2Arizona 99 1 98.3 99.1 99 1 98.2 99.1Arkansas 99 1 98.3 99.1 98 1 97.5 98.5California 98 1 97.4 98.6 99 1 98.0 98.9Colorado 99 1 98.3 99.2 99 1 98.3 99.3Connecticut 99 1 98.1 99.1 98 1 97.4 98.4Delaware 98 1 97.1 98.5 99 1 98.2 99.0Florida 98 1 97.5 98.6 98 1 97.7 98.8Georgia 99 1 97.9 99.0 98 1 97.7 99.0Hawaii 99 1 98.3 99.1 98 1 97.8 98.8Idaho 99 1 98.3 99.0 99 1 98.5 99.3Illinois 99 1 98.4 99.4 99 1 98.6 99.3Indiana 98 1 97.9 98.9 98 1 97.7 98.8Iowa 99 1 98.8 99.6 99 1 98.8 99.5Kansas 98 1 97.9 98.8 98 1 97.7 98.8Kentucky 99 1 98.0 99.0 98 1 97.2 98.5Louisiana 99 1 98.3 99.3 99 1 98.5 99.2Maine 98 1 97.3 98.4 99 1 98.2 99.0Maryland 99 1 98.6 99.3 98 1 97.7 98.7Massachusetts 98 1 97.3 98.5 98 1 97.1 98.6Michigan 98 1 97.3 98.6 98 1 95.8 98.6Minnesota 99 1 98.1 99.0 98 1 97.6 98.8Mississippi 99 1 98.7 99.5 99 1 98.5 99.6Missouri 99 1 98.0 99.0 99 1 98.2 99.1Montana 98 1 97.8 98.7 99 1 98.0 99.0Nebraska 98 1 97.6 98.8 98 1 97.6 98.6Nevada 99 1 98.1 99.0 99 1 98.4 99.3New Hampshire 99 1 98.3 99.1 99 1 98.5 99.3New Jersey 99 1 98.3 99.2 98 1 97.7 98.8New Mexico 99 1 98.2 99.2 98 1 97.9 98.8New York 99 1 98.1 99.2 98 1 97.1 98.7North Carolina 99 1 98.3 99.1 99 1 98.2 99.1North Dakota 97 1 96.8 97.9 97 1 96.5 97.5Ohio 99 1 98.2 99.0 98 1 98.0 98.9Oklahoma 98 1 97.5 98.6 98 1 97.7 98.9Oregon 98 1 97.2 98.4 99 1 97.9 99.0Pennsylvania 98 1 97.8 98.8 98 1 97.4 98.9Rhode Island 99 1 98.4 99.2 99 1 98.5 99.2South Carolina 99 1 98.2 99.3 99 1 98.0 99.1

National Center for Education Statistics

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

15

Page 99: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inclusion rate and confidence interval in NAEP mathematics for fourth- and eighth-grade public and nonpublic school students, as a percentage of all students, by state/jurisdiction: 2013

Inclusion State/jurisdiction rate Lower Upper Inclusion rate Lower UpperSouth Dakota 99 1 98.0 99.0 99 1 98.2 99.1Tennessee 99 1 98.0 99.1 98 1 97.7 98.7Texas 98 1 97.9 98.7 98 1 97.4 98.6Utah 99 1 98.1 99.2 98 1 97.9 98.9Vermont 99 1 98.2 99.0 99 1 98.8 99.4Virginia 98 1 98.0 98.9 99 1 98.6 99.2Washington 98 1 97.0 98.4 98 1 97.3 98.5West Virginia 98 1 97.6 98.8 98 1 97.8 98.7Wisconsin 98 1 97.7 98.6 98 1 97.9 98.9Wyoming 99 1 98.6 99.3 98 1 98.0 98.9Other jurisdictions

District of Columbia 99 1 98.1 99.0 99 1 98.5 99.4DoDEA2 98 1 97.9 98.7 99 1 98.4 99.2

Assessment Governing Board goal of 95 percent.

(public) and state/jurisdiction results include public school students only. Data for DoDEA schools are included in the overall national results, but not in the national (public) results.

Statistics, National Assessment of Educational Progress (NAEP), 2013 Mathematics Assessment.

1 The state/jurisdiction’s inclusion rate is higher than or not significantly different from the National

2 Department of Defense Education Activity (overseas and domestic schools).NOTE: The overall national results include both public and nonpublic school students. The national

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education

National Center for Education Statistics

Grade 4 Grade 895% confidence interval 95% confidence interval

Attachment A

16

Page 100: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

school students, as a percentage of all students, by jurisdiction: 2013

Inclusion InclusionJurisdiction rate Lower Upper rate Lower UpperNation (public) 98 2 98.4 98.6 98 2 98.3 98.5Large city1 98 2 98.0 98.4 98 2 97.9 98.4Albuquerque 99 2 98.1 99.3 98 2 97.8 99.0Atlanta 99 2 98.4 99.4 99 2 98.8 99.6Austin 98 2 97.0 98.6 98 2 97.4 98.6Baltimore City 98 2 96.9 99.2 98 2 96.6 99.1Boston 96 2 95.4 97.0 97 2 96.7 98.0Charlotte 99 2 97.6 99.4 99 2 97.8 99.2Chicago 99 2 98.3 99.3 99 2 98.0 99.2Cleveland 96 2 94.8 96.5 97 2 96.6 98.0Dallas 98 2 96.8 98.3 98 2 96.7 98.2Detroit 95 2 93.3 96.1 96 2 94.4 96.9District of Columbia (DCPS) 98 2 97.1 98.6 98 2 97.4 98.9Fresno 99 2 98.5 99.5 98 2 97.5 98.8Hillsborough County (FL) 99 2 98.1 99.3 99 2 97.8 99.2Houston 98 2 97.1 98.8 98 2 97.1 98.3Jefferson County (KY) 98 2 97.4 98.8 98 2 97.5 98.9Los Angeles 98 2 97.0 98.7 98 2 97.8 98.9Miami-Dade 98 2 96.5 98.4 98 2 97.0 98.3Milwaukee 97 2 95.2 97.6 96 2 93.6 97.4New York City 99 2 98.0 99.1 98 2 97.4 98.8Philadelphia 97 2 95.1 97.6 96 2 92.6 98.2San Diego 99 2 97.7 99.1 98 2 96.8 98.3

the participating districts.

Governing Board goal of 95 percent.

Statistics, National Assessment of Educational Progress (NAEP), 2013 Mathematics Assessment.

National Center for Education StatisticsInclusion rate and confidence interval in NAEP mathematics for fourth- and eighth-grade public

Grade 4 Grade 895% confidence interval 95% confidence interval

1 Large city includes students from all cities in the nation with populations of 250,000 or more including

2 The jurisdiction’s inclusion rate is higher than or not significantly different from the National Assessment

NOTE: DCPS = District of Columbia Public Schools.SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education

Attachment A

17

Page 101: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Committee on Standards, Design and Methodology

May 16, 2014 EXCERPT

COSDAM Members: Chair Lou Fabrizio, Vice Chair Fielding Rolston, Lucille Davy, James Geringer, Andrew Ho, Terry Holliday, James Popham, and Leticia Van de Putte. Governing Board Staff: Michelle Blair and Sharyn Rosenberg. Other Attendees: John Easton, Director of the Institute of Education Sciences and ex officio member of the Governing Board. NCES: Peggy Carr, Arnold Goldstein, Dana Kelly, Daniel McGrath, and Grady Wilburn. AIR: Fran Stancavage. CRP: Carolyn Rudd. ETS: Rochelle Michel and Andreas Oranje. HumRRO: Lauress Wise. Optimal Solutions Group: Lipika Ahuja. Pearson: Brad Thayer. Westat: Keith Rust. NAEP Testing and Reporting on Students with Disabilities Mr. Fabrizio noted that the session would focus on a particular challenge associated with the March 2010 Board policy on NAEP Testing and Reporting on Students with Disabilities (SDs) and English Language Learners (ELLs). The policy was intended to reduce exclusion rates and provide more consistency across jurisdictions in which students are tested on NAEP to promote sound reporting of comparisons and trends. The policy limits the grounds upon which schools can exclude students to two categories—for SDs, only those with the most significant cognitive disabilities, and for ELLs, only those who have been in U.S. schools for less than one year. Although schools cannot limit student participation on any other grounds, individual participation in NAEP is voluntary by law and parents may withdraw their children for any reason. The policy states, “Students refusing to take the assessment because a particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP data analysis procedures.” Under NAEP data analysis procedures, a weight class adjustment is used to account for students who refuse to take the assessment, but excluded students have no impact on estimated scores. Contrary to the Board policy, NCES has continued to permit schools to exclude students whose Individualized Education Programs (IEPs) call for accommodations that NAEP does not allow. NCES asserts that it is technically incorrect to apply a weight class adjustment that combines students who did not participate due to receiving accommodations on their state tests that are not allowed on NAEP with students who refused for other reasons. Grady Wilburn of the National Center for Education Statistics (NCES) and Rochelle Michel from Educational Testing Service (ETS) presented three alternative methods for adjusting scores for students who were excluded from NAEP, contrary to the Board policy. The first method, “Expanded” population estimates, would improve upon the methodology of the full population

Attachment A

18

Page 102: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

estimates (FPEs) and incorporate additional data from NAEP teacher and school contextual questionnaires and from school records (e.g., state test scores for individual students). The second method, Modified participation A, would involve administering only the contextual questionnaire to excluded students and using that additional information to predict how the students would have performed on the cognitive items. The third method, Modified participation B, would involve administering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g., Mathematics) and using both sources of information to predict how the students would have done on the Reading assessment. COSDAM members expressed serious reservations about implementing any of the three procedures due to the following reasons: current concerns about collecting student data; the potential for jeopardizing trend reporting; increased costs; and the threat of depressing scores due to a change in the population of tested students. There was general consensus that NCES’ current practices on this particular aspect of the policy—encouraging schools to include more students in NAEP even when they receive accommodations on their state tests that are not allowed on NAEP, but still allowing schools to exclude such students if they insist—was acceptable. The committee asked whether it is possible to identify students who do take the NAEP Reading assessment despite receiving a read-aloud accommodation on their state tests. Peggy Carr, Associate Commissioner of NCES, noted that the SD questionnaire will be modified for 2015 to capture this information. Andrew Ho suggested the following edit to the policy: “Students refusing to take the assessment because a particular accommodation is not allowed should not be classified as exclusions but placed in the category of refusals under NAEP data analysis procedures be tracked and minimized to the extent possible.” The committee agreed with Mr. Ho’s suggestion. Mr. Fabrizio asked that this recommendation be shared with the Reporting and Dissemination Committee in joint session during the August 2014 meeting.

Attachment A

19

Page 103: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board Committee on Standards, Design and Methodology

December 6, 2013

EXCERPT JOINT MEETING WITH REPORTING AND DISSEMINATION COMMITTEE Attendees COSDAM Members: Chair Lou Fabrizio, Vice Chair Fielding Rolston, Lucille Davy, Andrew Ho, Terry Holliday, and James Popham. Reporting and Dissemination Committee Members: Acting Chair Terry Mazany (Vice Chair of the Reporting and Dissemination Committee), Anitere Flores, Rebecca Gagnon, Tom Luna, Tonya Miles, and Father Joseph O’Keefe. Governing Board Staff: Executive Director Cornelia Orr, Michelle Blair, Larry Feinberg, Stephaan Harris, and Sharyn Rosenberg. Other Attendees: John Easton, Director of the Institute of Education Sciences and ex officio member of the Governing Board. NCES: Commissioner Jack Buckley, Gina Broxterman, Patricia Etienne, Arnold Goldstein, Andrew Kolstad, and Daniel McGrath. AIR: Victor Bandeira de Mello, George Bohrnstedt, Markus Broer, and Cadelle Hemphill. ETS: Andreas Oranje, John Mazzeo, and Lisa Ward. Hager Sharp: David Hoff, Debra Silimeo, and Melissa Spade Cristler. HumRRO: Steve Sellman and Laurie Wise. Optimal Solutions Group: Rukayat Akinbiyi. Reingold: Amy Buckley, Erin Fenn, Sarah Johnson, and Valeri Marrapodi. Virginia Department of Education: Pat Wright. Westat: Chris Averett and Keith Rust. Widmeyer: Jason Smith. Lou Fabrizio, Chair of the Committee on Standards, Design and Methodology (COSDAM), called the meeting to order at 10:02 a.m. and welcomed members and guests. The purpose of the joint session was to discuss implementation in the NAEP 2013 assessments of the Governing Board policy on NAEP Testing and Reporting on Students with Disabilities (SD) and English Language Learners (ELL). Larry Feinberg, of the Governing Board staff, described the March 2010 policy, which was intended to reduce exclusion rates and provide consistency across jurisdictions in how students are tested to promote sound reporting of comparisons and trends. The policy limits the grounds on which schools can exclude students from NAEP samples to two categories—for SD, only those with the most significant cognitive disabilities, and for ELL, only those who have been in U.S. schools for less than a year. He noted that previously, schools could exclude students with Individualized Education Programs (IEPs) that called for accommodations on state tests that NAEP does not allow because they would alter the construct NAEP assesses. The most widely used of these were

Attachment A

20

Page 104: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

having the test read aloud for the Reading assessment and using a calculator for all parts of the Mathematics assessment. Under the current Board policy, schools can no longer decide to exclude students whose IEPs for state tests specify an accommodation not allowed on NAEP. Instead, such students should take NAEP with allowable accommodations. Parents should be encouraged to permit them to do so, given that NAEP provides no scores and causes no consequences for individuals but needs fully representative samples to produce the valid results for the groups on which it reports. By law, individual participation in NAEP is voluntary and parents may withdraw their children for any reason. When parents refuse to allow children to participate in NAEP, scores are imputed based on reweighting the performance of other students with similar characteristics. However, when students are excluded, they do not impact group scores at all, and, in effect, are considered to achieve at the group average. Grady Wilburn, of NCES, presented 2013 participation data for grades 4 and 8 Reading and Mathematics. He noted large increases in inclusion rates over the past ten years, and said the Board’s inclusion goals—95 percent of all students in each sample and 85 percent of students identified as SD or ELL—had been met in almost all states. According to calculations by Keith Rust, of Westat, converting exclusions in reading to refusals would produce a statistically significant change in only one state, Maryland. However, Peggy Carr, Associate Commissioner of Assessment at NCES, said the impact would be much greater in some of the urban districts in TUDA, whose 2013 results have not yet been released.

In accordance with Board action, Mr. Wilburn said NCES had also published scores based on full-population estimates, (FPEs), which adjust state and district averages by imputing scores for excluded SD and ELL students based on the performance of similar SD and ELL students who are tested. Member Andrew Ho said these estimates should be given more emphasis as a way to give consistency to trends and make it clear when score changes are likely to have been caused by changes in exclusion rates. Ms. Carr said improvements were possible in the models for imputing FPEs. Mr. Wilburn explained that, contrary to the Board policy, NCES had continued to permit schools to exclude students whose IEPs called for accommodations that NAEP does not allow, in most cases, read-aloud. NCES believes changing this practice would increase refusals, impact reported trends, change NAEP’s target population, and violate sound psychometric procedures. For mathematics in 2013, NCES introduced a new option for students whose IEPs call for a calculator accommodation, where schools could choose to have these students take two calculator-active NAEP blocks, even if those were not the blocks that would have been randomly assigned through the matrix sampling design. Mr. Feinberg said this change, by reducing exclusions, had also impacted some reported trends.

Attachment A

21

Page 105: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Jack Buckley, the Commissioner of Education Statistics, noted that it is not clear who gets to define NAEP’s target population. He said NCES and the Board disagree about whether it should include students whose IEPs specify accommodations that NAEP does not allow. Mr. Wilburn said NCES plans to publish a technical memo that will focus on how refusal and exclusion issues impact NAEP participation and performance. The memo will include total participation rates that summarize non-participation from all causes—exclusions, refusals, and absence (which is the largest category). The memo will also provide data on the proportion of exclusions based on NAEP not allowing a state-provided accommodation. There was additional discussion on the impact that exclusion and refusal changes would have on TUDA districts. Terry Mazany, the acting chair of the Reporting and Dissemination Committee, conveyed a message from Andrés Alonso, the Committee chair who was not present. He said Mr. Alonso, former superintendent of Baltimore schools, had urged that policy changes impacting NAEP exclusions and scores should be highlighted in NAEP reports to provide context for interpreting results and that historical data should be provided. The Committees asked the staffs of NCES and NAGB to consider possible policy changes and what their impact might be. Lou Fabrizio, chair of the Committee on Standards, Design and Methodology, asked staff to prepare recommendations for moving forward and a timeline for possible Board action.

Attachment A

22

Page 106: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment B

NATIONAL ASSESSMENT GOVERNING BOARD 2014 STRATEGIC COMMUNICATIONS PLAN

August 1, 2014

In 2014 and beyond, the National Assessment Governing Board seeks to focus its communication efforts strategically and cost effectively to “Make Data Matter” for various target audiences. The Board is well-positioned to increase the impact of its outreach, but it must prioritize its audiences and identify its objectives for each, while integrating innovative strategies to elevate the Board’s work—and NAEP—as a thought leader in education. Reingold proposes three goals the Board can pursue to amplify its outreach efforts.

I. Make a Connection With Target Audiences II. Engage Audiences Between Report Card Releases

III. Maximize Impact Through Innovation Reingold’s assumption in developing strategic priorities for the Board is that reporting and dissemination activities must support a vision to make an impact in education through engagement with NAEP that will enable the use, discussion, and sharing of NAEP data and information. A time-phased action plan, including specific outreach tactics and metrics, will be developed with Governing Board staff on the Board’s approval of this strategic communications plan. The members of the Reporting and Dissemination Committee have identified three key audiences it believes the Board should focus on—parents; teachers and administrators; and policymakers—as each of these audiences is in a position to make an impact through NAEP data. Working with staff, we will identify the Board’s goals and expectations of each audience and the key messages needed to engage each one effectively. Potential outcomes of the audience-focused outreach are listed below: Parents Understand the value of NAEP and its implication for parents. Ask informed questions about their child’s education and the school system. Use NAEP to consider out-of-school factors that might affect their child’s education. Share NAEP information and messages with their parent peers.

Teachers and Administrators Understand the value of NAEP and its implication for teachers and administrators. Use NAEP to influence change within their classroom or school system. Educate parents about NAEP data and resources. Share and distribute NAEP information to their peers.

23

Page 107: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Policymakers Understand the value of NAEP and its implication for education policy. Use and cite NAEP data in policy decisions, public statements, and white papers. Distribute NAEP information and messages to constituents and peers to help

advocate for change. It is important to remember that messages and calls to action are intended to move the Board’s priority audiences along an engagement continuum, from awareness and education to trial, buy-in, and, ultimately, action. But creating the right messages is only the beginning. It is critical to know which information to deliver first, which should follow, and who are the most credible messengers. We will lay out a cohesive, practical, comprehensive roadmap for reaching the Board’s target audiences that identifies how to take advantage of existing opportunities, what new strategies to develop, and optimal methods of dissemination. The action plan will include a variety of opportunities to connect with each audience to maximize the reach and frequency of each message. The proposed strategies involve cultivating and leveraging partnerships that will include stakeholders or champions. There will also be collaboration with the National Center for Education Statistics (NCES) to ensure efforts are not duplicated, with Board and NCES staff coordinating on roles, responsibilities, and resources on various strategies as needed. To illustrate the strategies identified above, below we discuss what the execution of each one could involve for the Board’s three priority audiences. I. Make a Connection with Target Audiences The goal is personal and powerful: “Communicate the Value of NAEP.” This means going beyond the distribution of NAEP data to highlighting, developing, and sharing relevant messages, content, stories, and calls to action for key audiences. Communicating the “So what?” and “Why should we care?” can help the Board move beyond the scores and headlines to clarify the value of NAEP and its important role as an indicator of student achievement. Develop key messages and calls to action for priority audiences. The Governing

Board’s audience is widely diverse—in their knowledge of and experience with NAEP, in their intended uses and consumption of data and information, and in their communications networks, favored channels, and approaches. With these differences in mind, it is imperative that the Governing Board tailor messages for each of its audiences to inspire deeper engagement with NAEP data. Instead of a one-size-fits-all approach, we will define and continually test and adjust the messages that are the most relevant to each audience.

Example of the strategy in action for parents: Include the tailored messages and calls to action on the website’s “Information For” parent pages. The parent landing page could have calls to action including “Learn about NAEP,” “Download NAEP resources,” or “Test yourself on NAEP questions.” The page could also have a section devoted to the Board’s assessment literacy efforts (including resources, information and questions to ask) once outreach strategies from the work group are finalized.

24

Page 108: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Example of the strategy in action for teachers and administrators: The American Federation of Teachers and National Education Association could include a NAEP toolkit with messages for teachers on its website in a resources section.

Example of the strategy in action for policymakers: Minneapolis Board of Education and Governing Board member Rebecca Gagnon could use and reference data from Science in Action: Hands-On and Interactive Computer Tasks From the 2009 Science Assessment in a discussion with the Minnesota Department of Education and the Minnesota Education Technology Task Force about the importance of science computer labs.

Impact metrics: The number of downloads of materials such as a PowerPoint or frequently asked questions PDF; number of clicks on links for calls to action (e.g., “Test yourself on NAEP questions”); number of champions—that is, advocates—who commit to using or distributing the NAEP messaging and toolkit.

Expand communications beyond reporting on the scores. We need to get beyond the typical report presentations of the data and find meaningful ways to elevate the data (and their implications) through materials, messaging, and outreach activities. We will identify and highlight hidden gems of NAEP data, connecting the dots between data and practice and leveraging resources to reach specific audiences to deliver important messages in a meaningful and memorable way. The Governing Board must be a storyteller that educates its audiences about the relevancy of NAEP data and resources in a way that resonates with its audiences’ interests and needs in an actionable manner.

Example of the strategy in action for parents: Develop a parent leader discussion guide to assist parent leaders in using NAEP and other assessment data in their conversations with school administrators about improving student achievement for all children. Example of the strategy in action for teachers and administrators: Develop an interactive Prezi presentation (a visually animated storytelling tool for presenting ideas and messages) on NAEP achievement gap data from the recent 2013 Mathematics and Reading, Grade 12 report card for New Leaders, a national nonprofit organization that develops transformational school leaders and designs effective leadership policies and practices for school systems across the country. Example of the strategy in action for policymakers: Governing Board member Anitere Flores could host a Florida Senate session on parent involvement in education to highlight NAEP contextual variables data in reading from the 2013 Mathematics and Reading, Grade 12 report card. For example, when asked whether students discussed what they read, students who reported discussing their reading every day or almost every day had higher reading scores.

25

Page 109: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Impact metrics: The number of guides distributed at stakeholder conferences or downloaded from the website; number of groups posting the guide on their websites; number of Prezi and data downloads; parent-submitted testimonials and feedback on using the guide to speak with school and district leaders.

Tell the NAEP story through user testimonials. NAEP data become more

impactful when stakeholders learn how others use the data to fulfill their missions and advance their educational goals. Working through key groups, we will collect and disseminate real-life testimonials from the priority audiences to become an authentic author of the NAEP story. Example of the strategy in action for parents: Collaborate with National PTA to solicit testimonials from parents about how they use NAEP and other assessment data, and then promote the testimonials through the Board’s and PTA’s online networks. These testimonials and other NAEP information could also be featured on the websites of other national education groups, encouraging parents to learn about different assessments their children might take and how the data can be used. Example of the strategy in action for teachers and administrators: Coordinate with elementary school principal and Board member Doris Hicks and future Board member chosen for the secondary school principal slot to collaborate with the National Association of Elementary School Principals and the National Association of Secondary School Principals to solicit testimonials from principals and teachers within their districts about how they use NAEP and the importance of at-home and out-of-school activities that enhance learning, then promote testimonials through the school communication channels.

Example of the strategy in action for policymakers: Collaborate with the National Association of State Boards of Education to collect testimonials from state board members on how data, including NAEP data, are used to inform policy-level decisions and improvements.

Impact metrics: The number of NAEP user testimonials received; number of testimonial views online; number of social media shares and engagement; quality of the engagements and comments about parents using data.

Potential action taken by key audiences under this goal: Using NAEP materials and resources on organization websites to inform questions of school and education leaders about school curriculum and district progress; downloading NAEP sample questions to test student knowledge or supplement classroom lessons;

II. Engage Audiences Between Report Card Releases The goal is ongoing and impactful: “Continual Engagement.” This means building tangible connections—outside of report card release events—between NAEP and its stakeholders, and equipping them with the insight, information, and tools to make a difference in educational quality and student achievement. This important strategy cannot be executed by staff alone, and will require the contributions of Board members and the partnership of

26

Page 110: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

stakeholder groups and other NAEP champions, including former Board members. Expand the report card release life cycle. There is great opportunity for the

Governing Board to enliven data and engage target audiences by taking a comprehensive, reimagined view of releasing and reporting on NAEP results that goes beyond the one-day release event. The entire life cycle of an assessment—from developing the framework to fielding assessments to disseminating results—offers content and commentary that, if shared more strategically, will powerfully support the NAEP brand and use of NAEP by target audiences. The Board can both enhance the report card releases and extend the life cycle to make meaningful connections with target audiences by developing pre- and post-release content, and recording and sharing video or audio which tease out and illuminate NAEP data. Example of the strategy in action for parents: For each report card release, develop a highlight reel with panelist quotes, select data points, and facts on reading, mathematics, and science contextual variables to send to parent stakeholder groups to distribute to their networks and on the Web. Example of the strategy in action for teachers and administrators: Governing Board member Terry Mazany could host a meeting with the executive director of the Chicago Principals & Administrators Association to discuss the value of NAEP state and TUDA achievement data.

Example of the strategy in action for policymakers: Host a briefing with the California State Board of Education on the performance of fourth-grade students in the NAEP 2012 Writing Grade 4 Pilot with a diverse panel to include California fourth-grade teacher and Governing Board member Shannon Garrison, the executive director of the National Writing Project, and authors Carol Bedard and Charles Fuhrken. Impact metrics: The numbers of video views and shares; number of groups posting the video; quality of comments and conversations under the video; feedback from stakeholder groups about the impact of the video and parent engagement with the content; number of participants at the meeting or briefing.

Leverage partnerships with stakeholder organizations and champions. As a

trusted messenger of information to key audiences, the Governing Board needs to mobilize its existing networks, engaging stakeholder groups and champions to share and shape future outreach. Stakeholders and champions are diverse and can be from education associations or news outlets like NBC News. They could also be politicians, celebrities, athletes, or prominent individuals like First Lady Michelle Obama. We will help the Board identify key partnership opportunities for its priority audiences and develop specific recommendations for engagement, to put their distinct capabilities to work in promoting NAEP and extending the Governing Board’s reach. For example, we could keep working with the Alliance for Excellent Education to produce and promote post-release webinars, provide data infographics to the National Council of Teachers of Mathematics, and collaborate with the

27

Page 111: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Council of La Raza in sponsoring Facebook chats in addition to consistently pursuing new opportunities with key stakeholder organizations. Example of the strategy in action for parents: Collaborate with NBC News’ Education Nation and Pearson on their Parent Toolkit (www.parenttoolkit.com), including NAEP materials, graphics, and downloadable resources on the website that position the Governing Board as an authoritative source of information on student assessment data. Example of the strategy in action for teachers and administrators: Collaborate with Danica McKellar, actress, author, and STEM education advocate, to submit an article to the National Science Teachers Association’s NSTA Express newsletter on the importance of STEM education and girls’ involvement in STEM, and include data from NAEP’s Technology and Engineering Literacy assessment. Example of the strategy in action for policymakers: Arrange for James Geringer and/or Ronnie Musgrove, Board members and former governors, to present at the annual National Governors Association conference on an important policy issue affecting states in which NAEP data and contextual variables are relevant. Additionally, the Board and he governors can collaborate with the Center on Education Policy to include NAEP reading data and contextual variables (such as frequency of discussing what they read or finding reading enjoyable) in their research papers, publications and annual progress report.

Impact metrics: The number of clicks on the NAEP content; number of downloads of NAEP materials; use of presented NAEP data by governors and state policy leaders in media citations, state websites and other materials; volume of referral traffic from the Parent Toolkit site back to the Governing Board’s website; Education Nation engagement that identifies stories of the Toolkit in action; number of newsletter opens and clicks; number of research report downloads.

Equip, empower, and display thought leadership. The Governing Board and

NCES are well-positioned as thought leaders among researchers and many national policymakers but could expand their influence with other audiences, such as parents, local policymakers, and education practitioners. Governing Board members and staff should be seen by media representatives and stakeholders as valued spokespeople on educational assessment and achievement, including specific topics such as computerized assessments, achievement gap trends, 12th-grade academic preparedness, and the importance of technology, engineering, and literacy. The Board can also continually secure speaking engagements at a variety of events such as the International Reading Association’s annual conference or local PTA chapter meetings, or pitch quotes for inclusion in news articles and op-eds on relevant topics. Example of the strategy in action for parents: Work with Board member and parent Tonya Miles and develop and pitch op-eds that connect NAEP data with important year-round education events, emphasizing the role parents can play in raising student achievement. During Black History Month, pitch a piece to HuffPost

28

Page 112: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Parents that spotlights achievement gap success stories, or pitch a piece about technology and engineering skill-building beyond the classroom to Sacramento Parent magazine. Example of the strategy in action for teachers and administrators: Co-host a webinar discussion on NAEP state achievement trends with the American Federation of School Administrators, with members weighing in on state-level changes and education initiatives that are aimed at increasing achievement.

Example of the strategy in action for policymakers: Submit a proposal to the National School Board Association’s annual conference for a Board member and NCES to co-host a breakout session to share and discuss the recent 2013 Mathematics and Reading, Grade 12 report card, academic preparedness data, and recent graduation rate research.

Impact metrics: The numbers of op-ed placements, shares, and comments; quality of user engagements and comments; number of follow-up questions from readers; number of new emails collected (from a “Subscribe to the Governing Board” call to action); number of webinar and conference participants and follow-up requests.

Potential action taken by key audiences under this goal: Inspired by op-ed on racial achievement gaps, exploring gaps in their own districts and talking with school leaders about parity of resources; noting performance trends in subjects by state and/or urban district and then using that knowledge to inform state, local, or school district-level decisions regarding academic programs.

III. Maximize Impact Through Innovation The goal is proactive and cutting-edge: “Lead the Way.” This means reaching and making meaningful connections with priority audiences, customizing events, fostering and driving online conversations, and creating tech-savvy materials with compelling content. Customize release event formats. Report cards are not one-size-fits-all; innovative

release event strategies are needed to achieve the specific goals of each release. Each release event strategy should have distinct goals, audiences, messages, materials, strategies, and tactics to Make Data Matter. The Governing Board has expanded the report card release event structure from physical events for every release to include webinars and live-streaming during events, a post-release social media Facebook chat, and an online town hall event. We will continue to refine this approach to customizing every release to maximize the immediate release impact and create a sustained conversation that continues to reach and engage key audiences. Example of the strategy in action for parents: Host a Google Hangout for parents after a NAEP release that can feature panelists from the National Council of La Raza talking about the importance of parent involvement in education, and encourage parent participants to share how they use data to help their students achieve.

29

Page 113: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Example of the strategy in action for teachers and administrators: Develop a Twitter town hall guide (NAEP data points, question-and-answer content, best-practice tips, and facilitation instructions) for teachers and school administrators to host their own facilitated chats with parents and the school district on state-level NAEP data and areas for application. Example of the strategy in action for policymakers: Host an in-person round-table discussion with members of the Massachusetts Mayors’ Association on the latest state-level NAEP reading and mathematics results and their state-based implications. Impact metrics: The number of promotions of the online events and shares of the URL; numbers of event participants and total users viewing them or reached; numbers of comments or participants sharing their testimonials; number of follow-up testimonials received for inclusion in materials or on the website.

Engage in the online conversation. It is important to be aware of the conversations on important education issues, but to influence and help shape public understanding and perceptions the Governing Board needs to participate in the conversation with key messages. We will help the Governing Board foster conversations through real-time engagement on social media platforms, develop content such as an article written by a Governing Board member to post on NAEP’s upcoming blog coordinated by NCES, and create a strategy to join or host online chat events, sponsor Q&A sessions, or solicit feedback. Champions are key to the success of this effort, providing greater reach and often a more powerful story than the Governing Board can tell alone. Example of the strategy in action for parents: Hold a webinar with the Governing Board’s Education Summit for Parent Leaders attendees and parent leader champions to review the NAEP website workshop tutorial and obtain feedback through a moderated chat on how they have used NAEP data since the event. Compile feedback to create a one-pager and share it with participants. Example of the strategy in action for teachers and administrators: Collaborate with the National Council of Teachers of Mathematics (NCTM) on an online Q&A chat session based on the NAEP Mathematics Curriculum Study data, educating NCTM about the wide variance of content in mathematics courses and books with the same name. Board member and math teacher Dale Nowlin could be a participating panelist. Example of the strategy in action for policymakers: Reach out to the National Governors Association (NGA) on Twitter and provide NGA with content and data about the 2013 Mathematics and Reading, Grade 12 report card. Impact metrics: Numbers of campaign participants and user submissions; numbers of engagements (“likes,” comments, shares, retweets, views) for the multimedia submissions; quality of comments on the multimedia submissions; growth in the

30

Page 114: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Governing Board social media audience and number of engaged users discussing assessment data.

Create multimedia, digital content and materials. The Governing Board must

present messages, graphics, and images that resonate with target audiences. A wealth of materials has been developed by the Governing Board and NCES, and the first step will be to audit and catalog resources that may be repurposed through outreach and promotional activities. For the materials gaps that are identified, it is imperative to develop interactive, multimedia content and materials that deliver key messages to target priority audiences and include a call to action. Examples include infographics that embellish key report card findings to facilitate understanding and encourage engagement with NAEP data among nonexperts; videos, Prezi, and other presentation tools allowing exploration of the relationships between ideas and numbers and visual presentations of NAEP; and an email newsletter with new content and specific calls to action. Example of the strategy in action for parents: Create a “NAEP for Parents” email newsletter with information on the latest report card data and trends, multimedia content such as video clips or NAEP data user testimonials, and links to other resource or news content and the interactive data maps on the Board’s parent Web pages, to be distributed bimonthly or consistently throughout the year. Example of the strategy in action for teachers and administrators: Create an infographic with “hidden data” gems from the NAEP Grade 8 Black Male Students report and accompanying language to share with the National Alliance of Black School Educators to post on social media. Example of the strategy in action for policymakers: Work with Board member Terry Holliday to create an interactive presentation at CCSSO’s annual large-scale assessment conference on NAEP computer-based assessments, or work with Board member Tom Luna to distribute the dynamic 12th-grade preparedness video highlighting the new college preparedness data to Chiefs for Change members. Impact metrics: Email open rate; numbers of email shares, clicks from email to website, and new email subscribers; number of release participants who list the email as their referral source; numbers of email replies or responses with inquiries about NAEP or acquiring NAEP materials and resources; number of video and infographic views and shares.

Potential action taken by key audiences under this goal: Using contextual data to

influence out-of-school factors that have been shown to correlate with achievement; using curriculum study findings to investigate course rigor and influence change for exposure to challenging subject matter.

By pursuing these three fundamental communication goals and identifying priority strategies and tactics, the Governing Board can more effectively reach its target audiences to Make Data Matter and, ultimately, make an impact.

31

Page 115: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment C

Core Contextual Questions: Committee Review and Feedback

Historically, NAEP has designed its contextual questionnaire around single questions and questionnaire

results were reported as single questions as well. The National Center for Education Statistics (NCES) is

developing modules for the 2017 core contextual questions. During the Reporting and Dissemination

(R&D) Committee’s February 2014 meeting, NCES presented initial plans to develop core contextual

modules, including the following: Socio-Economic Status, Technology Use, School Climate, Grit, and

Desire for Learning1. During the Committee’s May meeting, NCES discussed the comprehensive

research used on question development and further described the five potential modules capturing

opportunity to learn and non-cognitive student factors relevant to student achievement that are proposed

for future NAEP Core survey questionnaires.

The Committee members will review contextual items and provide feedback to NCES for discussion at

the August 1 meeting. The week of July 14, members will have participated in the webinar to learn how

to navigate an embargo site—open on July 17—which hosts the items that members will be able to access

and study before the meeting. This review would include existing questions that are currently in the core

questionnaire pool along with draft questions intended to measure respective modules. Committee input

from this first review will help inform core contextual questionnaire development before cognitive labs

are administered later this year. Following cognitive labs, the Committee would review core contextual

questions two more times prior to the 2017 operational assessment administration. This would include a

review prior to 2016 pilot testing and a final review prior to the 2017 operational assessment.

Attached is a high-level schedule of contextual item development, including the Committee’s

opportunities for providing feedback at key junctures during the process.

1 This module was previously referred to as “Need for Cognition” during the February 28, 2014 presentation.

32

Page 116: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Timeline for Contextual Item Development and R&D Committee Review

This table represents a timeline for the review of contextual modules for 2017 NAEP.

*Cognitive labs allow NCES to study how respondents understand, mentally process, and answer survey questions. **The Questionnaire Standing Committee provides guidance for contextual questionnaires and is similar to a subject area standing committee that would provide guidance for a specific subject. ***Office of Management and Budget approval is needed for federal agencies that collect survey data from 10 or more people.

STAGES DATES TASKS

ITEM DEVELOPMENT

& PRE-TESTING

07-08/2014 • R&D review of existing item pool and draft items

08/2014 • Continuation of item development for cognitive labs* based on R&D and Questionnaire Standing Committee** input

10/2014 • OMB*** fast-track review of items in cognitive labs

11/2014-02/2015 • Pre-testing of new and revised items for cognitive labs*

03/2015 • Analysis of pre-testing data and decisions for pilot questionnaires

PILOT

04/2015 • R&D clearance review for pilot

05/2015 • OMB*** review of items for pilot

01/2016-03/2016 • Pilot administration

2016 • Analysis of pilot data and decisions for operational

OPERATIONAL

04/2016 • R&D clearance review for operational

05/2016 • OMB** review of items for operational

01/2017-03/2017 • Operational administration

2017 • 2017 grade 4 and 8 reporting

2018 • 2017 grade 12 reporting

33

Page 117: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

Overview of Webinar Release of Performance of Fourth-Grade Students in the 2012 NAEP Computer-Based Writing Pilot Assessment

The National Center for Education Statistics (NCES) conducted the NAEP Grade 4 Writing Computer-Based Assessment (WCBA) Study to determine if fourth-grade students could write effectively on the computer and to examine if technology-based assessments could be administered to fourth-grade students in the future. The study’s findings as well as sample computer tasks are slated to be posted online by NCES around July 21, 2014.

To bring attention to this study, the National Assessment Governing Board is hosting a webinar that is slated for July 25th. The webinar panel that will discuss the significance of this study, the implications for technology and writing education, and lessons learned that could especially inform educators and policymakers as they make the decisions on related student technology and assessments. The following panelists are confirmed for the webinar.

Shannon Garrison, Fourth-Grade Teacher, Solano Avenue Elementary School, Los Angeles; Member, National Assessment Governing Board

Elyse Eidman-Aadahl, Executive Director, National Writing Project Karen Cator, President and CEO, Digital Promise Cornelia Orr, Executive Director, National Assessment Governing Board

(moderator)

Due to the Board materials deadline, an overview and media coverage report of the webinar will not be available for posted materials but can be distributed, shared, and discussed at the R&D Committee meeting on August 1, 2014.

34

Page 118: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment E

Timing of Governing Board Input to 2014 Report Cards

The Reporting and Dissemination Committee has voiced a desire to offer ideas and topics for inclusion in the 2014 Report Cards at an early stage in their development, before it gets too late in the development process for significant revision. Four NAEP Report Cards are planned from the 2014 data collection:

• Civics

• Geography

• U.S. History

• Technology and Engineering Literacy (TEL)

The National Center for Education Statistics (NCES) will begin considering designs and content for the civics, geography, and U.S. history reports in November 2014 and for the TEL report in December 2014. At least one of these reports must be approved and delivered to the Governing Board for scheduling its release within one year of the end of data collection, i.e., by March 31, 2015.

The November 2014 Board meeting would be an ideal time to hold such a discussion for Committee members. While 2014 results will not be available, the 2010 reports on civics, geography, and U.S. history, as well as the 2014 student, teacher, and school survey questionnaires for these subjects and TEL, could be used for reference.

As previously discussed, NCES would welcome ideas for relevant issues and topics that Board members would like the reports to address before work begins in earnest on those reports. Having this discussion at this early stage would enable NCES and its contractor (ETS) to plan to include data, assuming such data were collected, that would shed light on the issues of interest.

35

Page 119: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment F

Upcoming NAEP Reports as of August 2014

Report Expected Release Date Initial NAEP Releases

2014 Puerto Rico December 2014 2014 Meaning Vocabulary December 2014 2014 Civics Report Card May 2015 2014 Geography Report Card May 2015 2014 U.S. History Report Card May 2015 2014 Technology & Engineering Literacy Report Card November 2015

Other NAEP Reports Focus on NAEP: 12th Grade Participation & Engagement July 2014

Focus on NAEP: Sampling

July 2014

Performance of Fourth‐Grade Students in the 2012 NAEP Computer‐Based Writing Pilot Assessment

July 2014

Linking NAEP and TIMSS 2011 Mathematics and Science Results for the 8th Grade (Technical Report)

August 2014

2013 School Composition and the Black‐White Achievement Gap Report

September 2014

Focus on NAEP: Simpsons Paradox September 2014

NAEP Grade 8 Black Male Students Through The Lens of the National Assessment of Educational Progress

November 2014

Focus on NAEP: English Language Learners December 2014

International Reports

Comparative Indicators of Education in the United States and Other G‐20 Countries

August 2014

36

Page 120: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

2014 NCES Assessment Data

Release Timeline

Jan Apr Jun Dec May Feb Mar Sep Jul Oct Aug Nov

NAEP Report Cards

LEGEND

Other NAEP Reports

Grade 4 Writing

Pilot Website

2011 Linking

NAEP and TIMSS 8

Technical Report

2013 Achieve-

ment Gaps

Report

Grade 8 Black Male

Students Report

Focus on NAEP:

Sampling

Focus on NAEP:

Simpson’s

Paradox

Focus on NAEP: English

Language

Learners

Focus on NAEP:

Grade 12 Participation

&

Engagement

Puerto Rico: 2013 Math Grades 4

& 8

2014 Meaning

Vocabulary

37

Page 121: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

2015 NCES Assessment Data

Release Timeline

Jan Apr Jun Dec May Feb Mar Sep Jul Oct Aug Nov

NAEP Report Cards

LEGEND

2014 TEL Report Card

2014 Civics Report Card

2014 Geography Report Card

2014 U.S. History

Report Card

38

Page 122: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Focus on NAEP: 12th Grade Participation & Engagement

Focus on NAEP: Sampling

Performance of Fourth-Grade Students in the 2012 NAEP Computer-

Based Writing Pilot Assessment

Linking NAEP and TIMSS 2011 Mathematics and Science Results for the 8th Grade (Technical Report)

2013 School Composition and the Black-White Achievement Gap Report

Focus on NAEP: Simpson’s Paradox

NAEP Grade 8 Black Male Students Through the Lens of the National

Assessment of Educational Progress

2014 Puerto Rico

2014 Meaning Vocabulary

Focus on NAEP: English Language Learners

Releases in

2014

U.S. History: Grade 8

Civics: Grade 8

Geography: Grade 8

Technology and Engineering Literacy: Grade 8

Assessment Data Collection Schedule

2014

39

Page 123: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

2014 Civics Report Card

2014 Geography Report Card

2014 U.S. History Report Card

2014 Technology & Engineering Literacy Report Card

Releases in

2015

40

Page 124: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Common Core State Standards Assessment Consortia

Introduction

At this meeting, representatives of the PARCC and SBAC Common Core State Standards (CCSS) Assessment Consortia will discuss several topics on which they are working that are of interest to the Governing Board. This discussion will be in a moderated question and answer format. After introductory presentations, an opportunity for questions from Board members will be provided.

The Partnership for Assessment of Readiness for College and Careers (PARCC) will be represented Jeff Nellhaus, P!R��’s Director of Policy, Research, and Design.

The SMARTER Balanced Assessment Consortium (SBAC) will be represented by Joe Willhoft, S�!�’s Executive Director.

ATTACHMENTS

Attachment A provides background information on the establishment of the two Common Core State Standards Assessment Consortia.

Attachment B is a map showing current consortia states as recently published in the Education Week Curriculum Matters blog by Catherine Gewertz. The complete blog post can be found at http://blogs.edweek.org/edweek/curriculum/2014/06/tennessee quits parcc.html.

Attachment C includes the bio for the PARCC presenter, Jeff Nellhaus, and the slides for his presentation.

Attachment D includes the bio for the SBAC presenter, Joe Willhoft, a summary of the recently complete SBAC Field test, and a description of the plans for SBAC online Achievement Level Setting, including facts and FAQs.

Page 125: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment A

Background

The Race to the Top Assessment Program, authorized under the American Recovery and Reinvestment Act of 2009 (ARRA), provided funding to consortia of States to develop assessments that are valid, support and inform instruction, provide accurate information about what students know and can do, and measure student achievement against standards designed to ensure that all students gain the knowledge and skills needed to succeed in college and the workplace. These assessments are intended to play a critical role in educational systems; provide administrators, educators, parents, and students with the data and information needed to continuously improve teaching and learning; and help meet the President's goal of restoring, by 2020, the nation's position as the world leader in college graduates.

In September of 2010, the U.S. Department of Education awarded two Comprehensive Assessment Systems grants to the Partnership for Assessment of Readiness for College and Careers (PARCC) Consortium and the Smarter Balanced Assessment Consortia (SBAC). The consortia are to develop and implement assessment systems by the 2014-2015 school year. In addition, PARCC and SBAC were each provided a supplemental grant award to support the work in their approved application and to successfully transition to the new standards and assessments. Each received a supplemental grant award to include activities that focused on:

Developing gap analyses between current and new standards, curriculum analysis tools, professional development related to the new standards and assessments including support for educators to better understand the content of the new standards, state and local assessment audits to determine what assessments will no longer be needed.

Enhancing technology to be used in the assessments systems, including assessment delivery.

Supporting educator understanding and use of assessment results, and other steps needed to build the professional capacity to implement more rigorous common standards.

On January 7, 2011, PARCC and SBAC each entered into a Cooperative Agreement with the Department regarding these grants. The agreement is intended to provide for substantial communication, coordination, and involvement between the Department and the grantee to ensure the success of the grant.

Page 126: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment B Published 6-20-2014

3

Page 127: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment C

Partnership for Assessment of Readiness for College and Careers (PARCC)

http://www.PARCConline.org

Biography of Jeffrey Nellhaus, PARCC Director of Policy, Research, and Design

Jeffrey Nellhaus serves as the Director of Policy, Research and Design for the PARCC assessment

consortium. PARCC (Partnership for the Assessment of Readiness for College and Careers) is

one of two state consortia that received Race to the Top Assessment Grants to design and

develop next generation student assessment systems based on the Common Core State

Standards in English Language Arts & Literacy and Mathematics.

Before joining PARCC, Mr. Nellhaus spent nearly 25 years with the Massachusetts Department

of Elementary and Secondary Education (MA DESE) where held the positions of Deputy

Commissioner, Acting Commissioner, and Associate Commissioner for Curriculum and

Assessment. While at the MA DESE, Mr. Nellhaus directed the design, development and

implementation of the Massachusetts Comprehensive Assessment System (MCAS), and the

development of the Massachusetts Curriculum Frameworks, which include the Common Core

State Standards. For his work on MC!S he was awarded the Manuel Carballo Governor’s !ward

for Excellence in Public Service.

Mr. Nellhaus has served on the National Validity Studies Panel to National Assessment of

Education Progress (NAEP) and on Technical Advisory Committees for the states of Maine,

Kentucky and Rhode Island. He has also served on the Technical Advisory Committee on

Standard Setting for NAEP and on the Growth Model Peer Review Panel for the U.S.

Department of Education.

Prior to joining the Massachusetts Department of Education, Mr. Nellhaus was a Peace Corps

Volunteer in India, taught chemistry and mathematics in a public high school in Vermont, and

directed a federally-funded educational program in Thailand for Cambodian and Laotian

refugees preparing to resettle in the U.S.

Mr. Nellhaus holds a B.S. in Chemistry from the University of Massachusetts, a M.S. in Science

Teaching from Antioch Graduate School of Education, and an Ed.M. in Administration, Policy

and Planning from Harvard Graduate School of Education.

Attachment: PARCC Update for NAGB

4

Page 128: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 129: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 130: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 131: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 132: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 133: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 134: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 135: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 136: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 137: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 138: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 139: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 140: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 141: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 142: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 143: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 144: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 145: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 146: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Attachment D

SMARTER Balanced Assessment Consortium (SBAC)

http://www.smarterbalanced.org/

Biography of Joe Wilhoft, SBAC Executive Director

Mr. Willhoft is the Executive Director of the SMARTER Balanced Assessment Consortium, one of

two federally-funded consortia that are developing a new generation of state assessment

systems aligned to the Common Core State Standards. Prior to this appointment he was the

assistant superintendent for assessment and student information for the state of Washington.

His responsibilities included design and implementation of Washington’s assessment program

and collection and reporting of student information for the state’s longitudinal data system.

Before working at the state level, Joe directed assessment and evaluation activities at the

district level for more than twenty years, primarily in Tacoma Public Schools in Washington and

in Maryland.

Joe earned his doctorate in educational measurement and statistics from the University of

Maryland. He is past president of the Maryland Assessment Group, the Washington

Educational Research Association, and the American Educational Research Association

Classroom Assessment Special Interest Group. He has been involved in multiple collaborative

data and assessment efforts, including having served on technical advisory committees in

several states and the Technical Work Group for the most-recent congressionally-mandated

evaluation of the National Assessment of Educational Progress (NAEP). He previously served as

co-chair of the NAEP Policy Advisory Task Force, a collaborative effort of the National

Assessment Governing Board and the Council of Chief State School Officers.

Other Attachments:

SBAC Field Test Report

SBAC Online Panel for Achievement Level Setting

23

Page 147: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 148: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 149: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 150: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,
Page 151: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

National Assessment Governing Board

Nominations Committee

August 2, 2014

7:30 – 8:15 am

AGENDA

Closed Session 7:30 – 8:15 am

7:30 – 7:35 am

Welcome, Introductions, and Agenda Overview

Tonya Miles, Chair

7:35 – 8:15 am

Status of Nominations for Board Terms Beginning on October 1, 2014 and Planning for 2015 Cycle

Mary Crovo, Governing Board Staff

Page 152: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Inside NAEP: Expanding NAEP Scales to Improve Measurement

The Board has recently heard about efforts that are underway to expand NAEP item pools and scales in order to more accurately measure the knowledge and skills of students across the ability distribution. At the May 2014 Board meeting, NCES provided an update on the Knowledge and Skills Appropriate (KaSA) studies that were intended to more reliably measure the mathematical knowledge and skills of students in Puerto Rico. For the KaSA studies, new items were designed and added to the mathematics item pool that targeted accurate measurement at the lower end of the ability distribution.

The KaSA studies exemplify an approach to targeted item development and administration that NAEP can build upon as the program begins to implement multi-stage adaptive testing designs. Multi-stage testing, or MST, is a test assembly and administration approach in which students are administered a set of items that are targeted to their ability levels. Specifically, based on responses to an initial set of test questions, lower-ability students would receive a set of relatively easier items that measure more accurately at the lower end of the ability distribution and higher-ability students would receive more difficult items that measure more accurately at the higher end. Targeting assessment content in this manner improves the measurement of students’ abilities, as it more precisely hones in on what students know and are able to do. In contrast, administering items that are much too easy or too difficult for a student yields little information about that student’s knowledge and skills.

The Board raised several questions about the expansion of NAEP scales, including whether administering easier or more difficult items might artificially increase or decrease scores. This presentation will address those questions by providing an overview of how NAEP scales can be expanded to improve measurement without compromising the program’s ability to report valid scores that can be compared and interpreted on a common metric. The presentation will explain how a common scale is established despite the fact that students receive different sets of NAEP items that vary in difficulty. Specific topics to be addressed in the presentation include:

• The basic principles of the NAEP assessment design, including the assembly of test forms.

• How the relationship between students’ ability and the assessment items, including difficulty, is characterized.

• The concepts of test information and measurement error, which tell us how well the assessment is estimating ability at specific levels of the ability distribution.

Page 153: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Improving the Contextual Questionnaires for the National Assessment of Educational Progress:

PLANS FOR NAEP CORE CONTEXTUAL MODULES A WHITE PAPER TO THE NATIONAL ASSESSMENT GOVERNING BOARD Jonas P. Bertling, Ph.D.

MAY 2014 (DRAFT)

Page 154: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

CONTENTS

1 Introduction ..................................................................................................................................................................... 3

2 Overview of Key Factors Relevant to Student Achievement ........................................................................ 6

3 Modules proposed for future Core Questionnaires ....................................................................................... 12

3.1 Socio-Economic Status (Module 1) ............................................................................................................. 14

3.2 Technology Use (Module 2) ........................................................................................................................... 18

3.3 “School Climate” (Module 3) .......................................................................................................................... 21

3.4 “Grit” (Module 4) ................................................................................................................................................ 24

3.5 “Desire for Learning” (Module 5) ................................................................................................................ 26

4 References ....................................................................................................................................................................... 29

NOTE

This white paper is in draft form.

Page 155: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

3

1 Introduction This memo describes the plans to develop core contextual questionnaire modules for the 2017

National Assessment of Educational Progress (NAEP) technology-based survey questionnaires.

Two main goals for this memo are, first to describe a proposed revised general questionnaire

approach that focuses on questionnaire modules and indices in addition to stand-alone questions

and, second, to describe five potential modules capturing opportunity to learn and noncognitive

student factors relevant to student achievement that are proposed for future NAEP Core survey

questionnaires. Evidence from the research literature on selection of these modules will be

provided.

We thereby directly address the National Assessment Governing Board’s policy principles laid out

in their 2012 policy statement, particularly the principles that “NAEP reporting should be enriched

by greater use of contextual data derived from background or non-cognitive questions asked of

students, teachers, and schools” (National Assessment Governing Board, 2012, p. 2). Proposed

Revision of General Questionnaire Approach

Historically, NAEP has designed its contextual questionnaires around single questions and

questionnaire results were therefore reported as single questions as well. A revised approach is

presented that is a more balanced, one that provides a mixture of both breadth and depth of

coverage. That is, in addition to single questions that are important to providing context for student

achievement, indices that are based on aggregation of data and several questions that will add more

robust policy-relevant reporting elements to the NAEP survey questionnaires. Indices can be

clustered into a number of distinct modules that each focus on a specific area of contextual

variables (e.g., socio-economic status). This approach is not entirely new – the existing core

questionnaires already contain several questions on multiple topics. In the existing approach,

however, no aggregate indices were created for reporting. While additional questions will be

needed to capture all modules proposed here, the main difference between the existing and newly

proposed approach is aggregating questions into indices that build several modules. This approach

directly addresses the National Assessment Governing Board’s call for making better use of the

NAEP contextual variables, specifically the first implementation guideline that, “clusters of

questions will be developed on important topics of continued interest” (National Assessment

Governing Board, 2012, p. 2). Table 1 summarizes the differences between the current and

proposed approaches in terms of both questionnaire design and reporting.

Page 156: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

4

Table 1 - Proposed revision of general questionnaire approach

Current Approach Proposed Approach Design Single questions Modules of questions and select

single questions Reporting Single questions Indices based on multiple

questions and select single questions

The proposed modules will comprise multiple questions on the same topic. While this marks a shift

to the approach to questionnaire design in NAEP, the central interest remains the same, that is

assessing topics related to student achievement. The NAEP subject area assessments focus on

measuring what students know and can do. The NAEP survey questionnaires capture relevant

contextual data for evaluating the achievement results that can help educators and policy makers

better understand the circumstances under which learning and instruction take place. In addition,

the proposed modules can add value to the NAEP survey questionnaires by capturing student,

teacher, and school factors that might not only be interpreted as important achievement predictors,

but that may also represent goals of education, and related outcomes, by themselves (see e.g.,

“Defining and Selecting Key Competencies”, Rychen & Salganik, 2003; “Key Education Indicators”,

Smith & Ginsburg, 2013). Enhanced questionnaire designs with questions being spiraled across

multiple forms will be considered for future technology-based assessments, in line with the

National Assessment Governing Board’s implementation guideline that, “whenever feasible,

assessment samples should be divided (spiral sampling) (…) in order to cover more topics without

increasing respondent burden” (National Assessment Governing Board, 2012, p. 3). Spiraling

approaches are the standard practice for the cognitive (subject area) tests in educational large-scale

assessments (Comber & Keeves, 1973; OECD, 2013). Recent research findings suggest that

questionnaire spiraling can substantially increase content coverage of survey questionnaires with

very small to negligible impact on the overall measurement model, including conditioning and

estimation of plausible values (see e.g., Adams, Berezner,& Lietz, 2013; Kaplan & Wu, 2014;

Monseur & Bertling, 2014; Almonte et al., 2014). Different possible spiraling designs for the 2017

NAEP questionnaires are currently being explored.

The idea of questionnaire indices (or modules) is not new. It is the current practice for other large-

scale assessments and surveys to aggregate multiple questions into scale indices, and analyze

relationships with achievement results and group differences based on these questionnaire indices,

in addition to analyzing responses to single questions.

Page 157: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

5

Since the year 2000, the Organization for Economic Co-operation and Development’s (OECD)

Programme for International Student Assessment (PISA; e.g., OECD, 2013) has been providing

various questionnaire indices based on a 30 minute student questionnaire, plus additional indices

from a school principal questionnaire, as well as a number of optional questionnaires (e.g.,

Information and Communications Technology (ICT) Familiarity questionnaire) that are

administered in selected countries only. Example indices from PISA 2012 are Attitudes towards

school (4 items), Sense of Belonging (8 items), Perseverance (4 items), Openness for Problem

Solving (4 items), or Mathematics Self-Efficacy (8 items). PISA also entails an index of economic,

social, and cultural status that is based on several questionnaire components. With PISA 2012 OECD

introduced several new item formats for increased cross-cultural validity of the derived

questionnaire indices, among them Anchoring Vignettes to adjust Likert type responses (Bertling &

Kyllonen, 2013), Topic Familiarity items with overclaiming correction (Kyllonen & Bertling, 2013),

and Situational Judgment Tests to measure students’ problem solving approaches (Bertling, 2012;

see Kyllonen & Bertling, 2013, for an overview). The International Association for the Evaluation of

Educational Achievement (IEA) follows a very similar approach with their international large-scale

assessments. Both the Trends in International Mathematics and Science Study (TIMSS; e.g., Martin,

Mullis, & Foy, 2008) and the Progress in International Reading Literacy Study (PIRLS; e.g., Foy &

Drucker, 2011) include numerous questionnaire indices. While PISA assesses only 15-year olds,

TIMSS and PIRLS are administered at grades 4 and 8. At both grades, questionnaire indices are

primarily based on matrix questions, i.e., questions that comprise a general item stem plus multiple

sub-items. Example indices from TIMSS are Home Resources for Learning (5 items), or School

Emphasis on Academic Success (5 items). The Gallup Student Poll measures Hope, Engagement, and

Wellbeing of fifth- through twelfth-graders in the United States, with 5 to 8 items per index.

Contextual modules with questionnaire indices can add value to the NAEP survey questionnaires in

several ways. Modules create more robust reporting through aggregating items into indices. Use of

scale indices to describe contextual factors instead of single items is not only beneficial from a

measurement perspective (e.g., indices will minimize wording effects of individual contextual

questions), but will also enhance the relevance of NAEP to policy makers, educators, and

researchers by enriching NAEP reporting and potentially providing trend data on important

noncognitive student factors as well as alternative outcomes of formal and informal education.

Page 158: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

6

2 Overview of Key Factors Relevant to Student Achievement

The NAEP statute requires that contextual factors included in the NAEP survey questionnaires must

be directly related to the appraisal of academic achievement. A simple way to think of student

achievement is as a function of student factors and opportunity to learn factors, and their interplay.

Student factors can be further divided into a student’s cognitive ability and “noncognitive factors”

capturing a student’s attitudes towards school and learning, interest, motivation, self-related

competency beliefs, and other dispositions relevant to learning and achievement. The term

“noncognitive factors” will be described in more detail in the following section.

Opportunity to Learn (OTL) describes whether a student is exposed to opportunities to acquire

relevant knowledge and skills. It was originally defined quite narrowly as whether students had

sufficient time and received adequate instruction to learn (Carrol, 1963; see also Abedi et al., 2006).

Several different aspects of the OTL constructs have been highlighted since then and, therefore,

broadened the definition of the term. In this memo we use a broad definition of OTL as all

contextual factors that capture the cumulative learning opportunities a student was exposed to at

the time of the assessment. These factors comprise both learning opportunities at school and

informal and formal learning outside of school. Examples for opportunities to learn at school are

exposure to relevant content, access to resources for learning, and exposure to a positive school

climate that encourages learning. Outside of school, a student family’s socio-economic background

(SES) and the family academic climate/home academic resources can determine opportunities to

learn. For example, while a student’s mathematical reasoning ability will be a core driver for

performance on a mathematics test, whether or not the student has been exposed to relevant

learning material, has access to the resources needed, and received support for this learning as

needed might play an equally large or even larger role for the student’s success. Student factors and

opportunity to learn factors can interact as students may differ in how they make use of the

opportunities provided, and learning opportunities may help learners develop abilities and shape

their attitudes. Figure 1 shows a graphical illustration of this general model.

Page 159: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

7

Figure 1 – A Simplified Model of Student Achievement,

Note. Contextual variables can be input, process, or outcome variables at the systems level, school level,

classroom level, or individual student level. Complex moderation or mediation pathways are not shown.

This graphical illustration is simplified in several ways: it does not illustrate the multilevel

structure with data sources at different levels (such as system level, school level, classroom level

and individual level variables) and different types of variables (input, process, output) as

distinguished in more complex models, such as the Context-Input-Process-Output (CIPO) model;

Purves, 1987; OECD 2013). It also does not depict the possible pathways of moderation and

mediation that might characterize the interplay between the components shown. In other words,

not all factors depicted in this model might pose direct influence on achievement but effects can be

indirect, i.e. mediated through other factors, or variables can impact the relationship between other

variables as moderators. For instance, noncognitive student factors (e.g., mindset, academic

perseverance) might mediate the relationship between SES and achievement. Moreover,

achievement outputs might take the role of input variables for noncognitive or other student

factors when, for instance, students with higher achievement levels might develop stronger

noncognitive factors (for instance, self-efficacy beliefs). In the context of this memo the model can

provide a useful basis for categorizing the different contextual factors relevant to achievement and

Stud

ent

At School

Outside of School

Ability

“Noncognitive Factors”

(OUTPUT VARIABLES)(CONTEXTUAL VARIABLES)OT

L

Achievement

Page 160: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

8

aligns with other schematic models proposed in the literature (e.g., Farrington et al., 2012;

Heckman & Kautz, 2013).

Despite the importance to general cognitive ability and content knowledge to student achievement

in school educational, psychological, and econometric research over the past decades, has shown

that psycho-social variables or so-called “noncognitive skills” or “noncognitive factors” are of key

importance for success in K-12 and beyond (Almlund, Duckworth, Heckmann and Kauth 2011;

Heckmann, Stixrud and Urzua, 2006; Richardson et al., 2012), and also have effects in comparable

range on achievement as cognitive ability has (e.g., Poropat, 2009). Success in school and beyond

depends, for instance, on applying effort and being committed to succeed and persist during

adversity, seeing learning as an opportunity, and respecting and understanding others. Related

educational, and especially psychological, research has focused on noncognitive factors for many

years, while numerous theories on the respective constructs have been proposed and investigated.

Economics literature has only recently focused more on noncognitive skills. Here, the increased

interest in these skills can be explained based on studies showing the predictive value of constructs

beyond classical cognitive measures of reading and mathematics for important academic and

workforce-related outcomes. While the term “noncognitive” is currently the most widely used term

to describe student factors outside of those commonly measured by aptitude tests factors, it might

reinforce a false dichotomy between traditional academic factors and psycho-social variables when,

in fact, almost all aspects of human behavior can be linked to cognition (Borghans, Duckworth,

Heckman, & Weel, 2008). Given its wide use and the current lack of a widely accepted alternative

term, we use “noncognitive factors” here to refer to skills, strategies, attitudes, and behaviors that

are distinct from content knowledge and academic skills, as described by Farrington et al. in their

2012 report for the Consortium of Chicago School Research, “Teaching Adolescents to Become

Learners: The Role of Noncognitive Factors in Shaping School Performance”. Alternative labels that

have been used in the literature are “non-intellectual correlates of GPA” (Richardson et al., 2012),

“Personality” (Heckman et al.) or “incentive enhancing preferences” (e.g., Bowles, Gintis & Osborne,

2000) to describe parameters “that shift the employee’s best response function upward, leading an

employee to work harder at every wage rate and holding all else constant” (p. 4). In the context of

educational large-scale assessments, this definition can be modified to relate to all student factors

that motivate a student to study harder, be more actively engaged in learning, and achieve higher

grades, but also in a broader sense, factors that make a student more successful in education, better

prepared for adult life as a student and/or member of the workforce, and an active citizen,

potentially including factors such as subjective well-being. Most taxonomies of so-called “21st

Page 161: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

9

Century Skills” (e.g., National Academy of Sciences/National Research Council) include

noncognitive factors as well.

The National Assessment Governing Board’s first policy principle in their 2012 Policy Statement on

NAEP Background Questions and the Use of Contextual Data in NAEP Reporting explicitly highlights

the importance of “non-cognitive questions asked of students, teachers, and schools” for enriched

NAEP reporting (National Assessment Governing Board, 2012, p. 1). We propose to include, in

addition to the subject-specific contextual factors, several domain-general noncognitive student

factors in future NAEP questionnaires to broaden the coverage of relevant variables and increase

the policy relevance of the NAEP database and reports.

Several larger literature reviews and meta-analyses have recently highlighted the importance of

noncognitive factors. Richardson et al. (2012) identified 42 noncognitive factors relevant to student

achievement and proposed clustering these into the following five conceptually overlapping, but

distinct, research domains, (1) personality traits, (2) motivational factors, (3) self-regulated

learning strategies, (4) students’ approaches to learning, and (5) psychosocial contextual

influences. Meta-analytical correlations in the range of approximately .20 or larger with Grade Point

Average (GPA) were found for 10 noncognitive factors out of the 42 factors investigated:

Performance self-efficacy, Academic self-efficacy, Grade goal, Effort regulation, Strategic approaches

to learning, Time/study management, Procrastination, Conscientiousness, Test anxiety, and Need

for cognition. Correlations with achievement for these noncognitive factors are in the same range as

the meta-analytical correlation between general intelligence and GPA. When controlling for

cognitive ability, several studies reported conscientiousness to take the role of the strongest

predictor of achievement (O’Connor & Paunonen, 2007; Poropat, 2009), and as a “comparatively

important predictor” (Poropat, 2009, p. 330) in direct comparison with general intelligence. It was

suggested that effort regulation might be the driving force behind these relationships with

achievement (Richardson & Abraham, 2009). Other reviews have drawn similar conclusions

highlighting goal setting and task-specific self-efficacy as the strongest predictors of GPA (Robbins

et al., 2004. A classification of noncognitive factors that seems especially helpful in the context of

NAEP is the recent work by the University of Chicago Consortium on Chicago School Research

(CCSR). The authors of the report suggest a similar, though slightly different, classification of

student success factors compared to the classification suggested by Richardson and others. The five

clusters of success factors identified are: Academic Behaviors, Academic Perseverance, Academic

Mindsets, Learning Strategies, and Social Skills (Farrington et al., 2012). While some of the research

on noncognitive factors (e.g., Heckman & Kautz, 2013; Nyhus & Pons, 2005; O’Connor & Paunonen,

Page 162: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

10

2007; Paunonen & Ashton, 2001; Poropat, 2009; Roberts, Kuncel, Shiner, Caspi, & Goldberg, 2007)

focuses heavily on personality and the so-called Big Five or OCEAN model (Openness,

Conscientiousness, Extraversion, Agreeableness, Neuroticism; Costa & McCrae, 1992; McCrae &

Costa, 1989) which was seen primarily as a stable person characteristics in a large part of the

traditional literature, Farrington et al. emphasize the malleability of noncognitive student factors,

and the importance of teaching in fostering noncognitive factors that help students become active

learners who succeed in school. This view is consistent with recent findings from individual

differences researchers providing ample validity evidence for the malleability, amenability for

interventions, and lifetime changes of noncognitive factors (e.g., Heckman and Kautz, 2013; Specht,

Egloff, Schmukle, 2011). As Farrington et al. (2012) describe, social investments in the development

of noncognitive factors may “yield payoffs in improved educational outcomes as well as reduced

racial/ethnic and gender disparities in school performance and educational attainment” (p. 5).

Dweck et al. (2011) highlight that educational intervention and initiatives can “have transformative

effects on students’ experience and achievement in school, improving core academic outcomes such

as GPA and test scores months and even years later” (p. 3). Several researchers have described

effective techniques to positively impact noncognitive factors such as self-efficacy beliefs in various

contexts (e.g., Abraham, 2012; Ashford, Edmunds, & French, 2010; Bandura, 1997) and have also

highlighted the specific importance of teachers’ behaviors such as setting grades, providing

constructive feedback and promoting mastery experiences, especially at early grades (Chen et al.,

2000; Lent & Brown, 2006; Stock & Cervone, 1990). Research suggests that performance-focused

interventions show larger expected effects on students’ academic achievement than more general

counseling services (Richardson et al., 2012). Further, the CCSR model aligns well with

multidimensional models of students’ school engagement (Appleton, Christenson, Kim, & Reschly,

2006; Fredricks, Blumenfeld, & Paris, 2004), with the three main engagement components

behavioral engagement, emotional engagement, and cognitive engagement. Academic behaviors

and perseverance relate to behavioral engagement, and academic mindsets and learning strategies

capture cognitive engagement as well as aspects of emotional engagement. The first cluster

described in the CCSR review, Academic behaviors, comprises behaviors such as going to class,

doing homework, organizing materials, participating in class, and studying. Academic perseverance

(cluster 2; also referred to as “grit”) as the second cluster is described as “a student’s tendency to

complete school assignments in a timely and thorough manner, to the best of one’s ability, despite

distractions, obstacles, or level of challenge. (...) It is the difference between doing the minimal

amount of work to pass a class and putting in long hours to truly master course material and excel

Page 163: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

11

in one’s studies.” (p. 9). Academic perseverance is conceptualized as a direct antecedent to

academic behaviors. Academic mindsets (cluster 3) are described as “the psycho-social attitudes

and beliefs one has about oneself in relation to academic work” (p. 9) and thereby give rise to

academic perseverance. Four key academic mindsets highlighted by Farrington et al. (2012) are (1)

“I belong in this academic community”, (2) “My ability and competence grow with my effort, (3), “I

can succeed at this”, and (4) “This work has value for me”. Learning strategies (cluster 4) are

processes or tactics that help students leverage academic behaviors to maximize learning. Four

groups of learning strategies distinguished by Farrington et al. (2012) are: study skills,

metacognitive strategies, self-regulated learning, and goal-setting. Social skills (cluster 5) are

conceptualized as interpersonal qualities that have mostly indirect effects on academic

performance by affecting academic behavior, with key social skills being empathy, cooperation,

assertion, and responsibility (Farrington et al., 2012). Farrington et al. (2012) propose a model “as

a simplified framework for conceptualizing the primary relationships” (p. 13) for how these five

noncognitive factors affect academic performance within a classroom context. In their model,

academic mindsets build the foundation for the emergence of academic perseverance that may

result in academic behaviors which, as a next step, lead to academic performance. While

Harrington’s focus clearly is on noncognitive factors, their model also includes classroom factors

and socio-cultural context factors that provide a foundation for student learning and may shape

noncognitive factors. These factors capture the OTL factors previously described on in this section

and illustrated in Figure 1.

Page 164: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

12

3 Modules proposed for future Core Questionnaires

Based on a review of the research literature, as well as a review of approaches for other large-scale

assessments, five potential modules, each comprising related constructs, are suggested for future

core contextual questionnaires. These modules are (1) Socio-Economic Status (SES), (2) Technology

Use, (3) School Climate, (4) Grit, and (5) Desire for Learning1. Modules may differ in their scope, in

terms of the number of questions needed on the questionnaire. SES, Technology Use, and School

Climate will likely comprise variables at multiple levels (e.g., school level, classroom level, and

individual level) and, therefore, be represented by questions across all respondent groups, while

Grit and Desire for Learning are primarily student-level constructs and, therefore, might require

fewer questions. Table 2 shows how these modules fit in with the overall model of student

achievement described in the previous section. Some modules capture variables spanning both

student and OTL factors. Technology use, for instance, includes an ability component (Familiarity

with technology), a noncognitive component (Attitudes towards technology), and an OTL

component (Access to technology).

Main criteria for selecting these modules were the following:

(a) Factors captured in each module should have a clear relationship with student

achievement. Student factors with no clear or low correlations with achievement based

on the published research are discarded from inclusion. This criterion directly refers to

the NAEP statute. Modules with a strong research foundation based on several studies

(ideally, meta-analyses) and established theoretical models will be favored over

modules with less research evidence regarding the relationship with achievement or

modules with a less established theoretical foundation.

(b) Factors captured in each module should be malleable and actionable in terms of possible

interventions in an outside the classroom.

(c) Factors should be amenable for measurement with survey questionnaires. Some of the

factors summarized above (e.g., social skills, learning strategies) might require other

assessment strategies to provide meaningful and reliable measures.

1 In an earlier presentation of potential modules the term “Need for Cognition” (NFC) was used. We suggest using the more general term “Desire for Learning” to replace the previous term as it is less technical and broader than NFC with NFC as one possible facets.

Page 165: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

13

(d) Modules suggested for inclusion in the Core Survey Questionnaires should focus on

those student and OTL factors that are domain-general, meaning that they are not

specific to one of the NAEP subject areas but, first, apply equally to all subject area

assessments and, second, cannot be measured better as part of the subject-specific

questionnaires.

These modules also show high alignment with the modules suggested by the National Assessment

Governing Board’s first implementation guideline for questions and questionnaires (“Clusters of

questions will be developed on important topics of continuing interest, such as student motivation

and control over the environment, use of technology, and out-of school learning, which could be

used regularly or rotated across assessment cycles”, National Assessment Governing Board, 2012,

p. 2) as well as the “Key Education Indicators” (KEI) suggested by Smith and Ginsburg (2013).

Technology was suggested as one module and is proposed also in this memo. Motivation was

suggested as a module and is captured by the two proposed modules of Grit and Desire for Learning

in this memo. Grit captures predominantly students’ motivation to work hard, apply effort, and self-

regulate their learning. Desire for learning captures intrinsic motives and general learning

motivation. Out of school activities play a role in several modules, but are primarily covered in the

Technology Use module. Out of school activities related to specific subject-areas are suggested for

inclusion in the subject-specific questionnaires, which is in line with current NAEP practices. The

Technology and Engineering Literacy (TEL) and Science survey questionnaires, for instance,

include several questions specifically targeted at learning opportunities and activities outside of

school. School climate was suggested as one KEI and is captured in this memo.

Several important noncognitive and OTL factors are not suggested as possible modules for the core

questionnaires as they can be better measured if questions are contextualized within the subject-

area questionnaires. This applies, for instance, to self-efficacy, self-concept, confidence, and interest,

or to OTL factors such as availability of resources for learning and instruction, and curriculum

content. Contextual factors specific to a NAEP subject area are proposed to be measured via the

subject-specific questionnaires, in line with current NAEP practices. Table 2 lists not only the

suggested domain-general modules, but also examples for the domain-specific indicators that are

considered for future survey questionnaires. For each subject area, an Issues Paper (not part of this

document) further lays out the contextual variables relevant to each subject area and the subject-

specific questionnaires. In the following section, the proposed modules will be described in more

detail.

Page 166: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

14

Table 2 – Overview of integration of suggested modules with achievement model; numbers in parentheses

indicate the five modules (1: SES, 2: Technology Use, 3: School Climate; 4: Grit; 5: Desire for Learning).

Domain-general* (Core Questionnaires)

Domain-specific** (Subject Area Questionnaires)

Foundational Skills/Abilities

• Familiarity with Technology (2) • Learning Strategies

Noncognitive Student Factors

• Grit (4), including: o Perseverance o Passion for long term goals o Effort regulation, self-control,

Procrastination (-) • Desire for Learning (5), including:

o Need for Cognition o Curiosity o Openness

• Attitudes towards Technology (2)

• Self-Efficacy • Self-Concept • Confidence • Interest • Achievement Motivation,

Grade Goal • Locus of Control

Opportunity to Learn (OTL)

At School:

• Access to Technology (2) • School Climate (3), including:

o Physical and emotional Safety o Teaching and learning, o Interpersonal relationships, o Institutional environment

• Resources for Learning and Instruction

• Organization of Instruction

• Teacher Preparation

Outside of School:

• Socio-Economic Status (1), key components: o Home Possessions

(including access to technology (2) and family academic resources)

o Parental Education o Parental Occupation

• Out of school educational opportunities

Note. *Basic student background characteristics, such as race/ethnicity are not included in this

overview table; **This list of domain-specific indicators is not exhaustive; domain-specific

contextual factors are described in the Issues Papers for each subject area.

3.1 Socio-Economic Status (Module 1)

Socio-economic status (SES) is a legislatively mandated reporting category in NAEP and questions

about SES have been included in all past NAEP survey questionnaires. Along with background

variables such as gender, age, and race/ethnicity SES-related variables are also among the standard

questions and reporting categories in other large-scale assessments by OECD and IEA (e.g., PISA,

TIMSS).

Page 167: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

15

SES has been described as an individual’s access to resources for meeting needs (Cowan & Sellman,

2008), the social standing or class of an individual or group, or as a gradient that reveals inequities

in access to and distribution of resources (American Psychological Association, 2007). The first

research on SES emerged in the 1920s when Taussig (1920) analyzed father’s occupational status

and observed that students of families with low income or low-status jobs demonstrated lower

achievement in school. Sims (1927) and Cuff (1934) took a more comprehensive approach using a

score card consisting of 23 survey questions including also home possessions (e.g., books), rooms in

the home, cultural activities, and parents’ educational attainment. Since then multiple approaches

to SES have been taken, and more complex statistical models were applied (e.g., Ganzeboom et al.,

1992; Hauser & Warren, 1997). Two large meta-analyses of studies published before 1980 (White,

1982) and between 1990 and 2000 (Sirin, 2005) consistently demonstrated medium to strong

relationships between SES and achievement, and further showed that parental educational

attainment was the most commonly used measure for SES, followed by occupational status and

family income. Sirin (2005) suggested six categories to group indicators of SES (numbers in

parentheses denote the number of studies identified by Sirin): parental educational attainment (30

studies), parental occupational status (15 studies), family income (14 studies), free or reduced-

price lunch (10 studies), neighborhood (6 studies), and home resources (4 studies). OECD reports

an Index of Economic, Social, and Cultural Status (ESCS) in their PISA reports that are based on three

main components: the highest parental education (indicated as the educational attainment of the

parent with the higher educational attainment; classified using the ISCED coding), the highest

parental occupation (indicated as the occupational status of the parent with the higher occupational

status; classified using the ISCO coding), and an index of home possessions (derived as a composite

of approximately 20 items about various wealth possessions, cultural possessions, and home

educational resources, plus a measure of the total number of books in the home). While different

studies have taken slightly different approaches to the measurement of SES, a common element

across the various definitions and measurement approaches for SES is the distinction of the so-

called “Big 3” components: education, income, and occupation (APA, 2007; Cowan & Sellman, 2008;

OECD, 2013). In 2012, NCES created an Expert Panel that completed a white paper entitled,

Improving the Measurement of Socioeconomic Status for the National Assessment of Educational

Progress: A Theoretical Foundation.2 Based on a comprehensive review and analysis of the literature

2 The SES Expert Panel White Paper is available at http://nces.ed.gov/nationsreportcard/pdf/researchcenter/Socioeconomic_Factors.pdf

Page 168: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

16

the NAEP SES Expert Panel (2012) suggested the following consensus definition that is adapted for

this memo:

“SES can be defined broadly as one’s access to financial, social, cultural, and human capital

resources. Traditionally a student’s SES has included, as components, parental educational

attainment, parental occupational status, and household or family income, with appropriate

adjustment for household or family composition. An expanded SES measure could include

measures of additional household, neighborhood, and school resources.” (p. 14)

SES indicators can be defined at different levels, with the systems level (e.g., the general wealth of

an economy and spending on education), school level (e.g., a school’s funding situation and the

availability and quality of educational resources), and individual level (e.g., home possessions)

being three key levels described in the literature (e.g., OECD, 2013). An example for another level is

neighborhood SES. Studies often compare socio-economically advantaged with disadvantaged

students. OECD considers students socio-economically advantaged if their ESCS index falls into the

top quartile (i.e., the top 25 percent) in their country or economy, and socio-economically

disadvantaged if their ESCS falls into the bottom quartile, respectively (OECD, 2013). That is, the

definition of being advantaged or disadvantaged is, ultimately, relative to a reference population.

The relationship between SES and student achievement has been well documented in the research

literature (Bryant, Glazer, Hansen, & Kursch, 1974; Coleman et al., 1966; Cowan & Sellman, 2008;

Cuff, 1934; Harwell & Holley, 1916; Kieffer, 2012; LeBeau, 2010; Lynd & Lynd, 1929; Singh, 2013;

Sirin, 2005; White, 1982; ). This relationship can go in both directions. SES determines students’

opportunity to learn and what skills they acquire, and the distribution of skills across the

population can have significant implications on the distribution of economic and social outcomes

within societies (OECD, 2013). Data from OECD’s Survey of Adult Skills (PIAAC), for instance, shows

that individuals with literacy scores on the highest level are “almost three times as likely to enjoy

higher wages than those scoring at the lowest levels, and those with low literacy skills are also

more than twice as likely to be unemployed” (OECD, 2013, p. 26). Recursive models and more

complex path models have been proposed to explain the observed relationships with achievement

based on additional variables such as personal aspirations, peer effects, cultural and social capital,

and variables concerning home academic climate and cognitively challenging home environments

(e.g., Blau & Duncan, 1967; Reynolds & Walberg, 1992; Spaeth, 1976; Levin & Belfield, 2002;

Coleman, 1988).

The availability of SES as a contextual variable enables researchers and policy makers to study

educational equity and fairness issues, making the existence of a reliable and valid SES measure an

Page 169: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

17

important indicator that can help monitoring achievement gaps. PISA 2012 results indicate that

socio-economic status strongly relates to achievement (“Socio-economically advantaged students

and school tend to outscore their disadvantaged peers by larger margins than between any other

two groups of students”, OECD, 2012, p. 34). At the same time, the socio-economic gradient (defined

as the relationship between SES and performance, OECD, 2013) can be altered by policies targeted

at increasing educational equity. PISA results show, for instance, that increasing educational equity

goes along with increased achievement overall in a majority of countries (OECD, 2013). SES further

is an important covariate with achievement to examine the effects of other variables, and as a

matching variable in educational intervention studies. (NAEP SES Expert Panel, 2012).

Current NAEP practice is to measure SES through a set of proxy variables that only partly capture

the “Big 3” components. Out of the three main components of SES, education, occupation, and

income, NAEP currently assesses parental education (based on student reported data) and

household income via several proxy variables including books in the home, household possessions

(both student reported), and school reported eligibility for the National School Lunch Program

(NSLP; 2008), as well as Title 1 status. For reporting purposes, all of these are treated as individual

variables, rather than as a composite index similar to the index of economic, social, and cultural

status (ESCS) that is reported by OECD based on PISA.

After reviewing the current SES indicators used in NAEP, the NAEP SES Expert Panel (2012)

concluded with four key recommendations for future SES developments in NAEP: First, developing

a core SES measure based on the “Big 3” indicators (family income, parental educational

attainment, and parental occupational status), second, considering development of an expanded

SES measure, which could include neighborhood and school SES variables; third, focusing on SES

composite measures rather than relying on single proxy measures; and forth, exploring possibilities

of using data from the U.S. Census Bureau, such as the American Community Survey (ACS), to link

to NAEP. Similar suggestions had been made earlier, particularly to create a composite measure

rather than relying on single proxy measures (Barton, 2003), and to use data linked from other

sources, such as the U.S.Census to provide more accurate data on income, parental educational

attainment, and parental occupation (Hauser & Andrew, 2007).

At the current stage of item development for the 2017 technology-based core survey

questionnaires, main considerations for future development are the design of parental occupation

questions and a possible update of existing questions on both household income and education. In

this context, we are pursuing a potential link between NAEP and Census that will allow us to obtain

SES-related information without increasing student burden. A special study will be conducted in

Page 170: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

18

2015 to link NAEP with the Early Childhood Longitudinal Study (ECLS) for grade 4 students. A short

supplemental questionnaire will be administered to all ECLS students, including new questions on

parental education and parental occupation. Furthermore, re-evaluating the validity of the NSLP

measure and some of the key traditional SES questions, such as the number of books in the home, is

a priority for future development. Particularly the availability of digital technologies has changed

the use of physical books and created new alternative indicators of wealth.

With the 2017 Core Survey Questionnaires we attempt to present a SES composite index that

captures the “big 3” components of SES and adds value to OECD’s ESCS index by improving the

validity of the parental education and occupation measures and, if feasible, combine student

reported data with other data sources in creating the index. These plans directly address the

National Assessment Governing Board’s implementation guideline that, “The development and use

of improved measures of socio-economic status (SES) will be accelerated, including further

exploration of an SES index for NAEP reporting” (National Assessment Governing Board, 2012, p.

3).

In addition, we attempt to further explore creation of an extended SES measure that might also

include psychological variables (such as, coping mechanisms, perceptions of the environment; see

also, SES Expert Panel, 2012) and potentially a subjective SES measure. In doing so we respond to

the NAEP SES Expert Panel’s recommendation that, “psychological variables and some subjective

measures of SES may be useful contextual and potentially explanatory variables that could help

interpret NAEP scores.” (NAEP SES Expert Panel, 2012, p. 17). Such an extension would correspond

to an SES model with an emphasis on social gradients and individuals’ positions relative to others

that was described by the American Psychological Association Task Force on Socioeconomic Status

as a potential alternative to the traditional materialist SES model (APA, 2007a).

3.2 Technology Use (Module 2)

Over the next few years, NAEP will fully transition from paper-and-pencil assessments to

technology-based assessments (TBAs). This represents not only a change in administration format,

but also signals the introduction of potentially new and expansive content in the subject area

assessments that reflect the way students are being prepared for post-secondary technology-rich

environments. Teaching and learning in and outside of the classroom increasingly involve using a

variety of digital technologies, such as internet resources, laptops, tablets, and smart phones.

Page 171: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

19

As all NAEP assessments move to technology-based delivery, discerning to what extent students

have access to digital technology, are familiar with it, and whether students have positive attitudes

regarding the use of technology for learning, is especially important. Thus far, two NAEP

assessments, namely the 2011 Writing assessment and the 2014 TEL assessment have been

administered via computers. When one examines the contextual variables from these assessments

that were designed to measure previous access and exposure to computers, there is only a single

contextual item measuring computer access that is common to both assessments – “Is there a

computer in your home?” There are no common items that measure familiarity with computers or

other relevant technologies across the assessments. With this suggested module, the intent is to

develop a set of indicators that help evaluate and monitor over time how prepared students are, in

a narrow sense, to take a technology-based assessment and, more generally, to deal with digital

technologies in their everyday life, both at school and outside of school. Self-efficacy regarding

major use cases of computer software in and outside the classroom, as well as keyboarding skills,

will be considered as part of this module as well.

The literature shows that access to technology at school and outside of school is linked to student

achievement (Clements, 1999; Clements and Sarama, 2003; Salerno, 1995). For example, studies

find that access to technology in the home is linked with improved achievement in mathematics and

reading (Espinosa, Laffrey, Whittaker, & Sheng, 2006; Hess & McGarvey, 1987), as well as other

achievement indicators such as graduating from high school (Fairlie, 2005). Specifically, Fairlie

(2005) found that children who had access to a computer at home were more likely to graduate

from high school. Researchers also find that access to the technology at school is positively related

to achievement, that is students who have access to technology at school tend to demonstrate

higher levels of achievement (Lowther, Ross, & Morrison, 2003; Mackinnon & Vibert, 2002; Siegle &

Foster, 2001). Interestingly, Lowther et al. (2003) also found that in addition to general access to

technology, student achievement is also influenced by whether students have their own laptop or

have to share a computer with other classmates. Specifically, these authors found that students,

who had access to their own laptop in the classroom, were more likely to have higher Problem-

Solving, Science, and Writing scores than students who had access to shared classroom computers.

One encouraging finding shows that at-risk students attending a school where a 1:1 laptop program

is implemented (i.e., one laptop is provided to each student) demonstrate the highest gains in

Writing (Zheng, Warschauer, Farkas, 2013).

While access to technology does have several educational implications, most notably on student

achievement, the literature also shows that familiarity with technology (i.e., knowing how to access

Page 172: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

20

and search the Internet, use functions in Word, Excel, etc.) is crucial to student academic success

(Cuban, Kirkpatrick, & Peck, 2001) and shapes students attitudes about technology (Peck, Cuban, &

Kirkpatrick, 2002). Familiarity with technology, often referred to as computer literacy, technology

literacy, or information and communications (ICT) literacy (i.e., knowledge about computers and

other related technology), encompasses a wide range of skills from basic knowledge/skills such as

starting a computer, opening software programs (e.g., Word or Excel) or opening a web browser

(e.g., Internet Explorer) to more advanced skills such as advanced programming.

OECD conceptualizes ICT literacy as the “availability and use of information and communications

technology (ICT), including where ICT is mostly used, as well as on the students’ ability to carry out

computer tasks and their attitudes towards computer use” (OECD, 2009). ICT literacy is considered

within the context of the home and at school, for example, the 2009 ICT questionnaire included

items related to devices available to students, activities, or tasks that students complete (e.g., home:

“Download music, films, games or software from the Internet”; school: “Post your work on the

school’s website”). In PISA, the importance of ICT literacy for learning and instruction is reflected

by a special questionnaire for students that is administered in addition to the regular student

questionnaire in a growing number of countries (45 countries in 2009). The optional ICT

questionnaire includes socio-economic factors (e.g., access to technology devices at home and

technology equipment at school), familiarity with specific tasks (e.g., using a spreadsheet or

creating a presentation), and attitudes towards computers (e.g., “it is very important to me to work

with a computer”) (OECD, 2009). Students who were more confident in their ability to perform

routine ICT tasks (e.g., open a file or save a file) and Internet tasks (e.g., browse the internet or use

email) also tended to demonstrate higher levels of mathematics and reading proficiency (OECD,

2003; 2010). PISA also includes questions in the school principal questionnaire asking about the

availability of computers in schools and whether principles experience a shortage in computers

that might negatively impact instruction in their school (OECD, 2010).

In line with these results, other studies such as Cuban et al. (2001) and Peck et al. (2002) found that

increased technology literacy is positively associated with several non-cognitive factors such as

self-confidence and motivation to excel in school. Similarly, another study found that students who

have access to and use technology also report higher participation rates in class, more interest in

learning, and greater motivation to do well in class (Trimmel & Bachmann, 2004). In addition,

students also believe that the use of laptops, and technology in general, positively affects their study

habits and general academic learning (Demb, Erickson, & Hawkins-Wilding, 2004).

Page 173: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

21

3.3 “School Climate” (Module 3)

School climate is a concept that captures a variety of experiences from the learning environment. It

is best thought of as a multidimensional construct. School climate refers to the quality and

character of school life. It sets the tone for all the learning and teaching done in the school

environment (National School Climate Center, 2013) and thereby also represents an important

opportunity to learn factor. School climate not only sets the tone for learning and teaching in the

school, but may also relate to student subjective well being (defined as “people’s experiences of

their lives as desirable”, Diener and William, 2006, p. 28) and happiness at school. The Gallup

Student Poll, for instance, includes a set of questions addressing student well-being. Several studies

demonstrated the strong impact that a student’s well-being and sense of belonging in a school or

classroom can have on achievement (Battistich, Solomon, Kim, Watson, & Schaps, 1995; Cohen &

Garcia, 2008; Furrer & Skinner, 2003; Goodenow, 1992; Goodenow & Grady, 1993; McMillan &

Chavis, 1986; Ryan & Deci, 2000; Solomon, Watson, Battistich, Schaps, & Delucchi; 1996; Wentzel &

Asher, 1995; Wentzel & Caldwell, 1997). Particularly the feeling of being part of a school or

classroom community can have considerable psychological benefits for students and makes them

more likely to engage in productive academic behaviors. School climate can have impact on

students’ academic mindsets and thereby, indirectly, impact academic perseverance and behaviors

(Farrington et al., 2012).

The literature suggests some common areas to address with any school climate measure (e.g.

Clifford, Menon, Condon, and Hornung, 2012; Cohen et al. 2013; Haggerty, Elgin, and Woodley,

2010; Voight and Hanson, 2012). One of the latest reviews by Cohen et al. (2013) identifies four

areas of focus: safety (emotional and physical), teaching and learning, interpersonal relationships,

and the institutional environment. The various sub-dimensions for these four areas are discussed

below.

Safety includes the sub-dimensions of rules and norms, sense of physical security, and sense of

social-emotional support. Rules and norms are measured by indicators of how clearly rules about

physical violence, verbal abuse, harassment, and teasing are communicated and enforced (e.g.,

“Rules in this school are made clear to students”). Sense of physical security refers to a sense that

students and adults feel safe from physical harm in the school (e.g. “Students feel safe in this

school”). Sense of social-emotional security is measured by indicators of students who feeling safe

from verbal abuse, teasing, and exclusion (e.g. “Students left me out of things to make me feel

Page 174: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

22

badly”). The contextual questionnaires in TIMSS and PIRLS, for instance, include a scale that

captures whether students feel that they are bullied at school.

Teaching and learning includes the sub-dimensions of support for learning, and social and civic

learning. Support for learning includes indicators of several different types of teaching practices

that provide varied opportunities for learning, encourage students to take risks, offer constructive

feedback, and foster an atmosphere conducive to academic challenge (e.g. “My teachers will always

listen to students' ideas”). Social and civic learning is measured by indicators of civic knowledge,

skills, and dispositions such as effective listening, conflict resolution, and ethical decision making

(e.g. “I can always find a way to help people end arguments”).

Interpersonal relationships include the sub-dimensions of respect for diversity, social support from

adults, and social support among students. Respect for diversity is measured by indicators of

mutual respect for individual differences at all levels of the school (e.g. “Students respect those of

other races”). Social support from adults is measured by indicators of supportive relationships

between adults and students, high expectations for student success, willingness to listen to

students, and personal concern for students (e.g. “Adults who work in this school care about

students”). Social support among students refers to the level of peer relationship or friendship

between students (e.g. “Students are friendly with each other”).

Institutional environment includes the sub-dimensions of school connectedness or engagement and

physical surroundings. School connectedness or engagement refers to whether the students

positively identify with the school and the norms for broad participation in school life (e.g. “I am

happy to be at this school”). The physical surroundings sub-dimension refers to how appealing the

schools facilities are and whether the school has adequate resources and materials (e.g. “This

school has clean and well–maintained facilities and property.”)

A great deal of research on school climate has been conducted in the United States at the national

level. The School Climate Surveys (SCLS) will pilot new questionnaires with middle and high

schools in 2015. Longitudinal surveys (such as the Early Childhood Longitudinal Study, ECLS-K)

include measures of school climate on their student, teacher, and school administrator survey

instruments. State-wide surveys are also common. States such as Alaska, California and Delaware

have undertaken item development efforts to develop their own surveys of school climate

(American Institutes of Research, 2011; Bear & Yang, 2011; Hanson, 2011). The PISA student

questionnaire includes several measures of school climate, such as Student-Teacher-Relations, Sense

of Belonging, and Disciplinary Climate that have been consistently used in the survey since 2000.

Page 175: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

23

PIRLS and TIMSS report several indices related to school climate as well (e.g., Students Bullied at

School Scale; School Discipline and Safety Scale). Finally, there are nonprofit organizations such as

the National School Climate Center (http://www.schoolclimate.org) and the Center for the Study of

School Climate (http://www.schoolclimatesurvey.com) that assists schools with assessing school

climate and developing strategies for improving it at their school. Item development for the

proposed school climate module will consider using existing questions from other surveys where

appropriate to further strengthen the linkage between NAEP and other large-scale assessments and

surveys, as called for in the National Assessment Governing Board’s implementation guidelines for

future survey questionnaires (“NAEP will include background questions from international

assessments, such as PISA and TIMSS, to obtain direct comparison of states and TUDA districts to

educational practices in other countries”, National Assessment Governing Board, 2012, p.3).

Research has shown a relationship between several of the sub-dimensions of school climate and

student achievement. Information on school-level factors which help improve schools, and thereby

also positively affect student learning, is of high policy relevance. A positive school climate creates

an environment that is conducive to student learning and achievement. School climate has been

proven to show an increase in a student’s motivation to learn (Eccles et al., 1993). It has also been

shown to moderate the impact of socioeconomic context on academic success (Astor, Benebnisty,

and Estrada, 2009).

There has been research showing that each of the sub-dimensions of school climate effect student

achievement. In the area of safety, schools without supportive norms, structures, and relationships

are more likely to experience violence and victimization which is often associated with reduced

academic achievement (Astor, Guerra, and Van Acker, 2010). The relationships that a student

encounters at all levels in school also have an effect on student achievement. Students’ perceptions

of teacher-student support and student-student support are positively associated with GPA (Jia et

al., 2009). The student-teacher relationship even very early on in school, such as kindergarten,

portends future academic success (Hamre and Pianta, 2001). Positive perceptions of the racial

climate in a school are also associated with higher student achievement while negative racial

climate can negatively influence college preparation (Griffin and Allen, 2006).

Perhaps some of the strongest predictors of achievement related to school climate refer to the

teaching and learning practices in a school. Several correlational studies have shown a positive

relationship between school climate in this area and academic achievement in elementary

(Sterbinksky, Ross, and Redfield, 2006), middle school (Brand, Felner, Shim, Seitsinger, and Dumas,

2003), and high school (Stewart, 2008). Research shows that positive school climate not only

Page 176: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

24

contributes to immediate student achievement, but endures for years (Hoy, Hannum, and

Tschannen-Moran, 1998). Specific types of social and civic learning practices have been shown to

be related to higher achievement. For example, evidence-based character education programs are

associated with higher achievement scores for elementary students. One meta-analysis of 700

positive youth development, social emotional learning, and character education programs found

that socio-emotional learning led to a gain of 11-17 percentile points in achievement (Payton et al.,

2008). There is also research suggesting that the institutional environment is related to

achievement. School connectedness or engagement has been shown to be predictive of academic

outcomes (Ruus, 2007).

A school climate measure for NAEP should take into account the various major focus areas and sub-

dimensions reviewed above. A selection of the most important sub-dimensions to focus on in future

NAEP contextual questionnaires seems important. Also, different respondent groups might be more

appropriate for the measurement of different sub-dimensions.

3.4 “Grit” (Module 4)

One key finding from the research literature reviewed in the previous section is that academic

perseverance is one of the strongest predictors of achievement. This module focuses not only on

academic perseverance but combines perseverance with other, related factors that are comprised

under the factor “Grit”. Grit is defined as perseverance and passion for long-term goals (Duckworth,

Peterson, Matthews, and Kelly, 2007). Grit can contribute to understanding student achievement

beyond variables related to SES and other OTL factors. It is related to conscientiousness, defined as

the degree to which a person is hard working, dependable, and detail oriented (Berry et al., 2007),

but focuses on its facets perseverance, industriousness, self-control, and procrastination

(negatively), which are among the facets that are strongest related to achievement (e.g., Barrick,

Stewart and Piotrowski, 2002). Students’ persistence even on difficult tasks (perseverance, e.g., not

to put off difficult problems, not to give up easily), general work ethics (industriousness, e.g.,

prepare for class, work consistently throughout the school year), and low level of procrastination

are not only among the strongest non-cognitive predictors of GPA (Richardson et al., 2012), but are

also important predictors of success in higher education and the workforce in general (e.g.,

Heckman, Stixrud & Urzua, 2006; Lindqvist & Vestman, 2011; Poropat, 2009; Roberts et al., 2007).

Meta-analyses (e.g., Poropat, 2009) have shown that perseverance and related person

characteristics predict educational success to a comparable degree as cognitive ability measures. In

other words, a prediction of a person’s educational outcomes, such as GPA, based on a score

Page 177: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

25

reflecting the person’s level of perseverance is about as accurate as a prediction of the same

outcome based on a person’s IQ.

Grit goes beyond what is captured with these conscientiousness facets by including the capacity to

sustain both the effort and interest in projects that take months or even longer to complete. Grit is a

noncognitive factor that may explain why some individuals accomplish more than others of equal

intellectual ability. Early psychologists recognized that there are certain factors that influence how

individuals utilize their abilities. William James suggested that psychologists should study both the

different types of human abilities and the means by which individuals utilize these abilities (James,

1907). Galton studied the biographical information of a number of eminent individuals and

concluded that high achievers had “ability combined with zeal and with capacity for hard labor”

(Galton, 1892). There are also more recent examples in modern psychology that demonstrate

renewed interest in the trait of perseverance (Peterson and Seligman, 2004). Howe (1999) studied

the biographical details of geniuses such as Einstein and Darwin and concluded that perseverance

must be as important as intelligence in predicting achievement. Similarly, Ericsson and Charness

(1994) found that in chess, sports, music, and the visual arts, dedicated or deliberate practice was

an important predictor of individual differences between individuals. Interestingly, these same

studies show that grit predicts achievement over and beyond the contribution of intelligence.

Grit is related to some of the Big Five personality traits. In particular, it shares some commonality

with the trait of conscientiousness. In contrast to conscientiousness, however, grit focuses on long-

term endurance. Grit may also be similar in certain aspects to an individual’s “need for

achievement” (McClelland, 1961). Need for achievement considers an individual’s ability to

complete manageable goals that provide immediate feedback on performance. While the idea of

working towards a goal may be similar between need for achievement and grit, individuals high in

grit are more likely to set long-term goals and continue to pursue these goals even without any

positive feedback.

Grit has been measured in different settings. It has been measured with both children and adults,

and there are similar measuring instruments available for both children and adults. The

questionnaire has been administered on both the Web and by pencil and paper. A series of studies

that have been used to validate the measure were conducted on a variety of populations

(Duckworth, Peterson, Matthews, and Kelly, 2007; Duckworth and Quinn, 2009). These include

visitors to a website providing free information about psychological research, undergraduate

students majoring in psychology, incoming United States Army cadets, and children age 7-15 years

old participating in a national spelling bee. Grit is highly relevant to NAEP as a noncognitive factor

Page 178: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

26

that explains individual differences in achievement. Students higher in grit may develop different

study habits that allow them to use more of their intellectual ability than other students with

similar levels of intelligence. Duckworth, Peterson, Matthews, and Kelly (2007) have provided some

evidence in this direction. When SAT scores were held constant, grit was shown to have roughly the

same association to GPA as SAT scores. These findings suggest that what student’s may lack in

general cognitive ability, as reflected in traditional test scores, be able to be made up in “grittiness”.

They have also found that children higher in grit were more likely to advance to higher rounds in a

national spelling bee than children who were lower in grit. Furthermore, this relationship was

mediated by the number of hours that the children practiced on the weekend—that is, children

higher in grit seem to be more likely to spend time practicing on weekends, which leads to better

achievement in the spelling bee. Other studies have shown that undergraduate students higher in

grit have higher GPAs than students lower in grit (Duckworth, Peterson, Matthews, and Kelly,

2007). This was true even though grit was associated with lower SAT scores. In addition, U.S.

military cadets who are higher in grit have been shown to be less likely to drop out than cadets who

are lower in grit (Duckworth, Peterson, Matthews, and Kelly, 2007). This relationship holds even

after controlling for other factors such as Scholastic Aptitude Test (SAT) scores (as mentioned

earlier), high school rank, and Big Five personality characteristics.

3.5 “Desire for Learning” (Module 5)

Desire for Learning is proposed as a second main domain-general noncognitive student factor that

adds to Grit in that need for cognition assesses whether individuals see learning as an opportunity

and approach learning situations at school and outside of school with an academic mindset that

helps them apply effort, persevere, and refrain from procrastination attempts. As highlighted in the

overview section of this paper, grit and academic perseverance are key factors to student

achievement in the classroom. At the same time, the research suggests that “an isolated focus on

academic perseverance as a thing unto itself may well distract reformers from attending to student

mindsets and the development of learning strategies that appear to be crucial to supporting

students’ academic perseverance.”(Farrington et al., 2012, p. 27). We therefore suggest including

“Desire for Learning” as an additional module that will provide policy relevant data on students’

mindset in terms of their need for cognition, curiosity, and intrinsic motivation to learn and grow

further. Desire for learning plays an essential role in order to teach students to become truly

engaged learners, as highlighted by the authors of the CCSR review on noncognitive factors:

Page 179: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

27

“Teaching adolescents to become learners requires more than improving test scores; it means

transforming classrooms into places alive with ideas that engage students’ natural curiosity and

desire to learn in preparation for college, career, and meaningful adult lives. This requires schools

to build not only students’ skills and knowledge but also their sense of what is possible for

themselves, as they develop the strategies, behaviors, and attitudes that allow them to bring their

aspirations to fruition.” (p. 77). Desire for learning relates to cognitive engagement in the

multidimensional model of school engagement described earlier on in this memo, particularly

students’ motivation to learn, intrinsic motivation, and task valuing in school (Ames, 1992; see also

Eccles et al, 1993: subjective value of learning scale), and mastery goal orientations (Wentzel,

1998).

A main theoretical basis for the relevance of desire for learning comes from research on so-called

“Need for Cognition”. Drawing on earlier work in social psychology, particularly the work of Cohen

(e.g, Cohen, 1957), Cacioppo and Petty (1982) described the need for cognition construct (that is,

“the tendency for an individual to engage in and enjoy thinking,” p. 116), and introduced a scale to

measure it, and presented evidence for the scale’s validity. For example, their first study showed

that university faculty had higher scores on the need for cognition than assembly line workers did.

A review of work in the ensuing 12 years (Cacioppo, Petty, Feinstein, and Jarvis, 1996) found that

the construct had been examined in more than 100 empirical studies; work on the need for

cognition has continued to the present day. The original scale for measuring need for cognition

included 34 items, but Cacioppo, Petty, and Kao (1984) introduced a shorter version with 18 items

that appeared just a reliable as the original.

More than 30 studies have examined reliability of scale scores, most of them using Cronbach’s

alpha; these studies generally find that the scale has high reliability. Numerous studies have also

examined the factorial structure of the original or short forms of the need for cognition scale; most

of them find a single dominant factor, with a few exceptions. For example, Tanaka, Panter, and

Winborne (1988) argue for three dimensions—cognitive persistence, cognitive confidence, and

cognitive complexity. Generally, researchers have treated the need for cognition as a one-

dimensional construct. Those who are high on need for cognition enjoy effortful cognitive

endeavors and engage in them; those who are low on need for cognition do not enjoy such

endeavors and try to avoid them.

The need for cognition scale has been translated into several languages (including German, Dutch,

and Turkish) and has been administered in a variety of settings. The original items were designed

for self-administration. Respondents are presented with 18 or 34 statements (“Thinking is not my

Page 180: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

28

idea of fun”) and are asked to rate each statement on a five-point scale, ranging from “extremely

uncharacteristic” to “extremely characteristic”. The items are balanced in the sense that half of the

statements indicate the presence of the need for cognition and half indicate the lack of it.

A few studies have included the need for cognition items in large-scale mail surveys (Verplanken,

1989, 1991, reports their use in a mail survey in the Netherlands), and the items would seem to

lend themselves to computerized administration (such as a web survey). The vast majority of

studies using the scale have administered it to undergraduates. The only studies we have found that

have used the items with respondents in the age range of the NAEP participants were conducted in

Germany (Bertrams and Dickhäuser, 2009; Preckel, Holling, and Vock, 2014). Some of the items in

the English version could exceed the vocabulary of the typical fourth grader. Thus, a version of the

scale might need to be developed for use with the NAEP student samples.

Several studies show that desire for learning/need for cognition is related to achievement in school

(e.g., Bertrams and Dickhäuser, 2009; Preckel, Holling, and Vock, 2006; see also Petty and Jarvis,

1996) and one of the stronger predictors of GPA based on meta-analytical data (Richardson et al.,

2012). There are several pathways that could account for the link between desire for

learning/need for cognition and academic success. Need for cognition reflects willingness to

expend cognitive effort and this is clearly a prerequisite for mastering difficult material. In

addition, persons with higher desire for learning engage in more effortful cognitive processing and

seek out information more than their counterparts who are low in desire for learning/need for

cognition (Cacioppo et al., 1996). Finally, those high on need for cognition also have higher intrinsic

motivation to perform challenging cognitive tasks (Cacioppo et al., 1996). Whatever the exact

causal path, need for cognition does seem to predict academic achievement, whether measured by

GPA or standardized test scores.

Desire for Learning also captures aspects of Openness, reflecting people’s willingness to make

adjustments to existing attitudes and behaviors once they have been exposed to new ideas or

situations (Flynn, 2005). PISA 2012 includes a 4-item openness for problem solving scale (e.g., “I

like to solve complex problems”) that shows some conceptual overlap with the Need for Cognition

(NFC) scales described above. Correlations of the scale with achievement are among the largest

across all noncognitive indices included in the PISA questionnaires based on PISA 2012 data.

Page 181: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

29

4 References Abedi, J. (2006). Psychometric issues in the ELL assessment and special education eligibility. The

Teachers College Record, 108, 2282–2303.

Abraham, C. (2012). Mapping change mechanisms and behavior change techniques: A systematic

approach to promoting behavior change through text. In C. Abraham & M. Kools (Eds.),

Writing health communication: An evidence–based guide for professionals (pp. 99–116).

London, England: Sage

Adams, R. J., Lietz, P.,& Berezner, A. (2013). On the use of rotated context questionnaires in

conjunction with multilevel item response models. Large–scale Assessments in Education,

1(5).

Almonte, D. E., McCullough, J., Lei, M., & Bertling. J. P. (2014, April). Spiraling of contextual

questionnaires in the NAEP TEL pilot assessment. In: Bertling, J.P. (Chair). Spiraling

Contextual Questionnaires in Educational Large–Scale Assessments (Coordinated Session).

2014 NCME Conference, Philadelphia.

Almlund, M., Duckworth, A., Heckman, J, J., & Kautz, T. (2011). Personality Psychology and

Economics. IZA Discussion Papers 5500. Retrieved from

http://ssrn.com/abstract=1765666.

American Institutes for Research (2011). 2011 School Climate and Connectedness Survey: Statewide

report. Retrieved from at http://alaskaice.org/school–climate/survey.

American Psychological Association. (2007a).Office on socioeconomic status. Retrieved from

http://www.apa.org/pi/ses/homepage.html.

American Psychological Association. (2007b). Report of the APA Task Force on socioeconomic status.

Washington, DC: Author. Retrieved from http://www2.apa.org/pi/ SES_task_force

report.pdf.

Ashford, S., Edmunds, J., & French, D. (2010). What is the best way to change self–efficacy to

promote lifestyle and recreational physical activity? A systematic review with meta–

analysis. British Journal of Health Psychology, 15, 265–288.

Astor, R. A., Benbenisty, R., & Estrada, J.N. (2009). School violence and theoretically atypical schools:

The principal’s centrality in orchestrating safe schools. American Educational Research

Journal, 46, 423-461.

Page 182: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

30

Astor, R. A, Guerra, N., & Van–Acker, R. (2010). How can we improve school safety research?

Educational Researcher, 39, 69-78.

Barrick, M.R., Stewart, G. L. & Piotrowski, M. (2002). Personality and job performance: Test of the

mediating effects of motivation among sales representatives. Journal of Applied Psychology,

87, 43–51.

Barton, P.E. (2003). Parsing the achievement gap: Baselines for tracking progress. Princeton, NJ:

Policy Information Center, Educational Testing Service.

Battistich, V., Solomon, D., Kim, D., Watson, M., and Schaps, E. (1995). Schools as communities,

poverty levels of student populations, and students’ attitudes, motives, and performance: A

multilevel analysis. American Educational Research Journal, 32, 627–658.

Bear, G., and Yang, C. (2011). Delaware School Climate Survey technical manual. Retrieved from

http://wordpress.oet.udel.edu/pbs/wp–content/uploads/2011/12/Final–Technical–

Manual.pdf.

Bertrams, A. & Dickhäuser, O. (2009). High–school students’ need for cognition, self–control

capacity, and school achievement: Testing a mediation hypothesis. Learning and Individual

Differences, 19, 135-138.

Berry, C. M., Ones, D. C. & P.R. Sackett, P. R. (2007). Interpersonal deviance, organizational deviance,

and their common correlates: A review and meta–analysis. Journal of Applied Psychology,

92, 410-424.

Blau, P. M., & Duncan, O.D. (1967).The American Occupational Structure. New York: Wiley and Sons.

Borghans, L., Duckworth, A. L., Heckman, J. J., & Ter Weel, B. (2008). The economics and psychology

of personality traits. Journal of Human Resources, 43, 972-1059.

Bowles, S., Gintis H., & Osborne, M. (2001). The Determinants of Earnings: A Behavioral Approach.

Journal of Economic Literature, 39, 1137–1176.

Brand, S., Felner, R., Shim, M., Seitsinger, A., & Dumas, T. (2003). Middle school improvement and

reform: Development of validation of a school–level assessment of climate, cultural

pluralism and school safety. Journal of Educational Psychology, 95, 570–588.

Bryant, E.C., Glazer, E., Hansen, M.A., & Kursch, A. (1974). Associations between educational

outcomes and background variables [Monograph]. Denver, CO: National Assessment of

Educational Progress.

Page 183: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

31

Cacioppo, J. T., Petty, R. E., Feinstein, J. A., and Jarvis, W. B. G. (1996). Dispositional differences in

cognitive motivation: The life and times of individuals varying in need for cognition.

Psychological Bulletin, 119, 197–253.

Cacioppo, J. T., Petty, R. E., & Kao, C. F. (1984). The efficient assessment of need for cognition.

Journal of Personality Assessment, 48, 306–307.

Carroll, J. (1963). A model of school learning. The Teachers College Record, 64, 723–723.

Chen, G., Gully, S. M., Whiteman, J. A., & Kilcullen, R. N. (2000). Examination of relationships among

trait–like individual differences, state–like individual differences, and learning performance.

Journal of Applied Psychology, 85, 835–847.

Clements, D. (1999). Young children and technology: In dialogue on early childhood science,

mathematics, and technology education. Washington, DC: American Association for the

Advancement of Science.

Clements, D. H., & Sarama, J. (2003). Young children and technology: What does the research say?

Young Children, 58, 34-40.

Clifford, M., Menon, R., Condon, C. & Hornung, K. (2012). Measuring School Climate for Gauging

Principal Performance: A Review of the Validity and Reliability of Publicly Accessible Measures.

American Institute for Research. Retrieved from http://www.air.org/resource/measuring-

school-climate-gauging-principal-performance.

Cohen, A. R. (1957). Need for cognition and order of communication as determinants of attitude

change. In C. Hovland (Eds.), The order of presentation in persuasion. New Haven, CT: Yale

University Press.

Cohen, J. (2013). School climate and culture improvement: A prosocial strategy that recognizes,

educates, and supports the whole child and the whole school community. In Brown, Cor-

rigan & Higgins–D’Alessandro (Eds.), The Handbook of Prosocial Education: Rowman and

Littlefield Publishing Group.

Cohen, G. L., & Garcia, J. (2008). Identity, belonging, and achievement: A model, interventions,

implications. Current Directions in Psychological Science, 17, 365-369.

Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfield, F. D., et at.

(1966).Equality of educational opportunity (2 vols.). Washington, DC: U.S. Government

Printing Office.

Page 184: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

32

Comber, L. C., & Keeves, J. P. (1973). Science Education in Nineteen Countries. Stockholm: Almquist

& Wiksell.

Costa. P. T., & McCrae, R. R. (1992). Revised NEO Personality Inventory (NEO–PI–R) and NEO Five–

Factor Inventory (NEO–FFI) professional manual. Odessa: FL: Psychological Assessment

Resources, Inc.

Cowan, C. D., Sellman S.W. (2008). Improving the quality of NAEP socioeconomic status information:

Report on research activities. Alexandria, VA: Human Resources Research Organization

(HumRRO). Retrieved from http://www.humrro.org/corpsite/publication/ improving-

quality-naep-socioeconomic-status-information-report-research-activities, February, 2012.

Cuban, L., Kirkpatrick, H., & Peck, C. (2001). High access and low use of technologies in high school

classrooms: Explaining an apparent paradox. American Educational Research Journal, 38,

813-834.

Cuff, N.B.(1934).The vectors of socio–economic status. Peabody Journal of Education, 12, 114-117.

Demb, A., Erickson, D., & Hawkins–Wilding, S. (2004). The laptop alternative: student’s reactions

and strategic implications. Computers and Education, 43, 383-401.

Diener, E. & Tov, W. (2006). National accounts of well-being. In K. C. Land, A. C. Michalos, & M. J.

Sirgy (Eds.), Handbook of social indicators and quality of life research (pp. 137-156). New

York, NY: Springer. Duckworth, A.L., Peterson, C., Matthews, M.D., & Kelly, D.R. (2007). Grit:

Perseverance and passion for long–term goals. Journal of Personality and Social Psychology,

92, 1087-1101.

Duckworth, A. L., & Quinn, P. D. (2009). Development and validation of the short grit scale (grits).

Journal of Personality Assessment, 91, 166-174.

Dweck, C. S., Walton, G. M., and Cohen, G. L. (2011). Academic tenacity: Mindsets and skills that

promote long–term learning. White paper prepared for the Gates Foundation. Seattle, WA.

Eccles, J. S., Wigfield, A., Midgley, C., Reuman, D., MacIver, D., & Feldlaufer, H. (1993). Negative

effects of traditional middle schools on students’ motivation. Elementary School Journal, 93,

553-574.

Ericsson, K. A., & Charness, N. (1994). Expert performance: its structure and acquisition. American

Psychology, 49, 725-47

Page 185: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

33

Espinosa, L.M.; Laffey, J.M., Whittaker, T., & Sheng, Y. (2006). Technology in the home and the

achievement of young children: Findings from the Early Childhood Longitudinal Study. Early

Education and Development, 17, 421-441.

Fairlie, R. (2005). The effects of home computers on school enrolment. Economics of Education

Review, 24, 533-547.

Farrington, C.A., Roderick, M., Allensworth, E., Nagaoka, J., Keyes, T.S., Johnson, D.W., & Beechum,

N.O. (2012). Teaching adolescents to become learners. The role of noncognitive factors in

shaping school performance: A critical literature review. Chicago: University of Chicago

Consortium on Chicago School Research.

Foy, P., & Drucker, K. T. (2011). PIRLS 2011 user guide for the international database. Supplement 1.

International version of the PIRLS 2011 background questionnaires and curriculum

questionnaire. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of

Education, Boston College.

Furrer, C., & Skinner, E. (2003). Sense of relatedness as a factor in children’s academic engagement

and performance. Journal of Educational Psychology, 95, 148-162.

Galton, F. (1892). Hereditary Genius: An inquiry into its laws and consequences. London: Macmillan.

Ganzeboom, H. B., De Graaf, P. M., & Treiman, D. J. (1992). A standard international socio–economic

index of occupational status. Social Science Research 21, 1-56.

Ginsburg, A., & Smith, M. S. (2013, December 5-7). Key Education Indicators (KEI): Making Sense of

NAEP Contextual Variables. Presentation at National Assessment Governing Board Meeting,

Baltimore.

Goodenow, C. (1992). Strengthening the links between educational psychology and the study of

social contexts. Educational Psychologist, 27, 177–196.

Goodenow, C., & Grady, K. E. (1993). The relationship of school belonging and friends’ values to academic motivation among urban adolescent students. Journal of Experimental Education, 2, 60–71.

Griffin, K. A. & Allen, W. R. (2006). Mo’ money, mo’ problems?: High achieving Black high school

students? Experiences with resources, racial climate, and resilience. Journal of Negro

Education, 75, 478-494.

Haggerty, K., Elgin, J & Woodley, A. (2010). Social–emotional learning assessment measures for middle school youth. Social Development Research Group, University of Washington and the Raikes Foundation.

Page 186: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

34

Harwell, M. R., & Lebeau, B. (2010). Student eligibility for a free lunch as an SES Measure in

education research. Educational Researcher, 39, 120-131.

Hauser, R. M., & Andrew, M. (2007). Reliability of student and parent reports of socioeconomic

status in NELS–88. Working paper presented at ITP seminar at the University of Wisconsin–

Madison. Retrieved from http://www.wcer.wisc.edu/itp/Spring%20 08%20

seminar/HauserNELS–SES%20measurement_070607a.pdf.

Hauser, R. M., & Warren, J.R. (1997). Sociological indexes for occupations: A review, update, and

critique. Sociological Methodology, 27, 177-298.

Heckman, J., & Rubinstein, Y. (2001). The importance of noncognitive skills: Lessons from the GED

testing program. American Economic Review, 91, 145–149.

Heckman J. J., Stixrud, J., & Urzua, S. (2006). The Effects of Cognitive and Noncognitive Abilities on

Labor Market Outcomes and Social Behavior. Journal of Labor Economics 24, 411-482.

Heckman, J. J., & Kautz, T. (2013). Fostering and Measuring Skills: Interventions That Improve

Character and Cognition. IZA Discussion Paper No. 7750. Bonn: Institute for the Study of

Labor.

Hess, R. D., & McGarvey, L., J. (1987). School-relevant effects of educational uses of microcomputers

in kindergarten classrooms and homes. Journal of Educational Computing Research, 3, 269-

287.

Hanson, T. L. (2011). Measurement analysis of California School Climate, Health, and Learning

Survey (Cal–SCHLS) for staff. Unpublished document.

Hamre, B. K., & Pianta, R. C. (2001). Early teacher–child relationships and the trajectory of

children’s school outcomes through eighth–grade. Child Development, 72, 625–638.

Hoy, W. K., Hannum, J., & Tschannen–Moran, M. (1998). Organizational climate and student

achievement: A parsimonious and longitudinal view. Journal of School Leadership, 8, 336–

359.

James, W. (1907, March 1). The energies of men. Science, 25, 321–332.

Kaplan, D. & Wu, D. (2014, April 2-6) Imputation Issues Relevant to Context Questionnaire Rotation.

In: Bertling, J.P. (Chair). Spiraling Contextual Questionnaires in Educational Large–Scale

Assessments (Coordinated Session). 2014 NCME Conference, Philadelphia.

Lent, R. W., & Brown, S. D. (2006). On conceptualizing and assessing social cognitive constructs in

career research: A measurement guide. Journal of Career Assessment, 14, 12-35.

Page 187: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

35

Levin, H. M., & Belfield, C. R. (2002). Families as contractual partners in education. UCLA Law

Review, 49, 1799-1824.

Lindqvist, E., & R. Vestman (2011). The labor market returns to cognitive and noncognitive ability:

Evidence from the Swedish enlistment. American Economical Journal: Applied Economics, 3,

101-128.

Lowther, D. L., Ross, S. M., & Morrison, G. M. (2003). When each one has one: The influences on

teaching strategies and student achievement of using laptops in the classroom. Educational

Technology Research and Development, 51, 23–44.

Lynd, R. S., & Lynd, H. M. (1929). Middletown: A study in American culture. New York: Harcourt

Brace.

McCrae, R. R., & Costa, P. T. (1989). The structure of interpersonal traits: Wiggin’s circumplex and

the five–factor model. Journal of Personality and Social Psychology, 45, 586–595.

Mackinnon, G. R., & Vibert, C. (2002). Judging the constructive impacts of communication

technologies: a business education study. Education and Information Technology, 7, 127–

135.

Mattison, E., & Aber, M.S. (2007). Closing the achievement gap: The association of racial climate

with achievement and behavioral outcomes. American Journal of Community Psychology, 40,

1-12.

McClelland, D. C. (1961). The achieving society. Oxford, England: Van Nostrand.

McMillan, D.W., & Chavis, D. M. (1986). Sense of community: A definition and theory. Journal of Community Psychology, 14, 6–23.

Monseur, C. & Bertling, J. P. (2014, April 2-3). Questionnaire rotation in international surveys:

findings from PISA. In: Bertling, J.P. (Chair). Spiraling Contextual Questionnaires in

Educational Large–Scale Assessments (Coordinated Session). 2014 NCME Conference,

Philadelphia.

Nyhus, E. and E. Pons (2005). The effects of personality on earnings. Journal of Economic Psychology,

26, 363–384.

National Assessment Governing Board. (2012). Policy statement on NAEP background questions and

the use of contextual data in NAEP reporting. Washington, DC: U .S. Department of Education,

National Assessment Governing Board.

Page 188: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

36

National School Lunch Program. (2008). Fact sheet. Retrieved from

http://www.fns.usda.gov/cnd/governance/notices/naps/NAPs.htm.

OECD (2009). Creating effective teaching and learning environment: First results of Teching and

Learning International Survey (TALIS). OECD.

OECD (2013, PISA 2012 assessment and analytical framework: Mathematics, reading, science,

problem solving and financial literacy. Paris: OECD Publishing. Retrieved from

http://dx.doi.org/10.1787/9789264190511-en.

O’Connor, M.C., & Paunonen, S. V. (2007). Big five personality predictors of postsecondary academic

performance. Personality and Individual Differences, 43, 971-990.

OECD (2003). Are students ready for a technology – rich world? What PISA studies tell us. Retrieved

from

http://www.oecd.org/education/school/programmeforinternationalstudentassessmentpis

a/35995145.pdf.

OECD (2009). Creating effective teaching and learning environment: First results of Teching and

Learning International Survey (TALIS). OECD.

OECD (2009). PISA 2009 assessment framework: Key competencies in reading, mathematics and

science. Retrieved from http://www.oecd.org/pisa/pisaproducts/44455820.pdf.

OECD (2010a). PISA 2009 Results: Overcoming Social Background - Equity in Learning Opportunities

and Outcomes (Volume II). Retrieved from http://dx.doi.org/10.1787/9789264091504-en.

OECD (2011). PISA 2009 Results: Students On Line Digital Technologies and Performance (Volume

VI). Paris: OECD Publishing.

Paunonen, S. V. (2003). Big five factors of personality and replicated predictions of behavior.

Journal of Personality and Social Psychology, 84, 411-424.

Paunonen, S. V. and M.C. Ashton (2001a). Big five factors and facets and the prediction of behavior.

Journal of Personality and Social Psychology, 81, 524-539.

Paunonen, S. V., and M.C. Ashton (2001b). Big five predictors of academic achievement. Journal of

Research in Personality, 35, 78-90.

Payton, J., Weissberg, R.P., Durlak, J.A., Dymnicki, A.B., Taylor, R.D., Schellinger, K.B., & Pachan, M.

(2008). The positive impact of social and emotional learning for kindergarten to eighth–grade

students: Findings from three scientific reviews. Chicago, IL: Collaborative for Academic,

Social, and Emotional Learning.

Page 189: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

37

Peck, C., Cuban, L., & Kirkpatrick, H. (2002). Techno–promoter dreams, student realities. Phi Delta

Kappan, 83, 472-480.

Peterson, C & Seligman, M.E.P. (2004). Character Strengths and Virtues: A Handbook and

Classification. Washington, D.C.: APA Press and Oxford University Press.

Poropat, A. E. (2009). A meta–analysis of the five-factor model of personality and academic

performance. Psychological Bulletin, 135, 322–338.

Preckel, F., Holling, H., & Vock, M. (2006). Academic underachievement: Relationship with

cognitive motivation, achievement motivation, and conscientiousness. Psychology in the

Schools, 43, 401-411.

Purves A.C. (1987). I.E.A. an Agenda for the Future, International Review of Education, 33, 103–107.

Reynolds, A. J., & Walberg, H. J. (1992). A structural model of science achievement and attitude: An

extension to high school. Journal of Educational Psychology, 84, 371-382.

Richardson, M. & Abraham, C. (2009). Conscientiousness and achievement motivation predict

performance. European Journal of Personality, 23, 589–605.

Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university students’

academic performance: A systematic review and meta–analysis. Psychological Bulletin, 138,

353-387.

Robbins, S. B., Lauver, K., Le, H., Davis, D., Langley, R., & Carlstrom, A. (2004). Do psychosocial and

study skill factors predict college outcomes? A meta–analysis. Psychological Bulletin, 130,

261–288.

Roberts, B. W., Kuncel, N. R., Shiner, R., Caspi, A. & Goldberg, L. R. (2007). The power of

personality: The comparative validity of personality traits, socioeconomic status, and

cognitive ability for predicting important life outcomes. Perspectives on Psychological

Science, 2,313–345.

Ruus, V., Veisson, M., Leino, M., Ots, L., Pallas, L., Sarv, E., & Veisson, A. (2007). Students’ well–being,

coping, academic success, and school climate. Social Behavior & Personality: An International

Journal, 35, 919–936.

Ryan, R. M., & Deci, E. L. (2000). Self–determination theory and the facilitation of intrinsic motivation, social development, and well–being. American Psychologist, 55, 68–78.

Rychen, D.S., & Salganik, L.H. (Eds). (2003). Defining and selecting key competencies. Cambridge, MA:

Hogrefe & Huber.

Page 190: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

38

Saucier, G., & L.R. Goldberg (2002). Assessing the big five: Applications of 10 psychometric criteria

to the development of marker scales, In B. de Raad and M. Perugini (eds.), Big Five

Assessment (pp. 29–58). Goettingen, Germany: Hogrefe & Huber.

Salerno, C. (1995). The effect of time on computer-assisted instruction for at-risk students. Journal

of Research on Computing in Education, 28, 85–97.

SES Expert Panel. (2012). Improving the Measurement of Socioeconomic Status for the National

Assessment of Educational Progress: A Theoretical Foundation. White Paper prepared for the

National Center for Education Statistics. Washington, DC: U.S. Department of Education.

Siegle, D., & Foster, T. (2001). Laptop computers and multimedia and presentation software: their

eVects on student achievement in anatomy and physiology. Journal of Research on

Technology in Education, 34, 29–37.

Sims, V. M. (1927).The measurement of socioeconomic status. Bloomington, IL: Public School Printing

Co.

Singh, M. (2013). A longitudinal study of a state–wide reading assessment: The importance of early

achievement and socio–demographic factors. Educational Research and Evaluation, 19, 4-18.

Sirin, S. R. (2005). Socioeconomic status and academic achievement: A meta–analytic review of

research. Review of Educational Research, 75, 417-453.

Solomon, D., Watson, M., Battistich, V., Schaps, E. & Delucchi, K. (1996). Creating classrooms that

students experience as communities. American Journal of Community Psychology, 24, 719–

748.

Spaeth, J. L. (1976). Cognitive complexity: A dimension underlying the socioeconomic achievement

process. In W.H.Sewall, R.M.Hauser, & D.L.Featherman (Eds.), Schooling and achievement in

American society (pp.103–131). New York: Academic Press.

Specht, J., Egloff, B., & Schmukle, S. C. (2011). Stability and change of personality across the life

course: The impact of age and major life events on mean–level and rank–order stability of

the Big Five. Journal of Personality and Social Psychology, 101, 862–882.

Sterbinksky, A., Ross, S. M., & Redfield, D. (2006). Effects of comprehensive school reform on

student achievement and school change: A longitudinal multi–site study. School

Effectiveness and School Improvement, 17, 367–397.

Page 191: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

Plans for NAEP Core Contextual Modules DRAFT (05/02/2014)

39

Stewart, E. B. (2008). School structural characteristics, student effort, peer associations, and

parental involvement: The influence of school- and individual-level factors on academic

achievement. Education & Urban Society, 40, 179–204.

Stock, J., & Cervone, D. (1990). Proximal goal setting and self–regulatory processes. Cognitive

Therapy and Research, 14, 483–498.

Tanaka, J. S., Panter, A., & Winborne, W. C. (1988). Dimensions of the need for cognition: Subscales

and gender differences. Multivariate Behavioral Research, 23, 35–50.

Taussig, F. W. (1920).Principles of economics. Newcastle: Cambridge Scholars Publishing.

Trimmel, M., & Bachmann, J. (2004). Cognitive, social, motivational and health aspects of students

in laptop classrooms. Journal of Computer Assisted Learning, 20, 151–158.

Voight, A. & Hanson, T. (2012). Summary of existing school climate instruments for middle school.

San Francisco: REL West at WestEd.

Verplanken, B. (1989). Involvement and need for cognition as moderators of beliefs-attitude-

intention consistency. British Journal of Social Psychology, 28, 115–122.

Wentzel, K. B., & Asher, S. R. (1995). The academic lives of neglected, rejected, popular, and

controversial children. Child Development, 66, 754–763.

Wentzel, K. R., & Caldwell, K. (1997). Friendships, peer acceptance, and group membership:

Relations to academic achievement in middle school. Child Development, 68, 1198–1209.

White, K. R. (1982). The relation between socioeconomic status and academic achievement.

Psychological Bulletin, 91, 461–481.

Zheng, B., Warschauer, M., & Farkas, G. (2013). Digital writing and diversity: The effects of school

laptop programs on literacy process and outcomes. Journal of Educational Computing

Research, 48, 267–299.

Page 192: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

NATIONAL ASSESSMENT GOVERNING BOARD MEETING EVENTS July 31 – August 2, 2014

The Fairfax at Embassy Row, 2100 Massachusetts Avenue, NW, Washington, DC 20008 Staff Office: Hunt Room, 2nd Floor

DATE AND TIME EVENT LOCATION DETAILS

Thursday, July 31 8:30 am – 4:00 pm

Assessment Development Committee

Fairfax at Embassy Row: Churchill Room (2nd Floor)

Closed Session

Thursday, July 31 2:00 – 4:00 pm

Assessment Literacy Work Group

Fairfax at Embassy Row: Salon (1st Floor)

Open Session

Thursday, July 31 4:30 – 5:30 pm

Executive Committee

Fairfax at Embassy Row: Balcony (2nd Floor)

Open Session Closed Session 5:00 – 5:30 pm

Friday, August 1 8:30 – 9:30 am (Committee meetings: 9:45 am – 12:30 pm) Closed Working Lunch Session: 12:45 – 1:45 pm Open Session 1:45 – 3:15 p.m. Closed Session 3:30 – 5:00 p.m.

Full Board Meeting General Session

Fairfax at Embassy Row: Ballroom (1st Floor)

Open Session Committee Rooms: ADC: Churchill Joint Meeting COSDAM & R&D: Balcony COSDAM: Whitehall R&D: Balcony (All rooms on 2nd Floor)

Friday, August 1 6:00 – 9:30 pm

Full Board Working Dinner

RIS 2275 L Street, NW Washington, DC 20037 (202) 730-2500

We will convene in the hotel lobby at 5:30 pm and either walk or share taxis to the restaurant.

Saturday, August 2 7:30 – 8:15 am

Nominations Committee

Fairfax at Embassy Row: Balcony (2nd Floor)

Closed Session

Saturday, August 2 8:30 am – 12:00 pm

Full Board meeting

Fairfax at Embassy Row: Ballroom (1st Floor)

Open Session

Page 193: National Assessment Governing BoardAdministering the contextual questionnaire in the selected subject (i.e., Reading) in conjunction with an assessment in a different subject (e.g.,

GROUND TRANSPORTATION Governing Board Quarterly Board Meeting

July 31 – August 2, 2014

The Fairfax at Embassy Row; 2100 Massachusetts Avenue, NW; Washington, DC 20008 (202) 293-2100

Shuttle Service from BWI Thurgood Marshall Airport, Dulles International Airport and Ronald Reagan National Airport Super Shuttle provides shuttle service from BWI Thurgood Marshall Airport (BWI), Dulles International Airport (IAD) and Washington Reagan National Airport (DCA) to the Fairfax at Embassy Row. For pick up, claim your luggage and proceed to Ground Transportation/Shared Ride Vans. Reservations are not required for transportation to the hotel but are required for transportation to the airport. 24-hour notice is preferred. Reservations can be made on-line at www.supershuttle.com, or by calling toll free (800) 258-3826. The one-way fare is $40 from BWI, $31 from Dulles and $16 from Reagan. Taxi Service Arrivals and Departures via BWI Thurgood Marshall Airport and Ronald Reagan National Airport Several taxi companies provide service from BWI Thurgood Marshall Airport (BWI) and Ronald Reagan National Airport (DCA). The one-way trip from BWI takes approximately one hour and the fare is approximately $135.00. The one- way fare from Reagan is approximately $25 and travel time is approximately 20 minutes. Taxi stands are located outside the airport and hotel. Arrivals and Departures via Dulles International Airport Washington Flyer Taxi Service (703) 661-6655 provides taxi service from Dulles International Airport. The one-way fare is approximately $75-85 per person and travel time is approximately 40 minutes. Upon arrival at Dulles, proceed to the baggage claim/arrivals area on the lower level of the Main terminal and proceed to the Washington Flyer taxi stand. A curbside representative will assist you with coordinating service. For return trips to Dulles from the Fairfax at Embassy Row, Yellow Taxi Cab Company (703) 534-1111 provides taxi service. The hotel bellman will assist you with service. Public Transportation-Metrorail The Fairfax at Embassy Row Hotel is metro rail accessible via the Dupont Circle Metro station on the Red line. Exit the train station using the Connecticut Avenue and Q Street exit. Walk one block west on Q Street. Turn left on 21st Street. Walk a short distance south on 21st Street to the hotel. MARC Penn Line Stations --www.amtrak.com Amtrak trains provide service to the BWI Marshall Rail Station, where free shuttles serve the airport terminal. Shuttle stops are located on the lower level terminal roadway in between door numbers 1 & 2, 8 & 9, 14 & 15, and 17 & 18. Shuttle buses from the BWI Marshall terminal building to the rail station operate every 12 min from 5:00am to 1:00am daily and every 25 min between 1:00am and 5:00am daily. Buses stop adjacent to the rail station garage, directly in front of the rail station. The BWI Rail Station is located one mile from the terminal building. To contact the BWI Marshall Rail Station, please call 410-672-6169. For Amtrak schedules and information call 800-USA-RAIL (800-872-7245).

Parking Valet parking is available in the hotel’s parking garage. Parking rates are $47.80 overnight and $40.00 for day parking.