Project LEAPP Learning from Educators At the Peak of their ... · ognitive processes used in the...

28
Project LEAPP 1 Learning from Educators At the Peak of their Profession The National Board for Professional Teaching Standards Assessment as Professional Development University of Oregon

Transcript of Project LEAPP Learning from Educators At the Peak of their ... · ognitive processes used in the...

Project LEAPP

1

Learning from Educators At the Peak of their Profession

The National Board for Professional Teaching Standards Assessment

as Professional Development

University of Oregon

Project LEAPP

2

Learning from Educators At the Peak of their Profession

The National Board for Professional Teaching Standards Assessment

as Professional Development

The quality of a child’s education depends in great part on the quality of her teachers.

This common-sense notion is supported in empirical studies of students from elementary to post-

secondary schools (Darling-Hammond, 2000; Wilson, Floden, & Ferrini-Mundy, 2001; and

Hanushek, Kain, & Rivkin, 2001). Teachers who (a) have a solid grounding in the content

material of the courses they teach, (b) have mastered an array of pedagogical techniques, and (c)

demonstrate high verbal ability, are better able to plan and deliver lessons that convey the key

concepts of their curriculum to students with diverse needs (Wilson et al., 2001). Teachers who

lack these skills are less effective, particularly with students with a history of low academic

achievement.

Unfortunately, too many teachers lack these vital skills, a problem particularly rampant in

rural and urban districts (Esch, Shields, & Young, 1999). It is no wonder, then, that much of the

focus of modern education reform has focused on improving the quality of the teaching force in

American schools. A Nation at Risk, the influential report published in 1983 by the National

Commission on Excellence in Education, called for the establishment of standards for the

teaching profession as a key component of education reform. At that time, states began

increasing teaching licensure requirements, and a series of reform initiatives gained momentum

(Fuhrman, 2004).

Despite these reform efforts, however, American students continue to score in the middle

or lower tier of students worldwide on international assessments (National Center for Education

Statistics [NCES], 2004), and public faith in the school system continues to decline (Fuhrman,

Project LEAPP

3

2004). Many places in America experience a dearth of qualified applicants to open teaching

positions, and the schools with the most difficulty finding and retaining highly qualified teachers

tend to be those in the poorest rural and urban locations in the country (Rice, 2003). Clearly,

finding effective ways to help working and pre-service teachers develop the skills they lack is a

critical field for research; the challenge is how to achieve this goal. This study seeks to provide

insight into that very question.

Project LEAPP, Learning from Educators at the Peak of their Profession, focuses the

power of two methodologies on one central question: What components (including specific

content requirements of and cognitive processes used in the portfolio and assessment center

exercises as well as more general procedural aspects of completing the year-long assessment) of

the National Board for Professional Teaching Standards assessment affect teachers’

professional development? Phase 1 employs the qualitative theory-building power of multiple

case studies (Yin, 1998) to generate knowledge about a topic and a population that previously

have not been studied in quite this way. This initial phase will allow for more thorough

identification and articulation of the possible components of the National Board assessment that

will be examined in the second half of the study. Phase 2 harnesses the power of a large-scale

survey analyzed with Hierarchical Linear Modeling to better understand the degree to which the

professional development prompted by participation in the National Board (NBPTS) assessment

differs by types of teacher involved in the assessment.

Three nested variables will be included in analyses: (a) teachers from different

certification areas, (b) teachers from different racial/ethnic groups, and (c) male vs female

teachers. These three variables were chosen because previous research in education suggests that

there may be systematic group differences among teachers in these groups. Teachers who work

Project LEAPP

4

in different content area and grade level teaching contexts may differ systematically from their

colleagues in other content areas or who work with other grade levels (Shen & Poppink, 2003).

Burroughs (2000) reported that teachers from ethnic minority groups experienced unique

challenges with the NBPTS assessment, and it seems likely that these differences in experience

with the assessment process could result in different aspects of the assessment influencing their

professional development. Finally, there is a long history of research in psychology and

sociology on gender differences (e.g., Casey, 1996; Colum, Contreras, Arend, Leal, & Santacreu,

2004; and Kimura, 1999). Given the many ways in which men and women have been reported to

differ, it is reasonable to investigate the degree to which gender affects the ways in which the

NBPTS assessment influences teachers’ professional development.

Results of this study will be directly applicable to teachers who will be pursuing National

Board certification in the future. The relevance does not end there, however. Results of this study

will also lay the groundwork for future research on both pre-service and in-service teacher

professional development.

Theoretical Grounding: Teacher Professional Development

Professional development is generally considered a vital piece in efforts to improve

schools. On average, U.S. school districts spent 3% of total expenditures on teacher professional

development activities throughout the 1990s. These expenditures roughly equate to $200 per

pupil (Killeen, Monk, & Plecki, 2002) and are in addition to the private funds individual teachers

spent to enhance their education. Given the steady allocation of resources and time to

professional development activities, it is essential that the content and structure of professional

development be grounded in empirical evidence of effectiveness. Without such empirical

grounding, teacher educators and professional developers are left without a sufficient knowledge

Project LEAPP

5

base on which to build their staff development activities. This lack of knowledge leads to

inefficient use of teachers’ professional development time, decreased probability that trainings

will result in appreciable development of new skills, and a potential for spending money on

activities without a corresponding increase in teacher skills. From a practical as well as a

philosophical position, such practices are untenable.

Cohen and Hill (2001) and Cohen, Hill, and Kennedy (2002) reported that professional

development activities are more likely to be successful when they require teachers to focus over

time on their own practices and link such practices to the standards teachers are trying to meet.

Analysis of professional development programs conducted by Bollough, Kauchak, Crow, Hobbs,

and Stoke (1997) suggests such programs result in an improvement of teaching practices when

they require teachers to engage in analysis and reflection rather than mere demonstration of

teaching techniques: When teachers are asked to explain and defend their practices through

structured analysis and reflection, they are more likely to change those practices to include

newly-taught knowledge and skills.

Likewise, Darling-Hammond and McLaughlin (1996) reported that effective professional

development shares six common features. It (a) engages teachers in concrete, experimental tasks;

(b) features inquiry and reflection; (c) involves collaboration and interaction with other

educators; (d) is closely connected to teachers’ work with their own students; (e) involves

activities that are sustained, ongoing, and intensive; and (f) is explicitly connected to other

components of school change and improvement.

The Role of the National Board for Professional Teaching Standards

Since 1994, the National Board for Professional Teaching Standards (NBPTS) has

offered a voluntary program of advanced certification to K-12 teachers in the United States. At

Project LEAPP

6

the end of the 2004 assessment cycle, a total of 40,209 teachers had earned National Board

certification (http://www.nbpts.org/nbct/nbctdir_byyear.cfm, retrieved April 11, 2005). To earn

NBPTS certification, teachers must hold a valid teaching license, be teaching at least half time in

an accredited school, have a minimum of three years’ teaching experience, hold a baccalaureate

degree from an accredited institution, and complete a year-long assessment. The assessment

includes a lengthy written test and a portfolio of work samples and reflective essays in which

teachers describe their approach to curriculum design, instruction, and assessment and provide

evidence (in the form of annotated student work samples) linking their teaching practices to

student learning (http://www.nbpts.org/candidates/guide/1 eligib.html, retrieved May 19, 2004).

Not every teacher in the United States, however, is a candidate for NBPTS certification.

In fact, the organization expressly states that its goal is to certify the very best teachers in the

United States, those teacher-leaders at the peak of their profession who contribute to education in

ways beyond the expectations held for teachers in general (http://www.nbpts.org/about/index.

cfm). Given the NBPTS’s arguably elitist attitude, what might the Board have to offer to the

wider population of American teachers?

The answer lies in one of the more promising consequences of the National Board

assessment process. Namely, teachers who participate in the NBPTS assessment overwhelmingly

report experiencing significant professional growth as a result of their participation. At the end of

the 2000-2001 assessment cycle, the NBPTS surveyed all teachers who had participated in the

certification process that year. An overwhelming 80 percent of the teachers who responded to the

survey reported that they benefited more from participating in the National Board assessment

than from any other professional development in which they had engaged (NBPTS, 2001).

Project LEAPP

7

The profound learning that accompanies NBPTS certification is echoed in almost every

article about the National Board assessment process (e.g., Bond, Smith, Baker, & Hattie, 2000;

Darling-Hammond, 1999; Haynes, 1995; Tracz, Sienty, & Mata, 1994; Vanevoort, Amrein-

Bardsley, & Berliner, 2004). In a recent multiple case study, for instance, Place and Coskie

(2004) examined the ways in which Portfolio Exercise 1 from the Early Childhood/Generalist

certificate influenced six elementary school teachers’ literacy instruction. All 6 teachers reported

that participating in the assessment (a) strengthened their literacy-teaching skills, (b) helped them

gain more in-depth insight about the needs of individual students, and (c) reinforced the

importance of aligning curriculum, instruction, and assessment in their teaching. Clearly, the 6

teachers believed that completing the National Board assessment improved their teaching in

tangible ways.

Of particular relevance to the study I am proposing, Place and Coskie (2004) also

examined the specific components involved in completing this portfolio entry that prompted

teachers’ professional growth. They reported three themes that spanned the experiences of all

participants in the study. First, the writing prompts provided by the NBPTS to guide teachers in

their work on Portfolio Entry 1 served as a helpful scaffold for teachers in the study, encouraging

them to think deeply and recognize the complexity of the teaching process. Second, the

requirement that teachers select two students to use as exemplars in their examination of their

teaching helped participants gain more profound understanding of their students’ varying needs.

Finally, having to reflect in writing on their teaching practices assisted participants in the

analytic process (Place & Coskie, 2004).

The insights gleaned from Place and Coskie’s multiple case study suggest the need for an

expanded study, one which extends both the sample of National Board teachers beyond

Project LEAPP

8

elementary teachers earning their Early Childhood/Generalist certification and the unit of

analysis to the entire assessment rather than focusing exclusively on a single component of the

portfolio requirement. Project LEAPP will address both these needs, thereby extending our

knowledge of the aspects of the NBPTS assessment that teachers report as being most influential

for developing their teaching skills to the other certificate areas as well as enhancing confidence

in the generalizability of the findings that have been reported to date.

Overview of Project LEAPP

Project LEAPP will provide a profile of National Board certified teachers’ insights on the

activities and strategies that assisted them in their own development as professional educators. In

doing so, Project LEAPP will add to the knowledge base on teacher professional development,

an area that has considerable implications for in-service as well as pre-service teacher education.

Given the profound importance of ensuring a high-quality teaching force, this study is not merely

interesting from a scientific perspective, it is imperative from a practical standpoint as well.

The Purpose of the Study

The purpose of this study is to develop a theoretical understanding of the components of

the NBPTS assessment that affect teachers’ professional development. Discovery and subsequent

articulation of the specific components of the assessment to be examined in both Phase 1 and

Phase 2 of the study is the compelling rationale for the selection of mixed methodology in the

design of this study. As yet, the complex and interwoven network of components that might

promote teachers’ professional development during their year-long process of completing the

National Board assessment have not been articulated. Although I might discuss the general

categories in which I expect such components to emerge, hypothesizing the specific themes that I

Project LEAPP

9

expect to discover during the Phase 1 development of grounded theory would be premature at

this stage.

Of course, it is possible to identify potentially relevant components of the assessment.

For example, the process of analyzing one’s teaching practices in relation to a set of standards

for the teaching profession might be expected to promote heightened awareness of the

knowledge and skills expected of highly qualified teachers. This heightened awareness might

logically result in increased attention to gaining the knowledge or skills one is lacking. Likewise,

it might be logical to suggest that the very act of gathering student work samples as evidence of

one’s progress towards meeting those standards might promote an awareness in the individual

teacher of personal strengths and weaknesses, which might lead to an effort to address identified

weaknesses. The array of components that might promote teachers’ professional development is,

quite understandably, vast. To remain true to the qualitative tradition of developing grounded

theory as described by Strauss and Corbin (1998), I will refrain from further predictions at this

time and move instead to a discussion of some of the grouping variables that will move to the

forefront of analysis, particularly during Phase 2 of the study.

Selection of the grouping variables used during Phase 2 of the study draw from literature

on practices that result in substantive improvement of professional skills and from studies of

ethnic group and gender differences that might alter the ways in which people learn such skills.

In addition, reports of differences among teachers who are drawn to different content area and

grade levels suggest that these two variables are also worthy of attention. While Phase 1 of the

study will explore the lived experiences of 10 teachers seeking Board certification to gain a deep

understanding of their professional development, Phase 2 will examine the degree to which the

Project LEAPP

10

findings hold true for the larger population of Board certified teachers. Results from the study

will:

• Provide information on the degree to which the professional development prompted by

participation in the NBPTS assessment differs by types of teachers (teachers from

different certification areas, teachers from different racial/ethnic groups, and male vs

female teachers, as well as the interaction of those factors), and thus:

o Help staff developers design programs to assist teachers with their professional

development, and

o Provide information to teacher education programs across the country that are

aligning their curriculum, instruction, and assessment to the NBPTS standards and

assessment.

Research Questions

Project LEAPP will address two research questions:

1. What components of the NBPTS assessment (including specific content requirements of

and cognitive processes used in the portfolio and assessment center exercises as well as

more general procedural aspects of completing the year-long assessment) prompt

teachers’ professional development?

2. To what degree does the professional development prompted by participation in the

NBPTS assessment differ by types of teacher involved in the assessment (teachers from

different certification areas, teachers from different racial/ethnic groups, and male vs

female teachers, as well as the interaction of these factors)?

Methods

Project LEAPP

11

Project LEAPP will take place in two distinct phases, separated by methodology as well

as time. Phase 1 of the study will use an exploratory qualitative multiple case study design (Yin,

1998) to gather information about the components of the NBPTS assessment that affect teachers’

professional development while they are in the process of completing their portfolio, preparing

for the assessment center exercises, and then taking the assessment center exercises. Information

gleaned from Phase 1 will be used to develop a survey instrument vital to the second phase of the

study. I will interview 10 teachers currently participating in the NBPTS assessment for a

minimum of 6 hours each over a period of 12 months, from September 10, 2005, through August

10, 2006. In addition, I will collect samples of their reflective analysis portfolio entries for

document analysis. Phase 2 of the study, a causal comparative quantitative design, will use the

results of a national survey delivered via the internet to investigate the generalizability of the

themes that emerge during the first phase of the study for sub-populations of Board-certified

teachers (see Appendix B). Data for Phase 2 of the study will be gathered between November 7

and December 7, 2006 (see Appendix C for timeline).

Setting, Participants, and Instrument Development

Phase 1. For Phase I of this study, the Director of Research for the National Board has agreed to

provide demographic information (subject area and level in which the teacher is seeking

certification, gender, ethnicity, and school where the teacher is employed), for all teachers within

a 300 mile radius of where I am located in the Pacific Northwest (D. Lussier, personal

communication on October 13, 2004). This geographic region was selected out of convenience

and in an effort to keep the cost of the study within reason. In keeping with National Board

regulations to protect anonymity of teachers engaging in the certification process, no names will

be provided with the list of potential participants. Using the list provided, I will select a

Project LEAPP

12

purposeful sample of 10 teachers to allow for in-depth case studies. More teachers would

significantly increase the time and resources needed to gather and analyze data, yet 10 should

provide sufficient information, particularly if selection procedures result in a diverse group.

Participants will be selected to get the most diverse sample possible, considering all of

the following variables: type of certificate attempted (both age level and content area will be

considered), school type (rural, urban, suburban), school size (under 200, between 201 and 800,

and over 801 students), gender, and ethnicity. Following established protocol, the National Board

will act as the liaison between the researcher and the potential participants, mailing the invitation

to participate in the case study to all selected teachers so that they can contact me if they are

willing to participate in the study. This process will be repeated until a suitably diverse group of

10 teachers currently engaging in the National Board assessment process is recruited.

Phase 2 instrument development. Phase 2 of the study will involve an online-survey instrument,

developed at the completion of Phase 1. Using information gleaned from the qualitative case

studies, I will write approximately 30 questions to gather data about the components of the

National Board assessment process that teachers report most influenced their professional

development. I will also include approximately 10 questions to gather demographic data such as

participants’ gender and ethnicity, as well as the year, subject-area, and grade level for which

they received National Board certification.

All questions written for the survey will be evaluated by twelve external reviewers: two

primary grade teachers, two middle school teachers, two high school teachers, and six university

researchers familiar with survey research. Reviewers will be asked to provide feedback on the

appropriateness of the language as well as the comprehensibility and organization of the survey

questions. I will revise the survey questions as needed, based on the external review. In addition,

Project LEAPP

13

the on-line survey will be pilot tested with the 10 teachers who participated in Phase 1 of the

study. To evaluate the technical adequacy of the survey instrument, I will compare the responses

obtained during piloting to the responses provided by the same individuals during Phase 1. I will

also gather qualitative feedback from the 10 teachers on the survey instrument during piloting

and make changes to the survey instrument as needed prior to sending the survey to the larger

sample of participants.

Phase 2 setting and participants. The survey will be administered via the Internet using

InfoCounts, an online survey database tool that allows automatic capture of respondents’ data. I

will first email each of the teachers whose email addresses were provided to me by the National

Board to inform them of the study and request their cooperation. Two days later, I will embed

the link of the survey web address directly in the text of an email sent to all teachers. All the

email addresses will be stored in a database linked to the web-based survey so I can automate the

process of sending additional requests to participate. Once subjects have completed the survey,

their responses will be entered into the database, the computer will send an email thanking them

for their participation, and their email address will be removed from the follow-up email request

list. Email contact information will be retained for all subjects until the end of the study to allow

for follow up questions if they are needed.

The survey will be administered to all teachers certified by the NBPTS since the 1998-

1999 assessment cycle who have received Board certification in a certificate area with at least

200 certified teachers. Although 1837 teachers received National Board certification prior to the

1998-1999 assessment cycle, insufficient numbers of teachers were certified in each of the

previous year’s assessment cycles to meet the analytical requirements of the multilevel modeling

I am using for analyses. In addition, subjects will be limited to certification areas in which at

Project LEAPP

14

least 200 teachers have been certified to allow for sufficient power in the analyses I will be

running. As in Part 1, the Director of Research for the National Board has agreed to cooperate in

providing access to this population by sending me complete email addresses of the

approximately 38,000 teachers who have been certified by the Board since 1999 in credential

areas where a minimum of 200 certificates have been awarded (see Appendix A).

Analysis

Phase 1. All interviews will be audio-recorded and transcribed in full. During each interview,

draft copies of the participants’ reflective portfolio entries will be collected for later analysis.

Both interview transcripts and documents from Phase 1 of the study will be coded and analyzed

during the course of the study using the methods outlined by Strauss and Corbin (1998) and

Miles and Huberman (1994) in order to develop grounded theory. Two researchers (myself and a

trained associate) will code the data from Phase 1 independently and compare findings, reaching

at least a 90% agreement on the emerging themes. Participants will be asked to provide insights

into the correspondence between analyses and their own understanding of their lived experience.

Ongoing analyses will be used to guide subsequent interviews, and emerging themes will be

explored in greater depth with all participants in order to maximize the information gleaned from

Phase 1 of the study. This triangulation of data sources and analysis, coupled with regular

member checking, will add to the reliability of the findings.

Phase 2. Survey results from Phase 2 of the study will be analyzed using Hierarchical Linear

Modeling (HLM). This statistical procedure is appropriate because the study involves modeling

the relationship between variables nested within groups, nested within other groups inside a

larger group (see Figure 1 for a diagram showing the hierarchical structure of the data). HLM

allows us to predict outcomes for members of groups while taking into account the

Project LEAPP

15

characteristics of both the members and the groups (Arnold, 1992). To increase the reliability of

the findings, HLM analysts recommend a minimum of 150 data points in each of the clusters of

nested variables (Bryk & Raudenbush, 1992). This study falls well within those parameters,

substantially exceeding the recommended sample size.

Figure 1

Nested Variables from Phase 2 of Study

Plan to Control for Threats to Validity

Differential selection and attrition. Survey research carries with it the threat of differential

selection even when the sampling plan attempts to control for this threat. In this study, all

NBPTS certified teachers who have been certified since the 1998-1999 assessment cycle in a

certification area in which a minimum of 200 certificates have been awarded will be included in

the survey. This exhaustive sampling plan will not account for differential response rates,

however. Previous survey research with NBPTS teachers has reported response rates of 41%

(NBPTS, 2001a) to 53% (NBPTS, 2001b). As in the current study, no financial incentives were

provided to NBPTS teachers in these studies to encourage their participation, so a similar

response rate is anticipated in this study.

female

male

white

female

male

black

female

male

other

Certificate Area#1 of 20

female

male

white

female

male

black

female

male

other

Certificate Area#2 of 20, etc.

Score on SurveyIndicating which components

of NBPTS assessment have been most influential.

Project LEAPP

16

To check for differential selection caused by systematic variation in response rates, a

random sample of 100 non-responders will be selected from all teachers in the original sample

who have not responded to the email survey within the three-week window during which data

will be collected. These non-responders will be contacted by telephone and asked an abbreviated

5-question survey pulled directly from the larger survey that was sent to the whole group. Their

responses will be recorded and checked for systematic variation that would indicate deviation

from the larger sample of respondents.

Resentful demoralization and compensatory rivalry. Because this is not an intervention study and

subjects will complete the survey as individuals rather than as members of groups, neither

resentful demoralization nor compensatory rivalry should threaten the validity of findings.

History. Data will be collected from the entire sample during a three-week window in November

and December of 2006. Although it is impossible to control for the possibility of an unrelated

event influencing respondents’ results during such a time, the large sample size and geographic

diversity of the sample should reduce the potential impact of history as a threat to validity.

Maturation. The threat of maturation confounding the results should be minimized by the three-

week window in which data collection will occur.

Testing. There is no reason to believe that the survey through which data are collected for Phase

2 of the study will have an influence on teachers’ perception of which aspects of completing the

NBPTS assessment affected their professional development. The survey is a low-stakes measure,

and because teachers’ responses will be analyzed in group format rather than individually, there

is no reason to believe that this threat to validity will play a role in this study.

Instrumentation. As described earlier, the survey instrument used in Phase 2 of the study will

undergo content review and validation for use with the target population prior to being

Project LEAPP

17

administered to the research sample. These steps should result in minimal impact from this threat

to validity.

Statistical regression. Because the survey instrument will be administered to the entire sample

only once, the research design does not allow for control of statistical regression. However, the

large sample size should reduce the potential impact of this threat to validity.

Experimenter bias. There is a potential for experimenter bias in this study. I received my NBPTS

certification in Adolescence and Young Adulthood English Language Arts in 2002. My

experiences with the NBPTS assessment process might influence data collection and analysis

during Phase 1 as well as instrument development during Phase 2 of the study. The steps I will

take to reduce this threat to validity are described below.

phase 1

Because I will be conducting all interviews and participating in the qualitative analyses

during Phase 1, taking steps to control the potential for researcher bias are critically important.

First, I will enlist a colleague who has expertise in teacher education but is unfamiliar with the

National Board assessment to assist with developing questions for both initial and follow-up

interviews. Second, a different trained colleague who is not Board certified will participate in

analysis of interview transcripts and documents gathered during Phase 1. Third, I will conduct

regular member checks of results of analyses to ensure that analyses appropriately capture

participants’ lived experiences without undue influence from my own experience with the

NBPTS assessment.

phase 2

Systematic steps to control for researcher bias also are included in the plan for instrument

development during Phase 2 of the study. The survey instrument will undergo standard content

Project LEAPP

18

review and validation procedures. It will be evaluated for appropriateness and bias by a

minimum of six non-Board certified K-12 teachers and six researchers familiar with survey

design. These evaluations will be used to make needed revisions, and the survey will be piloted

with the 10 Board certified teachers who participated in Phase 1 of the study.

Discussion

Significance of the Work, Especially as it Relates to Education

This study extends the research on the link between the NBPTS assessment and teacher

professional development in three key ways.

1. The study is not limited to elementary teachers; it samples from teachers in all subject

areas and all levels.

2. The study harnesses the power of both qualitative and quantitative methodology; multiple

case studies inform development of a survey instrument that is then administered to the

entire population of teachers certified by the NBPTS since 1999. Previous research on

how participating in the NBPTS assessment affects teachers has focused on either

quantitative (Bond, Smith, Baker, & Hattie, 2000; NBPTS, 2001) or qualitative (Place &

Coskie, 2004) methodology, but not both.

a. Case studies offer insights into key aspects of the assessment to explore with the later

survey (Yin, 1999; Miles & Huberman, 1994).

b. Survey results will be analyzed using Hierarchical Linear Modeling (Bryk &

Raudenbush, 1992), a powerful statistical procedure that will account for the multiple

layers into which the data are nested to provide evidence of the extent to which

components of the assessment differentially promote professional development for

Project LEAPP

19

different sub-groups of Board-certified teachers. These analyses will inform theory

building as well as theory delimitation.

3. The study broadens the scope of the research on the ways in which participating in the

NBPTS assessment serves as professional development by examining not only the

portfolio but also the assessment center exercises. Previous research in this area has been

limited to the portfolio. Because different components of the assessment may work

conjunctively to promote professional development, examination of the complete

assessment package is important. Unless all parts of the assessment are explored, possible

interaction effects with important implications may be overlooked.

The findings from this study will be relevant to a wide audience. The NBPTS might use the

findings to inform their assessment process and to assist them in the development of future

assessment exercises. In addition, school districts interested in supporting the professional

development of their teaching staff might use the results to help guide them in their choice of

activities. Finally, universities engaged in teacher education may find that some of the activities

they currently require could be revised to prompt more significant learning in their teachers.

Proposed Committee Members

The committee will be comprised of four members with differing areas of expertise (see

Table 1). Three of the members are faculty in the College of Education; the outside member will

come from the sociology department at the University of Oregon.

Table 1

Proposed Committee Members

Name Role Rationale

Jerry Tindal Chair Advisor, familiar with quantitative research

Project LEAPP

20

Ron Beghetto Member Familiar with mixed methodology and teacher preparation

Paul Yovannof Member Statistical expertise

Jean Stockard Outside member Familiar with qualitative research

Plan for Completion of the Literature Synthesis

The literature synthesis needs to be expanded in several key areas. First, the section on

professional development needs considerable work. Additional research on factors related to

teacher professional development needs to be added, and each of the sections that exists in the

current manuscript needs to be enlarged. Second, empirical studies on the ways in which teachers

certified in different content areas and grade levels differ from one another need to be added.

Third, literature on gender differences related to learning and the types of activities most likely to

promote professional growth for men and for women needs to be added. Finally, any additional

research on the NBPTS assessment published since the date I finished this prospectus will need

to be synthesized.

Project LEAPP

21

References

Arnold, C.L. (1992). An introduction to hierarchical linear models. Measurement and Evaluation

in Counseling and Development, 25, 58-90.

Bollough, R. V., Kauchak, D., Crow, N., Hobbs, S., & Stoke, D. (1997). Professional

development schools: Catalysts for teacher and school change. Teaching and Teacher

Education, 13(2), 153–169.

Burroughs, R. (2000). Communities of practice and discourse communities: Negotiating

boundaries in NBPTS certification. Teachers College Record, 102, 344-375.

Bryk, A. S., & Raudenbush, S. W. (1992). Hierarchical linear models. Newbury Park, CA: Sage.

Casey, M.B. (1996). Gender, sex, and cognition: Considering the interrelationship between

biological and environmental factors. Learning & Individual Differences, 8, 39-53.

Cohen, D.K. & Hill. H.C. (2001). Learning policy: When state education reform works. New

Haven, CT: Yale University Press.

Cohen, D.K, Hill, H.C., & Kennedy, M. (2002). The benefit to professional development.

American Educator, 26(2), 22-25.

Colom, R., Contreras, M. J., Arend, I, Leal, O. G., & Santacreu, J. (2004). Sex differences in

verbal reasoning are mediated by sex differences in spatial ability. Psychology Record, 54,

365-372.

Darling-Hammond, L., & McLaughlin, M. (1996). Policies that support professional

development in an era of reform. In M. McLaughlin & I. Oberman (Eds.), Teacher learning:

New policies, new practices (pp. 202-218). New York: Teachers College Press.

Project LEAPP

22

Darling-Hammond, L. (2000). Teacher quality and student achievement: A review of state policy

evidence. Journal of Education Policy Analysis, 8 (1). Retrieved June 16, 2004 from

http://epaa.asu.edu/epaa/v8n1/.

DeBacker, T. K. & Nelson, R. M. (2000). Motivation to learn science: differences related to

gender, class type, and ability. Journal of Educational Research, 93, 245-254.

Esch, C. Shields, P., & Young, V. (1999). Strengthening California’s Teacher Information

System. The Center for the Future of Teaching and Learning Research conducted by SRI

International.

Fuhrman, S. (Ed.) (2001). From the capital to the classroom: Standards-based reform in the

states. The National Society for the Study of Education Yearbooks.

Hanushek, E., Kain, J. & Rivkin, S. (2001). Why public schools lose teachers (Working Paper

no. W8599). National Bureau of Economic Research.

Killeen, K.M., Monk, D.H., & Plecki, M.L. (Summer 2002). School District Spending on

Professional Development: Insights Available from National Data (1992-1998). Journal of

Education Finance, 20, 25-50. Retrieved August 3, 2004 from www.depts.washington.

edu/ctpmail/PDFs/JEFArticle-KKDMMP.pdf.

Kimura, D. (1999). Sex and cognition. Cambridge, MA: The MIT Press.\

Miles, M. & Huberman, A. (1994). Qualitative data analysis: A sourcebook. Thousand Oaks,

CA: Sage.

National Board for Professional Teaching Standards, NBCTs by Year, Retrieved October 31,

2004 from http://www.nbpts.org/nbct/nbctdir_byyear.cfm

National Center for Education Statistics. Trends in international mathematics and science study.

Retrieved from http://nces.ed.gov/timss/ on October 18, 2004.

Project LEAPP

23

National Commission on Excellence in Education. (1983, April). A Nation at Risk. Washington,

DC: U.S. Department of Education.

Place, N. & Coskie, T. (2004). Learning from the national board portfolio process: What

teachers discovered about literacy teaching and learning. Retrieved from http://www.cstp-

wa.org/Accomplishedteaching/Impact_studies/impact.ht on October 14, 2004.

Rice, J. K. (2003). Teacher quality: Understanding the effectiveness of teacher attributes.

Washington, D.C.: Economic Policy Institute.

Shen, J. & Poppink, S. (2003). The certification characteristics of the public teaching force:

National, longitudinal, and comparative perspectives. Educational Horizons, 81, 130-137.

Strauss, A. & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for

developing grounded theory, 2nd ed. Thousand Oaks, CA: Sage.

U.S. Department of Education. (1991). AMERICA 2000: An Education Strategy. Washington,

DC: Author.

U.S. Department of Education. (2001). No Child Left Behind. Washington, DC: Author.

Wilson, S. W., Floden, R. E., & Ferrini-Mundy, J. (2001). Teacher preparation research:

Current knowledge, gaps, and recommendations, a research report prepared for the U.S.

Department of Education and the Office for Educational Research and Improvement.

University of Michigan: Center for the Study of Teaching and Policy.

Yin, R.K. (2003). Case study research: Design and Methods, 3rd ed. Applied Social Research

Methods Series, Volume 5. Thousand Oaks, CA: Sage.

Project LEAPP

24

Appendix A

The National Board offers certificates in the following 27 areas. Numbers in brackets

indicate the total number of Board certified teachers in that area as of the end of the

2003-2004 assessment cycle.

• Generalist / Early Childhood (ages 3–8) [8184]

• Generalist / Early Adolescence [647]

• Generalist / Middle Childhood [7894]

• Middle Childhood Generalist (ages 7–12) [7894]

• Art Early and Middle Childhood (ages 3–12) [351]

• Art Early Adolescence through Young Adulthood (ages 11–18+) [815]

• Career and Technical Education Early Adolescence through Young Adulthood

(ages 11–18+) [1639]

• English as a New Language Early and Middle Childhood (ages 3–12) [310]

• English as a New Language Early Adolescence through Young Adulthood (ages

11–18+) [135]

• English Language Arts Early Adolescence (ages 11–15) [2691]

• English Language Arts Adolescence and Young Adulthood (ages 14–18+) [2690]

• Exceptional Needs Specialist (ages birth–21+) [2984]

• Library Media Early Childhood through Young Adulthood (ages 3–18+) [1089]

• Literacy: Reading-Language Arts Early and Middle Childhood (ages 3–12) [342]

• Mathematics Early Adolescence (ages 11–15) [1274]

• Mathematics Adolescence and Young Adulthood (ages 14–18+) [1694]

• Music Early and Middle Childhood (ages 3–12) [501]

Project LEAPP

25

• Music Early Adolescence through Young Adulthood (ages 11–18+) [320]

• Physical Education Early and Middle Childhood (ages 3–12) [393]

• Physical Education Early Adolescence through Young Adulthood (ages 11–18+)

[317]

• School Counseling Early Childhood through Young Adulthood (ages 3-18+)

[350]

• Science Early Adolescence (ages 11–15) [1198]

• Science Adolescence and Young Adulthood (ages 14–18+) [1820]

• Social Studies-History Early Adolescence (ages 11–15) [826]

• Social Studies-History Adolescence and Young Adulthood (ages 14–18+) [1184]

• World Languages Other than English / Early and Middle Childhood [51]

• World Languages Other than English / Early Adolescence through Young

Adulthood [510]

Project LEAPP

26

Appendix B

Nested Structure of the Variables Examined in Phase 2 of Project LEAPP

• Different Components of the NBPTS Assessment (# of groups to be determined

during Phase 1)

• Certification Area (# of groups = 25; see Appendix A—all certification areas with

at least 200 teachers will be included in analyses)

• Teacher’s Ethnicity (# of groups = 3; White, Black, or Other)

• Teacher’s Gender (# of groups = 2; Male or Female)

female

male

white

female

male

black

female

male

other

Certificate Area#1 of 25

female

male

white

female

male

black

female

male

other

Certificate Area#2 of 25, etc.

Score on SurveyIndicating which components

of NBPTS assessment have been most influential.

Project LEAPP

27

Appendix C

Timeline for Dissertation

Task Date To Be Completed

Make initial contact with NBPTS October 13, 2004

Write prospectus April 23, 2004

Defend proposal August 11, 2005

Complete Human Subjects Clearance August 31, 2005

Recruit participants for Phase 1 September 20, 2005

Complete case studies with 10 participants from Phase 1 August 10, 2006

Analyze data from Phase 1 September 10, 2006

Write draft of survey for Phase 2, based on Phase 1 information September 15, 2006

Content review of survey (minimum 12 content experts) October 5, 2006

Revise and pilot survey with a small sample of Board-certified teachers (n = 10) October 15, 2006

Analyze data from pilot and revise survey instrument October 30, 2006

Obtain complete email addresses for all teachers certified by the NBPTS since 1999 October 30, 2006

Send initial introductory email to all potential subjects November 5, 2006

Send 1st email containing link to the survey to all potential subjects November 7, 2006

Send 2nd email containing survey link to all subjects who have not yet responded November 14, 2006

Send 3rd email containing survey link to all subjects who have not yet responded November 21, 2006

Project LEAPP

28

Analyze data from Phase 2 December 20, 2006

Conduct telephone follow-up of random sample of 100 non-respondents to check for differential selection / response rate interaction

January 28, 2007

Write Results section of dissertation March 30, 2007

Write Discussion section of dissertation April 30, 2007

Defend dissertation May 24, 2007