Research Design in QualitativeQuantitativeMixed Methods

36
147 10 R ESEARCH DESIGN IN QUALITATIVE /QUANTITATIVE/ MIXED METHODS MICHAEL  R. HARWELL University of Minnesota T he editors of this handbook assert that educational inquiry is weakly connected to the study of promising ideas. The implication is that much educational research has a “nuts and bolts” character that limits creative problem generation or problem solving and discourages risk taking. The consequences of this practice are missed opportunities to advance our understanding of important educational problems and solutions. This is a bold assertion, but it is one that is supported by an examination of published and unpublished educational studies. A key player in this process is a study’s research design. In many educational studies, research design reflects a “cookie cutter” approach in which the same designs are repetitively and narrowly applied in ways that may limit what is studied and how it is studied. Research-based ideas that cannot be easily mapped onto a small number of designs acceptable to funders and professional journals are likely to be abandoned or modified to fit widely used research designs. The assertion of this chapter is that the study of promising ideas in education would be well served by the increased use of research designs that draw on multiple traditions of inquiry . In some cases, these designs are currently available, and the challenge is for more researchers to employ them (or continue to employ them) in a r igorous manner . In other instances, new designs will need to be developed. The net effect is that the catalog of research designs available to educational researchers is expanding dramatically. This change does not threaten or undermine the value of designs that are currently widely used, but rather it increases the pool of research designs available to support rigorous inquiry. This chapter explores the role of research design in the study of promising ideas in educa- tion, assuming two conditions are present. First is that the idea being pursued is worth pursuing, that is, an idea that could reasonably lead to, and support, the formation of important educational questions and identification of solutions to important educational problems. Second is that every phase of the research process linked to studying a promising idea is rigorous. These con- ditions simplify an examination of the role of research design in the study of promising ideas. Characteristics of three research methodolo- gies (qualitative methods, quantitative methods, mixed methods) and their role in studying ideas

Transcript of Research Design in QualitativeQuantitativeMixed Methods

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 1/36

147

10R ESEARCH D ESIGN IN Q UALITATIVE /Q UANTITATIVE /M IXED M ETHODS

M ICHAEL R. H ARWELL

University of Minnesota

The editors of this handbook assert thateducational inquiry is weakly connected

to the study of promising ideas. Theimplication is that much educational research hasa “nuts and bolts” character that limits creativeproblem generation or problem solving anddiscourages risk taking. The consequences of thispractice are missed opportunities to advance ourunderstanding of important educational problemsand solutions. This is a bold assertion, but it is onethat is supported by an examination of publishedand unpublished educational studies.

A key player in this process is a study’s researchdesign. In many educational studies, researchdesign reflects a “cookie cutter” approach in whichthe same designs are repetitively and narrowlyapplied in ways that may limit what is studied andhow it is studied. Research-based ideas that cannotbe easily mapped onto a small number of designsacceptable to funders and professional journalsare likely to be abandoned or modified to fitwidely used research designs.

The assertion of this chapter is that the studyof promising ideas in education would be wellserved by the increased use of research designsthat draw on multiple traditions of inquiry. In some

cases, these designs are currently available, andthe challenge is for more researchers to employ

them (or continue to employ them) in a rigorousmanner. In other instances, new designs will needto be developed. The net effect is that the catalogof research designs available to educationalresearchers is expanding dramatically. Thischange does not threaten or undermine the valueof designs that are currently widely used, butrather it increases the pool of research designsavailable to support rigorous inquiry.

This chapter explores the role of researchdesign in the study of promising ideas in educa-tion, assuming two conditions are present. First isthat the idea being pursued is worth pursuing,that is, an idea that could reasonably lead to, andsupport, the formation of important educationalquestions and identification of solutions toimportant educational problems. Second is thatevery phase of the research process linked tostudying a promising idea is rigorous. These con-ditions simplify an examination of the role ofresearch design in the study of promising ideas.

Characteristics of three research methodolo-gies (qualitative methods, quantitative methods,mixed methods) and their role in studying ideas

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 2/36

148 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

believed to be worth studying are described. I usetwo studies to illustrate strengths in the researchdesign, as well as opportunities to enhance theresults, and give greater attention to mixed meth-ods because of their relative newness and poten-tial. Three ways that researchers can enhance theability of research designs to better support thestudy of promising ideas in educational studiesare described. I conclude by arguing that mixedmethods offer an especially promising path towardusing research design in ways that support rigor-ous inquiry.

R ESEARCH D ESIGN

In educational research, it is usually possible(and certainly popular) to characterize a researchstudy’s methodology as qualitative; as quantita-tive; or as involving both qualitative and quanti-tative methods, in which case it is typicallyreferred to as mixed methods. The term researchdesign is widely used in education, yet it takes ondifferent meanings in different studies. (Theterms research method and research design will beused inter changeably in this chapter.) For exam-ple, in one study, research design may reflect theentire research process, from conceptualizing aproblem to the literature review, research ques-tions, methods, and conclusions, whereas inanother study, research design refers only to themethod ology of a study (e.g., data collection andanalysis). Perhaps not surprisingly, there is varia-tion within and between methodologies in howresearch design is defined. However, this varia-tion does not affect an examination of the roleof research design in promoting rigorous studyof promising ideas and, thus, a single definitionof research design is not adopted in this chapter.I assume that research questions are the drivingforce behind the choice of a research design andany changes made to elements of a design as astudy unfolds.

Identifying a study’s research design is impor-tant because it communicates information aboutkey features of the study, which can differ forqualitative, quantitative, and mixed methods.However, one common feature across researchdesigns is that at one or more points in theresearch process, data are collected (numbers,words, gestures, etc.), albeit in different ways and

for different purposes. Thus, qualitative studiesare, among other things, studies that collect and

analyze qualitative data; quantitative studies are,among other things, studies that collect andanalyze quantitative data; and so on.

Crotty (1998) described four key features toconsider in research design: the epistemologythat informs the research, the philosophicalstance underlying the methodology in question(e.g., post-positivism, constructivism, pragma-tism, advocacy/participatory; see Morgan, 2007),the methodology itself, and the techniques andprocedures used in the research design to collectdata. These features inform the descriptions ofresearch designs below.

Qualitative Research Methods

Qualitative research methods focus on discov-ering and understanding the experiences, per-spectives, and thoughts of participants—that is,qualitative research explores meaning, purpose,or reality (Hiatt, 1986). In other words,

qualitative research is a situated activity thatlocates the observer in the world. It consistsof a set of interpretive, material practices thatmake the world visible. These practicestransform the world. They turn the world into aseries of representations, including field notes,

interviews, conversations, photographs,recordings, and memos to the self. At this level,qualitative research involves an interpretive,naturalistic approach to the world. Thismeans that qualitative researchers study thingsin their natural settings, attempting to makesense of, or interpret, phenomena in terms ofthe meanings people bring to them. (Denzin &Lincoln, 2005, p. 3)

Central to this inquiry is the presence of mul-tiple “truths” that are socially constructed (Lincoln& Guba, 1985). Qualitative research is usuallydescribed as allowing a detailed exploration of atopic of interest in which information is collectedby a researcher through case studies, ethnographicwork, interviews, and so on. Inherent in thisapproach is the description of the interactionsamong participants and researchers in naturalisticsettings with few boundaries, resulting in a flexibleand open research process. These unique interac-tions imply that different results could be obtainedfrom the same participant depending on who

the researcher is, because results are created by aparticipant and researcher in a given situation

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 3/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 149

(pp. 39–40). Thus, replicability and generalizabil-ity are not generally goals of qualitative research.

Qualitative research methods are also describedas inductive, in the sense that a researcher mayconstruct theories or hypotheses, explanations,and conceptualizations from details provided by aparticipant. Embedded in this approach is theperspective that researchers cannot set aside theirexperiences, perceptions, and biases, and thuscannot pretend to be objective bystanders to theresearch. Another important characteristic is thatthe widespread use of qualitative methods in edu-cation is relatively new, dating mostly to the 1980s,with ongoing developments in methodology andreporting guidelines (Denzin, 2006). The relativenewness of this methodology also means that

professional norms impacting research, includ-ing evidence standards, funding issues, and edi-torial practices, are evolving (see, e.g., Cheek,2005; Freeman, deMarrais, Preissle, Roulston, &St.Pierre, 2007). Good descriptions of qualitativemethods appear in Bogdan and Biklen (2003),Creswell (1998), Denzin and Lincoln (2005),Miles and Huberman (1994), and Patton (2002).

There are several categorizations of researchdesigns in qualitative research, and none is uni-versally agreed upon (see, e.g., Denzin & Lincoln,2005). Creswell (2003) listed five strategies of

inquiry in qualitative research that I treat as syn-onymous with research design: narratives, phe-nomenological studies, grounded theory studies,ethnographies, and case studies. Creswell alsodescribed six phases embedded in each researchdesign that are more specific than those suggestedby Crotty (1998), but still encompass virtually allaspects of a study: (1) philosophical or theoreticalperspectives; (2) introduction to a study, whichincludes the purpose and research questions;(3) data collection; (4) data analysis; (5) report writ-ing; and (6) standards of quality and verification.

Journals that publish qualitative methodologypapers and qualitative research studies in educa-tion include Qualitative Research, QualitativeInquiry, Field Methods, American EducationalResearch Journal, Educational Researcher, and theInternational Journal of Qualitative Studies inEducation. Examples of the use of qualitativeresearch designs are provided by Stage and Maple(1996), who used a narrative design to describethe experiences of women who earned a bache-lor’s or master’s degree in mathematics and opted

to earn a doctorate in education; Gaines(2005), who explored the process of interpreting

interviews and media collected during the author’svisits to India in ways that took into account hisidentity; Harry, Sturges, and Klingner (2005),who used the methods of grounded theory todevelop a theory providing a new perspective onethnic representation in special education; Brown(2009), who studied the perspectives of universitystudents identified as learning disabled; andChubbuck and Zembylas (2008), who examinedthe emotional perspective and teaching practicesof a White novice teacher at an urban school.

These studies reflect several important fea-tures of qualitative research, including a focuson discovering and understanding the experi-ences, perspectives, and thoughts of participantsthrough various strategies of inquiry. The stud-

ies were also conducted in naturalistic settings inwhich inquiry was flexible and guided by par-ticipants’ comments, which in some instanceswere used to construct explanations of theirviews and perspectives. An important feature ofseveral of these studies is their use of elements ofdifferent strategies of inquiry.

Quantitative Research Methods

Quantitative research methods attempt tomaximize objectivity, replicability, and general-

izibility of findings, and are typically interested inprediction. Integral to this approach is the expec-tation that a researcher will set aside his or herexperiences, perceptions, and biases to ensureobjectivity in the conduct of the study and theconclusions that are drawn. Key features of manyquantitative studies are the use of instrumentssuch as tests or surveys to collect data, and reli-ance on probability theory to test statisticalhypotheses that correspond to research questionsof interest. Quantitative methods are frequentlydescribed as deductive in nature, in the sense thatinferences from tests of statistical hypotheses leadto general inferences about characteristics of apopulation. Quantitative methods are also fre-quently characterized as assuming that there is asingle “truth” that exists, independent of humanperception (Lincoln & Guba, 1985).

Trochim and Land (1982) defined quantitativeresearch design as the

glue that holds the research project together. Adesign is used to structure the research, to show

how all of the major parts of the researchproject—the samples or groups, measures,

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 4/36

150 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

treatments or programs, and methods ofassignment—work together to try to address thecentral research questions. (p. 1)

Definitions of quantitative research design arecomplicated by the fact that this term is oftenused to identify the experimental design reflect-ing the arrangement of independent and depen-dent variables associated with data collection.Older categorizations of experimental designstend to use the language of analysis of variance indescribing these layouts—for example, a single-factor, completely between-subjects, fixed-effectsdesign or a factorial split-plot design withbetween- and within-subject effects (Kempthorne,1952; Winer, 1962). These descriptions were typi-

cally linked to an experimental design in whichsubjects were randomly assigned to treatmentconditions, although quasi-experimental designsin which intact groups (e.g., African Americanand White students) are compared, and correla-tional designs in which no definable groups arepresent, also received attention (e.g., Campbell &Stanley, 1963).

More recent categorizations of researchdesigns largely abandon the analysis-of-varianceframework and instead rely on a trichotomy:random ized designs or, equivalently, randomized

controlled trials in which participants areassigned at random to treatment conditions,quasi-experimental designs, and correlationaldesigns (Pedhazur & Schmelkin, 1991; What WorksClearinghouse, 2008). These newer categorizationsalso tend to focus on conditions needed to justifystrong causal inferences (Schneider, Carnoy,Kilpatrick, Schmidt, & Shavelson, 2007; Shadish,Cook, & Campbell, 2002).

A quantitative research design also involvesphases that are superficially similar to thoseoffered by Crotty (1998) and Creswell (2003) forqualitative research, but that are quite different inpurpose and execution: (1) introduction to astudy that includes the purpose and researchquestions; (2) theoretical perspectives or models;(3) methodology that encompasses sampling andan evaluation of external validity, instrumentationthat may include an evaluation of construct valid-ity, experimental design that includes an evalua-tion of internal validity and data collection, anddata analysis that includes an evaluation of statis-tical conclusion validity; (4) reporting the results;

and (5) conclusions and implications (Pedhazur& Schmelkin, 1991; Shadish et al., 2002).

Quantitative methods have a long history,dating to at least the 1930s, that has producedstrong professional norms that impact researchactivities, such as the criteria used to make fund-ing decisions (What Works Clearinghouse,2008) and decisions about the kinds of studiesand results likely to be published (Bozarth &Roberts, 1972; Leech, Morgan, Wang, & Gliner,2010; Rosenthal, 1979).

Good descriptions of quantitative methodsappear in Bryman (2004); Kerlinger (1964);Balnaves and Caputi (2001); Gay, Mills, andAirasian (2005); and the Research MethodsKnowledge Base (Trochim, 2006). There are alsoseveral journals that publish quantitative method-ology papers and quantitative research studies in

education, including Psychological Methods, Journal of Educational and Behavioral Statistics,British Journal of Mathematical and StatisticalPsychology, American Educational Research Journal,American Journal of Evaluation, and EducationalEvaluation and Policy Analysis.

Examples of quantitative research designs inthe educational literature are provided byHowell, Wolf, Campbell, and Peterson (2002),who used a randomized design to study theeffectiveness of school vouchers in improvingachievement; Jacob and Lefgren (2004), who

used a quasi-experimental (regression disconti-nuity) design to study the relationship betweenreceiving a remedial education and achieve-ment; and Garner and Raudenbush (1991), whoused a correlational design to examine the rela-tionship between neighborhood characteristicsand achievement. These studies reflect severalimportant features of quantitative research,including an assumption that researchers setaside their experiences, perceptions, and biasesin conducting the study and drawing conclu-sions, data collection that did not treat theresearcher as the data collection instrument, andthe use of hypothesis testing to make deductiveinferences about characteristics of a population.

In sum, research design in education hasmoved from an almost exclusive reliance on quan-titative methods to a more varied approach thatincludes qualitative research methods. As Lincolnand Guba (1985) pointed out, both qualitativeand quantitative research methods emphasizetruth, consistency, applicability, and neutralitywhile taking different procedural approaches to

assure quality. Thus, it might be imagined thatqualitative and quantitative methods have been

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 5/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 151

promoted to educational researchers in thespirit of “Use the methodology or methodolo-gies that is (are) most appropriate and helpfulfor your purposes,” but that is not quite thecase. Rather, educational research in the lasttwo decades has been home to considerableconflict fueled by advocates of each methodol-ogy who have emphasized weaknesses of theother in terms of epistemology, importance forthe field, permissible inferences, and so forth(see, e.g., Berkenkotter, 1991; Denzin & Lincoln,2005; Gage, 1989; Howe, 2009; Maxwell, 2004;Morse, 2006; Pegues, 2007). Much of the debatehas been useful in encouraging researchers tothink about features of these methodologiesimportant for their work and for the field (e.g.,

Moss et al., 2009), but its prolonged and some-times acrimonious nature has likely promptedmany researchers to simply sit it out.

Mixed Methods

The qualitative versus quantitative debate hascoincided with the rapid development of mixedmethods, which combine qualitative and quanti-tative methods in ways that ostensibly bridge theirdifferences in the service of addressing a researchquestion. The roots of mixed methods are typi-

cally traced to the multi-trait, multi-methodapproach of Campbell and Fiske (1959, cited inTeddlie & Tashakkori, 2009, p. 31), although it isconsidered a relatively new methodology whosekey philosophical and methodological founda-tions and practice standards have evolved sincethe early 1990s (Tashakkori, 2009).

Johnson and Turner (2003) have argued thatthe fundamental principle of mixed methodsresearch is that multiple kinds of data should becollected with different strategies and methods inways that reflect complementary strengths andnon-overlapping weaknesses, allowing a mixedmethods study to provide insights not possiblewhen only qualitative or quantitative data are col-lected. Put another way, mixed methods researchallows for the “opportunity to compensate forinherent method weaknesses, capitalize on inher-ent method strengths, and offset inevitablemethod biases” (Greene, 2007, p. xiii).

While mixed methods research combinesqualitative and quantitative methods in waysthat draw on the strengths of both traditions of

inquiry, it is a clear step away from the boundar-ies and practices of those traditions, especially

those linked to quantitative methods. Accordingto Johnson and Onwuegbuzie (2004),

Mixed methods research is formally definedhere as the class of research where the researchermixes or combines quantitative and qualitativeresearch techniques, methods, approaches,concepts or language into a single study. Mixedmethods research also is an attempt tolegitimate the use of multiple approaches inanswering research questions, rather thanrestricting or constraining researchers’ choices(i.e., it rejects dogmatism). It is an expansiveand creative form of research, not a limitingform of research. It is inclusive, pluralistic, andcomplementary, and it suggests that researchers

take an eclectic approach to method selectionand the thinking about and conduct of research.(pp. 17–18)

This definition highlights the potential value ofmixing multiple elements of qualitative andquantitative methods, as well as the potentialcomplexity of doing so.

Caracelli and Greene (1997) identified threetypical uses of a mixed methods study: (1) test-ing the agreement of findings obtained fromdifferent measuring instruments, (2) clarifying

and building on the results of one method withanother method, and (3) demonstrating howthe results from one method can impact subse-quent methods or inferences drawn from theresults. These purposes appear in some form inmany mixed methods studies in diverse fieldsincluding education (Taylor & Tashakkori,1997), psychology (Todd, Nerlich, McKeown, &Clarke, 2004), criminology (Maruna, 2010),nursing and health sciences (O’Cathain, 2009),family research (Greenstein, 2006), and business(Bryman & Bell, 2007). In part, recent increasesin the number of mixed methods studies can beattributed to increases in funding (Creswell &Plano Clark, 2007; Plano Clark, 2010). Still, journal articles and funding programs offerabundant evidence that widespread acceptanceand funding of mixed methods in educationalresearch is a work in progress (e.g., Alise &Teddlie, 2010; National Research Council, 2002,2005; O’Cathain, Murphy, & Nicoll, 2007; WhatWorks Clearinghouse, 2008).

Despite their growing popularity, there is not

widespread agreement on exactly what consti-tutes a mixed methods study (Morse, 2010). For

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 6/36

152 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

example, some authors insist that a mixed meth-ods study is any study with both qualitative andquantitative data, whereas other authors say amixed methods study must have a mixed meth-ods question, both qualitative and quantitativeanalyses, and integrated inferences (Tashakkori,2009). There is also disagreement regardingvarious aspects of mixed methods, such as whenmixing should occur (e.g., at the point of design-ing a study, during data collection, during dataanalyses, and/or at the point of interpretation).

Still other authors have criticized the wholeidea of mixed methods (Denzin, 2006; Sale,Lohfeld, & Brazil, 2002; Smith & Hodkinson,2005), criticism which is sometimes framed interms of the response of advocates of a particu-

lar “stance” to arguments for mixing methods:the purist stance, the pragmatic stance, and thedialectical stance (Greene & Caracelli, 1997;Johnson & Onwuegbuzie, 2004; Lawrenz &Huffman, 2002). Those adopting a purist stanceargue that mixed methods are inappropriatebecause of the incompatibility of the worldviewor belief system (paradigms) (Tashakkori &Teddlie, 2003) underlying qualitative and quan-titative methods, i.e., qualitative and quantita-tive methods are studying different phenomenawith different methods (Smith & Hodkinson,

2005). Some purists have also raised concernsthat mixed methods designs leave qualitativemethods in the position of being secondary toquantitative methods (Denzin, 2006; Giddings,2006; Yin, 2006).

Researchers who adopt a pragmatic stanceargue that paradigm differences are independentof, and hence can be used in conjunction with,one another in the service of addressing a ques-tion (Johnson & Onwuegbuzie, 2004; Morgan,2007). Wheeldon (2010) summarizes this view:

Instead of relying on deductive reasoning andgeneral premises to reach specific conclusions, orinductive approaches that seek generalconclusions based on specific premises,pragmatism allows for a more flexible abductiveapproach. By focusing on solving practicalproblems, the debate about the existence ofobjective “truth,” or the value of subjectiveperceptions, can be usefully sidestepped. As such,pragmatists have no problem with asserting boththat there is a single “real world” and that all

individuals have their own unique interpretationsof that world. (p. 88)

However, the pragmatic stance also has its crit-ics (Mertens, 2003; Sale et al., 2002). Dialecticalresearchers argue that multiple paradigms arecompatible and should be used, but their differ-ences and the implications for research must bemade clear (Greene & Caracelli, 1997). It is impor-tant to emphasize that a researcher who adopts adialectic stance would, other things being equal,draw on the same procedures for mixing as oneadopting a pragmatic stance. The issue of stancesis certainly not settled, and additional develop-ments on this topic continue to appear in themixed methods literature (e.g., design stance ofCreswell & Plano Clark, 2007).

Mixed methods designs seem especially firmlyrooted in the evaluation literature. An early

important paper in this area was Greene, Caracelli,and Graham (1989), which highlighted five majorpurposes of (or justifications for) a mixed meth-ods evaluation. One is triangulation, which exam-ines the consistency of findings, such as thoseobtained through different instruments, andwhich might include interviews and surveys.According to Green et al., triangulation improvesthe chances that threats to inferences will be con-trolled. A second purpose is complementarity,which uses qualitative and quantitative dataresults to assess overlapping but distinct facets of

the phenomenon under study (e.g., in-classobservations, surveys), and a third is develop-ment, in which results from one method influ-ence subsequent methods or steps in the research;for example, interviews with teachers might sug-gest that an additional end-of-year assessment beadded. A fourth purpose of a mixed methodsevaluation is initiation, in which results from onemethod challenge other results or stimulate newdirections for the research; for example, teacherinterviews might challenge results provided byadministrators in a school district. The fifth andlast purpose is expansion, which may clarifyresults or add richness to the findings.

A number of frameworks for mixed meth-ods have appeared in this literature, many ofwhich have built on the work of Greene et al.(1989) (Caracelli & Greene, 1997; Creswell,2003; Johnson & Onwuegbuzie, 2004; Lawrenz& Huffman, 2002; Leech & Onwuegbuzie, 2007;Morse, 2010; Newman, Ridenour, Newman, &DeMarco, 2003; Tashakkori & Teddlie, 2003).These frameworks differ in many ways, but

they all successfully convey a sense of the largenumber of methodological tools available to

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 7/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 153

researchers. However, none of these frameworkshas been widely adopted.

The framework of Creswell (2003) is used hereto categorize research designs in mixed methods,but this choice should not be interpreted to meanthat it has been widely endorsed by mixed meth-ods theoreticians and practitioners. This frame-work is described next, and studies by Howellet al. (2002) and Norman (2008) are used toillustrate strengths in their research design, aswell as opportunities to enhance their results byemploying a mixed methods approach. In doingso, I assume that different paradigms can bematched to answer research questions of interest,and thus I adopt a pragmatic stance. It is impor-tant to emphasize that there are many complexi-

ties not captured by these examples, includingthe nature and timing of the mixing and the useof elements from multiple mixed methodsapproaches in a given research design.

Creswell (2003) described six somewhat over-lapping mixed methods research designs, referredto as strategies of inquiry, that guide the con-struction of specific features of a mixed methodsstudy. The designs vary in whether qualitativeand quantitative data are collected sequentiallyor concurrently, the weight given one kind ofdata or another, when the mixing is done, and

the extent to which a theoretical perspective (e.g.,post-positivism, constructivism) is present andguides the research design.

The first mixed methods approach describedin Creswell (2003) is the sequential explanatorydesign, in which qualitative data are used toenhance, complement, and in some cases followup on unexpected quantitative findings. In thisapproach, the focus is on interpreting andexplaining relationships among variables andmay or may not be guided by a particular theo-retical perspective. Quantitative data are collectedand analyzed first, followed by the collection andanalysis of qualitative data, meaning that qualita-tive and quantitative data are not combined(mixed) in the data analysis; rather, integrationtakes place when the findings are interpreted. Ingeneral, results are interpreted in ways thatusually give more weight to the quantitativecomponent. The separate phases of design, datacollection, and reporting for qualitative andquantitative data are considered strengths,because this arrangement is relatively easy to

implement. The weaknesses of this approach arethe time and resources needed for separate data

collection phases (as opposed to only collectingqualitative or quantitative data) and the expertiseneeded to integrate the qualitative and quantita-tive findings. Morgan (1998) suggested that thesequential explanatory design is the most fre-quently used mixed methods approach.

As an example, consider the quantitative studyof Howell et al. (2002). These authors were inter-ested in learning if students who received vouchersallowing them to enroll in a private school (andwho subsequently enrolled) showed improvedachievement compared to students who did notreceive a voucher and attended a public school.To explore this question, these authors examinedachievement data for more than 4,000 students inthree U.S. cities in which vouchers were randomly

awarded via a lottery. This arrangement produceda randomized design in which receiving a voucher(yes, no) was the key independent variable, andstudent achievement was the key dependent vari-able. No theoretical model was offered. Theresults showed an average increase in the achieve-ment of African American students who receiveda voucher and subsequently enrolled in a privateschool compared to African American studentswho did not receive a voucher, but not for anyother groups in the study. From a quantitativeperspective, this study has many strengths, par-

ticularly random assignment, that enhance causalarguments.

Howell et al. (2002) also commented that thereasons vouchers improved the academic perfor-mance of African American students and notother students were unclear. Is this differencerelated to a student’s view of his or her academicskills, peer related, due to motivation differencesor strong parental support for education, thestudent’s interactions with school staff, or otherfactors? The Howell et al. comment suggests thata sequential explanatory research design wouldallow these authors to search for explanations ofthe finding that vouchers improved the academicperformance of African American students butnot other students. For example, collecting qual-itative data at the end of the study through inter-views of a purposively sampled group of parentsof students who had and had not received avoucher could be helpful in enhancing andcomplementing the quantitative findings.

A second mixed methods approach, thesequential exploratory design, is essentially the

reverse of the sequential explanatory design,with quantitative data used to enhance and

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 8/36

154 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

complement qualitative results. This approach isespecially useful when the researcher’s interest isin enhancing generalizability, and it may or maynot be guided by a theoretical perspective.Creswell (2003) pointed out that instrumentconstruction is an example of this approach, inthat a draft of an instrument (survey, test) ispiloted with a small number of cases who oftenprovide important qualitative feedback abouttheir experience with the instrument, followed,after appropriate modifications of the instru-ment, by using the instrument to collect quanti-tative data. The quantitative results are thenused to enhance, complement, and possiblyextend the earlier pilot results. The strengthsand weaknesses of this approach are similar to

those of the sequential explanatory design.As an example, consider the mixed methodsstudy of Norman (2008). Norman was interestedin uncovering the perspectives of college aca-demic counselors when advising newly admittedstudents on the appropriate mathematics coursewith which to begin their college mathematicscareer, and integrating this information withquantitative information reflecting the highschool mathematics curriculum students com-pleted and their scores on a college mathematicsplacement test. Norman’s work was motivated by

anecdotal evidence that students who completeda standards-based mathematics curriculum inhigh school were more likely to be advised tobegin college with a less difficult mathematicscourse than comparable students who completeda traditional high school mathematics curricu-lum. Standards-based curricula in high schoolfocus on a variety of mathematics topics everyschool year in a way that emphasizes problemsolving and small-group work, and de-emphasizesalgorithmic manipulation. Traditional curriculatypically focus on algebra, algorithms, and repe-tition and depend heavily on the teacher forstudent learning (Schoenfeld, 2004).

Differential advising of students as a func-tion of the high school mathematics curricu-lum they completed implies that some studentsare advised to begin their college mathematicswith a course they should not take (e.g., precal-culus rather than Calculus I). The implicationof this practice is that students and collegesmay be spending time, money, and effort onunneeded courses.

Norman (2008) adapted a theoretical decision-making model (Miller & Miller, 2005) to the

process of how academic counselors make deci-sions, and used this model and a case studyapproach (Creswell, 1998). Norman purposivelysampled 6 counselors and 24 students and col-lected data via in-depth interviews with thecounselors and students that included theirobservations of the advising process, and highschool and college records of students’ mathe-matics course taking and grades. Normanreported that the counselors generally misinter-preted the mathematics portions of high schooltranscripts, which had important implicationsfor advising a student on which college mathe-matics course to begin with. For example, a studentwhose transcript indicated that his or her highestcompleted high school mathematics course was

“Integrated Mathematics IV,” a standards-basedcourse, was generally advised to start with aprecalculus mathematics course, even thoughIntegrated Mathematics IV is a precalculuscourse and the student should have been advisedto enroll in Calculus I. Norman also reportedthat counselors who looked at transcripts ofstudents who completed a traditional highschool mathematics curriculum, in which thehighest mathematics course completed waslisted as precalculus, were more likely to recom-mend that a student enroll in Calculus I. Norman

suggested that counselor views toward standards-based curricula may be related to working in amathematics department, because mathematicsdepartments have generally been quite critical ofstandards-based curricula (Roitman, 1999;Schoenfeld, 2004).

Norman (2008) also collected and analyzedquantitative data for a sample of more than1,000 college freshmen that included informa-tion on the high school mathematics curricu-lum they completed; their score on a collegemathematics placement exam; and the difficultyof their first college mathematics course, whichwas captured using a 4-point Likert variable(1 = a course that should have been completedin high school, which is sometimes referred to asa developmental course; 4 = a course whose dif-ficulty exceeded that of Calculus I). The resultsof these analyses suggested that the curriculuma student completed was related to his or hermathematics placement score and the difficultylevel of the student’s first college mathematicscourse. In particular, students who completed a

standards-based high school mathematics curric-ulum were more likely to enroll in a less difficult

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 9/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 155

college mathematics course compared to studentswho completed a traditional curriculum.

Norman’s (2008) study reflects several exem-plary features of mixed methods research, includ-ing using a case study approach that focused ondiscovering and understanding the experiences,perspectives, and thoughts of counselors and stu-dents, and the presence of quantitative data high-lighting the possible consequences of counselors’misunderstanding the information on a student’shigh school transcript. While the exact timing ofthe quantitative and qualitative data collection inNorman’s study was unclear, it appears the datawere collected relatively close in time, suggestingthat the results of one data collection did notinfluence the other data collection. However, the

main question focused on counselors, and col-lecting and analyzing data for counselors first,intentionally using this information to guide thecollection and analysis of quantitative data forstudents, and integrating the two sets of findingssuggest that a sequential exploratory designwould have been of value. For example, informa-tion obtained from the qualitative data aboutcounselors’ misunderstanding of high schoolmathematics course taking could have been usedin developing questions for a student survey, suchas on whether students thought their counselor

misunderstood their high school course taking.A third mixed methods approach is the

sequential transformative design, in which eitherqualitative or quantitative data may be collectedfirst. Here, the theoretical perspective underly-ing the methodology is critical to the conduct ofthe study, and the chosen methods should servethe theoretical perspective. Once again, qualita-tive and quantitative data are analyzed sepa-rately, and the findings are integrated during theinterpretation phase. This approach is oftenused to ensure that the views and perspectives ofa diverse range of participants are represented orwhen a deeper understanding of a process that ischanging as a result of being studied is sought.Its strengths and weaknesses are similar to thoseof the sequential explanatory design.

For example, the decision-making model usedby Norman (2008) suggested that collecting datafor students who are affected (sometimes adversely)by the recommendations provided by counselorsis important in evaluating and improving theadvising process. Norman commented that most

students follow the recommendation of whichmathematics course to take, even if they disagree

with it. Here, the study could focus on discover-ing and understanding students’ experiences withthe advising process and its implications for theircollege experience. A case study approach, using apurposively chosen sample of students whobegan their college mathematics course taking atdifferent difficulty levels, would be appropriate.Information obtained from interviews and stu-dent academic records could be used to informthe construction of a survey to be sent to a repre-sentative sample of students. The survey resultscould be used to improve decision making bycounselors and to enhance generalizability.

A fourth mixed methods approach is the con-current triangulation design, which is used whenthe focus is on confirming, cross-validating, or

corroborating findings from a single study.Qualitative and quantitative data are collectedconcurrently, such that weaknesses of one kind ofdata are ideally offset by strengths of the otherkind. Typically, equal weight is given to the twokinds of data in mixing the findings, althoughone kind of data can be weighted more heavily.The qualitative and quantitative data are analyzedseparately, and mixing takes place when the find-ings are interpreted. Important strengths of thisapproach are the ability to maximize the infor-mation provided by a single study, for example,

when interest is in cross-validation, and a shorterdata collection period compared to the sequentialdata collection approaches. Important weak-nesses include the additional complexity associ-ated with collecting qualitative and quantitativedata at the same time and the expertise needed tousefully apply both methods. Discrepanciesbetween the qualitative and quantitative findingsmay also be difficult to reconcile.

In the Howell et al. (2002) study, the primaryfinding was that the achievement of AfricanAmerican students who received a voucher toattend private school was on average higher thanthat of African American students who did notreceive a voucher, and that this difference did notemerge for other student groups. Adopting a con-current triangulation design could provide anexplanation for these findings by collecting quali-tative data in the form of interviews with parentsof students who did and did not receive a voucher,and quantitative data in the form of student testscores and background information. This wouldoffer an opportunity to corroborate findings

from this study with respect to the improvedachievement of African American students but

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 10/36

156 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

not other students. For example, a plausible out-come of concurrent data collection is that thequalitative data suggest that parents of AfricanAmerican students appeared to be more commit-ted to, and enthusiastic about, their students’education in general and the voucher program inparticular, than parents of other students, andthat this enthusiasm persisted throughout theschool year (Achievement tests used in this studywere given at the end of the school year.). In thisinstance, the qualitative and quantitative infor-mation would provide corroborating evidencethat the improved achievement of AfricanAmerican students could be attributed in part toreceiving a voucher and enrolling in a privateschool, and in part to the support, encourage-

ment, and motivation of students’ parents.The fifth approach is the concurrent nesteddesign, in which qualitative and quantitativedata are collected concurrently and analyzedtogether during the analysis phase. Greaterweight is given to one kind of data, in the sensethat one kind of data is typically embedded inthe other. However, there may or may not be aguiding theoretical perspective. A popular appli-cation of this approach is with multilevel struc-tures (Tashakkori & Teddlie, 2003), in whichdifferent levels or units of an organization are

studied. Strengths of this approach include theshorter data collection period and the multipleperspectives embedded in the data, whereasweaknesses include the level of expertise neededto execute the study successfully, especially inmixing the qualitative and quantitative datawithin the data analysis, and difficulties in rec-onciling conflicting results from the qualitativeand quantitative analyses.

In this design, qualitative and quantitative dataare mixed in the analysis phase, a process that cantake many different forms (see, e.g., Bazeley, 2009;Tashakkori & Teddlie, 2003). Caracelli and Greene(1993) described four strategies to mix qualitativeand quantitative data in the analysis. One is datatransformation, in which qualitative data aretransformed to quantitative data or qualitativedata are transformed into narrative, and theresulting data are analyzed. In Norman’s (2008)study, this could involve transforming (i.e., rescal-ing) qualitative data in the form of interviews,field notes, and so on to a quantitative form thatcaptures key themes in these data. Typically, the

transformed qualitative data exhibit a nominal orordinal scale of measurement.

A second data-mixing strategy described byCaracelli and Greene (1993) is typology develop-ment, in which the analysis of one kind of dataproduces a typology or set of categories that isused as a framework in analyzing the other kindof data. In Norman’s (2008) study, analyses of thequalitative data could produce themes that allowa variable with nominally scaled categories to bedeveloped, in which the categories provide anexplanation of why participants became counsel-ors. This variable could then be used in the quan-titative analysis.

A third data-mixing strategy is extreme caseanalysis, in which extreme cases identified withone kind of data are examined with the otherkind, with the goal of explaining why these cases

are extreme. For example, multilevel analyses ofquantitative data in Norman’s (2008) study maysuggest that some counselors are, with respect tothe sample of counselors, statistical outliers (e.g.,students linked to these counselors dispropor-tionately began their college mathematics studywith courses that should have been completed inhigh school). Qualitative data could be used to tryto explain why these counselors appeared to dis-proportionately steer students to developmentalcollege mathematics courses.

The fourth data-mixing strategy described by

Caracelli and Greene (1993) is data consolidation/merging, in which a careful review of bothkinds of data leads to the creation of new vari-ables or data sets expressed in a qualitative orquantitative metric. The merged data are thenused in additional analyses. In Norman’s (2008)study, a review of the qualitative and quantita-tive data may suggest new variables. For exam-ple, a review of the counselor and student datamay suggest constructing a variable capturingthe extent to which students assert themselves inthe advising process. Examples of data mixingappear in Caracelli and Greene (1993); Sandelowski,Voils, and Knafl (2009); and Tashakkori andTeddlie (2003).

Norman’s (2008) study seems ready-made fora concurrent nested design, given the inclusion ofboth counselors and students. A key outcome inthis study was the difficulty level of a student’s firstcollege mathematics course. Quantitative evi-dence of a relationship between a student’s highschool mathematics curriculum (standards-based,traditional) and the difficulty level of his or her

first college mathematics course could be mixedwith qualitative data obtained concurrently from

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 11/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 157

information provided by counselors. For example,qualitative data could be obtained using a narra-tive approach, in which counselors talked aboutthe events in their lives that led them to becomecounselors and to continue to advise students.Transforming the qualitative counselor data suchthat the transformed variables reflect importantthemes allows this information to be directlyincluded (mixed) with student variables in aquantitative multilevel data analysis, in whichstudents are treated as nested within counselors(Raudenbush & Bryk, 2002). These analyses couldexplore the impact of nesting and the impact ofcounselor variables on student data and provide apowerful explanation of how and potentially whystudents enrolled in a particular college mathe-

matics course.The sixth mixed methods approach is the con-current transformative design. As with the sequen-tial transformative design, there is a clearly definedtheoretical perspective that guides the methodol-ogy. In this approach, qualitative and quantitativedata are collected concurrently and can be weightedequally or unequally during the integration offindings. Qualitative and quantitative data aretypically mixed during the analysis phase. Strengthsinclude a shorter data collection period, whereasweaknesses include the need to transform data so

that it can be mixed in the analysis phase and dif-ficulties in reconciling conflicting results usingqualitative and quantitative data.

As an example of applying a concurrenttransformative design, Norman (2008) reportedevidence that some students were unhappy withthe recommendations they received but felt pow-erless to change the process. Suppose that thegoal of the study was to develop a theory explain-ing this sense of powerlessness in ways that couldimprove the advising process. Norman’s adapta-tion of the Miller and Miller (2005) decision-making model includes a component that callsfor students who are concerned about their rec-ommendation to be heard. This could take theform of collecting qualitative information fromstudents reflecting their experiences with theadvising process and their view of its impact ontheir mathematics course taking, and simultane-ously collecting qualitative data for a group ofcounselors. Developing a theory explaining thestudent’s feelings of powerlessness would requirethat student and counselor responses be carefully

examined and categorized to suggest themes toguide and inform the construction of the theory.

This information could then be used to informthe construction of a survey that would providequantitative information for a representative sam-ple of newly admitted college students to enhancegeneralizability. The strengths and weaknesses ofthis approach are similar to those of the otherconcurrent approaches.

Other examples of mixed methods studiesinclude Buck, Cook, Quigley, Eastwood, andLucas (2009); Day, Sammons, and Gu (2008);Onwuegbuzie, Bustamante, and Nelson (2010);and Onwuegbuzie et al. (2007). Good descrip-tions of mixed methods can be found in Bazeley(2009), Creswell (2003), Creswell and PlanoClark (2007), Greene (2007), Reichardt andRallis (1994), and Tashakkori and Teddlie

(2003). The relative newness of mixed methodsmeans that journals that concentrate on pub-lishing mixed methods methodology papersand mixed methods studies in education areevolving. They currently include the Journal ofMixed Methods Research, International Journalof Multiple Research Approaches, QualitativeResearch Journal, American Educational Research Journal, Educational Researcher, and EducationalEvaluation and Policy Analysis.

In sum, the mixed methods approach offers acollection of flexible research designs that seem

well suited to support rigorous examinations ofpromising ideas. The six designs of Creswell(2003) draw on the strengths of qualitative andquantitative methods to enhance inquiry in waysunlikely to occur with singular applications ofthese methods. Still, it is important to emphasizethat mixed methods design continues to face anumber of significant challenges (Bryman, 2007;Creswell, 2009; Lawrenz & Huffman, 2002;Tashakkori, 2009).

One important challenge is resolving out-standing disagreements over appropriate para-digms. The mixed methods literature containsa number of “mission accomplished” statementsimplying that important philosophical differ-ences have been resolved, primarily by adoptinga pragmatic stance (e.g., Carey, 1993; Creswell& Plano Clark, 2007; Haase & Meyers, 1988;Tashakkori & Teddlie, 2003). Yet it is clear thatimportant differences remain (Sandelowski,2001; Yin, 2006). A related challenge is achiev-ing better agreement on what characterizes amixed methods study and its components. For

example, what are the components of a mixedmethods study (research questions, design,

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 12/36

158 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

data collection, data analyses, interpretation),when should the components be mixed, andhow (and how much) should they be mixed to justify the label mixed methods (see, e.g.,Bazeley, 2009; Bryman, 2007; Johnson &Onwuegbuzie, 2004; Morse, 2010; Tashakkori& Teddle, 2003; Wheeldon, 2010)? Perhaps asingle comprehensive framework outlining thecharacteristics and components of mixed meth-ods studies will emerge; on the other hand, it isentirely possible that a crazy quilt pattern offrameworks will continue to define mixedmethods studies.

T HE STUDY OF P ROMISING IDEAS AND R ESEARCH D ESIGN

Earlier descriptions of qualitative, quantitative,and mixed methods suggest three related waysfor researchers to enhance the ability of researchdesigns to better support the study of promis-ing ideas. These represent both suggestionsand challenges.

First, and most important, is for researchersto consciously shed narrow definitions ofresearch design and to embrace more flexible

designs that support rather than constrain thestudy of promising ideas. This may mean using afamiliar research design in an unfamiliar way,such as substantially modifying elements of adesign, for example, the way that data are col-lected, analyzed, or interpreted in a qualitativestudy, or changing features of the interventionbeing studied or the hypotheses being tested inthe middle of a quantitative study because ofpreliminary evidence that a different researchdirection would be more productive. It maymean using a well-known research design to sup-port exploratory rather than confirmatory work,adopting key elements of research designs usedin other fields, such as dosage-escalation studiesin medicine (Whitehead, Patterson, Webber,Francis, & Zhou, 2001), or abandoning the sin-gular use of qualitative or quantitative methodsin a study in favor of a mixed methods approach.

Part of the challenge of embracing more flex-ibility in research designs is technical, to theextent that it requires additional expertise on thepart of a researcher or suggests the need to

develop new research designs. For example, aresearcher may plan to conduct a quantitative

study to compare learning outcomes of a treat-ment group whose members receive a promisingintervention against those of a control groupusing longitudinal data. However, in the serviceof developing a better understanding of theintervention, the researcher may decide to add apreliminary component to the study in whichsingle-subject methods (Kratochwill & Levin,1992) will be used to examine the learning tra- jectories of a small number of purposively sam-pled participants. This would require expertisein single-subject methodology that a researchermay or may not possess. Similar examples canbe constructed for qualitative and mixed meth-ods studies.

Still, the greater challenge for many research-

ers is likely to be modifying personal normsdefining scholarship and the adequacy of acontribution to the field. This may require aresearcher to step outside the methodologicalboundaries he or she has been trained to honorand, in some cases, enforce, and to embraceresearch designs the researcher may have beentaught, and have taught others, are inferior.Embracing other methodologies in ways thatcross disciplinary, funding, and publicationlines, for example, serving as a member of anmultidisciplinary research team that includes

(and values) individuals with expertise in quali-tative or quantitative methods, will require acertain amount of risk taking but will help tomove the field toward more flexible designs.

Second, researchers can work (or continue towork) to modify professional norms in theirroles as authors, manuscript reviewers, journaleditors, panel members evaluating grant pro-posals, and so forth, to allow a greater range ofstudies and findings to be supported. This willrequire an artful balancing between encourag-ing risk taking (e.g., studies employing innova-tive interventions or emerging methods ofqualitative inquiry) and the need to satisfystandards of research design and reporting thathelp to ensure the integrity of study results.Interestingly, there is growing evidence of sup-port for doing just this. Some of this evidenceappears in studies published in journals, such asthe American Educational Research Journal, Educational Evaluation and Policy Analysis, andthe Journal of Mixed Methods Research, that relyon multidisciplinary research teams. Other evi-

dence is in the form of new funding programsat the U.S. Department of Education (primarily

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 13/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 159

supporting quantitative research), such as theRace to the Top and Investing in Innovation,which focus on innovative and ambitiousapproaches to educational change that draw onmultiple disciplines.

An especially important challenge for modi-fying professional norms lies in the practices ofeducational research journals that typically pub-lish quantitative studies. The filters enforced bythe editorial process of these journals often rein-force existing and narrow professional norms.Evidence of their success can be found in manyquantitative meta-analyses. In a typical quantita-tive meta-analysis, a sample of published andunpublished studies of a common phenomenonare collected, examined, and combined—for

example, studies that have introduced interven-tions (vouchers, increased teacher professionaldevelopment) designed to reduce the achieve-ment gap between African American and Whitestudents. Variables capturing key features of eachstudy, such as sample size, percentage of thesample that was African American, and charac-teristics of the interventions are developed, andeach study is coded using these variables (Cooper,Hedges, & Valentine, 2009). Quantitative analy-ses of the resulting data typically provide evi-dence of the impact of editorial practices.

For example, limiting the analyses to publishedstudies will typically reveal that all (or virtually all)reported at least one statistically significant result(see, e.g., Borman, Hewes, Overman, & Brown,2003; Fukkink, & de Glopper, 1998; Purdie,Hattie, & Carroll, 2002), a reflection of the well-documented practice of publishing studies withstatistically significant findings (Bozarth & Roberts,1972; Hewitt, Mitchell, & Torgerson, 2008; Leechet al., 2010; Rosenthal, 1979). Other analyses willoften reveal that only a handful of researchdesigns were used in the pool of sampled studies.

The absence of published studies that failedto find evidence of a statistically significanteffect and the prevalence of studies employingsimilar research designs send a powerful mes-sage to researchers: Risk taking is risky, andsmall successes are valued more than large fail-ures. This reinforces a “cookie cutter” approachto research designs and provides further evi-dence of the need to strengthen the connectionbetween educational inquiry and the study ofpromising ideas.

Third, researchers can consciously use tech-niques and procedures that enhance the depth

and breadth of a study’s findings. Advances indata collection, data analysis, report writing,and standards of inquiry and verification havebeen rapid—and in some cases stunning—andemploying the techniques and procedures thatreflect these advances should enhance the roleof research design in supporting the study ofpromising ideas. For example, advances in com-puter hardware and software have significantlyexpanded the ways that data can be collectedand analyzed, especially non-numerical datasuch as words and gestures in qualitative research(Bazeley, 2009; Creswell & Plano Clark, 2007),and the accessibility of techniques to controlselection bias and take missing data into account inquantitative research (What Works Clearinghouse,

2008). Advances in standards of quality andverification enhance the transparency andintegrity of results (see, e.g., Creswell & PlanoClark, 2007; Shadish et al., 2002).

C ONCLUSION

The pursuit of promising ideas in educationalresearch sounds noble, and it is. In this spirit, thetask of this chapter was not to reexamine or

reignite the conflict between qualitative andquantitative methods, nor to assess the peace-making capacity of mixed methods, but ratherto examine the role of research design in sup-porting the rigorous study of ideas that arebelieved to be worth studying.

An examination of qualitative and quantita-tive methods in education suggests that singularapplications of these methodologies will con-tinue to play an important role in research study-ing new ideas. Still, there is good reason tobelieve that mixed methods do indeed represent,

as Teddlie and Tashakkori (2003) argued, a ‘‘thirdmethodological movement’’ (p. 5), which is onlynow beginning to mature “as a well-establishedmethodological alternative with agreed-on foun-dations, design, and practices” (p. 287).

In arguing for the value of mixed methods,Creswell and Plano Clark (2007) wondered,what would happen if

quantitative researchers paid more attention tothe incredible range of hypotheses that

qualitative researchers have generated for them?And what if qualitative researchers spent more

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 14/36

160 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

time exploring the range of phenomena thatquantitative researchers have sought to defineand test? (p. 59)

These are important and intriguing questions,and it seems clear that flexible research designsthat aggressively support the study of promisingideas are needed to help answer them. Mixedmethods designs seem especially well suited forthis task. Whether mixed methods will somedayenjoy an equal partnership with qualitative andquantitative research in education or supplantthese methodologies is unclear and in many waysunimportant. What is important is expanding the

catalog of research designs available to supportrigorous inquiry. Still, change that leads to moreresearchers embracing more flexible researchdesigns more conspicuously depends on modify-ing norms in ways that broaden the range of stud-ies and findings supported in education. Thatchange is underway is undeniable; what is lessclear is the scope and pace of the change.

In sum, mixed methods research offers anespecially promising path toward usingresearch design in ways that support rigorousexaminations of promising educational ideas.The time to fully embrace mixed methods designshas come.

R EFERENCES

Alise, M. A., & Teddlie, C. (2010). A continuation ofthe paradigm wars? Prevalence rates ofmethodological approaches across the social/behavioral sciences. Journal of Mixed MethodsResearch, 4, 103–126.

Balnaves, M., & Caputi, P. (2001). Introduction toquantitative research methods: An investigativeapproach. Thousand Oaks, CA: Sage.

Bazeley, P. (2009). Mixed methods data analysis. InE. Halcomb & S. Andrew (Eds.), Mixed methodsresearch for nursing and the health sciences (pp. 84–118). London: Wiley-Blackwell.

Berkenkotter, C. (1991). Paradigm debates, turf wars,and the conduct of sociocognitive inquiry incomposition. College Composition andCommunication, 42, 151–169.

Bogdan, R. C., & Biklen, S. K. (2003). Qualitativeresearch for education: An introduction to theories and methods (4th ed., pp. 7–42).Boston: Allyn & Bacon.

Borman, G. D., Hewes, G . M., Overman. L. T., &Brown, S. (2003). Comprehensive school reformand achievement: A meta-analysis. ReviewofEducational Research, 73, 125–230.

Bozarth, J. D., & Roberts, R. R. (1972). Signifyingsignificant significance. American Psychologist,27 (8), 774–775.

Brown, S. (2009). Learning to read: Learning disabledpost-secondary students talk back to specialeducation. International Journal of QualitativeStudies in Education, 22, 85–98.

Bryman, A. (2004). Social research methods (2nd ed.).Oxford, UK: Oxford University Press.

Bryman, A. (2007). Barriers to integratingquantitative and qualitative research. Journal of

Mixed Methods Research, 1, 8–22.

Bryman, A., & Bell, E. (2007). Business researchmethods. New York: Oxford University Press.

Buck, G., Cook, K., Quigley, C., Eastwood, J., &Lucas, Y. (2009). Profiles of urban, low SES,African American girls’ attitudes towardscience: A sequential explanatory mixedmethods study. Journal of Mixed MethodsResearch, 3, 386–410.

Campbell, D. T., & Stanley, J. C. (1963). Experimentaland quasi-experimental designs for research. Chicago: Rand McNally.

Caracelli, V. J., & Greene, J. C. (1993). Data analysisstrategies for mixed-method evaluation designs.Educational Evaluation and Policy Analysis, 15, 195–207.

Caracelli, V. J., & Greene, J. C. (1997). Crafting mixed-method evaluation designs. In J. Greene &V. Caracelli (Eds.), Advances in mixed-methodevaluation: The challenges and benefits of integratingdiverse paradigms. New directions for evaluation(No. 74, pp. 19–32). San Francisco: Jossey-Bass.

Carey, J. W. (1993). Linking qualitative andquantitative methods: Integrating culturalfactors into public health. Qualitative HealthResearch, 3, 298–318.

Cheek, J. (2005). The practice and politics of fundedqualitative research. In N. K. Denzin &Y. S. Lincoln (Eds.), The SAGE handbook ofqualitative research (3rd ed., pp. 387–409).

Chubbuck, S. M., & Zembylas, M. (2008). Theemotional ambivalence of socially just teaching:A case study of a novice urban schoolteacher.American Educational Research Journal, 45, 274–318.

Cochran, W. G., & Cox, G. M. (1950). Experimentaldesigns. New York: Wiley.

Cooper, H., Hedges, L. V., & Valentine, J. C. (2009).

The handbook of research synthesis and

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 15/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 161

meta-analysis (2nd ed.). New York: Russell SageFoundation.

Creswell, J. W. (1998). Qualitative inquiry andresearch design: Choosing among five traditions. Thousand Oaks, CA: Sage.

Creswell, J. W. (2003). Qualitative, quantitative, andmixed methods approaches (2nd ed.). ThousandOaks, CA: Sage.

Creswell, J. W. (2009). Mapping the field of mixedmethods research. Journal of Mixed MethodsResearch, 3, 95–108.

Creswell, J. W., & Plano Clark, V. L. (2007). Designingand conducting mixed methods research.Thousand Oaks, CA: Sage.

Crotty, M. (1998). The foundations of social research:Meaning and perspective in the research process. London: Sage.

Day, C., Sammons, P., & Gu, Q. (2008). Combiningqualitative and quantitative methodologies inresearch on teachers’ lives, work, andeffectiveness: From integration to synergy.Educational Researcher, 37, 330–342.

Denzin, N. K. (2006). The elephant in the living room:Or extending the conversation about the politicsof evidence. Qualitative Research, 9, 139–160.

Denzin, N. K., & Lincoln, Y. S. (2005). Introduction.In N. K. Denzin & Y. S. Lincoln (Eds.), TheSAGE handbook of qualitative research (3rd ed.,pp. 1–29). Thousand Oaks, CA: Sage.

Freeman, M., de Marrais, K., Preissle, J., Roulston, K.,& St.Pierre, E. A. (2007). Standards of evidencein qualitative research: An incitement todiscourse. Educational Researcher, 36, 25–32.

Fukkink, R. G., & de Glopper, K. (1998). Effects ofinstruction in deriving word meaning fromcontext: A meta-analysis. Review of EducationalResearch, 68, 450–469.

Gage, N. L. (1989). The paradigm wars and theiraftermath: A “historical” sketch of research onteaching since 1989. Educational Researcher, 18, 4–10.

Gaines, E. (2005). Interpreting India, identity, andmedia from the field: Exploring thecommunicative nature of the exotic other.Qualitative Inquiry , 11, 518–534.

Garner, C., & Raudenbush, S. J. (1991). Neighborhoodeffects on educational attainment: A multilevelanalysis of the influence of pupil ability, family,school, and neighborhood. Sociology ofEducation, 64, 251–262.

Gay, L. R., Mills, G., & Airasian, P. W. (2005).Educational research: Competencies for analysisand applications (8th ed.). Upper Saddle River,NJ: Merrill.

Giddings, L. S. (2006). Mixed-methods research:Positivism dressed in drag? Journal of Research

in Nursing, 11 (3), 195–203.

Greene, J. C. (2007). Mixed methods in social inquiry. New York: Wiley.

Greene, J. C., & Caracelli, V. J. (1997). Advances inmixed-method evaluation: The challenges andbenefits of integrating diverse paradigms. New

directions for evaluation (No. 74, pp. 19–32).San Francisco: Jossey-Bass.

Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989).Toward a conceptual framework for mixed-method evaluation design. EducationalEvaluation and Policy Analysis, 11, 255–274.

Greenstein, T. N. (2006). Methods of family research(2nd ed.). Thousand Oaks, CA: Sage.

Haase, J. E., & Myers, S. T. (1988). Reconcilingparadigm assumptions of qualitative andquantitative research. Western Journal of NursingResearch, 10, 128–137.

Harry, B., Sturges, K. M., & Klingner, J. K. (2005).Mapping the process: An exemplar of processand challenge in grounded theory analysis.Educational Researcher, 34, 3–13.

Hewitt, C., Mitchell, N., & Torgerson, D. (2008).Listen to the data when results are notsignificant. British Medical Journal, 336 ,23–25.

Hiatt, J. F. (1986). Spirituality, medicine, and healing.Southern Medical Journal, 79, 736–743.

Howe, K. R. (2009). Positivist dogmas, rhetoric, andthe education science question. Educational Researcher, 38, 428–440.

Howell, W. G., Wolf, P. J., Campbell, D. E., &Peterson, P. E. (2002). School vouchers andacademic performance: Results from threerandomized field trials. Journal of PolicyAnalysis and Management, 21, 191–217.

Jacob, B. A., & Lefgren, L. (2004). Remedialeducation and student achievement: Aregression discontinuity analysis. Review ofEconomics and Statistics, 86, 226–244.

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixedmethods research: A research paradigm whosetime has come. Educational Researcher, 33, 14–26.

Johnson, R. B., & Turner, L. A. (2003). Datacollection strategies in mixed methods research.In A. Tashakkori & C. Teddlie (Eds.), Handbookof mixed methods in social and behavioralresearch (pp. 297–319). Thousand Oaks, CA:Sage.

Kempthorne, O. (1952). Design and analysis ofexperiments. New York: Wiley.

Kerlinger, F. (1964). Foundations of behavioralresearch. New York: Holt.

Kratochwill, T. R., & Levin, J. R. (1992). Single-caseresearch design and analysis: New directions for psychology and education. Hillsdale, NJ:

Lawrence Erlbaum.

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 16/36

162 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

Lawrenz, F., & Huffman, D. (2002). The archipelagoapproach to mixed method evaluation.American Journal of Evaluation, 23, 331–338.

Leech, N. L., Morgan, G. A., Wang, J., & Gliner, J. A.(2010). Statistical significance and evidence-

based approach: Implications of non-significantresults. Journal of Non -significant Results inEducation, 1, 1–12.

Leech, N. L., & Onwuegbuzie, A. J. (2007). Atypology of mixed methods research designs.Quality & Quantity: International Journal ofMethodology, 43, 265–275.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalisticinquiry. Newbury Park, CA: Sage.

Maruna, S. (2010). Mixed method research incriminology: Why not go both ways? InA. R. Piquero & D. Weisburd (Eds.), Handbookof quantitative criminology (pp. 123–140).New York: Springer.

Maxwell, J. A. (2004). Causal explanation, qualitativeresearch, and scientific inquiry in education.Educational Researcher, 33, 3–11.

Mertens, D. M. (2003). Mixed methods and thepolitics of human research: The transformative-emancipatory perspective. In A. Tashakkori &C. Teddlie (Eds.), Handbook of mixed methodsin social and behavioral research (pp. 135–164).Thousand Oaks, CA: Sage.

Miles, M. B., & Huberman, M. A. (1994).Qualitative data analysis (2nd ed.). ThousandOaks, CA: Sage.

Miller, M., & Miller, T. (2005). Theoreticalapplication of Holland’s theory to individualdecision-making styles: Implications for careercounselors. Journal of Employment Counseling,41, 20–28.

Morgan, D. L. (1998). Practical strategies forcombining qualitative and quantitativemethods: Applications to health research.Qualitative Health Research, 3, 362–376.

Morgan, D. L. (2007). Paradigms lost andpragmatism regained: Methodologicalimplications of combining qualitative andquantitative methods. Journal of Mixed MethodsResearch, 1, 48–76.

Morse, J. M. (2006). The politics of evidence.Qualitative Health Research, 16, 395–404.

Morse, J. M. (2010). Simultaneous and sequentialqualitative mixed method designs. QualitativeInquiry, 16, 483–491.

Moss, P. A., Phillips, D. C., Erickson, F. D., Floden,R. E., Lather, P. A., & Schneider, B. L. (2009).Learning from our differences: A dialogueacross perspectives on quality in educationresearch. Educational Researcher, 38, 501–517.

National Research Council. (2002). Scientific research

in education (R. J. Shavelson & L. Towne, Eds.).Committee on Scientific Principles for

Education Research. Washington, DC: NationalAcademies Press.

National Research Council. (2005). Advancingscientific research in education (L. Towne,L. L. Wise, & T. M. Winters, Eds.). Committee

on Research in Education. Washington, DC:National Academies Press.

Newman, I., Ridenour, C. S., Newman, C., &DeMarco, G. M. P. (2003). A typology ofresearch purposes and its relationship to mixedmethods. In A. Tashakkori & C. Teddlie (Eds.),Handbook of mixed methods in social andbehavioral research (pp. 167–188). ThousandOaks, CA: Sage.

Norman, K. (2008). High school mathematicscurriculum and the process and accuracy ofinitial mathematics placement for studentswho are admitted into one of the STEM programs at a research institution. Unpublished dissertation, University ofMinnesota, Twin Cities.

O’Cathain, A. (2009). Mixed methods research in thehealth sciences: A quiet revolution. Journal ofMixed Methods Research, 3, 3–6.

O’Cathain, A., Murphy, E., & Nicoll, J. (2007). Why,and how, mixed methods research is undertakenin health services research in England: A mixedmethods study. BMC Health Services Research, 7. Available at http://www.biomedcentral.com/1472-6963/7/85

Onwuegbuzie, A. J., Bustamante, R. M., & Nelson, J. A.(2010). Mixed research as a tool for developingquantitative instruments. Journal of MixedMethods Research, 4, 56–78.

Onwuegbuzie, A. J., Witcher, A. E., Collins, K. M. T.,Filer, J. D., Wiedmaier, C. D., & Moore, C. W.(2007). Students’ perceptions of characteristicsof effective college teachers: A validity study ofa teaching evaluation form using a mixed-methods analysis. American EducationalResearch Journal, 44, 113–160.

Patton, M. Q. (2002). Qualitative research andevaluation methods (3rd ed.). Thousand Oaks,CA: Sage.

Pedhazur, E. J., & Schmelkin, L. P. (1991).Measurement, design, and analysis: Anintegrated approach. Hillsdale, NJ: LawrenceErlbaum.

Pegues, H. (2007). Of paradigm wars:Constructivism, objectivism, and postmodernstratagem. Educational Forum, 71, 316–330.

Plano Clark, V. L. (2010). The adoption and practiceof mixed methods: U.S. trends in federallyfunded health-related research. QualitativeInquiry, 16, 428–440.

Purdie, N., Hattie, J., & Carroll, A. (2002). A review

of the research on interventions for attentiondeficit hyperactivity disorder: What works

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 17/36

Chapter 10. Research Design in Qualitative/Quantitative/Mixed Methods 163

best? Review of Educational Research, 72,61–99.

Raudenbush, S. W., & Bryk, A. S. (2002).Hierarchical linear models: Applications anddata analysis methods (2nd ed). Newbury Park,

CA: Sage.Reichardt, C. S., & Rallis, S. F. (Eds.). (1994). The

qualitative–quantitative debate: Newperspectives. New Directions for ProgramEvaluation (no. 61). San Francisco: Jossey-Bass.

Roitman, J. (1999). Beyond the math wars.Contemporary Issues in Mathematics Education,36, 123–134.

Rosenthal, R. (1979). The “file drawer problem” andtolerance for null results. Psychological Bulletin,86, 638–641.

Sale, J. E. M., Lohfeld, L. H., & Brazil, K. (2002).Revisiting the quantitative–qualitative debate:Implications for mixed methods research.Quality & Quantity, 36, 43–53.

Sandelowski, M. (2001). Real qualitative researchersdo not count: The use of numbers inqualitative research. Research in Nursing &Health, 24, 230–240.

Sandelowski, M., Voils, C. I., & Knafl, G. (2009). Onquantitizing. Journal of Mixed Methods Research, 3, 208–222.

Schneider, B., Carnoy, M., Kilpatrick, J., Schmidt, W. H.,& Shavelson, R. J. (2007). Estimating causaleffects using experimental and observationaldesigns. Washington, DC: American EducationalResearch Association.

Schoenfeld, A. H. (2004). The math wars.Educational Policy, 18, 253–286.

Shadish, W. R., Cook, T. D., & Campbell, D. T.(2002). Experimental and quasi-experimentaldesigns for generalized causal inference(2nd ed).Boston: Houghton Mifflin.

Smith, J. K., & Hodkinson, P. (2005). Relativism,criteria, and politics. In N. K. Denzin &Y. S. Lincoln (Eds.), The SAGE handbook ofqualitative research (3rd ed., pp. 915–932).Thousand Oaks, CA: Sage.

Stage, F. K., & Maple, S. A. (1996). Incompatiblegoals: Narratives of graduate women in the

mathematics pipeline. American EducationalResearch Journal, 33, 23–51.

Tashakkori, A. (2009). Are we there yet? The state ofthe mixed methods community. Journal of Mixed Methods Research, 3, 287–291.

Tashakkori, A., & Teddlie, C. (2003). Mixedmethodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA:Sage.

Taylor, S. J., & Tashakkori, A. (1997). Toward anunderstanding of teachers’ desire forparticipation in decision making . Journal ofSchool Leadership, 7, 1–20.

Teddlie, C., & Tashakkori, A. (2009). Foundations ofmixed methods research: Integrating quantitative and qualitative approaches in thesocial and behavioral sciences. Thousand Oaks,CA: Sage.

Todd, Z., Nerlich, B., McKeown, S., & Clarke, D. D.(2004). Mixing methods in psychology: Theintegration of qualitative and quantitativemethods in theory and practice. East Sussex, UK:Psychology Press.

Trochim, W. M. K. (2006). Research methodsknowledge base (2nd ed.). Available at http://www.socialresearchmethods.net/kb/

Trochim, W. M. K., & Land, D. A. (1982). Designingdesigns for research. The Researcher, 1, 1–6.

What Works Clearinghouse. (2008). Procedures andstandards handbook (version 2.0). Washington,DC: U.S. Department of Education. Available athttp://ies.ed.gov/ncee/wwc/help/idocviewer/doc.aspx?docid=19&tocid=1

Wheeldon, J. (2010). Mapping mixed methodsresearch: Methods, measures, and meaning. Journal of Mixed Methods Research, 4, 87–102.

Whitehead, J., Patterson, S., Webber, D., Francis, F., &Zhou, Y. (2001). Easy-to-implement Bayesianmethods for dose-escalation studies in healthyvolunteers. Biostatistics, 2, 47–61.

Winer, B. J. (1962). Statistical principles inexperimental design. New York: McGraw-Hill.

Yin, R. K. (2006). Mixed methods research: Are themethods genuinely integrated or merelyparallel? Research in the Schools, 13, 41–47.

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 18/36

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 19/36

165

11INTELLECT , L IGHT , AND SHADOW IN R ESEARCH D ESIGN

JOH N P. B EA N

Indiana University

As people are known by the companythey keep, scholars should be knownby the ideas they keep. A poultice

against stale ideas resides in the generation of

new ones. Creating valuable new ideas, ideasthat transform and energize a discipline, is theprovince of researchers. Ideas, which are directlyrelated to research topics, are also directly con-strained or expanded by research design. Bigideas change the way a generation of researchersthinks and works, and these ideas transformpractice. Little ideas are refinements of bigideas, if they are of any use at all. The tensionbetween studying big ideas that do not fit neatlyinto existing methods and studying safe andsmall ideas that produce predictable but trivial

results creates a backdrop against which schol-arship evolves.Research, as a form of scholarship and cre-

ative work, is at the core of the academic enter-prise. It is through research that universitiescontribute knowledge to society. Researchprovides the basis for what is taught in the dis-ciplines and how members of a discipline under-stand their professional work. The design ofresearch has a direct effect on what is discovered,the ideas that are created, and what forms of

research contain legitimate professional infor-mation that is then passed on to the next gen-eration of scholars in a field. The promise ofresearch is that it will give us, as a society, what

we need to know to improve our lives.An axiological question arises for those plan-ning to do research: Is research of value becauseit is true or because it is useful? Truth, the mean-ing of which is contested by philosophers, theexistence of which is contested by postmodern-ists, and the use of which is contested by criticaltheorists, might be unattainable. I use the termtruth as shorthand to mean that which is consis-tent with observation—if observations can bemade—and is identified through proceduresaccepted in the discipline.

T HE BRIGHT P ROMISE OF K NOWLEDGE

Basic research emphasizes the search for disci-plinary truth, whereas applied research empha-sizes finding out something useful. Both goalsare attractive, and one does not preclude theother. Conjoined with the axiological question isa metaphysical one: Is there an objective reality

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 20/36

166 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

out there that we can discover, or is the world aproduct of the imagination, constructed in theminds of individuals and groups? Researchersdesign studies based on what they believe knowl-

edge to be. The search for an objective truthinvolves a different path from the search forindividual meaning or a consensus about inter-subjective meaning.

In the professions, as opposed to the basicdisciplines, utility is necessary. Professionalsprovide service to a client based on superiorknowledge developed from a long study of thedisciplinary research. According to Shils (1984),what separates academic knowledge from com-mon knowledge is that academic knowledge isdeveloped by a rigorous methodology. Research-

ers in a pure discipline (Biglan, 1973) attempt toestablish truth in that discipline. Researchers ina profession have as their purpose not to attainpure knowledge but rather praxis, that is, toattain the best knowledge that can be applied inservice of their clients’ needs. To offer educationas a societal good, money is spent, programsare funded, and teachers are trained andhired. If research is to inform these processes,then it is difficult to escape from pragmatismand positivism.

Research that advances methodology is ofvalue to a field by developing better researchers,but such research is not always of direct use to aresearcher’s clientele. A field that emphasizesinternal debates about philosophy, methodol-ogy, and definitions can be very lively but is indanger of becoming irrelevant. Well-designedresearch should deliver new understandings andnew theories. The ultimate test of its value to thepublic will not rest with internal elaboration orwith faculty members charming other facultymembers; rather, it will be seen with improvingunderstanding, teaching, learning, and organiz-ing in a heterogeneous society. In what follows, Idiscuss some of the primary considerations thatshould inform research designs.

Theories

Educational researchers are interested infinding out how one thing is related to another;describing a set of phenomena; and establishinga basis on which to make claims, predictions,and explanations. Braithwaite (1955) writes that

the purpose of science is theory and that, by

extension, the purpose of research is to contrib-ute theories or refinements of existing theoriesto science. Theory is a kind of abstraction, asimplification of reality that applies in similarcircumstances and not just to the specific case athand. For researchers, theories focus attention,limit choices, and provide explanations—characteristics that give good theories a centralrole in research design. For actors in the educa-tional environment, they are practical for thesame reasons.

Theories about social behavior have inherentlimits. These are identified by Thorngate (1976)and elaborated on by Weick (1979). Thorngatedeveloped a postulate of commensurate com-plexity in which there are trade-offs among a

theory being general, a theory being accurate, and a theory being simple. A theory cannot be allthree simultaneously; general accurate theoriesare not simple, accurate simple theories are notgeneral, and simple general theories are notaccurate. Weick provides examples of each. Indeveloping a research design, the theory used, orthe one the researcher is trying to develop, hasmore important effects on research design thandoes anything except the topic chosen for theresearch. Theory drives hypotheses. The choiceof a theory to use or develop reflects the

researcher’s interest in being general, simple, oraccurate and shapes the study accordingly. Frommy observations, I would suggest that educa-tional research errs on the side of being simpleand, with luck, accurate.

Theory is emphasized in research because itprovides explanation. Without meaningfuldescriptions of the situation—that is, withoutidentifying new ideas to be understood andrelated to each other by theories—researchwould not move forward. Designing researchthat identifies the ways in which people in agiven situation view their worlds is a sensiblestarting place for meaningful research. Withoutimportant things to be studied, no theorieswould need to be developed, and a rigorousmethodology to estimate relationships based ontheory would not be necessary.

Topics and Ideas

How does one go about selecting substantiveissues connected by a theory? Texts covering

research in education or the behavioral sciences

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 21/36

Chapter 11. Intellect, Light, and Shadow in Research Design 167

have tended to either be mute on the questionor provide only minimal advice (Gall, Borg, &Gall, 2002; Kerlinger, 1973; Krathwohl, 1988),with some improvement lately (Creswell, 2007).The selection of topics for study is neither inno-cent nor rational. It is not innocent, becausebeing selected gives a topic legitimacy, createsthe power of knowledge for those affected bythe topic, and creates invisibility for thoseexcluded. It is not rational, because researcherschoose topics to study based on professionalinterests, not professional mandates, and onself-interest based on what researchers value,are curious about, or perceive they will profitfrom. What is studied in a discipline becomeswhat is taught in the discipline. Just like what is

included in the curriculum is a political deci-sion as well as an educational decision, what isstudied is not neutral; it implies that what isstudied is valuable.

It is axiomatic that the most important partof any study is the choice of a topic; that is,research findings depend on what is beingresearched. A good topic, when well studied,improves descriptions in a field, better explainshow theories operate in the discipline, andshows how this knowledge can be applied tobenefit clients and society as a whole. The

research problem to be addressed and the state-ment of the purpose of the study focus researchactivities and limit the scope of the study. If thepurpose is too broad, then the research cannotbe accomplished with reasonable effort. If thepurpose is too narrow, then the study is trivial.The statement of purpose is the most importantsentence in a research proposal. Researchersneed to avoid making Type III errors—askingthe wrong question or not asking the rightquestion—or what I consider to be Type IVerrors, that is, studying the wrong thing.

Because we are well trained, we academics can justify studying practically anything. Politicianscan mock this attribute of the academy, in thepast handing out “Golden Fleece” awards forresearch that seemed to be fleecing the taxpayer.Rather than focusing exclusively on what we do study, it is important to recognize what we choosenot to study. Nearly 30 years ago, Gardner (1983)identified eight areas of intelligence: verbal/linguistic, logical/mathematical, visual/special,physical/kinesthetic/intrapersonal/social, musi-

cal, intrapersonal/emotional, naturalistic, and

spiritual. Of these, verbal/linguistic and logical/mathematical topics have dominated educationalresearch, while the others have received littlestudy or have been the province of specializedfields (e.g., music).

The topics we choose reflect a worldview ofeducational research: study the specific ratherthan the general, emphasize topics related tolinguistic and logical cognitive development,and use methods that favor such studies. Wealso favor topics that reflect social justice, cog-nitive development that leads students intoeconomically productive lives, and those topicsthat fit within predetermined disciplinary spe-cialties. If education deals with the whole stu-dent, as Bowen (1977) suggests, it would

behoove researchers to look for topics thatconsider more kinds of intelligence than thoseassociated with linguistic and mathematicalreasoning. Having become obsessed with cer-tainty in our research, we have eschewed broadresearch topics that require integrating variouscomponents of human intelligence in favor ofbeing able to say with great certainty some-thing of diminutive importance. Highly spe-cialized methodologies require highly refinedtopics, which provide little opportunity fortransformational outcomes and the develop-

ment of big ideas.

Choosing Topics

Identifying a research problem is the startingplace for research. Research problems involveunresolved real-world conditions or situations,and it is the research problem that is nestedbetween the topic and the purpose. Theoreticalresearch problems deal with “we don’t knowwhy,” descriptive research problems deal with“we don’t know what,” and practical researchproblems deal with “we don’t know how.” Thebest researchers are attracted to uncertainty,paradoxes, anomalies, contradictions, and ambi-guities in the field. The significance of a prob-lem is often based on the way in which it inter-sects with theoretical uncertainty and practicalimportance.

Probably the best way of finding a problem tostudy is to do extensive reading in an importanttopical area and find out what is poorly under-stood. Many articles contain sections that give

suggestions for future research, and many have

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 22/36

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 23/36

Chapter 11. Intellect, Light, and Shadow in Research Design 169

retention program, such as developing studyskills, but otherwise would have experiences nodifferent from those of the control group. Aftera given time period, the researcher would findout whether the retention rate for students whoparticipated in the retention program was sig-nificantly different from that for students whodid not. This information would be used to sup-port or negate the hypothesis.

The conceptual theorist is involved withresearch that is impersonal, value-free, disinter-ested, imaginative, and problematic, involvingmultiple causation, purposeful ambiguity, anduncertainty. The theorist is interested in theconflict between antithetical imaginative theo-ries, comprehensive holistic theories, and ever

expanding research programs to produce con-flicting schemas using dialectical and indetermi-nate logics (Mitroff & Kilmann, 1978, p. 56). Atheorist conducting retention research wouldprovide at least two theoretical explanations ofretention behavior, use survey methods to gatherinformation, analyze the data using statistics,find out whether the data supported one theorymore than the other, and use that information tomake more theories. Much of the empiricalresearch reported in educational journals is acombination of the scientist and theorist—

theory guiding social science inquiry intoeducational structures and processes.

The conceptual humanist (although I findsocial humanist to be a more accurate descrip-tion) approaches research as a value-constituted,interested activity that is holistic, political, andimaginative; where multiple causations are pres-ent in an uncertain and problematic social envi-ronment; and with a deep concern for humanity.This approach recognizes the importance ofthe relationship between the inquirer and thesubject and has the aim of promoting humandevelopment on the widest possible scale. Thenormative outcomes of such research would beeconomic plenty, aesthetic beauty, and humanwelfare. Similar to an action researcher, thesocial humanist prefers small-group dynamicswhere both the inquirer and the participantslearn to know themselves better and worktogether to improve the situation (Mitroff &Kilmann, 1978, p. 76). A retention researcherusing this approach could develop an ongoingprogram of action-oriented ethnographic

research studies, where the researcher comes to

better understand how the issues facing studentscontribute to their leaving, and tries to alleviatethose conditions. The purpose is to increase theoverall retention rate with the belief that stu-dents who complete college lead richer lives.

An individual humanist addresses inquiry asa personal, value-constituted, interested, andpartisan activity, engaging in poetic, political,acausal, and nonrational discourse in pursuitof knowledge. Intense personal knowledgeand experience are highly valued, aiming to helpthis person to know himself or herself uniquelyand to achieve her or his own self-determina-tion. The logic of the unique and singular hasmythical, mystical, and transcendental over-tones that operate as counternorms to the

CUDOS (Mitroff & Kilmann, 1978, p. 95). Aretention study from this perspective would tryto develop a detailed understanding of a singlestudent in the full context of his or her life. Itcould take the form of an “ N of 1” case study: aphenomenological inquiry into who the studentis, what the student finds at school, and howstaying or leaving school would be better for thisparticular individual.

The purpose of presenting these fourperspectives—and many more can be imagined—is to illustrate that there is no best way in which

to study a topic. Different kinds of studies makedifferent kinds of assumptions about what isimportant to know, serve different needs fordifferent people involved in the studies, andproduce different kinds of outcomes. The fourperspectives were presented in what was onceconsidered the normative order of acceptability:science-based research, theory development,action research, and phenomenology. One maybe no more correct than the others. Some aremore acceptable to certain audiences than toothers, and each produces a particular out-come that favors some stakeholders more than itdoes others.

Methodology and the Scientific Approach

Methodology is often considered the core ofresearch design. Kerlinger (1973, cited in Daniel,1996) described as one of the research myths theidea that research design and research methodswere synonymous, even though many research-ers held this view. Methodology is the tool used

to accomplish part of the study, specifically, how

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 24/36

170 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

to obtain and analyze data. It is subservient tochoosing an important topic to study, matchingthe research problem and the methodology, andknowing what the results mean and how theycan be applied. To do good research, the meth-odology used should be appropriate for theproblem addressed. This is a necessary condi-tion but not a sufficient one. An elegantly ana-lyzed data set that was composed of ambiguouslymeasured data that addressed a question of triv-ial importance is not likely to enter the annals ofgreat research.

Educational research is part of the social sci-ence research tradition, a tradition that wasinfluenced by research in the natural sciences.The natural sciences use the scientific method to

solve research problems or support a perspec-tive. The method contains a series of sequentialsteps similar to the following: Identify a prob-lem, gather information from the literatureabout this question, develop a hypothesis in thecontext of a theory, collect data related to thehypothesis, analyze the data, and draw a conclu-sion related to the truthfulness of the hypothesisand correctness of the theory.

Scientists, as logical purists, build argu-ments on falsifiability and the law of theexcluded middle. This law states that A and

not-A cannot exist simultaneously. But if A stands for “this program helps students tolearn” and not-A stands for “this program doesnot help students to learn,” then both can betrue, as in the case of aptitude–treatment inter-actions; that is, a treatment could be effectivefor a high-aptitude student but not effective fora low-aptitude student. If both are true, thenthe law of the excluded middle is violated andfalsifiability cannot be demonstrated. This sit-uation is problematic for scientific researchin education.

Education is not a scientifically based process,partly because the term education is ideologicaland idiosyncratic, much different from the termtemperature. At best, scientific research can shedlight on narrowly defined educational behaviors,and researchers can hope for—but cannotguarantee—a cumulative effect. When a govern-ment policy assumes that education is equivalentto improving the score on a test, the society willnot have a moral compass and will not be edu-cated. Feyerabend (1993) holds the view that if

we do not separate scientific research and thestate, as we have separated the church and thestate, irreparable harm will be done.

In the same way that social science researchhas imitated the natural sciences, educationalresearch has imitated social science research.Research in these areas may be separated moreby the topic studied than by the rigor of themethodology. As with social science researchin general, educational research might pre-tend a level of control so as to carry out anexperiment. Researchers give themselves sol-ace that “other things are equal,” or “othereffects are random,” or “spuriousness is not aproblem,” and proceed as if the social worldwere simple and understandable in the sameway that the traditional world of the pure sci-ences can be.

A Traditional Approach to DesigningEducational Research

Graduate programs in education typically arearranged in departments that reflect academicsubspecialties such as history of education, soci-ology of education, anthropology of education,counseling psychology, experimental psychol-ogy, higher education, school administration,and curriculum and instruction. Most of theseprograms require a course in research designappropriate for their field. My discussion

revolves around quantitative and qualitativeapproaches to research, terms that reify catego-ries that are themselves overlapping and arbi-trary, but are commonly used to describeresearch courses.

A simple way of looking at a proposal is to seehow it answers the three questions posed earlier:What is this study about? Why is the studyimportant? How will the researcher conduct thestudy? The study itself must cover these threequestions and answer two additional questions:What did the researcher find? What do the find-ings mean? Research designs revolve around alimited number of elements. Their exact use andexposition vary depending on the particularapproach taken. The purpose and promise ofthese elements have been identified and dis-cussed by a number of textbooks such as thoseof Gall and colleagues (2002) and Creswell(2007). These texts identify many of the issuesfacing researchers, especially those who are newto the process.

Although not suitable for all studies, well-

designed quantitative research usually ad-dresses the areas presented in the followingoutline:

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 25/36

Chapter 11. Intellect, Light, and Shadow in Research Design 171

I. Introduction to the topic of the research a. Background and context in which that

topic has been studied b. Importance of studying the topic,

including c. Practical value of the study d. Research problem to be addressed e. Purpose of the study f. Objectives or questions to be addressed g. Definitions and related constructs h. Assumptions used in the study i. Limitations of the study j. Scope of the study

II. Using the literature to build conceptualarguments

a. Relevant theories to guide the study b. The findings of other researchers that

identify important variables related tothe purpose

c. Identification of the dependentvariable, independent variables, andtheories linking these variables

III. Methodology to be used in the study a. Site, sample, or selection procedure for

respondents, including b. How the data will be gathered c. How the data will be measured d. How the data will be analyzed e. Why these methods are appropriate.

[This information usually completesthe design of a research proposal andappears as the first part of a finishedstudy. Finished studies also include thefollowing.]

IV. Findings a. A description of the sample actually

analyzed in the study b. Description of the data c. Treatment of missing cases d. Possible or known biases e. Description of how the researcher met

the assumptions required to use thechosen statistics

f. Presentation of the data g. Support or lack of support for the

hypotheses or theories used h. Discussion of the findings i. Conclusions

V. Summary of the study a. Practical implications of the study b. Areas for future research

Qualitative research can involve using thefive research traditions identified by Creswell

(1998)—biography, ethnography, grounded the-ory, case study, and phenomenology—which canbe used singly or in combination. General head-ings appropriate for qualitative studies include thetopic, focus and purpose, significance, related liter-ature, methodology, presentation of the data, inter-pretation of the data, and conclusions. Detailedheadings would be similar to the following:

I. Topic to be studied a. The overall interest focusing on what

will be explained or described b. Organizing metaphor (like grounded

theory) c. The mystery and the detective d. Hermeneutic elements

e. The significance of the study f. Why the reader should be interested.

II. Getting information away from the source a. Relevant literature b. Theories c. Findings for content area, themes, foci,

or analogous situations

III. The selected qualitative method a. How information is obtained b. How sense is made from it c. Site selection

d. Informant selection e. Data collection f. Data analysis g. Data evaluation [The methodology can be in a separate

chapter, part of the first chapter, in anappendix, or woven into the chaptersthat present respondent information.This section is followed by one or morechapters which does the following.]

IV. Present the text as natural answers tonatural questions.

a. Presentation formats include stories,tables, interviews, documents,narratives, photographs, videotapes,vignettes, texts of various kinds,descriptions, and routines (seeSchwartzman [1993], from whichsome of these headings were taken).

b. Raw data can be presented withoutcomment, presented and interpretedsimultaneously, or presented andinterpreted in a later section.

V. Findings

a. Conclusions b. Recommendations for others

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 26/36

172 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

The structure of qualitative studies is moreidiosyncratic than the more formal structure oftraditional quantitative studies. In qualitativeresearch, the sample is the study, and the reasonsfor selecting the sample need to be emphasized.

Most researchers, when approaching a topicthey care about, have tentative hypotheses aboutwhat causes what or predispositions to thinkingthat the world operates according to certainprinciples that also apply in this area. Peoplebias their observations based on their experi-ence. All of us know that we have certain biases,and we can try to counter those biases in ourresearch by looking for evidence that directlycontradicts what we expect. In the unconscious,there is a second set of biases of which, by defini-

tion, we are not aware. Sometimes peer readerscan help the researcher to discover what is miss-ing or what is inappropriately under- or over-emphasized in the study.

After analyzing the data in a quantitativestudy, the researcher presents the findings.Typically, it is rather straightforward, becausethe data to be gathered and the analyses pro-posed for the data were specified in the proposalfor the study. For qualitative researchers, thedata, the findings, and the method might not bedistinct. The narrative that presents selected

questions and answers can represent findingsbased on data that came from the method bywhich questions were developed. As the previ-ous sentence suggests, it is a convoluted process.The presentation might revolve around respon-dents’ experiences and understandings, a chro-nology of events, or themes supported byrespondents’ statements. In ethnographic stud-ies, the descriptions of lives in context canstand on their own (Lawrence-Lightfoot, 1995).Thick descriptions (Geertz, 1973) might pro-vide greater insight into the education of a stu-dent in a school than would an analysis ofvariables. In most studies, some analysis of thedescriptions is expected. This predisposition ispart of the legacy of pragmatism; researchersin education are expected to identify howknowledge gained from the study can improveprofessional practice.

In designing a study, the background, con-text, importance of the topic, and presumedpractical value of the study come from the lit-erature written about the topic or analogous

literatures in similar fields. For example, the studyof college student retention can be considered to

be analogous to the study of turnover in workorganizations, and the literature in one area canbe used to reinforce the literature in the otherarea (Bean, 1980). The use of literature in quan-titative studies, however, can differ substantiallyfrom the use of literature in qualitative studies.

In a quantitative study, the literature is usedto identify the importance of the dependentvariable, relevant independent variables, andtheories that bind these factors together, to jus-tify the use of statistical procedures and to pro-vide a context for the discussion. In qualitativestudies, a premium is placed on the ability to seewhat is before the researcher. Our ability toobserve is both heightened and diminished byour prior knowledge and expectations (Bean,

1997). It is heightened by making ourselvesaware of important details to observe, and it isdiminished because we focus only on thosedetails. Due to the preconceived notions of theresearcher, those factors actually influencing therespondents’ world might not be identified.When the literature shapes the way in which weview the world, what is actually before us isreplaced by what we expect to see.

A review of the literature, as a stand-alonesection summarizing research in the topicalarea, makes little sense. The literature, as a com-

pendium of related information, should be usedto advance arguments related to the importanceof the subject. It should identify topical areasthat are either well or poorly understood, iden-tify and describe relevant theories, identify anddescribe appropriate methodologies to study thetopic, describe dependent and independentvariables if relevant, provide definitions, andprovide a context to discuss the findings fromthe study.

Google Scholar, Dissertation AbstractsInternational, ERIC Documents, the ISI Web ofKnowledge, and the proceedings of relevantprofessional organizations all can be used toaccess current research. A condemning retort isthat the literature in a study is dated. This phrasehas some interesting subtexts. The first assump-tion is that the most recent research is the bestresearch and that previous research is irrelevant.A second assumption is that all research is oflimited generalizability over time so that if it isolder than, say, 5 years, it is irrelevant. In eithercase, dated research is of marginal value. By

extension, the research that a person is currentlyconducting is also of marginal value because it

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 27/36

Chapter 11. Intellect, Light, and Shadow in Research Design 173

will be useful for only 5 years. This plannedobsolescence of research becomes a justificationfor a frenzied increase in the rate of publicationand is counterproductive in terms of identifyingimportant and durable ideas in the field.Literature should not be weighed nor consid-ered dated after 5 years.

In traditional quantitative research, the topiccontains the dependent variable, and the factorsassociated with it identify the independent vari-ables that have been found to have importanteffects on the dependent variable. In studies thatare not codifications—that is, not extensivereviews of the literature for the heuristic pur-pose of organizing what is known about atopic—citing the literature should be done for

the purpose of building an argument, not sim-ply to show familiarity with the canon.Since the 1960s, the number of statistical

analyses available for researchers to include intheir designs has increased dramatically. Fivecommercial statistical packages bear the initialsSAS, SPSS, BMDP, GLIM, and HLM. The devel-opment of these statistical packages has allowedever more complex analyses to be performed.National data sets from the National Center forEducational Statistics (NCES) and other sourceshave provided the opportunity to bring order to

vast amounts of data.For the description of large-scale phenom-

ena, these data sets can be very valuable. Foranalyzing the causes of behavior, however, theattempt to gain a broad vision masks individualor small-group differences. Longitudinal studiesalmost always suffer from decay; that is, mea-sures may differ from year to year and respon-dents drop out of the study. So the comparisonsfrom year to year might not be the result of whatpeople report; rather, they might be the result ofchanges in who is doing the reporting.

The availability of data and the means toanalyze them raised the level of expectationin some journals that such analyses shouldbe the norm. What is certain is that duringthe past 50 years, the sophistication of analyseshas increased. The literature shifted fromnormed surveys that reported frequencies, tochi squares, to analyses of variance (ANOVAs)and simple correlations, to factor analysis andmultiple regression, to causal modeling withordinary least squares path analysis, to maxi-

mum likelihood used in linear structural rela-tions (LISREL) modeling, and to generalized

linear modeling (GLIM) and hierarchical linearmodeling (HLM).

The increase in complexity is associated withan increase in agitated exchanges between statis-ticians about whose method is correct. Animproved methodology has not been matchedby these studies becoming more influential inpolicy making or practice (Kezar, 2000). Thedebate is sometimes invisible to the public, tak-ing place between the author of a piece ofresearch and the consulting editors who reviewthe research, and sometimes it appears in jour-nals such as the Educational Researcher.

Data

Quantitative studies require data that can beused in statistical analyses. The sources of datacan vary widely—historical documents, govern-mental records, organizational records, inter-views, standardized surveys, questionnairesdeveloped as part of the research protocol for aparticular study, unobtrusive measures, obser-vations, participant observation, and so on. Thequality of the research depends on the quality ofthe data analyzed; data analysis has only a sec-ondary influence.

The quality of the data varies greatly. Good

research design requires that the researcherunderstand the strengths and weaknesses of thedata. Historical data can reflect the biases andideological preferences of those who recorded it.People who provide data can intentionally dis-tort it to put themselves in a better light, forexample, reporting that they had higher gradesthan they actually did. Survey data might comefrom a biased sample reflecting only the experi-ences of high–socioeconomic status respondents.Questions in a survey might be ambiguouslywritten, or a single item might contain twoquestions with different answers, for example,“How satisfied are you with your salary andfringe benefits?” Survey data that require aforced-choice response might not represent thereal interests of the respondent. A respondentmight have no opinion on most of the questionsand refuse to answer them. Other respondentsmight not want to reveal personal informationand so might misrepresent their actual incomes,whether they have ever plagiarized, or howmuch they use drugs or alcohol. Although the

questionnaire is not missing any data, the dataprovided might be intentionally inaccurate.

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 28/36

174 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

In other cases, respondents might not under-stand the questions, might not care about theanswers given, or might become fatigued whilefilling out the questionnaire, so that the accuracyof the responses is different for the beginningand the end of the questionnaire. A well-writtenquestion should reflect one bit of informationabout the respondent unambiguously and reli-ably, and the answer to the question shouldmatch observable facts.

It is acceptable to use problematic data if theanalyst understands and acknowledges the prob-lems that exist in the data. For example, a data setmight not be random but might be completelyrepresentative of one subgroup in the populationstudied. The bias in this sample can make con-

clusions drawn about the well-represented groupaccurate, but the conclusions would not apply tothe whole population. Although not representa-tive, the data might be useful to see whether ahypothesized relationship exists at all, that is, asa test of theory.

Data gathered from face-to-face interviewsfor qualitative research have the potential to yield a gold mine of insights into the people’slives and situations. There is no substitute forprolonged and focused conversations betweentrusted parties to discover what is important to

the interviewees and how respondents under-stand key elements in their own lives. Whenmishandled, interview data can reflect what theinterviewees think the interviewers want to hear,normatively appropriate responses, the fears andbiases of the interviewees, and the fears andbiases of the interviewers. Data flaws becomelimitations of the study for which the onlyresponse is to caution the reader that the resultsare far from certain.

Ethics and Institutional Review BoardsBefore proceeding with an examination of

research methods, there are some ethical andlegal considerations that have obtrusively en-tered the development of a research protocol. Inline with designing research to be useful, itshould also be designed to be ethical. The mostobvious ethical problems arise when a researchprocedure causes harm to those who are askedor forced to participate in the process. There areseveral well-known cases of abuse, including

psychological studies where participants were putin unusually stressful situations (Baumrind, 1964;

Milgram, 1974) and medical research whereparticipants were given diseases or intentionallydenied treatment (Jones, 1993).

The bureaucratic response to these ethicalviolations was to create rules that would includeeverybody doing any kind of research thatinvolved contact with living people. Bureaucraticactors, evaluating research they are not conduct-ing themselves, become the gatekeepers of ethi-cal behavior. This responsibility is misplaced;researchers themselves should be responsible forprotecting the interests of participants in theirstudies. I am not naïve enough to think that allresearchers are ethical or that institutionalreview boards (IRBs) or protection of humansubjects committees will go away. The problem

is that ethical judgments about research havebeen made extrinsic to the research process.Researchers need to design research that does just what the committees want—to protect theparticipants of a study from harm. If researchersare not socialized to provide these protections,IRBs might not help. The enforcement systemused, which involves taking federal support awayfrom ethical researchers because they happen tobe at an institution where one person did notcomply with the guidelines, is a collective pun-ishment which is itself unethical. IRBs have the

enormous power of being able to block research,and the potential for abusing power must bekept under constant scrutiny.

For qualitative researchers especially, com-plying with a written informed consent formcan damage the trust required to conduct astudy. The study of any group that dislikesauthority is made impossible, or at least less reli-able, by asking participants at the outset to signa form that says, “You should know that thisresearcher does not intend to hurt you.” A jour-nalist and an ethnographer can conduct andpublish identical studies. However, the journal-ist needs no informed consent from those whoare interviewed for the story, whereas the eth-nographer at a research institute needs IRBpermission to ask the same questions. The jour-nalist is protected by freedom of speech, whereasacademic freedom, according to IRB rules, pro-vides no such protection for the researcher.

While much of the antagonism betweenIRBs and the faculty involve what are seen asnuisance parameters, as hurdles to be jumped,

IRB guidelines constitute a direct attack onacademic freedom. When a faculty member has

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 29/36

Chapter 11. Intellect, Light, and Shadow in Research Design 175

to get permission to “do research as he or shesees fit,” there is no academic freedom. Thereare three traditional threats to academic free-dom: economics, religion, and politics. Onecould argue that IRBs are simply a new form ofthe economic threat, being that failure to fol-low IRB guidelines will result in the loss ofsubstantial federal funding for research at theinstitution. Too often, I fear, these offices dis-place their goals of protecting human subjectsand replace them with the goal of makingresearchers follow their rules. The direct andopportunity costs to institutions in wastingfaculty time is real, but the greater problem isresearcher self-censorship—not doing researchbecause of the fear that it will not be approved

by the IRB. Academic freedom is the most cen-tral value in American higher education, andanything that puts it at risk needs greater justi-fication than IRBs have provided for theirbehavior. Regardless of what happens to IRBs,research should be designed to protect every-one, to benefit the participants in the research,and to protect society from ignorance.

Generalizability

Generalizability is the central bulwark of thescientific research in education approach. In a2002 National Research Council report, the edi-tors observed, “Regularity in the patterns acrossgroups and across time—rather than replicationper se—is a source of generalization. The goal ofsuch scientific methods, of course, remains thesame: to identify generalized patterns” (p. 82).

Generalizability is a powerful statistical toolthat allows researchers to make predictionsabout patterns of behavior in a population, suchas the percentage of people who will vote asindependents, based on a measure of that behav-ior taken from a sample of the population. It isattractive to policy makers because it suggeststhe extent to which a particular solution willwork everywhere in the population. As thebehavior in question gets more complicated,such as how students learn ethical behavior,generalization is of more limited value. Lincolnand Guba (1985) describe this limitation well:

Generalizations are nomothetic in nature, that is,lawlike, but in order to use them—for purposes

of prediction or control, say—the generalizationsmust be applied to particulars. And it is precisely

at that point that their probabilistic, relativenature comes into sharpest focus. (p. 116)

Does what works 90% of the time for theparticipants in a study work for one particularteacher in one particular class dealing with oneparticular subject? Tutoring generally helps stu-dents to learn how to read, but for a student whois acting out against authority, and who viewsthe tutor as an authority figure, tutoring mightprevent the student from learning to read.

As a “reductionistic fallacy” (Lincoln &Guba, 1985, p. 117), generalization simplifiesdecision making and simultaneously reducesthe understanding of the particular. Teachersoperate in particular environments, and the

findings from a “scientific” study with a highdegree of generalizability do not ensure a pro-gram’s utility in a given classroom. The purposeof scientific research is to eliminate uncertaintyso that the operator can predict and control thefuture. Applied to education, this goal is not aresearch norm or a teaching norm but rather apolitical norm.

T HE SHADOW OF R ESEARCH D ESIGN

Research is seductive because it promises to givethe participants, as producers or consumers,those things that they imagine they want. We areseduced by research, viewing it as a beatific pro-cess by which we can glimpse the bright light ofpure knowledge. Scholars would have no agendaother than furthering knowledge, a value sharedby those who fund research, publish it, and basepolicy on it. It would be a collegial environmentwhere differences exist only about approach, allparticipants share the ultimate goals of research,and no ethical problems exist. These utopiangoals include a greater understanding of indi-vidual and group processes in a given discipline,with the potential to apply these findings toimprove both individual lives and society col-lectively. Researchers would design their studiesfor the sole purpose of sharing information tobetter understand the issues at hand and distrib-ute the knowledge widely so that it can improvepractice.

The shadow of research design comes as aseries of dispositions and paradoxes when theperson designing research must make decisionsfor which the search for disciplinary truth

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 30/36

176 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

provides no direction. A researcher has somecontrol, but not complete control, over decidingwhat research to conduct. A researcher has lim-ited control, or no control, over how research isfunded, how it is evaluated, and how it is used.The shadow of research appears when one con-fronts the lack of creativity in research and psy-chological barriers to the free flow of ideas. Itoccurs when a researcher encounters difficultiesrelated to the disciplinary research environmentand the primary and secondary social environ-ments associated with the research.

The Loss of Creativity

If the world were static, then creativity would

not be necessary; what worked in the past wouldcontinue to work in the future. In a dynamicsocial world existing in a turbulent ecology, thegeneration of new ideas is necessary for survival.In the natural world, mutation is a random pro-cess, and selection occurs where the fit to thenatural environment of the new form has advan-tages over existing forms. In the social world,creativity is the source of variation and mustbe present before selection can take place.Without creativity in identifying problems to beaddressed or methods to be used, a field of study

would atrophy.If research has a core more important than

anything else, it is creativity. Without creativity,researchers would only repeat themselves.Without creativity, the questions we pose, thestarting place for research design, would be end-lessly repetitive. Creativity allows the clientele ofresearchers—be they the public, practitioners,or other researchers—to bring new ideas intotheir intellectual or practical lives. They canagree or disagree with each other’s findings.They can find fault in their methodologies. Butthe new ideas remain as work to be examined,understood, enacted, selected, and retained foruse (Weick, 1979).

Getzels and Csikszentmihalyi (1976) describeproblem finding as being at the heart of thecreative process. Educational researchers whoinvent the best problems have the greatestchance of contributing to their fields. A goodproblem implies the way in which it should bestudied. A superb methodology will not makeup for a poor research problem. Structured pro-

cesses for becoming more creative that empha-size steps to be followed have been identified

(Parnes, 1992). However, the content of the stepsis not well understood. If it were, then everyonewould be creative and have plenty of excellentproblems around which to design research.

To be creative, as opposed to simply novel, aresearcher should be well-versed in substantiveknowledge of the topic and the limitations ofthe chosen methodology. Creativity has at itsfoundation a sense of play—of suspending nor-mal constraints so as to see new patterns, possi-bilities, or connections. Play is usually an “ideanon grata” in a workaholic environment, althoughthe hermeneutical philosopher Godamer con-sidered play to be an expression of great serious-ness (Neill & Ridley, 1995). Play is characterizedby just those things that are likely to lead a

researcher into creative work, including takingrisks, testing new ideas in safety, avoiding rigidity,and suspending judgment (Schwartzman, 1978).

A risk-averse, judgmental, assessment-orientedenvironment focused on short-term gains willhave a negative effect on creativity. If proposalsare assessed by published criteria, then how cannew projects that do not fit established criteriabe funded? We live in a judgment-rich environ-ment, where we have been socialized for yearsinto viewing work as something that will begraded. Peer reviews, editorial reviews, adminis-

trative reviews, and granting agency reviewsoccur regularly. Faculty work can be assessed onan annual basis, with the expectation of prod-ucts in hand making the time frame for com-pleting work within a year or less. In graduateschools, students are steered out of creativeprojects because such projects are too risky. It isunfortunate when research is designed not outof the possibility of success but rather out of thefear of failure. In the process, creativity—researchers’best friend and asset—is shunted to the rear.Academic reproduction (Bourdieu, 1984/1988;Bourdieu & Passeron, 1990) ensures repro-duction, not evolution. Creativity suffers in thecurrent context of conducting research, andproducing valuable new understandings is evermore difficult.

Fear and the Researcher’s Ego

There are a number of personal factors thataffect research design. Morgan (1997) describes“psychic prisons” as a metaphor for the ways

in which our imaginations become trapped.Whatever neuroses we have can emerge in the

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 31/36

Chapter 11. Intellect, Light, and Shadow in Research Design 177

task of developing research. Researchers can fix-ate on certain ideas, repress others, idealizestates, or project their own views on the data. Afrequently occurring form of projection occurswhen the conclusions of a research study are notconnected to the data. Researchers project theirbeliefs onto the data, concluding what theywanted to conclude before they began conduct-ing the study.

Fear has an unacknowledged influence onresearch design that manifests itself in a varietyof ways. The first is internal censorship. Certaintopics and methods are never given serious con-sideration because to do so would be to invitetrouble, at least in the minds of the researchers.For example, during the 1970s, many people

did not consider qualitative research to be anappropriate form of educational research. Fear-ing rejection by colleagues, granting agencies,advisers, or editors, researchers steered them-selves away from the use of qualitative research.It was not surprising that much of the emphasisof Lincoln and Guba’s (1985) NaturalisticInquiry is a justification for, and not an explana-tion of, this kind of study. Researchers engagedin self-censorship by avoiding Black studies,women’s studies, GLBT (gay, lesbian, bisexual,transgendered) studies, and the study of emo-

tional aspects of organizational behavior(Fineman, 2000).

Fear also underlies what has been called the“imposter syndrome” (Harvey & Katz, 1985),where researchers might fear that they are fakes.This problem can show up in an obsessive needto review the literature because a researcher“doesn’t know enough yet.” A researcher mightfear not being professional enough or not beingthorough enough and might analyze data in anendless pattern of trivial changes. This fear is apathology, not a motivator.

Research can also be conducted in service tothe ego, not the discipline, where the researcheris driven by the extrinsic value of research. Thisdrive results in designing research for maximumvisibility, regardless of substance. Finding thesmallest publishable unit in a data set inflatesone’s résumé but clutters journals. The egothrives on high levels of productivity. The disci-pline thrives on high levels of quality. The cur-rent research market defines what is acceptable,and clever marketing may be more important to

one’s ego than a quiet but long-term contributionto the field.

There is a competitive aspect to designingresearch. Instead of a “best knowledge for thediscipline” model, it involves “I got there first,”“I’m right and you’re wrong,” “I win the argu-ment,” “My theory is right,” “I got the grant and you didn’t,” “My university is ranked higher than your university,” and the like. These are the con-cerns of the ego, extrinsic to the creation ofknowledge, casting a shadow on research. Fromthe point of view of designing research to dis-cover knowledge, it is bizarre that information isnot shared. From the point of view that researchis not about improving knowledge but rather isabout supporting the ego, making a name foroneself, and providing research overhead toone’s institution, it makes perfect sense. The

impulse is to design research in order to winsome imaginary (or real) competition, notbecause it is vital to the field.

Disciplinary Norms and Groupthink

The kind of study a researcher can conductdepends on the development of the field. Maturefields, such as the arts and sciences, medicine,and engineering (Parsons & Platt, 1973), have along tradition of theories and methods that arethought to be appropriate to use when conduct-

ing research. Cultural studies, critical theory,and other postmodern approaches have pre-ferred methods that keep other disciplines vitalby challenging their traditions. Research normsbecome institutionalized through acceptingpapers for professional meetings and publica-tion. A disciplinary language develops, and akind of parochialism develops in citation pat-terns: Cite from journals in the field only.Disciplines within education and in the profes-sions have become ever more specialized.Research followed suit and led the way to disci-plinary specialization.

Research design reflects this specialization intopic and method. Specialization can have theadvantage of accuracy and the disadvantage oftriviality. Researchers who venture outside thenorms can be transformational if they are luckyor can be ignored or ridiculed if they are not.New ideas are sometimes blocked by the disci-plinary equivalent of groupthink. Groupthink, first described by Janis (1972), includes manyfactors that limit creativity and risk taking,

including sharing stereotypes that guide thedecision, exerting direct pressure on others,

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 32/36

178 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

maintaining the illusion of unanimity andinvulnerability, and using mind guards to pro-tect the group from negative information.

Groupthink is more than a norm; it is anexclusionary process designed to protect thegroup from outside influence. Groupthink inresearch can limit the topics studied and themethodology used. The long period duringwhich editors silenced the voices of women;African Americans; and the gay, lesbian, bisex-ual, and transgendered community in educationis one example. Another currently exists amongthose in education who support only “scientificresearch” (National Research Council, 2002).Berliner (2002) suggests that the problem is notone of science but rather one of politics and

money. Those who label good research in educa-tion as “scientific” are stuck in groupthink, as arethose who consider the research method asessential and all else as trivial. When methodol-ogy precedes identifying the problem to be stud-ied, groupthink wins and research suffers.

Methodology and MethodologicalCorrectness

At the extreme, the result is “methodologicalcorrectness,” a term I coin as a play on “political

correctness.” It is associated with taking oneselfvery seriously and is related to academic funda-mentalism, where skepticism is replaced bydogma. Methodological correctness means thatthe purpose of research is to optimize method-ology. It is an example of goal displacement,where the purpose of research is no longer tofind out something important but rather to usemethod flawlessly. The hegemony of methodol-ogists in determining the value of research has achilling effect on exploring new approaches toresearch, on studying topics not studied previ-ously, and on studying topics that do not lendthemselves to study using preferred methods.

Institutionalized methodological correctnesstakes the form of guidelines, where if the guide-lines are not followed, the result is funding notbeing given or results not being taken seriously.The U.S. Department of Education has providedA User-Friendly Guide, one that is not “friendly”at all, that can be summarized as follows: Theonly rigorous evidence that can be used to eval-uate an educational intervention comes from

research using randomized controlled trials(Institute of Education Sciences, 2003).

Simple solutions are often wrong. Random-ization means that individual student differ-ences will not be a factor in the research and thatall kinds of students can expect to benefitequally from the program. The results aredesigned to mask individual differences to seewhether the program worked for the majority. Itworks if the mean of the criterion variable forthe treatment group is significantly higher thanthe mean of the control group. Like randomiza-tion, means are designed to mask individualdifferences. Berliner (2002) makes the point thatthere is a “ubiquity of interactions” and that aprogram could have remarkable positive effectson a small segment of the treated population,none of which would be discovered by this

research design. A program could benefit giftedstudents, African American students, girls, ath-letes, or special needs students in a mannerinvisible to scientific methods.

Much of the contentiousness about educa-tional research design centers on whether theresearch is scientific, a desiderata identified bythe National Research Council’s (NRC) publica-tion of Scientific Research in Education (2002).The debate revolves around using scientificmethodologies to examine educational issues.The NRC’s position is generally supported by

some (Feuer, Towne, & Shavelson, 2002; Slavin,2002) and cautioned against or rejected by oth-ers (Berliner, 2002; Erickson & Gutierrez, 2002;Olson, 2004; St.Pierre, 2002). Research design,research funding, and politics are intercon-nected (Burkhardt & Schoenfeld, 2003). TheObama administration has done much to restorethe importance of scientific knowledge in policymaking, but one can never assume that such achange is permanent.

A research article, like the tip of an iceberg,contains only a small percentage of the informa-tion that the author encountered in the study.Given this situation, research becomes an enact-ment of the map–territory relationship, that is,the relationship between the object studied andthe symbol for that object—the research report(Bateson, 2000). How complete does the symbolneed to be to represent some objective reality?Borges (1998), in his story “On Exactitude inScience,” provides a fictional example of anempire that was so enamored of mapmakingthat the cartographers were encouraged to make

maps that were larger and more accurate. In theend, they made a map that was so detailed, it

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 33/36

Chapter 11. Intellect, Light, and Shadow in Research Design 179

needed to be exactly the same size as the land itdescribed. As a map, it was perfectly useless.

In this case, greater accuracy and greatermethodological correctness diminished utility.Bateson (2000) argues that maps are useful notbecause they are literal representations butrather because they are in some way analogousto reality. Research provides a map, an analog ofreality. If Bateson is right, then it might be moreappropriate to design and evaluate research noton the basis of how correct the methodology isor how literally it represents reality, but ratheron how useful it is for understanding and actingin our environments.

The Primary Research Audience

Research design is affected by the primaryresearch audience for the study. For doctoralstudents, the primary audience is their advisorsand other members of their research committees.For faculty, the primary audience is journal andpublishing house editors and grantors. Refereed journal editors are the gatekeepers of much ofthe research that is published, which in turninfluences what is taught and who is tenured andpromoted at schools that value research.

Recognizing this power, a researcher responds

to the real or imagined preferences for topic ormethod of this audience. The obvious way offinding editorial preferences is to read the jour-nal, see what is being published, and use a similarapproach to one’s own study. Doctoral studentswould be prudent to read dissertations directedby a prospective dissertation advisor to see whatthese preferences actually are. This situationbegs the question, should these gatekeepers setthe research agenda? Editors of research journalsusually have been successful researchers in theirfields and have published widely. The advisoryboard that hires an editor increases a journal’sprestige by hiring the most prestigious editor itcan find. The editor then seeks out other suc-cessful researchers in the field and brings themon board. This selection procedure produces aconservative bias: It rewards what has workedin the past.

One model for the editorial process is thatreviewers have had long experience in the fieldand make prudent judgments about what stud-ies will advance educational practice or knowl-

edge. Another model views editorial decisions asbeing on show because what editors approve is

published. The imposter syndrome is ever-present:“How do I, as an editor, make decisions that willmake me look like I know what I’m doing?” Theordinary response is risk aversion: “If I don’t takechances, I’m least likely to look like an imposter.”Editors are likely to reject methodologicallyflawed research in favor of methodologicallycorrect research. Imaginatively flawed research,research whose consequences are trivial for thediscipline or practitioners, can be published if themethods are correct but with predictable disdainfrom the public (Kezar, 2000). I have heard of nocases where an editor has written an author say-ing, “The ideas in this article are so compellingthat I’m going to publish it even though it con-tains obvious methodological flaws.” Editorial

referees work at the pleasure of the editor, and ifthey are to be retained, they work in line with theeditorial vision. Reviewers are often shown thecomments of other referees so that they cancompare their responses. Feedback provides animplicit pressure to conform.

The upward drift in methodology can be con-sidered paradoxical. To get published, authorsuse sophisticated methodologies. The newer andmore complex the method is, the fewer the peo-ple who will be able to evaluate the article, andthe fewer the practitioners who will be able to

understand the research and judge whether usingthe results would be beneficial. In attempting togain the approval of other researchers, a researchermight not care whether an article advances prac-tice in the field. Good research can do both; somepublications do neither.

The Secondary Research Audience

It is a desirable state when secondary researchaudiences—other researchers, practitioners, andthe public—are more important than the pri-mary ones. From an altruistic perspective, it isfor these audiences that the research is con-ducted. Research should be designed to be usefulto the discipline and to advance theoretical orempirical understanding of what is happening insome area of education. Does this research pro-vide new ideas, new understandings, and newpractices that advance the ways in which profes-sionals and practitioners in the field can serve thepublic good? An affirmative answer would justifythe use of public and philanthropic resources in

pursuit of educational knowledge. Good researchshould benefit everybody.

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 34/36

180 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

A measure of the value of research is not justacceptability to the editorial process, that is, themerit indicated by its publication; rather, theimpact of a piece of research on theory or prac-tice becomes the ultimate measure of its valueor worth. Research that does not meet at leastminimal methodological acceptability is notpublished and does not become available to itspotential audience. Assuming that it does reach alarger audience, does it affect future research inthe field?

A well-designed study should include, at theend of the article, recommendations for futureresearch, but in practice, these recommenda-tions tend to focus on narrow methodologicalconcerns, such as improving a questionnaire

and using a different sample. The implicit rec-ommendation for future researchers is that theycontinue to advance the theoretical orientationof the line of research. A second concluding sec-tion should deal with the practical applicationsof the study. The form of the application is alongthese lines: “If your educational world is similarto the one in which this study was conducted,here are the things you should do, based on myfindings, that would improve educational prac-tice and understanding in your world.”

Educational research can be influential not

because of its quality but rather because the find-ings confirm what policy makers already believe.This situation is distressing because it means thatan excellent study will affect policy only if policymakers want it to affect policy. When two studiesare excellent but lead to opposite conclusions,policy makers lose confidence in the research andreturn to intuition to set policy. The politics ofeducational research seems to be one of itssalient features (Cooper & Randall, 1999).

C ONCLUSION

The reporting of research can be viewed as sto-rytelling, as part of a mythic process of identify-ing who we are. In storytelling, we seek toremember the past, invent the present, and envi-sion the future (Keen & Valley-Fox, 1989).Research can be viewed as a similar process inremembering the past by examining the litera-ture; inventing the present by conducting thestudy and describing the findings; and envision-

ing the future where this research influencesthought, policy, and practice.

To design research is to make a map, an anal-ogy of what happens in the world. Researchdesign depends on what is being studied andwhat the researcher wants to find out. The dou-ble entendre of “wants to find out” is intentional.The researcher wants to find out somethingabout, say, how to improve literacy rates in ruralareas. The researcher also wants to find out thathis or her hypothesis is true, for example, thattutoring improves literacy.

The choice of the topic focuses the endeavor.The choice of method limits what can be discov-ered, emphasizing some possibilities and elimi-nating others. Each choice involves trade-offs.Each methodology chosen should, if done well,supply some beneficial information. There is

one best way in which to find out somethingextremely simple, such as the mean length oftime it takes students to memorize a list of spell-ing words. As the question addressed becomesbroader and more complex, it can be studiedusing a variety of designs. There is no best wayin which to study education; each approachemphasizes some things and is silent on others.Political and research ideologies can driveresearch or be ignored.

Research could be designed purely on thebasis of curiosity if the researcher wants to know

something. The methodology is likely to beemergent as the researcher plays with the topic,thinking of it without preconception; delightingin possibility; and creating an ongoing dialoguewith the topic, methods, other researchers in thefield, the persons being studied, and so on.

Research can also be designed around extrin-sic reasons: “How can I make myself famous,promoted, tenured, or rich on the basis of myresearch?” For that research, the researchershould let money and disciplinary popularitylead the endeavor. For research to affect policy,one should follow the money out of governmen-tal or other granting agencies and heed theirguidelines for topics and methods. Researchshould be designed to meet their expectationsusing methods they prefer. An effective presen-tation of the results might demand that they bepresented in the most simple or most mystifyingforms.

Designing research for altruistic purposes,to benefit humanity, is more complicated,because what benefits one group might not ben-

efit another. Any discovery can have wonderfulunanticipated consequences. Basic research has

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 35/36

Chapter 11. Intellect, Light, and Shadow in Research Design 181

grand possibilities, but the environment mustthrive on patience and failure—on trying manynew things that do not work to find the fewthat do. Such environments are rare. Researchdesigned to solve well-defined problems—appliedresearch—can also benefit humanity. Otherapplied research is intended to profit the patentholder. Research designed to provide an educa-tional environment that will save humanityshould get top billing, but who could agree onwhat that research would be?

In the near future, methodological correctnesswill likely maintain its salience. I would expectthat humanistic and aesthetic values will beneglected in research in the face of issues ofsocial justice and pragmatism. Capitalistic ele-ments related to the costs of education and theways in which the education system provides asuitable labor force for the nation’s economywill likely be emphasized. Whatever work wedo or we neglect, our research must refreshour intellect.

R EFERENCES

Bateson, G. (2000). Steps to an ecology of mind:Collected essays in anthropology, psychiatry,evolution, and epistemology. Chicago: Universityof Chicago Press.

Baumrind, D. (1964). Some thoughts on ethics ofresearch: After reading Milgram’s behavioralstudy of obedience. American Psychologist, 19, 421–423.

Bean, J. P. (1980). Dropouts and turnover: Thesynthesis and test of a causal model of studentattrition. Research in Higher Education, 12,155–187.

Bean, J. P. (1997, March). How painting can informqualitative inquiry. Paper presented at themeeting of the American Educational ResearchAssociation, Chicago.

Berliner, D. (2002). Educational research: Thehardest science of all. Educational Researcher,31(8), 18–20.

Biglan, A. (1973). The characteristics of subjectmatter in different academic areas. Journal ofApplied Psychology, 57,195–203.

Borges, J. L. (1998). Collected fictions (A. Hurley,Trans.). New York: Penguin Books.

Bourdieu, P. (1988). Homo academicus (P. Collier,Trans.). Stanford, CA: Stanford University Press.(Original work published 1984)

Bourdieu, P., & Passeron, J. C. (1990). Reproductionin education, society, and culture (R. Nice,Trans.). London: Sage.

Bowen, H. R. (1977). Investment in learning. SanFrancisco: Jossey-Bass.

Braithwaite, R. (1955). Scientific explanation. Cambridge, UK: Cambridge University Press.

Burkhardt, H., & Schoenfeld, A. H. (2003).Improving educational research: Toward a moreuseful, more influential, and better-fundedenterprise. Educational Researcher, 32 (9), 3–14.

Cooper, B., & Randall, E. (1999). Accuracy or

advocacy: The politics of research in education. Thousand Oaks, CA: Corwin.

Creswell, J. W. (1998). Qualitative inquiry andresearch design: Choosing among five traditions.Thousand Oaks, CA: Sage.

Creswell, J. W. (2007). Educational research (3rd ed.).Thousand Oaks, CA: Sage.

Daniel, L. G. (1996). Kerlinger’s research myths.Practical Assessment, Research, & Evaluation,5 (4). Available at http://pareonline.net/getvn.asp?v=5&n=4

Erickson, F., & Gutierrez, K. (2002). Culture, rigor,and science in educational research. EducationalResearcher, 31(8), 21–24.

Feuer, M. J., Towne, L., & Shavelson, R. J. (2002).Scientific culture and educational research.Educational Researcher, 31(8), 4–14.

Feyerabend, P. (1993). Against method (3rd ed.).New York: Verso.

Fineman, S. (2000). Emotions in organizations. Thousand Oaks, CA: Sage.

Gall, M. D., Borg, W., & Gall, J. P. (2002). Educationalresearch: An introduction (7th ed.). Boston:Allyn & Bacon.

Gardner, H. (1983). Frames of mind: The theory ofmultiple intelligences. New York: Basic Books.

Geertz, C. (1973). Thick description: Toward aninterpretive theory of culture. In C. Geertz, Theinterpretation of cultures (pp. 3–32). New York:Basic Books.

Getzels, J. W., & Csikszentmihalyi, M. (1976). Thecreative vision: A longitudinal study of problem finding in art. New York: Wiley.

Harvey, J., & Katz, C. (1985). If I’m so successful, whydo I feel like a fake? The imposter phenomenon. New York: St. Martin’s.

Institute of Education Sciences. (2003). Identifyingand implementing educational practicessupported by rigorous evidence.Washington, DC:U.S. Department of Education.

Janis, I. (1972). Victims of groupthink: A psychologicalstudy of foreign-policy decisions and fiascoes. Boston: Houghton Mifflin.

Jones, J. (1993). Bad blood: The Tuskegee syphilisexperiment (Rev. ed.). New York: Free Press.

8/20/2019 Research Design in QualitativeQuantitativeMixed Methods

http://slidepdf.com/reader/full/research-design-in-qualitativequantitativemixed-methods 36/36

182 S ECTION III. O PPORTUNITIES AND C HALLENGES IN D ESIGNING AND C ONDUCTING I NQUIRY

Keen, S., & Valley-Fox, A. (1989). Your mythic journey. New York: Putnam.

Kerlinger, F. (1973). Foundations of behavioralresearch (2nd ed.). New York: Holt, Rinehart &Winston.

Kezar, A. (2000). Higher education research at themillennium: Still trees without fruit? Review ofHigher Education, 23, 443–468.

Krathwohl, D. R. (1988). How to prepare a research proposal: Guidelines for funding and dissertationsin the social and behavioral sciences (3rd ed.).Syracuse, NY: Syracuse University Press.

Lawrence-Lightfoot, S. (1995). I’ve known rivers:Lives of loss and liberation. New York: PenguinBooks.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalisticinquiry. Beverly Hills, CA: Sage.

Merton, R. K. (1973). The normative structure ofscience. In N. W. Storer (Ed.), The sociology ofscience(pp. 267–278). Chicago: University ofChicago Press. (Original work published 1942)

Milgram, S. (1974). Obedience to authority. New York: Harper & Row.

Mitroff, I. I., & Kilmann, R. H. (1978).Methodological approaches to social science. SanFrancisco: Jossey-Bass.

Morgan, B. (1997). Images of organization (2nd ed.).Thousand Oaks, CA: Sage.

Myers-Briggs, I. (1962). Manual for the Myers-BriggsType Indicator. Princeton, NJ: EducationalTesting Service.

National Research Council. (2002). Scientific researchin education (R. J. Shavelson & L. Towne, Eds.).Committee on Scientific Principles for

Education Research. Washington, DC: NationalAcademies Press.

Neill, A., & Ridley, A. (Eds.). (1995). The philosophyof art. New York: McGraw-Hill.

Olson, D. R. (2004). The triumph of hope over

experience in the search for “what works”: Aresponse to Slavin. Educational Researcher,33(1), 24–26.

Parnes, S. J. (1992). Source book for creative problemsolving. Buffalo, NY: Creative Foundation Press.

Parsons, T., & Platt, G. M. (1973). The Americanuniversity. Cambridge, MA: Harvard UniversityPress.

Schwartzman, H. (1978). Transformations: Theanthropology of children’s play. New York:Plenum.

Schwartzman, H. (1993). Ethnography inorganizations (Qualitative Research MethodsSeries, No. 27). Newbury Park, CA: Sage.

Shils, E. (1984). The academic ethic. Chicago:University of Chicago Press.

Slavin, R. E. (2002). Evidence-based educationpolicies: Transforming educational practiceand research. Educational Researcher, 31(7),15–21.

St.Pierre, E. A. (2002). “Science” rejectspostmodernism. Educational Researcher, 31(8),25–27.

Thorngate, W. (1976). In general vs. it all depends:Some comments on the Gergen–Schlenkerdebate. Academy of Management Journal, 25, 185–192.

Weick, K. (1979). The social psychology of organizing (2nd ed.). Reading, MA: Addison-Wesley.