Secondary Students' Dynamic Modeling Processes: Analyzing ...

20
Journal of Science Education and Technology, Vol. 7, No. 3, 1998 Secondary Students' Dynamic Modeling Processes: Analyzing, Reasoning About, Synthesizing, and Testing Models of StreamEcosystems Steven J. Stratford, 1,4 Joseph Krajcik, 2 and Elliot Soloway 2 In this paper, we explore dynamic modeling as an opportunity for students to think about the science content they are learning. We examined the "Cognitive Strategies for Modeling" (CSMs) in which students engaged as they created dynamic models. We audio- and vide- otape-recorded eight pairs of ninth grade science students and analyzed their conversations and actions. In analyzing appropriate objects and factors for their model, some students merely enumerated potential factors whereas others engaged in rich, substantial, mindful analysis. In reasoning about their models, students discussed relationships in depth, concen- trated only on the most important key relationships, or encountered difficulty distinguishing between causal and correlational relationships. In synthesizing working models, students mapped their model to aid visualization, focused on their goal, or talked about their model's appearance or form. Students attempted to articulate explanations for their relationships, but sometimes their explanations were shallow. In testing their models, some students tested thoroughly but only a few persisted in debugging their model's behavior so that it matched their expectations. In our conclusion we suggest that creating dynamic models has great po- tential for use in classrooms to engage students in thought about science content, particularly in those thinking strategies best fostered by dynamic modeling: analysis, relational reasoning, synthesis, testing and debugging, and making explanations. INTRODUCTION AND OBJECTIVES How often do students in our secondary science classrooms really have the opportunity to think about the content they are supposed to learn? David Perkins and others have suggested that learning is a consequence of thinking, and that understanding that goes beyond the information given (Bruner, 1973) comes about through using knowledge in per- 1 Marantha Baptist Bible College, 703 Western Meadows Dr., Wa- tertown, WI 53098. e-mail: [email protected] 2 School of Education Building, University of Michigan, 610 E. University, Ann Arbor, Michigan 48109. e-mail: krajcik@umich. edu 3 Advanced Technologies Laboratory, University of Michigan, 1101 Beal Ave., Ann Arbor, Michigan 48109-2110. e-mail: soloway® umich.edu formances of understanding (Perkins, 1992; Gard- ner, 1991). But how can educators provide students with opportunities to reflect upon science content? And what might processes of thinking about science content look like? These are questions that have driven our research into dynamic modeling in sci- ence classes. The research reported here, focusing on students' dynamic modeling processes ("Cogni- tive Strategies for Modeling"), is part of a larger study of dynamic modeling in secondary science classrooms (Stratford, 1996b; Stratford, Krajcik, and Soloway, 1997), in which we have investigated proc- esses of students' dynamic modeling efforts, the products of those efforts, and relationships between process and product. In this paper we focus on the students modeling processes, presenting results of a 215 1059-0145/98/0900-0215$15.00/0 © 1998 Plenum Publishing Corporation KEY WORDS: Dynamic modeling; cognitive processes; simulation; Model-It.

Transcript of Secondary Students' Dynamic Modeling Processes: Analyzing ...

Page 1: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Journal of Science Education and Technology, Vol. 7, No. 3, 1998

Secondary Students' Dynamic Modeling Processes:Analyzing, Reasoning About, Synthesizing, and TestingModels of Stream Ecosystems

Steven J. Stratford,1,4 Joseph Krajcik,2 and Elliot Soloway2

In this paper, we explore dynamic modeling as an opportunity for students to think aboutthe science content they are learning. We examined the "Cognitive Strategies for Modeling"(CSMs) in which students engaged as they created dynamic models. We audio- and vide-otape-recorded eight pairs of ninth grade science students and analyzed their conversationsand actions. In analyzing appropriate objects and factors for their model, some studentsmerely enumerated potential factors whereas others engaged in rich, substantial, mindfulanalysis. In reasoning about their models, students discussed relationships in depth, concen-trated only on the most important key relationships, or encountered difficulty distinguishingbetween causal and correlational relationships. In synthesizing working models, studentsmapped their model to aid visualization, focused on their goal, or talked about their model'sappearance or form. Students attempted to articulate explanations for their relationships,but sometimes their explanations were shallow. In testing their models, some students testedthoroughly but only a few persisted in debugging their model's behavior so that it matchedtheir expectations. In our conclusion we suggest that creating dynamic models has great po-tential for use in classrooms to engage students in thought about science content, particularlyin those thinking strategies best fostered by dynamic modeling: analysis, relational reasoning,synthesis, testing and debugging, and making explanations.

INTRODUCTION AND OBJECTIVES

How often do students in our secondary scienceclassrooms really have the opportunity to thinkabout the content they are supposed to learn? DavidPerkins and others have suggested that learning is aconsequence of thinking, and that understandingthat goes beyond the information given (Bruner,1973) comes about through using knowledge in per-

1Marantha Baptist Bible College, 703 Western Meadows Dr., Wa-tertown, WI 53098. e-mail: [email protected]

2School of Education Building, University of Michigan, 610 E.University, Ann Arbor, Michigan 48109. e-mail: [email protected]

3Advanced Technologies Laboratory, University of Michigan, 1101Beal Ave., Ann Arbor, Michigan 48109-2110. e-mail: soloway®umich.edu

formances of understanding (Perkins, 1992; Gard-ner, 1991). But how can educators provide studentswith opportunities to reflect upon science content?And what might processes of thinking about sciencecontent look like? These are questions that havedriven our research into dynamic modeling in sci-ence classes. The research reported here, focusingon students' dynamic modeling processes ("Cogni-tive Strategies for Modeling"), is part of a largerstudy of dynamic modeling in secondary scienceclassrooms (Stratford, 1996b; Stratford, Krajcik, andSoloway, 1997), in which we have investigated proc-esses of students' dynamic modeling efforts, theproducts of those efforts, and relationships betweenprocess and product. In this paper we focus on thestudents modeling processes, presenting results of a

2151059-0145/98/0900-0215$15.00/0 © 1998 Plenum Publishing Corporation

KEY WORDS: Dynamic modeling; cognitive processes; simulation; Model-It.

Page 2: Secondary Students' Dynamic Modeling Processes: Analyzing ...

216 Stratford, Krajcik, and Soloway

study of the cognitive strategies in which ninth-gradescience students engaged as they used a learner-cen-tered dynamic modeling tool (called Model-It) tomake original models based upon stream ecosystemscenarios.

PROBLEM AND RATIONALE

Systems thinking was designed many years ago(Forrester, 1968) and has often been promoted sincethat time as a way of thinking about and under-standing complex systems. Several computer toolswere developed to support systems thinking throughthe creation of dynamic models (specifically, Dy-namo and STELLA). Attempts have been made tointroduce systems thinking and dynamic modeling tomiddle and secondary school students (Roberts, An-dersen, Deal, Garet, and Shaffer, 1983), but re-search is scarce. We do know that it's difficult andtime-consuming to engage all students in dynamicmodeling (Roberts and Barclay, 1988; Mandinachand Cline, 1994; Schecker, 1993). However, with mi-crocomputers becoming more readily available toscience students, with the increasing processingpower of those computers (Soloway and Pryor,1996), and with developments in theory and imple-mentation of learner-centered software (Jackson,Stratford, Krajcik, and Soloway, 1996), the opportu-nity exists to try again.

Why is the exploration of dynamic modeling inscience classes a significant exploration? First, manyscience curriculum topics dealing with systems, suchas ecology, weather, climatology, and biology may beenhanced by creating, manipulating, and exploringcomputer models of those systems (Roberts, et al,1983). Other curricula with systems-related content,such as history or economics, may also be enhancedwith computer models. Second, creating dynamicmodels should engage students in combining iso-lated, fragmented, inert knowledge about poorly-un-derstood concepts and relationships into larger, moreclearly-understood constructs by allowing them to re-present, reconstruct and explore that knowledgewithin a computer model. Third, creating modelsshould provide students with opportunities to thinkabout and discuss scientific phenomena: breakingthem down into pieces, considering how (and why)those pieces are related, incorporating those piecesinto computer models, and verifying those models bycomparing their behavior to reality (Stratford, Kra-

jcik, and Soloway, 1996). Finally, creating modelsmay allow students to come face-to-face with funda-mental issues of scientific models such as their accu-racy, limitations, and usefulness (Gilbert, 1991;Stratford, 1996a). All of these reasons suggest thatconstructing models, and in particular, constructingdynamic models, may help students better under-stand the science content we want them to learn.

Our exploration of dynamic modeling has fo-cused on the cognitive strategies in which students en-gage as they create dynamic models, strategies wehave called "Cognitive Strategies for Modeling."These strategies include analyzing, relational reason-ing, synthesizing, testing/debugging, and explaining. Inthe process of creating a model, one would expect tosee someone analyzing the phenomenon being mod-eled, breaking it conceptually down into relevant, re-lated parts. Relationships (usually causal) betweenthose parts have to be reasoned out, identified andclearly defined. As the model is then created with thecomputer, those parts, and the relationships betweenthem, are synthesized conceptually back together intoa computerized representation of the phenomenon.To verify that the model works as intended, and thatits behavior matches that of the phenomenon, themodel should be thoroughly tested and debugged.And throughout the process of building and testing amodel, explanations for why parts are related (that is,explanations for the mechanisms underlying thecausal relationship between two parts) certainly existin the mind of the builder(s), otherwise the modelwould be totally random; such explanations may bearticulated in oral form or documented in writtenform.

RESEARCH QUESTIONS

The research questions of this study, then, wereas follows:

• In what Cognitive Strategies for Modeling(analyzing, reasoning, synthesizing, test-ing/debugging, and explaining) do ninth-grade science students engage as they createdynamic models of stream ecosystem phe-nomena?

• What are characteristics and qualities of theCognitive Strategies for Modeling in whichthey engaged?

Page 3: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 217

SETTING

The participants in our study were 16 ninthgraders, enrolled in a public school in a midwesterncollege town. They were chosen by their 3 scienceteachers to possess a range of characteristics (ability,gender, and race) roughly representative of thegroup of 100 ninth grade students from which theywere drawn. They were also selected for having thequalities of being likely to cooperate with data col-lection procedures (daily videotape recordings andpre-/post-interviews), being able to work well withothers, and being relatively "talkative" (to ensurerich videotape and interview data). The ninth gradescience class in which they were enrolled was taughtfollowing a curriculum called Foundations of Science(Heubel-Drake, et al., 1995), with the goal of engag-ing students in long-term inquiry of non-trivial driv-ing questions. In the three months prior to theresearch activities reported in this paper, students in-vestigated an authentic, meaningful question: "Is ourwater safe?" It was authentic because they used alocal creek for their investigatory activities, and it wasmeaningful because the water flowing in that creekwas part of the watershed from which their towndrinking water was obtained. The class and curricu-lum was enhanced by ubiquitous computer technolo-gies (portable computers, networks, and printers inthe laboratory, digital data collection and display de-vices, off-the-shelf productivity software, and custom-designed and -programmed research software). Theirinvestigations included chemical assessments (collect-ing and testing water samples), conducting biologicaland physical habitat surveys and assessments, and re-porting their results to peers, to their teachers, andto the community. Most of their project work wasdone in groups using computers, so they were usedto working with others and were reasonably profi-cient with computer operations. They did not, how-ever, have any formal classroom instruction aboutmodels in science or about dynamic modeling.

METHOD, CATEGORIES, AND ANALYSISPROCEDURES

Method

Model-It, described in Appendix A and elsewhere(Jackson, et al., 1996; Stratford, 1996b) provided thedynamic modeling computer environment. Eight pairs

of students (8 male, 8 female; 3 African American, 1Asian, 4 Caucasian; 2 mixed-gender pairs; 3 male and3 female same-gender pairs) were chosen as focusgroups whose conversations and actions on the com-puter were videotaped throughout the study. Partici-pants, as part of their classroom activities, used awritten guide along with Model-It on the computer for6 to 8 daily 50-minute class periods. [Note that a rangeis given here because some students finished workingthrough the guide more quickly than others.] The pur-pose of the guide was to help them learn how to usethe software to make dynamic models, and it was writ-ten in such a way as to require mindful, directed ac-tivity with the software. Then, during the following 2or 3 class periods, they created models based upontheir choice of five stream ecosystem scenarios (or amodel of their own choice). For example, one scenariosuggesting a model of cultural eutrophication: "Whenexcess phosphorus from human sources is added. to astream (cultural eutrophication), algae blooms can re-sult. Build a model that includes algae and bacteriapopulation objects, along with stream factors such asdissolved oxygen and total phosphorus. Also include apossible source of the phosphorus in your model." Inlike manner, each scenario briefly described an eco-logical phenomenon (e.g., cultural eutrophication) andsuggested a couple of objects or factors to help themget started in their analysis. Their teachers communi-cated an expectation to the students that they shouldattempt to enter explanations and descriptions into theappropriate explanation and description boxes pro-vided in the software, for all of the objects, factors,and relationships they included in their model. Thevideotape from the independent modeling sessions(about 11 hours of total footage) comprised the datafor our study.

Categories of Analysis

The five Cognitive Strategies for Modeling wehave associated with dynamic modeling are analyz-ing, relational reasoning, synthesizing, testing/debug-ging, and explaining. The modeling-related meaningsfor each of these categories are found in Table I,along with examples of the kinds of behaviors wetook as evidence for those strategies. Here we brieflydiscuss the contents of the table.

Analyzing strategies include statements or actionsin which students divide the scenario or phenomenoninto parts, identify important components, or in

Page 4: Secondary Students' Dynamic Modeling Processes: Analyzing ...

218 Stratford, Krajcik, and Soloway

Table I. Cognitive Strategies for Modeling (CSMs): Categories, Criteria, and Examples for Analyzing, Relational Reasoning, andSynthesizing

CSM

Analyzing

RelationalReasoning

Synthesizing

Testing andDebugging

Explaining

Definition

identifying factors or objects

creating factors/objects in FactorFactory/Obj. Editormaking judgments (comparing andcontrasting)

interpreting model's behavior whentesting

drawing conclusionscritiquing what works

creating relationships in theRelationship Makermaking cause and effectstatements

discussing or selectingrelationships

predicting what should happen

deciding how model shouldwork as a whole

discussing or commenting onthe model's representation inthe Factor Map

discussing/commenting onConcept Map representationconnecting ideas

testing model

trying possible solutions

explaining why or how parts arerelated (causally or correlationally)giving examples

stating evidence

justifying an argumentelaborating or demonstratingideasdescribing what was observed

Criteria and Examples

Students talk about factors or objects, discuss which are relevant (ornot relevant) to their scenario; they discuss relevat minimum,maximum, or initial valuesStudents create factors or objects using the Factor Factory or ObjectEditorStudents talk about how things are alike or different (e.g., discussingwhether a relationship should be "about the same" or "more andmore"); students compare their model to the real worldStudents discuss what the behavior they're observing means, either interms of the factors and relationships they created, or in terms of howthe real world worksStudents discuss an issue and come to some conclusionStudents make comments like "it's not working" or "it's working" andtalk about what they think is right or wrong with it; they makereference to whether their model is accurate or realistic

Students create relationships using the Relationship Maker (immediateor rate, text or table view)Students say "this affects that" or "this makes that go up or down";they use words like "increases," "decreases," "a causes b," makesmore," "makes less"Students select which relationship they should include in their model;they discuss possible relationships to include (or exclude); they discusswhether a relationship should be immediate or rateStudents say things like "it should do ... when we run it" or "it'sgoing to ..."; or, they say "it didn't do what I thought it would"

Students discuss model as a whole (e.g., "our model shows howweather affects stream depth"); they remind theselves about whattheir model is supposed to doStudents look at their Factor Map and discuss the overall shape (e.g.,"look, these make a long chain", or, "in our model the main factor is..."), or configuration of relationship (e.g., "look, this factor dependson everything else")Same as previous, only in reference to Concept Map

Students discover or think of relationships between factors that theyhadn't considered before (e.g., "I wonder if there's a relationshipbetween...")

Students run their model with meters and/or graphs after they haveconstructed one or more relationships.Students are not satisifed with their model's behavior (either theywant to improve it or they think something is wrong with it) so theymodify something and re-test it

Students talk about how or why a relationship works using words like"because" (e.g., "a affects b because c")Students give examples with their explanations (e.g., "things in theweather such as clouds, rain, and wind affect the stream")Students refer to data, experience, or common sense to support anexplanationStudents make a logical argument to support an idea or explanationStudents restate an idea or demonstrate it on the computer

Students observe something and descirbe what they saw (e.g., "It's like..." or "I saw it do..."

Page 5: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 219

which they attempt to make sense of or pass judg-ment on their model's behavior. Analyzing strategies,then, are statements or actions such as: identifyingfactors or objects, creating Factors or Objects inModel-It, making judgments about the difference be-tween parts, interpreting the model's behavior, draw-ing conclusions about the model, or critiquing whatworks and what doesn't.

Relational reasoning strategies consist of state-ments or actions related to reasoning about the rela-tionships between parts of the scenario or phenome-non, reasoning about the relationships between factorsor objects in the model, or making a reasoned predic-tion about the behavior of a model. Reasoning strate-gies include statements or actions such as: creatingrelationships with the Relationship Maker, makingcause and effect statements, discussing or selecting re-lationships, and predicting what should happen whenthe model runs.

Synthesizing strategies are statements or actionsrelated to viewing the content, behavior, or form ofa model as a whole, or to making connections be-tween previously unconnected ideas. This includesthe following strategies: deciding how the modelshould work as a whole, discussing or commentingon the model's representation in the Factor Map orin a concept map, and making connections betweenideas (e.g., realizing that factors are related).

Testing and debugging encompasses strategies re-lated to verifying that a model works, or to figuringout why it doesn't work. It includes the strategies oftesting the model using Model-It's built-in testing fa-cilities, and of changing existing factors and relation-ships through additions, deletions, or modifications.

Finally, explaining strategies are associated withtalking or writing about why a relationship exists, thatis, about the reason(s) why one factor causes changesin another. So explaining strategies are those that in-volve telling why or how parts of a phenomenon arerelated (or typing them into an explanation box inModel-It), illustrating statements with examples, stat-ing some supportive evidence or justifying an argu-ment logically, elaborating on or demonstratingideas, or giving witness to something they have per-sonally experienced or observed.

Analysis Procedures

The goal of the data analysis was to create nar-ratives capturing the characteristics and quality of

each focus groups' Cognitive Strategies for Modeling.To that end, the analysis proceeded through severalstages. First, in the descriptive stage, transcriptionsof conversations were annotated to include non-ver-bal interactions with the computer and then dividedinto episodes according to shifts in modeling activity(such as when shifting from creating a relationshipto testing the model). Then, for each episode a de-scriptive account was composed to summarize themain happenings in that episode. Next, we iterativelyanalyzed each descriptive account for instances inwhich students engaged in Cognitive Strategies forModeling, a process that involved writing, refining,and categorizing narratives, discussing analyses withcolleagues, and producing summaries. In each narra-tive, the episode's happenings were interpreted interms of the modeling strategies in which the stu-dents were apparently engaging. The goal of the fi-nal, synthetic phase was to identify patterns in theentire analysis, in order to compose a story about thestrategies in which each pair of students engaged asthey constructed their model. The stories were illus-trated with examples from students' modeling ses-sions, by drawing upon our episode descriptions andupon our analytic narratives as well as upon tran-script data.

In order to help exemplify our analysis proce-dure, in Appendix B we have provided a short sam-ple transcript episode and examples of how it wasanalyzed.

RESULTS

Here are the end results of our analysis for eachof the 8 groups of students, after the analysis wascompleted. These results are summarized in Table IIand discussed below. In the discussion, because oflength considerations, we will expand on the resultsfor only 3 of the groups.

In this discussion, we will elaborate on the re-sults from Cory and Dan, Cathy and Connie, and Ni-cole and Mark, because we feel these cases willprovide the reader with a flavor for the range of Cog-nitive Strategies for Modeling in which the pairs ofstudents engaged. Cory and Dan's Cognitive Strate-gies for Modeling represent a moderate quality;Cathy and Connie represent a high level of qualityin their strategies, and Nicole and Mark representmixed levels of high and some low quality CognitiveStrategies for Modeling.

Page 6: Secondary Students' Dynamic Modeling Processes: Analyzing ...

220 Stratford, Krajcik, and Soloway

Table II. Results of Analysis of Cognitive Strategies for Modeling (Pairs Treated in Detail in this Paper are Bold)

Cognitive Strategy forModeling

Analyzing

Relational reasoning

SynthesizingTesting/debuggingExplaining

Misc. Comment

Analyzing

Relational reasoning

Synthesizing

Testing/debugging

Explaining

Analyzing

Relational reasoning

Synthesizing

Testing/debugging

Explaining

Comment

Analyzing

Relational reasoningSynthesizing

Testing/debugging

Explaining

Cory and Dan

Analysis accompanied by shallowcausal reasoningCareful analyzed and reasoned aboutcertain key causal relationshipsUsed the Factor Map as aA few instancesNo written a few oral causalexplanations during data collection

Cathy and Connie

Carefully analyzed factors andrelationships for realismReasoned about how factors shouldbe related to each otherCommented on or discussed the big-picture synthesis of their modelEngaged in mindful testing andquestioningArticulated explanations to justifydecisions and help each otherunderstand what they were observing.

Rachel and Sam

Engaged in analysis while referring toscience content sourceEngaging in correlational and causalreasoningReferred to the Factor Map toprovide a synthetic overview of themodel;Engaged in unsatisfactory (to them)testingFormulated non-causal and causalexplanations

Denise and Mary

Received help on scenario analysis

Received help on causal reasoningTested the model as a synthesizedwholeTested the model as a synthesizedwholeMade both causal and non-causalexplanations

Phil and Gary

No verbal analysis or discussion

Some

Model never completely synthesizedOnly one (limited) instance of testingMostly reiterative explanations

Solo cognitive modeling strategies

Nicole and Mark

Analyzed scenarios in depth

Discusse every causal relationship whilecreating itEvaluated proposed factors and relationshipsagainst the goalNone

Supported arguments with causal andcorrelational explanations

George and Carl

Analysis focused on modeling techniques ratherthan contentConfusion about causality resolved by softwarecapabilityJust-in-time completion and final testing forverification

Reckless relationship-building early necessitatedextensive testingSome explanations

Early mutual work, later solo work

Nancy and Andrea

Minimal scenario analysis as they created theirconcept mapFailed to causally analyze relationshipsSynthesized a concept map, not a model

None

Made reiterative and factual rather than causalexplanations

Cory and Dan

Cory and Dan were two students of average abil-ity and achievement. They created a model of the im-

pact of urban runoff containing human and animalwaste on stream quality. Their analysis of the scenariowas accompanied by some rather shallow causal rea-soning, though they did carefully reason about several

Page 7: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 221

key causal relationships in their model. They used theFactor Map as a catalyst for synthesizing their model.They only tested their model a few times. They wroteno written and articulated only a few oral explana-tions while they created their model.

Shallow Causal Reasoning. Cory and Dan en-gaged in several rounds of analysis before coming toa decision about which scenario to model, duringwhich they considered, in rapid succession, thechemical test, the macroinvertebrate/water quality,and the food chain scenarios. They settled on mod-eling the effect of rainfall runoff on a stream ecosys-tem. They sketched out a rough model that includedcomponents from several scenarios, but apparentlydrawn primarily from the rainfall runoff scenario.However, their first day's conversation did not in-clude any indication that they were considering cau-sality at all; not until the very end of the firstmodeling session when they started testing theirmodel did they begin to talk about how or why rain-fall causes changes in runoff.

Factor Map Used as a Tool for Synthesis andAnalysis. They asked for some help from a classroomhelper why their model wasn't working; actually, theyhad forgotten about how to open a meter to test it.Once they tested their model, they subsequentlyopened the Factor Map and generated numerousideas for extending their model, such as: gravity affectsrainfall, salt affects total solids, dog and geese fecesaffect fecal coliform in the runoff, and so on. Viewingthe Factor Map apparently helped them think of manyideas for their model, ideas they subsequently dis-cussed and some of which they implemented.

Carefully Reasoned About Key Relationships. Therelationships between runoff, parks, animal feces, fe-cal coliform, and water quality formed the core oftheir model, and there were several occasions inwhich they explored those relationships in depth. Forexample, in one episode midway through their mod-eling sessions, they tested their model so far andfound that rainfall always equaled runoff; Dan ar-gued that runoff should be less, "cause if it rains notall of that will go into the stream." They subsequentlychanged the relationship to more closely reflectDan's understanding of the relationship. In the brain-storming session mentioned above, they articulatedseveral causal explanations, including: road salt en-ters the stream through runoff making the waterquality go down; and geese and dog feces are washedinto the stream by runoff making water quality de-crease.

In another episode, Cory tried to create a rela-tionship between animals and fecal coliform. Dan ar-gued that "it's not just 'animals.' It doesn't matter howmany animals there are . . . . It's about the animalwaste." They proceeded to create an "animal waste"factor and a relationship between "animal waste" and"fecal coliform." Although they were not engaged ina deep level of causal reasoning, still it was evidentthat they were carefully analyzing their factors and at-tempting to link them in logically causal ways.

A Few Instances of Testing. There were only a fewinstances of testing, one near the end of the first ses-sion, in which they forgot about using meters, andone near the beginning of the second session inwhich they asked for help and then went on to theFactor Map to brainstorm and create more relation-ships. Most of their testing sessions seemed satisfac-tory to them, or helped them analyze what else theyneeded to do to their model. However, their testingwas inadequate because it did not reveal several con-ceptual flaws in their final model.

No Written, Few Oral Causal Explanations. Theydid not type in any explanations during either of theirmodeling sessions. Neither did they express manyoral explanations about the relationships they weremodeling, particularly on the first day. They creatednumerous relationships without articulating any ex-planations about how or why the relationshipsworked the way they did. It wasn't until the secondday that they made any oral explanations, and eventhen there weren't very many.

In summary, Cory and Dan represent a moder-ate quality of modeling strategies. They engaged inquite a bit of scenario analysis, but considerations ofcausality and underlying explanations were not veryevident. The Factor Map seemed to help them viewtheir model as a whole and to energize their analysis.Their testing seemed satisfactory to them, but theyactually only tested superficially. Most importantly,however, they spent time reasoning about several keyrelationships, and took care to express them in rea-sonable and logical ways in their model.

Cathy and Connie

Cathy and Connie created a model of culturaleutrophication and algae blooms. Cathy was a highachiever, and Connie was average. Together, theycarefully analyzed factors and relationships for real-ism; they reasoned about how factors were related

Page 8: Secondary Students' Dynamic Modeling Processes: Analyzing ...

222 Stratford, Krajcik, and Soloway

to each other; they engaged in mindful testing andquestioning; they commented on and discussed thebig-picture synthesis of their model; and they articu-lated oral and written explanations to help eachother understand what they were observing.

Carefully Analyzed Factors and Relationships forRealism. Cathy and Connie repeatedly discussed theirscenario in order to select appropriate factors and re-lationships, all the while considering whether theywere being realistic. They talked about whether"phosphates" and "nitrates" should be factors or ob-jects; whether they should include either or both"bacteria" and "algae" as population objects; and howto define a "fertilizer" factor in a realistic fashion.

Cathy: "there's no real measure for fertilizer runoff,so I guess I'll just leave it as 0 to 100 [the default]for now—100 gallons of fertilizer-that would bespectacular", among other things.

In later episodes, they often expressed concernfor realism. For example, they had some trouble get-ting their algae population to work the way theythought it should. In order to make the algae rateof growth high enough to make the population ofalgae grow, they had to raise rainfall (which in-creased fertilizer runoff into the stream) to what theyfelt was an unrealistically high level. Again, near theend of their modeling sessions, they discovered thatwhen their model showed a level of 0 dissolved oxy-gen in the stream, there were still organisms living,Cathy observed that that was "totally unrealistic."The idea that their model should be realistic was ap-parently never far from their thoughts.

Careful Reasoning About Relationships BetweenFactors. In many episodes, Cathy and Connie dis-cussed how certain factors were related to eachother, as they created relationships between them.For example, in the process of creating relationshipsbetween dissolved oxygen and organisms in thestream, Cathy realized she didn't understand howdissolved oxygen might affect different organisms dif-ferently, so she discussed it with Connie and with herteacher until she was satisfied that she understood.In a later episode, they discussed relationships be-tween algae and bacteria, but Connie was confusedabout how to connect them. Cathy carefully ex-plained how algae should affect bacteria: "No, butsee the algae count makes bacteria grow faster. See,that's the thing-rate of growth is always going to af-fect bacteria count. How fast they are growing iswhat makes the population higher." Her explanation

showed that she understood not only the relationshipbetween algae and bacteria (larger quantities of al-gae will lead to increased rates of growth of bacte-ria), but also how to model the relationship in thecomputer.

Discussing the Big-Picture Synthesis of Their ModelEach of the students made comments indicative of a"big-picture" view of their model. For example, earlyin their modeling session, when they were working ontheir concept map, Connie mused out loud about whattheir "main" factor might be, but Cathy said theydidn't have a main factor, and proceeded to read fromthe map: "fertilizer runoff affects total phosphates andnitrates which increase algae and plants which de-crease DO (dissolved oxygen) which kill everything."(Note that some of her later comments reveal that sheknew it was not dissolved oxygen that kills living crea-tures, but the lack of it.) Connie understood whatCathy said, because later she commented, "We don'treally have a web, we have a chain."

Mindful Testing and Questioning. When Cathyand Connie encountered situations in which theirmodel didn't behave as they expected it to behave,they treated these situations as problems to be solved.Early on they encountered a problem in the level ofnitrates in their model-something was causing it todrop to nearly zero. They ran numerous tests, modi-fied the model several times, and asked for assistancefrom a classroom helper before finally discovering thesource of the problem and fixing it. In another situ-ation their algae population wasn't growing the waythey thought it should grow. Cathy suggested thatthey remove a certain relationship and replace it withanother. That didn't work, and they hypothesized thatthey needed to modify one relationship to make itsomewhat stronger. They tried this solution, testedtheir model, and were finally satisfied with its behav-ior. At no time did they encounter unexpected or per-plexing model behavior without trying to understand,explain, and correct it if necessary.

Articulating Helpful Explanations. Cathy and Con-nie justified and explained their decisions, plans, andobservations to each other on numerous occasions. Forexample, to explain the decision to set the initial valueof their nitrates factor at 0.1, Cathy explained, "It'swhere unpolluted streams are usually at. You see 0.2is going to be quite a bit more, I mean, if it's onlyfound in really small levels." Another time, Connie ar-gued for an "increases more and more" relationshipbetween rainfall and runoff, explaining, "If there'smore rain, the rain would build up and then wash it

Page 9: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 223

all away ... it would only do a little bit at first." Theyboth consistently attempted to help each other under-stand what they were thinking.

In summary, Cathy and Connie engaged in highquality modeling strategies. Their analysis was thor-ough, their reasoning was causal, their synthesis wasdriven by a mental picture of the model as a whole,their explanations were supportive and valuable, andthey tested and debugged their model until it workedto their satisfaction.

Nicole and Mark

Nicole and Mark chose to design a model ontheir own to show how weather factors can affect thedepth and temperature of a stream. Both studentswere of average achievement and ability. They sub-stantially analyzed their scenario, and discussed(sometimes argued about) causal relationships asthey created them, supporting their arguments withexplanations (both causal and correlational); andthey evaluated proposed factors and relationshipsagainst their overall main goal.

Analyzed Scenarios in Depth. Nicole and Mark be-gan their modeling session by analyzing several of thepossible scenarios in great depth. In turn, they consid-ered the water quality test scenario, the macroinver-tebrate indicator of water quality scenario, and thefertilizer runoff scenarios before finally deciding ontheir own to do a weather model. For each scenario,they discussed it as if they were really making a modelof it. For example, when they talked about water qual-ity tests, they discussed several possible relationshipsbetween riffles, dissolved oxygen, and biochemical oxy-gen demand; when they considered the macroinverte-brate scenario they looked up the pollution toleranceindices for various taxa. Even after finally settling ona model of how weather affects a stream, they gener-ated, considered, and discarded many, many factors,including dissolved oxygen, phosphates, nitrates, fecalcoliform, and even "rubbish."

Discussing Causal Relationships. Every time theycreated a relationship, Nicole and Mark discussed it.For example, they had a discussion about how cloudcover is related to air temperature: Nicole claimedthere was a causal relationship ("If cloud cover in-creases, there will be less sunlight") but Mark coun-tered with an exception ("Sometimes cloud coverkeeps the hot air in"). Similar discussions occurred asthey considered relationships between "cloud cover"

and "rainfall rate" (Mark: "sometimes when the skyis cloudy, it doesn't rain, it doesn't always rain"), andbetween "stream temperature" and "water quality."Sometimes their discussions were more like argu-ments, but they engaged in dialogue about every re-lationship they put into their model, and many more.

Evaluating Factors and Relationships Against aGoal. Because Nicole and Mark chose to create amodel of their own that was not already describedin a scenario, they found it necessary to constantlymake sure they were making progress toward a finalmodel. During their work together, Mark tended tosuggest factors and relationships that weren't directlyrelated to weather; Nicole often had to remind himthat they were doing a weather model, and persuadehim why his idea didn't fit into the model they wereconstructing. For example, in an early episode, Marksuggested that they include a "trash" factor, and evenwent so far as to actually create a factor for it. How-ever, eventually they decided to discard it from theirmodel, because, as Nicole put it, "What does it[trash] have to do with the weather, Mark?" Nicoleasked a similar question when Mark wanted to createa "ducks" factor; and when he wanted to include acidrain, Nicole said, "I don't think we should use acidrain because it doesn't have to do with anything weare doing . . . . We are trying to see how weatheraffects temperature and the depth of the stream." Ni-cole's comments helped keep them on track and en-sure a coherently synthesized model.

Arguments Supported with Both Causal and Cor-relational Explanations. Nicole and Mark supportedtheir (numerous) arguments with a mixture of causaland correlational explanations. For example, Nicoleargued (correlationally) that increased cloud coverwould decrease air temperature because there wasless sunlight. A similar correlational explanation waspresented for the relationship between cloud coverand rainfall rate (more clouds make more rain).Mark explained somewhat correlationally later onthat lower temperatures were good for water qualitybecause macroinvertebrates prefer cooler tempera-tures; he explained causally that deeper streams havelower temperatures because "on deep stuff, the sun-light is hitting the top of the water and it doesn'talways make it down to the bottom." This explana-tion translated into the following in the explanationbox: "The surface is warmer because the sun canreach it better, and if the stream is deeper, the bot-tom will be cooler." Thus, their explanations were amixture of both causal and correlational statements.

Page 10: Secondary Students' Dynamic Modeling Processes: Analyzing ...

224 Stratford, Krajcik, and Soloway

In summary, Nicole and Mark engaged in somehigh quality Cognitive Modeling Strategies, and insome at a lower quality. They analyzed their scenarioin depth, both in terms of relevant factors and interms of how those factors were related. They re-mained focused on their final goal and produced acoherently synthesized model. However, though theyarticulated a substantial number of explanations, theydid not distinguish between causal explanations andcorrelational explanations. Also, they did they en-gage in any model testing and debugging at all.

Summary of Findings

This is a summary of findings from all 8 pairsof students. Most pairs of students engaged in ananalysis of appropriate objects and factors for theirmodel. Some of their analyzing strategies were lim-ited to identifying and creating factors whereas oth-ers' analysis strategies were richer, more substantial,and more mindful. Most engaged in relational rea-soning about their factors, though again a range wasevident: some discussed every relationship in depth,some concentrated on only the most important keyrelationships, and a few were either unaware of orconfused about the difference between causal andcorrelational relationships. Most were able to synthe-size a working model, employing a range of strate-gies: they used the Factor Map as a tool to aid theirvisualization, they focused on their goal, or theyfound ways to talk about their model's appearanceor form. Similarly, most attempted to articulate ex-planations for their relationships, but sometimes ex-planations were shallow or even non-existent. Mosttested their model, though some tested their modelmuch more substantially and thoroughly than others.Only a few persisted in their debugging to fine-tunetheir model's behavior to match their expectations.

DISCUSSION AND IMPLICATIONS

These findings indicate that creating dynamicmodels has great potential for use in classrooms toengage students in thought about the science contentthey are supposed to learn, particularly in thosethinking strategies best fostered by dynamic model-ing: analyzing, relational reasoning, synthesizing,testing and debugging and explaining. This workbuilds upon other related work (Jackson, et al, 1996;

Mandinach, 1989; Miller, et al., 1993; Roberts, 1981),expanding the base of dynamic modeling research atthe secondary level (Stratford, 1997), providing acloser look at thinking strategies employed by stu-dents as they create dynamic models, and informingsoftware design and classroom instruction.

The STELLA research (Mandinach and Cline,1994) explored how students may benefit from sys-tems thinking and from creating models of complexsystems, but not the cognitive strategies in which stu-dents engage. Miller and colleagues (1993) looked atjust one type of cognitive strategy, reporting that stu-dents engaged in sophisticated causal reasoning asthey created models with IQON. Our research ex-tends this prior research in dynamic modeling by in-vestigating and reporting on a much wider range ofcognitive strategies. In this study, it was found thatmost students engaged in some or all of the Cogni-tive Strategies for Modeling at some time duringtheir modeling sessions. Most engaged in analyzingtheir scenario, and were able to select and create ap-propriate factors. Most engaged in relational reason-ing as they created relationships between factors.Most were able to synthesize a working model. Mosttested their model, some to a lesser, others to agreater degree. Most groups engaged in explainingtheir relationships, though the depth of their expla-nations was sometimes rather shallow and sometimesmore correlational than causal.

Thus, students do engage in Cognitive Strategiesfor Modeling when they create dynamic models. Thissupports our idea that dynamic modeling can be aperformance for understanding (Perkins, 1992) forstudents in science classrooms, engaging them inanalyzing, reasoning about, and synthesizing the con-tent they are learning. Engaging in such activities al-lowed students to "go beyond the information given"(Bruner, 1973) in the scenarios, building upon whatthey knew to produce a synthesis of that knowledgein the form of a model. Thus these results also showthat, using Model-It, students engaged in the samegeneral kinds of cognitive activities that systemsthinking and system dynamics modeling is claimed tosupport (Forrester, 1978; High Performance Systems,1992), though in not as rigorous a form.

Model-It's design provides strong support fordynamic modeling, structuring the task into easily un-derstood subtasks. However, additional support isnecessary for all students to progress beyond makingsomewhat superficial relationship connections (thatare often based on correlations) toward creating and

Page 11: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 225

articulating causal explanations for every relation-ship. In addition, since testing is critical for verifyinga model's behavior, Model-It needs to support notonly the act of testing but also its necessity. Furtherdesign and implementation of these revisions are be-ing made: the most recent version of Model-It (nowrenamed "Theory Builder" and in the process of be-ing field-tested) includes, for example, scaffolds tosupport more in-depth analysis and to support testingand debugging.

In conclusion, this research demonstrates thatcreating dynamic models is a classroom activity thatfosters students' engagement in higher-level thinkingperformances such as analyzing, reasoning, synthesiz-ing, testing/debugging, and explaining. Constructingdynamic models provides opportunities for them tothink about, use, and reflect upon the science contentknowledge gained during classroom instruction andinvestigations.

ACKNOWLEDGMENTS

This research has been supported by the Na-tional Science Foundation (RED 9353481) and the

Rackham School of Graduate Studies at the Univer-sity of Michigan, Ann Arbor.

APPENDICES

Appendix A: Description of Model-It Software

Model-It is based upon an "ecological model-ing" simulation engine (Silvert, 1993) in which vari-ables are paired using functional mathematicalrelationships; the simulation engine averages the ef-fects of multiple functions to derive resultant outputvalues. Using Model-It, the user creates objects withwhich he or she associates measurable, variablequantities called factors and then defines relationshipsbetween those factors to show the effects of one fac-tor upon another. Relationships can model immedi-ate effects or effects over time. Model-It providesfacilities for testing a model and a "Factor Map" forvisualizing it as a whole.

Creating Objects. Typically, objects are chosen tocorrespond with observable features of a system beingstudied: trees, fish, weather, people, water, golfcourses, and so on. Model-It allows the user to asso-ciate a graphic, icon, or photograph with each "ob-

Fig. 1. The Simulation Window, showing weather, rainfall, fish, and plant objects superimposed on a lake object

Page 12: Secondary Students' Dynamic Modeling Processes: Analyzing ...

226 Stratford, Krajcik, and Soloway

ject," so that each becomes visually associated withwhat it actually represents. Figure 1 shows a Model-Itscreen that represents the objects in a sample lakemodel: a picture of a lake, the cattails representingplants, fish graphic representing fish, sun/clouds/rainrepresenting weather, and the faucet (somewhatwhimsically) representing water runoff into the lake.It is also possible to specify whether an object is an"environment," "individual" or a "population" object,5in the example model, plants and runoff might be sin-gular, "individual" objects, whereas fish might be cre-ated as a "population," giving it specialpreprogrammed behaviors and relationships.6 Figure 2shows how the fish population object is created withModel-It's "Object Editor." Note that Model-It doesnot place any constraints upon the selection of objectsor choice of domain-it is content-free, entirely directedby the preferences and choices of whomever is creat-ing the model. The Model-It program is distributedwith several sample environment pictures and a selec-tion of graphics that can be used to create objects, par-ticularly in the domain of stream ecosystems.

Creating Factors. Next, the user selects and cre-ates "factors," each one associated with a specific ob-ject. Factors are usually measurable: the temperatureof the stream, the speed of the wind, the number ofpeople, the size of the golf course. Factors can alsobe "calculable" (that is, mathematical constructs),such as the water quality of a stream or the rate ofgrowth of a population. Choosing relevant and irrele-vant factors is important in the analysis of the prob-lem. An object that seems to have only one factorat one level of analysis may in fact be decomposableinto more factors at a deeper level of analysis. Figure3 shows how one factor, pond depth, would be cre-ated with the "Factor Factory." If the 'minimum' and'maximum' values are unknown, Model-It supplies ageneric, default range of 0 to 100. A deeper analysisof the pond depth factor itself might engage the

5The only difference between "environment" and "individual" ob-jects is that the "environment" object's graphic occupies the back-ground in the Simulation Window, whereas "individual" objects'graphics overlay the background. Typically a model has only oneenvironment object, but it is not required to have any at all.

6Specifically, Model-It creates three factors and two relationshipswith each population. One factor is called "count" and keepstrack of how many individuals are in the population; one is called"rate of growth" and the third is called "rate of decay." The two"rate" factors are each related to "count" with special rate rela-tionships so that if "rate of growth" is larger than "rate of decay,"the population count will grow, and vice versa. Thus the popu-lation may be affected by creating relationships that manipulatethe rates of "growth" and "decay."

Fig. 2. Creating a Fish population object with the ObjectFactory.

Fig. 3. Creating a "pond depth" factor with the Factor Factory.

Page 13: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 227

modeler in a consideration of what values might bemore realistic, of what additional information mightbe needed to determine those values, or of how val-ues might differ between locales. The 'initial value'of a factor may be an actual value, or may simplyrepresent the modeler's choice of how high or lowthat factor should start when the model is run. The'units' and 'description' entry fields are fields inwhich the user may include relevant information inthe model, but that information is not used by thesimulation engine.

As an aside, factors intended to be used as rates(to be discussed shortly) should be created with rela-tively small ranges in comparison to the possible val-ues of the variables they are likely to affect. If weassume for a moment that "suspended solids" in thepond might be defined with a range of, I say, zeroto five hundred pounds of solids, then the range ofthe factor representing the average rate at which sol-ids are entering the stream from the runoff shouldbe quite a bit smaller, say, zero to three.

Defining Relationships—Immediate. After select-ing and defining at least at least two factors of oneor more objects, any two factors may be associatedby creating "relationships," defined by selecting fromamong two types and several variations.7 One typeof relationship that can be chosen is the "immediate"relationship that works as follows: changes in thevalue of the causal factor are immediately reflectedin the value of the affected factor regardless of whathappened before in previous time steps.8 Immediaterelationships may be defined with one of two orien-tations ("increases" or "decreases") and a selectionof variations (e.g., "a little," "a lot," "about thesame," or "more and more" or "less and less").These orientations and variations are selected withsimple pull-down menus. By default, Model-It cre-ates all immediate factors as "increases about thesame." Figure 4 shows how to define an immediate

7There are constraints upon the kind and nature of relationshipsthat may be created between factors, because of the way rela-tionships are internally represented mathematically. For example,multiplicative relationships, in which factors are multiplied to-gether (as in, for example, a calculation of bank interest), arenot possible in Model-It.

8When a model is ran, a time counter ticks off arbitrarily sized"time steps." Each time step may represent a minute, hour, day,or whatever interval is conceptually sensible to the user. "Imedi-ate" relationships operate essentially independently of time steps."Rate" relationships, on the other hand, are driven by the timecounter, so that the rate value is added to the affected factoronce every time step.

relationship between the pond's suspended solids andits depth, using the "Relationship Maker." An "im-mediate increases about the same" relationship hasbeen selected, following the reasoning that sus-pended solids eventually settle out and decrease thedepth of the pond. Note that the Relationship Makerscreen provides a simultaneous graphical repre-sentation of the variation selected by the user as ascaffold for making the selection. Figure 5 shows howan immediate relationship can be defined betweenthe fish population's count and the plants in thepond, using a table of values, following the reasoning(notice the explanation in the lower left) that if thereare few fish living in the pond, plants will probablymaintain a stable level, but too many plant-eatingfish may lead to a decrease in the quantity of plants.

Defining Relationships—Rate. The other type ofrelationship is called the "rate" relationship: at eachtime step, the value of one factor is added onto [orsubtracted from] another factor's value. Figure 6 showshow to define a rate relationship between the averagerate of runoff and the amount of suspended solids inthe pond. The words on the left hand side of the win-dow verbalize the relationship, saying that at each timestep, the value of the average rate of runoff will beadded to the amount of solids in the pond. This is why(as mentioned earlier) the runoff rate's range needsto be defined with a much smaller range than the totalsolids' value: if it were relatively large (say, one-fourthor one-half of the total solids' value)1 the total solidsfactor could reach its maximum value within a fewtime steps, a situation that might be realistic in catas-trophes, but not particularly useful conceptually undermore equilibrated conditions. When the user attemptsto create a rate relationship between two such "mis-matched" factors Model-It produces a warning aboutthe possible problem.

Another potential problem exists when the usertries to create an immediate relationship to a factorthat is already affected by a rate relationship (andvice versa). The simulation engine is not able to proc-ess a factor affected by both kinds of relationships.

Relationships, conceptually speaking, may becausal or correlational. For example, a relationshipbetween pond depth and fish rate of decay might bemore correlational than causal, because the depth ofthe pond itself won't cause changes in the popula-tion, but might be correlated with those changes.Model-It, however, does not require the user tomake the difference between causal and correlationalrelationships explicit, although the user is encour-

Page 14: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Fig. 4. Creating an immediate relationship between "pond suspended solids" and "pond depth," using the Text View.

Fig. 5. Creating an immediate relationship between "fish count" and "plant quantity" using the Table View.

Page 15: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 229

Fig. 6. Creating rate relationship between "water average rate of runoff " and "pond depth."

aged to reflect upon how or why one factor affectsanother by the presence of a field in which "expla-nations" may be typed. Causal explanations shoulddescribe the mechanism behind a relationship; con-sider the explanation, for example, for the relation-

ship between dissolved solids in the pond and ponddepth in Fig. 4. It says "Suspended solids eventuallysettle to the bottom of the pond, reducing the ponddepth." The key is that more suspended solids causeeventual reduced pond depths by the mechanism of

Fig. 7. Running the pond model, showing Meters and Graph Windows ("weather rain" set to zero).

Page 16: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Stratford, Krajcik, and Soloway

Fig. 8. Running the pond model ("weather rain" set to three).

Fig. 9. The Factor Map.

230

Page 17: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 231

settling. On the other hand, an explanation such as"Fish die if the pond gets too shallow" doesn't ar-ticulate the mechanism relating fish populations andpond depth (fish deaths may be caused by reducedoxygen levels or by raised temperatures resultingfrom shallowness, but not by shallowness itself). So,although a relationship between fish rate of decayand pond depth may be observable and modelable,at best the relationship reflects a correlation, not acause. What is important is that the user understandthe difference.

Testing the Model. In Model-It, the user tests hisor her model interactively using several graphicaltools. One tool, called a "meter," presents a continu-ous display of a factor's current value at the currenttime step. Multiple meters may be displayed whiletesting. Meters have a special property that if a factorhas no other factors affecting it (i.e., it is an inde-pendent variable) its meter is a "slider" whose valuecan be adjusted while the model. This allows the userto test his or her model while it is running, settingvalues for independent factors to see what happens,and perhaps generating additional questions or hy-potheses for further investigation.

A second tool, called a "Simulation Graph,"presents a line graph display of the changing valuesof up to five factors over many time steps. Figures7 and 8 show two tests conducted by Chris, ourmodeler, using meters and graph windows. The firsttest (Fig. 7), to the right of the figure, shows whathappened when the model ran with the runoff factorset to zero: suspended solids stayed low, pond depthstayed high, fish grew, and plants declined and even-tually stabilized at a low level. In the second test (Fig.8), the model ran with runoff initially set at a valueof three. Notice that the amount of suspended solidsgradually increased with each time step (because ofthe rate relationship between runoff rate and sus-pended solids) which in turn eventually caused thedepth of the stream to decrease. Because the pondwas becoming shallower, the fish eventually died off,allowing the amount of plants to increase.

The Factor Map Overview. Finally, Model-It pro-vides an overview of the entire model called the "Fac-tor Map." Every factor of every object in the modelappears in the Factor Map with its name and a smallicon-sized graphic of its associated object. Figure 9shows a Factor Map. Relationships are representedby arrows between factors; immediate relationshipsare solid black arrows, and rate relationships are gray.Factors can be moved anywhere on the screen and

adjusted so they are close to factors they affect so re-lationship lines don't cross, or so they are in an aes-thetically pleasing arrangement. The Factor Mapdoesn't provide visual cues for relationships beyondthe arrows between related factors (e.g., whether itwas defined with "increases" or "decreases"); how-ever, double-clicking directly on the relationship ar-row opens the Relationship Maker window where itsdefinition can be viewed or modified. The FactorMap's visual display allows the user to view his or hermodel as a whole; it assists him or her, then, in plan-ning the model and synthesizing objects, factors, andrelationships into a conceptual unit.9

Appendix B: Sample Data Analysis

The sample data analysis follows this section oftranscript in the listing below, which has been anno-tated from the video. The underlined passages arediscussed below:

1. {begin episode 10 creating rate affects depth,51:20}

2. M: OK, rainfall rate affects depth. Weather . . .stream . . . (selects rainfall rate affects depthrelationship)

3. N: Rainfall rate affects stream depth. Is it rate?4. M: I don't know—Wait . . . why would it be,

well [gets notifier message when they tried tochoose rate] it could be both, but if it starts—assoon as the first . . .as soon as it starts to rain,it's going to change the stream depth. So howwould that be a rate . . . never mind. I just don'tunderstand how it could be a rate.

5. N: Because the depth keeps increasing as it keepsraining.

6. M: There's also an immediate. It affects it im-mediately, too.

7. N: No immediate is where it never changes. . . .Immediate is where it changes once and it neverchanges again.

8. M: OK.9. N: And rate is when it changes once and keeps

changing—that's what . . .10. M: Fine, fine, fine. I'm looking for something

else. Do that, and I'm gonna—cause I don't

9The version of Model-It used by students in this study had a par-tially functional Factor Map: although factors could be movedaround and relationships examined, the spatial arrangement wasnot saved with the model. Thus, factors had to be rearrangedeach time a model was opened.

Page 18: Secondary Students' Dynamic Modeling Processes: Analyzing ...

232 Stratford, Krajcik, and Soloway

think we will have enough factors if we just dothat.

11. N: Well we have to think of the—you guys, stopit. [long pause, typing the explanation: "As therain all rate increases, the depth of the stream alsoincreases." They make it a rate relationship.]

12. M: What?13. N: OK, what about a new relationship?14. {end episode 10}15. {begin episode 11 discussing factor map & look-

ing up t rate of change, 53:05}16. M: New relationship, . . . wait, go to the factor

map and see what we have so far. [goes to factormap] Wow. Oh, wow. OK You want to movethat rainfall rate.

17. N: Where?18. M: Just move it.19. N: Do you know what I'm going to do? I'm go-

ing to cool watch this20. M: But you see that still goes through rainfall

rate. [moving stuff around on screen]21. N: Oh, I see22. M: Oh, that's cool. Air temperature affects cloud

cover which affects stream temperature. [he readit wrong] Oh, look, that's like in the middle.

23. N: Right. This affects both. [referring to cloudcover, which affects stream temp and air temp]

24. M: OK. Now, so we're not going to do acid rain.[N: No.] So there's no use for it being there.

25. N: Right. That's what I said. [deletes acid rain,confirms with notifier] I'm going to move it soit looks—so it looks like the arrow's not just go-ing through it so it looks like the arrow is sup-pose to be like that.

26. M: OK, um, T-rate of change? Temperature,well, yeah, we could do that. What affects thetemperature rate of change?

27. N: Well, the air temperature.28. M: The stream temperature. I guess.29. N: How did we do the rate of change last time?

Why don't you look in here and find out how

we did the t rate of change last time what affectsthe t rate of change.

30. M: What packet would that be in?31. N: Oh, my god. I would have no idea. It's not

going to be in that one.32. M: Yeah, you're right. Probably around 6 then.

[looking in Guide packets] The T-rate of changeaffects . . . OK, air temperature affects T-rateof change.

33. N: Air temperature affects . . .34. M: No, no, no.35. N: Oh I didn't pick that one. OK. [connects air

temperature with t-rate of change to make a re-lationship]

36. M: OK, wait, now we just have to find out why.OK, about the same, increases. [setting up airtemp affects t rate in rel maker] OK, it means,OK, as the temperature gets higher, the water,OK. Oh, it's easy. As the water, or as the air tem-perature gets higher, the water changes temperaturefaster. Then if it gets lower, it changes. [typing ex-planation]

37. N: Now wait, as the air temperature . . . wait,as the air temperature of the water changes . . .

38. M: No.39. N: As the air temper . . . changes . . .40. M: Gets higher. [typing explanation: "as the air

temperature gets higher, the water changes tem-perature faster"l

41. N: Higher, the water changes . . .42. M: No, the water temperature changes, or the

water changes temperature faster. [typing]43. N: Faster?44. M: Faster. And it's like as the air temperature

gets lower, the water temperature, waterchanges temperature slower. The water changesthe temperature slower. Slower, slowly. [typesthe rest: "as the air temp gets lower the waterchanges the temperature slower"]

45. N: Slower. OK?46. M: Yeah. Enthralled.47. {end episode 11}

Episode Division. Divided into two episodes, atline 14 in the selection, where they switch fromworking out a relationship to looking at their FactorMap.

Episode 1 Descriptive Account. In this episode,they create and explain one factor, rainfall rate af-fects stream depth.

Because Mark is now persuaded not to thinkabout doing acid rain, he suggests the idea that rain-fall rate affects stream depth. They immediately getstuck on the rate vs. immediate problem. Nicole ar-gues for rate: "Because the depth keeps increasingas it keeps raining." Mark argues for immediate: "assoon as it starts to rain, it's going to change the

Listing: Transcript for Chapter II Sample Data Analysis

Page 19: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Secondary Students' Dynamic Modeling Processes 233

stream depth." In the end Nicole's argument wins,as she demonstrates an understanding of rate vs. im-mediate that Mark can't counter: "Immediate iswhere it changes once and it never changes again—And rate is when it changes once and keeps chang-ing." Mark assents and they leave it at rate. One ofthem finishes typing the explanation.

Episode 2 Descriptive Account. In this episode,Nicole and Mark look at their factor map, then cre-ate a relationship from air temp to t rate of change.

First they go to the factor map to see what theyhave so far. After arranging a few factors, Mark no-tices the connection between air temp, cloud cover,and stream temp, but interprets it incorrectly. Nicolepoints out that cloud cover affect the others. Markasks if they're not doing acid rain, Nicole says no, sothey delete it. Mark notices t-rate of change, and sug-gests that they create a relationship for it. He asksNicole what affects it—she says air temp, he saysstream temp. Mark looks up how they did it in theGuide, and finds that air temp affects stream t rateof change. Nicole connects the two and starts the re-lationship maker. Mark, having read the Guide, ex-plains how it works: "As the water, or as the airtemperature gets higher, the water changes tempera-ture faster. Then if it gets lower, it changes." So theymake it 'increases about the same' and type an ex-planation to that effect.

Episode 1 Event Identification. Constructing con-tent—relationships

Episode 2 Event Identification. Constructing con-tent—relationship; obtaining information—lookingup conceptual and procedural

Episode 1 Analytical Narrative. Here they areconfused about the rate vs. immediate concept. Bothof them have reasonable ideas, but neither appliesthem to Model-It in sensible ways. Both attempt tojustify their ideas and to make persuasive arguments;however, they don't go to the next level and try tofind out which is right.

Episode 2 Analytical Narrative. Here they aregenerating another idea for their model. They realizethat they lack information needed to connect t rateof change to the rest of their model, so they consultthe Guide. Mark looks up more than just how to doit (inc ab same); he also looks up the explanation,and reads it to Nicole so they both know and so shecan type the explanation.

Episode 1 Cognitive Strategies for Modeling. Rela-tional reasoning: Making cause/effect statements, cre-ating relationships with Relationship Maker;

explaining: How parts are related (causally/correla-tionally), justifying an argument.

Episode 2 Cognitive Strategies for Modeling. Rela-tional reasoning: Creating relationships with Relation-ship Maker, discussing/selecting relationships,making cause/effect statements; synthesizing: discuss-ing model's representation in Factor Map, decidinghow model should work as a whole; explaining: howparts are related (causally/correlationally).

REFERENCES

Bruner, J. S. (1973). Beyond the Information Given. W.W. Norton,New York.

Forrester, J. W. (1968). Principles of Systems. Wright-Allen Press,Cambridge, MA.

Gardner, H. (1991). The Unschooled Mind: How Children Thinkand How Schools Should Teach. Basic Books, New York.

Gilbert, S. W (1991). Model building and a definition of science.Journal of Research in Science Teaching, 28, 73-79.

High Performance Systems (1992). Stella II: An Introduction toSystems Thinking.

Huebel-Drake, M., Finkel, E., Stern, E., and Mouradian, M.(1995). Planning a course for success. The Science Teacher, 62,18-21.

Jackson, S., Stratford, S. J., Krajcik, J., and Soloway, E. (1996).Making dynamic modeling accessible to pre-college sciencestudents. Interactive Learning Environments, 4, 233-257.

Mandinach, E. B., and Cline, H. F. (1994). Classroom Dynamics:Implementing a Technology-Based Learning Environment.Erlbaum, Hillsdale, NJ.

Miller, R., Ogborn, J., Briggs, J., Brough, D., Bliss, J., Boohan,R., Brosnan, T, Mellar, H., and Sakonidis, B. (1993). Educa-tional tools for computational modelling. Computars in Edu-cation 21, 205-261.

Perkins, D. (1992). Smart Schools: From Training Memories to Edu-cating Minds. The Free Press, New York.

Roberts, N., and Barclay, T. (1988). Teaching model building tohigh school students: Theory and reality. Journal of Computersin Mathematics and Science Teaching, 8, 13-24.

Roberts, N., Andersen, D. F., Deal, R. M., Garet, M. S., and Shaf-fer, W. A. (1983). Introduction to Computer Simulation: A Sys-tem Dynamics Modeling Approach. Productivity Press,Portland, OR.

Schecker, H. (1993). Learning physics by making models. PhysicsEducation, 28, 102-106.

Silvert, W. (1993). Object-oriented ecosystem modeling. EcologicalModeling, 68, 91-118.

Soloway, E., and Pryor, A. (1996). The next generation in human-computer interaction. Communications of the ACM, 39,16-18.

Stratford, S. J. (1996a). Opportunities for thinking: constructingmodels in precollege science classrooms. Paper presented atthe Annual Meeting of the American Educational ResearchAssociation. New York, New York, April, 1996.

Stratford, S. J. (1996b). Investigating processes and products ofsecondary science students using dynamic modeling software.Unpublished doctoral dissertation, University of Michigan,Ann Arbor.

Stratford, S. J. (1997). A review of computer-based model researchin precollege science classrooms. The Journal of Computers inMathematics and Science Teaching, 16, 3-23.

Stratford, S. J., Krajcik, J., and Soloway, E. (1997). TechnologicalArtifacts Created by Secondary Science Students: Examining

Page 20: Secondary Students' Dynamic Modeling Processes: Analyzing ...

Stratford, S. J., Krajcik, J., and Soloway, E. (1997). TechnologicalArtifacts Created by Secondary Science Students: ExaminingStructure, Content, and Behavior of Dynamic Models. Paperpresented at the annual meeting of the National Associationof Research in Science Teaching, Chicago, IL, March, 1997.

Stratford, S. J., Krajcik, J., and Soloway, E. (1996). Cognitivestrategies in dynamic modeling: case studies of opportunitiestaken and missed. In D. Edelson and E. Domeshek (Eds.),Proceedings of the International Conference on the LearningSciences. Evanston, IL, July, 1996.

234 Stratford, Krajcik, and Soloway