A Sample of Best Practices in University Lower Division Science Education CSE Brownbag 18 October...

62
A Sample of Best Practices in University Lower Division Science Education CSE Brownbag 18 October Dan Bernstein, University of Kansas [email protected]

Transcript of A Sample of Best Practices in University Lower Division Science Education CSE Brownbag 18 October...

A Sample of Best Practices in University Lower Division Science

Education

CSE Brownbag18 October

Dan Bernstein, University of [email protected]

Overview of session

• Uses of class time• Capturing out of class time• Alternative course designs• Discussion of implementation • costs and benefits• cultural context• recommendations

• Disclaimer

Using class time• Pause for problems /

interaction• Mazur is the poster child• Survey of KU clicker users

– Attendance and pop quizzes– Check for understanding 3rd– no plans for how to proceed

• Pollock argued that main benefit is collaboration during breaks– “Stop learning and listen to me”

• Could be routine feature of classes

Group work tricky in large classes

• Managing groups requires planning

• One KU professor takes on Budig 120

• Tim Shaftel (Business) inserts group days

• Random assignment to fours w/ warmup

• Works a problem on the big screen

• Breaks out for comments and suggestions

• Roams the room for solutions

• Consider the video

Tutorials - U of Washington PEG• Lillian McDermott and colleagues

• Crafted generative problem tutorials

• Intended to replace lecture/problem sessions [TA doing the problem]

• Active engagement in figuring conceptual features of physics

• Consider these examples

Collisions in 2-D

Progressive questions per set upFocus on explanations

More challenging particulars

Magnetism -- with magnets

Progressively complex

Steve Pollock, PhysicsUniversity of Colorado

• Replaced typical discussion for Intro to Physics with Tutorials.

• Result: very high learning gains, by national standards. (“The final score matches what our junior physics majors get on this hard exam!”)

Colorado -- BEMA pretest

BEMA = “Brief E&M Assessment”, validated research-based survey of Conceptual elements of E&M. Blue data above is F04 (N=319) Pretest ave 26%Pretest ave 26%

BEMA (matched) (CU scoring) Fa04

02468

101214161820

0 6 12 18 24 30 36 42 48 55 61 67 73 79 85 91 97

Score (%) (CU scoring)

% of students

PreF04

BEMA post -- Comparable to Grad Students

F04 (N=319) 26% -> 59%, S05 (N=232) 27% -> 59%

“Posttest results yield an impressive replication for two semesters High by nat’l standards (typical trad courses, post score = 30-40% !)”

BEMA (matched) (CU scoring) Compare Fa04 and Sp05

02468

101214161820

0 6 12 18 24 30 36 42 48 55 61 67 73 79 85 91 97

Score (%) (CU scoring)

% of students

PostF04 PostS05

Pre/post FMCE (Sp04)

0

10

20

30

40

50

60

0 12 24 36 48 61 73 85 97Score (%)

# of students

Pre

Post

This is their research area

Inquiry laboratories

• Related to the tutorials -- constructivist model of understanding

• Taken to full hands on laboratory

• Joe Heppert, Jim Ellis, Jan Robinson

• Engage students in process

• Embedded, inductive, open-ended

High End -- Studio Physics

• Hands on discovery in place of lecture

• Reorganize even very large classes

• Two hour blocks of time

• Measure conventional and conceptual skills

• Taking inquiry lab, constructivist model to the whole experience

• Robert Beichner, NC State, one example

Teaching space very different

Conventional exam questions

MC items - Studio v. three lecturers

Studio => comparable problem sets

Failure ratio: Lecture/Studio

FCI gain - Highly replicable

Semester gain by class rank

Outside of Class Time

• Some use technology

– Center for Academic Transformation

• Others based on peers

– Community building

– Meta cognitive coaching

Carol Twigg invested Pew funds• Re-gifted the money for course redesign

• Focused on technology as tool

• Emphasized saving money through efficient non-human or lower cost human delivery

• Committed to evaluation by learning and completion rates

• Increased success and/or difficulty of course

• Tracked learning downstream in curriculum

• Decreased rates of D, F, and Withdrawal

Carnegie Mellon -- Statistics• Created StatTutor program

• Open-ended intelligent tutoring software– Gives feedback on individual paths– Focuses on decision making en route

• Aimed for high levels of skill not previously attainable

• 22% increase in scores

• Critical skill is selecting appropriate statistics to use

High rates of success

• Replicated in two course offerings (N>400)

• Selection error rates dropped from ~9 to <1

• Two skills not attempted before reached 70% correct

Ohio State University - Statistics

• Buffet of options for >3000 students / year

• Discovery laboratories, small groups, small lectures, training modules, video reviews

• All take common examinations

• Learning was greater than comparable daytime lecture based course

• Greatly enhance retention of students

• Fewer W’s, F’s, and I’s

• Modular credit (1-5), reducing full retakes

Tutorial out performed day class

• Large class equaled smaller night class• Fewest failures• Maintained large enrollment

Penn State - Statistics• Reduced lectures from 3 to 1 per week

• Replaced with computer lab time– Computer mediated workshops– Extended practice in computerized testing

• Lecture: Exam pre-post was 50% => 60%

• Redesign: Exam pre-post was 50% => 68%

• Selection of correct tool: 78% => 87%

• DFW rate: 12% => 10%

• 2200 students per year

University of Iowa - Chemistry• 1300 students / year

• Pressure from Engineering and Pharmacy

• Fewer lectures, modular content, active participation, computer simulations

• Inquiry based laboratories

• No difference on common exam items

• Am Chem Soc exams: 58 => 65, 52 => 61

• DFW: 24-30% => 13%

• DF: 16% => 9%; W: 9% => 4%

U Mass - General Biology• 700 students / semester

• Lectures: 3 => 2, add review session

• Inquiry lab already in place

• Interactive class technology, online quizzes

• Peer tutoring and supplemental instruction

• Use ClassTalk network for students

• Exams: 61% => 73% correct

• Questions: 23% => 67% required reasoning

• DF: 37% => 32%

Peer led workshop groups• Northwestern University Biology course

• Based on legendary work of Uri Treisman

• Peers prepared as facilitators at UT Austin

• Led group problem solving 2 hr / week

• Majority students outperformed controls– Steady improvement across exams

• Minority students outperformed controls– Improvement noted on last exam– Historic controls show decline over course– 3rd exam exceeds majority controls

Wendi Born @ CTE 7 November

Supplemental Instruction• Peer led sessions with trained facilitators

• Part content and part meta cognition– Study skills– Learning about learning

• Designed at UMKC by Deanna Martin– Address high failure rate by minorities in

professional programs– Identifies at risk courses, not at risk students

• Lani Guinier on the canary

Key Characteristics• All students invited, not targeting weakest

• Always with faculty cooperation

• Sessions begin right away– Not associated with having problems

• Minority students:– SI participants have 2.02 GPA in courses– Non SI participants have 1.55 in same courses

• DFW rate:– Non SI at 43%, SI at 36%

Huge international following

Nebraska - Intro to Chemistry Non SI had more than double the failure rate

83% passed with SI, 57% without

“Universal” aid, like Studio Physics

[Universal] Design for Success

• Presume students can learn

• Discount need to sort or differentiate

• Maximize overall course performance

Benjamin Bloom promoted mastery • Based on practice

and feedback• Divide course into

many smaller units• Take examinations

and get results• Require taking exam

again until high score• IFF 95% correct =>

study next unit

Fred Keller promoted mastery• IFF 95% correct =>

study next unit• Course grade is number

of units passed• No penalty for repeating

and learning• All who pass 12 units =>

grade of A• Do A work on 10 units

=> grade of C

Also taught conventional lecture

• Mastery Class

• 95% contingency

• No penalty for learning

• Immediate feedback

• No lecture required

• Lecture class

• Same exam questions

• Two attempts per test

• In class feedback

• No contingency

Total amount learned

• Nearly twice as many at the high end of learning• Virtually no one failed to learn• Maximized learning for many students

Showed in amount and accuracy

• Many more questions answered

• Took 12 15-item tests• Lecture was three tests

of 20 items each• Certify more learning• Overall percent correct

also higher

No magic -- students studied better

• They put in more time to their learning

• There was more work asked for by the course

• Note that they report doing the reading more

• Preparation for class is key issue (later also)

• Guideline in US -- 2 hours outside for every 1 hour in class

Major meta analysis of 100 studies

• Kulik, Kulik, & Bangert-Drowns

• Consistently more learning

• More time on task

• Greater retention over time

• Lower completion rates when used without deadlines and incentives

Placement and Prerequisites

• Variation on the same theme

• Languages require competence

• Tracking skill downstream in the curriculum

• Using mastery criteria for foundation courses

• Requires some coordination within and between units

• Could benefit from tutorials and SI

Marginal gains not clear• Are these effects additive?• Maybe they all help the same students in the

same way

Ernest Boyer:

The work of the scholar remains incomplete

until it is understood and used by others.

Challenges on teaching science• Do we really want success? Grade inflation?

• How do we handle the coverage/depth issue?

• What about the resources?– Space, funds, faculty time

• Students should also be responsible

• Are these technologies transferable/robust?

• What about bureaucratic hurdles?– Remedial courses/tutorials, Undergrad TAs– Semester based credit and tuition

http://www.cte.ku.edu

Your Insights?

Three functions of grading

• Certification of learning

• Motivation for learning

• Differentiation among learners

• Each has a legitimate purpose

• No one system does all well

Variability in conventional course

• Students learn at different rates• When course ends, fast learners get best results• Very good at identifying fast learners (differentiate)• Less good at motivating for more work

Variability in a mastery course

• Everyone learns until material is mastered• Reward is for work; subjective probability of success• Very good at certification of learning• Provides incentive for studying, no penalty if slow

When is mastery the right approach?

• Foundation courses -- want knowledge• Programs in which rate of learning is not a

criterion for success• Situations in which performance will not be

timed• Professions in which high skill is expected• Why tolerate ineffective teaching?• If we don’t care or think it can not be learned

by all

How much can a student learn?

• Boundaries are time, effort, and capacity• Time and capacity are fixed• Your leverage into learning is effort• Organize a system that allows extra work• Honor that work when it succeeds• May lose some identification of capacity• Greatly expand the amount of learning

Scholarship Assessed (1997)• All forms of scholarship

include:

– Clear goals– Adequate preparation– Appropriate methods– Significant results– Reflective critique– Effective presentation

Glassick, Huber, & Maeroff

Communities of inquiry on learning

• Being very public with teaching in same sense of a center for research

• Faculty need another lens to complement student voice, converging measures

• Have an external community that values this work

• Stresses our existing skills at intellectual inquiry as basis of exploration

Building a community to discuss ideas about teaching

• Workshops and seminars for faculty members

• Straightforward process of peer interaction • Exchange ideas around three themes• Provide resources for exploration• Written product of thinking and planning

Collaborative Working Seminars

• Discussion of shared issues with colleagues

• Time for reading and searching

• Targets for writing and sharing

• Intentional plan is the product