Criticality in writing up research for publication researchweb.edu.hku.hk/f/event/4486/HKU...

16
26/10/2017 1 Criticality in writing up research for publication research Sara Hennessy Hong Kong University, October 2017 OUTLINE Critical approaches to writing up and interpreting research Need for more analysis, rigour, attention to bias, confounds – Example from classroom dialogue research Examples from Edtech research – BJET, tablet use and learning gains

Transcript of Criticality in writing up research for publication researchweb.edu.hku.hk/f/event/4486/HKU...

26/10/2017

1

Criticality in writing up research for publication research

Sara Hennessy

Hong Kong University, October 2017

OUTLINE Critical approaches to writing up and interpreting research

Need for more analysis, rigour, attention to bias, confounds – Example from classroom dialogue research

Examples from Edtech research – BJET, tablet use and learning gains

26/10/2017

2

Critical: a questioning and sceptical stance Are terms carefully defined and diverse uses of them problematised? Is the account analytical and underpinned by theory, or purely descriptive? Is there a theory of change? Do the authors critique the literature and present a balanced account? Or does the review gloss over known controversies / ambiguities?

Analytical versus descriptive

Is the perspective open-minded, questioning and objective? How do the theoretical assumptions and explanations of case compare with alternative explanations? Are links made in the findings/discussion/conclusions back to the conceptual framework?

Critical approach to theory

26/10/2017

3

Do the RQs convey genuine inquiry? Or do they assume that the intervention/application is a Good Thing? Are author agendas, assumptions, values, contextual factors acknowledged? Were there any potential sources of bias? What measures were taken to counter them? Eg. Were any counter-examples sought when collecting and analysing data? Does reporting seem selective?

Is there evident bias?

Is there enough detail about sampling and context? Is there enough information about what both teachers and students were doing? Is it clear what the analyses actually involved and how the findings derived from them? Are the analyses explicit? Is the account reflective and evaluative?

Is the reporting comprehensive?

26/10/2017

4

Are limitations in the design acknowledged? Could reactivity have played a role? Did novelty value increase motivation? Is self-report triangulated? Are analyses reliable? Is there participant validation? Is there corroboration by an independent researcher? How robust are the claims: How adequately is the argument supported by the evidence provided?

Was the design rigorous?

Was there any kind of control in an intervention design? Are there threats to validity or reliability? Control variables: Does inadequate control of extraneous factors threaten validity of theoretical inferences from data?

Was the design rigorous?

26/10/2017

5

CLASSROOM DIALOGUE: DOES IT REALLY MAKE A DIFFERENCE TO STUDENT OUTCOME?

Christine Howe, Sara Hennessy, Neil Mercer

Confounding factors – thanks to Hattie!

General Approach

u Record authentic lessons, and analyze dialogue involving

teachers to reflect current views about productive features

u Relate naturally occurring variation in dialogue to measures of student outcome, controlling for other potential influences

26/10/2017

6

Recording and Analyzing

u  72 primary classrooms

u  Recorded two lessons per classroom

u  After achieving inter-judge reliability, analyzed all dialogue in which teacher involved

u  Derived variables suitable for multi-level modeling

Outcome measures: Attainment and Attitudes

u  SAT scores •  Reading, Spelling & grammar, Maths

u  Customised cognitive measures §  Science §  Reasoning

u  PASS •  50-item questionnaire assessing attitudes to

school and self-as-learner

26/10/2017

7

Initial Identification and Assessment of control variables

u  Identified c.150 influences on student outcome •  Eliminated majority because of low effect sizes, irrelevancy

for age group, incorporation in other variables

u  Assessed remaining 32 variables via: •  Teacher questionnaire, e.g. use of testing, homework •  Student questionnaire or test, e.g. parental involvement,

prior attainment, prior attitudes •  Teacher observation, e.g. behaviour management •  Student observation, i.e. small-group activity

u  Each classroom scored for each variable

Final Selection u  Multiple regression using backward elimination to

exclude control variables not related to 11 dialogue variables

u  Multi-level modeling to exclude control variables not related to 6 outcome measures

u  Only three variables remained, and even these did not apply with every dialogue variable

•  Start-of-year attainment scores •  Start-of-year PASS scores •  Group work scores

u  These variables included in analyses as appropriate

26/10/2017

8

Is the research innovative or at least original? What is the significance and contribution to existing knowledge? Theory? Methodology? Empirical data?

“So what?”

Who is the audience? Is the work relevant and current? Does the sampling strategy permit empirical generalisation to a larger population? Is awareness of limitations on generalisability explicit? Are there clear conclusions that generalise beyond specific case/context? Do they apply in other institutions? Is the work applicable / of interest in other countries? Is the focus overtly parochial?

Is the research generalisable / replicable / scalable / sustainable?

26/10/2017

9

“BJET publishes theoretical perspectives and critical reviews, methodological developments and high quality empirical research that demonstrate whether and how applications of instructional/educational technology systems, networks, tools and resources lead to improvements in formal and non-formal education at all levels, from early years through to higher, technical and vocational education, professional development and corporate training.” 6000 word papers, 14% acceptance, impact factor 2.41.

British Journal of Educational Technology (BJET)

Has the innovation been tested with real users? Are there convincing learning outcomes? Are there explicit implications for practice? Policy?

Is the focus technical or linked to pedagogy?

26/10/2017

10

Technology has no agency or “impact”!

Many reports of interventions initiated by “how can I use tablets / wikis / individual response systems / interactive whiteboards / the VLE…” rather than educational need How it is used, by and with whom, how often, for what purpose, under what conditions, with what support… What is the role of the teacher? Has pedagogy-focused professional development been offered? What cultural shift in teacher and learner roles is necessary?

Is there consideration of barriers to equity and inclusion?

Is edtech serving only the “privileged” in developing countries who already have access? (MOOCs: Liyanagunawardena, Williams & Adams, 2013) BARRIERS: SES and gender inequity, rural/urban divide, language, computer literacy, bandwidth & intermittent connectivity/electricity, maintenance, technical support, gatekeepers/stakeholders, culturally appropriate content, institutional support & capacity, teacher motivation…

26/10/2017

11

Criticality in your area

Systematic review of evidence for learning through tablet use in schools

Motivation:

(i)  Schools investing large sums of money

(ii)  Assumption: tablets have a positive “impact”

(iii)  Policy advocating the use of tablets doing so without referring to existing evidence

Aim: To critically review literature that reports on the use of tablets by children in school, with a particular focus on learning outcomes

Haßler, B., Major, L. & Hennessy, S. (2015). Tablet use in schools: a critical review of the evidence for learning outcomes. Journal of Computer Assisted Learning, 32(2), 139-156.

26/10/2017

12

Review Methodology

Systematic review (SR) involves methodically collecting and critically analysing multiple research studies

Advantages = identify gaps in research, limit bias, repeatable

SRs aim to produce reports about evidence that are usable and reliable

[RQ1] Do subject knowledge and skills of students increase following the use of tablets to support educational activities? [RQ2] What factors contribute to (un)successful use of tablets?

Review Methodology §  Studies rated for high/medium/low relevance and methodological

trustworthiness – including:

v  rigour in design

v  description of context

v  appropriate and explicit sampling strategy and data collection methods

v  appropriate and explicit data analysis & interpretation

v  credible claims with sufficient evidence

§  Minimum quality threshold of “medium” for both

§  Only 11 studies judged of ‘high’ trustworthiness, only 6 high on both

26/10/2017

13

Findings (in brief)

Number of potentially relevant studies excluded from analysis (>100 found but many focused on motivation not learning or did not meet criteria)

23 studies included in final set:

§ 16 reported positive learning outcomes

§ 7 reported no difference in, or negative, learning outcomes

In principle, tablets (like other digital technologies) can be viably used to support students of all ages…

However… §  Fragmented nature of current knowledge base, and scarcity of

generalisable rigorous studies, make it difficult to draw firm conclusions

v  Many provided a ‘ lessons learned’ account or described an approach, without any empirical evidence

v  Lack of control groups and baseline measures

v  Limited detail of activities that learners engaged in

v  Limited scope and duration

v  Optimal student-to-device ratio is unresearched and unclear

v  Little systematic exploration of theoretical frameworks

26/10/2017

14

However… v  Research conducted in ‘controlled’ environments without

teacher

v  Little info on added value of tablets over other mobile technologies (including unique features like accelerometer and GPS sensors)

§  Explanations as to how/why tablet use can improve learning remain elusive (i.e. technical, not pedagogical, factors were the focus)

This has parallels with many technology-based education projects, as these often focus narrowly on hardware/software, and not teaching practices

https://www.bera.ac.uk/blog/tablets-in-schools-tools-for-learning-or-tantalising-toys

An example… Kenyan Primary Math and Reading Initiative (PRIMR) programme studied effectiveness of 3 interventions for literacy (Piper & Kwayumba, 2014):

§  tablets for teacher educators §  tablets for teachers § eReaders for students

Gains in student learning outcomes shown for all 3 treatments, with no statistical difference between groups in terms of learning gains

(although cost per student varied)

Findings highlight learning gains due to instructional approach, not use of technology, since the tablet programme was “not noticeably different from the base, non-ICT PRIMR intervention”

26/10/2017

15

Education is on the brink of being transformed through learning technologies; however, it has been on that brink for some decades now (Diana Laurillard, 2008)

We need to move beyond research studies and reports of using new instructional and pedagogical approaches – or emerging educational technologies – that present a “victory narrative” vs. problematising!

The 3 Rs…

QUALITY takes time

QUALITY NOT QUANTITY!

Rigour, Robustness, Reflection

26/10/2017

16

1.  Are there effective ways to support authors in bringing

in a critical edge? Is modelling helpful?

2.  What steps can you as a researcher take to be more systematic & rigorous, to minimise bias, triangulate findings, deal with confounds? To present more critical and analytical reports?

3.  Think about a piece of your own research; what can you do to strengthen its quality for publication?

Questions