Evaluation and Festival- based Public Engagement · Cambridge Science Festival: External Evaluation...

Post on 12-Jun-2020

1 views 0 download

Transcript of Evaluation and Festival- based Public Engagement · Cambridge Science Festival: External Evaluation...

Evaluation and Festival-based Public Engagement

Dr Eric Jensen Assistant Professor of Sociology

University of Warwicke.jensen@warwick.ac.uk

1

Website for Evaluation Examples

• http://warwick.academia.edu/ericjensen

Cambridge Science Festival: External Evaluation

• On-site survey (large-scale, single page A4)

• Online follow-up post-festival survey (small scale, in-depth qualitative questions)

• Post-festival focus group (recruited from survey sample; in-depth qualitative discussion)

*See report on Cambridge Science Festival website for details / results

Big picture on evaluation within non-formal learning settings

• Full-scale evaluation unrealistic as a continuous activity for some institutions / festivals.

–May need to bring in external expertise

–May need additional training / skills

Bare minimum (pre-evaluation): Be able to specify intended outcomes and specific connections between content and delivery approaches and these outcomes. (checked against current knowledge)

Big Picture continued

• If possible, formative evaluation of engagement ideas before full public rollout.

e.g. focus groups, other pre-testing of ideas

• If possible, Summative evaluation to address 'how' and 'why' issues within engagement.

These issues hold implications beyond a particular festival, and is important to share results.

Evaluation Research

• Evaluation = sub-category of 'socialresearch' (thus all principles of socialresearch apply)

• Distinguishing feature of evaluation:Focus on objectives / claimedoutcomes (practitioners specify)

• The research process begins with a

concept / idea that a researcher is

interested in measuring or observing.

(e.g. 'learning‘ or ‘interest in art’)

Translating Objectives into Research Questions

The Evaluation Process: 1st steps

• Process of translating abstract / general ideas

or concepts into concrete, measurable

variables is very important.

• Easier said than done.

• Define concepts by what they ‘do’.

• How would you know that a particular kind of

change has happened?

• Think about how you would know if your

intervention has been successful?

Quantitative Evaluation Methods

Are used to answer any counting related question:

• How many? What proportion? What percentage?

-Survey Research

-Structured Observation (visitor tracking)

Qualitative Evaluation Methods

Any study involving non-counting data (e.g. words, drawings, etc.)

– Can be converted into quantitative data

Evaluation Research Approaches: Defining Terms

Qualitative Evaluation Research

• Qualitative Interviews

• Focus Groups

Data Analysis

- Must be systematic to avoid tendency to select quotes based on personal bias and preferences.

- Can convert qualitative data into quantitative data through content analysis

Sampling• Sometimes we can study every case: e.g. the whole

festival visitor population.

• But most of the time this would be too difficult or time consuming.

• So we usually study just a sample of the cases that we are interested in.

• What is most important in selecting a sample is that it is representative of the population.

• When a sample is representative we can make statements / claims about the population based on the sample.

What is a Representative Sample?

• To be representative the sample should

accurately reflect the whole population

of research interest. (e.g. not selecting

evening events only, or weekend)

• The ideal is that every member of the

population has an equal chance of being

included in sample.

An incredibly brief

introduction to

surveys as an

evaluation method

The Survey Process

1. Clearly identify your aims

2. Select your population and sample

3. Select how data will be collected

4. Build your questionnaire

5. Pilot the survey and re-design accordingly

6. Conduct main survey

7. Analyse data and report results

Evaluation Survey Design Flaws

(Avoid!)

• Construct Validity: The soundness of the

measures in the survey as indicators of

particular outcomes

• Non-specific effects: Improvements or

changes from effects not specific to the

factor or treatment under study (this makes

pre- and post-visit data collection the ideal)

Survey Design Flaws (Avoid!)

continued

• Demand Characteristics: tendency of

respondents to alter their responses in

accord with what they believe to be the

researchers‟ desired results

• Experimenter expectancy effect: tendency

of investigators to unintentionally bias

results in accordance with their

hypotheses. (Allow for negative results!)

Principles of question design

• Survey question response options need to be:

Exhaustive – that everyone fits into one category

Exclusive – ensure everyone fits into only one category (unless specifically required to ‘tick as many as apply).

Unambiguous – so that they mean the same to everyone and all responses are comparable.

Questionnaire Design:

Considerations

• Decide on your questions

• Decide on type of question response (e.g. Likert, multiple choice, open-ended) and refine the wording

• Decide on sequence of questions and overall layout of the survey form.

Keep the end in mind

• Ensure clear connection between questions and research aims and analysis

The Survey Form• Give your questionnaire form a title

• Provide a brief introductory statement

• Contact and return (in the case of postal surveys) information should be included on the questionnaire

• Number individual questions to aid in the data entry and analysis process later on.

• Be consistent in phrasing and try not to use too many different question types in order to avoid confusing respondents.

Questionnaire Layout

• Don‟t put too many questions on any one page of the survey.

• Response rate can be negatively affected by questionnaires that seem too long at the outset, so ensure that there are no unnecessary questions in the final version.

• Use italics and bold consistently: e.g. for instructions and for the questions or category headings

• Ensure a logical and simple structure for the questionnaire, avoiding unnecessarily complex and changing question types.

Question Types• Open-ended

– What interested you in

attending the festival today?

• Ranked Response

– Rank your preference from

amongst the following options

• Multiple versus single response

– Specify „select one‟ or „tick all that apply‟

• Likert scale (rating scales): 1-5, 1-7, 1-9

Final Notes on Questions: Part 1

• Ensure questions are as brief as possible.

• Use plain language. Avoid jargon, assumptions of specialist knowledge.

• Minimise ambiguity in the questions and response options

Final Notes on Questions: Part 2

• Ensure you don‟t have any double-barrelled

questions (e.g., „What interested you in visiting

the festival this year and last year?‟)

• Avoid Leading Questions!!!

– Leading questions such as “Do you agree that

Durrell is doing important work to save

animals from extinction?”

Top Tips for Evaluation Methods

Take home points / Top Tips• Evaluation requires very clear, specific and measurable

objectives

Beware of ‘Raising Awareness’ and ‘Inspiring Interest’!

• Quantitative Methods

Get the design right at the beginning! (e.g. pilot testing)

• Sampling

Equal probability of selection!

• Surveys– Good for large samples and large claims / statements, but

think carefully about question design!

– Avoid self-report

Take home points / Top TipsQualitative Evaluation Methods

Stay open-ended and be a sponge of information!

Focus groups

Learn from the disagreements and help the quiet people to speak!

Interviews

Deep understanding of small number of people

Website for Evaluation Examples

• http://warwick.academia.edu/ericjensen

Evaluation and Festival-based Public Engagement

Dr Eric Jensen Assistant Professor of Sociology

University of Warwicke.jensen@warwick.ac.uk

29