workshop 1 introduction, case studies and context for wiki
TRANSCRIPT
Grace Kimble
8-10 October 2013
House of Astronomy, Heidelberg, Germany
NameCountry
What you want from the
workshop
Welcome! Please write:
EU Universe Awareness International Workshop:
Evaluation sessions
NameCountry
What you want from the
workshop
Introductions
Workshop goals
Participants will:
1.1 Understand evaluation key ideas1.2 Share evaluation case studies1.3 Learn about evaluation context
2.1 Recap evaluation methods2.2 Carry out (video) interviews2.3 Consider data analysis
3.1 Review evaluation reports3.2 Present demographic information using mapping software3.3 Consider evaluation strategies3.4 Present evaluation information
Workshop 1 Aims
1.1 Understand evaluation key ideas
1.2 Share evaluation case studies
1.3 Learn about evaluation context
Workshop 1 activities
-Using the wiki-What you already know1.1 To understand evaluation key ideas-Overview- key words and organisation (GK)1.2 To share evaluation case studies-Case studies. ------------------------------------Present web links, evaluation findings and gaps1.3 To learn about evaluation context-Research about evaluation and examples (GK)
-Review: return to mind map
Workshop 1 activitiesDay 1 aims: -Using the wiki: evaluationunawe.wikispaces.com
Workshop 1 activities-What you already know/ questions you have
General reference: Personal Meaning Mapping
Falk, J. H. (2003). Personal meaning mapping.
In G. Caban, C. Scott, J. H. Falk & L. D. Dierking (Eds.), Museums and creativity: A study into the role of museums in design education. Sydney: Powerhouse Publishing.
Specific Astronomy education reference:Lelliot, 2008 Data collection outside and inside the classroom: Personal Meaning Mapping University of the Witwatersrand, South Africa
Nonkululeko, Grade 7
Before a visit to an astronomy science centre•Listed nine planets together with some brief facts •E.G. Jupiter is the biggest planet and Mercury is the closest planet to the Sun. •She referred to stars as being “a lighting thing” created by God, and that they are our “friends, family and negbour” (sic). •stars being at the galaxy and Milky Way. •She stated that space consists of open space, containing planets, stars, galaxy and the Milky Way.
When probed about her PMM, she confirmed that “God created stars so that it can shine at night”. Although she knew the term galaxy she was unable to explain its meaning or its relationship to the term Milky Way. She further referred to a spaceship and rocket, although she found difficulty in expressing herself here. She also appeared to have differing ideas on aliens. Having said she doesn’t believe in them in the structured interview, she mentioned that some planets have them in the PMM.
Lelliott, A. D., Rollnick, M., & Pendlebury, S. (2005).
Investigating learning about astronomy - a school visit to a science centre. Paper presented at the Proceedings of the13th Annual SAARMSTE Conference, Windhoek, Namibia.
After her visit to an astronomy space centre:
•“saw which bottle goes high and low”. This was reference to the ‘Coke bottle rockets’ which students used in an activity. •Additional planets to the nine named ones. •Additional facts about the nine planets. •Black spots on the Sun. •Various features of Mars: water, land, and orbit. •A description of the Moon landing and the time taken to get there •A star bigger than the Sun.
How can children respond to new experiences?
Draw
Talk
Play games
Answer questions
Educator impact on teachers; Teacher impact on children’s learning
Wouter Schrier and Erik ArendsTeacher training session in Leiden
Dumfries Primary School
Teacher interview, DumfriesAfter training session by Libby McKearney and Mark Bailey
•Randomised Control Tests e.g. pre/post test - knowledgeExample: Two-Group Pretest-Posttest Comparison Study
Prestest Score Posttest Score Treatment A Apre Apost Treatment B Bpre Bpost
•Affective responses: Likert scale
•Closed questions in surveys
•Multiple choice e.g. Sadler (1992). Administered to 1400 school students. Result: Project STAR curriculum materials
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
Benefits? Disadvantages?
QUANTITY= NUMERICAL OUTPUT
• Interviews
• Observations
• Drawings
• Interpretive
• Images
• Video
TEXT/ VISUAL OUTPUT
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
Benefits? Disadvantages?
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas-Overview- key words and organisation (GK)
Definitions
• Quantitative
• Qualitative
• Front-End
• Formative
• Summative
Case study
1. Who were the participants?a) Ageb) Number of participants
2. What was the activity?Website address: a) Goals:b) Description:c) Time of dayd) Activity lengthe) Who delivered it? f) Format (e.g. resource,school
session, festival)g) How was it advertised?
3. Where did it take place?• Location • Country
4. How did you evaluate it?
•What did you want to know?
•Who were you evaluating it for?
•Which methods did you use?
•What did you find out?
•What were the challenges?
•How did you communicate what you found?
•What are the implications of the evaluation?
http://goo.gl/GyaeCc
Workshop 1 activities1.2 To share evaluation case studies
-Case studies
----------------------------------------------
Please show a weblink about your activity if possible.
•What did you find out?
•Were there gaps in what you were able to find out?
Context: Levels of evaluation
• International Policy• National Policy• Regional policy• Network of (in)formal learning organisations• Cluster of schools• Informal learning organisation• Curriculum• Enrichment Programme• E Learning programme• Resource• Intervention• Group of informal educators• Group of teachers• Group of parents• Informal educator• Teacher• Parent• Group of children• Child
Globalisation and Policy Research in Education (Rivzi, 2009).
View of knowledge
When technology illuminates objective knowledge, the role of the educator can focus on subjective
View of knowledge
Patten 1997 ‘You get what you measure’
Measurement needs to adapt to what educators are doing in the current context
UNAWE evaluation framework
UNAWE evaluation framework
Domains of learning
UNAWE Evaluation epistemology
‘Scientific realism’ - Robson 2002
Moves beyond objective, modernist positivism to acknowledge the importance of the social and historical factors.
Chatterji, 2009Extended Term, Mixed Method (ETMM) approaches
1.Pragmatic choice of methods2.Mixture of qualitative and quantitative data; triangulation3.Synthesis of common framework to organise complementary data4.Does not attempt to generalise between contexts; applicability of results is specific to context.
Moves beyond randomised control tests.
Issues in global evaluation
• Why? Purpose• Acceptible evidence• Role of stakeholders• Standards• Key questions• Methods• Collaboration• Evaluator role
There are different and conflicting schools of thought on how to do educational evaluation
Schwandt, 2009
At the core of the IOCE vision is the belief that evaluation as a practice can best be strengthened by the collective and professional efforts of
colleagues working together in organised waysInternational Organisation for Co-operation in evaluation, 2008
Tensions in globalised eraCriteria Ontological perspective:
StructureHuman Agency
Function Control, supervision, accountability
Learning, understanding
Goal Standardisation, universality Variance, difference, diversity and peculiarity
Frame Structural/ macro perspective
Diagnostic i.e. pupil level
Focus Products, conceptual definitions
Processes, local meanings
Benefit Sorting, accountability Strengthening, autonomy
Outcomes Knowledge/professionalism Strengthening/ autonomy
Methodology Scientific, quantitative, RCT Responsive, diversified
Inquiry Analytic Holistic
Locus External Internal
Levin-Rosalis et al., 2009:191
Context: Levels of evaluation
• International Policy• National Policy• Regional policy• Network of (in)formal learning organisations• Cluster of schools• Informal learning organisation• Curriculum• Enrichment Programme• E Learning programme• Resource• Intervention• Group of teachers• Group of parents• Teacher• Parent• Group of children• Child
Structural perspective
Human agency perspective
Key idea: democracy in evaluation
1.3 To learn about evaluation research and examples-Research about evaluation and examples (GK)
1.3 To learn about evaluation research and examples-Research about evaluation and examples (GK)
1.3 To learn about evaluation research and examples-Research about evaluation and examples (GK)
Democracy in UNAWE evaluation
Tensions in globalised eraCriteria Ontological perspective:
StructureHuman Agency
Function Control, supervision, accountability
Learning, understanding
Goal Standardisation, universality Variance, difference, diversity and peculiarity
Frame Structural/ macro perspective
Diagnostic i.e. pupil level
Focus Products, conceptual definitions
Processes, local meanings
Benefit Sorting, accountability Strengthening, autonomy
Outcomes Knowledge/professionalism Strengthening/ autonomy
Methodology Scientific, quantitative, RCT Responsive, diversified
Inquiry Analytic Holistic
Locus External Internal
Levin-Rosalis et al., 2009:191
1.3 To learn about evaluation research and examples-Research about evaluation and examples (GK)
local human agency perspective
Integration with structural perspectiveCategorisation for communication
UNAWE evaluation framework
Domains of learning
Evaluation research notes
Workshop 1 activities
-Review: return to mind mapWhat’s new? Try out? To share?
Workshop 1 Aims:
1.1 To understand evaluation contextFront end, formative and summative evaluationQuantitative and qualitative data
1.2 To share evaluation case studies
1.3 To place existing evaluation experiences in wider contexts
Analysis: extent, breadth, depth, and mastery
Three workshops Aims for participants:1.1 To understand evaluation key ideas1.2 To share evaluation case studies1.3 To learn about evaluation context
Tomorrow2.1 To recap evaluation methods2.2 To carry out (video) interviews2.3 To consider data analysis
3.1 To review evaluation reports3.2 To present demographic information using mapping software3.3 To consider evaluation strategies3.4 To present evaluation information