ppw_feasst_sh

57
1 Southampton, Oct. 2009 Participatory Pattern Workshops: Application to formative e- assessment

Upload: yishay-mor

Post on 29-Nov-2014

1.332 views

Category:

Education


0 download

DESCRIPTION

Participatory Pattern Workshops: Application to formative e-assessment An LKL seminar at Southampton University, Oct. 2009

TRANSCRIPT

  • 1. Southampton, Oct. 2009 Participatory Pattern Workshops: Application to formative e-assessment
  • 2. the critical resource is not the capacity to produce, but the knowledge to do it right. Problem: The Design Divide the gap between those who have the expertise to develop high-quality tools and resources and those who dont (Mor & Winters, 2008*)
  • 3. Solution... (in architecture)
  • 4. What is a pattern?
    • At is simplest, it is a
      • Generalised solution to a problem
      • Follows a specific structure
    When, Where, Who What are we trying to achieve / solve? Cookbook: ingredients, procedure, expected outcomes C o n t e x t Problem Solution
  • 5. A burda pattern.. Season: Fall For: Women Garment Type: Dress Style: Classic, Evening Wear, Romantic Material: Taffeta if I copy a dress, I can only create the same dress. If I have a pattern, I can create many dresses (Yim Ping LENDEN)
  • 6. As for software
  • 7. Participatory Methodology for Practical Design Patterns
    • Problem
      • Acceleration -> need for effective protocols for sharing of design knowledge
    • Context
      • interdisciplinary communities of practitioners engaged in collaborative reflection on a common theme of their practice.
      • blended setting : co-located meetings + on-line collaborative authoring system.
    Son, this was my dad's mobile. I want you to have it.
  • 8. The Participatory Pattern Workshops Methodology
  • 9. Collaborative reflection workshop Facilitate on-going design-level conversation between designers and practitioners involved in diverse aspects of the problem domain. Open, trusting and convivial. And at the same time Critical, focused and output-directed.
  • 10. Collaborative reflection workshop
  • 11. Case Stories Workshop Engender collaborative reflection among practitioners by a structured process of sharing stories of successful practice.
  • 12. Tell us about...
    • A specific incident
    • That happened to you
    • Where you confronted a challenge / problem
    • And resolved it successfully
  • 13. Be a STARR
    • Situation
      • Describe the context in detail.
    • Task
      • What was the problem you were trying to solve?
    • Action
      • What did you do to solve it?
    • Results
      • What happened? Did you succeed? Did you adjust?
    • & Reflections
      • What did you learn?
    http://www.slideshare.net/yish/star-case-study-template http://patternlanguagenetwork.myxwiki.org/xwiki/bin/view/Cases/
  • 14. Tell it like it was
    • You dont know
    • Would have happened..
    • Could have happened..
    • Should have happened..
    • Will Happen
    You DO know, and only YOU know What happened
  • 15. Three hats
  • 16. Pattern Mining Workshop Shift from anecdotes to transferable design knowledge by identifying commonalities across case stories, and capturing them in a semi-structured form.
  • 17. The core template
    • Context
      • Where, when, who (all the things you cant change)
    • Problem (pick one!)
      • We want to do A under condition B but are constrained by C
    • Solution
    • (in any order that
    • works for you)
    When, Where, Who What are we trying to achieve / solve? Cookbook: ingredients, procedure, expected outcomes C o n t e x t Problem Solution
  • 18. Force Mapping
    • Name the forces
    • Give them icons
    • Plot the links and mark + / -
    Forces: constraints or factors that influence the problem. The difficulty of solving the problem arises from tensions between competing forces.
  • 19. Forces
    • A ctors
    • B eliefs
    • C onditions
    • D esires
  • 20. Future Scenarios Workshop Validate design patterns by applying them to novel real problems in real contexts.
  • 21. the cycle of design research
  • 22. the cycle of design research
  • 23. formative e-assessment: case stories, design patterns, and future scenarios Caroline Daly, Harvey Mellar, Yishay Mor, Norbert Pachler, Institute of Education, University of London http://feasst.wlecentre.ac.uk/
  • 24. Overview
    • Scoping study commissioned by JISC
    • Short term, small budget, intended to inform future funding frameworks
    • Established a commited user group of higher-education teachers & researchers
    • Adopted and adapted the Planet Project's Participatory Methodology for Practical Design Patterns, and used the Planet platform
  • 25. Methodology
    • Desk research
      • Literature review
      • Comparing frameworks
    • Five Practical Enquiry Days
      • Combination of collaborative reflection, report back from team, and guest plenaries
      • Launch day, three Planet workshops, developers' day
  • 26. What is formative e-assessment?
    • No consistent view in the literature
      • From: practice assessment, or serial (or repeated) summative assessment
      • To: synonymous with learning
    • The use of digital means to support formative assessment
    • Formative features of assessment, which are afforded by specific features of digital media
  • 27. OK, so what do we mean by Formative Assessment?
    • An assessment functions formatively when evidence about student achievement elicited by the assessment is interpreted and used to make decisions about the next steps in instruction that are likely to be better, or better founded, than the decisions that would have been made in the absence of that evidence
    • (Dylan Wiliam)
  • 28. Formative = feedback + moments of contingency
    • "... These create "moments of contingency," in which the direction of the instruction will depend on student responses. Teachers provide feedback that engages students, make time in class for students to work on improvement, and activate students as instructional resources for one another."
    • (Leahy, Lyon, Thompson, and Wiliam 2005)
  • 29. Wiliam's five strategies
  • 30. Conversational Framework (Laurillard)
  • 31. Learners conception as practice = 20% Learners conception = 10% teachers Teachers conception Other learner(s) conceptions Teacher-designed task practice environment Other learner(s) conceptions as practice Offers answers, ideas Questions, offers ideas Feedback on action Action to achieve goal Shares practice attempt Adapts approach to task to current conception Adapt a Task practice environment for learners needs Reflects on feedback in relation to task and action Reflects on alternative practice Reflects on learners practice Hints, comments Present concepts Task goal Revises action This animation represents the full use of the Conversational Framework through a combination of Teaching Methods, such as lecture/book/web resource + tutorial/discussion environment + fieldwork/lab/simulation + collaboration environment Presents conception as product Questions Learners conception = 20% teachers Learners conception = 40% teachers Learners conception as practice = 40% Learners conception = 50% teachers Learners conception as practice = 50% Adapts approach to task to current conception Shares practice attempt Learners conception as practice = 60% Learners conception = 60% teachers It illustrates how the iteration between theory and practice, between teacher and learner, and other learners, is expected to contribute to enable the learner to approach an understanding equivalent to the teachers. Discursive/theoretical level Practice/practical level
  • 32. Evidence Centred Assessment Design (Mislevy)
    • PRO:
    • high level of abstraction, so room for implementation flexibility
    • delivery process was implementation independent
    • use of design patterns
    • ANTI
    • full architecture was beyond our needs
    • somewhat removed from the more open ended forms of work that we were seeing in practice
    • too close to implementation detail for where we were at
    • did not provide a theory of learning, or of formative assessment
  • 33. A few cases
    • Creature of the week
    • CoMo
    • Post 16 String Comparison
    • Open Mentor
    • ...
  • 34. Creature of the week (Judy Robertson)
    • Situation
      • large class (138), first and second year computer science students. assignment: create a virtual pet in Second Life.
    • Task
      • Engage and motivate the students
      • show examples of good work which others could learn from
      • show students their work is valued.
      • build a sense of community.
    http://purl.org/planet/Cases/creatureoftheweek
  • 35.
  • 36. CoMo (Niall Winters, Yishay Mor)
    • Situation
      • Royal Vet College.
      • Hospital rotations as part of their training.
    • Task
      • Allow students to capture critical incidents in text and image.
      • Support sharing of clinical experiences and co-reflection.
    http://purl.org/planet/Cases/CoMo
  • 37.
  • 38. Post 16 string comparison (Aliy Fowler)
    • Situation
      • Grammar school been piloting the string comparison approach to language teaching at post-16 for AS and A2 level students.
      • Sixth Form level, grammatical consolidation and whole-sentence translation.
    • Task
      • Allow students to practise written language independently and receive feedback on errors in order to improve their language skills.
    http://purl.org/planet/Cases/Post16stringcomparison
  • 39. Solution
    • A bespoke string (sequence) comparator was designed; uses fine-granularity sequence comparison to compare correct language strings to a users answer. Students answer questions and the comparator marks up errors in their input using colour coding (and font style) to highlight the different types of error. If an answer contains errors the student is given a second attempt in which to correct the submission based on the feedback received.
  • 40. Open mentor (Denise Whitelock) http://purl.org/planet/Cases/OpenMentor
  • 41. A few patterns..
    • Try Once, Refine Once
    • Feedback on Feedback
    • Classroom display
  • 42. Try Once, Refine Once (Aliy Fowler) http://pul.org/planet/Patterns/TryOnceRefineOnce
  • 43. Problem Lack of immediate feedback for students leads to fossilisation of errors and misconceptions providing immediate feedback in an iterative fashion can also hinder effective learning since students are able to "grope their way" step-by-step to a correct solution without necessarily having to think about each answer as a whole.
  • 44. Context
    • Class size
      • Large (30-300)
    • Content
      • Skills facts
    • Mode of instruction
      • Blended / on-line. Computer tested.
  • 45. Solution
  • 46. Feedback on Feedback (Linda McGuigan) http://purl.org/planet/Patterns/FeedbackonFeedback
  • 47. Problem
    • Good feedback should -
    • Alert learners to their weaknesses.
    • Diagnose the causes and dynamics of these.
    • Include operational suggestions to improve the learning experience.
    • Address socio-emotive factors.
    Tutors know this, but are pressed for time. Or not aware of their feedback strategies Large teaching organisations are not equipped to provide tutors with personal feedback on their teaching
  • 48. Context
    • Large scale, technology supported, graded courses
      • many tutors instructing many students.
    • Feedback is mediated by technology that allows it to be captured and processed in real time
    • Topic of study is subject to both grading and formative feedback.
  • 49. Solution
    • Embed a mechanism in the learning and teaching system that regularly captures tutor feedback, analyses it, and presents them with graphical representation of the types of feedback they have given. Ideally, this should also include constructive advice as to how to shift from less to more effective forms.
    • In computer supported environments (e.g. VLEs), this mechanism could be integrated into the system, providing tutors with immediate analysis of their feedback, as well as long-term aggregates.
  • 50. Classroom Display http://purl.org/planet/Patterns/Classroomdisplay
  • 51. Problem
    • Rewards participation.
    • Relates to learner's personal experiences.
    • Window on student conceptions .
    Using learner generated content..
    • Needs to collate works in a single easy to access location.
    • Learners uncomfortable about presenting their work in public
    • Legal or other restrictions on sharing work.
  • 52. Context
    • Class size:
      • Small / medium (6-60)
    • Mode of instruction:
      • Blended (preferable)
    • Time frame
      • Continuous, over a period
    • Pedagogy
      • Involves construction / media production
  • 53. Solution
  • 54. Augmented domain map
  • 55. Example scenario
    • When using Try Once Refine Once , there is a risk that high-achievers do not receive feedback.
    • So -
    • Use Showcase Learning to celebrate students work and provoke feedback from peers and tutors.
    • Use Feedback on Feedback to alert tutors to the problem.
  • 56. Conclusions
    • Tip of the iceberg
    • Practitioners (educational / software) acknowledge the value of patterns, when served with side dishes of cases + scenarios
    • Collaborative elicitation of patterns from cases could be a potent form of professional development.
  • 57. Thank you The pattern language network project: http://patternlanguagenetwork.org The learning patterns project: http://lp.noe-kaleidoscope.org/ The formative e-assessment project: http://feasst.wlecentre.ac.uk/ This presentation: http://www.slideshare.net/yish/ppw_feasst_sh Yishay Mor people.lkl.ac.uk/yishay [email protected] @yishaym Niall Winters www.lkl.ac.uk/naill [email protected] @nwin Harvey Mellar lkl.ac.uk/people/mellar.html [email_address]