adopting webpa: implications for assessment strategy ... · work using grademark. ... a marking...

1
As we look to adopt Peer Assessment in online courses, review of assessment strategy and course management is required alongside implementation. In Brand Management the course tutors and I looked at the elements that needed to be in place in order to effectively use WebPA which is available as a Moodle plugin. e course grade comprised an Individual Report weighted at 50% and a Group Report in 2 parts comprising the remaining 50% weight: a Group Report (80%) and a WebPA Peer Assessment (20%) Aſter adding grade items we added weightings to Moodles Gradebook calculations. Tutors developed new Grademark rubrics for the Group & Individual Reports. Categories and Items settings for calculation purposes. Next stage was to design WebPA Assessment Criteria forms - as with many peer assessment approaches we focussed on participation, communication and time management and were informed by Pollard’s et al (2014) model of reflective teaching with aim of fostering objective reflection against broader group roles and functions. e 4th criterion considered Contribution in terms of ideas and quality of work and although a single element in the Peer Assessment the linkage to the criteria used for the Group assessment rubric was intended to reinforce student awareness of specific assessment requirements. ese forms can be attached to one or more assessments. We had to decide how to form work groups: for the UK based instance we allowed students to choose a group from within the larger tutorial group. is was facilitated by use of access restricted Group Choice activities in Moodle with final groups being replicated in WebPA. One member of each work group submits the Group Report, tutors grade the work using Grademark. Within WebPA the score for each Group is set, a Marking Sheet created and a suite of reports becomes available. One of these, the individualised score is imported into Moodle to form part of the Category and Course Total. We found that aſter import a small number of students were concerned over their grade - largely due to their belief that as they had all awarded each other the highest possible scores within WebPA they would receive the full element of groups component. (20% of category total, max 10% towards course total.) In essence the individualised score is capped by each groups performance against criteria on Group Report rather than a simple awarding of additional marks based on peer assessment. is is an issue that needs to be reviewed to improve student understanding of the various processes comprising grading strategy or the opportunity for improved feedback peer assessment offers, Brown (1997), may be missed. As suggested by Falchikov (2005) we are looking to involve students in the review process which may improve both ongoing acceptance and reliability of both group work rubrics and WebPA assessment forms. As with many quantitative and qualitative approaches to group assessment available marks are initially viewed as evenly distributed between group members and students adjust the propor- tion of each criteria allocated to their peers. So in this sense the WebPA element can be viewed as a mechanism for students confirming tutor awarded grades if they all award each other top marks - variation in individual scores stems from the level of disagreement within any given group. So a situation arises where a low performing group can assess each other equally positively but not gain as much as they expected from the exercise whereas a high performing but fractious group can result in greater variation and see marked ‘winners’ and ‘los- ers’. With Non-cooperative groups seeing greater levels of variation in individual scores than Cooperative groups there should be reflection on both construc- tion of peer assessment criteria to include objective or measurable elements and to consider whether the mechanisms for group formation can negatively impact on performance Conclusion & further work e issues of self-selection or allocation will form the basis of comparative analysis following completion of the peer assessment for the Hong Kong occurrence where student group choice was disallowed. Areas of particular interest are the gender balance of groups and the impact on peer evaluation. Webb (1984) discussed gender dominance of groups in relation to levels of cooperation and us- ing WebPA we will be able to cross reference group perception of colleague contributions to tu- tor based evaluation of group performance. However, we posit that individuals require time to identify their ‘natural’ Group & team roles but within a modular environment it is unusual for the same Group to be formed for multiple assignments. As this may affect an individuals ability to contribute further work on design and scoring of group work exercises is required. Student access to the group exercise rubrics was largely leſt to the mechanism within Grademark and we should further ensure that students are aware of that. WebPA assumes that all members of a group receiving the same score for a piece of work is un- fair, however there is a risk that without careful review and implementation of group work ru- brics and assessment criteria we could just be switching from 1 type of unfairness to another. References • George Brown. (1997). Assessing student learning in higher education.: London : Rout- ledge. • Falchikov, N. (2005). Improving assessment through student involvement: Practical solutions for higher and further education teaching and learning. London: Routledge Falmer. Pollard, A., Black-Hawkins, K., Pollard, P. A., & Hodges, G. C. (2014). Reflective teaching in schools (4th ed.). London: Continuum Publishing Webb, N.M. (1984) Sex differences in interaction and achievement in cooperative small groups, Journal of Educational Psychology, 76, 33-24. Adopting WebPA: implications for assessment strategy & course management for cohorts in Brand Management Ian Hesketh, Academic & Business Liaison & Alessandro Feri, Lecturer in Marketing 159 Students 3 Tutorial groups 40 self-elective groups

Upload: others

Post on 18-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Adopting WebPA: implications for assessment strategy ... · work using Grademark. ... a Marking Sheet created and a suite of reports becomes available. One of these, the individualised

As we look to adopt Peer Assessment in online courses, review of assessment strategy and course management is required alongside implementation.

In Brand Management the course tutors and I looked at the elements that needed to be in place in order to effectively use WebPA which is available as a Moodle plugin.

The course grade comprised an Individual Report weighted at 50% and a Group Report in 2 parts comprising the remaining 50% weight: a Group Report (80%) and a WebPA Peer Assessment (20%)

After adding grade items we added weightings to Moodles Gradebook calculations.

Tutors developed new Grademark rubrics for the Group & Individual Reports.

Categories and Items settings for calculation purposes.

Next stage was to design WebPA Assessment Criteria forms - as with many peer assessment approaches we focussed on participation, communication and time management and were informed by Pollard’s et al (2014) model of reflective teaching with aim of fostering objective reflection against broader group roles and functions.

The 4th criterion considered Contribution in terms of ideas and quality of work and although a single element in the Peer Assessment the linkage to the criteria used for the Group assessment rubric was intended to reinforce student awareness of specific assessment requirements.

These forms can be attached to one or more assessments.

We had to decide how to form work groups: for the UK based instance we allowed students to choose a group from within the larger tutorial group.

This was facilitated by use of access restricted Group Choice activities in Moodle with final groups being replicated in WebPA.

One member of each work group submits the Group Report, tutors grade the work using Grademark.

Within WebPA the score for each Group is set, a Marking Sheet created and a suite of reports becomes available. One of these, the individualised score is imported into Moodle to form part of the Category and Course Total.

We found that after import a small number of students were concerned over their grade - largely due to their belief that as they had all awarded each other the highest possible scores within WebPA they would receive the full element of groups component. (20% of category total, max 10% towards course total.)

In essence the individualised score is capped by each groups performance against criteria on Group Report rather than a simple awarding of additional marks based on peer assessment. This is an issue that needs to be reviewed to improve student understanding of the various processes comprising grading strategy or the opportunity for improved feedback peer assessment offers, Brown (1997), may be missed.

As suggested by Falchikov (2005) we are looking to involve students in the review process which may improve both ongoing acceptance and reliability of both group work rubrics and WebPA assessment forms.

As with many quantitative and qualitative approaches to group assessment available marks are initially viewed as evenly distributed between group members and students adjust the propor-tion of each criteria allocated to their peers. So in this sense the WebPA element can be viewed as a mechanism for students confirming tutor awarded grades if they all award each other top marks - variation in individual scores stems from the level of disagreement within any given group.

So a situation arises where a low performing group can assess each other equally positively but not gain as much as they expected from the exercise whereas a high performing but fractious group can result in greater variation and see marked ‘winners’ and ‘los-ers’.

With Non-cooperative groups seeing greater levels of variation in individual scores than Cooperative groups there should be reflection on both construc-tion of peer assessment criteria to include objective or measurable elements and to consider whether the mechanisms for group formation can negatively impact on performance

Conclusion & further work

The issues of self-selection or allocation will form the basis of comparative analysis following completion of the peer assessment for the Hong Kong occurrence where student group choice was disallowed.

Areas of particular interest are the gender balance of groups and the impact on peer evaluation. Webb (1984) discussed gender dominance of groups in relation to levels of cooperation and us-ing WebPA we will be able to cross reference group perception of colleague contributions to tu-tor based evaluation of group performance. However, we posit that individuals require time to identify their ‘natural’ Group & team roles but within a modular environment it is unusual for the same Group to be formed for multiple assignments. As this may affect an individuals ability to contribute further work on design and scoring of group work exercises is required.

Student access to the group exercise rubrics was largely left to the mechanism within Grademark and we should further ensure that students are aware of that.WebPA assumes that all members of a group receiving the same score for a piece of work is un-fair, however there is a risk that without careful review and implementation of group work ru-brics and assessment criteria we could just be switching from 1 type of unfairness to another.

References• George Brown. (1997). Assessing student learning in higher education.: London : Rout-

ledge. • Falchikov, N. (2005). Improving assessment through student involvement: Practical

solutions for higher and further education teaching and learning. London: Routledge Falmer.

• Pollard, A., Black-Hawkins, K., Pollard, P. A., & Hodges, G. C. (2014). Reflective teaching in schools (4th ed.). London: Continuum Publishing

• Webb, N.M. (1984) Sex differences in interaction and achievement in cooperative small groups, Journal of Educational Psychology, 76, 33-24.

Adopting WebPA: implications for assessment strategy & course management for cohorts in Brand Management

Ian Hesketh, Academic & Business Liaison& Alessandro Feri, Lecturer in Marketing

159 Students3 Tutorial groups

40 self-elective groups