work-place based assessments (wbas): an update
TRANSCRIPT
![Page 1: Work-place based assessments (WBAs): an update](https://reader036.vdocument.in/reader036/viewer/2022082213/575091451a28abbf6b9cdc57/html5/thumbnails/1.jpg)
ETHICS/EDUCATION
Work-place basedassessments (WBAs): anupdateKevin Hayes
Emily Easter
AbstractWork-place based assessments (WBAs) are now embedded in the
curricula of nearly all Medical Royal Colleges as the principal tools of
assessment of ongoing clinical training. They are also increasingly
being used in most medical schools. They have been in place for more
than 4 years and yet multiple problems still exist in their utility and deliv-
erability. There appears to be a significant mismatch between the original
educational theory supporting the WBA tools and their practical day-to-
day usage. Most of these problems relate to their rushed implementation
and apparent lack of clear purpose and this is compounded by the conflict
with the increasing pressures of clinical service delivery. Most of the
recent work in this area has concentrated on defining their role in keeping
with the supporting educational evidence and redesigning the forms to
reflect this. This review outlines the educational rationale behind these
proposed and ongoing changes and how this is hoped to influence
their practical future direction.
Keywords anchor statements; AoMRC; formative; summative; Super-
vised Learning Events; Work-place based assessments (WBAs)
Introduction
So, where are we then, more than 4 years after the imple-
mentation of WBAs as the “way forward” in the assessment of
clinical training? On a positive note, data from the Foundation
programme, and indeed several Royal colleges, tell us these tools
are being used in very large numbers in the UK and that the
e-portfolio in particular captures this data extremely well.
Trainee and trainer surveys confirm earlier findings that both
generally find them educationally valid and there also appears to
be a significant increase in the number of “trained” trainers.
On a less positive note, a significant number of trainees and
trainers still remain unclear as to the purpose of WBAs, a point
backed up by persistent significant variations in their use
between and within specialities. The single biggest problem
remains deliverability as time pressures continue to prevent
these assessments occurring as regularly as would be ideal.
Kevin Hayes MRCOG is a Senior Lecturer and Consultant in Obstetrics
and Gynaecology and Medical Education at St George’s University of
London, London, UK. Conflicts of interest: none.
Emily Easter BSc (Hons) is a penultimate year medical student at
St George’s University of London, London, UK. Conflicts of interest: none.
OBSTETRICS, GYNAECOLOGY AND REPRODUCTIVE MEDICINE 22:7 205
These issues have recently been addressed and the way forward
outlined.
Why the need for change?
There is no doubt that doctor’s abilities tested in an actual day-to-
day environment are more predictive of actual performance than
assessment in a controlled and deconstructed way. WBAs
certainly fulfil this role; they do however need improvement. In
June 2010 a meeting occurred in London consisting of the
Association of Medical Royal Colleges (AoMRC), the Conference
of Postgraduate Medical Deans (COPMeD), the Medical Schools
Council and the General Medical Council (GMC). It had become
apparent that the current WBA situation could not continue.
Nearly all MRCs were represented as the issues were truly cross-
collegiate and indeed are in continuity with undergraduate
education. In fact in its guidance document “Assessment in
under graduate medical education” in 2011, the GMC outlined
the importance of WBAs in medical school training to prepare
students for the Foundation programme. The London meeting
was an attempt to redress the obvious continuing issues
surrounding WBAs including: lack of clear purpose leading to
variable practice particularly in relation to the Annual Review of
Clinical Practice (ARCP), the naming of these tools, the wording
of the assessment forms themselves and issues around
deliverability.
Re-defining the purpose of WBAs
One of the single biggest sources of confusion with WBAs has
been whether they are formative, summative or both. The tools
were originally essentially designed as formative ones where
activities were observed and contemporaneous feedback was
given e assessment for learning. Nearly all work in this area has
reported that these tools perform well in this respect and that the
feedback given to trainees enhances learning. There has also
been considerable use of these tools in a summative fashion. The
original OSATs form had a pass/fail judgement (since removed)
and performance in WBAs has been used to a varying degree at
annual reviews as a possible barrier to progression. It is vital to
make it clear to all concerned whether a WBA is being used for
formative or summative purposes.
What’s in a name?
Formative WBAs are planned to be re-named Supervised
Learning Events (SLEs). This describes their formative nature
and avoids the word assessment which in itself can lead to
undesirable assessment-orientated behaviour. The whole
emphasis should therefore become the feedback and learning
elements rather than a “tick box exercise”. These events will be
part of a trainee’s portfolio but it is only the occurrence of them
that is documented and not the outcome i.e. that they occur and
when they occur. They are designed to occur regularly
throughout a period of training and not all clumped together just
before an ARCP, and regular educational supervisor meetings
should help to ensure that this is happening. There will be no
minimum number of SLEs required, but the more that are per-
formed the more useful it is for a trainee’s development. It is not
envisaged that there will be any specific sanction if regular SLEs
� 2012 Elsevier Ltd. All rights reserved.
![Page 2: Work-place based assessments (WBAs): an update](https://reader036.vdocument.in/reader036/viewer/2022082213/575091451a28abbf6b9cdc57/html5/thumbnails/2.jpg)
ETHICS/EDUCATION
fail to occur but it helps to form part of an overall picture for
a trainee, especially one with difficulty, where engagement in the
educational process may be lacking. In O þ G, Mini-CEXs and
Case Based Discussions (CBDs) fit this model particularly well
and are likely to be used as such.
Summative WBAS will continue to be called WBAs as they are
by definition assessments that count towards attainment of
training milestones. OSATs for surgical and technical procedures
and Team Observations (TOs) are likely to be used in this way.
TOs will not be expanded upon as their place is well established
and no changes in their current use are envisaged. As a surgical
speciality we have an obligation to ensure our trainees are
competent at particular procedures at particular times in their
training, hence the mandatory and summative nature of the
assessments. The RCOG curriculum has defined the core proce-
dural competencies required by certain times in the training
programme.
So, how many OSATs are required to say a trainee is
competent at a procedure? As a general rule, the more OSATs
a trainee performs per procedure e.g. diagnostic laparoscopy, the
more reliable our assessment of their competence will be.
However, as has already been stated, there has to be a degree of
pragmatism regarding numbers as the pressure of clinical service
is the biggest barrier to doing these assessments. Hence it is felt
that a minimum of three OSATs judged as competent, by at least
two different assessors, for each core procedure per year will be
deemed necessary. Decisions on minimum numbers will need to
be speciality specific and indeed medical school specific to reflect
their own curricula. Increasing complexity of cases will be
assessed in the later years of training so that advanced trainees
are being judged on more challenging procedures e.g. a placenta
praevia section rather than an elective breech section. The
process is designed to be driven by trainees and they should
request summative OSATs when they feel ready to do them
within any year of training. There will be a global judgement
decision on the assessment form of “competent” or “working
towards competence” for any given procedure. For both the
assessed and assessors, it is vital that honest and constructive
judgements are made and that “working towards competence”
judgements are not seen as “failure” but rather as part of surgical
development over time. A trainee working towards competence
will simply be required to perform extra WBAs until the required
number of competent procedures has been attained. Where
persistent surgical/procedural issues remain, this will clearly
indicate a role for targeted training for an individual trainee.
The distinction between the SLEs and WBAs has been
compared to driving lessons (they have occurred but the
outcome of them is not known to an assessor) and the driving
test (a summative judgement on safe competent driving ability
that is documented) respectively.
This duality of purpose is not a problem as long as it is clear
beforehand to all concerned which type of assessment is being
used. There is currently a study looking at this change in the
Foundation programme and the findings are awaited.
Documentation changes
There are three proposed major changes that are going to occur
to the current assessment forms (paper or electronic): they will
OBSTETRICS, GYNAECOLOGY AND REPRODUCTIVE MEDICINE 22:7 206
all lose any numerical rating or Lickert-type scales, there will be
anchor statements to help assessors make judgements and the
area for feedback will become the main feature. Clearly in time,
all trainees will be using an e-portfolio and paper forms will
become obsolete.
It has long been known that rating scales induce particular
types of assessment behaviour, namely wanting to score highly
and/or get top marks in everything by trainees, and score infla-
tion (“right-sided marking”) by trainers. The “score” then
become the focus of the whole process. By simply removing them
for all assessments, these behaviours are removed and the focus
automatically becomes feedback and advice on areas for devel-
opment i.e. formative assessment for learning.
The use of narrative descriptors (or anchor statements)
regarding levels of ability is used to aid assessors in making
judgements e they serve to guide the assessor on the observed
activity only, rather than make them “score” a trainee. These
statements will vary but examples for laparoscopy could be:
“Able to perform the basic procedures but needs significant help
with more additional manoeuvres” or “Is able to perform the
entire procedure safely and without the need for supervision”.
The current forms have a limited area for feedback and this is
going to increase significantly to form the single biggest area in
line with the educational ethos e verbal and written feedback
will be a mandatory part of all SLEs and WBAs.
SLEs, WBAs and the ARCP process
A trainee’s requirements for ARCP in all years of training are now
well defined in the RCOG curriculum including the current WBAs
being used. When a trainee comes to an O þ G ARCP in the
future the main supporting documentation will be the Educa-
tional Supervisor’s (ES) report (in the process of being re-vamped
and improved), their logbook, TO forms and a mandatory
number of WBAs that are specified at a particular stage. It is the
overall review of these factors that determines the ARCP
outcome for a trainee. As described, SLEs will not form any part
of the documentation e any major issues with engagement with
the process should be subsumed into the ES report if deemed
necessary, usually when a trainee is in difficulty. Again, different
specialities and medical schools will have the responsibility of
deciding on their own assessment strategies and requirements.
Assessing more difficult things in the future?
A particular area of interest currently is the assessment of human
factors in day-to-day work. While it is clearly desirable to assess
that a trainee can perform a competent and safe Ventouse
delivery, it is arguably more important to assess the decision
making, team working and situational awareness of the trainee
on the labour ward that day, who decides to perform the
procedure in the first place. None of our current assessment tools
truly address these factors. The Acute Care Assessment Tool
(ACAT), used by the physicians, is used to assess many of these
traits and initially looked promising. There was a pilot study of
its use in a labour ward setting by the RCOG assessment sub-
committee but it became apparent that even enthusiastic units
found it too time consuming and that it could not realistically be
delivered. The Non-Operative Technical Skill for Surgeons
(NOTSS) assessment tool, which is less time consuming and also
� 2012 Elsevier Ltd. All rights reserved.
![Page 3: Work-place based assessments (WBAs): an update](https://reader036.vdocument.in/reader036/viewer/2022082213/575091451a28abbf6b9cdc57/html5/thumbnails/3.jpg)
Practice points
C WBAs are now being used from undergraduate training
onwards and are here to stay
C Existing WBA tools are being renamed to reflect whether they
are being used for formative or summative purposes
C They will be known as Supervised Learning Events (SLEs) and
Work-place based assessments (WBAs) accordingly
C SLE forms are changing to reflect an emphasis on feedback
and numerical rating scales are being removed
C WBAs will be a mandatory part of training and there is
a specified minimum number to be performed
C Pilot work and consultation is already occurring nationally to
support these changes
ETHICS/EDUCATION
assesses these other important skills, is currently under consid-
eration as a plausible alternative, with a pilot study ongoing e
watch this space!
The way forward
One of the biggest mistakes of the original WBA implementation
was unrealistic timescales leading to a rushed start date, with
inadequate time and resources devoted to training and infor-
mation provision. This led to most of the current issues, with no
explicit clear purpose being top of the list. It would appear that
the mistakes of the past have been heeded and a clear message at
the AoMRC meeting was not to do the same again. Time is being
taken with no final decisions until data is forthcoming from the
current work in the foundation programme, individual Royal
Colleges and Medical schools to back up these significant
changes and provide evidence of their utility and deliverability.
The individual Colleges are “getting the message out there” via
College Tutors meetings and web-based information and seeking
the views and comments of these pivotal local educational
members. Training will need to be provided nationally in
a systematic way for all trainees and trainers, including consul-
tation on the new forms, as well as in the practical changes to
daily assessment practice. Ongoing evaluation will clearly also be
vital. This work will again be the role of the individual Colleges
and the RCOG is intimately involved in the proposed changes.
Deliverability, particularly due to lack of time in trainers’ job
plans, continues to present a significant barrier to WBAs. Trusts
are in difficult financial positions and hence educational issues
tend to be secondary to income generating clinical activities.
While these difficulties are acknowledged at Academy and
Collegiate level, there is no easy solution to the problem. Much
work needs to be done at national and local levels to ensure
adequate time and resources are allocated to training. Ultimately
Trusts need well trained doctors and common sense dictates that
the situation will reach equilibrium in time.
Summary
WBAs are a highly valid way of assessing day-to-day real clinical
practice and nearly all Royal Colleges are committed to their use.
Medical schools are increasingly using these tools and the future
should see a seamless transition from using WBAs at medical
school and then into postgraduate training. Much needed
changes to the current WBA theory and practice are under way,
moving to a clearly defined purpose that distinguishes between
assessment for learning and assessment of learning. This is
reflected in changes to the names of the tools and the forms used
OBSTETRICS, GYNAECOLOGY AND REPRODUCTIVE MEDICINE 22:7 207
to document them. WBAs will also hopefully now link in with
the end of year ARCP process in a transparent and uniform way
regardless of the region in which a trainee works. The appro-
priate process, time and resources appear to be in place this time
around to ensure that these changes make our clinical assess-
ment of doctors robust and fit for purpose. The mistakes of the
past will hopefully soon be a distant memory but local Trust
engagement is also vital to ensure that these tools are actually
deliverable in a meaningful way. A
FURTHER READING
Assessment in undergraduate medical education. London, UK: General
Medical Council, 2011.
Crossley J, Jolly B. Making sense of work-based assessment: ask the right
questions, in the right way, about the right things, of the right people.
Med Educ 2012; 46: 28e37.
Crossley J, Marriott J, Purdie H, Beard JD. Prospective observational study
to evaluate NOTSS (Non-Technical Skills for Surgeons) for assessing
trainees’ non-technical performance in the operating theatre. Br J Surg
2011 Jul; 98: 1010e20.
Hayes K. Work-place based assessments. Obstet Gynaecol Reprod Med
2010; 21: 52e4.
Jenkins J. Workplace based assessment forum supporting paper. London,
UK: General Medical Council, 2010.
The state of basic medical education. London, UK: General Medical
Council, 2010.
Workplace based assessment forum outcomes. London, UK: Association
of Medical Royal Colleges (AoMRC), 2010.
� 2012 Elsevier Ltd. All rights reserved.