regional assessment workshop ability · understanding every child… from reception to y12 ....

Post on 10-Oct-2020

4 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Regional

Assessment

Workshop

Ability

#GLRAW

Assessment, A whole-pupil approach model:

• Attainment

• Ability

• Identifying barriers to learning

Standard Age Scores are vital for the meaningful comparison

of assessment data.

The Complete Digital Solution (CDS®) reflects and supports

that model in a ready-made digital package of the relevant

assessments, offering critical insights and contextualised

data on individuals and groups.

A whole-pupil approach

Whole-pupil approach,

supported by the CDS

Progress Test in English®

Progress Test in Maths®

Progress Test in Science®

Dyscalculia Screener

Dyslexia Screener

Kirkland Rowell®

PASS®

Cognitive Abilities Test 4®

New Group Reading Test®

Single Word Spelling Test

Now including the GL Assessment Baseline®

Understanding every child…

From Reception to Y12

Whole-pupil approach:

Ability, exemplified by CAT4

Cognitive Abilities Test 4

By the end of the session you will:

● Understand the data from all the CAT4 reporting and their implications for the

school, groups and individuals;

● Be able to use CAT4 indicators appropriately to set meaningful and challenging

targets and provide a ‘value added’ measure;

● Be able to use SAS to identify appropriate pace and challenge for all students

and in particular students for ‘stretching’ (Gifted and Talented) to those likely to

need support;

● Understand the benefits and possible pitfalls of external stakeholder reporting;

And in common with some of the other

workshops, you will:

● Begin developing simple and effective ways to communicate data in school

● Be able to understand common standardised terms and appreciate the

implications of using standardised data

● Be able to use the ‘combination assessment’ reports to highlight

underachievement and coasting;

● Be able to combine assessment data from Attainment, Ability and Surveys to

better understand an individual’s needs and strengths

● Have a comprehensive set of tools to help you cascade the workshop learning

back in school.

Session and post-session task

As we go through the module it may be useful to jot down your thoughts on next steps.

Some possible development areas include:

● Tracking, monitoring and progress;

● Informing teaching and learning (internal stakeholders);

● Reporting to external stakeholders;

● Integration into the school’s assessment structure.

Thoughts on data communication might include: who, when, what, why, and how.

Measuring Ability Cognitive Abilities Test (Edition 4)

● Thoughts on data communication (up and down) must be carefully

considered.

● It might include: who, when, what, why, and how.

● As we go through the module keep these in mind regarding the data from the

reports and what the next steps might be.

Data Communication

How are ability assessments different to measures

of attainment?

Ability is about:

● Basic ‘building blocks’ of learning — ability to recognise similarities, analogies, patterns and relationships; used to make sense of the world, whenever and wherever we learn;

● General, transferable skills. Material is kept simple, clear and familiar with minimal specialised content or knowledge — context reduced;

● Important for CAT4’s use is this contrast with attainment tests, which assess specific skills/knowledge that have been directly taught as part of the curriculum — context laden.

Why use ability measures?

● To learn more about a student’s ability level measured outside of current

performance levels and unencumbered by many learning barriers.

● To identify performance/ability mismatches and explore reasons behind them.

● To identify cognitive strengths and areas where understanding will need support

with teaching and learning.

Verbal Reasoning Capability

Verbal classification

Verbal analogies

Quantitative Reasoning

Number Analogies

Number Series

Spatial Ability

Figure Analysis

Figure Recognition

Non-Verbal Reasoning

Figure Classification

Figure Matrices

How can ability be measured?

- Reasoning tests in CAT4

What do we learn from the individual

battery scores?

● Are there any problems that might arise in getting accurate reasoning measures from a

particular battery?

● Do you think any one battery might show:

● a better correlation to future examination success than the others? – implications.

● the best indicator of potential performance, if there were no barriers to learning?

CAT4 Group Reporting

CAT4 results will help you decide about

the pace of learning that is right for a

student and whether additional

support or challenge is needed.

Including what Standardised Age

Scores might tell us about the

identification of Gifted and

Talented (G&T) and other ability

groups.

Appropriate

Challenge?

Case study students (mean SAS)

Student Mean (SAS)

Sara Shafiq 125

Natasha Aransola 81

Samera Kan 116

Lara Sandford 114

Mia Shimizu 114

Kareena Kahn 108

Susan McGregor 110

Zaynab Ashfiq 98

Kirsty Freeman 133

Standardised Age Scores —

A national distribution/benchmark

The ‘average’

is set to a

SAS of 100.

50% of

students get

this score or

higher

A SAS of >125 is only

achieved by the 5% of

students with the highest

ability and are likely to

form part of a gifted and

talented group

12%

Ability Groupings Stanine Percentage of Pupils

Very High 9 4%

Above Average 7 and 8 12+7= 19%

Average 4, 5 and 6 17+20+17= 54%

Below Average 2 and 3 7+12= 19%

Very Low 1 4%

CAT4 standardised scores

and stanines

Case study students (SAS)

Student Verbal Quantitative Non-verbal Spatial Mean

Sara Shafiq 130 120 122 126 125

Natasha Aransola 80 79 83 81 81

Samera Kan 113 116 115 120 116

Lara Sandford 97 111 121 126 114

Mia Shimizu 122 111 112 112 114

Kareena Kahn 105 114 105 110 108

Susan McGregor 108 103 117 113 110

Zaynab Ashfiq 83 97 115 95 98

Kirsty Freeman 131 131 133 136 133

● Should very high ability scores ‘trigger’ G&T discussions? If so across the board or in

specific batteries?

● Are different SAS scores better ‘indicators’ for G&T for some subjects than others?

● Why might students with very high SAS scores not be presenting as G&T?

● Look at mean SAS and ‘battery’ SAS for students already identified as G&T. Are there

any ‘mismatches’?

Helping identify potential G&T students

G&T possible implications

● If you have high ability pupils who have not up to now been identified as Gifted

and Talented (G&T) - how is their performance compared to those who have

been identified previously? Are their targets challenging enough?

● Looking at students who have previously been identified as G&T, yet appear to

have a lower ability profile, how are they performing compared to others of a

similar ability profile?

● Have they thrived or struggled as a result of the greater challenge?

Verbal/Non-verbal Standardised Age Score

differences

Student Verbal Non-verbal

Ruby Williams 97 136

Kirsty Freeman 131 133

Sara Shafiq 130 122

Lara Sandford 97 121

Susan McGregor 108 117

Samera Kan 113 115

Zaynab Ashfiq 83 115

Mia Shimizu 122 112

Khan Kareena 105 105

Natasha Aransola 80 83

Some important points on Verbal Reasoning

Scores

● For students with high scores in the Non-verbal, Quantitative or Spatial batteries if

the Verbal scores are in stanines 1 to 3 then the pupils are still likely to experience

problems across the curriculum.

Recommendation:

● Students gaining Verbal scores in stanines 1 -3 should have a reading assessment

administered to gain further insight into their accessibility to curriculum material

● How can we work with a student’s cognitive strengths and challenge and scaffold

weaker areas?

● What are the implications for specific subject areas?

● Generalisations may help as a first step (handout).

Cognitive strengths and weaknesses

The ‘Learning Profile’

Implications for teaching and learning

Personalising learning issues

Figure Classification & Figure Matrices

• Use both verbal and spatial thought

processes in combination

• Good indication of overall ability

independent of reading skill or spoken

English

• Shape distinctions are easy so

difficulty lies in the reasoning aspect

Figure Analysis & Figure Recognition

• Involve creating, remembering and

mentally manipulating precise mental

images

• Difficulty lies in the visualising aspect

• Good indication of potential to do well

in particular subjects and careers

Non-Verbal Reasoning and

Spatial Ability

The Verbal Reasoning and Spatial Ability Batteries form the basis of this analysis and the

profiles are expressed as a mild, moderate or extreme bias for verbal or spatial learning or,

where no bias is discernible (that is, when scores on both batteries are similar), as an even

profile.

Learning profiles

Are all relative difference (‘bias’) students the

same?

Increased verbal

bias

Increased spatial

bias

Higher

general ability

Spatial stanine

Ve

rba

l S

AS

Ve

rba

l sta

nin

e

Lower general

ability Spatial SAS

Example: Daniel’s scatter graph

Extreme verbal bias

Moderate verbal bias

Mild verbal bias

No bias

Mild spatial bias

Moderate spatial bias

Extreme spatial bias

Daniel Rivera

How much can ability alone tell us?

Background information

Daniel is a student with English as his first language. His attendance is very good. His

teachers report that he is generally considered a reasonable, but not outstanding, student.

In terms of extra curricular activities he is captain of the football team and a member of the

art class.

Mark Wilkinson on You Tube

https://www.youtube.com/watch?feature=player_embedded&v=ab

1_8SwUMNM

Mark Wilkinson’s experience

Unlike Daniel from the prior example, Mark felt a huge ‘disconnect’ with his

learning experience.

• Students with an extreme verbal preference may be characterised as sequential

learners – learning best through step-by-step instruction, repetition, consolidation and

review of information.

• The spatial learner may be the antithesis making it difficult to attend to his or her

learning needs, which will be the exception in most classrooms.

It’s not just links to STEM subjects —

the Spatial Learner may have

other needs

Sequential Spatial

Written or oral directions Visual directions

Step-by-step learning Whole-to-part learning

Benefits from repetition and revision Once acquired, learning sticks

Computation Concepts

Algebra Geometry

Phonics Sight words

May be late or later developer

The Spatial Learner

• Are those pupils who have a strong spatial bias (compared to verbal) displaying

the key characteristics of this learner profile?

• Are there reports of the students looking disengaged and distracted? Could this

be when they are processing/thinking really hard?

• Have they had opportunities to explore their STEM potential?

The Spatial Learner in the classroom

Tracking and monitoring,

target-setting,

value-added measure

Indicators

A national benchmark

Target-setting

Using national benchmarked

performance levels

How might indicators help in the target-setting

process?

• The CAT4 indicators are derived from the performance of students who scored

similarly on the CAT in the past.

• They make very useful benchmarks of performance both for the individual and for

value added for groups and the school.

Performance indicators

Why not just use indicators directly as the

students’ targets?

• Indicators are a statistical indication of average national performance linked to an ability level, not a hard prediction of an individual’s actual results

• Progress charts emphasise to students the range of achievable outcomes

• The importance of students’ motivation

• Not using indicators to label students as actual or potential failures

• Setting the indicators in the context of other relevant factors and assessment information

• Appropriate level of challenge set for school / cohort / individual?

• Personalisation factors taken in to account

• Targets set/agreed, academic mentoring meetings

• Teachers report regularly on current attainment

• Discrepancy with target monitored / reviewed, referencing on how long to

achievement, is the learner on track for their target?

• Who coordinates monitoring and feeds back?

Other issues on tracking

Academic Mentoring: Using ‘chances’ charts

with individual students

The conversation …

● What grades appear to be easily within your reach? How do your expectations compare to the outcomes shown in the progress chart?

● What grade could be achieved with more effort/support? Negotiate a 'target for attainment'

● What aspects of the work do you find difficult or relatively easy? Where would you benefit from extra help? How can I better support you?

● What are the learning steps over the next half-term if we are to achieve this target (‘targets for learning’)? All the principles of assessment for learning apply

Measuring Value-added

Indicators: A national benchmark

Which pupils in general (school) or within subject areas outperform indicated ‘most

likely’ and ‘challenging’ indicators?

This can significantly support departments in their self-evaluation and development

planning

● You could split this analysis to look at groups of learners e.g. Boys/girls, SEN, EAL,

disadvantaged students.

Back in school: Value-added for different

groups of pupils?

● Different ability levels?

● Those that may have joined the school recently?

● Those who have missed schooling or particular classes for long periods?

● Who had extra support / intervention?

● Who followed a particular scheme of work?

● Whose teachers used different teaching practices?

You could even go as far as to look at other

identified student groups

Getting the

all-round view

Triangulation of standardised data

How can we triangulate data from attainment

and ability?

● Via SAS with other standardised assessments

● Giving position within national sample range (SAS and percentile)

● Via indicator comparison with current performance levels

● TA or assessment generated

For example, many schools have adopted a GCSE number/Grade tracking system from the start of KS3

Combination reports are designed to compare and contrast data from two or more areas of assessment.

For example:

The CAT4 and

Progress Test in English

Progress Test in Maths and

New Group Reading Test

Where students are over/under performing against average bench indicators (discrepancy) identifying the causes of these discrepancies could deliver further useful insight.

How can ‘Combination Reports’ highlight

performance issues?

What is the format of the

Combination reports?

Some example case-studies

CAT4 Quantitative PTM Non-

Verbal Spatial Verbal Mean NGRT PTE

Barry 107 85 98 108 112 106 110

Miles 78 79 68 70 77 74 88 93

Daisy 106 91 112 125 92 109

Kyle 80 110 88 100 82 92

Further triangulation

• Have any students in this group received high levels of academic support at school and/or

home which will have helped them to achieve at a higher level than might have been

predicted from their ability in quantitative reasoning?

• This might be in the form of extra lessons, parental input or

very good classroom teaching

• Do any of the students in this group show high academic motivation which will have impacted

positively on their learning during lessons and during the assessment tasks?

• Does this group include slow processors of information who would have benefitted from PTM

being untimed but who would struggle to complete the CAT4 tasks in the time allocated?

• Extra time is not an option for CAT4 as it is the combination of the difficulty of the tasks

and the time allocated to complete them that contributes to the score and in turn the

student profile.

Case-Study:

Data Triangulation

An illustrative student data story

Space for your notes

An illustrative student data story

An illustrative student data story

Space for your notes

Space for your notes

An illustrative student data story

Next Steps

The Regional Assessment Workshop (RAW) training may be only one step in a longer

process.

We have assembled materials to reinforce the RAW training and to facilitate your

cascade of the training to your colleagues. We hope that you will find this very useful.

● Unlimited access for you and your colleagues to our recently produced e-learning

modules. These are tiered to support colleagues:

• new to the assessment

• those administering the assessment

• and those interpreting the data

Next Steps

Supporting materials for presentations and work with colleagues in school:

• A complete electronic copy of all the slides used at the Regional

Assessment Workshop

• A glossary of statistical terms used in standardised assessments

• A visual illustration of standardised scores and score distribution.

• A fully annotated Student Data Story – A case study illustrating the power

of data triangulation in gaining a ‘whole pupil view’.

• A further unannotated Student Data Story for practice interpretation

• Practitioner case studies.

• Ideas and tasks for further work ‘ back in school’.

Q & A

Thank You

top related