centre for evaluation & monitoring yellis example feedback.pdf · feedback information a...

20
Feedback Information A progress and attitudinal measure for KS4 Yellis Centre for Evaluation & Monitoring

Upload: lethuy

Post on 24-Apr-2018

228 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

Feedback Information

A progress and attitudinal measure for KS4

Yellis

Centre for Evaluation & Monitoring

Page 2: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

2

ContentsFeedback that you receive via the Secondary+ secure website will include:

Assessment Feedback page 3• Raw Test Scores

• Band Profile Graphs

• List of Underaspirers

• Standardised and Ranked Scores for each pupil

• Individual Pupil Record Sheets

• Attitudinal Comparison Graphs

Predictions Feedback page 10• Predictions to GCSE based on each pupil’s Yellis performance

• Subject Chances Graphs, for each subject

• PARIS software

Value-added Feedback page 14• Value-added Graphs

• Relative Ratings Graphs

• Pupil Level Residuals

• PARIS

Page 3: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

3

Assessment FeedbackThis section includes examples of the type of assessment feedback that schools will receive following the Yellis Assessment.

Assessment feedback contains

• Raw Test Scores

• Band Profile Graphs

• List of Underaspirers

• Standardised and Ranked Scores for each pupil

• Individual Pupil Record Sheets

• Attitudinal Comparison Graphs

If you are using the Yellis Computer Adaptive Baseline Test, your feedback will usually be available within a short period following the end of the testing window. For paper-based Yellis testing, results are usually available within 4 weeks of test completion.

All your feedback can be downloaded from our secure Secondary+ website using your username and password. This website holds all previous years’ feedback for each school and thus allows schools to easily and quickly replace any mislaid files or to obtain updated files. It also provides a full range of documentation.

Yellis is part of a suite of monitoring systems established by CEM. Part of Durham University, CEM is recognised around the world for its pioneering educational research and school monitoring. It is a trusted supplier of educational assessments delivering over 1 million tests to pupils each year across 36 nations.

Raw Test ScoresThis report provides the school with details of the performance of each student in the Yellis Baseline Test. Individual scores are given, as a percentage, for the maths, vocabulary and non-verbal sections of the Yellis test. However it is the overall Yellis test score, which is the average of the maths and vocabulary scores, which is currently used in all Yellis analyses. If a student did not attempt any question in the vocabulary or maths sections then, although their name will appear on the list, they will not have a Yellis test score or band.

YellisRaw Test Scores Baseline Feedback

www.cemcentre.org

This report provides the school with details of the performance of each student in the Yellis Baseline Test. Individual scores are given, as a percentage, for the maths, vocabulary and non-verbal sections of the Yellis test. However it is the Yellis test score, which is the average of the maths and vocabulary scores, which is currently used in all Yellis analyses. If a student did not attempt any question in the vocabulary or maths sections then, although their name will appear on the list, they will not have a Yellis test score or band.

The Yellis test score for a student should always be interpreted using your knowledge of the student's academic performance in class, course-work and homework. Evidently a student's performance in a particular test on a particular day may be in�uenced by a variety of factors - for example the student could be feeling ill or upset by an incident inside/outside school. Such events may a�ect a student's performance in the Yellis test, leading to an untypical result for the student concerned, and so, if possble sta� should be aware of this when interpreting the results from the baseline test.

The Test Scores should always be used as an aid to your sta�'s professional judgements - not as a replacement to them.

Each student is also assigned to a Yellis Band, labelled from A to D, where A is the highest and D is the lowest, depending on their Yellis test score. The Chances Graphs give an indication of the range of GCSE grades a student with a given Yellis Band may attain (based on the real data from the most recent similar cohort). These Chances Graphs can be used to raise expectations or challenge students at all levels of ability.

Page 4: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

4

The Yellis test score for a student should always be interpreted using your knowledge of the student’s academic performance in class, course-work and homework. Each student is also assigned to a Yellis Band, labeled from A to D, where A is the highest and D is the lowest, depending on their Yellis test score.

Band Profile GraphsBand Profile Graphs are included with the Yellis Feedback, to give a quick overview of the cohort’s assessment performance.

• The pupils’ overall Yellis Test scores are given a band – A to D (where A is the highest and D is the lowest).

• The bands are constructed using quartiles, each band containing 25% (a quarter) of the pupils in the sample.

• A pupil scoring in band A has scored in the top 25% of the sample, for example.

The chart below shows the performance of Year 10 (Class of 2010) pupils at yourschool in the Yellis tests. It displays the percentage of pupils who fell into each

quartile band.

Band A is the highest band for pupils scoring 62% or over.Band B is the next highest, for pupils scoring 50% to 62%.Band C is the next highest, for pupils scoring 39% to 50%.Band D is the lowest band for pupils scoring under 39%.

0

10

20

30

40

D C B A

Per

cent

age

of P

upils

Band

30

25

2223

A school with an absolutely average cross-section of pupils will have 25%of its pupils in each band.

The chart should be used in conjunction with the 'Chances Graphs' for the Year10 Class of 2010 pupils and the individual test results for each pupil.

Band Profile Graph

Y

Yellis Institution 5003Year 10 Class of 2010

Report Produced 01/04/2009

What these graphs tell usThe graphs show the proportion of pupils falling into each of the four performance bands A – D. They allow schools to see how the pupils in their school broadly performed compared to the Yellis sample of pupils and thus provide an overview of the attainments of their own pupils.

A school with a profile like the standardisation sample, in terms of average attainment and range of performance, would have 25% of pupils in each band

List of UnderaspirersWithin the Baseline Yellis questionnaire students are asked a series of questions regarding their educational expectations for the future. From the answers to these questions a ‘Likelihood of Staying in Education’ (LSE) score is assigned to each student. For example, a student that indicated that they are ‘very likely’ to remain in education will have an LSE score of 5 for this question and a student that indicated that they ‘very likely’ to get a job will have an LSE score of 1 for this question. The overall LSE score for

Page 5: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

5

a student is the average LSE across these questions.

Calculations have shown that there is a positive relationship between the LSE and Yellis test score. Indeed one can state that if a student attains a ‘high’ Yellis test score then they are more likely to have a ‘high’ LSE score and vice versa. An underaspiring student is a student whose LSE score is more than 1 standard deviation below the average LSE score of all other students with this particular Yellis test score and they have not indicated that they are ‘likely’ or ‘very likely’ to remain in education post-16. In other words, an underaspirer is a student whose aspirations are low, given their abilities.

YellisUnderaspiring Candidates Baseline Feedback

www.cemcentre.org

Within the Baseline Yellis questionnaire students are asked a series of questions regarding their educational expectations for the future. From the answers to these questions a 'Likelihood of Staying in Education' (LSE) score is assigned to each student. For example, a student that indicated that they are 'very likely' to remain in education will have an LSE score of 5 for this question and a student that indicated that they 'very likely' to get a job will have an LSE score of 1 for this question. The overall LSE score for a student is the average LSE across these questions.

Calculations have shown that there is a positive relationship between the LSE and Yellis test score. Indeed one can state that if a student attains a 'high' Yellis test score then they are more likely to have a 'high' LSE score and vice versa. An underaspiring student is a student whose LSE score is more than 1 standard deviation below the average LSE score of all other students with this particular Yellis test score and they have not indicated that they are 'likely' or 'very likely' to remain in education post-16.

Following suggestions made by users in the early days of Yellis, a list of underaspiring students is included in the baseline feedback. Many schools carefully discuss aspirations and intentions with these students in order to understand what might be causing them to have lower aspirations than would be expected given their academic ability.

It is important that we consider the consequences of labelling pupils as ‘underaspirers’. Does singling out a pupil as an underaspirer, with subsequent intervention from the school, actually lead to a more positive attitude towards education? Of real concern is that intervention may do more harm than good, resulting in the pupil switching o� from education even further, so we would recommend careful, sensitive use of the list.

Why is it useful?Following suggestions made by users in the early days of Yellis, a list of underaspiring students is included in the baseline feedback. Many schools carefully discuss aspirations and intentions with these students in order to understand what might be causing them to have lower aspirations than would be expected given their academic ability.

It is important that we consider the consequences of labeling pupils as ‘underaspirers’. Does singling out a pupil as an underaspirer, with subsequent intervention from the school, actually lead to a more positive attitude towards education? Of real concern is that intervention may do more harm than good, resulting in the pupil switching off from education even further, so we would recommend careful, sensitive use of the list.

Standardised and Ranked Yellis Scores and Bands

What are standardised scores?The results for each Yellis assessment are standardised to have a mean of 100 and a standard deviation of 15. That is, the average pupil in the sample will score 100 and the majority of pupils will score between 70 and 130. These scores allow you to judge how well your pupils have performed on the Yellis assessments relative to one another, and also how they have performed relative to the sample average. Yellis standardised scores are available as a pdf and excel spreadsheet.

Year 10 testing, Autumn 2008 for GCSE Examinations in 2010

Band and Percentage Scores of Candidates in Yellis Tests (D is the lowest band,A is the highest). If no Band is listed, this pupil failed to answer at least one

section (Vocabulary and/or Mathematics) of the Yellis Test.

All scores are standardised with a mean of 100 and standard deviation of 15.

Results should be read in conjunction with the Chances Graphs booklet.

Standardised Yellis Test Scores

Surname Forename(s) Sex Band Maths Vocab Patterns TotalANTELOPE Andrew MBEAR Bruce MCAMEL Carrie FDODO Denise FEMU Edward MFLAMINGO Freddy MGANNET Grace FHARE Harry MIBEX Iona FJAGUAR Jenny FKANGAROO Kelly FLEOPARD Lenny MMANATEE Martin MNATTERJACK Nina FOCELOT Ophra FPANDA Peter MRACOON Rita FSHREW Sade FTOUCAN Thomas M

9999 noitutitsnI silleY0102 fo ssalC 01 raeY

Report Produced 20/01/2009

A 127 117 127 124C 100 90 103 95A 112 115 114 115B 100 107 93 103A 117 104 90 112B 100 115 95 108B 107 112 109 110B 128 87 111 110D 91 71 111 80C 91 94 90 92B 103 102 98 103B 92 110 90 101A 108 112 103 111C 93 92 85 92B 102 114 98 108B 98 107 111 102B 115 100 95 109D 84 81 77 81A 120 109 109 116

Page 6: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

6

Why are they useful?Comparison between the Vocabulary, Maths and Patterns scores can highlight pupils who are:

• able but who are under performing

• lower ability pupils that work hard to achieve above their potential

• ‘gifted’ students within your institution

Individual Pupil Records (IPRs) Individual Pupil Records display, for each pupil, a personalised spreadsheet summarising all the Yellis assessment information for that pupil.

What the IPR tells us…For each sub assessment and overall score in Vocabulary, Maths and Patterns :

• The pupil’s raw percentage score.

• The pupil’s standardised score in each subject, together with 95% confidence bands. (Confidence bands represent the range of probable scores for that pupil if the test were to be repeated many times. There is only a 5% chance that the score would fall outside this range.)

• The Yellis band (A to D) into which the pupil’s performance falls.

• The percentile (the percentage of the sample that this pupil performed better than) corresponding to the pupil’s performance.

• The stanine (one of nine divisions, with 9 being highest) into which the pupil’s performance falls.

Why is this useful?The graphical representation of a pupil’s performance on the Yellis Assessments allows you to quickly and easily identify any areas for concern, particularly between Vocabulary, Maths and Patterns. The IPR provides an easy visual way to identify discrepancies in performance which could highlight the able pupil that does not try hard in lessons or conversely the low ability pupil that goes the extra mile.

Where the confidence bands do not overlap between assessment sections this indicates that the pupil has performed significantly better or worse in that area compared to other areas. A significantly low score on a particular section may mean that intervention is needed in order that the pupil achieves his or her full potential and could have implications for teaching and learning.

An example of a Yellis pupil’s Individual Pupil Record is given overleaf.

Page 7: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

7

YellisIndividual Pupil Recordsheet Baseline Feedback

www.cemcentre.org

The Individual Pupil Recordsheet (IPR) contains all the Yellis baseline information, both raw and standardised, for a particular student. The IPRs are designed so that each may be printed and placed in a student's personal folder in the school o�ce. A typical IPR would look as follows;

MARTIN MANATEE

112

50

60

70

80

90

100

110

120

130

140

150

Vocabulary Mathematics Patterns Total Score50

60

70

80

90

100

110

120

130

140

150

Nationally-Standardised Scores with 95% Confidence Band

Vocabulary

Mathematics

Patterns

Total Score

Standardised Data

53

64

68

59

Score

98

109

106

Band

C

A

B

Percentile Stanine

45

79

71

5

7

6

6

Raw Data

B

Score (%)

66

Attitudinal Comparison GraphsBeing a good school is about more than just academic results. Indeed, other aspects of a school’s work may well be far more important for the effects they can have on students’ lives, and on society as a whole. Everybody agrees that school ethos is important, but how do you measure it? How do your students’ experiences, attitudes and backgrounds compare to those of pupils in other schools?

Yellis QuestionnairesAs part of the Baseline Yellis test, your students will have been given the opportunity to complete a short questionnaire that looks briefy at their attitudes towards education and their aspirations for the near future.

Further attitudinal questionnaires are available through Yellis and these are designed to provide you with information that you will find useful in the process of Self-Evaluation.

Page 8: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

8

Each of these questionnaires cover many areas – for example attitudes to school, particular lessons and homework, quality of school life, feeling of fear in school, home background and support for education from parents/guardians, career plans and aspirations for the future.

Feedback from the Yellis Questionnaires:Feedback from each Yellis questionnaire, including the short Baseline questionnaire, takes the form of a set of graphs that illustrate how the responses of your students to the Attitudinal Questionnaire differ when compared to all Yellis students in the same cohort. In addition, for schools who have participated in Yellis for two or more years, we provide sets of attitudinal trend graphs which are designed to enable you to easily track how the responses of your students to items on the questionnaire compare to those of all other Yellis students over time.

Evidently there is no reason to expect that the attitudes of your students should be exactly in line with the Yellis average. Moreover, the raw response of each cohort to the questions posed is likely to differ simply because of the variation in the sample of students in the school. This means that, in any single year, the raw data should be treated as an estimate - with some uncertainty attached, due to the amount of variation that might occur from year to year simply due to the fact that you get a different sample of students each year. To get a fuller picture of the attitudes of your students, we would recommend that you refer to the attitudinal trend graphs.

YellisAttitudinal Feedback

www.cemcentre.org

Being a good school is about more than just academic results. Indeed, other aspects of a school's work may well be far more important for the e�ects it can have on students' lives, and on society as a whole. Everybody agrees that school ethos is important, but how do you measure it? How do your students’ experiences, attitudes and backgrounds compare to those of pupils in other schools? 

Yellis Questionnaires:As part of the Baseline Yellis test, your students will have been given the opportunity to complete a short questionnaire that looks brie�y at their attitudes towards education and their aspirations for the near future. Further attitudinal questionnaires are available through Yellis and these are designed to provide you with information that you will �nd useful in the process of Self-Evaluation. Each of these questionnaires cover many areas – for example attitudes to school, particular lessons and homework, quality of school life, feeling of fear in school, home background and support for education from parents/guardians, career plans and aspirations for the future.

Feedback from the Yellis Questionnaires:Feedback from each Yellis questionnaire, including the short Baseline questionnaire, takes the form of a set of graphs that illustrate how the responses of your students to the Attitudinal Questionnaire di�er when com-pared to all Yellis students in the same cohort. In addition, for schools who have participated in Yellis for two or more years, we provide sets of attitudinal trend graphs which are designed to enable you to easily track how the responses of your students to items on the questionnaire compare to those of all other Yellis students over time.

Evidently there is no reason to expect that the attitudes of your students should be exactly in line with the Yellis average. Moreover, the raw response of each cohort to the questions posed is likely to di�er simply because of the variation in the sample of students in the school. This means that, in any single year, the raw data should be treated as an estimate - with some uncertainty attached, due to the amount of variation that might occur from year to year simply due to the fact that you get a di�erent sample of students each year. To get a fuller picture of the attitudes of your students, we would recommend that you refer to the attitudinal trend graphs.

Likely to stay on at school

Likely to go to a college

Likely to take a vocational course

Likely to repeat or take new GCSEs

54

74

13

16

39

80

17

17

Overall Average (%) Your Average (%)

Basic Comparison GraphsLikelihood of Staying in Education

Yellis

www.cemcentre.org

More than 50 books in house

Read for pleasure

Borrow books from school library

83

45

44

81

48

36

Overall Average (%) Your Average (%)

Your Boys Your Girls Yellis Boys Yellis Girls

Basic Comparison Graphs by GenderCultural Capital

Basic Comparison Graphs by GenderCultural Capital

1997

1998

1999

2000

2001

Basic Comparison Trend GraphsLikelihood of Staying in Education

36

52

48

44

40

Interpreting the data on the comparison graphsEach school's data is shown in the foreground as a bar graph that is plotted against a background of the data from the whole Yellis sample. In this way these graphs present simple comparative data - whilst maintaining student con�dentiality.The details of the information on each comparison graph are as follows: - The questionnaire item is shown on the left of the bar chart. - The number immediately following the questionnaire item is the percentage of the students in the whole Yellis sample that responded to the items as indicated. This is further illustrated by the shaded background area of the graph. - The number on the right hand side of the graph corresponds to the percentage of your students who fell into a certain category according to their response to the questionnaire item. It is this number that is used to determine the length of the bar for this item. - If a symbol on the bar appears and the bar is coloured red, then your data is signi�cantly above or below that of the 'background' Yellis data (at the 95% con�dence level). - If no symbol appears on a bar and the bar is coloured blue, then your data is not signi�cantly di�erent to the ‘background’ Yellis data.

Overall Average (%) Your Average (%)

72

44

40

72

53

44

46

46

46

45

47

Page 9: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

9

Interpreting the data on the comparison graphsEach school’s data is shown in the foreground as a bar graph that is plotted against a background of the data from the whole Yellis sample. In this way these graphs present simple comparative data - whilst maintaining student confidentiality.

The details of the information on each comparison graph are as follows:

• The questionnaire item is shown on the left of the bar chart.

• The number immediately following the questionnaire item is the percentage of the students in the whole Yellis sample that responded to the items as indicated. This is further illustrated by the background area of the graph.

• The number on the right hand side of the graph corresponds to the percentage of your students who fell into a certain category according to their response to the questionnaire item. It is this number that is used to determine the length of the bar for this item.

• If a symbol appears at the end of a bar, then your data is significantly above or below that of the ‘background’ Yellis data (at the 95% confidence level).

• If no symbol appears on a bar, then your data is not significantly different to the ‘background’ Yellis data.

Page 10: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

10

Predictions FeedbackThis section includes examples of the type of predictions and chances graph feedback that you will receive from the Yellis Tests.

Predictions Feedback Contains• GCSE Predictions

• Chances Graphs

• PARIS

PleAse note thAt For PrediCtions to be mAde From Yellis sCores, A PuPil must hAve mAde A vAlid AttemPt At both the voCAbulArY & mAthemAtiC seCtions oF the Yellis bAseline test.

Predictions FeedbackThe Yellis predictions feedback includes the following:

GCSE PredictionsThe Predictions Spreadsheet is a quick way to access the GCSE predictions for your students. It contains details of your students’ test scores together with their GCSE predictions, on the QCA or 8-Point scale, for “selected” subjects (as indicated by the student when taking the Yellis test, or as pre-loaded by the school) and “all” subjects. The workbook also contains details of the regression equation coefficients.

Page 1 of 2

Maths

Vocab

Patterns

YELLIS

Maths

Vocab

Patterns

YELLIS

ANTELOPE Andrew 03/07/1994 M 34 48 41 C 89 94 91 23 33 26BEAR Bruce 27/10/1993 M 31 50 41 C 87 95 90 19 38 25CAMEL Carrie 30/01/1994 F 43 37 40 C 96 84 90 40 14 24DODO Denise 06/08/1994 F 41 46 44 C 95 92 93 35 30 31EMU Edward 04/05/1994 M 24 46 35 D 82 92 85 11 30 16FLAMINGO Freddy 24/05/1993 M 30 39 35 D 87 86 85 18 16 15GANNET Grace 01/09/1993 F 42 46 44 C 96 92 93 38 30 33HARE Harry 18/02/1994 M 34 50 42 C 89 95 91 23 38 28IBEX Iona 02/04/1994 F 64 28 46 C 112 76 95 78 5 36JAGUAR Jenny 27/09/1993 F 23 19 21 D 81 67 72 10 1 3KANGAROO Kelly 19/02/1994 F 19 31 25 D 78 79 76 7 7 5LEOPARD Lenny 12/05/1994 M 48 46 47 C 100 92 96 50 30 39MANATEE Martin 16/05/1994 M 39 69 54 B 93 112 102 31 78 54NATTERJACK Nina 28/08/1994 F 48 52 50 C 100 97 98 49 42 45OCELOT Ophra 31/12/1993 F 40 33 37 D 94 81 87 34 9 18PANDA Peter 09/03/1994 M 44 54 49 C 97 99 97 41 46 43RACOON Rita 28/12/1993 F 34 35 35 D 89 82 85 23 11 15SHREW Sade 31/08/1994 F 27 30 28 D 84 77 79 14 6 8TOUCAN Thomas 14/07/1994 M 33 35 34 D 89 82 84 22 11 14WARTHOG William 13/09/1993 M 55 52 54 B 105 97 102 64 42 54

Yellis Institution 9999Year 10 Class of 2010

Band

Standardised Percentile

Gender

Surname

Forename

Date of B

irth

Standardised Score

Underaspirer

YELLIS

Maths Score %

Vocab Score %

Patterns Score %

Why is it useful? There are three main areas of target setting: school, subject, and student targets which are all intrinsically linked to each other. The Yellis GCSE Predictions spreadsheet is designed to quickly help schools with this process.

Having a pupil’s predicted grades, up to two years in advance of final exams, can be an invaluable aid to setting these targets, and achieving them. Your Yellis predictions can provide an accountable base for target setting, both at the school and Local Authority level.

Page 11: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

11

Chances GraphsEach student who has taken the Yellis test is assigned to a band, labeled from A to D, depending on their score. These bands were created by dividing the whole Yellis sample into four groups with an equal number of students in each group - with the highest scoring students in band A and the lowest scoring students in band D.

When a group of Yellis students take their GCSEs it may reasonably be expected that the students in the higher Yellis bands will tend to perform better in their GCSEs that students in the lower Yellis bands - the Chances Graphs show to what extent this is the case.

The Chances Graphs are constructed using the data gathered from the actual performance of a specific cohort of Yellis students in their GCSE examinations e.g. the Yellis bands and GCSE results for the Year 10 Class of 2009 cohort can be used to produce Chances Graphs for the Year 10 Class of 2011 cohort. For each subject analysed, the four graphs (one for each Yellis quartile band) illustrate the range of grades actually attained by these students for a given GCSE subject. These graphs can be used to provide a rough guide to what might be reasonably expected for similar students who have taken the Yellis test but have yet to take their GCSEs.

Chances Graphs are produced in four different sets depending on the data sent by your school. Currently we offer:

• Yellis to GCSE Chances Graphs

• Yellis to Short Course GCSE Chances Graphs

• Yellis to Vocational GCSE Chances Graphs

• KS3 to GCSE Chances Graphs.

Art & DesignBased on data collected from 25,677 'Year 10 Class of 2008' students.

0

10

20

30

40

50

60

U G F E D C B A A*

Perc

enta

ge o

f Stu

dent

s

GCSE Grade

Band AYellis Test Score > 61.8

0 0 0 14

16

26

33

21

0

10

20

30

40

50

60

U G F E D C B A A*

Perc

enta

ge o

f Stu

dent

s

GCSE Grade

Band B50.5 < Yellis Test Score <= 61.8

0 0 1 38

28 2722

10

0

10

20

30

40

50

60

U G F E D C B A A*

Perc

enta

ge o

f Stu

dent

s

GCSE Grade

Band C38.7 < Yellis Test Score <= 50.5

0 1 37

14

34

24

13

4

0

10

20

30

40

50

60

U G F E D C B A A*

Perc

enta

ge o

f Stu

dent

s

GCSE Grade

Band DYellis Test Score <= 38.7

1 26

14

21

35

14

52

Chances Graphs (Yellis to GCSE Full Course)

Y

Year 10 Class of 2010 Predictions

Page 12: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

12

Why are they useful? Chances Graphs should always be used when discussing “target-setting” with students, staff and parents, as they can be used to raise expectations at all levels of ability and challenge the more-able students, for example:

A ‘band D’ student might expect to gain between a GCSE grade D and a grade F in a given GCSE subject. Consultation of the appropriate Chances Graph may indicate that some students of similar ability generally do achieve grades higher than these, i.e. within the A*-C range. Discussing this with a student may help to raise his/her expectations.

A ‘band A’ student might think that he or she can achieve a grade A or B at GCSE without really trying too hard. Consultation of the appropriate Chances Graph will show that not all students of similar ability generally do get a grade A or B at GCSE - indeed it may show that a significant proportion of students can be expected to get a grade D or E. Staff can then use the Chances Graphs to remind such a student that he/she must continue to work hard to ensure that they get the higher grades.

The Chances Graphs, combined with (i) teachers’ knowledge of the student and (ii) the Yellis predicted grade, can help set realistic targets.

PARISPARIS, which stands for Predictions And Reporting Interactive Software, is an Excel workbook and runs under the versions of Excel found in Microsoft Office 97 and later. PARIS can be used to produce several different reports, either formatted ready for printing or as worksheets within the PARIS workbook, that enable users to easily produce pupil level, subject level and school level reports that look at the potential performance of students in their GCSE examinations.

Why is it useful?PARIS is an analysis and reporting tool comprising of an Access database to store details and an interface (Excel or custom) to explore and interpret the data. PARIS allows users to work with and customise their Yellis data using a wide variety of analysis options. A range of different analysis types are possible including potential performance, intermediate performance and Value-added performance. Reports can be based on both predicted and achieved GCSEs and detail tabular and graphical formats.

A variety of reports can be created that, used in addition to the standard Yellis feedback, provide valuable feedback reports.

Institution Reports:

• Institution Summary

• Standardised Residual Graphs

• Intake Profile

Subject Reports:

• Statistical Process Control Charts (SPC)

• Scatter Graphs

• Summary Reports

• Full detail Reports

Student Reports

• Subject Predictions and Value-Added data

• Optional Progress chart

• Chances Graphs

• Custom sets of data (e.g. teaching groups)

Page 13: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

13

• Interim results (e.g. mock exams)

An example PARIS report:

Page 14: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

14

value-added FeedbackThis section includes examples of the type of assessment feedback that schools will receive following the receipt and analysis of GCSE data from a Yellis cohort.

Value-added feedback contains• Value-added Graphs

• Relative Ratings Graph

• Pupil Level Residuals

• PARIS

What is value-added?Value-added measures the amount of progress made by pupils over a certain period of time, compared against pupils of similar abilities in other schools.

Why is value-added useful?If one school increases the achievement level of its pupils more than other schools, then its pupils gain an additional advantage. Value-added is a fair method of looking at GCSE performance rather than raw results.

Once pupils have taken GCSE examinations, their results are collected at CEM and Value-added Analyses is carried out. Feedback is provided at the student, subject and school level with the data for each baseline cohort analysed separately.

Value-added GraphsThe value-added graph shows the average value-added score (i.e. raw residual) for each subject. This graph shows how well pupils in each subject achieved in comparison with similar pupils in the same subject in other schools. Average progress is indicated by there being no difference between the statistically predicted score and the actual score - resulting in a value-added score of zero for that subject.

Art & DesignBiology

Business & Comm. SystemsBusiness Studies

ChemistryClassical Civilisation

Design & TechnologyDrama

EconomicsElectronics

EnglishEnglish Literature

FrenchGeography

GermanHistory

Home EconomicsHumanities

ICTIrishLatin

MathematicsMedia Studies

MusicPhysical Education

PhysicsReligious Studies

Science: GCSEScience: GCSE Additional

Science: Single AwardScience: Double Award

Science: OtherSociology

SpanishStatistics

UrduWelsh

Mean of subject residualsMean of pupil residuals

-2.0 -1.0 0.0 1.0 2.0

0.8-0.7

-0.5

-0.5

0.10.4

0.40.1

-1.2-0.1

-1.20.5

-0.1

0.00.6

-0.1

-0.1

-0.1-0.1

YValue Added Graph

Value Added measures answer the question: 'How well did students in my particular subjectperform in comparison with similar students in the same subject in other schools?'

Yellis to GCSE (Full Course)

Yellis Institution 5011Year 10 Class of 2007

Report Produced 25/09/2007

Page 15: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

15

Value-added graphs are produced in four different formats:

• Yellis to GCSE

• Yellis to Short Course GCSE

• Yellis to Vocational GCSE

• KS3 to GCSE

Why are they useful?The value-added graphs show the average value-added score for each subject. The graphs shows how well pupils in each subject achieved in comparison with similar pupils in the same subject in other schools. Average progress is indicated by there being no difference between the statistically predicted score and the actual score - resulting in a value-added score of zero for that subject.

Relative Ratings GraphThe Relative Ratings Graph compares departmental performances within a school, taking into account subject difficulties and subject enrolments. It shows how well students in a particular subject performed in comparison with their performances in other subjects taught in the school. Since it is a comparison within the school, there will be as much on the positive side as on the negative side.

How Relative Ratings are calculatedThe Relative Ratings approach was developed in Scotland by Alison Kelly for the SOED, and can be used to compare performances of departments within a school with each other, using only their GCSE results.

Relative Ratings contrast with Value-added measures in the sense that they compare departments with other departments within school, as opposed to comparing each department with the same departments in other schools. However, we would generally expect the pattern of Relative Ratings to be similar to Value-added measures, providing we realise that Value-added scores could all be positive or all be negative but Relative Ratings must roughly balance positive with negative.

A simplified version of calculating Relative Ratings would be to follow these steps:

• calculate a pupil’s average GCSE score

• calculate a ‘correction factor’ for each subject the pupil took, by taking their average GCSE score away from that subject score

• repeat this for every pupil in the school

• average the correction factors for each subject

This method would give us an apparent ‘relative difficulty’ for the school of each GCSE subject, i.e. how difficult pupils found that subject compared to the other subjects. We could repeat the procedure using every school in the analysis to find a ‘universal’ relative difficulty. The difference between the school relative difficulty and the universal relative difficulty would give the Relative Ratings for the school.

Page 16: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

16

Art & DesignBiology

Business & Com SystemsBusiness Studies

ChemistryClassical Civilisation

Design & TechnologyDrama

EconomicsElectronics

EnglishEnglish Literature

FrenchGeography

GermanHistory

Home EconomicsHumanities

ICTIrishLatin

MathsMedia Studies

MusicPhysical Education

PhysicsReligious Studies

Single ScienceDouble Science

Other ScienceSociology

SpanishStatistics

UrduWelsh

-2.0 -1.0 0.0 1.0 2.0

0.3

-1.1

-0.8

0.8

0.3

-0.1

-0.2

0.6

-1.2

0.0

-1.4

-0.5

0.4

-0.8

1.7

Relative Ratings GraphRelative Ratings answer the question: How well did students in my particularsubject perform in comparison with their performance in other subjects taught

in this school?

YYYY

Yellis Institution 5055Year 10 Class of 2005

The problem with this method is that it does not take account of combinations of subjects . The full method of calculating Relative Ratings, the one devised by Kelly and used in the YELLIS analysis, does take into account which combinations of subjects pupils took.

If a department has performed well in comparison with the overall Yellis sample, then the Value-added score will be positive, but it doesn’t necessarily follow that the Relative Rating will be positive - it depends on (i) how the other departments within school have performed, and (ii) the relative difficulty of the subject.

Page 17: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

17

Pupil Level ResidualsUsing YELLIS Baseline Test Score as input, GCSE score as output and the most recently calculated regression equation for a GCSE subject to calculate the residual, the reports for the Pupil Level Residuals look something like this:

Yellis

www.cemcentre.org

Using YELLIS Baseline Test Score as input, GCSE score as output and the most recently calculated regression equation for a GCSE subject to calculate the residual, the reports for the Pupil Level Residuals look something like this:

In this report the GCSE grade, expressed as a score (A*=8, A=7, B=6, etc.) attained by a student is compared with his/her predicted GCSE grade. The di�erence between the two - the value-added or “residual” - repre-sents the progress made by the student relative to other similar students.

If a student made "average progress" then there would be no di�erence between the average attained GCSE grade of similar students and the GCSE grade attained by the student in question i.e. the value-added score or residual would be exactly zero. This means that the student's progress was exactly in line with other students in the Yellis sample of similar ability taking the same subject.

A positive residual indicates that the student attained a better GCSE grade than that attained on average by students with the same YELLIS test score. Study of the residuals at the student level recognises the fact that a grade E or F at GCSE may have been a great achievement for particular students - a much fairer method of looking at performance than the rather crude method of simply looking at the number of A*-C's attained, for example.

Pupil Level Residuals

Yellis to GCSE (Full Course)

English Literature

Surname Forename YellisScore

PredictedGrade

ActualGrade

RawResidual

StandardisedResidual

Predicted grade is calculated on the basis of Yellis Test Score for the student and the general pattern of results in thissubject this year in the Yellis sample.

The 8-point GCSE scale is used in all calculations: A*=8, A=7, B=6, C=5, D=4, E=3, F=2, G=1, U=0.

7.11.20.69.3ANTELOPE Andrew 32BEAR Bruce 45

0.12.10.58.3CAMEL Carrie 30DODO Denise 62

8.0-0.1-0.30.4EMU Edward 328.09.00.51.4FLAMINGO Freddy 34

GANNET Grace 485.1-8.1-0.38.4HARE Harry 45

1.01.00.69.5

2.02.00.58.4

5.2-0.3-0.20.5

0.00.00.30.3IBEX Iona 179.0-0.1-0.30.4JAGUAR Jenny 342.0-3.0-0.33.3KANGAROO Kelly 22

Pupil Level Residuals Value Added Feedback

In this report the GCSE grade, expressed as a score (A*=8, A=7, B=6, etc.) attained by a student, is compared with his/her predicted GCSE grade. The difference between the two - the value-added or “residual” – represents the progress made by the student relative to other similar students.

If a student made "average progress" then there would be no diff¬erence between the average attained GCSE grade of similar students and the GCSE grade attained by the student in question i.e. the value-added score or residual would be exactly zero. This means that the student's progress was exactly in line with other students in the Yellis sample of similar ability taking the same subject.

Why is this useful? A positive residual indicates that the student attained a better GCSE grade than that attained on average by students with the same YELLIS test score. Study of the residuals at the student level recognises the fact that a grade E or F at GCSE may have been a great achievement for particular students - a much fairer method of looking at performance than the rather crude method of simply looking at the number of A*-C's attained, for example.

Likewise a negative residual indicates that the progress made by a student was less than the average progress made by similar ability students. Careful interpretation of the data may be required for students who made significantly less than average progress. The reason that they made less than average progress may be attributed to any number of reasons - for example, a student may have been excluded from school for the majority of Year 11 and only allowed to return to sit his/her GCSE examinations or simply that the student didn't feel very well on the day of the examination.

PARISEach year an updated database is made available to schools which includes GCSE grades and value-added figures.

Page 18: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

18

Page 19: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within

page

19

Page 20: Centre for Evaluation & Monitoring Yellis Example Feedback.pdf · Feedback Information A progress ... Centre for Evaluation & Monitoring. ... results are usually available within