chief examiner’s report – physicstrial of practical assessment in leaving certificate physics,...

30
Trial of practical assessment in Leaving Certificate Physics, Chemistry, Biology – 2017 Chief Examiner’s report – Physics This report is based on the marks assigned by the examiners in the trial, on the reports submitted by the examiners and advising examiners, TPA72, TPA73 and TPA97 and a review of the student work produced in the trial. 1. Introduction 1.1 Purpose of trial The State Examinations Commission (SEC), in consultation with the National Council for Curriculum and Assessment (NCCA), was asked by the Department of Education and Skills (DES) to carry out a trial of practical assessment in Biology, Chemistry and Physics. The NCCA in its draft subject specifications (2013) for these subjects recommended that there should be a 90-minute externally assessed practical examination, worth 30% of the overall marks in each subject. Candidates would be assessed as they perform practical tasks and on a task booklet they complete during the practical session. As well as the practical assessment there would continue to be a written examination, worth 70%, in each of the sciences. The purpose of the trial was to assess the feasibility of including the proposed model of practical assessment as a component of the Leaving Certificate examinations in the future. The trial therefore was concerned with estimating roll-out costs and assessment of value versus cost, the validity of the practical tasks generated for the students and whether such tasks could have ongoing validity. 1.2 Description of practical examination The trial involved students from Year 2 of Leaving Certificate and took place in October 2017. The practical examination in Physics was organised into sessions. The trial involved up to three 90-minute sessions in an examination centre (laboratory) in a day. Each session involved up to twelve students each working on a different task observed by one external examiner. The examiner awarded marks (up to 15%) for the practical performance observed. The booklets completed by the students during the practical were collected and awarded marks (up to 15%) later. 1.3 Number of students and the number of schools All 712 post-primary schools were invited to express interest in participating in the trial. Responses came from over two hundred schools. The trial aimed to involve some schools in in just one of the subjects, Biology, Chemistry and Physics, while others would trial practical examination in two or all three subjects. Twenty-eight schools was the minimum required as per Figure 1. 1

Upload: others

Post on 11-Oct-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Trial of practical assessment in Leaving Certificate Physics, Chemistry, Biology – 2017

Chief Examiner’s report – Physics

This report is based on the marks assigned by the examiners in the trial, on the reports submitted by the examiners and advising examiners, TPA72, TPA73 and TPA97 and a review of the student work produced in the trial. 1. Introduction

1.1 Purpose of trial The State Examinations Commission (SEC), in consultation with the National Council for Curriculum and Assessment (NCCA), was asked by the Department of Education and Skills (DES) to carry out a trial of practical assessment in Biology, Chemistry and Physics. The NCCA in its draft subject specifications (2013) for these subjects recommended that there should be a 90-minute externally assessed practical examination, worth 30% of the overall marks in each subject. Candidates would be assessed as they perform practical tasks and on a task booklet they complete during the practical session. As well as the practical assessment there would continue to be a written examination, worth 70%, in each of the sciences. The purpose of the trial was to assess the feasibility of including the proposed model of practical assessment as a component of the Leaving Certificate examinations in the future. The trial therefore was concerned with estimating roll-out costs and assessment of value versus cost, the validity of the practical tasks generated for the students and whether such tasks could have ongoing validity. 1.2 Description of practical examination The trial involved students from Year 2 of Leaving Certificate and took place in October 2017. The practical examination in Physics was organised into sessions. The trial involved up to three 90-minute sessions in an examination centre (laboratory) in a day. Each session involved up to twelve students each working on a different task observed by one external examiner. The examiner awarded marks (up to 15%) for the practical performance observed. The booklets completed by the students during the practical were collected and awarded marks (up to 15%) later. 1.3 Number of students and the number of schools All 712 post-primary schools were invited to express interest in participating in the trial. Responses came from over two hundred schools. The trial aimed to involve some schools in in just one of the subjects, Biology, Chemistry and Physics, while others would trial practical examination in two or all three subjects. Twenty-eight schools was the minimum required as per Figure 1.

1

Page 2: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Figure 1 – Schools in the Science trial

Thirty of the applicant schools were selected to have a good geographical spread and a range of school types (size, gender of students, management type, language of instruction, DEIS status, etc). The initial cohort of trial students in the schools selected was about 1600. The number of students that actually participated was about 1100. The fall-off is attributed to students opting out of the trial. Students aged 18+ could opt out by informing their subject teacher. Students (under age 18 at the time of the trial) could also opt out of participation with the permission of their parent or guardian provided in a letter to the school. The opt-out was given to avoid any stressful impact on students in the trial preparing for their Leaving Certificate and where involvement in the trail was not rewarded and to ensure positive engagement of those students in the trial with the practical assessment. 2. The tasks

2.1 Structure of tasks Each student in a session was assigned a task at random. The task was presented in a 4-page task booklet. For the Physics Trial fourteen tasks were produced, twelve (designated P1, P2, P3 …P12) that were used in the trial and two (P13 and P14) that were issued to trial schools as sample tasks to help the students prepare for the trial. The level of difficulty and time required were intended to be the same for each task. The tasks were based on the mandatory experiments in the current Physics syllabus. Similarly fourteen Biology and fourteen Chemistry task booklets were also produced for each of the Biology and Chemistry trials. A common separate Instructions and information for the practical examination sheet was issued to each student in the trial and a common format/layout was used for the booklets in each subject and across the three subjects although the body of the tasks in each subject were distinct to that subject. All tasks were presented at common level. In Physics each task had a name that stated what was to be done in that task followed by a list of instructions with questions and room for answers interspersed with the instructions. The external examiner evaluated the individual practical performances as the students did the tasks, each student working on their own. Students could ask for help if they needed it but the examiners did not question the students directly. Pseudodata was available for students who could not generate their own data. Certain help or the issue of pseudodata

2

Page 3: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

may have been penalised by the examiner. The task booklets were used by the students to record observations, data, and data analyses and to answer questions either during or after their experimental work or a combination of both. Graphs were drawn on standard SEC graph paper. Tidying up the work stations at the end of the practical examination session was part of each task. At the end of the session the task booklets and any graph paper issued were collected by the examiner to be marked later. For students whose normal mode of working in the laboratory involves using a PC or tablet, and who, in consultation with the SEC with regard to Junior Certificate examinations, are likely to be granted reasonable accommodations with respect to the use of a computer for written Leaving Certificate examinations in June 2018, pdf-fillable versions of some tasks in each subject were prepared so that students could access the trial in that way. One Physics student in the trial used the digital versions of the tasks. 2.1 Direct and indirect assessment Direct assessment awarded practical skills that are difficult or impossible to test in a written examination, e.g.

• selection, manipulation and/or assembly of suitable apparatus, • taking accurate measurements, • tabulating, graphing and perform other analysis of own data, • working safely and efficiently in a laboratory and using resources economically, • following procedures or devising own method to carry out an investigation, etc.

Direct assessment was carried out by an examiner observing students as they carried out practical work. Indirect assessment gave credit through the written work in the task booklets and accompanying graph paper for answers to questions on practical work, safety and for analysis of experimental data collected by the students (or provided by the examiner). Indirect assessment was carried out by an examiner after the practical session marking the task booklets produced in the practical session. 3 The trial practical examination

3.1 Description of what happened during the examination Students were invited to enter the examination centre (laboratory) about ten minutes before the session start time. Students had been randomly pre-assigned by the examiner to a certain task on a Direct Assessment Mark Sheet, TPA-6. As students entered they were directed to the corresponding numbered workstation. They were given a pre-printed sticky label with their Trial ID number to wear during the practical session. Students were reminded to put on appropriate safety clothing and that that mobile phones were not permitted in the examination centre. Students were asked to read the ‘Instructions and Information’ sheet on their workstations. Students were reminded that they were allowed to ask for help, by putting up their hand, and, depending on the help needed, that such help may have involved losing some marks. The examiner distributed the task booklets, each student getting a different task (or with the minimum degree of duplication). The students were reminded that the first ten minutes of the examination session was only for reading the task booklet, preparation and planning.

3

Page 4: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Students were given permission to start and were informed when the first ten minutes was up and notified that they would be alerted again ten minutes from the end and that the last five minutes would be for clearing up. During the first ten minutes students read through their task and collected apparatus, chemicals and other materials they needed. They were not permitted to assemble the apparatus or start work on the task or on completing the booklet during that time. During the next 75 minutes of the practical examination students assembled the appropriate equipment and manipulated resources, carried out experimental activities, and recorded observations and measurements. The examiner used a clipboard and a Direct Assessment Mark Sheet to assign marks/make notes discreetly as he/she moved around the laboratory. The examiner attempted to make a fair assessment of the individual student practical abilities under five categories according to the assessment objectives and recorded their assessments on the mark-sheet. The examiner attempted to give equal attention to all students. There should have been no communication between the examiner and a student other than exchanges about a request for help by a student or a student calling the examiner to examine a particular activity. The examiners did however intervene in situations that threatened the safety of the student or of other students or compromised the efficient running of the examination centre. Where a student needed help that involved a penalty, the help required was given to enable the student to progress. The penalty was recorded in the appropriate place on the mark sheet. As the session progressed the examiner had ample opportunities to observe the students selection of apparatus, chemicals and other materials, assembly of apparatus, use of apparatus and safe and efficient way of working in the laboratory. Towards the end of the session the examiner awarded marks for each of these four categories to each student based on the student’s overall 90 minute performance. The lowest mark in each category was 0; there was no negative marking in any category even when penalty marks were applied for errors or help given. For some specific tasks, there were relatively few opportunities for the examiner to observe a student making a measurement or recording an observation. The examiner may have asked a student doing these tasks to call him/her to observe key moments or, if appropriate, e.g. if time permitted, the examiner may have asked students to repeat a key step involving measurement/observation. Marks for this category were awarded to each student based on the student’s overall 90 minute performance and penalties were deducted where help was given or data provided. The lowest possible mark in this category, even when penalties applied, was also 0. Ten minutes before the end of the session the students were reminded of the time and that the last five minutes was designated for tidy-up. An announcement was made when just five minutes were left that students should stop working on the tasks and start cleaning and tidying their workstation.

4

Page 5: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

At the end of the examination session the examiner instructed students to leave their task booklets with any graph paper used inserted inside the booklet at their workstations. The examiner completed his/her session mark-sheet and collected the session booklets which were sent to SEC Athlone by registered post. 4 Assessment objectives of practical examination

4.1 Direct assessment

4.1.1 Direct assessment objectives The 60 marks available for direct assessment were awarded by the external examiner who supervised the practical session in five assessment objective categories, as follows. 1. Selection of apparatus, chemicals & other materials

- apparatus suitable for task - sufficient apparatus appropriate to task - chemicals/other materials needed for task

2. Assembly of apparatus - correct assembly - manipulative skills in assembly

3. Use of apparatus - candidate carries out task as directed - manipulation of apparatus during conduct of task - co-ordination and dexterity in the use of equipment - apparatus used appropriately

4. Observations/measurements - correct observation/measurement technique - accurate observations/measurements - sufficient repetition where appropriate

5. Working safely & efficiently & cleaning up - personal safety and safety of others - economic and safe use of resources - tidy work practices - task carried out in the correct sequence - task completed within the given time - adherence to safe work practices in relation to electrical appliances, glassware, hot

liquids, chemicals, spillages, etc. - cleaning work area

Each objective carried up to 12 marks and the marks that could be assigned by the examiner in each objective were 12, 8, 4 or 0 according to the marking key given in Table 1 below.

5

Page 6: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

4.2 Indirect assessment

Table 1 – Direct Assessment marking key 4.1.2 Direct Assessment Moderation Moderation of marks according to an advising examiner’s instructions was applied to the initial marks assigned by the examiners. Moderation was based on a comparison of the marks applied by an advising examiner and an examiner marking a session in parallel without conferring. Where the examiner’s marks differed from the adviser’s to a degree that exceeded a previously set tolerance, an adjustment, indicated by the adviser, was applied to all the marks of the examiner.

4.2 Indirect assessment

The 60 marks available for indirect assessment were awarded to the work in a student task booklet by an external examiner in a number of assessment objective categories in line with the key Physics syllabus objectives (i) knowledge, (ii) understanding, (iii) skills, (iv) competence and (v) attitudes, see Physics syllabus page 6 and page 24. Each examiner applied a marking scheme for the tasks that had been discussed and agreed at a marking conference. The work of each examiner was monitored by an advising examiner during the marking.

5 Practical examination outcomes – direct assessment

5.1.1 Statistics A total of seventeen schools participated in the Physics trial. Thirty-eight examining sessions were conducted. The size of the sessions varied from 5 to 12 students; the average number of candidates per Physics session was 8.8.

5.1.2 Average overall total score for direct assessment The number of students who participated in the Physics trial was 292. Seven students were examined through the medium of Irish.

DIRECT ASSESSMENT MARKING KEY

High level of achievement 12 Low level of achievement 4

Moderate level of achievement 8 Not achieved 0

6

Page 7: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

The rates of uptake of and the average direct assessment mark for each task is given in Table 2. Ta

sk

No.

att

empt

s

No.

att

empt

s (%

)

Rank

ord

er o

f no

. of a

ttem

pts

Aver

age

mar

k,

out o

f 60

(and

as

%)

Rank

ord

er o

f av

erag

e m

arks

Task Topic

1 38 13.0 1 51.8 (86.3) 8 Converging lens

2 35 12.0 2 52.3 (87.2) 5 Curved mirror

3 27 9.2 3 52.1 (86.8) 6 Refractive index

4 17 5.8 11 47.2 (78.7) 11 Specific heat capacity of a liquid

5 19 6.5 10 45.9 (76.5) 12 Specific latent heat of a substance

6 22 7.5 7 52.5 (87.5) 4 To investigate variation in the thermometric property of a material with temperature

7 27 9.2 3 51.1 (85.2) 10 To measure g, the acceleration due to gravity by free fall

8 21 7.2 8 52.8 (88.0) 3 To measure the resistivity of the material of a wire

9 26 8.9 5 54.5 (90.8) 1 To investigate the laws of equilibrium for a set of co-

planar forces

10 25 8.6 6 53.8 (89.7) 2 Boyle’s law

11 21 7.2 8 52.0 (86.7) 7 To verify that acceleration is proportional to force

12 14 4.8 12 51.2 (85.3) 9 To verify the principle of conservation of momentum

Table 2 - Frequency and average direct assessment mark for each task, Physics

The overall average direct assessment examiner mark in the tasks was 51.7 or 86.2% (with a standard deviation of 8.2 marks or 13.7%) and a mark range [24, 60]. Graph 1, below, shows the distribution of direct assessment marks (expressed as a percentage).

7

Page 8: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Graph 1 - Direct Assessment Mark Distribution (%) – Physics

Graph 1- Cummulative frequency curve Direct Assessment, Physics

0

10

20

30

40

50

60

70

80

0 10 20 30 40 50 60 70 80 90 100

No

of S

tude

nts

Direct Assessment Mark (%)

0

50

100

150

200

250

300

0 10 20 30 40 50 60 70 80 90 100

No.

of S

tude

nts

Direct Assessment %

8

Page 9: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Graph 2 shows good correlation between the examiners’ original marks and the moderated marks.

Graph 2 – Comparison of examiners’ marks and marks following moderation

y = 0.9604x + 1.6472R² = 0.9579

0

10

20

30

40

50

60

70

0 10 20 30 40 50 60 70

Exam

iner

Mar

k

Moderated Mark

Moderation

9

Page 10: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Average score for each sub-part of direct assessment

Table 3 shows the average mark assigned by examiners in each of the direct assessment objectives. These results show that most students were able to select suitable apparatus, chemicals and other materials and assemble the equipment, even by trial and error, to carry out their tasks. Some errors, inaccuracies and poor techniques were observed by examiners as students used the apparatus and took measurements. Most students worked safely and efficiently. The vast majority left their work stations as they found them.

Assessment Objective Average

mark out of 12

Average mark

% Selection of apparatus,

chemicals & other materials

10.9 90.8

Assembly of apparatus 10.8 90.0

Use of apparatus 9.6 78.3

Observations / measurements 9.1 75.8

Working safely & efficiently & cleaning up 11.3 94.2

Table 3 – Average scoring in direct assessment objectives

5.1.3 Commentary on the statistics Examiner reports recorded that students in the trial engaged very enthusiastically and direct assessment results show that most were able to work competently and safely in the laboratory and complete their tasks. As the tasks were written for common level but approximately 83% of students are expected to sit Leaving Certificate Physics in June 2018 at Higher Level, it is not surprising that most students were able to exhibit a high level of achievement in direct assessment. 48.7% of the students achieved 90% or a higher percentage of the available direct assessment marks; 19.9% achieved full marks. This was in line with marking criteria where full marks were awarded for reaching a high (but not necessarily flawless) level of achievement in each objective. The range of the average direct assessment marks across the tasks used in the trial was [45.9, 54.5]. This quite narrow range suggests that the tasks were reasonably similar in level of practical difficulty. On full roll-out, it would be expected that the numbers taking each task would be similar and the results could be scaled to ensure that the same average mark was achieved in each

10

Page 11: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

task thus addressing issues related to reliability associated with different candidates attempting different tasks. 5.2 Comments on direct examining based on examiner’s reports

Selection of apparatus, chemicals & other materials Students succeeded in selecting the correct equipment, chemicals and materials for their tasks and most scored very well in this area of assessment. The students were able to find the equipment in its usual location in the laboratory unless the trial took place in a laboratory unfamiliar to them. In such cases it was easier when the equipment was laid out on benches for the practical assessment. This suggested that most of the students were familiar with working in a laboratory. It was recommended by examiners that apparatus should be left in its usual location in the laboratory but the examiner should check before the sessions that sufficient equipment was available. Where equipment is stored out of sight in cupboards or drawers, labels should indicate the content of the cupboard or drawer but where possible students should be assessed where they usually perform practical work. Examiners reported no issues about tasks being impossible to do because of lack of equipment or chemicals although some improvisation did occur. In particular, some schools found it difficult to carry out both task P11 and task P12 because of a lack of availability of mechanics equipment. Assembly of apparatus Examiners reported very good competence by students in arranging and assembling apparatus. Examiners reported student problems with the assembling the apparatus in P4 and P5 involving heat. Some students had problems with tasks P11 and P12, depending on whether or not light-gates were being used. When undertaken by one student working alone instead of by a pair of students working together, tasks P11 and P12 may require more planning. If practical assessment, as trialled, were introduced, teaching and learning practices would change to give students more practice working without a partner in some lessons based on practical work. Some students found the setup of the one electricity task (P8) to be quite difficult. Use of apparatus Technical errors were found in a number of tasks. Errors in taring mass balances and errors of parallax in reading metre sticks and thermometers were common. (Many examiners penalised this in the next section however.) As mentioned above tasks P11 and P12 were problematic, depending on equipment used; similarly problematic were tasks where datalogging equipment was used such that the students were not taking direct measurements themselves, or then manipulating this data. Some students found the use of the micrometer and the multimeter in task P8 to be a challenge. At present, the successful execution of experiments in school is generally considered to be the responsibility of the teacher and not of the individual student. When experiments fail to

11

Page 12: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

deliver the expected product or result, the student is not usually required to troubleshoot the situation themselves and is not often required to care about the outcome and may even, if queried, attribute blame to the equipment, to the materials, to his or her partner or to the teacher. If practical assessment, as trialled, were introduced, students would realise that in laboratory lessons they need to accept responsibility for the conduct and outcomes of their own experiments as they practice for external practical assessment. Observations / measurements Examiners reported some errors in reading scales on thermometers and metre-sticks, as mentioned previously. In the case of tasks P1 and P2, errors in where to reach object and image distances were observed. In the case of task P3, a large number of students displayed confusion regarding what angles to measure. Most mechanics experiments did not trouble students who were used to the equipment they were using. Working safely & efficiently & cleaning up There were no reports from the examiners of any student behaving in a dangerous manner in the laboratory, risking themselves and others. The majority of students acted safely and efficiently and completed their tasks in the time allowed and had plenty time to complete their booklets. Not all students were always busy and on their feet throughout the 90 minute session, which indicated the possible need for the introduction of sub-tasks in the Physics tasks. The students worked quietly with minimal interaction between them. Most students completed the task booklet at the end while it may have been more efficient to partially complete them as they proceeded through their tasks. Some Physics students requested white coats and laboratory glasses but many did not. Where worn, students also had a tendency to remove the safety glasses when completing the task booklets towards the end, forgetting that others were still working near them. While Physics tasks may not lead to the same safety concerns as those found in Chemistry and Biology, students carrying out tasks P4, P5 and P6 had to be aware of safety issues arising from the use of hot plates or similar. All students should be aware of safety issues arising from any work being carried out in a laboratory, even if it not being carried out by that student. If practical assessment, as trialled, were introduced it would prompt discussion about best, safe laboratory practice. A detailed set of protocols could be issued by the SEC to schools to alert schools, teachers, students and examiners to a common, acceptable safety code.

12

Page 13: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

6 Practical examination outcomes – indirect assessment

6.1 Statistics

6.1.1 Average mark overall for indirect assessment The overall average indirect assessment mark was 31.1 or 51.8% (with a standard deviation of 12.8 marks or 21.3%) and a range [3, 60]. The average mark value, the large standard deviation and the large range are indicators that the task booklets were discriminating of different levels of student ability. Graph 3 shows the distribution of the indirect assessment marks, expressed as percentages.

Graph 3 - Indirect Assessment Mark Distribution (%) – Physics

0

5

10

15

20

25

30

35

40

45

50

0 10 20 30 40 50 60 70 80 90 100

No

of S

tude

nts

Indirect Assessment Mark (%)

13

Page 14: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Graph 3 – Cummulative frequency curve Indirect Assessment – Physics

6.1.2 Average mark for each task Table 4 summaries the 292 student performances in indirect assessment in the trial. Seven students submitted task booklets completed in Irish. Only 3.4% of the students achieved 90% or a higher percentage of the available marks, meaning that this part of the assessment was significantly more discriminating than the direct assessment part. The range of the average indirect assessment mark per task was [25.3,36.7]. Such fluctuations in results from task to task are understandable. On full roll-out, it would be expected that the numbers taking each task would be similar and the results could be scaled to ensure that the same average mark was achieved in each task thus addressing issues related to reliability associated with different candidates attempting different tasks.

0

50

100

150

200

250

300

0 10 20 30 40 50 60 70 80 90 100

No.

of S

tude

nts

Indirect Assessment %

14

Page 15: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Ta

sk

No.

att

empt

s

No.

att

empt

s (%

)

Rank

ord

er o

f no

. of a

ttem

pts

Aver

age

mar

k,

out o

f 60

(and

as

%)

Rank

ord

er o

f av

erag

e m

arks

Task Topic

1 38 13.0 1 29.8 (49.7) 9 Converging lens

2 35 12.0 2 30.2 (50.3) 8 Curved mirror

3 27 9.2 3 25.6 (42.7) 11 Refractive index

4 17 5.8 11 25.3 (42.2) 12 Specific heat capacity of a liquid

5 19 6.5 10 26.1 (43.5) 10 Specific latent heat of a substance

6 22 7.5 7 34.2 (57.0) 3 To investigate variation in the thermometric property of a material with temperature

7 27 9.2 3 32.1 (53.5) 5 To measure g, the acceleration due to gravity by free fall

8 21 7.2 8 33.1 (55.2) 4 To measure the resistivity of the material of a wire

9 26 8.9 5 36.7 (61.2) 1 To investigate the laws of equilibrium for a set of co-

planar forces

10 25 8.6 6 36.3 (60.5) 2 Boyle’s law

11 21 7.2 8 31.6 (52.7) 7 To verify that acceleration is proportional to force

12 14 4.8 12 32.0 (53.3) 6 To verify the principle of conservation of momentum

Table 4 - Frequency and average indirect assessment mark for each task, Physics

6.1.3 Commentary on the statistics There was a wide variation in the quality of student answering with marks for the indirect assessment varying from 3 to 60 out of 60. Each examiner marked approximately 36 booklets, and therefore the number of booklets marked by each examiner for each of the tasks used in the trial was in the small single figures. Some examiners marked no examples of some tasks. Examiners commented on the booklets being only partially completed by students, in particular some of the more complex mathematical and graphical parts which existed in all tasks. Given the small numbers of booklets involved, examiner ‘general’ impressions were often contradictory, e.g. some examiners reported students having excellent understanding of percentage error and some reported that this was a source of difficulty for students.

15

Page 16: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

In the Section A of the Leaving Certificate Physics examinations, some students who can correctly describe the procedures carried out in a mandatory experiment are then unable to correctly explain the scientific rationale behind the procedures and are unable to perform the associated calculations without errors. Similarly in this trial, some students who were able to carry out a practical task very well, and scored well in direct assessment, struggled with the associated calculations and with explaining the rationale governing the procedures followed; such students consequently scored poorly in indirect assessment. This is an unsurprising outcome of the trial as the students involved are in preparation for the current examination format and had not prepared for a practical assessment in the same way that might be expected were a practical component to be part of the summative assessment of the subject specification. The design of the tasks for common level meant that certain items known to be very challenging to Ordinary Level students, e.g. drawing graphs, using graphs, manipulating and using mathematical formulae were poorly answered by a significant minority of the trial cohort. Examiners commented that the work of most students did not indicate that time was a factor in any failures to properly complete the task booklet.

16

Page 17: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

7 Direct and indirect assessment – contrast and comparison

7.1 Analysis of overall student performance The number of students who participated in the Physics trial was 292. The average total practical assessment mark per task was 89.3 or 74.4% and a mark range [35, 120]. Table 5 shows the average total mark in each task used in the trial. The range of average marks was [72.0, 91.2]. This range is broader than would be acceptable in a full roll-out situation, suggesting that overall the tasks were not of equal difficulty. However the total number of students involved in the trial was small, and the statistics were possibly influenced by the different states of readiness of the students for assessment given the time of the trial and fluctuations in results from task to task are understandable. On full roll-out, it would be expected that the numbers taking each task would be similar and approximately 650 per task. The results could be scaled to ensure that the same average mark was achieved in each task thus addressing issues related to reliability associated with different candidates doing different tasks.

Task

Aver

age

DA

mar

k, o

ut o

f 60

(and

as

%)

Aver

age

IA

mar

k, o

ut o

f 60

(and

as

%)

Aver

age

tota

l m

ark

out o

f 120

(a

nd a

s %)

Rank

ord

er o

f av

erag

e m

arks

Task Topic

1 51.8 (86.3) 29.8 (49.7) 81.6 (68.0) 9 Converging lens

2 52.3 (87.2) 30.2 (50.3) 82.5 (68.8) 8 Curved mirror

3 52.1 (86.8) 25.6 (42.7) 77.8 (64.8) 10 Refractive index

4 47.2 (78.7) 25.3 (42.2) 72.5 (60.4) 11 Specific heat capacity of a liquid

5 45.9 (76.5) 26.1 (43.5) 72.0 (60.0) 12 Specific latent heat of a substance

6 52.5 (87.5) 34.2 (57.0) 86.7 (72.25) 3 To investigate variation in the thermometric property of a material with temperature

7 51.1 (85.2) 32.1 (53.5) 83.2 (69.3) 6 To measure g, the acceleration due to gravity by free fall

8 52.8 (88.0) 33.1 (55.2) 85.9 (71.6) 4 To measure the resistivity of the material of a wire

9 54.5 (90.8) 36.7 (61.2) 91.2 (76.0) 1 To investigate the laws of equilibrium for a set of co-

planar forces

10 53.8 (89.7) 36.3 (60.5) 90.2 (75.2) 2 Boyle’s law

11 52.0 (86.7) 31.6 (52.7) 83.5 (69.6) 5 To verify that acceleration is proportional to force

12 51.2 (85.3) 32.0 (53.3) 83.2 (69.3) 6 To verify the principle of conservation of momentum

17

Page 18: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

15 51.8 (86.3) 29.8 (49.7) 81.6 (68.0) 9 Converging lens

Table 5 - Average total mark for each task, Physics

7.5% of the students in the trial were awarded 90% or a higher percentage of the available mark; 30.1% were awarded more than 80%. For comparison the H1 rate at Higher Leaving Certificate Physics in 2017 was 10.7% and the combined H1 and H2 rates were 27.3%. Graph 4 shows how the sum of the direct and indirect assessment mark distribution compares with the distributions of marks for direct and indirect assessment compare.

Graph 4 - Comparison of mark distribution for the sum of the direct and indirect assessment marks with the of mark distributions for direct assessment and indirect assessment.

0

10

20

30

40

50

60

70

80

0 10 20 30 40 50 60 70 80 90 100

No

of S

tude

nts

Marks (%)

Direct Assessment Indirect Assessment Total

18

Page 19: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Graph 4 - Comparison of cummulative frequency curves for the sum of the direct and indirect assessment marks with cummulative frequencies for direct assessment and indirect assessment.

Graph 5 compares the direct assessment marks and the indirect assessment marks. This scatter displayed on the graph is indicative that two different skill sets were examined. However the scores are not fully independent of each other as the task booklets were completed at the same time as the direct assessment. Some of these students may have worked through their tasks very well but at a rate that allowed too little time to complete the task booklets resulting in low indirect assessment scores. Other students may have had to repeat parts of their task to succeed in completing their tasks, could have been awarded a high direct assessment mark but at the expense of their indirect assessment mark.

0

50

100

150

200

250

300

0 20 40 60 80 100

No.

of S

tude

nts

%

IndirectAssessmentDirectAssessmentTotal

19

Page 20: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Graph 5 – Comparison of direct assessment marks and indirect assessment marks

y = 0.3349x + 40.686R² = 0.1471

0

10

20

30

40

50

60

0 10 20 30 40 50 60

Dire

ct A

sses

smen

t Mar

k

Indirect Assessment Mark

20

Page 21: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

7.2 Attainment of Key Syllabus Objectives, Engagement and Performance

Cognitive Domain

Knowledge and understanding Some short response questions in every task required recall of facts, equations, formulae, symbols, units or basic physics terminology, e.g. Task P1 (Name another type of lens.) and task P8 (Use the Formulae and Tables booklet to find the formula for the resistivity of a wire. Explain the notation.) Many of these short response items were designed to be accessible to students of all levels. Most students answered these short response items well. In some cases the answers were to be found in the Formulae and Tables booklets provided. The tasks also required students to demonstrate their broader or deeper knowledge and understanding of the principles and theories underlying the practical tasks. For example, in task P5 (What was the advantage of allowing a large change in temperature? Was there any disadvantage in doing so?) and in task P9 (How did you account for atmospheric pressure in your experiment?). Examiners noted that the quality of students’ answers to some of these more challenging questions may have significantly depended on the method they used to carry out the experiments and the equipment available to them. The trial provided an opportunity for students to use the equipment available to them to guide them in answering questions which in a written paper would be entirely theoretical. For example, in task P3 (For what angle of incidence does the light ray not change direction?) students were observed to use the equipment to try to determine an answer. This is illustrative of how practical assessment can facilitate and promote methods of investigative learning that are not always to the forefront during current laboratory practical sessions. Application and Analysis In six of the tasks students were asked to draw a graph, although they could also have drawn a graph to help answer questions in two further tasks. Examiners reported that many students were able to draw good graphs of their own data, and where it was provided, of the pseudodata. Drawing graphs of experimental data an activity which all Higher Level Physics students would be familiar with and which is given significant emphasis on the current syllabus and on recent Higher Level examination papers. However many Ordinary Level students would not have the same degree of aptitude or experience in drawing graphs, which led to a significant loss of marks in the indirect assessment portion of the marking for such students. For all tasks, students were required to perform calculations. Calculations were required on students’ own primary data, and/or on modified data, and/or on data obtained from a graph the student had drawn. An example of a calculation based on primary data is in task P4 (Assuming no heat loss to the surroundings, calculate the specific heat capacity of the liquid you used, using the data obtained in the experiment); an example of data modification is in task P7 (With reference to the formula you have written above, perform the appropriate mathematical operation on the measurements you have taken.); an example of a calculation

21

Page 22: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

on data obtained from a graph was in task P3 (Use your graph to calculate the refractive index of the material used in the experiment.). Performance in calculations reported by examiners varied from calculations not attempted, to calculations not completed and poorly attempted with multiple errors, to calculations that were attempted well with minor mathematical slips to calculations that were correctly executed. This was the expected outcome given that the tasks were set at Common Level and undertaken at a time in the school calendar when students were not fully prepared and practised in calculations of the type they might be expected to perform in Section A of the Leaving Certificate examination papers in June. Examiners noted however that significant errors were observed in the calculations attempted in tasks that it would be expected students to be familiar with at this point it their studies, e.g. task P3. It is possible that students’ performance in data analysis and calculations would be improved when the students would have had an opportunity to consolidate their learning on completion of the syllabus. The complex calculations in tasks P4 and P5 proved a significant challenge for many students; this challenge was increased if students were using a mechanical method (“method of mixtures”) to carry out these experiments rather than an electrical method. The current syllabus allows students/teachers/schools to choose what method they use to approach experiments. The choice of method had a potential impact on students’ performance in the direct assessment portion for most/all tasks; however for tasks such as P4 and P5 this choice of method also seems to have had a serious impact on students’ performance in calculations. (This problem of tasks showing a varying degree of difficulty depending on experimental method used was not limited to these tasks, but is best exemplified here.) Synthesis and Evaluation Students, in each task, were required to evaluate their own data and decide whether to work on their own data or request, subject to possible penalty at the direct assessment stage, pseudodata. This necessitated their evaluation of the quality of their own work. The penalty was not applied if the examiner deemed the data problem to be not of the student’s making, e.g. due to instrumental error that could not be promptly corrected. Students could subsequently be awarded full marks in the indirect assessment for correct evaluation of their own data (even if flawed) and/or of the pseudodata. However the production of flawed data (whether resulting in a penalty during the direct assessment or not) could lead to problems in students’ ability to answer question in the indirect assessment. When a student’s poor data meant that they their graph did not show a linear relationship, the student frequently made no attempt to draw a line of best fit, meaning that no marks could be awarded for calculating a slope. However it was noted by examiners that some students had enough time to notice that their graph did not look as expected, gather further (better) data and draw an improved graph. Some tasks asked students to examine their experiment in ways that they may not have previously considered. An example is in task P8 (Suggest a reason, other than experimental error, why you may get a different value for resistivity if the same measurements were taken

22

Page 23: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

on a different day.), where some examiners noted that students’ responses to a slightly unusual question was better than expected. It seems possible that the act of carrying out an experiment may help students to better construct abstract evaluations, compared to their ability to consider such issues in a traditional written examination. Psychomotor Domain This trial provided new opportunities to assess student achievement in the psychomotor domain in terms of speed and precision of action, physical dexterity and organisation and experimental techniques, i.e. to assess achievement of the practical the skills and competences specified in the syllabus. It is not possible to assess these psychomotor skills by means of a written examination only. Perception and Readiness to Act Arising out of the need to design tasks that could be completed by all students, irrespective of the experimental method they had previously used in class (and, relatedly, of the equipment to them), Physics tasks had to be written in what might appear to be a generalised and unspecified way. For example, in task P11, students were asked to “measure the acceleration (a) for a certain force (F), using an appropriate method. Alert your examiner to check what you are doing in this step. Record in the table below the values for any distances, times and/or velocities that you measured or calculated during this experiment that will help you to calculate the acceleration. Include in the table the units you have used.” Depending on the method chosen, students may have been measuring different sorts of distances or velocities. It is unclear how such generalisation is task design might have led to problems with examiners assessing students psychomotor skills. Examiners reported some situations where students were using physical methods previously unseen by the examiner. In the case of most tasks, students’ previous experience in carrying out the experiment would have been as part of a small group. The change to working individually was a challenge to some students, not only cognitavely, but also mechanically. This challenge was more significant for some tasks (e.g. P11 and P12). Guided Response, Mechanism and Complex Overt Response In each session students were required to carry out complex tasks based on mandatory experiments that they had previously carried out and use skills or mechanisms that they had become proficient at by practice. The laboratory techniques demonstrated by the students, e.g. measuring distances, masses and times, reading a thermometer, setting up a circuit, setting up reflection/refraction arrangement, setting up trolleys/air-tracks, etc. were guided responses or mechanisms that were achieved by imitation, trial and error and repetition before the trial. Students demonstrated competence in guided response and mechanism in the trial by responding appropriately, without detailed instructions, to very limited and generalised stimuli in the tasks.

23

Page 24: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Typically instructions were of the form:

• Select the apparatus required for this experiment. • Set up the apparatus. Alert your examiner to check what you are doing in this step. • Using an appropriate method, measure … . Record your results in the table below.

Alert your examiner to check what you are doing in this step. High scores in the second two assessment objectives in Table 3 indicate high levels of achievement in the areas of guided response and mechanism. However various errors in technique were observed including readings not taken at eye-level, failure to crush or dry ice, measuring inappropriate distances, angles, temperatures, masses and times, incorrect use of multimeter, failure to account for zero error in micrometer or mass balance, etc. Some candidates were observed to have performed their tasks very quickly and accurately and in a highly coordinated manner. However no distinction was made in direct assessment between a high level of practical achievement and or complex overt response, i.e. an expert performance. Adaptation and Origination The tasks were based on a selection of the mandatory experiments on the current syllabus. While in each case some adaptable proficiency was required by the students, an attempt was made during task design to write the task in such a generalised way that students were not disadvantaged, either by the method previously practised (typically chosen for them by their teacher) or by equipment available to them. Students were however required to modify their approach to troubleshoot situations that did not go to plan. Examiners noted that many students demonstrated high levels of achievement in solving unforeseen problems due to a slight unfamiliarity with equipment, conditions etc. As students of the current syllabus are not expected to carry out unseen experiments, the trial did not assess students’ ability to design and execute new approaches that would apply acquired skills to a new practical situation, e.g. using equipment or material not previously encountered.

24

Page 25: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

7.3 Key issues associated with individual tasks Task P1 (To investigate experimentally the relationship between object distance, image distance and focal length for a converging lens.) Students typically arranged and used the equipment well. When they found that, for a chosen object distance, no real image could be formed, they moved object, lens and/or screen until a real image was found. Some errors were noted in finding the sharpest image and in students measuring distances that were not to the centre of the lens. Typical errors were found in the data analysis section of the task, i.e. handing fractions, failure to invert for final answer, averaging of object and image distances rather than focal lengths. Many student seemed unaware of how to find an approximate focal length at the start of the experiment. Safety issues did not seem to be apparent. Task P2 (To investigate experimentally the relationship between object distance, image distance and (i) focal length, (ii) magnification for a curved mirror.) The design of this task was very similar to task P1 above, with most of the questions repeated, as would be expected for these parallel experiments. Examiners noted the same issues arising both in direct and indirect assessment. Again, students usually arranged and used the equipment well. When they found that, for a chosen object distance, no real image could be formed, they moved object, mirror and/or screen until a real image was found. Some errors were noted in finding the sharpest image and in students measuring distances that were not to the centre of the lens. Typical errors were found in the data analysis section of the task, i.e. handing fractions, failure to invert for final answer, averaging of object and image distances rather than focal lengths. Many student seemed unaware of how to find an approximate focal length at the start of the experiment. Safety issues did not seem to be apparent. Task P3 (To investigate the relationship between angles of incidence and angles of refraction and to determine experimentally the refractive index and critical angle of translucent materials.) Examiners reported that an unexpectedly large part of the student cohort made fundamental errors in carrying out this experiment, either failing to draw normals, measuring two angles of incidence (on either side of the transparent block) and no angle of refraction, and/or measuring the angles between the rays and the block. Because many school laboratories might be without a protractor and students might not otherwise have one available to them, examiners were instructed to bring a small protractor with them. As is observed in candidate responses in written examinations, some students either failed to calculate the sine of the angles measured, or more confusingly, calculated the sines but did not plot them on a graph. Calculating a slope a challenge for many students. Students used the equipment available to help them determine angle of incidence for which the light ray does not change direction; i.e. asking this question in a practical examination made it a significantly different question than if it was asked in a purely written examination, and one which allowed students to demonstrate creativity and originality. Safety issues did not seem to be apparent.

25

Page 26: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Task P4 (To determine the specific heat capacity of a liquid.) Examiners reported that this task proved to be a challenge for many students, both in carrying out the experiment and in performing the data analysis and calculation. As noted previously, each of these challenges were increased if students were using a mechanical method rather than an electrical one. However students using an electrical method also had some challenges: in one case a student who had previously used a commercially supplied joulemeter was now trying to set up a circuit using an ohmmeter and ammeter, due to the joulemeter having been recently broken – this challenge is inappropriate at this level. Safety issues were apparent in the choice of safety equipment, experimental technique and safe disposal of hot materials. Task P5 (To determine experimentally the specific latent heat of a substance.) The design of this task was quite similar to task P4 above, as might be expected for these experiments. Although the current syllabus specifies that candidates measure the specific latent heat of fusion of ice and the specific latent heat of vaporisation of water in separate experiments, the new draft specification combines these experiments. In task design, the choice was made to allow students to measure either specific latent heat. Examiners noted that those students who chose to measure the specific latent heat of vaporisation of water faced more significant technical (and safety) challenges than those who chose to measure the specific latent heat of fusion of ice. In both cases, as with task P4 above, the data analysis and calculations proved to be a significant challenge for all but the most very able candidates and highlighted the difficulty of a Common Level examination of this sort. Again, safety issues were apparent in the choice of safety equipment, experimental technique and safe disposal of hot materials. Task P6 (To investigate variation in the thermometric property of a material with temperature (as defined by a mercury thermometer) and use this property in the design of a thermometer.) While most students who carried out this task were observed to use an uncalibrated alcohol-in-glass thermometer, some were observed to measure an electrical property, typically resistance of a conductor or semiconductor, or emf of a thermocouple. Students typically arranged and used the equipment well, and took accurate measurements, although those measuring electrical properties made more errors. Students drew graphs well but often showed errors in using their graph to determine the temperature of the room. The question which asked students what a calibration curve was proved to be very challenging, and stimulated much discussion among the team of examiners. Safety issues were apparent in the choice of safety equipment, experimental technique and safe disposal of hot materials. Task P7 (To measure g, the acceleration due to gravity by free fall.) Examiners found that although many students had initial difficulties setting up the freefall apparatus and timing circuit, trial and improvement meant that they were usually able to proceed without examiner intervention. Errors were observed in measuring the correct height, either in not reading the meter stick at eye-level or, more commonly, not measuring from the bottom of the object to the trap-door. Those students measuring falling heights and falling times using traditional techniques were typically awarded high marks during the direct assessment, but often lost marks in manipulating data, drawing an appropriate graph,

26

Page 27: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

calculating the slope and using the slope to find g. However the small number of students who used “data-logging” methods to calculate g (using a “picket-fence” falling object and light sensors attached to a computer using specialist software) had immense difficulty in showing what measurements they took and how they used them; in some cases, the computer drew a graph and calculated g, meaning that the students were unable to demonstrate data-handling abilities and did not gain a high mark during the indirect assessment. Safety issues did not seem to be apparent. Task P8 (To measure the resistivity of the material of a wire.) This was the only task based on the electricity portion of the current syllabus, which is typically not covered by students until sixth year. Some schools reported that candidates had not done this experiment in class yet. Examiners reported that students were generally proficient in setting up and using the equipment, although some students had to request help in the appropriate use of a digital mulitmeter. Some students chose a mechanical callipers rather than a micrometer to measure the diameter of the wire and were penalised appropriately. In drawing and using a graph of R against l to calculate resistivity, most students faced an unforeseen challenge; examiners reported their surprise in how well many candidates understood how to draw and use the appropriate graph. Many students showed excellent understanding and synthetic skills in explaining why they might have gotten a different value for resistivity if the same measurements were taken on a different day. Safety issues did not seem to be apparent. Task P9 (To investigate the laws of equilibrium for a set of coplanar forces.) This was the task which gained the highest average mark for both direct and indirect assessment. Examiners reported the students had little difficulty in selecting and arranging the apparatus, taking measurements of distances and weights; some made errors in reading upward forces by reading from the “mass” side of the scale rather than the “force” side. The calculations of moments were a challenge to many students, who often made repeated attempts and who often failed to get the moments to balance. A common error was a failure to account for the turning moment due to the weight of the meter stick itself. The last question (Which of the quantities you have measured contains the largest percentage error? Explain your answer.) allowed students to demonstrate their ability to evaluate and synthesise and proved to be a highly discriminating question. Safety issues did not seem to be apparent. Task P10 (To verify Boyle’s law.) Examiners reported that students’ performance in the direct assessment portion of this task often depended on the experimental method they chose, i.e. the method made available to them by their school. Those using a traditional Boyle’s law apparatus (pump, valve, pressure gauge, oil, trapped gas) often had difficulties in taking readings and/or in using the apparatus in a reliable way. Those using a gas syringe attached to a digital pressure sensor typically experienced little or no such difficulties and subsequently had the potential to be awarded higher marks during the direct assessment. Depending on the method used (and on the type of pressure gauge used) students had greater or less potential to explain how they accounted for atmospheric pressure in their experiment. Because no student was using dangerously high pressures, safety issues did not seem to be apparent for this task.

27

Page 28: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

Task P11 (To verify that acceleration is proportional to force.) This was the only task in which data was given to the student as part of the task. This choice was made during task design as it was felt that asking students to repeat the same task of using, say, ticker-tape to determine an acceleration six times would be a poor use of examination time. Students usually used either a runway (slightly elevated to counter friction) and trolley or a linear air-track and riders during this experiment. Students usually used either a ticker-tape timer or electronic light-gates during this experiment; in some cases the task of setting up the light-gate circuitry was a significant challenge for students. It was felt that for this task the fact that students had to work alone, rather than as part of a small group, led to significant physical and organisational challenges. Some students used data-logging methods, which was an issue for students and examiners only when the data-logger was set up not just to measure distances and times but also to calculate acceleration directly. Safety issues did not seem to be apparent. Task P12 (To verify the principle of conservation of momentum.) As with the similar task P11 above, students usually used either a runway (slightly elevated to counter friction) and trolley or a linear air-track and riders during this experiment. Again, students usually used either a ticker-tape timer or an electronic light-gate during this experiment; in some cases the task of setting up the light-gate circuitry was a significant challenge for students. Some students used data-logging methods, which was an issue for students and examiners only when the data-logger was set up not just to measure distances and times but also to calculate velocity directly. For many schools it was not possible to make both task P11 and task P12 simultaneously available, usually because the school had access to only one air-track or runway system. Safety issues did not seem to be apparent. 8 Findings

8.1 Key messages in relation to the practical assessment and future practical asessment General • Students’, physics teachers’ and school managements’ responses to the notion of

practical assessment of senior sciences and to the model trialled were positive.

• There is general agreement that teaching and learning of Physics would be enhanced by the introduction of practical assessment and that students would have greater sense of involvement in and ownership of all the practical work they do.

• There is a perception that at the moment some schools do not carry out some or all of

the mandatory practical work and that their students still perform well in the Leaving Certificate Physics examinations. This discourages some teachers from investing a lot of time and effort in practical work and some students doubt the importance of the practical work in supporting them in understanding theory.

28

Page 29: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

• At present practical Physics classes are perhaps taken less seriously even by ambitious students than theory classes. Even though students may enjoy practical work, they are not always fully incentivised to work skilfully, to work independently or to consider extensions to the experiments being conducted or to devise improved methodologies to do them.

• An increased emphasis on practical assessment in Physics would help students acquire

transferable skills of observation and measurement, manual dexterity, and a good attitude to safety based on a balanced approach to hazards and their associated risks, that can be applied outside the laboratory.

Designated Support Teacher (DST) and Laboratory Preparation • DST support of practical assessment before and during the assessment is critical.

• DSTs in the trial engaged very positively and were very supportive of the role of the

examiner. DSTs were asked to account for and check all equipment, to arrange it suitably and to tidy away equipment as necessary after the session(s).

• The introduction of practical assessment in the science subjects would require a more

planned approach by the schools science departments to laboratory and storeroom management, ordering of equipment, access to laboratories etc.

• The introduction of practical assessment in Physics would require appropriate continuous

professional development to be made available for teachers. Examiners and Students • Students in the trial were reported to have engaged very positively with the trial; the

majority of students remained for the full session although many were finished their work some time prior to the end of the 90 minutes.

• In order to avoid system failures, the examiners would need a time allocation in each school before the commencement of the practical sessions to check that all equipment necessary was available and in proper working condition.

• Examiner training would need to involve a practice session to familiarise examiners with

the tasks. • A student from a non-exam class acting as a helper (to call DST, etc) during examining

sessions is recommended.

• Physics tasks should be designed to have two sub-tasks, the larger one similar to those used during this trial and the smaller one to assess a different aspect of practical skills.

29

Page 30: Chief Examiner’s report – PhysicsTrial of practical assessment in Leaving Certificate Physics, Chemistry, Biology 2017– Chief Examiner’s report – Physics This report is based

• While it is possible to assess 12 students in a session, it is difficult to observe all relevant work by any one student in a group of 12. This is especially the case if one or more students required a significant amount of help. Examiner guidelines should be provided about the maximum time that could be spent helping a student in a session.

• In an actual practical assessment it is likely that students would be more nervous.

However it is also likely that they would have prepared more thoroughly for the assessment.

• It should be accepted that direct assessment will be typically high-scoring and

undiscriminating while the indirect assessment will be far more discriminating. • Where a student in a session had a catastrophic experience of their own making or as a

result of an error by another student, e.g. a collapse of assembled apparatus, or a breakage that involved a spillage of a solution that could not be quickly dealt with, say within 5 minutes, it should be possible to reschedule the student to a later session (doing another task) to avoid claims for systems failure. The examiner could make a judgement about whether the student should start the later session carrying a penalty of 4 marks for an error or not.

• Some practical work specified as Higher Level in the current syllabus is designated

common to both levels in the draft subject specifications on which practical assessment in the future will be based. Two marking schemes, differentiated at Higher Level and Ordinary Level, could be used to help give weaker students more credit for procedures and observations with fewer marks assigned to detailed conclusions, explanations, data analysis and calculation while at Higher Level more marks could be allocated for data handling, and fewer for recording observations, explaining safety precautions, etc.

• Careful thought and attention needs to be given to redesigning parts of the draft subject

specification to allow for fair practical assessment in Physics. This may mean that for each practical activity, one specific experimental methodology may have to be detailed. A fair and reliable practical examination cannot be designed for a cohort of students who are carrying out significantly different experiments, using different techniques, different equipment and, consequently, calculations of widely varying degrees of difficulty.

• The use of data-logging equipment and/or computer software to perform calculations (as

opposed to measurements) is to be discouraged. Students should become adept at analysing their data and performing calculations. Students should expect to be assessed in data analysis and calculations arising out of experimental work in any future Physics examination.

30