scoring units of competency

164
Connecting Competence and Quality: Scored Assessment in Year 12 VET Patrick Griffin Shelley Gillis Leanne Calvitto The University of Melbourne Assessment Research Centre

Upload: umesh-banga

Post on 10-Apr-2015

195 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: scoring units of competency

Connecting Competence and Quality: Scored Assessment in Year 12 VET

Patrick Griffin Shelley Gillis Leanne Calvitto

The University of Melbourne Assessment Research Centre

Page 2: scoring units of competency

2

Page 3: scoring units of competency

3

TABLE OF CONTENTS

Acknowledgements ......................................................................................................4 Executive Summary .....................................................................................................5

Recommendations ....................................................................................................7 The Project and Background ......................................................................................16

Assessment and Reporting .....................................................................................16 Standards Referencing............................................................................................16 The Notion of Competence ....................................................................................19

Assumptions .......................................................................................................20 Changed Focus for ‘VET in Schools’.....................................................................23 Recording and Reporting........................................................................................23

Trials ..........................................................................................................................29 Score Development ................................................................................................29 Differential Weighting............................................................................................31

The Outcomes ............................................................................................................49 Differentiating Scores.............................................................................................49 Competency Interpretation .....................................................................................51 Compatibility with System Practices......................................................................52

New South Wales (NSW)...................................................................................52 Victoria...............................................................................................................58 Tasmania ............................................................................................................62 Australian Capital Territory (ACT) ....................................................................63 Queensland.........................................................................................................65 South Australia ...................................................................................................68 Western Australia ...............................................................................................70 The Assessment Model.......................................................................................70

Implications................................................................................................................72 National ..................................................................................................................72 Consistency ............................................................................................................73 Systems ..................................................................................................................79 Teachers’ Practices.................................................................................................79 Employers ..............................................................................................................79 Parents....................................................................................................................80 Students..................................................................................................................80

References..................................................................................................................81 Appendix A ................................................................................................................83

Standards Referenced Frameworks ........................................................................83 Level Descriptions and Distributions .....................................................................83 Hospitality Units ....................................................................................................83 Distributions. ........................................................................................................101

Appendix B ..............................................................................................................107 Standards Referenced Frameworks ......................................................................107 Level Descriptions and Distributions ...................................................................107 Business Studies Units .........................................................................................107 Distributions. ........................................................................................................120

Appendix C ..............................................................................................................125 Standards Referenced Frameworks ......................................................................125 Level Descriptions and Distributions ...................................................................125 Information Technology Units .............................................................................125 Distributions. ........................................................................................................144

Page 4: scoring units of competency

4

Appendix D ..............................................................................................................148 Standards Referenced Frameworks ......................................................................148 Level Descriptions and Distributions ...................................................................148 Metal and Engineering Units ................................................................................148

Acknowledgements

The project team acknowledges the many teachers and students who contributed to this important study. We also thank the staff at ANTA who gave considerable support and members of ACACA agencies and Departments of Education throughout Australia, who encouraged, supported and in some cases provided examination data that was so important to the overall interpretation of the project results. It is also important to acknowledge the contribution of the students who agreed that their examination data could be analysed as part of this project. They have made an important contribution.

Page 5: scoring units of competency

5

Executive Summary

The Federal Government, together with the NSW Department of Education and

Training, launched a project in 2000 involving all state and territory education

systems and ACACA agencies to examine expanding opportunities for youth and to

identify ways of obtaining greater industry and university recognition of achievement

in ‘VET in schools’ courses. In the report of the first phase of the study, Griffin,

Gillis, Keating and Fennessy (2001) made a range of recommendations to the national

working party headed by the then Director General, Dr. Ken Boston, for change in

assessment and reporting in ‘VET in schools’ courses.

Three areas of recommendation included the development of

1. standards-referenced frameworks for reporting performance in units of

competence;

2. appropriate tasks and recording procedures that capture the complexity of

workplace requirements and expectations with respect to the units of

competence; and

3. reporting strategies that enable the retention of the competency decision and

allow for quality of performance to be recognised in a differentiating score

that could be used for university selection purposes.

Griffin et al (2001) recommended that a standards-referenced system be used for

interpreting and reporting student performances in VET subjects, such that any

differentiating score used for selection procedures should also be directly interpreted

in terms of the competencies demonstrated. They further recommended that the

reporting method should be linked to nationally endorsed training packages and

acknowledge an underpinning developmental continuum of competence that

incorporated the designation of competent/not-yet-competent status of the student. It

was also recommended that such continua be developed for each unit of competence.

Item response modeling (IRM) was recommended as a relevant technique to enable

school-based assessment and central examination data to be combined into a single

differentiating score when both were used for assessment. In this project it has been

possible to replace the IRM approach with a judgment model using subject matter

experts (SMEs) to emulate the logic of the empirical approach. It was also necessary

that a national credential continued to be issued by accredited agencies since a

Page 6: scoring units of competency

6

national system of differentiated scores should not prevent the registered training

organisations from being able to issue the nationally recognised credentials and giving

recognition to the competence of the persons assessed.

The initial report identified that these conditions were necessary if two fundamental

purposes of ‘VET in schools’ assessment, central to this project, were to be met.

Recognition of competence was a mandatory requirement. In addition, the project

team was required to investigate a method of providing a differentiating score that

could be used primarily for university selection. Both purposes were to be

encompassed under the heading of “Greater Industry and University Recognition of

Achievement in ‘VET in Schools’ Courses”. As such, the selection purpose of the

assessment was foregrounded and the project team set out to develop an assessment

and reporting system that could provide both types of information, and to trial it

nationally. It was expected that this would give credence to both recognition of

competence and differentiation between students for the purposes of selection either

into university or in other forms of further education or indeed any other context

where selection and differentiation were required.

Two other aspects of the study were important. The first was that the study and the

assessment model that arose from it should ensure that ‘VET in schools’ courses

retained, and perhaps even enhanced, their credibility within industry, further

education and universities. It was also required that the model accommodate all state

and territory approaches for ‘VET in schools’ teaching and assessment and all state

and territory systems for developing university entrance scores.

The initial study was followed by a pilot study of the materials and approach. This

was conducted in each participating state and territory, and workshops and

consultations were conducted to ascertain the efficacy of the approach. The study

recruited schools, and developed and reviewed materials in schools, Industry Training

Advisory Bodies (ITABs) and in enterprises. Approaches were made to all school

systems to obtain access to the schools, and workshops were conducted with teachers

and systems in Victoria, ACT and WA. A web site was established for schools, and

contacts were made with all states through teleconferences.

Page 7: scoring units of competency

7

The pilot study established a large sample of schools volunteering to participate and

two states agreeing to provide examination data. Teacher reaction was supportive and

enthusiastic and materials were endorsed at workshops. The consulted ITABS

provided support to the project in the form of materials and nominations of personnel

to assist in the development of materials for the trials - Subject Matter Experts (SME).

The pilot study left little doubt regarding the overall support of the model proposed in

the initial report.

Given this background, the project trialed assessment materials in Victoria, New

South Wales, Queensland, Australian Capital Territory, Tasmania, South Australia

and Western Australia. Sixty schools across four industries received the materials and

tested them in the classroom. The materials and the record forms were retained by the

schools over a whole school year.

The data enabled a calibration of units using a series of quality indices linked to

performance criteria. A process of SME judgments was developed and trialed to

calibrate the criteria. The judgment-based calibration was compared to an empirical

approach using IRM. The close match of the two procedures led us to believe that a

SME approach could be used as a cost efficient, valid approach to defining and

calibrating quality criteria which in turn led to the derivation of differentiating,

interpretable scores based on judgments of performance quality. The criteria were

directly derived from the elements and performance criteria in the national training

packages.

Recommendations

1. It is recommended that there is a national approach to assessment for ‘VET in

schools’ subjects and that the model proposed in this report is adopted

nationally. The procedure will need the endorsement of the Australian

National Training Authority. At present, the assessment approach leads to a

dichotomous decision of ‘competent/not-yet-competent’ for each unit in the

training packages. This dichotomous decision has disenfranchised students

from further and/or higher education and resulted in an inferior status being

attached to ‘VET in schools’ subjects. For some students this was a lifelong

decision that closed off career options.

Page 8: scoring units of competency

8

2. The model requires a modification of the idea of competence. It needs to

incorporate the idea of students being able to demonstrate varying levels of

performance. Support is needed for the notion that, while a decision of

competent/not-yet-competent can still be made, there is room for expanding on

the possible number of levels of performance to enable a differentiated score

to be assigned to students on the basis of the quality of their performance.

3. Standards-referenced frameworks need to be developed for every unit of

competence. The development of these unit level frameworks needs to be

undertaken at a national level in order to standardise the reporting frameworks

available to assessors. ITABS can be shown how to achieve this with

minimum effort using a panel of subject matter experts. Experience in this

project shows that each unit requires between one half and two hours of effort

in order to develop a weighted rubric and scoring procedure that is equivalent

to the outcome of a large scale survey and empirical calibration process.

4. The project has illustrated that there is no need to change existing assessment

methods that address the competencies outlined in the national training

packages. It is not necessary to require the Australian National Training

Authority, the Business Services Training Advisory Board and other agencies

involved in the revision of the Workplace Trainer and Assessor Training

Package to change their procedures, advice, or the kinds of assessment

strategies they are providing. Current methods of assessment should be

retained, but the training of assessors needs to incorporate the approach to

interpreting standards-referenced frameworks.

5. Training in assessment is needed for teachers involved in ‘VET in schools’.

Teachers will need to be provided with training in competency-based

assessment that allows them to recognise and give credence to quality

performances observed. It would also empower teachers to teach beyond the

minimum level and would have flow on implications for curriculum

development. Teachers will need to be encouraged to recognise the higher

Page 9: scoring units of competency

9

levels of competence and to teach to those levels beyond the minimum levels

defined in training packages that have focused on a competence dichotomy.

6. Recording and reporting procedures for competence assessment also need to

be modified without losing the idea of the competence decision. The

competent/not-yet-competent decision should be retained. However, the

concept of competence needs to be expanded, such that it incorporates the

notion of adjusting the performance to the expectations of the workplace. It

further needs to allow for the idea that different levels of performance can be

accommodated within a revised definition of competence.

7. The same audit and sign-off procedures as are currently implemented in a

competency assessment should be retained. That is, both the assessor and the

assessee must agree upon the level of performance demonstrated using the

same range of evidence and the evidence guide as used in the current

competency-based assessment system.

8. Reporting can be expanded to allow for performance quality to be documented

and the basis of a score to be communicated to stakeholders. A transcript

indicating performance quality should be provided for a subject or a

national qualification as part of the certificate issued. For stand-alone units,

the level of performance should be documented on any certificate of

attainment. This profile of performance should be the national standardised

reporting procedure used in competency-based assessment.

9. In completing the trials and development of the assessment model, the project

team has both established and followed a set of principles recommended to be

incorporated into a competency-based assessment.

I. The system of assessment and reporting must be situated in a theory of

learning and assessment.

This principle underlines the belief that a theory of learning,

whether it is in the workplace or in the classroom, is important to

the learning and assessment process used. Too often a so-called

‘good idea’ is the dominant force driving change. Without a solid

Page 10: scoring units of competency

10

theoretical basis we have often seen the eventual abandonment

and rejection of procedures with the often-quoted statement ‘it

seemed like a good idea at the time’.

The theory of learning needs to be developmental and a theory of

assessment or measurement has to be consistent with and support

such a theory of developmental learning. These were outlined by

Griffin (2004).

II. The procedure and assessment must satisfy both criterion- and norm-

referenced interpretation.

The model of assessment must support both the purposes of

selection or differentiation as well as the recognition of

competence as applications of the two interpretation frameworks.

The reporting in the project recommends both a differentiating

score as well as a standards-referenced framework that indicates

the quality of performance associated with the score. It

emphasises both forms of reporting and interpretation.

III. The model, approach used, assessment method, materials and decisions

must be transparent and externally verifiable through a formal audit

process.

This is important to give credibility to an assessment and

reporting model and to ensure that the competency assessment is

acceptable to all relevant parties and all stakeholders.

Each unit of competency was examined and validated by national

ITABS ensuring the method and materials used were consistent

and reasonable. The link between the school-based assessment

and the central examination also illustrated that the assessment

procedures and data were verifiable using statistical, or IRM,

procedures.

IV. The assessment procedure and the model must be resource-sensitive in

both development and application.

The procedures and research methodology used in this trial study

could not be used for assessments in the future. The project has

taken more than four years, and a national study, involving

hundreds of teachers, numerous industry personnel and thousands

Page 11: scoring units of competency

11

of students. The model proposed in 2001 by Griffin, Gillis,

Keating and Fennessy has been trialed and explored in terms of

its capacity to accommodate selection and recognition in each of

the state and territory systems. It has used sophisticated computer

technology to analyse large-scale data collections. This cannot be

applied at the national level by every ITAB for every unit of

competence.

In addressing this principle we undertook to find a method that

could be used by individuals, organisations and institutions that

did not require access to, or even familiarity with, the highly

sophisticated approaches of item response modelling (IRM). The

resulting method applied the logic of IRM to a developmental

assessment procedure. It yielded a weighted score where the

weighting was based upon the differentiating power of the

criteria. The study illustrated that the procedure could be carried

out by a panel of subject matter experts in less than an hour, with

practice and experience, with comparable results to the intensive

research-oriented empirical analysis employed in the project.

The empirical process was important to validate the judgment-

based approach.

V. The model and the approach to assessment and reporting must

accommodate the existing assessment procedures that workplace

assessors have been trained to use with minimal change.

Introducing large-scale changes to either the procedures or the

psyche of workplace assessors would create problems for systems

attempting to implement this model. Asking people to either

reject and/or to change their existing understanding of

competency-based assessment or to remove and replace their

existing materials and procedures would fail.

It is for this reason that we focused on how people record their

judgment and how to communicate their judgment to various

stakeholders. It is also true that the procedure retains existing

approaches to assessment and uses existing assessment materials.

We have changed only the method of recording observations and

Page 12: scoring units of competency

12

reporting the level of performance quality and the score aligned

to that.

VI. The model and its procedures should be accessible to subject matter

experts.

The subject matter expert was defined as a person nominated and

approved by a national industry training advisory body, as one

who understands the training package, its workplace

implementation and the manner in which the competency units,

elements and performance criteria are usually manifested in the

workplace. He or she was expected to be a person who could

recognise and describe differences in the quality of the

performance demonstrated by persons’ workplace performances.

If, on the other hand, the model had to be developed by

theoretical specialists and remained inaccessible to SMEs, but

demanded the input of those who possess and have access to the

highly sophisticated materials, techniques and software (as

required for this project), only a few people would be able to

carry out the task. It would also be prohibitively expensive.

i. The model that was trialed, after being proposed in the initial

report, made the procedures for development and implementation

accessible to everyone. If panels of experts were to be assembled

to develop these procedures, then the recommended system

would be low cost and low effort in development. Part of the

procedures used in this project and elsewhere has enabled us to

show that subject matter experts are able to develop equivalent

procedures and equivalent materials and that they were able to

analyse and differentiate on the basis of difficulty, to an

equivalent extent to that obtained from the computer analyses.

Two postgraduate theses have been written on this topic – one by

Bateman (2003) and a second by Connally (2004) - illustrating

exactly this point.

VII. The procedure must have both face and construct validity.

Face validity must be based on evidence of the extent to which

the procedure and interpretation of evidence mimics the

Page 13: scoring units of competency

13

workplace performances. It should look like it is the right thing to

do. If it meets this condition it would generally have support

among people in vocational education and in the workplace. In

this regard the criteria that are based on the training package

elements and performance criteria have greater validity than

those based on generic approaches to assessment, such as the

method or the general skill underpinning the student

performance.

Construct validity requires that the underlying continuum of

increasing quality performance must define the ability to adapt

and to demonstrate different levels of performance depending

upon differing expectations. In other words, construct validity

demands that the levels of competence and levels of performance

defined for units and subjects do in fact define a differentiating

continuum that enables distinct levels of performance in

workplace procedures to be identified and reported.

VIII. The procedures must be demonstrably fair, equitable and unbiased.

Fairness relates to the ANTA principles of flexibility and

fairness; equitable means that, across all systems and all states,

the assessment procedure should be applied in much the same

way. Unbiased has a particular meaning. It requires that the

results of the assessment should be unrelated to other factors such

as gender, ethnicity, location or any other secondary variables

that should not be taken into account in making competency

statements.

For this reason it is important to be able to show that a national

system of assessment is not affected by local, state or systemic

factors and that the interpretation of the competency scales is

identical across all systems of education. The analyses across

states showed that there was little or no differential effect

attributable to state location. In the few instances where

differences due to location were identified, it meant that

differences could be controlled. Where differences due to

location are unknown, they cannot be controlled.

Page 14: scoring units of competency

14

IX. The model must be communicative and satisfy the information needs of

stakeholders in a quality assurance context that must be accommodated.

Bearing in mind that this project was initiated in order to obtain

differentiated scores for students that would feed into a university

selection process, it is important that universities obtain

information they can use. In general, it is not the university that is

the stakeholder for the direct assessment data. The universities’

admissions council in each state scales assessment data and

produces the university admissions index, whether it is a UAI, a

TER, an ENTER score, a TEE or an OP. The scaling procedures

used by the university admissions committees generally take care

of differences in difficulties between subjects and develop the

scaled rank scores or bands for university selection. The model

must provide these bodies with an appropriate differentiating

score. Sometimes, and in some systems, there is an intermediate

step where the data are standardized, as they are in Victoria, for

instance, to a mean of 30 and a standard deviation of 7. This

standardisation of scores must be able to be carried out to

accommodate the differentiation of score requirement. Other

states have varying approaches to scaling of the scored

assessments and a more detailed description is provided by

Griffin et al, (2000) and a summary is included later in this

report.

In addition, the assessment system must be able to produce the

competent/not-yet-competent result and record for each student.

For registered training organisations that use grading systems, the

assessment model also had to accommodate such requirements in

order to be consistent with extant national systems.

X. The scores and assessments are amenable to statistical and or consensus

moderation to ensure consistency of decisions and accuracy of score.

Statistical moderation can be achieved through scaling and

standardisation procedures whether or not an external scaling test

is used. The purpose of moderation is to bring score distributions

into alignment. In this instance a national moderation approach is

Page 15: scoring units of competency

15

possible and would be needed within each industry to ensure

nationally comparable standards and interpretations. Statistical

moderation and consensus moderation can also help to ensure

that scores are sufficiently reliable and accurate in order to

provide data suitable for scaling and development of percentile

ranks or bands.

Consensus moderation is particularly important at a local level

where judgments are applied and interpreted in terms of

performance quality descriptions within a standards-referenced

framework. These must be demonstrably present and transparent

in any implementation of the system and able to produce

evidence of local consistency.

Consistency measures are a problem in competency-based

assessment. Existing studies of this issue and of reliability have

not addressed the issue directly. There have been calls for a new

paradigm for reliability, but the issue remains that reliability is

the extent to which errors of judgment, measurement, or errors in

the observation are controlled. Studies of reliability and

consistency have generally focused on the process in the belief

that if the process was consistent then reliability is underpinned

and improves. This, however, is a statement of faith or a ‘believe

me’ approach, so common in competency-based assessment.

Bateman (2003) indicated that this can be an approach but there

is no evidence of the extent to which it helps reliability in CBA.

This project has examined consistency from a number of points

of view. It has explored measures of reliability and applied them

to the judgments of the expert panels – the measure called the

standard error of judgment (SEj). While SEj provides a measure,

it is still unclear how to interpret it and further studies are needed.

The issue of reliability is one that will remain vexed as long as

there is a judgment process and an element of ‘trust’ embedded in

the process.

Page 16: scoring units of competency

16

The Project and Background

The current project was carried out in four industries: Business Services, Metal and

Engineering, Hospitality and Information Technology. The feasibility report (Griffin,

Gillis, Keating & Fennessy, 2001) and pilot study (Griffin & Gillis, 2001; 2002)

outlined the procedures that were to be trialed nationally to examine the efficacy of a

differentiated scoring system for ‘VET in schools’. The trials and the

recommendations of the earlier project have shown how current approaches to

competency-based assessment could yield a differentiated score in addition to the

recognition of competence, without altering the fundamentals of the competency-

based approach but instead focusing on a customization of the record keeping and

reporting frameworks.

Assessment and Reporting

The assessment model (Griffin & Nix, 1990) adopted in this study defines the process

of assessment and reporting as a purposeful process of observing, interpreting,

recording and communicating outcomes to stakeholders. In this case, the purpose of

assessment was to provide either recognition of competence and/or selection through

score differentiation. The observations used to obtain this recognition or

differentiation were based in the procedures already adopted by assessors in the

workplace and endorsed by ANTA and National ITABs and outlined in the training

package for workplace trainers and assessors in either its previous development or in

the 2004 version. This report does not enter into detailed discussion of assessment

methods. Suffice to say that each industry and each assessor group should continue to

rely upon their industry expertise and knowledge to develop appropriate assessment

methods to make decisions about the competence or non-competence of an assessee.

The project was to show how those assessment methods and the endorsed procedures

could yield a differentiating score at the same time as providing recognition of

competence.

Standards Referencing

Interpretation has always been an issue in competency-based assessment. Ever since

competency-based assessment was introduced in the early 1990s and defined as an

example of criterion-referenced or criterion-based assessment, it has assumed a

Page 17: scoring units of competency

17

limited and possibly misleading view of criterion referencing. In their report in 2001,

Griffin et al illustrated how the definition of criterion-referenced assessment could

itself be used to justify the approach of the dichotomy of competent/not-yet-competent

and still yield a differentiating score.

Glaser (1981) argued that criterion referencing incorporated competence that was

about the ability to progress along a continuum of increasing competence. His

definition was taken as the basis for the development of materials and the

interpretation of competence in this project. In a sense the project team was not

arguing that the interpretation should shift from a criterion-based assessment,

although it has been recommended that a standards-referenced approach should be

used. A standards-referenced approach is recent terminology for a form of criterion-

referenced interpretation. It is important to note also that the interpretation of criterion

referencing is not an assessment method; it is not a testing procedure; it is an

interpretation framework.

Criterion-referenced interpretation enjoys an interesting history in Australia. In the

early 1980s in Victoria the development of subject profiles within the school system

was an extensive and theoretical approach to the development of criterion-referenced

interpretation frameworks leading to national profiles and curriculum statements.

Profiles then led to a few years of ‘outcomes-based education’. Outcomes were

defined in terms of increasing levels of competence within discipline areas in the

school curriculum. Outcomes were described using ‘progress maps’, but progress

maps were generated by a small number of people capable of conducting item

response model (IRM) analyses using sophisticated computer programs. Standards

referencing was first proposed in Queensland in the late 1980s and early 1990s

(Sadler, 1987) but gained great credibility with the McGaw (1997) and later Masters’

(1998) reports regarding the NSW Higher School Certificate.

The distinction between profiles and standards-referenced frameworks is difficult to

identify. They may be the same thing. Hence if a criterion-referenced framework or a

standards-referenced framework were to be adopted, then recording methods of

competent and not-yet-competent would need to be expanded so that the records of

achievement by students in ‘VET in schools’ programs would record their level of

Page 18: scoring units of competency

18

performance as well as the competence dichotomy. Assessment and reporting

strategies, interpretation methods and recording procedures will all need to reflect this

extension of the current approach to interpreting evidence and predicting the quality

of workplace performance. It will be necessary to communicate the level of

performance in a way that is meaningful for purposes of both recognition and

selection/differentiation. However, it may not be necessary to have the same method

of communication for both purposes.

The current understanding of recognition requires that a person be described as having

achieved a status of competent or not-yet-competent for units and elements in a

training package. However, a continuum of competence at a unit level enables a report

and a communication to be provided which indicates how well an assessee had

performed on that unit of competence and to differentiate among those people

classified as competent. While it is possible to maintain current methods of reporting

and recording of competent and not-yet-competent, the system trialed in this project

provides for future extensions of this classification system to allow for the quality of

the assessee’s expected workplace performance to be reported to stakeholders.

There are numerous stakeholders in this system. The students have a right to know

whether or not they have been judged as having achieved a particular competence or

whether they require additional training. They also expect to receive information

about the opportunities that are available for training. There is also the teacher or the

trainer, who may also be the assessor, who requires information on individuals and

aggregated information about groups.

Employers wishing to make decisions about training for employees need to know

what training plans are required and this could be based upon the numbers of people

in their employ who have not yet reached the level of competence required in that

workplace. Employers wishing to induct new staff into their workplace may also wish

to differentiate between applicants on the basis of the quality of their predicted

workplace performance.

Universities make decisions about which students to select into their courses. Most

use a ranking system based on Year 12 examinations to select candidates. Where a

Page 19: scoring units of competency

19

ranking system is used, it is usually a percentile rank reported at an assumed accuracy

of two decimal places. Such is the fine distinction between candidates seeking

entrance into university, that the errors of measurement must be minimised and the

quality and reliability of the assessment data needs to be high in order to allow such

fine-grained discrimination. Item response modeling was able to assist in obtaining

fine-grained scores. In order to achieve the goals of the project and to implement the

procedures trialed, there was a need for some supplementation of existing practices in

recording and reporting competency-based assessment.

The Notion of Competence

Griffin et al recommended in 2001 that quality of performance needed to be

recognised in the definition of competence. The idea of competence as a dichotomy

was revisited. Competence has been generally defined as the capacity to meet the

standard of performance expected in the workplace. This has been a fine definition of

competence in the introductory periods of competency-based assessment and training.

However, experience of industry, educators and of administrators has led to

recognition that there is no fixed standard expected across the many workplaces

within an industry throughout the country. Employers exploit their competitive

advantage by arguing and insisting that their workers are able to demonstrate superior

performance against the competencies in training packages and that they expect and

achieve higher standards than their competitors. If this is the case, then it becomes

difficult to align with the argument that there is a single defining point on a continuum

that indicates that competence (the standard expected in the workplace) has been

achieved.

The idea of a continuum was also important for other reasons. Regardless of whether

there were two levels defined as competent and not-yet-competent, a continuum that

consists of only two levels does not have a single stable and invariant cut point across

all workplaces in the same industry even for the same competence. So we proposed a

different view of competence. We suggested that competence could be defined as a

person’s capacity to adjust their performance to the varying demands expected in

workplaces. This definition of competence incorporated the previous one - that a

person can meet the standard expected in the workplace - but it also says that the

person experiencing different workplaces can adjust their level of performance to the

Page 20: scoring units of competency

20

varying standards encountered across different workplaces. Where this is the case, we

could also incorporate, in a definition of competence, the idea of variability of

expectations and different levels of performance on a continuum of increasing

competence. In order to develop the continuum, a series of assumptions are made

(Griffin, 1997) that underpin the use and development of continua of increasing

competence.

Assumptions

1. A set of underlying continua can be constructed that describe development or

growth in specific domains of learning. The continua define constructs that

are measurable, and have direction and units of magnitude.

2. The continua do not exist in and of themselves, but they are empirically

constructed to assist in explaining observations of learned behaviour.

3. Each continuum can be defined by a cohesive set of indicative behaviours

representing levels of proficiency in the area of learning. These behaviours

can be demonstrated through the performance of representative tasks that can

be regarded as either direct or indirect indicators of competence.

4. Not all behaviours can be directly observed. Related, indirect behaviours

can be used, along with directly observable behaviours, to describe

competency or ability at any point on the continuum.

5. The indicators (behaviours or task descriptions) may be ordered along a

continuum according to the amount of the proficiency, competence or ability

required to demonstrate a satisfactory performance or success on each task.

6. People can be ordered along the continuum according to the behaviours they

are able to exhibit or the tasks that they are able to perform and the quality of

the performance. The behaviours, which cluster at points on the continuum,

can be interpreted to provide a substantive interpretation of the level of

proficiency or ability of people at the same point on the continuum.

Page 21: scoring units of competency

21

7. It is not necessary to identify or to observe all possible behaviours or

indicators in order to define the continuum. The continuum can be defined by

any representative, cohesive sample of indicators that covers a range of

levels on the continuum.

8. There is no one correct sample of indicators, tasks, test items or pointers that

exclusively defines the continuum or the domain, although there may be a set

of indicators that is generally agreed upon as important in defining the

continuum. Once the continuum is identified together with the underpinning

construct, samples of indicative tasks can be interchanged.

9. While the indicators that are used to define the continuum are homogeneous

and cohesive as a set, there is no causal or dependent relationship between

them. It is neither necessary nor obligatory to observe lower order indicators

in order to observe higher order behaviours. The existence of higher order

indicators implies the ability to demonstrate lower order indicative behaviour.

The relationship is probabilistic, not causal.

There is an advantage to this approach to defining the developmental continuum or

the standards-referenced framework. The theory underpinning IRM analyses, on

which this approach was modelled, argues that when the competence of the person is

equal to the demands of the task (is located at the same point on the continuum) the

odds of success are 50/50. From this it can be deduced that, if the person were to

improve a little, he or she would have a better than even chance of succeeding on

tasks at that point on the continuum. It could be argued that the main outcome of

training is to increase the odds of success in each of these competency levels. The

demonstrated performance level is defined by the set of tasks or quality criteria

clusters at levels on the continuum. Moreover, the odds of 50/50 at the transition

points can be linked to a change in the required performance quality and this can be

directly translated into an implication for training. If the skill changed, then this had

an implication for a change in training strategy. However, these odds were not

considered as suitable for competency-based assessment so all analyses were

conducted so that the assessments were calibrated at odds of 95/05 chance of success

Page 22: scoring units of competency

22

and the interpretation adjusted accordingly to indicate that the person had a very high

chance of performing to the described level of quality.

The first point (task or criteria grouping) is justified on statistical and conceptual

grounds if the criteria have behaved in a cohesive manner that enables an

interpretation of an underpinning continuum. This is sometimes described as behaving

in a Rasch-like manner because it is also a requirement of the Rasch (1961) model

(IRM) analysis. The second point (labelling the skills) is based on conceptual rather

than on statistical grounds. If the criteria within a group do not suggest a meaningful

and unifying set of skills or competencies, the set of criteria may need to be ‘adjusted’

to make the interpretation clearer. That is, some items may need to be omitted

because, despite statistically appropriate qualities, they may not be conceptually

relevant to the underlying continuum or to identifiable and comprehensible levels

within the continuum. This is a far more powerful reason for omitting or adjusting

criteria from an assessment recording procedure than any statistical analysis. Under

these circumstances, they might not belong in the assessment at all. These procedures

can, at times, also identify gaps in the indicator set.

There is a further advantage to this procedure. If the qualitative analysis undertaken

on the IRM results 'back translates' to match or closely approximate an original SME-

developed continuum it can also be used as evidence of validity. The technique of

‘levels’ has been used sparingly but is increasingly emerging in international studies;

for example, Greaney and others used the procedure in their report on the 'Education

For All' project (Greaney, Khandker, & Alam, 1990).

The project team proposed a standards-referenced framework to enable employers and

workers also to benchmark and aspire to perform at levels beyond a minimum

acceptable level that may have been interpreted via the training packages. It was the

capacity to train and develop workers’ skills to higher levels and to expect training

strategies to achieve levels of excellence that existing training packages neither define

nor demand. This trial has introduced these ideas into competency assessment at

Certificate I and II. That having been said, it should also be pointed out that the

methodology used in this project arose from another study by Griffin, Gillis,

Page 23: scoring units of competency

23

Connally, Jorgensen and McArdle (2000, 2003), which explored the application of a

standards-referenced model at the Advanced Diploma level of the AQF.

Changed Focus for ‘VET in Schools’

An important change that this project proposed was that the difference between ‘VET

in schools’ subjects and those subjects often regarded as mainstream academic

subjects be completely removed. If more VET subjects are used for university

selection, the current practice in many systems of clear classification of VET subjects,

outside the university preparation stream for students, will need to be removed. VET

subjects should be made available to all students and included in the procedures used

to calculate the university entrance score. Their exclusion has lowered the esteem of

VET subjects, despite considerable efforts having been made in many systems to

expand the range of subjects taken as equivalent to academic mainstream subjects.

This project, and the methodology employed, has demonstrated that there is no

difference between VET and any other subjects and that all subjects should be

included into a single pool from which students select according to their abilities,

interests and aspirations.

Differentiation among the subjects in terms of difficulty, desirability or prerequisites

is a matter for another forum. Tertiary admission councils, in their scaling procedures,

are able to take into account difficulty levels of different subjects. The purpose of the

project was to make sure that the subject was at least eligible for such treatment and

that the students were not forced into life-long decisions about which subjects they

might take and which career paths were open or closed to them.

Recording and Reporting

A major change that can result from this project is the focus on record-keeping and

reporting: record-keeping in terms of levels of competence; reporting in terms of

scores and levels of performance quality. In order to do this, the project team took the

training package unit, element and performance criteria and, also taking into account

the Training Package Range of Variables and the Evidence Guide as sources of

information, worked collaboratively with the industry training advisory boards of each

of the four industries engaged in the project to develop a new level of criteria. They

Page 24: scoring units of competency

24

have been called the ‘quality criteria’. This approach expanded the performance

criteria, which described the job tasks to be performed, and addressed the issue of

how well each job task was done. Each performance criterion was then examined.

Some performance criteria had two identifiable levels of quality, some three, some

four, but in all cases the project followed a set of principles and a set of procedures for

defining the criteria. When a set of quality criteria are combined with the performance

criteria as rating scales the composite is called a rubric. The procedures and rules for

the rubrics were outlined in the initial project by Griffin et al (1997). Rubrics were

written and linked to the performance criteria. Figure 1 illustrates the link between

subjects, units, performance criteria and quality criteria.

Rubrics

Figure 1: The hierarchy of criteria for interpreting differentiating scores in a training

package.

Rubric = Performance criterion and quality criteria combination

Subject

Unit

Element

Element

Performance

Criteria

Performance

Performance

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Unit

Element

Element

Performance

Criteria

Subject

Unit

Element

Element

Performance

Criteria

Performance

Criteria

Performance

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Quality

Criteria

Unit

Element

Element

Performance

Criteria

Page 25: scoring units of competency

25

Representatives of ITABS were shown how to write rubrics according to the

following rules (Griffin, 1997). These are illustrated in Figures 2 and 3.

Rubrics must:

1. reflect levels of quality of performance. Each recognisable, different level of

quality needs to be defined within each task or criterion. They should reflect

the quality of cognitive, affective or psychomotor learning that is

demonstrated in the students' performances;

2. enable an inference to be made about developmental learning. They should

not be just counts of things right and wrong;

3. discriminate between levels of learning and performance quality;

4. be based on an analysis of samples of performance and the samples should

cover a diverse range of levels of performance;

5. be written in a language that is unambiguous and easily understood by all

appropriate assessors. The language should be descriptive, enable inference

and avoid the use of comparative terms;

6. be written such that students can verify their own performance against the

rubrics;

7. be developmental so that each successive level code implies a higher level of

performance quality;

8. be internally coherent such that they should consistently describe

performances in the same domain of learning;

9. reflect the level of performance quality (or difficulty) relative to all other

rubrics and codes as stipulated in a quality matrix; and

10. lead to reliable and consistent judgments across judges. To this effect no task

or sub-task should have more than four or five levels. If more levels are

Page 26: scoring units of competency

26

required the task or sub-task should be split for coding purposes and two sets

of rubrics developed.

The examples in the following pages illustrate these principles from the point of view

of the units in the Hospitality Training Package. The same principles and examples

apply to all four industries. They have all been included in the materials previously

provided to the Industry Training Advisory Boards and approved by them.

Figure 2: An example of units, elements and performance criteria.

Each of these elements and performance criteria were then expanded to address the

issue of ‘how well’ these performance indicators could be demonstrated. In each case

a number of levels of performance quality (quality criteria) were defined by the

specialist panels nominated by the ITABS. This was the first time such an exercise

had been attempted and there was a great deal of uncertainty among the panel

members regarding whether there was a developmental sequence among the

indicators for any specific performance criterion. In some cases there was a tendency

to use ‘steps taken’ and indicators of quality and this was discussed and remedied.

The uncertainty of the development remained, however, and there was a compromise

reached in the design of the sheet used for recording observations. Some ITABs

wanted every quality criterion to be recorded as present or absent. While this

demonstrated a lack of confidence in the panels’ definitions of development and

ELEMENT 1: Communicate with customers and colleagues from diverse backgrounds.

1.1

Value customers and colleagues from different cultural groups and treat them with respect and sensitivity.

1.2

Take into consideration cultural differences in all verbal and non-verbal communication.

1.3

Communicate through the use of gestures or simple words in the other person’s language, where language barriers exist.

1.4

Obtain assistance from colleagues, reference books or outside organisations when required.

ELEMENT 2: Deal with cross cultural misunderstandings

2.1

Identify issues which may cause conflict or misunderstanding in the workplace.

2.2

Address difficulties with the appropriate people and seek assistance from team leaders or others where required.

2.3

Consider possible cultural differences when difficulties or misunderstanding occurs in the workplace.

2.4

Make efforts to resolve misunderstandings, taking account of cultural differences.

2.5

Refer issues and problems to the appropriate team leader/supervisor for follow-up.

Element

Performance Criteria

ELEMENT 1: Communicate with customers and colleagues from diverse backgrounds.

1.1

Value customers and colleagues from different cultural groups and treat them with respect and sensitivity.

1.2

Take into consideration cultural differences in all verbal and non-verbal communication.

1.3

Communicate through the use of gestures or simple words in the other person’s language, where language barriers exist.

1.4

Obtain assistance from colleagues, reference books or outside organisations when required.

ELEMENT 2: Deal with cross cultural misunderstandings

2.1

Identify issues which may cause conflict or misunderstanding in the workplace.

2.2

Address difficulties with the appropriate people and seek assistance from team leaders or others where required.

2.3

Consider possible cultural differences when difficulties or misunderstanding occurs in the workplace.

2.4

Make efforts to resolve misunderstandings, taking account of cultural differences.

2.5

Refer issues and problems to the appropriate team leader/supervisor for follow-up.

Element

Performance Criteria

Page 27: scoring units of competency

27

Unit Code THHCOR02

Unit Name Work in a socially diverse environment ELEMENT 1: Communicate with customers and colleagues from

backgrounds.

Value customers and colleagues from different cultural groups and treat them with respect and sensitivity.

1

Describe the key characteristics of a broad range of different cultural groups in the Australian society and the principles that underpin cultural awareness

2

Explain the significance of cultural diversity and values when dealing with colleagues and customers within the hospitality industry

3

Take into consideration cultural differences in all verbal an non-verbal communication

1

2

Communicate through the use of gestures or simple words in the other person’s language, where language barriers exist.

1

2

Obtain assistance from colleagues, reference books or outside organisations when required.

1

2

3

Performance Criteria

Unit

Element

Quality Criteria

performance quality, it also created a large task for teachers who now had to record

every quality criterion rather than choose the one that best matched the students’

performance. Not surprisingly there was some resistance by teachers to this task, as it

multiplied the amount of recording by a factor of approximately three rather than

keeping it the same as had been required by a process that mandates recording every

performance criterion. To some extent this led to some teacher non-compliance.

Nevertheless this was the response mode required by the ITABS and the data were

collected in this manner. It allowed a check on the developmental sequence of the

indicators and the data could also be recoded to allow an examination of the indicators

as a rating scale as designed initially. The recommended format is in Figure 3a. The

format actually used to allow for the developmental process is illustrated in Figure 3b.

Figure 3a: An example of the unit, element, performance criteria and quality criteria

using a rating scale response format.

Response Rating

Page 28: scoring units of competency

28

Figure 3b: An example of the unit, element, performance criteria and quality criteria

using a checklist response format.

Performance Scoring Sheet

Unit Code

THHCOR02B

Unit Name Work in a socially diverse environment

Which of the following best describes your relationship to the candidate.

I am the student’s:

Teacher Trainer Workplace Supervisor

ELEMENT 1: Communicate with customers and colleagues from diverse backgrounds.

Yes

No

1.1 Value customers and colleagues from different cultural groups and treat them with respect and sensitivity.

Can you confirm that the student can:

Maintain a patient, courteous and helpful manner when dealing with customers/colleagues from a range of diverse backgrounds, even under situations of time pressure.

Describe the key characteristics of a broad range of different cultural groups in the Australian society and the principles that underpin cultural awareness.

Explain the significance of cultural diversity and values when dealing with colleagues and customers within the hospitality industry.

1.2 Take into consideration cultural differences in all verbal and non -verbal communication.

Can you confirm that the student can:

Describe a range of verbal and non-verbal communication strategies that are appropriate to a socially diverse environment.

Apply knowledge of different cultures and cultural characteristics when communicating with colleagues and customers.

1.3 Communicate through the use of gestures or simple words in the other person’s language, where language barriers exist.

Can you confirm that the student can:

Use appropriate gestures or simple words in the other person’s language to try to overcome language barriers.

Apply a range of communication strategies to try to overcome language barriers.

1.4 Obtain assistance from colleagues, reference books or outside organisations when required.

Can you confirm that the student can:

Refer customers to a colleague or team leader when experiencing difficulties communicating with customers from diverse cultural backgrounds.

Obtain external assistance when communication blockages cannot be overcome within the establishment.

Obtain timely assistance from colleagues, reference books or outside organisations when required whilst maintaining customer satisfaction.

Unit

Element

Performance

Criterion

Quality Indicator

Response Boxes (multiple)

Page 29: scoring units of competency

29

Trials

The trials were conducted in four industries using a total of 56 competency units.

Seventeen units in Metal and Engineering, 15 units in Information Technology, 12

units in Business Administration and 14 units in Hospitality were developed for the

trials. Sixty schools were approached to participate in the project on the advice of

each of the state jurisdictions. There were nine schools approached in the Australian

Capital Territory, twelve in New South Wales, five in Queensland, six in South

Australia, thirteen in Victoria, and five in Tasmania and ten in Western Australia. The

schools were distributed over the four industries and the details of this were reported

as part of the pilot study (Griffin & Gillis, 2002).

Score Development

A method of defining quality criteria for each performance criterion was developed

which in turn led to the derivation of a raw score at both unit and subject levels. The

scores at unit level or performance criterion level needed to be weighted. The decision

on how to weight the scores at unit or performance criterion level was supported from

a theoretical perspective of item response modeling (IRM). IRM weights the criteria

(or score points) according to the criterion’s capacity to differentiate or discriminate

between students. This was an important point and consistent with the overall goal of

the project to provide a differentiating score.

If the scores were simply added across criteria within a unit and then across units

within a subject, the greatest contribution to a total score would have been made by

the performance criteria that had the largest number of quality criteria. The greatest

contribution to a subject score would therefore be made by the units that had the

largest number of elements and/or performance criteria and quality. In order to

increase the relative importance of the unit, a larger number of quality criteria would

need to be defined. It might be argued that this could be a correct procedure. This

leads generally to a practice of insisting that all criteria have the same number of

levels or score points and a sometimes irrelevant method of weighting used to

influence the importance of the rubric. This is due to a belief that if there are more

levels of quality it raises the importance or influence of the criterion. However, the

Page 30: scoring units of competency

30

number of levels of quality for a specific performance criterion is not necessarily

always an indication of how well a unit, an element or a performance criterion would

differentiate between students. Applying the logic of IRM is an important procedure

in making this assessment model accessible. It also provided a way of defining the

underlying continuum in terms of a developing competence linked to the training

package. IRM weights the rubric according to its discriminating or differentiating

influence. A differentiating score is the precise outcome required in this project and it

made sense to use a weighting method directly related to that purpose.

Thus, as the purpose of this project was to produce a differentiating score, the scores

within performance criteria were weighted on the basis of their capacity to

differentiate between students. In adopting this approach we have used the purpose of

the project as a mode of weighting scores. Having produced a weighted score,

allowing for differentiation, it was then possible to standardise the scores across units

to provide a subject score and then to scale those standardised subject scores to

produce a university entrance score (where scores were required) or, in the case of

Queensland, an OP band, as defined in the initial report by Griffin, Gillis, Keating and

Fennessy (2001).

However, as a number on its own (even as a differentiated score) a score has little

substantive meaning unless it is linked to the competence it is meant to represent in

the relevant training package. The debate on grading, while antithetical to the idea of

levels of quality performance, opens the possibility of using a system of reporting that

looks like grading but which describes in performance terms how well a person has

performed in a unit or a subject. The differentiation between grading and a standards-

referenced framework was reported by Griffin and Gillis (2002). In essence the

difference is that grading is a normed approach to reporting and it is usually reported

as a letter grade ranging from A to F, for example, with the grades determined by a

distribution and no real substantive interpretation of the letter grades other than

relative locations in the distribution. By comparison, a standards-referenced

framework has no a priori distribution across the levels. Ideally, all students could

demonstrate performance at the highest possible level. This is an important distinction

between grading and standards referencing. The model has provided methods that

Page 31: scoring units of competency

31

enabled an interpretative and differentiating score to be provided to employers,

universities, schools, ACACA agencies and a range of other audiences.

Differential Weighting

Figure 4: The SME judgment matrix approach- a simulated matrix allowing for

weighting by differentiating power.

Weighting and interpreting the criteria are important. The weighting of scores needs

to be based upon the scoring rubric’s capacity to differentiate amongst students. It

ought not to be a raw score where every criterion has the same weight nor should it be

a weighting that is artificial, based upon an arbitrary notion of relative importance or

criticality. Given the requirement of the project to produce a score that differentiates

between a quality of performance between students, the weighting procedure

developed and used by a panel of subject matter experts was an important approach to

moderation of judgments. Item response modelling generally does this by mapping

task or score point difficulty against student ability. The SME judgment matrix

achieves the same thing. It is similar to the logic embedded in the development of the

SAI (Subject Achievement Indicator) in Queensland.

In a three- or four-point rating scale, it is possible to determine how difficult it is for

students at a level of competence to demonstrate a performance of a quality that can

C1 C2 C3 C5 C4 C6 C7….

Loren

Ipsom

et

ideo

precor

cum laude

sic

Theme identified

among the lower cluster indicators

Theme identified

among the next

cluster indicators

Theme identified

among the next

cluster indicators

1

2

2

3

4

2

3

4

1

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

1 1

1 1

2 2

2

3

3

2

1

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

C1 C2 C3 C5 C4 C6 C7…. C1 C2 C3 C5 C4 C6 C7….

Loren

Ipsom

et

ideo

precor

cum laude

sic

Theme identified

among the lower cluster indicators

Theme identified

among the next

cluster indicators

Theme identified

among the next

cluster indicators

1

2

2

3

4

2

3

4

1

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

1 1

1 1

2 2

2

3

3

2

1

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Loren

Ipsom

et

ideo

precor

cum laude

sic

Page 32: scoring units of competency

32

earn each score point compared to the difficulty of scoring any other criterion’s score.

A score of four on a particular criterion, for example, might be extraordinarily

difficult to obtain whereas a score of four on another criterion might be relatively

simple to obtain. The score of four that is more difficult to obtain therefore identifies

the more capable people. The score on the easier criterion does not identify the more

capable people but does identify and describe the highest level of performance for that

specific criterion and helps to identify and describe the performance of students at

lower levels of performance.

Each column in the IRM matrix shown in Figure 5 represents a performance criterion.

The labels 1.1 and so on represent the element and performance criterion within the

unit. For example, Element 1 had four criteria; Element 2 had five. The numbers in

the vertical columns are codes for the quality criteria, which are also summarised in

the cell of the matrix. The codes can be replaced by the written form of the quality

criteria and in many cases the codes only are used to represent the relative difficulty

of the criteria. The numerical codes become the score assigned to the student

performance on each performance criterion. The height of the code indicates how

difficult it is to achieve that score or how much the quality criterion can discriminate

between students. The most difficult performances are at the top and the easiest ones

are at the bottom. A qualitative analysis of the descriptions of the collection of

quality criteria for performances at the top provides a description of a type of

performance and this is inserted at the right of Figure 6. Similarly, quality indicator

codes at the bottom are qualitatively analysed and so on. The number of levels is a

matter of judgment using the nature of the descriptions as a guide as well as an

inspection of the way the criterion codes cluster vertically.

Page 33: scoring units of competency

33

Figure 5: An example SME judgment matrix differentiation by quality criteria.

The SME procedure requires that the panel members place each score point (or

quality criterion) in the vertical dimension using a judgment of the relative difficulty

of each criterion. This can also be achieved by writing the quality criteria on post-it

notes and placing them on a wall allowing the height to represent the relative

difficulty of the rubric. This simple procedure is readily learned and implemented.

Clusters of rubrics in generally similar horizontal levels or heights are then identified

as they spread across performance criteria. The SMEs interpret the clusters of score

points or rubrics in the same way as they would for an IRM analysis and identify any

common theme that the clusters suggest. This is a panel procedure and definitely not

an individual task. The themes become the bands or the levels of performance quality

Hospitality Unit 2 THHCOR02B

3 3

Expla in s s ig nif ic a nc e

Us e s e xpe rie nc e

2 3

A pplie s kn o wle dg e o f

diffe re n t c ultu re s

S e e ks time ly a s s is ta nc e

whils t main tain in g

c us to me r s a tis fa c tio n

2 2 1

D e s c ribe s k e y c harac te ris tic s

A pplie s a ra ng e o f

c o m mu nic atio n s trate gie s

Exp lo re s a v arie ty o f

fa c to rs

2 2 2

D e s c ribe s s trate gie s to

de al with

Eva lua te s s tra te gie s

Co ns truc tiv e , c o nc is e

re p o rtin g

2

S e e ks e xte rna l a s s is ta nc e

1 1

Ma inta in s pa tie nc e ,

c o urte o us n e s s e tc

P ro m ptly c o nv e ys

1 1 1 1

D e s c ribe s ve rba l & no n -

ve rb al

Lis ts a ra ng e o f s ituatio ns

Ide ntif ie s b lo c kag e s

R e fe rs is s ue s

1 1

Us e s a pp ro pria te

g e s ture s an d s im ple wo rd s

R e fe rs c u s to me r to

c o lle ag ue s

1.1 1.2 1.3 1.4 2.1 2.2 2 .3 2.4 2.5

C u lture C o mm un ic a tio n Ge s tu re s A s s is ta nc e C o n flic t As s is ta nc e D iffe re nc e s R e s o lve Re fe r

Page 34: scoring units of competency

34

that students might have demonstrated for each unit. Each vertical column still

represents a performance criterion and each entry in the vertical column represents a

quality criterion. Again it is emphasized that this is an example and the process

applies to ALL industries studied in this project.

Figure 6: An example IRM empirical matrix of calibrated quality indicators.

The score codes provide a way of deriving a differentiating score for the unit. In

Figure 7, the score ranges of 1 to 3, 4 to 7, 8 to 12 and 13 to 19 indicate levels 1, 2, 3

and 4 respectively. Using discrimination or the capacity to differentiate amongst the

students as a method of weighting made two things possible. First, it was possible to

derive a raw score that differentiated between students on the basis of the quality of

Student Distribution 1.1 1.2 1.3 1.4 2.1 2.2 2.3 2.4 2.5

3

3

2

XXXXXXXXXXXXXXXXX 2

2

2

2

XXXXXX

3 2

XX

1 1

XXXXXX 1

2

XXXX

1

X 1

XXXXX 1

XXX

1

X

1

X

X

1

XX

XX

1.1 1.2 1.3 1.4 2.1 2.2 2.3 2.4 2.5

Demonstrates limited ability to work in a socially diverse environment

At this level the student is patient, courteous and helpful when dealing with customers from a range of socially diverse backgrounds.

At this level the student can apply a range of communication strategies to overcome language barriers.

At this level the student can use their experience to select from a range of strategies the most appropriate for handling cultural misunderstandings and avoid conflict.

Page 35: scoring units of competency

35

the performance; the raw score sub-ranges indicate a specific level of performance

quality and lead to a direct interpretation in terms of the competencies in the training

package, reinforcing the validity of the procedure. Thus, the principles set out in this

report were adhered to.

Figure 7: Developing a score conversion for a competency unit.

In this project, the interpretation of the standards-referenced frameworks derived from

the SME matrices was cross-checked against those developed from the empirical IRM

analysis.

Page 36: scoring units of competency

36

Figure 8: Example of interpretation of the bands in a calibrated IRM unit analysis.

In this example (shown in Figure 8), four levels of performance were identified. Level

zero was generally classified as a level occupied by students who did not meet even

minimum levels of score point in the rubric. This could (and perhaps should) be

renamed, but in this report the zero designation is used for emphasis.

The themes or band descriptions are at times quite long. The length and the detail of

each of the score points is important to convey the kinds of performances and the

level of quality being demonstrated. However, it would become cumbersome to do

this in every case. A solution would be to summarise the detail into a short nutshell

statement (Griffin, Smith & Ridge, 2002) as a way of communicating the general idea

of the level of performance. These are shown on the right of the Figure 8.

THHCOR02B Work in a socially diverse environment

At this level the student can use his or her experience to select, from

a range of strategies, the most appropriate for handling cultural

misunderstandings and avoiding conflict. He or she can explain the

events leading to cross cultural misunderstandings, and apply

knowledge of different cultures and cultural characteristics when

communicating with colleagues and customers.

Level 3 Avoids, and when

required, resolves

cultural

misunderstandings

At this level the student can apply a range of communication

strategies to overcome language barriers. He or she can explain the

significance of cultural diversity and values when dealing with

colleagues and customers. The student can also list situations that

could result in cross-cultural breakdowns, and explore factors that

may have led to difficulties and report them where necessary.

Level 2 Displays cultural

awareness and

sensitivity

At this level the student describes and uses a range of verbal and

non-verbal communication strategies that are appropriate to a

socially diverse environment. The student can describe the key

characteristics of a range of cultural groups and the principles that

underpin cultural awareness.

Level 1 Uses various

communication

strategies when

dealing with diverse

groups

At this level the student is patient, courteous and helpful when

dealing with customers from a range of socially diverse

backgrounds. The student can refer customers to a colleague or team

leader when experiencing difficulties.

Level 0 Requires support to

work in a socially

diverse environment

Page 37: scoring units of competency

37

This was an important step because the nutshell statements can be used to describe

and code performances at unit level. These can be aggregated across units to obtain

scores for subjects or certificates. This enabled us to shift the focus of the assessment

from performance criteria to units of competence and to report a score for a subject or

certificate as a combination of units. It is therefore possible to use the nutshell

statements derived from the unit qualitative analysis to report by unit, and to

aggregate clusters of units for a subject or certificate and report this as a level of

performance and a differentiating score for purposes of scaling and incorporating into

university selection procedures. The model therefore had considerable built-in

flexibility. In some cases, systems or ITABS might consider this as a starting point for

assessment and reporting. Figure 9 illustrates the comparison of empirical and SME

panel approaches and shows how interpretations were cross-checked. There were

some differences between the SME-developed frameworks and the empirical

interpretations. In a few cases differences were minimal or non-existent. In one or two

instances it was necessary to restructure both the SME matrix and grade level

descriptions derived from empirical analysis rather than the subject matter analysis.

Figure 9: Comparing the IRM and SME interpreted matrices.

Highest level of

performance

Lowest level of performance

SME IRM

Level 4 Applies knowledge and experience to avoid and resolve cultural misunderstanding

Level 3 Avoids, and when required, resolves cultural misunderstandings

Level 3 Displays cultural awareness and sensitivity to minimize conflict/misunderstanding

Level 2 Displays cultural awareness and sensitivity

Level 2 Displays patient, courteous and helpful behaviour when dealing with diverse groups

Level 1 Displays politeness, and refers to others when experiencing difficulties

Level 1 Uses various communication strategies when dealing with diverse groups

Level 0 Requires support to work in a socially diverse environment

Page 38: scoring units of competency

38

The match in this example is close, and robust to the differences in the relative

placements of the criteria in the matrix. The overall interpretation of the levels was

not affected. This meant that the methodology could deliver a report for students for

each unit, providing information about the level of performance quality and a

differentiating score. The role of the teacher has been simplified as well. The teacher

continues to rate the student on each performance criterion but instead of ticking as

each is observed, the teacher selects a description of the performance quality that best

matches the student performances, using an apparent rating scale, as shown in Figure

10, to indicate how well the performance criterion has been demonstrated. The task is

then to aggregate the ratings to obtain a score for the unit. A simple score conversion

chart (as illustrated in Figure 7) is then used to indicate which unit performance level

the score represents, and which unit nutshell statement to report. These levels can

then be aggregated across units to produce the subject score.

If the unit performance descriptions are written in a horizontal fashion, the

development of the record form takes shape. This is illustrated in Figure 10. For Unit

THHCOR02B, Works in a Socially Diverse Environment, the levels demonstrate

performances that increase in quality and sophistication. This can be done for all units

but note that not all units had the same number of performance levels. In trials it was

possible to identify three or four levels of performance quality in some units. It may

well be that, with practice, an equal number of levels of performance for each unit

might be identified to ease the reporting and recording task, but this may be an

artificial and unnecessary constraint on the rubric and loses the advantage of

differentiating weighting.

It is also possible to produce a standards-referenced framework for reporting the

students’ performances in a subject or certificate. The levels of performance quality

defined for each unit can in turn be used to obtain a score for the subject. The

previous unit approach to interpretation can be used to develop an interpretable raw

score for the subject or certificate. In this example, twelve units were aggregated from

the hospitality industry training package, and a total score based on summing the unit

scores was obtained. It was also possible to differentiate and weight the unit rubric

levels on the basis of the difficulty of demonstrating that level of performance for the

unit of competence and this was done both empirically and using the SME judgment

Page 39: scoring units of competency

39

approach. This provided the opportunity to interpret themes aggregated across units

within subjects or certificate rather than across performance criteria within units.

Figure 10a: The record sheet for collected units for subject or certificates.

Page 40: scoring units of competency

40

Figure 10b: The record sheet for collected units for subject or certificates.

Page 41: scoring units of competency

41

Figure 10c: The record sheet for collected units for subject or certificates.

Page 42: scoring units of competency

42

Figure 10d: The record sheet for collected units for subject or certificates.

Page 43: scoring units of competency

43

These unit descriptions for each industry were then mapped onto a matrix (one for

each industry), but placing the rating scale descriptions from the record sheets in

Figures 10a to 10d as the rubrics. Precisely the same procedure was used. An IRM

process was used to calibrate the rating scales or score points from Figures 10a to 10d.

This is a process of using the differentiation power of the unit level Standards

Referenced Frameworks (SRF) as a weight in order to place it in the matrix.

In this case the columns of the matrix represent competency units. The entries in the

cells of the matrix represent the levels of performance for each unit of competence.

The height of the rubric or SRF band in the matrix column represents how difficult it

is for a student to achieve this level of performance. Students of a commensurate level

of ability or competence can attain the rubrics at approximately the same level. It is

assumed that the set of rubrics (or cluster) at the same relative level of competence is

underpinned by a common kind of competence. The clusters of unit level descriptions

were then interpreted in the same way as the previous qualitative analysis of quality

criteria across performance criteria within units. Using the code assigned to the level

within a unit as a score also enables the development of a subject or certificate score

based on aggregates of performances in training package units. The subject scores

could be represented as a frequency distribution with a mean and standard deviation,

and can be standardised to produce the raw score for scaling purposes. Each of the

subjects is scored and the distributions presented in this report. What is shown is that

it is possible to obtain a differentiating score from a set of competency units and, if

IRM procedures are used, it does not matter which units are combined to form the

differentiating score. There are some differences between state approaches to

marking but this is to be expected at this stage. Practice and moderation should

minimize this phenomenon. The scoring has been done with no moderation at all and

no feedback or training for the teachers assigning the scores. This is an unusual but

simple and inexpensive issue to address.

Page 44: scoring units of competency

44

Figure 11a: Illustrative standardized score of the Hospitality data (using a mean of 30

and a standard deviation of 7).1

0.00 10.00 20.00 30.00 40.00 50.00 60.00

Converted Score

0

10

20

30

40

Fre

qu

ency

Mean = 30.00Std. Dev. = 7.00002N = 223

Histogram

Figure 11b: Illustrative standardized score of the Business Studies data (using a mean

of 30 and a standard deviation of 7).1

10.00 20.00 30.00 40.00 50.00

Converted Score

0

10

20

30

40

50

Fre

qu

ency

Mean = 30.00Std. Dev. = 7.00N = 323

Page 45: scoring units of competency

45

0.00 10.00 20.00 30.00 40.00 50.00

Converted Score

0

3

6

9

12

15

Fre

qu

ency

Mean = 30.00Std. Dev. = 6.99999N = 56

Converted Score

Figure 11c: Illustrative standardized score of the Metal and Engineering data (using a

mean of 30 and a standard deviation of 7).1

Figure 11d: Illustrative standardized score of the Information Technology data (using

a mean of 30 and a standard deviation of 7).1

10 .00 15 .00 20 .00 25.00 30.00 35 .00 40 .00 45.00

Co n verted S co re

0

10

20

30

40

50

Freq

uen

cy

M ean = 30 .001S td . Dev . = 7.00004N = 295

Page 46: scoring units of competency

46

The SRF interpretations can also be used for describing the grades that emerge from

this analysis. In addition, it is possible to produce gist or nutshell statements for

subjects or certificates to enable reporting of quality performances at an aggregate

level for public communication. The more detailed analysis of the subjects would be

pertinent for teachers and employers but for recording purposes the subject grade

interpretation might use a summary format. This process is illustrated in Figures 12

and 13.

In general, there were sufficient matches between the SME approach and the IRM

analyses across units within industries to confidently argue that the SME procedure

was successful and relatively inexpensive to operate. The evidence and extent of

matching SME matrices and the Rasch model analyses leads us to predict that the

subject matter specialists, with feedback, can become even more skilled at developing

the matrices, defining the rubrics and judging their relative difficulty such that a

weighted differentiating score is developed. This then led us to argue that a

differentiating score could be produced and interpreted for any unit in the training

packages. Very few matrices and standards-referenced frameworks required

substantial reworking. In other words, matrices were interpreted in the same way,

yielding substantially the same standards framework, regardless of whether the

analysis was conducted using a SME or an empirical IRM approach.

Page 47: scoring units of competency

Figure 12: IRM matrix using unit descriptions for differentiating score development.

Page 48: scoring units of competency

Figure 13: Interpretation of the IRM matrix across units - A standards-referenced

framework for a Year 12 Hospitality subject using only the SRF and IRM analysis.

This means that the model led to a process that was quick, inexpensive and uses subject

matter expertise rather than the measurement or statistical expertise of a small number of

people outside the industry, who in turn have to consult with the SMEs to interpret the

sophisticated statistical analyses. Figure 14 presents the percentage distribution across

levels for the subject ‘Hospitality Operations’. This indicates that scores of levels can be

presented for units or subjects. In the discussion that follows, it also becomes evident that

the score or level can be derived independent of which units were combined into the

aggregate subject. The distributions only represent the calibration samples and IN NO

WAY represent the population of hospitality students.

Grade Competency Description

Reporting Summary

A

Manages and deals with issues as they arise. Shows high level

skills in dealing with customers and work team members; solves

problems as they arise, is aware of nuances in customer and

staff interactions and is able to act accordingly and handle

atypical telephone calls. Prepares ingredients for an extensive

range of hot and cold beverages and maintains and monitors

equipment usage and functionality.

Skilled in dealing with clients and

team members, knowledgeable

about stock and presentation

B

Manages quality control with stock and preparation of financial

transactions. Presents food correctly and with style, shows

understanding of industry issues and contributes to workplace

health, safety and security. Avoids and resolves

misunderstanding with colleagues and others.

Takes control of quality and

finances, for presentation and

industry issues including OHS and

IR.

C

Manages stock in accordance with OHS and enterprise

requirements. Evaluates products, services and promotional

initiatives. Processes financial transactions, maintains efficient

workflow in tools and food, telephone calls and systems. Knows

industry, legal and ethical implications, OHS hygiene risk

management, and shows cultural awareness and sensitivity with

colleagues and customers.

Manages, stock and

services, promotions. Maintains

workflow and equipment, knows

industry and ethical issues.

D

Maintains stock and supplies, product/service and knowledge of

the industry. Follows procedures for cash, food portions and

presentations, OHS and hygiene regulations and telephone

calls, handling and storage of foods, and emergency situations.

Demonstrates logical workflow in preparation and knife

handling. Is patient, courteous and helpful with colleagues and

customers.

Maintains supplies and stock;

equipment, customers, OHS and

food specialities, linked to industry

and its tools

E

Is aware of kitchen stock and supplies and non-alcoholic

beverages. Informs customers of products and services, uses

correct garnishes and sauces and follows equipment safety

procedures. Can communicate on the telephone,update

knowledge of industry, risks, storage and OHS procedures. Is

polite with colleagues and customers.

Knows about supplies and stock;

customers, OHS and food

specialities, linked to industry

F

With assistance and advice, can receive supplies, inform

customers, select garnishes/sauces and follow safety

procedures. With assistance, can communicate on the

telephone, update hospitality industry knowledge, follow

workplace hygiene, health safety and security procedures, and

show politeness with colleagues and customers.

Learning to deal with supplies

workmates and customers, use

the telephone, check industry

information and OHS

Page 49: scoring units of competency

49

Figure 14: Percentage distributions across levels in the standards-referenced framework

at a subject level.

The Outcomes

Differentiating Scores

The purpose of the project was to develop a method of generating differentiating scores

for ‘VET in schools’ subjects such that it could be used in Year 12 university selection

procedures compatible with each state system. The outcome has been a school-based

assessment model for ‘VET in schools’ subjects that is able to be subjected to statistical

and consensus moderation procedures, that can be differentially weighted, standardised

and scaled to produce the universities admission index appropriate for education systems

across Australia. Each jurisdiction would be able to use a moderated score system that

would enable a scaled university entrance score to be obtained. Anecdotal feedback in

the trials as well as direct evaluations in the pilot study (Griffin & Gillis, 2002) have

shown that teachers prefer the quality criterion approach to the checklist of performance

criteria and there was some relief in being able to use a standards-referenced framework.

The demand for materials far exceeded the production for the trials and this was a

powerful indicator of the acceptance by teachers. In Western Australia and in Victoria,

where parallel generic criteria were in use, the teacher reaction was clear. They preferred

to use the competency-based criteria in this model.

0.00

5.00

10.00

15.00

20.00

25.00

30.00

35.00

40.00

45.00

Skilled in dealing with clients and team

members, knowledgeable about stock

and presentation

Takes controlof quality and finances, for

presentation and industry issues including

OHS and IR.

Manages, stock and services,promotions.

Maintains workflow and equipment, knows

industry and ethical issues.

Maintains supplies and stock; equipment,

customers, OHS and food specialities,

linked to industry and its tools

Knows about supplies and stock;

customers, OHS and food specialities,

linked to industry

Learning to deal with supplies workmates

and customers, use the telephone, check

industry information and OHS

A

B

C

D

E

F

Page 50: scoring units of competency

50

A subset of units was selected for each industry (those with the largest response rate) and

the subset was treated as if it was a composite set used for a subject on a national

curriculum. Score distribution details are provided for Hospitality in Table 1, as well as

details of the subject score if the units under consideration were to be aggregated into a

single subject. Similar tables for other industries in this study are included in appendices.

Given the mean and standard deviation of the unit and subject scores, as shown in Table

1, it is possible to standardise this distribution to any mean and dispersion for any scaling

exercise. It was also possible to produce composite scores based on the combination of

central examination and school-based assessment. This set of scores and the score range

for levels in a standards-referenced framework are shown in Table 1. This is an example

only. It is based on an artificial analysis of the NSW HSC and Victoria VCE data for the

Hospitality subject in which the examination and school-based assessments were

combined. The distribution of scores does not represent any overall state distribution

because a calibration sample used to establish the properties of the SRF data collected

from this project was merged with each state central examination data. It illustrates that

the procedure can be carried out and that statistical moderation of the school based

assessment is possible with IRM analysis.

Table 1: Distribution and Score Properties for Subject Aggregation

Hospitality Levels -> F E D C B A

Unit Ni

Max Level Max Cut Score

School Based 14 0.84 9.35 8.10 44 4 9 22

33 40 44

Page 51: scoring units of competency

51

Table 2: Distribution and Score Properties for Hospitality Units

Hospitality 0 1 2 3 4

Unit Code Unit Description Ni

Max

Level Max Raw Score

THHCOR01B Work with colleagues and customers. 26 0.95

50 2 9 25 41 50

THHCOR02B Work in a socially diverse environment 10 0.87

19 3 7 12 19

THHCOR03B Follow health, safety and security procedures.

11 0.86

18 3 8 15 18

THHGHS01B

Follow workplace hygiene procedures 6 0.83

15 3 9 12 15

THHHC001B

Develop and update hospitality industry knowledge

10 0.92

22 2 7 15 22

THHGGA01B Communicate on the telephone. 13 0.90

24 4 10 18 24

THHBH01B Provide housekeeping services to guests. 11 26 10

15 20 26

THHBH03B Prepare room for guests 20 43 11

21 29 43

THHBKA01B Organise and prepare food 16 0.92

36 3 6 20 32 36

THHBKA02B Present food 10 0.90

24 4 15 24

THHGFA01B

Process financial transactions 15 0.95

35 6 11 20 35

THHGCS02B

Promote products and services to customers 12 30 6 21 28 30

THHBFB10B

Prepare and serve non alcoholic beverages 10 0.94

22 7 12 17 22

THHBKA03B Receive and store kitchen supplies 15 0.93

26 5 10 15 26

Competency Interpretation

The interpretation of the scaled score enables a direct and easily communicated

description of the performance in competency terms. This can be provided to employers,

students, parent and teachers. Decisions can be made about which level on the scale might

be considered to represent a decision of ‘competent’, although this is not consistent with a

notion of competence as the standard expected in the workplace. Rather, the assumption

has been that competence is the capacity to adjust performance to the work place

requirements rather than demonstrate a fixed level of performance.

Page 52: scoring units of competency

52

Compatibility with System Practices

The project has also examined how it is possible to combine school-based assessment and

central examination assessments using data from the NSW and Victorian systems. It must

be emphasised that central examinations do not represent a requirement of this

competency model. School-based assessment alone can be used in the states and territory

systems where there is no central examination, such as in the ACT and Queensland. It

also shows, however, that if there is a scaling test that is used to moderate or to

standardise school-based assessment, then it is possible to use item response modelling,

subject to the constraints and assumptions of that particular procedure, to undertake the

scaling.

New South Wales (NSW)

Within the NSW HSC, VET in schools is delivered as either Board-developed courses

derived from national training packages and presented for the HSC as "industry

curriculum frameworks” or Board-endorsed courses based on national training packages

and/or TAFE or national VET modules.

Each framework specifies the range of industry-developed units of competency from the

relevant training package(s) identified as suitable for HSC purposes and the combinations

of these units that comprise particular HSC courses. Each framework contains several

Board-developed HSC courses, which relate to that industry area. At least one of these

courses must be a 240-indicative-hour course, which provides four units of study over the

two years of the HSC. Shorter, 120-hour (2 unit) courses are also included. Some

frameworks also offer extension courses in addition to a 240-hour course.

All courses within industry curriculum frameworks feature competency-based

assessment. Students who meet assessment requirements are eligible for the relevant AQF

certificate or statement of attainment. No mark is reported for competency-based

assessment as currently required by the Training Package. Students undertaking 240-

indicative-hour framework courses may undertake an optional, centrally set, standards-

referenced, written HSC examination. For these students, the result obtained in the

examination is reported as a mark on the HSC and may be included in the Universities

Admission Index (UAI). There is no requirement for an a priori assessment of

competence. The examination is directly linked to units of competence and it is possible

Page 53: scoring units of competency

53

to interpret the examination directly from the rubrics and the multiple-choice items.

There is no contribution to the score of the performance in workplace assessment directly

linked to the performance criteria of the training package unit. The proposed model

directly contributes to the differentiating score and links the central examination to the

workplace assessment without adding to the workload of teachers/assessors who need to

record performance at the performance criterion model using the rating scale approach.

Figure 15: Concurrent IRM calibration of school-based assessment and central

examination in NSW: Hospitality.

SRF + NSW Exam Variable Map SRF Exam

6.0 | | e22 .5 | e21 .5 5.0 | | e24 .5 | e25 .5 | e22 .4 4.0 | e17a.4 e23 .3 | e17b.4 e19c.4 e20 .5 e21 .4 e25 .4 | e24 .4 3.0 X | s4 .4 s5 .4 s6 .4 X | s1 .4 e18a.2 e19c.3 XXXX | s2 .4 e15 e23 .2 e25 .3 XXXXXXX | e20 .4 2.0 XXXXXXXXXX | e10 e14 e16b.4 e18b.4 e22 .3 e24 .3 XXXXXXXXXXXXX | e12 XXXXXXXXXXXXXXXXXXXX | s2 .3 s3 .3 e5 e11 e13 e16b.3 e17b.3 e19c.2 e21 .3 1.0 XXXXXXXXXXXXX | e9 e25 .2 XXXXXXXXXXXXXXX | s6 .3 s9 .3 e17a.3 XXXXXXXX | s5 .3 e16b.2 XXXXXXX | s1 .3 e2 e4 e6 e18b.3 e19c.1 0.0 XXXX | s10 .3 e3 e16b.1 e17b.2 e18a.1 e19a e20 .3 e21 .2 e24 .2 XXX | s4 .3 s5 .2 e16a.2 e17a.2 X | e7 e19b e22 .2

-1.0 X | s2 .2 s3 .2 | s1 .2 s10 .2 e17b.1 | s4 .2 e18b.2

-2.0 | s6 .2 e8 e21 .1 | e16a.1 e17a.1 e18b.1 e20 .2 | s9 .2 e20 .1 e22 .1 |

-3.0 | e1 | e25 .1 | e23 .1

-4.0 | | | e24 .1 |

-5.0 | | |

-6.0 | | | |

-7.0 | | |

-8.0 | | s1 .1 s2 .1 s3 .1 s4 .1 | s5 .1 s6 .1 s9 .1 s10 .1 |

-9.0 |

Page 54: scoring units of competency

54

Figure 16: VCAA assessments and SRF Food and Beverages.

SRF

Page 55: scoring units of competency

55

.

SRF

Figure 17: VCAA assessments and SRF Commercial Cookery.

Page 56: scoring units of competency

56

Figure 17a: Business Studies concurrent calibration - SRF and examination.

5

e20 .54 e19c.3 e22 .5

e20 .4s3 .4 e13 e21 .5

3e16z.2 e19a.2

X e18c.3 e21 .4X s1 .4 e16b.3 e17c.3

2 XX s10 .4 e8 e22 .4XXXXX s5 .4 e19c.2 e20 .3

XXXXXX e9 e19b.2XXXXXXXXXX s7 .4 s8 .4 s10 .3 e14 e17b.2 e18b.4 e21 .3

1 XXXXXXXXXXXXXXXX e12 e15 e18b.3XXXXXXXXXXXXXXX s6 .3 e18c.2

XXXXXXXXXXXXXXXXXXXX e1 e2 e4 e10 e11 e17c.2 e18b.20 XXXXXXXXXXXXX e7 e16z.1 e16b.2 e18b.1 e19b.1 e22 .3

XXXXXXXXXX s4 .4 s7 .1 s7 .2 s7 .3 e3XXXXXXXXXX s4 .2 s4 .3 e21 .2

XXXXXXX s2 .4 e5 e16a-1 XXXXXX e17b.1 e19a.1

XXX e16b.1 e17c.1 e20 .2XX

X e19c.1 e22 .2-2 X s3 .3 s10 .2

e18c.1s9 .3 e17a e18a e21 .1

-3 s2 .2 s2 .3 e6 e22 .1s11 .2 s11 .3s5 .3

e20 .1-4

-5s1 .3

-6s3 .2

-7

s1 .1-8 s1 .2 s2 .1 s3 .1 s4 .1 s5 .1

s5 .2 s6 .1 s6 .2 s8 .1 s8 .2s8 .3 s9 .1 s9 .2 s10 .1 s11 .1

-9

-10

SRF NSW Exam

Page 57: scoring units of competency

57

As shown in Figure 15, using data from NSW, it was possible to produce a weighted

standardised score incorporating both central and school-based assessments in developing

a scaled score for input to the UAI. (A similar analysis will be reported in a supplement

dealing with the Victorian data).

In Figure 15, the student score distribution is shown on the left. On the right side of the

figure, two panels of codes are presented. The left panel represents the school-based

assessment (SRF) (all with a prefix ‘s’) and the right panel (EXAM) presents the central

HSC examination (all with a prefix ‘e’). The relative height on the display of particular

codes indicates the relative difficulty of the performance task or the exam question. The

relative height of the student code (X) represents the ability of the student. In the school-

based assessment, the code (sa.b) indicates ‘s’ for school; the first symbol (a) represents

the unit number and the second (b) is the score obtained for that particular unit. On the

right-hand codes (ea.b), the ‘e’ represents the examination; the first number immediately

following the ‘e’ is the question number on the examination and the second is the score

obtained. Where there is no score after the item code, this represents a question in which

it was only possible to score 1 or 0 for a dichotomously scored item. In the near normal

distribution of student outcomes, each ‘x’ represents approximately 50 students. There are

additional students at the top and bottom of this distribution, but too few to warrant an ‘x’

in the chart. It can be seen that there is very little differentiation of students about the

centre of the distribution and this is reinforced when school-based assessment is

combined with the central examination.

The central examination is more discriminating than either the school-based assessment

or the combined data. It is also clear that the central examination is generally more

difficult than school-based assessment. This is indicated by the relative heights of these

two distributions of score codes (or quality criteria). A comparison of the mean ability

and difficulty levels shows that the school-based assessment is more closely matched to

the student ability mean than the central examination.

The figure shows, however, that it is possible to link the school-based assessment and the

examination to obtain a combined differentiating score as well as an interpretation based

on the combined assessment strategy. It also illustrates that it is possible to statistically

Page 58: scoring units of competency

58

adjust (moderate) the performance level of the student given the combined exam and

school-based assessment.

Victoria

In Victoria a set of generic criteria is used for Year 12 VET assessment. Most school

assessment is based on school-assessed coursework. The model for assessing VCE VET

studies consists of workplace, coursework and an external examination. Scored

assessment is optional. If students elect to receive a score they must do both the

coursework and the examination (where the content is based on and related to the

requirements set out in the Evidence Guide of the relevant Competency Unit). Scores

from both are used to calculate a study score which in turn is used to develop the ENTER

score.

Assessment of Work Performance can involve observation of the student conducting a

range of work tasks or practical activities in a workplace or an appropriate simulated

environment. Assessors are required to base their assessment judgment on structured

observation and a range of possible tasks including oral questioning. A set of oral

questions prepared prior to assessment is used to collect evidence of a student’s

underpinning knowledge, application of key information and skills in the workplace.

Three separate task scores are reported to VCAA using the available range of 5-25. Each

school assessment task is weighted by the nominal hours related to the units of

competency assessed by it and this is a significant difference to the weighting used in this

project (weighting by differentiating power).

The external central assessment entails a 90-minute end of year external examination in

each subject. Examinations consist of 35-60 items and are based on the units of

competence that make up the unit 3-4 in the VET study. Test items are designed to

measure underpinning knowledge and understanding as designated in the Evidence

Guides of the relevant competency standards but some content is drawn down from

Certificate III or supplemented by recommendations from external specialist panels

(SMEs).

Page 59: scoring units of competency

59

The VCAA assigns study scores to students who satisfactorily complete units 3 and 4 of a

VCE study. These study scores give students a ranking in the cohort of students taking

that study across the state in that year. The ranking of the student’s performance in the

cohort is determined by the student’s performance in the graded assessment for that study.

The study scores are scaled to a range of 0 to 50 with a mean of 30 and a standard

deviation of 7. Students with the highest score in any study unit are assigned a score of

50, which then indicates that a student has finished at the top of the cohort. A study score

of 0 indicates that the student has finished at the bottom. Within these anchor scores, the

distribution is scaled to obtain a mean of 30 and a standard deviation of 7. The VCAA

does not determine any measure of overall performance in the VCE. Study scores only

provide a basis for the ranking of students in each study.

VET programs are treated in the same manner as other VCE studies in the calculation of

the ENTER, with the scaled score able to be counted directly in the calculation of the

ENTER as one of the best three scaled scores other than English.

Although the sets of five criteria provided by VCAA vary somewhat for each assessment

task type, all specify the requirement for application of underpinning knowledge. The

other criteria relate to generic competencies such as problem solving,

communication/interpersonal skills, organisational skills, evaluation skills, etc. The mix

of criteria and the levels of performance described on each criterion are common across

VCE VET studies.

The current assessment consists of two or three parts, depending on student decisions.

The first is the competency-based assessment, which is recorded as competent or not yet

competent. The second is a differentiating assessment based on generic criteria. The third

is a central examination, which also yields a differentiating score. The generic criteria

contribute up to 34% of the overall study score. The examination contributes the

remainder. Only students who are assessed as competent may elect to sit for the central

examination.

This combination of assessments means that there is NO competency unit based

assessment contributing to the Year 12 VCE assessment in the VET study scores, other

than the decision of competent/not-yet-competent. It is also evident that there are many

Page 60: scoring units of competency

60

students, who have been assessed as competent, who have little or no underpinning

knowledge and this tends to undermine the confidence in the assessment.

The proposed model ensures that evidence of competence is taken into account in the

assessment and that the assessment based on the units of competence contributes to the

VCE assessed score. It does not mean that the generic criteria should be abandoned, In

fact the complementary nature of the three components (SRF, Exam, Generic) could be

considered as a well-rounded approach to assessment. However, it became clear during

the trials that the teachers did not want to add assessment to the current system. More was

not considered to be better. It is therefore recommended that the VCE assessment

consider the replacement of the generic criteria at a VCE study and use unit-based criteria

such as those in the study record sheets presented in this report. SME panels could

prepare these rating scales and evidence guides in a short amount of time and with

minimal training. Teachers can use the record sheets with relative ease and expressed a

preference for them during the field trials and the pilot studies.

In Figure 17b, the Victorian Examination in Business Studies is concurrently calibrated

with the standards-referenced data. It is clear that the SRF approach differentiates as

much as the short answer questions and more than school assessed coursework. The

extent to which it differentiates is indicated by the vertical spread of the rubrics. A

comparison with Figure 17a shows that the indicators discriminate more in Victoria than

in NSW, indicating a slightly different approach to marking in the two systems.

Page 61: scoring units of competency

61

Figure 17b: Standards-referenced framework calibration with examination (Victorian data).

Page 62: scoring units of competency

62

Tasmania

The Tasmanian Qualifications Authority (TQA) supervises the external subject

examination that assesses against criteria in the syllabus expressly reserved for that

purpose. The ratings from the external examination and from the school are used to

determine the final assessment. The requirements for an award in each syllabus vary

across subjects but the award classifications remain as Satisfactory Achievement, High

Achievement and Outstanding Achievement (SA, HA and OA).

The Tasmania education system uses stand alone VET programs, where students can

enrol in Certificate I or II (or III) depending on the school’s scope as QERTOs. The VET

programs are based on the relevant industry Training Packages. The schools carry out all

assessment for VET. There are no external examinations or moderation processes

involved in the assessment of VET in Tasmania. VET programs conducted in schools

assess outcomes as either competent or not yet competent. They do not grade outcomes.

However, the Tasmanian Qualifications Authority (TQA) has approved a model for

scaling subject scores to be used for tertiary entrance score calculation. The introduction

of subject score scaling is in response to evidence of differences in the degree of

difficulty between subjects used in the calculation of the tertiary entrance score. The

technical approach to scaling is based on Rasch analysis as detailed on the TQA web site.

Discussion of the new scaling approach centred on general education subjects at present,

but at this stage it is not known if the same procedures will apply to all subjects or to VET

in particular. However, it is feasible that the same approach could be used. The three

subject grades, SA, HA and OA, are ordered categories of student achievement. These

grades are then calibrated using the Rasch partial credit or rating scale model. Tasmania

used this procedure for the first time in 2000 with the Year 11 general education subjects

and it will be an important test bed for both the practicability and the meaningfulness of

the approach modeled in this report. If the ordered categories were to be interpreted

substantively for each subject, it would present the TQA with the opportunity to develop

an overall standards-referenced framework for reporting purposes.

Separate subject algorithms are used to allocate subject awards (SA, HA and OA) to

students. These awards are Rasch scaled for all pre-tertiary subjects to identify any

Page 63: scoring units of competency

63

anomalous cases. These cases are temporarily removed from the analysis so that they do

not distort the results. Anomalous cases may occur if a subject does not 'fit' the Rasch

model, i.e., it is not measuring the same underlying characteristic of "general academic

ability". This happens only in a small number of cases. Rasch analysis identifies these for

treatment by a slightly different method. The Rasch analysis is conducted on the

remaining results and determines the award cut-off points for each subject (difficulty

values). The subject difficulty values are adjusted to be on a 20-point scale by making the

average CA equal to 7.0 and average EA equal to 20.0. These results are applied to all of

the students who undertook any subjects that were not excluded. The subjects that had

been excluded are reintroduced by determining their award cut-off points by statistically

comparing student performances in other subjects as determined above. Then the award

cut-off or difficulty values are inserted and readjusted (if necessary) so that the average

CA and EA requirements are retained at 7.0 and 20.0 respectively. This produces the

required table of award sub-scores (between cut-off points) and the sub-score values are

applied to students to determine the subject contributions to the TE Score.

The only change required in Tasmania is the adoption of the record sheets for the

combined units in stand alone VET subjects for the award of the three levels (SA, HA and

OA). The adoption of the IRM logic and approach has already been implemented.

Australian Capital Territory (ACT)

Assessment in the ACT Year 12 Certificate rests entirely on college/school-based

assessment. Courses consist of a number of units determined by the BSSS. Students are

given grades for each Unit and a Unit Score for “T-tertiary” courses is provided to the

BSSS. Grades are subject to consensus moderation and the Course Scores are statistically

moderated against an external ability test for the purpose of selection into university.

Where a nationally recognised VET course meets the requirements of the ACT BSSS, the

course can be adapted to meet the requirements of the BSSS Frameworks by inclusion of

additional underpinning knowledge. Assessment is graded. Schools assess student

performance in units of competency without grading (C/NYC) and issue the appropriate

Certificate or Statement of Attainment in conjunction with the BSSS. It is possible for

students to concurrently gain a vocational outcome, a grade and a score in a 'T' subject

Page 64: scoring units of competency

64

with embedded VET components and assessment in the workplace. Teachers set a group

of assessment tasks aligning them with vocational outcomes.

Teachers allocate a mark or grade for each assessment task within a Unit. The

Framework documents stipulate the weighting of each task type “that could be

constructed to meet the assessment criteria” and these are combined to provide Unit

scores to calculate the Universities Admission Index (UAI). In addition, teachers

determine a grade used for reporting purposes.

Grade descriptors are provided to schools/colleges in each Framework. The following

principles underpin the development of the unit grade descriptors.1

All graded assessment is college-based. The Australian Scaling Test (AST) is employed

to facilitate the comparison of ‘T’ Course groups across ACT colleges for university

selection. The AST comprises two, one and a half-hour multiple-choice tests, and a

written test. The AST measures scholastic aptitude. Students are awarded a course score

for each ‘T’ Course completed. Course scores are based on unit scores reported over the

duration of a student's program of study. In turn, unit scores are a summation of marks

given by teachers on each assessment task (essay, project, work place tasks, test, etc)

required of students in a course unit. Where VET components are included and assessed

in a ‘T’ course they directly contribute to the University Admission Index (UAI). The

course scores indicate the relative ranking of students within a group but are not designed

to show a level of achievement in that course. VET units included in the UAI require a

score and this is readily available from the proposed model. In the ACT this would mean

that they would also be scaled and moderated using the AST, and this may cause some

difficulties given the academic nature of the AST and the performance nature of the VET

courses. However, if this works for the broad range of subjects involved in Year 12 and

the range of schools awarding marks then there seems to be no reason why it would not

work for the VET subjects as well. A Rasch analysis, however, might identify non-fitting

subjects and the Tasmanian approach could be of assistance.

Page 65: scoring units of competency

65

Queensland

There are two high stakes assessment regimes operating in the senior school: one a

system of externally moderated, subject-specific school-based assessment; the other an

external cross-curriculum test (the QCS Test) developed by QBSSSS, which is

administered statewide under standardised conditions. QBSSSS is responsible for the

statewide comparability of results in Board subjects and the standards reported in the

external cross-curriculum test. Results in Board subjects, Board-registered subjects and

the QCS Test appear on the senior certificate.

Assessment for results in Board and Board-registered subjects is only school-based.

Results in Board subjects form the basis of the results produced by QBSSSS for use in

selection into further education. Teachers are guided by criteria and standards set in

centrally developed syllabuses to assess student performance in each subject. A system of

local and state panels moderate subject assessment standards within schools and for the

state as a whole. Panels do this by reviewing samples of students’ work at each level of

achievement and providing advice to schools about the standards they should use to

determine students’ results in terms of syllabus criteria. Moderation of a random sample

of student work samples each year provides evidence of the success of the standards-

setting procedures.

An examination of syllabuses in Board-developed subjects with embedded VET

components and Study Area Specifications (SAS) showed that teachers are provided with

subject-related criteria and standards for grading tasks on ‘knowledge and understanding’,

and ‘reasoning’ components. While there is considerable evidence that teachers of Board

subjects have become adept over time in using criteria and standards-based assessment

the same may not be true for VET, so there would be an important role for the proposed

model in Queensland.

The syllabuses for Board subjects with embedded VET and Study Area Specifications

(SASs) also have a set of criteria related to practical tasks or skills. The criteria and

Page 66: scoring units of competency

66

standards in these documents are not for the purpose of grading tasks but for allocating

students to one of five levels or bands of achievement at the end of Year 12.

Teachers in Queensland have developed a high level of expertise in using criteria and

standards-based assessment since external examinations were abolished about 25 years

ago. Their capacity to identify and use criteria for competencies is not considered to be

problematic. If standards are supplied by the QBSSSS for VET subjects within a scored

VET paradigm, it is reasonable to expect that Queensland teachers will quickly become

adept at moderating student work samples against those standards. This should lead

quickly to a stable system of scored assessment in the VET subjects, in much the same

way as Board subjects are scored at present.

A cross-curriculum examination, the Queensland Core Skills Test (QCS), is used to

compare the strength of the variation of performances from different groups of students

within and between schools, to generate an Overall Position (OP) and Field Position (FP)

from assessments in Board subjects for tertiary selection purposes. The OP, a single piece

of information, represents relative overall achievement. OPs provide a comparison of

students across the state in terms of their overall achievement in senior studies. All Board

subjects are weighted equally in the computation of the OP. An FP indicates a student’s

rank order position based on achievement in Board subjects where subjects are weighted

unequally in up to five areas of study that emphasise particular areas of knowledge and

skills. The starting point for all assessment is the teachers’ estimates of the student

position within the subjects within the school, using as a reference the Level of

Achievement criteria and then a subject Achievement Index within the Level of

Achievement (SAI within the LOA). As far as this project is concerned, the teachers’

roles in assigning this for VET subjects are the only focus of change. The amount of

change is minimal.

Subject results are defined in terms of subject-specific criteria and standards as one of

five categories labeled: Very Limited Achievement, Limited Achievement, Sound

Achievement, High Achievement and Very High Achievement. They are linked to specified

standards and moderated using statewide moderation procedures. These are too broad to

calculate the OPs so SAIs are used (Subject Achievement Index - an intra school rank

Page 67: scoring units of competency

67

within the subject on a scale from 200 to 400; if there are less than 14 students the SAIs

are reported in levels and illustrate the relative positions and relative distances between

students within a school) for eligible students (doing university-oriented courses). The

Levels of Achievement are determined first and then the SAIs are worked through within

the LoAs. Schools submit their SAIs to the Board of Studies, which then scales the

scores for an OP.

While this is a complex procedure, it uses many characteristics of the proposed model.

The standards-referenced scale is central to each subject. Students are allocated both a

level of achievement for the subject (parallel to the proposed SRF band scale for the

subject) and then allocated a relative performance measure or scale score within a level of

achievement. This parallels the use of the matrix and derived unit scores. With minimal

professional development, if any at all, records of performance sheets could be used to

inform the SAIs, which would then be submitted to the Board for scaling. The only

modification would be the development of the scale on the record form to be within a

range of 200 to 400. This is a simple procedure, which would allow a national approach

to scored assessment to be introduced for VET subjects.

More importantly it would allow the VET subjects to be treated as mainstream subjects in

Queensland and cease the practice that forces students wishing to take a VET program

and to be eligible for university entrance to study TWO Year 12 programs concurrently

and suffer consequences when work placements are required. Students report the need to

make up time in the university-oriented program when they miss classes in the parallel

course. This project offers the Queensland system an opportunity to cease this dual

program and to provide VET subjects with parity of esteem. There is little professional

development if the SME-based scales can be developed on a national basis. Teachers

need only complete a rating scale to develop an SAI for the students in VET subjects in

Queensland.

Page 68: scoring units of competency

68

It will help students who are currently required to undertake two courses if they wish to

study VET AND gain entry to university. The proposed model meets all these

requirements and retains the standards-referenced approach coupled with the statewide

moderating approach and the central scaling procedure.

South Australia

VET in school agreements are entered into by schools for delivery of agreed VET

programs under the auspices of the Department of Education, Training and Employment.

Schedules in the Agreement include obligations of both parties, details of the accredited

program, and quality assurance plans. Two VET in Schools Agreements (VISAs) have

been developed to comply with the Australian Recognition Framework (ARF)

requirements: one for government schools and another for non-government schools

intending to deliver VET under the auspices of TAFE.

Any combination of vocational education and training units of competence successfully

achieved with a total 50 nominal hours of instruction is granted status for one SACE unit.

The packaging of the combinations of units of competency is a school-student decision. A

maximum of eight units, from a total of 22 units required for the SACE, may contribute

towards the completion of the SACE. SSABSA also packages vocational education and

training from Training Packages and accredits these as ‘SSABSA VET subjects’. These

subjects are centrally developed, school-assessed, and externally moderated. Students

can complete two SACE units at Stage 1, and two SACE units at Stage 2 in each of these

subjects. At Stage 1, successful achievement is reported as either ‘Satisfactory

Achievement’ or ‘Recorded Achievement’. At Stage 2, students receive:

a grade (A, B, C, D, or E);

a subject achievement score (out of 20);

a verbal description (outstanding, very high, high, competent, marginal, low

achievement);

a scaled score (out of 20).

Scaled scores are used in the calculation of the University Aggregate and the Tertiary

Entrance Rank. All SSABSA subjects at Stage 2 are accepted for university entrance with

Page 69: scoring units of competency

69

the exception of Stage 2 Community Studies, which uses 'work-required assessment' with

achievement expressed as a level ('Satisfactory Achievement' or 'Recorded Achievement')

rather than a score and therefore it cannot contribute to the calculation of the University

Aggregate. Subjects are placed in one of two categories of Higher Education Selection

Subjects (HESS) (General and Restricted).

Subjects in the SACE are either Publicly Examined Subjects (PES) that use an external

examination, Publicly Assessed Subjects (PAS) or School Assessed Subjects (SAS). PAS

subjects are 30% externally examined. SAS subjects are wholly examined in the school.

There is a restriction on the number of subjects that can be taken in this mode. VET

components can be embedded in PES or PAS subjects but are usually wholly assessed in

schools. For tertiary selection, students must have four subjects, but three should be

either PAS or PES. This will automatically restrict the number of VET subjects available

for tertiary selection if they are allocated to the SAS category exclusively.

Assessment of 'VET embedded subjects' is on the same basis as other SSABSA general

education subjects. Assessment in ‘SSABSA VET subjects’ at Stage 2 level consists of

three assessment components - a folio, a special study, demonstration or a work

placement journal. Each of these is weighted in order to contribute to the overall

assessment. The criteria, like those used in the ACT are relatively generic. For example

and are scored n a 20-point scale.

20 (A) – Outstanding Achievement

The Chief Assessor, in consultation with the moderation panel may deem that a student’s

achievement is outstanding against all learning outcomes. The score of 20 reflects

achievement that is consistently beyond the criteria for the A-grade band. The other score

ranges are linked to other grades.

17-19 (A) – Very High Achievement

14-16(B) – High Achievement

Scaled scores are derived from Stage 2 SACE subjects. The proposed model should

provide a simple approach that ensures a scored assessment within the units or stand-

alone VET subjects can be directly related to the training package. In South Australia

Page 70: scoring units of competency

70

there seems to be little or no change to procedures other than the adoption of the score

assessment record sheets and the standardization of subject scores ready for scaling.

Western Australia

In Western Australia, schools award a letter grade to students for each subject. The

processes used to determine these grades are moderated by the Curriculum Council to

ensure comparability of standards of student achievement. Achievement of a grade

entitles a student to credit towards Secondary Graduation for the Western Australian

Certificate of Education (WACE). VET Studies (based on national training package units

of competency) are included in student study programs in addition to Curriculum Council

subjects. Schools have a choice as to whether they embed the VET Studies in Curriculum

Council subjects or offer them in addition to Curriculum Council subjects.

If the VET modules are completed by a student in addition to Curriculum Council

subjects they are grouped into 55-hour or 110-hour blocks using the notional hours for

modules to form half and full VET subject equivalents respectively. Students would

normally undertake VET in a chosen industry area; however, VET modules from any

industry area may be grouped to form Curriculum Council subject equivalents. Up to

40% of a student’s program of study (i.e., four out of ten full-year subjects) may be

comprised of VET subject equivalents and up to 25% can be VET subjects (i.e., two out

of eight full-year subjects), which meet the minimum standard of C or better required for

graduation.

The Assessment Model

Teachers of Year 11 and Year 12 collect information on the students’ performance from

the beginning of the year using semester examinations, classroom tests, in-class work,

assignments and practical work. At the end of the year, teachers submit assessments

based on this information to the Curriculum Council. In the case of Assessment Structure

Subjects, schools provide a numerical school assessment (0-100) and a grade (A-E) for

each subject completed by students. For tertiary entrance (TEE) subjects, there is also an

external examination.

Page 71: scoring units of competency

71

For Common Assessment Framework (CAF) subjects, defined standards of performance

describe what students are expected to be able to do. The assessment format comprises

outcomes, tasks for measuring the performance of the outcomes, and criteria describing

Very High (V), High (H) and Satisfactory (S) performance are included for judging the

performance of the outcomes. The performance criteria define the standard of

performance expected for each outcome. Three levels of student performance are defined

and are illustrated through annotated work samples that are published as assessment

support materials.

V: at least 50% of ratings are at a Very High level, and at least 50% of the remainder

are at a High level or better.

H: at least 50% of ratings are at a High level or better, and at least 50% of the

remainder are at a Satisfactory level or better.

S: more than 50% of ratings are at a Satisfactory level or better.

ND: more than 50% of ratings are at a Not Demonstrated level.

For subjects that may contribute to the TER, the results are the 50:50 composite of TEE

results and school assessments. All marks/assessments are scaled/moderated (see below)

before they are used for university admission purposes.

Curriculum Officers visit a sample of schools to review assessment and grading

procedures and ensure that statewide comparability of standards is achieved. Seminars

for teachers focus on generic assessment programs, tasks and student work samples to?

demonstrate good practice, ?highlight subject/learning area matters related to assessment

and to ?build common understandings that underpin comparability.

Industry-related vocational subjects are outcome-based, affording the student the

opportunity of demonstrating what they are able to do. The outcomes are adapted in such

a way as to reflect the particular vocational emphasis of the subject.

The performance criteria for the CAF subjects are consistent with those developed in the

proposed model and almost no change would be needed for the rating scales. They are

essentially based on the same idea except that the record forms arise from quality

Page 72: scoring units of competency

72

indicators endorsed by the national ITAB. An example is shown below. It shows the same

idea is used in a Design and Drafting vocational unit. This project has in fact provided the

basis of the assessment criteria for four other areas of VET units for the schools and the

Curriculum Council. It is a simple process to implement the model.

Satisfactory High Very High The student applies elements of a design process to produce a computer assisted drafting solution to a design problem.

The student structures the key elements of a design process to produce a detailed computer assisted drafting solution, which shows development of design, to solve a design problem.

The student independently selects and applies a design process to produce a solution, structures the key elements of the process, demonstrates progression through the process and produces varied solutions in solving design problems.

There would seem to be little reason why the proposed model could not be adopted in

Western Australia. Record sheets can be developed for additional national training

package units and incorporated into the common assessment frameworks that define

standards of student performance in terms of outcomes when they have completed the

subject. In at least four VET areas the criteria have been developed.

Implications

National

At the national level, the model offers an opportunity to combine quality with competence

and to encourage industry to pursue excellence in training and target setting within the

training package frameworks.

From an educational point of view, it is possible to benchmark at a state level against all

other states and to monitor national levels of competence and performance among senior

secondary students in VET in schools. It is possibly the only area of the school

curriculum where this would now be possible without expensive additional testing

programs. The methodology, however, could be applied well beyond the ‘VET in

schools’ programs.

These developments have implications for a range of organisations and individuals. There

is a need for an acceptance of levels of performance that go beyond the two levels of

competent and not-yet-competent. Moreover, it is necessary to accept that a score can be

obtained for each unit and for each combination of units making up a subject or

Page 73: scoring units of competency

73

certificate. It is important that the scores for discrete tasks or performances are weighted

on the basis of their capacity to differentiate between students, because this is the method

of transforming numerical scores into a standards-referenced framework. When these are

also based directly on training package competencies rather than generic skills or

assessment methods, the validity of the assessment is enhanced and the relevance of the

assessment is clear to teachers, students and other audiences.

Consistency

Classical measures of consistency include the indices of reliability known as Cronbach

alpha. This index ranges in value from 0.0 to 1.0. Zero indicates that there is no

consistency and that the assessment is entirely random. More recently, item response

theories have added indices of ‘separation’ reliability and they provide some interesting

interpretations. Item separation reliability index is closely related to the Cronbach index

but indicates the extent to which the items or, in this case, the rubrics are separated along

the underpinning construct or continuum. A value of zero indicates that the rubrics are all

clustered at the same point and each indicates the same level of performance. Adding

more items or rubrics at the same level does not add to the assessment. As the items are

increasingly separated, the value of the index rises to 1.0. At 1.0 the difficulty level of the

rubrics or items are completely separated and each one adds new meaning to the nature of

the continuum or the construct. Interpreting the meaning of each rubric or item then adds

to the meaning of the construct and this can also be interpreted as a measure of the

validity of the construct. Item response theory also provides a measure of person

separation. The index also ranges from 0.0 to 1.0 with similar meaning to the values for

the item separation index. In the case of persons, however, perfect separation on the

underlying variable would indicate that it is possible to accurately identify differences

between the persons assessed. It is therefore possible to use this as a measure of criterion

validity. Item response modeling therefore tends to merge the ideas of reliability and

validity in ways that classical approaches cannot. Values of zero do not always mean that

there is no separation of the indicators. It is common when partial credit rubrics are used

and the range of one or more rubrics is restricted and within the range of another rubric.

This lowers the overall separation index of the indicators and indicates that there is

redundancy among the set of indicators.

Page 74: scoring units of competency

74

There is another form of consistency needed in this project. The SME judgments of the

indicator level need to be consistent with the empirical calibration of the level of the

indicator. The validity and reliability of the methodology relies on the SME approach

yielding the same definition of the continuum as the IRM analysis. To check this, it has

been proposed that SME panels nominate the relative difficulty of each of the rubrics.

This was tested by comparing the SME placement of rubrics at levels of performance and

then empirically calibrating their relative difficulty using item response modeling (IRM).

A measure of consistency has been proposed for this comparison and it was called the

Standard Error of Judgment (SEj). It is the value obtained when square root of the

average squared difference between the SME level and the IRM level is calculated. In

more common language, it is the standard deviation of the differences between the SME

and IRM approaches. There is a need for a new study to see if this measure is appropriate

and to determine how to assess the significance of its value. The example provided

illustrates that the SEj for the Hospitality unit used in this project for illustration was 0.27.

Zero would indicate a perfect match, but it is not known what the overall distribution

statistics are for the SEj and hence it is not possible to state how impressive or otherwise

this figures is. Given the small value (1.0 would indicate an average of one level

difference) it does suggest that there was a close match and that the SME group could be

regarded as making valid judgments of relative level of difficulty. There is still a lot of

work to do on this index to determine how to judge its significance and importance, but

this would be a very technical study and outside the scope of this project. In the table

below, the reliability indices are reported for Hospitality assessments and in Victoria and

NSW, where the assessments are combined with central examinations, the combined

reliabilities are also assessed; the reliabilities for the SRF are not state-based and only

national reliability is reported for these measures.

Page 75: scoring units of competency

75

Table 3. Reliabilities and Distributions of the Hospitality Assessments in NSW and

Victoria

Separation

Score

Level

Unit* Alpha

Item

Case

Mean

SD

F

E

D C B A

SRF (all) 0.84

0.55

0.78

9.35

8.1

4

9

22 34

36

44

SRF (Vic) -

-

-

8.55

8.29

11

21

27 33

40

SRF + FB Vic Exam 0.96

0.64

0.86

93.28

39.8

- -

- - - -

SRF + CC Vic Exam

0.96

0.71

0.89

65.27

41.7

- -

- - - -

FB Exam Only 0.95

0.86

0.94

98.51

34.7

- -

- - - -

CC Exam Only 0.95

0.7

0.94

78.9

34

- -

- - - -

SRF + NSW Exam 0.76

0.86

0.79

32.1

7.09

- -

- - - -

SRF (NSW) -

-

-

10.56

8.3

8

16

22 24

29

NSW Exam Only 0.76

0.98

0.82

32.15

6.77

- -

- - - - * SRF = Standards Referenced Model; FB = Food and beverage, CC = Commercial Cookery

The SRF has high reliability and validity indices, lending considerable support to the

recommendation for its use, whether it is used alone or in combination with a central

exam or other forms of assessment. It is likely that the use of a mean square error will

ultimately yield a better measure of agreement and consistency of the judges’ estimates

than other more rudimentary methods such as the colour coding of assigned levels as

shown in Figure 18 below. The colour coding procedure used by Bateman (2003) and

Connally (2004) relies on further judgment to assess the level of agreement and might be

based more on intuition than measurement. Nevertheless, the figure is shown to illustrate

the current state of expertise and to encourage further studies of measures of consistency

and agreement and the standard error of judgment (SEj). Horizontal lines separate the

IRM based levels. The SME levels are colour coded.

Page 76: scoring units of competency

76

| | | 24 M.3 | | | | | | | 14 O.3 | 25 R.2 | | 14 O.2 24 M.2 XXXXXXXXXXXXXXXXX | | 12 T.2 21 I.2 | XXXXXX | | | 11 V.3 13 C.2 | XX | | 23 C | 21 I.1 24 M.1 XXXXXX | | 11 V.2 XXXX | | 25 R.1 X | 13 C.1 | XXXXX | 12 T.1 | XXX | 22 A X | | 14 O.1 X | | X | 11 V.1 | | | XX | XX |

SME Levels

Level 4

Level 3

Level 2

Level 1

Level 0

Level 1

Level 2

Level 3

IRM Levels

Figure 18: Colour coding of the match between SME and IRM relative difficulty.

Consistency is an issue that needs to be addressed. In making decisions about

competency, there is a need to not only be consistent but to appear to be as well. A

national system of assessment, such as in the proposed model, offers an opportunity to

monitor consistency at a national level, which is important given the link to a national

credential. The following analyses have not been previously reported in any context.

They show that, for the most part, the competency assessment for a subject or for a

differentiating score is unaffected by the selection of units or by how many units are used.

This supports the flexibility of the states to select units most suitable for their curriculum.

Figure 19 relates student ability or competence (horizontal axis) to the standards-

referenced band assigned for the each unit.

Page 77: scoring units of competency

77

Figure 19: Competence and the selection of units.

Figure 20, however, shows that the competence assessed in a unit can differ considerably

depending on the location of the assessment. In the ACT, for example, a student has to

demonstrate a high level of performance quality in order to be assessed as competent. In

most other states a lower level of performance quality is required in order for the student

to be assessed as competent. In this study the difference due to location is controlled, but

in most assessments, where the weighted differentiating score is not used and just a

decision of competent is made, this difference is uncontrolled and the consistency of

decision-making is variable. This may reflect upon points raised earlier in this report.

There is no fixed level of competence. That decision varies according to the expectations

of the employer for the workplace and according to the demands of the curriculum in the

school system. The important thing is whether the student can meet the expectations of

the workplace or (in this case) the expectations of the school system. Nevertheless, this

model provides the methodology for moderation of decisions and assessments and allows

the systems to be aligned with respect to scores assigned for equal competence. This is a

powerful argument for the adoption of a single approach that allows for differences but

does not penalize students due to the location of their assessment.

Page 78: scoring units of competency

78

Figure 20. Competency and location of assessment.

Figures 19 and 20 show that there are different views of competence across systems and

action needs to be taken to address this lack of consistency. It is not an artifact of the

current procedure. It is a hidden aspect of competency assessment, which relies on

judgment in context. The effect may be exacerbated if further research allowed for an

examination of this effect across workplaces and across assessors. The chart points out

that there are different interpretations of competence and its manifestation across state

systems. It cannot be described as a weakness of the proposed model. On the contrary, it

is a strength. Previous investigations of consistency of competency assessment have not

focused on outcomes or on the performances of the assessees. They tend to have

examined the procedures and materials. Even with constant process and materials,

differences exist in the interpretation of competence. If a consistent national process is

used and the model is adopted it will become possible for national standards to be set and

monitored in competency assessment. Hence the model not only provides an opportunity

for scored assessment, it adds the notion of quality to competence and allows monitoring

of standards and the existence of bias in competency assessment. This, more than any

other reason, makes it compelling for states and territory systems to adopt a single

approach and the model explored in this project is offered as an example of one that

provides each of the properties required in a competency assessment system and adheres

to the principles recommended.

In this project the ability of the student or the level of performance has been controlled for

the effect of the location. Without the weighted differentiating score and IRM calibration,

this effect would have to be controlled through moderation. Consistency of competence

Page 79: scoring units of competency

79

assessment is an issue that still needs to be resolved. The methods displayed in this report

have shown a possible approach.

Systems

It is important to be cognizant of the need for a national system of assessment with

respect to the national training packages. In Australia, systems of education are

differentiated in terms of philosophy, approach, and methods of implementing

educational curricula. It may be that incentives have to be offered to state systems in

order for them to adopt a national approach to differentiated scores and to gain greater

recognition for students and their performances or achievements in VET subjects. Some

systems, such as Victoria, have invested large amounts of resources in existing

approaches based on generic criteria indirectly related to the training packages. Teachers

in both the trials and the pilot study expressed a frustration with the approach and a desire

to use the trial materials. It would be relatively simple to change the criteria to fit with an

approach based on the training packages rather than on assessment methods. The validity

of the proposed system may be sufficient incentive.

Teachers’ Practices

Teachers have indicated their enthusiasm for the quality indicator model and its power to

communicate competency in terms consistent with trends in general education. To adopt

it, they will need to assess and record in terms of the levels of performance demonstrated

by students and the scoring procedures. This is not a particularly arduous task. Teachers

all over Australia have become accustomed to standards frameworks through outcomes-

based education, profiles or competence assessment for many years. The proposed model

would bring into line the ‘VET in schools’ subjects with the academic and mainstream

subjects and enable the teachers to adopt a nationally uniform method of assessment,

recording and reporting of achievements across the curriculum.

Employers

For employers, the implications are straightforward. The model represents a change in

both logic and approach to interpreting assessment evidence, recording and developing

training plans. There are benefits in terms of more detailed knowledge of employee

performance and the capacity or potential for employees to be trained and perform at

Page 80: scoring units of competency

80

levels above the minimum outlined in the training packages. With some employers and

registered training authorities, it may have implications for training plans where

competitive advantage is an important aspect.

Parents

Parents are often placed in a difficult situation when students wish to pursue a vocational

credential. There is no doubt that this, in the minds of many parents, is an inferior choice.

The students are often persuaded not to do so and are encouraged to undertake courses in

which they have little or no interest. This should never be the case, and if the parity and

the scoring of the subjects were reinforced such that all subjects are eligible for all

possible options it would not be necessary.

This would make VET subjects more attractive, because not only would they be eligible

for any further career, academic or study choice, they would also lead to a national

credential. It also means that, for VET subjects in the curriculum, there would be a need

to continue to ensure that the rigor of assessment was appropriate to the kinds of

decisions that need to be made on the basis of performance in those subjects.

Students

For students, the model removes the discrimination, lack of parity and the necessity to

make lifelong decisions, about whether they will or will not pursue an academic or

vocational education program, in the senior secondary years. While it is true that only a

small percentage of VET students wish or even attempt to obtain university selection, the

fact that many or most VET subjects are closed to this option means that there will always

be a ‘second class’ label attached to VET subjects and hence to students who specialise in

those subjects. This may not be important to many students but it is important that

students are offered the opportunity to simply choose subjects at the senior secondary

level and not to choose life career paths at this level. It is important that all options

remain open to all students and bringing ‘VET in schools’ subjects into line with all

other subjects will be an important means of achieving this. The model proposed in this

project makes this feasible and retains the competency assessment and reporting required

and used extensively in industry.

Page 81: scoring units of competency

81

References

Bateman, A. (2003). A validation of multi source assessment of higher order

competency assessment. Unpublished Masters Thesis, Faculty of Education,

University of Melbourne

Connally, J. (2004). A multi source assessment of higher order competencies.

Unpublished Doctoral Thesis. Faculty of Education, University of Melbourne

Glaser, R. (1981). The future of testing: A research agenda for cognitive psychology and

psychopathology, American Psychologist, 36(9), 9-23.

Glaser, R. (1963). Instructional technology and the measurement of learning outcomes:

Some questions. American Psychologist, 18, 519-521.

Greaney, V., Khandker, S.R., & Alam, K. (1999). Bangladesh: Assessing basic skills.

Dhaka: University Press.

Griffin, P. (1995). Competency assessment: Avoiding the pitfalls of the past. Australian

and New Zealand Journal of Vocational Education, 3(2), 33 - 59.

Griffin, P. (1997). Developing assessment in schools and workplace. Paper presented at

the Inaugural Professorial Lecture, Dean's Lecture Series, Faculty of Education,

The University of Melbourne, September 18.

Griffin, P., & Gillis, S. (2001). Competence and quality: Can we assess both? Upgrading

Assessment: A National Conference on Graded Assessment, Melbourne, Kangan

Batman Institute of TAFE.

Griffin, P., & Gillis, S. (2002). Scored assessment for Year 12. Report of the pilot study.

Assessment Research Centre.

Griffin, P., Gillis, S., Connally, J., Jorgensen K., & McArdle D. (2000). A multi source

approach to assessing higher order competences. A project report to the Australian

Research Council. Canberra, ARC.

Griffin, P., Gillis, S., Connally, J., Jorgensen K., & McArdle D. (2003) A multi source

approach to assessing higher order competencies. A project report to the

Australian Research Council. Canberra, ARC.

Griffin, P., Gillis, S., Keating J., & Fennessy, D. (2001). Assessment and reporting of

VET courses within senior secondary certificates. In Creating expanded

opportunity for youth: Greater recognition for VET courses in industry and

university. Sydney, New South Wales Department Vocational of Education and

Training.

Page 82: scoring units of competency

82

Griffin, P., & Nix, P. (1991). Educational assessment and reporting: A new approach

NSW: Harcourt Brace Jovanovich.

Griffin, P., Smith, P., & Ridge, N. (2002). The literacy profiles in practice: An

assessment approach. Portsmouth: Heinemann.

Masters, G.N. (2002). Fair and meaningful measures? A review of examination

procedures in the NSW Higher School Certificate. Camberwell: Australian

Council for Educational Research.

McGaw, B. (1997). Shaping their future: recommendations for reform of the Higher

School Certificate. Sydney, NSW: Department of Training and Education

Coordination.

Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests.

Copenhagen: Danish Institute for Educational Research (Expanded edition, 1980.

Chicago: University of Chicago Press).

Sadler, R. (1987). Specifying and promulgating achievement standards. Oxford Review of

Education, 13(2), 191-209.

Sadler, R. (1998). Formative assessment: Revisiting the territory. Assessment in

Education: Policies, Principles and Practice, 5, 77-84.

Page 83: scoring units of competency

83

Appendix A

Standards Referenced Frameworks

Level Descriptions and Distributions1

Hospitality Units

Separation

Score

Level

Unit* Alpha

Item

Case

Mean

SD

F

E

D C B A

SRF (all) 0.84

0.55

0.78

9.35

8.1

4

9

22

34

36

44

SRF (Vic) -

-

-

8.55

8.29

11

21

27

33

40

SRF (NSW) -

-

-

10.56

8.3

8

16

22

24

29

Page 84: scoring units of competency

84

Levels-> 0 1 2 3 4

Unit Code Unit Desciption Ni

Alpha

Mean SD Level Max Cut Score

SEJ

THHCOR01B Work with colleagues and customers. 26

0.95 43.67 11.40

2 9 25

41

50

1.50

THHCOR02B Work in a socially diverse environment 10

0.87 17.88 4.93 3 7 12

19

0.95

THHCOR03B Follow health, safety and security procedures

11

0.86 18.04 3.99 3 8 15

18

1.15

THHGHS01B Follow workplace hygiene procedures 6 0.83 15.62 3.84 3 9 12

15

0.45

THHHC001B

Develop and update hospitality industry knowledge

10

0.92 18.91 7.24 2 7 15

22

0.85

THHGGA01B Communicate on the telephone. 13

0.90 22.73 5.18 4 10

18

24

1.22

THHBH01B Provide housekeeping services to guests.

11

10

15

20

26

THHBH03B Prepare room for guests 20

11

21

29

43

THHBKA01B Organise and prepare food 16

0.92 34.37 6.99 3 6 20

32

36

1.71

THHBKA02B Present food 10

0.90 23.57 5.11 4 15

24

0.58

THHGFA01B Process financial transactions 15

0.95 30.16 11.13

6 11

20

35

1.21

THHGCS02B Promote products and services to customers

12

6 21

28

30

0.95

THHBFB10B

Prepare and serve non alcoholic beverages

10

0.94 21.98 7.69 7 12

17

22

0.88

THHBKA03B Receive and store kitchen supplies 15

0.93 17.09 7.42 5 10

15

26

1.61

Page 85: scoring units of competency

85

Standardized scores of Hospitality using a mean of 30 and a standard deviation of 7.

10.00 20.00 30.00 40.00 50.00

Converted Score

0

10

20

30

40

50

Fre

qu

ency

Mean = 30.00Std. Dev. = 7.00N = 323

Page 86: scoring units of competency

86

Grade Competency Description Reporting Summary

A

Manages and deals with issues as they arise. Shows high level skills in dealing with customers and work team members; solves problems as they arise, is aware of nuances in customer and staff interactions and is able to act accordingly and handle atypical telephone calls. Prepares ingredients for an extensive range of hot and cold beverages and maintains and monitors equipment usage and functionality.

Skilled in dealing with clients and team members, knowledgeable about stock and presentation

B

Manages quality control with stock and preparation of financial transactions. Presents food correctly and with style, shows understanding of industry issues and contributes to workplace health, safety and security. Avoids and resolves misunderstanding with colleagues and others.

Takes controlof quality and finances, for presentation and industry issues including OHS and IR.

C

Manages stock in accordance with OHS and enterprise requirements. Evaluates products, services and promotional initiatives. Processes financial transactions, maintains efficient workflow in tools and food, telephone calls and systems. Knows industry, legal and ethical implications, OHS hygiene risk management, and shows cultural awareness and sensitivity with colleagues and customers.

Manages, stock and services,promotions. Maintains workflow and equipment, knows industry and ethical issues.

D

Maintains stock and supplies, product/service and knowledge of the industry. Follows procedures for cash, food portions and presentations, OHS and hygiene regulations and telephone calls, handling and storage of foods, and emergency situations. Demonstrates logical workflow in preparation and knife handling. Is patient, courteous and helpful with colleagues and customers.

Maintains supplies and stock; equipment, customers, OHS and food specialities, linked to industry and its tools

E

Is aware of kitchen stock and supplies and non-alcoholic beverages. Informs customers of products and services, uses correct garnishes and sauces and follows equipment safety procedures. Can communicate on the telephone,update knowledge of industry, risks, storage and OHS procedures. Is polite with colleagues and customers.

Knows about supplies and stock; customers, OHS and food specialities, linked to industry

F

With assistance and advice, can receive supplies, inform customers, select garnishes/sauces and follow safety procedures. With assistance, can communicate on the telephone, update hospitality industry knowledge, follow workplace hygiene, health safety and security procedures, and show politeness with colleagues and customers.

Learning to deal with supplies workmates and customers, use the telephone, check industry information and OHS

Page 87: scoring units of competency

87

At this level the student can acknowledge covert signs of customer dissatisfaction,

respond appropriately to a situation where a customer displays anger, and where

necessary apply conflict resolution strategies. The student can monitor work team

goals against enterprise requirements and directions, anticipating difficulties that

may arise in achieving tasks, provide constructive feedback to team members to

improve work group functioning, and utilise the diverse composition of the work

team to help achieve goals.

Level 4 Anticipates, monitors

and resolves difficult

situations when dealing

with others

At this level the student can adjust the use of spoken language in response to

subtle observations, show cultural sensitivity when communicating with people

from diverse backgrounds, and speak in a way that conveys sincerity. The student

can supply accurate information to customers in a manner than enhances

acceptance of the product or service, and deal with situations involving

unreasonable needs and requests in a manner that maintains customer

satisfaction. At this level the student can promote trust and respect of team

members through demonstration of consistently high performance standards, and

propose initiatives for enhancing the quality of service.

Level 3 Displays cultural

sensitivity and high

quality service

At this level the student can communicate with appropriate oral and body

language, and apply knowledge of different cultures when communicating. The

student can actively seek information on customer needs, including signs of

customer dissatisfaction, and respond to complaints in a positive and sensitive

manner. The student can prioritise and undertake designated work to ensure

work team goals are met, performing day to day activities in a positive and

professional manner to promote trust, support and respect.

Level 2 Deals with difficult

situations in a positive

and sensitive manner

At this level the student can maintain a positive, courteous and cooperative

manner at all times, asking questions and considering non verbal communication.

When dealing with colleagues, the student can acknowledge and respond to

feedback, provide support upon request, seek assistance when difficulties arise in

achieving tasks, and work in collaboration with team members to identify work

team goals.

Level 1 Communicates and

interacts with others in

a positive and

supportive manner

At this level the student can communicate with customers and colleagues in a

polite, professional and friendly manner and maintain high standards of personal

presentation. His or her manner is consistent but does not take account of

customer needs. The student requires assistance to learn how to work in a

collegial manner and is unsure of details or where to find out information.

Level 0 Demonstrates limited

ability to work with

colleagues and

customers

THHCOR01B Work with colleagues and customers

Page 88: scoring units of competency

88

THHCOR02B Work in a socially diverse environment

At this level the student can use his or her experience to select, from a range of strategies, the most appropriate for handling cultural misunderstandings and avoiding conflict. He or she can explain the events leading to cross cultural misunderstandings, and apply knowledge of different cultures and cultural characteristics when communicating with colleagues and customers.

Level 3 Avoids, and when required, resolves cultural misunderstandings

At this level the student can apply a range of communication strategies to overcome language barriers. He or she can explain the significance of cultural diversity and values when dealing with colleagues and customers. The student can also list situations that could result in cross-cultural breakdowns, and explore factors that may have led to difficulties and report them where necessary.

Level 2 Displays cultural awareness and sensitivity

At this level the student describes and uses a range of verbal and non-verbal communication strategies that are appropriate to a socially diverse environment. The student can describe the key characteristics of a range of cultural groups and the principles that underpin cultural awareness.

Level 1 Uses various communication strategies when dealing with diverse groups

At this level the student is patient, courteous and helpful when dealing with customers from a range of socially diverse backgrounds. The student can refer customers to a colleague or team leader when experiencing difficulties.

Level 0 Requires support to work in a socially diverse environment

Page 89: scoring units of competency

89

THHCOR03B Follow health, safety and security procedures

At this level the student can analyse and evaluate the level of risk associated with emergency situations, suggesting the most appropriate control or preventative measures, and can accurately report details of emergency situations. Further, he or she contributes to the management of workplace health, safety and security.

Level 3 Actively contributes to the management of workplace health, safety and security

At this level the student can describe a range of workplace hazards, their consequences and appropriate preventative measures. He or she follows correct health, safety and security procedures, and explains the legal implications of disregarding those procedures. The student can identify and deal with emergency situations, following workplace emergency and evacuation procedures.

Level 2 Follows correct health, safety and security procedures and can deal with emergency situations

At this level the student can correctly describe the OHS requirements of both employers and employees, and describe major causes of workplace accidents related. He or she can promptly identify issues requiring attention and communicate these to the appropriate person where appropriate.

Level 1 Identifies and reports OHS issues requiring attention

At this level the student maintains personal hygiene and grooming and can explain the importance of this to the industry. Any suspicious behaviour or unusual occurrences can be correctly reported.

Level 0 Requires support to follow occupational health, safety and security procedures

Page 90: scoring units of competency

90

THHGHS01B Follow workplace hygiene procedures

At this level the student can provide an overview of relevant hygiene legislation. Further, he or she can explain the impact of legislation and regulations on the hospitality industry, and the legal implications of not following workplace hygiene procedures.

Level 3 Understands hygiene regulation and its impact on the industry

At this level the student can describe hygiene control procedures and principles, and the consequences of hygiene risks. The student can apply corrective action to minimise or remove risks.

Level 2 Applies corrective action to minimise or remove hygiene risks

At this level the student can describe and follow workplace standards for handling and storage of foods, and can identify general hazards and hygiene risks in food handling. The student can describe processes for minimising hygiene risks, and can promptly report risks to the appropriate personnel.

Level 1 Follows workplace standards for handling and storage of foods

At this level the student can follow workplace hygiene procedures, identify common hygiene risks and list factors that contribute to hygiene problems.

Level 0 Demonstrates ability to follow basic hygiene procedures

Page 91: scoring units of competency

91

THHHCO01B Develop and update hospitality industry knowledge

At this level the student can evaluate the key factors that impact on services offered, and

apply industry knowledge to enhance service delivery. The student can explain legal and

ethical issues as related to their workplace, and suggest strategies for ensuring these

obligations are met. They can critique information from a range of sources to update

knowledge and understanding of the hospitality industry, and describe emerging issues

and explain their impact on the industry.

Level 3

Demonstrates high

awareness and

understanding of a

range of industry

related issues, including

current and emerging issues.

At this level the student can describe quality assurance processes and systems, and apply

specific information on the hospitality industry to enhance the quality of work

performance. The student can identify a range of legal and ethical issues that impact on

the hospitality industry, and actively seek information on these issues. The student can

carry out independent research to update industry knowledge; identifying issues of

current concern to the hospitality industry and explaining their implications for the

enterprise.

Level 2

Maintains specific

knowledge of the

industry, including

legal, ethical, and

current issues of local

concern.

At this level the student can select appropriate sources of information on the hospitality

industry, and apply this information in day to day activities. They can describe the

function of a range of industry sectors, and access information on relevant sector as

required. The student can enhance service quality through communication with

customers, and carry out workplace activities according to legal and ethical industry

practices.

Level 1

Accesses specific

information on relevant

sectors of work when

required.

At this level the student can select some sources of information regarding the hospitality

industry, and can apply this information in day to day activities. He or she can describe

some functions of industry sectors but cannot enhance service quality through

communication with customers.

Level 0

Demonstrates limited

ability to develop and

update hospitality

industry knowledge

Page 92: scoring units of competency

92

THHGGA01B Communicate on the telephone

At this level the student can ask well focused questions to establish the purpose of calls

and affirm meaning. He or she can apply problem solving skills to answer atypical

inquiries, and can determine the level of urgency of incoming calls to ensure timely

follow-up. When making calls the student can obtain all necessary information to

facilitate accurate, efficient and purposeful communication. The student can use

advanced technical features of the enterprise telephone system to efficiently handle both

incoming and outgoing calls.

Level 3

Efficiently handles

incoming and outgoing

calls, including those that

are urgent and atypical

and uses advanced

features of a telephone

system

At this level the student can ask questions to establish the purpose of calls, and promptly

respond to inquiries and apply problem solving skills where necessary. He or she can

maintain accurate records of received calls and select the most appropriate

communication strategy to relay messages. The student can identify and report

threatening or suspicious calls, and determine the most appropriate action. The student

can maintain professionalism at all times, even when dealing with difficult callers.

Level 2

Applies appropriate

communication, record

keeping and problem

solving skills when using

the telephone

At this level the student can establish the purpose of routine calls and offer assistance to

the caller in a professional and friendly manner. He or she can repeat caller details to

ensure mutual understanding, and relay messages to the nominated person within an

acceptable timeframe. The student can use a range of resources to obtain correct

telephone numbers, and clarify the purpose of calls prior to calling. The student can use

basic technical features of the enterprise telephone system.

Level 1

Provides basic assistance

to routine calls and

inquiries

At this level the student can answer incoming calls promptly, clearly and politely and in

accordance with enterprise procedures. The student attempts to meet caller requests and

where necessary transfers the call to the appropriate person. When making calls, the

student can obtain correct telephone numbers from accessible resources, and clearly

convey his or her name, company and the purpose of the call.

Level 0

Demonstrates basic

abilities to communicate

on the telephone

Page 93: scoring units of competency

93

THHBH01B Provide housekeeping services to guests

At this level the student can efficiently integrate multiple activities, ensuring minimal disruption to workflow, customise arrangements to resolve specific problems and suggest appropriate follow-up actions where necessary. The student can also ensure that current guests’ names are known and individual guests are fully satisfied by the equipment set-up and their perceived ability to apply it to their needs.

Level 4 Efficiently manages workflow, problem solves and provides a personalised service to guests

At this level the student can maintain a polite and friendly manner when handling requests, make appropriate apologies and implement available support procedures. The student can also develop procedures to ensure the timely delivery of items and organise the items for pickup in combination with other services.

Level 3 Performs and arranges multiple activities to ensure high quality and timely services

At this level the student can use last name and correct title to acknowledge a guest, note details in the most appropriate format and negotiate collection times with other activities. The student can also advise guests on appropriate equipment usages, manage set up and answer questions to ensure understanding.

Level 2 Responds appropriately to housekeeping requirements and provides advice to guests on room and housekeeping equipment

At this level the student can demonstrate knowledge of customer service standards, security procedures, and procedures for reporting malfunctions and equipment uses. The student can also appropriately address a guest not identified, record the details of requests using standard procedures, satisfactorily communicate timelines for meeting them and make respectful apologies.

Level 1 Understands basic, standard procedures for providing housekeeping services

At this level the student can demonstrate some functions of industry sectors but cannot enhance service quality through communication with guests. The student can generally record the details of requests, but industry procedures are not always followed.

Level 0 Demonstrates limited ability to provide housekeeping services to guests

Page 94: scoring units of competency

94

THHBH03B Prepare room for guests

At this level the student can accurately estimate the required supplies and select or order them accordingly. The student can efficiently reset rooms to meet guest needs, systematically validate those rooms that require service and effectively manage problems involving room access. The student can also ensure pests are properly removed, defects are reported and, where possible, appropriate action for repair or replacement is suggested. Knowledge of the correct cleaning chemicals, equipment and procedures can also be applied while incorporating checking.

Level 4 Performs quality control checks when preparing rooms

At this level the student can correctly select and replenish supplies, load trolleys in an efficient and safe manner, attractively replace room supplies and store supplies in a way to facilitate stock management. The student can also replace bed linen quickly and effectively, thoroughly check all furniture, fixtures and fittings and ensure the prompt return of guest items.

Level 3 Efficiently and safely prepares room for guests

At this level the student can demonstrate knowledge of equipment types and requirements as well as relevant customer service and security procedures. This includes reporting pests and unusual or suspicious items to the proper person and appropriately reporting damaged items identified. The student can also effectively liaise with housekeeping staff, remove stains sufficiently and ensure that ordered supplies arrive within an acceptable timeframe.

Level 2 Performs standard procedures for preparing rooms for guests

At this level the student can demonstrate knowledge of enterprise procedures for loading and selecting supplies for trolleys. The student can also ensure that all equipment is clean and prepared for future use and sufficient supplies are readily available to replenish rooms. Beds and mattresses can be properly stripped, rooms cleaned and all items stored with consideration to safety and security.

Level 1 Checks and replaces supplies safely, cleans room and strips bed

At this level the student can generally communicate with housekeeping staff and clean rooms in the correct order with minimal disruption to guests.

Level 0 Requires support to prepare room for guests

Page 95: scoring units of competency

95

At this level the student can improvise when ingredients are unavailable due to spillage,

breakage or spoilage. He or she can apply cutting and shaping techniques appropriate to

the style of cuisine, ensuring minimal wastage. The student can correctly clean, prepare

and fillet fish, and clean and prepare seafood, with consideration to hygiene and OHS

requirements.

Level 4 Improvises ingredients when

required, applies cutting and shaping

techniques appropriate to the style of

cuisine, and cleans and prepares

seafood with consideration to hygiene

and OHSAt this level the student can use logical and time efficient workflow in preparation of

food, including principles of sequencing, organising, co-operation and teamwork, and can

effectively resolve situations where equipment fails or is unavailable. The student can

consider quality, suitability, consistency, hygiene and wastage when preparing and

portioning a range of foods. He or she can use a range of knife handling techniques, and a

variety of cutting and shaping techniques to prepare fruit and vegetables, meat and

poultry with consideration to hygiene and OHS requirements.

Level 3 Uses logical and time-efficient

workflow in preparation of food and

displays a range of knife handling,

cutting and shaping techniques

At this level the student can safely and hygienically assemble and use a range of

equipment. He or she can describe the characteristics of basic food products, ingredients

and a range of menu types, and identify ingredients according to standard recipes, recipe

cards and instruction sheets. The student can assemble the correct quantity, type and

quality of ingredients in a logical sequence. He or she can prepare and portion a range of

foods, including fruit and vegetables, dairy products, meat, poultry and seafood, quickly

and accurately with consideration to quality, hygiene, suitability, consistency and wastage.

Level 2 Prepares a range of food quickly and

accurately with consideration to

quality, hygiene ,suitability,

consistency and wastage

At this level the student can ensure that the correct equipment is safely assembled and

ready for use, and that correct procedures for maintenance and cleaning of equipment

have been followed. The student can describe the characteristics of basic food products,

ingredients and a range of menu types using appropriate terminology.

Level 1 Follows correct equipment safety

procedures and assembles ingredients

for menu items

At this level the student can generally ensure that the equipment is assembled and that

cleaning of equipment has been attempted. The student can assemble the correct quantity,

type and quality of ingredients for menu items, including accurately measuring and

sifting dry goods, and preparing meat correctly, safely and hygienically.

Level 0 Demonstrates ability to assemble

equipment and prepare some

ingredients

THHBKA01B Organise and Prepare Food

Page 96: scoring units of competency

96

At this level the student can describe the characteristics of food products and menu types.

He or she can explain the importance of recipe cards in ensuring quality, and describe and

use a range of portion control measures and equipment. The student can present food

using both classical and innovative styles, with consideration to colour, contrast and

temperature. The student can implement problem solving strategies to overcome

unanticipated shortages, and can participate in a team to ensure logical and time-efficient

work flow and to maintain quality.

Level 2 Presents food using

classical and innovate

styles with

consideration to colour,

contrast and

temperature

At this level the student can correctly portion food according to standard recipes and

instructions, arrange garnishes and sauces with consideration to colour, contrast and

presentation, and recommend alternative garnishes for menu items. He or she can select

appropriate crockery for menu items, and ensure there are sufficient quantities of crockery

for food service. The student can list factors that influence the effectiveness of team

functioning and display effective teamwork with all service staff. The student can explain

the importance of OHS and hygiene regulations for food preparation, service and

equipment usage, and ensure adherence to these regulations.

Level 1 Portions and presents

food according to

standard recipes and

instructions, OHS and

hygiene regulations and

food presentation

At this level the student can list and select ingredients for menu items in a logical and

sequential manner, and can describe a range of appropriate garnishes and sauces. The

student recognises the importance of teamwork within the kitchen.

Level 0 Demonstrates ability to

select appropriate

garnishes/sauces when

presenting food

THHBKA02B Present food

Page 97: scoring units of competency

97

THHGFA01B Process financial transactions

At this level the student can process a range of non-cash transactions in accordance with enterprise and financial institution procedures, and prepare and issue point of sale receipts including all relevant tax details. He or she can apply correct enterprise procedures for handling declined automated transactions with sensitivity and tact. Where appropriate, the student can determine register reading or print out accurately.

Level 4 Processes a range of non-cash transactions in accordance with enterprise and financial institutions

At this level the student can receive and check cash float against appropriate documentation, and describe the process for resolving inconsistencies between cash float and documentation. The student can conduct all transactions within enterprise speed requirements whilst maintaining customer service standards, and can describe enterprise procedures for responding to customer claims of incorrect change. He or she can count cash and calculate non-cash documents accurately, maintaining records in accordance with enterprise procedures, and can describe enterprise procedures for reconciling takings, including dealing with variances in the reconciliation process.

Level 3 Conducts timely transactions, counts cash, calculates non-cash documents and maintains accurate records

At this level the student can apply basic numeracy skills when processing financial transactions. The student can describe a range of payment methods and explain the benefits of recording all transactions. He or she can describe and follow enterprise procedures when making cash payments, issuing automated receipts, separating and securing cash float from takings, recording takings and removing and transporting cash and documents.

Level 2 Follows enterprise procedures when processing automated receipts and cash payments, and removing and recording takings from register/terminal.

At this level the student can describe enterprise procedures for handling cash payments and the importance of checking cash floats. The student can describe enterprise security procedures for removing and transporting cash and non-cash documents. When reconciling, the student can count cash and calculate non-cash documents accurately.

Level 1 Receives cash payments, issues correct change and records transactions in timely manner

At this level the student can describe some procedures for handling cash payments and checking cash floats. The student can receive cash, issue correct change that has been electronically calculated, and record all transactions within an appropriate timeframe.

Level 0 Requires support to process financial transactions

Page 98: scoring units of competency

98

THHGCS02B Promote products and services to customers

At this level the student can apply conflict resolution strategies to deal with situations where customers respond angrily to sales initiatives and explain any legal issues that need to be considered when selling.

Level 4 Applies conflict resolution strategies

At this level the student can evaluate products, services and promotional initiatives, using a range of data sources, and propose subsequent sales strategies for consideration in future planning. He or she can also enhance the customer’s acceptance of the product/service and employ upselling and cross-selling techniques whilst maintaining satisfaction. The student can create the opportunity to acquire specialised knowledge.

Level 3 Evaluates products, services and promotional initiatives. Successfully employs upselling and cross selling techniques

At this level the student can use a variety of research techniques to update and maintain product/service knowledge and explain the importance of sharing knowledge with colleagues. The student can also obtain customer preferences through active listening and questioning, provide helpful information and create opportunities to promote products and services. Product and service evaluations and promotions on offer can be integrated into sales strategies.

Level 2 Actively researches and maintains product/service knowledge

At this level the student can update product/service knowledge, through other colleagues and readily accessible written material, and communicate it to colleagues. The student can also supply accurate information in response to customer queries and inform customer of possible extras and add-ons.

Level 1 Supplies accurate and readily available information to customers on products and services.

At this level the student has not yet consolidated the ability to update and communicate product/service knowledge. With supervision, the student can generally respond to customer inquiries in a polite and courteous manner.

Level 0 Requires support to promote products and services to customers

Page 99: scoring units of competency

99

At this level the student can prepare ingredients and equipment for an extensive range of

hot and cold beverages, and can prepare drinks efficiently during busy periods. He or she

can ensure customer satisfaction, modifying recipes as required. The student can conduct

regular inspections of machinery and equipment, monitor inefficient usage, and

anticipate problems.

Level 3 Prepares ingredients for an

extensive range of hot and cold

beverages and maintains and

monitors equipment usage and

functionality

At this level the student can communicate with customers and staff to ensure that desired

drinks are prepared to customer satisfaction, and demonstrates an understanding of

enterprise practices for assembling ingredients and equipment. The student can ensure

that quality control is maintained during busy periods, and can effectively resolve

situations where drink-making equipment fails or is unavailable, ensuring minimal

disruption to workflow.

Level 2 Drinks can be customised to

meet specific requests and

quality control is maintained

during busy periods

At this level the student can demonstrate knowledge of preparation methods and recipes

for a range of beverages, and can prepare drinks in a logical and efficient order to

maximise quality and presentation. He or she can develop procedures for preparing and

presenting drinks in response to organisational and customer requests, and maximise

drink presentation through use of appropriate glassware, crockery and garnishes. The

student can clean machinery and equipment with consideration to its long-term life cycle.

Level 1 Prepares and services a range of

beverages in a logical, efficient

and presentable manner

At this level the student can prepare ingredients and equipment for a small range of

beverages. He or she can name drinks if prompted by customers and clean equipment in

preparation for next use.

Level 0 Demonstrates limited ability to

prepare and serve non alcoholic

drinks

THHBFB10B Prepare and serve non alcoholic beverages

Page 100: scoring units of competency

100

At this level the student can record and report variations in deliveries to ensure the timely

replacement of goods. The student can implement enterprise criteria for assessing the quality

and suitability of a range of products, and for managing stock levels. The student can utilise

knowledge of ordering processes and procedures, including expected delivery times and

waiting periods, to ensure that stock is replenished within the necessary time frame. He or she

can label, monitor and rotate stock in accordance with enterprise requirements.

Level 3 Manages stock to ensure

timely use and replacement

of goods

At this level the student can check the match between received goods, delivery dockets and

purchase orders using enterprise criteria, and record and report variations and discrepancies

to the relevant person. He or she can describe typical problems that can arise with rotating and

maintaining supplies, the optimum conditions for maintaining storage areas, and can dispose

of damaged or expired stock in accordance with enterprise requirements, legislation and OHS

requirements.

Level 2 Identifies, records and

reports incoming stock

variations and discrepancies

and disposes of damaged or

expired stock in accordance

with OHS legislation and

enterprise requirements

At this level the student can explain enterprise procedures for checking incoming supplies. He

or she can explain the individual storage requirements of various food and other products,

including ideal storage temperatures and conditions and which products need to be separated,

and can prioritise storage requirements of different commodities and other items as delivered.

The student can describe and apply safety and hygiene requirements when moving and

rotating stock.

Level 1 Prioritise storage

requirements of various

foods and maintains stock

with consideration to usage,

safety and hygiene

At this level the student can inspect incoming stock for damage, used by dates and quantity,

and maintain records in accordance with enterprise procedures. The student can store all

chemicals, equipment and non-food products in accordance with enterprise procedures. He or

she can explain the benefits of stock control, and identify and report any problems promptly.

Level 0 Follows standard

procedures for inspecting,

storing and recording

incoming stock

THHBKA03B Receive and store kitchen supplies

Page 101: scoring units of competency

101

Distributions.

This distribution does NOT represent national sampling distributions. It is the results of only the students who studied this unit.

THHCOR02B Work in a socially diverse environmentNational Levels

0 10 20 30 40 50 60

Requires support to work in a sociallydiverse environment

Uses various communication strategieswhen dealing with diverse groups

Displays cultural awareness andsensitivity

Avoids, and when required, resolvescultural misunderstandings

Page 102: scoring units of competency

102

Record Sheet for Hospitality units considered in this study

Page 103: scoring units of competency

103

7.0 | | H6 L.4 H12 L.4 | | 6.0 XXX | XX | | H13 .4 | 5.0 | XXXX | | XXXXXXXXX | 4.0 XXX | H5 L.4 XXXX | XXX | H1 L.4 H4 L.4 H14 .3 XXX | H13 .3 3.0 XXXXXXX | H2 L.4 H11 .4 XXXXXXX | XX | H9 L.3 XXXXXXXXXXXXXX | 2.0 XX | H3 L.3 H6 L.3 H10 .3 XXXXXXXXXXXXXXX | H2 L.3 XXXX | H4 L.3 XXXXXXXXXX | H1 L.3 H5 L.3 H14 .2 1.0 XXXXXXX | H12 .3 XXXXXXXXXX | H6 L.2 XXXXXXXXX | H11 .3 0.0 XXXXXXXXXXXXXXXXXXX | XXXXXXXXXXXXXXXXXX | XXX | H13 .2 XXX | H11 .2

-1.0 XXXXXXXXXXXXXX | H1 L.2 H5 L.2 H11 .1 XXXXXXXXXXXXX | H2 L.2 H12 .2 XXXXXXXXXXX | XXXXXXXXXXXX |

-2.0 XX | H3 L.2 H4 L.2 H10 .2 XXXXXXXXXXX | XXX | H5 L.1 XXXXX | H9 L.2

-3.0 X | XXXX | XXXXXXXXX | H6 L.1 XX |

-4.0 X | X | X | H1 L.1

X | H3 L.1 H14 .1

-5.0 | H4 L.1 X | | H10 .1 X |

-6.0 | | | | H2 L.1 H9 L.1 H12 L.1 H13 L.1

-7.0 |

Variable map of Hospitality units

Page 104: scoring units of competency

104

Concurrent calibration of school based assessment and the Victorian Food and Beverage exam.

Page 105: scoring units of competency

105

Concurrent calibration of school based assessment and the Victorian Commercial Cookery exam.

Page 106: scoring units of competency

106

Concurrent calibration of school based assessment and the NSW Hospitality exam.

School Exam

6.0 | | e22 .5 | e21 .5 5.0 | | e24 .5 | e25 .5 | e22 .4 4.0 | e17a.4 e23 .3 | e17b.4 e19c.4 e20 .5 e21 .4 e25 .4 | e24 .4 3.0 X | s4 .4 s5 .4 s6 .4 X | s1 .4 e18a.2 e19c.3 XXXX | s2 .4 e15 e23 .2 e25 .3 XXXXXXX | e20 .4 2.0 XXXXXXXXXX | e10 e14 e16b.4 e18b.4 e22 .3 e24 .3 XXXXXXXXXXXXX | e12 XXXXXXXXXXXXXXXXXXXX | s2 .3 s3 .3 e5 e11 e13 e16b.3 e17b.3 e19c.2 e21 .3 1.0 XXXXXXXXXXXXX | e9 e25 .2 XXXXXXXXXXXXXXX | s6 .3 s9 .3 e17a.3 XXXXXXXX | s5 .3 e16b.2 XXXXXXX | s1 .3 e2 e4 e6 e18b.3 e19c.1 0.0 XXXX | s10 .3 e3 e16b.1 e17b.2 e18a.1 e19a e20 .3 e21 .2 e24 .2 XXX | s4 .3 s5 .2 e16a.2 e17a.2 X | e7 e19b e22 .2

-1.0 X | s2 .2 s3 .2 | s1 .2 s10 .2 e17b.1 | s4 .2 e18b.2

-2.0 | s6 .2 e8 e21 .1 | e16a.1 e17a.1 e18b.1 e20 .2 | s9 .2 e20 .1 e22 .1 |

-3.0 | e1 | e25 .1 | e23 .1

-4.0 | | | e24 .1 |

-5.0 | | |

-6.0 | | | |

-7.0 | | |

-8.0 | | s1 .1 s2 .1 s3 .1 s4 .1 | s5 .1 s6 .1 s9 .1 s10 .1

School Exam

6.0 | | e22 .5 | e21 .5 5.0 | | e24 .5 | e25 .5 | e22 .4 4.0 | e17a.4 e23 .3 | e17b.4 e19c.4 e20 .5 e21 .4 e25 .4 | e24 .4 3.0 X | s4 .4 s5 .4 s6 .4 X | s1 .4 e18a.2 e19c.3 XXXX | s2 .4 e15 e23 .2 e25 .3 XXXXXXX | e20 .4 2.0 XXXXXXXXXX | e10 e14 e16b.4 e18b.4 e22 .3 e24 .3 XXXXXXXXXXXXX | e12 XXXXXXXXXXXXXXXXXXXX | s2 .3 s3 .3 e5 e11 e13 e16b.3 e17b.3 e19c.2 e21 .3 1.0 XXXXXXXXXXXXX | e9 e25 .2 XXXXXXXXXXXXXXX | s6 .3 s9 .3 e17a.3 XXXXXXXX | s5 .3 e16b.2 XXXXXXX | s1 .3 e2 e4 e6 e18b.3 e19c.1 0.0 XXXX | s10 .3 e3 e16b.1 e17b.2 e18a.1 e19a e20 .3 e21 .2 e24 .2 XXX | s4 .3 s5 .2 e16a.2 e17a.2 X | e7 e19b e22 .2

-1.0 X | s2 .2 s3 .2 | s1 .2 s10 .2 e17b.1 | s4 .2 e18b.2

-2.0 | s6 .2 e8 e21 .1 | e16a.1 e17a.1 e18b.1 e20 .2 | s9 .2 e20 .1 e22 .1 |

-3.0 | e1 | e25 .1 | e23 .1

-4.0 | | | e24 .1 |

-5.0 | | |

-6.0 | | | |

-7.0 | | |

-8.0 | | s1 .1 s2 .1 s3 .1 s4 .1 | s5 .1 s6 .1 s9 .1 s10 .1

Page 107: scoring units of competency

107

Appendix B

Standards Referenced Frameworks

Level Descriptions and Distributions1

Business Studies Units

Reliability Score Levels

Unit Nc

Ni

Alpha

Item

Case

Mean

SD

F

E

D

C

B

A

SRF (All) 223

12

0.89

0.52

0.76

12.09

10.4

9

18

30

41

45

SRF (VIC) 33 12

15.28

6.71

5

11

26

32

37

41

SRF (NSW) 35 12

12.09

10.4

16

18

26

33

37

41

Page 108: scoring units of competency

108

Levels ->

0

1

2

3

SEJ

Unit Code

Unit Description

Ni Alpha

Mean SD Level Max Cut Score

BSBCMN105A

Use business equipment

10

0.89

17.67

7.19

3

7

19

22

1.17

BSBCMN202A

Organise and complete daily work activities

9

0.90

28.13

6.14

2

12

16

28

1.13

BSBCMN203A

Communicate in the workplace

8

0.87

19.06

5.38

6

13

20

1.05

BSBCMN204A

Work effectively with others

8

0.92

19.86

5.79

2

7

12

20

1.17

BSBCMN205A

Use business technology

10

0.93

22.72

6.86

3

12

21

25

0.85

BSBCMN206A

Process and maintain workplace information

11

0.96

21.72

7.01

7

18

22

0.72

BSBCMN207A

Prepare and process financial/business documents

15

0.94

24.86

11.04

3

13

29

34

0.89

BSBCMN208A *

Deliver a service to customers

8

7

14

24

0.74

BSBCMN211A *

Participate in workplace safety procedures

5

5

9

12

.0.71

BSCMN212A

Handle mail

12

0.94

35.70

11.35

6

12

41

1.04

BSBCMN213A

Produce simple word processed documents

10

0.92

30.14

6.55

3

17

31

1.09

BSBCMN306A

Produce business documents

11

0.92

29.27

6.69

5

12

23

31

1.22

Page 109: scoring units of competency

109

0.00 10.00 20.00 30.00 40.00 50.00 60.00

Converted Score

0

10

20

30

40F

req

uen

cy

Mean = 30.00Std. Dev. = 7.00002N = 223

Histogram

Standardized scores of Business using a mean of 30 and a standard deviation of 7.

Page 110: scoring units of competency

110

BSBCMN105A Use business equipment

At this level the student can undertake routine

maintenance of business equipment in accordance with

the operating manual instructions. When required, the

student can arrange for routine servicing. The student

can also, under direct instruction, maintain records of

equipment/resources.

Level 3 Carries out routine maintenance

of equipment, organises servicing

when necessary

At this level the student can select, from a range of

business equipment, the most appropriate for a task. The

student can also explain the organization’s policies, plans

and procedures that relate to the use and storage of

business equipment and can apply problem solving skills

to determine appropriate repair actions on routine faults

and breakdowns.

Level 2 Selects equipment, explains

policies, plans and procedures;

describes repairs

At this level, the student can list common equipment

faults for a range of business equipment and describe

routine maintenance procedures that are consistent with

manufacturer requirements. The student can also, under

supervision, select and operate, from a range of business

equipment, the most appropriate for completing the task.

Level 1 Learning to list faults,

maintenance procedures operate

appropriate equipment

At this level, the student can describe the functions and

operational requirements for a range of business

equipment that are consistent with manufacturer

requirements. The student can also, under supervision,

report repairs that are outside of operator’s responsibility.

Level 0 Describes functions and

equipment requirements.

Learning to report repairs

Page 111: scoring units of competency

111

The student at this level can apply effective negotiation strategies

to building trust with colleagues and can demonstrate a

willingness to compromise. The student can complete tasks in

accordance with organisational requirements and within

designated timelines. He or she can also anticipate factors that

may impact on work requirements and can build into his or her

workplan potential contingencies to minimise adverse impacts on

achieving goals and outcomes. The student can suggest ways in

which the workgoals and those of the organisation can be better

matched. The student can also minimise resource wastage and

workplace disruption when using business technology.

Furthermore, the student self-evaluates personal development

opportunities and appropriately adjusts own work performance.

Level 3 Completes tasks by adjusting for

influencing factors; minimises wastage

and disruption; seeks opportunities and

adjusts performance

The student at this level checks the consistency of work goals and

plans against organisational requirements and can adjust own

work performance as a result of comparisons with team and

organisational standards. The student can identify factors that

impact on work requirements and plan daily workplace activities

involving use of business technology to minimise workplace

disruption.

Level 2 Checks work performance and goals,

against team and standards; identifies

factors impacting on work requirements

The student at this level understands and complies with

designated workgoals and plans, and displays an understanding

of the organisation’s and workgroup’s plans, responsibilities and

accountabilities. The student can also assess and prioritise his or

her workload to achieve allocated timeframes and use business

technology efficiently and effectively to complete tasks in

accordance with organisational requirements. When difficulties

arise in achieving allocated tasks, the student seeks assistance

from supervisors and/or colleagues and adjusts his or her work

practices according to feedback obtained from others.

Level 1 Complies with work responsibilities,

priorities and technology; uses assistance

and advice

The student at this level can comply with designated workgoals

and plans from superior. The student can recognise when

assistance is required to complete allocated tasks.

Level 0 Learning to comply with workgoals and

plans, recognises when assistance is

required.

BSBCMN202A Organise and complete daily work activities

Page 112: scoring units of competency

112

BSBCMN203A Communicate in the workplace

At this level, the student can improve communication in

the workplace by evaluating the appropriateness of

methods used. The student can also successfully discuss

complex issues and produce appropriate, timely and

coherent written documents that use correct style,

grammar and word choice.

Level 2 Improves communication in

workplace with appropriate, timely

and coherent documents

At this level, the student can complete routine

correspondence within designated timelines using clear

and concise language that is presented in accordance

with organisational requirements for style, format and

accuracy. The student can confirm that the intended

meaning of the written information has been understood,

making appropriate modifications where necessary. The

student can also display persuasive speech using

appropriate word choice and body mannerisms, and can

also clarify instructions or enquiries to ensure mutual

understandings of requirements and expectations.

Level 1 Routine correspondance with clear

meaning, uses persuasive and clear

language for enquiries and

understanding

At this level, the student can speak clearly and ask

questions for clarification. He or she can also apply the

most appropriate communication method to convey

information and ideas. In addition, he or she can respond

promptly and appropriately to routine instructions or

enquiries, and can collect information relevant to work

duties from a range of sources.

Level 0 Learning to clarify information and

ideas, instructions and enquiries

Page 113: scoring units of competency

113

BSBCMN204A Work effectively with others

At this level, the student can promote trust, confidence,

cooperation and good relationships by demonstrating high

standards and strong interpersonal skills. The student can also

apply a range of communication strategies to elicit constructive

feedback and the sharing of information, as well as utilise the

diverse background of the team composition to assist with

achieving workgroup’s goals. The student can also identify

strategies for improvement based on group debriefings.

Level 3 Promotes trust and use

scommunication strategies for feedback

and sharing information for individual

or group debriefings

At this level, the student can promote and apply good

relationships by indiscriminately applying appropriate

interpersonal and communication skills when dealing with

colleagues and customers, including those from diverse

backgrounds. The student can also provide support to team

members when requested, prioritise own work and pass on

relevant information to ensure designated goals are met. The

student can also identify areas for improvement.

Level 2 Supports team members, prioritises

work and information and identifies

areas for improvement

At this level, the student can demonstrate good interpersonal

skills, seeking appropriate assistance and acting upon

constructive criticism. The student can also make constructive

contributions to workgroup goals and tasks in accordance with

organisational requirements.

Level 1 Seeks assistance and responds to

criticism making contributions to goals

and tasks

At this level, the student can promote cooperation and good

relationships by working in a positive manner and seek

appropriate assistance when difficulties arise.

Level 0 Learning to work in a positive manner

and seeks appropriate assistance

Page 114: scoring units of competency

114

BSBCMN205A Use business technology

At this level, the student can develop and implement

procedures for ensuring data management, data storage, routine

technology maintenance, and the replacement of technology

consumables in accordance with organisational requirements.

The student can also develop solutions to overcome basic

problems with applications.

Level 3 Maintains and stores data; maintains

and replaces technology as required

At this level, the student can select and use a range of

technology and input devices, with consideration to efficiency

and safety. The student can also apply and ensure valid and

appropriate data management and storage methods.

Information about applications, prevention of difficulties and

replacement procedures can be obtained. The student can also

undertake routine maintenance and develop a process for

reporting faults.

Level 2 Selects technology for efficiency and

safety, data management, storage and

replacement, maintenance and report

faults

The student at this level can identify suitable technology and

software applications to maximise task outcomes as well as

apply a range of processes to process and organise data, such as

those associated with generalising the identification, opening,

generation and amendment of files. The student can also use a

range of input devices within organisational requirements and

actively promote a safe work environment.

Level 1 Identifies technology and software to

assist outcomes, processes and organise

data safely

At this level, the student can identify technology and software

requirements of tasks, including ergonomic requirements. The

student can also seek information on how to deal with problems

as they occur.

Level 0 Identifies technology requirements of

tasks and information regarding

dealing with problems.

Page 115: scoring units of competency

115

BSBCMN206A Process and maintain workplace information

At this level the student can develop or recommend systems for

efficient management of information.

Level 2 Develops and recommends

systems for efficient management

of informationAt this level, the student can integrate multiple sources of

technology to obtain information efficiently and effectively, and

can explain the organisational requirements related to security

and confidentiality. When processing workplace information, the

student can use appropriate business equipment and technology

effectively, and can select and use the most appropriate mode of

information transfer. In relation to maintaining information

systems, the student can also develop and implement routine

procedures for efficient file construction and management,

including identification, removal or relocation of dead or inactive

files.

Level 1 Uses technology for information

with security and confidentiality

in file construction and

management

At this level, the student can use simple procedures and business

technology to collect, appraise, process, collate, transfer and

dispatch information in accordance with organisational

requirements, timelines and guidelines.

Level 0 Learning to use procedures and

technology to collect, appraise,

process, collate, transfer and

dispatch information

Page 116: scoring units of competency

116

BSBCMN207A Prepare and process financial/business documents

The student can develop efficient and standardised procedures for

preparing and processing banking documentation and invoices in

accordance with organizational auditing requirements. The student

can also answer some routine creditor enquiries and apply the

appropriate procedures.

Level 3 Develops procedures for documentation, invoices

and answers creditor enquiries and apply

procedures

The student at this level can accurately and efficiently process, record,

verify and reconcile petty cash transactions and banking

documentation, as well as apply flexible, efficient and accurate

procedures for preparing, distributing and reconciling invoices.

Level 2 Independently handles cash transactions,

invoices and banking documentation

The student at this level can use a range of procedures to routinely

check and process petty cash claims and vouchers with accuracy and

efficiency as well as complete appropriate banking documentation.

The student can also apply routine procedures for filing invoices as

well as detecting errors.

Level 1 Follows directions to check and process invoices,

cash claims, banking documentation and

vouchers efficienty.

At this level, the student can use appropriate procedures to verify and

balance petty cash transactions and prepare and process simple

banking documentation.

Level 0 Learning to use procedures to verify and balance

cash transactions and prepare and process simple

documentation

Page 117: scoring units of competency

117

BSBCMN208A Deliver a service to customers

At this level, the student can identify both overt and covert

needs of customers and alleviate any of their qualms through

a range of strategies. The student can also trouble shoot to

avoid unnecessary assistance and recommend ways to

improve service delivery.

Level 4 Identifies both overt and covert needs of

customers and alleviate their qualms.

Can also touble shoot to avoid

unnecessary assistance and recommend

ways to improve service deliveries

At this level, the student can demonstrate a professional and

supportive demeanour to the customer and provide

information relevant to them through a range of strategies.

The student can also prioritise his or her workload and

contribute to a positive working environment. The student

can draw upon his or her own experiences at work to identify

improvement strategies.

Level 3 Professional and supportive demeanour

to customer with relevant information,

prioiritises workload for improvement

At this level, the student can establish a convivial but

professional rapport with customers that facilitates

identifying needs and encourages raising concerns. The

student can document all facts in accordance with

organisational requirements.

Level 2 Establishs customer rapport in regard to

needs and concerns and documents facts

At this level, the student can identify customer needs and

their urgency, supportively listen to complaints and provide

prompt service in line with organisational requirements. The

student can also identify the need for assistance and act upon

customer service instructions.

Level 1 Identifies customer needs and urgency,

listen to complaints and provide service

and customer service instructions

Student understands explanations of customer needs and

complaints; can follow directions to provide service

Level 0 Follows directions in providing service

and customer liaison

Page 118: scoring units of competency

118

BSBCMN211A Participate in workplace safety procedures

At this level, the student can analyse and evaluate the level

of risk of workplace hazards and recommend appropriate

measures to be taken. The student can also discuss with the

designated personnel relevant OHS issues and actively

contribute to their management in the workplace.

Level 3 Analyses and evaluates hazards,

recommend solutions regarding OHS

issues and contribute to their

management

At this level, the student not only follows workplace OHS

procedures but also understands the consequences of

workplace hazards, including appropriate control and

preventative measures. The student can also interpret OHS

symbols for assessing and controlling risks.

Level 2 Follows OHS procedures and

understands the consequences, interprets

OHS symbols

At this level, the student can identify and follow hazard

reporting and OHS procedures in accordance with enterprise

policy and legislative requirements.

Level 1 Identifies and follow hazard reporting

and OHS procedures in accordance with

requirements

Understands hazards when they are identified and linked to

OA procedures

Level 0 Learning OHS procedures and hazard

identification

BSCMN212A Handle mail

At this level the student can prioritise own work place activities

to facilitate timely delivery of emergency and electronic mail.

The student can also apply problem solving strategies for

distributing inaccurately addressed mail as well as record/report

and follow up damaged, suspicious or missing mail items. The

student can also prepare , process and record bulk mail, as well

as organize, process and implement quality checks on the

dispatch of a typical, urgent and electronic mail. Email

attachments can also be prepared according to organizational

requirements.

Level 2 Deals with emergency and electronic mail,

inaccurate, damaged, suspicious, bulk mail,

quality checks on dispatch

The student at this level can demonstrate the correct procedures

for opening, checking, registering, sorting and prioritising the

distribution of incoming mail. The student can also dispatch

standard mail, as well as list a range of delivery options for same

day deliveries and evaluate a range of electronic mail options.

Level 1 Follows procedures for opening, checking,

registering, sorting and prioritising incoming

mail

At this level the student can describe policies on opening and

checking incoming mail and dispatching standard outgoing mail.

The student can also identify the designation for mail clearly

addressed and distribute those marked as urgent/confidential.

Level 0 Learning policies on opening, checking,

distributing incoming and dispatching

outgoing mail

Page 119: scoring units of competency

119

BSBCMN213A Produce simple word processed documents

At this level, the student can recommend new enhancements to

document presentation and can organize own workload to ensure

accurate and timely production of documents. The student can use a

range of procedures to enhance presentation, readability and

accuracy of documents, and can overcome non-routine problems

with document production. The student can also ensure

organisation’s mailable requirements are met within agreed

timelines. The student can recommend new energy and resource

conservation techniques.

Level 2 Enhances document presentation, produces

accurate documents, meets mail and conservation

requirements

At this level, the student can make appropriate adjustments to

workspace, furniture and equipment to meet ergonomic

requirements. The environmental benefits of paper conservation can

be explained and the workspace organised to meet OHS

requirements. The student checks and clarifies document

requirements with relevant personnel, uses simple word processing

functions to enter and format text for consistency and applies

organisational conventions for naming and saving files efficiently.

Level 1 Makes egonomic adjustments to workspace;

Clarifies document requirements. Correctly

names and saves word processing files

At this level, the student can demonstrate a range of conservation

techniques to minimise wastage according to organisational and

statutory requirements and adjust furniture and equipment to suit

personal comfort and needs. The student can recognise when

assistance is required to overcome difficulties with document

presentation and production.

Level 0 Develops ways to minimise waste and adjust

workspace; seeks assistance with document

presentation and production.

BSBCMN306A Produce business documents

The student at this level can efficiently manage and research

organisational requirements for information entry, storage, output

and quality of document presentation, in combination with other

tasks. Input devices can be efficiently used and integrated into

other work tasks and the student can generalize procedures, use

multiple functions and cross check the production of documents

in line with organizational requirements and deadlines.

Level 3 Manages data entry, storage, output, quality

control via integrated tasks, generalised

procedures and multiple functions

At this level the student can research and select the most

appropriate software, resources and procedures to optimize the

design, production and storage of business documents with

consideration to efficiency, consistency, presentation, data

integrity and task requirements.

Level 2 Selects software, resources and procedures to

optimise design, production and storage of

business documents

The student at this level can typically use a range of software to

produce documents and can appropriately modify the physical

work environment to suit ergonomic requirements. The student

can identify a range of procedures for laying out and producing

documents to meet task requirements and solves problems using

readily accessible information sources..

Level 1 Produces documents and modifies workspace.

Solves problems using information sources

At this level, the student typically can produce documents using a

limited range of software and design a document for data entry

efficiency and presentation. The student can identify the means

by which documents can be stored and applications exited

without loss of data.

Level 0 Learning document software to design

documents with data entry and storage

Page 120: scoring units of competency

120

Distributions.

This distribution does NOT represent national sampling distributions. It is the results of only the students who studied this unit.

BSBCMN105A Use business equipmentNational Levels

0 5 10 15 20 25 30

Routine maintenance of equipmentrecords and routine servicing.

Select equipment, explain policies, plansand procedures; describe repairs.

Learning to list faults, maintenanceprocedures operate appropriate

equipment.

Describe functions and equipmentrequirements. Learning to report repairs.

Page 121: scoring units of competency

121

Record Sheet for Business units considered in this study

Page 122: scoring units of competency

122

Variable map of Business units

Page 123: scoring units of competency

123

Concurrent calibration of school based assessment and the Victorian Business Studies exam.

Page 124: scoring units of competency

124

Concurrent calibration of school based assessment and the NSW Business Studies exam.

5

e20 .54 e19c.3 e22 .5

e20 .4s3 .4 e13 e21 .5

3e16z.2 e19a.2

X e18c.3 e21 .4X s1 .4 e16b.3 e17c.3

2 XX s10 .4 e8 e22 .4XXXXX s5 .4 e19c.2 e20 .3

XXXXXX e9 e19b.2XXXXXXXXXX s7 .4 s8 .4 s10 .3 e14 e17b.2 e18b.4 e21 .3

1 XXXXXXXXXXXXXXXX e12 e15 e18b.3XXXXXXXXXXXXXXX s6 .3 e18c.2

XXXXXXXXXXXXXXXXXXXX e1 e2 e4 e10 e11 e17c.2 e18b.20 XXXXXXXXXXXXX e7 e16z.1 e16b.2 e18b.1 e19b.1 e22 .3

XXXXXXXXXX s4 .4 s7 .1 s7 .2 s7 .3 e3XXXXXXXXXX s4 .2 s4 .3 e21 .2

XXXXXXX s2 .4 e5 e16a-1 XXXXXX e17b.1 e19a.1

XXX e16b.1 e17c.1 e20 .2XX

X e19c.1 e22 .2-2 X s3 .3 s10 .2

e18c.1s9 .3 e17a e18a e21 .1

-3 s2 .2 s2 .3 e6 e22 .1s11 .2 s11 .3s5 .3

e20 .1-4

-5s1 .3

-6s3 .2

-7

s1 .1-8 s1 .2 s2 .1 s3 .1 s4 .1 s5 .1

s5 .2 s6 .1 s6 .2 s8 .1 s8 .2s8 .3 s9 .1 s9 .2 s10 .1 s11 .1

-9

-10

SRF NSW Exam

Page 125: scoring units of competency

125

Appendix C

Standards Referenced Frameworks

Level Descriptions and Distributions1

Information Technology Units

Score Levels

Unit Alpha Mean SD F E D C B A SRF (All) 0.57

7.08

5.55

5 15

21

35

42 SRF (NSW) 0.78

8.25

5.98

7 11 20

27

35

38 SRF (Vic) 0.57

7.08

5.55

3 5 12

16

22

Page 126: scoring units of competency

126

Levels->

0

1

2

3

4

5

Unit Code

Unit Description

Ni

Alpha

Mean SD Level Max Cut Score

SEJ

ICAITTW001B

Work effectively in an IT environment

7

0.89

13.61

3.75

5

9

14

0.60

ICAITTW002B

Communicate in the workplace

8

0.83

16.69

3.59

3

8

13

16

.87

ICAITTU005B

Operate computer hardware

8

0.77

18.64

3.27

2

9

13

18

.82

ICAITU007B

Maintain equipment and consumables

11

0.90

24.32

5.74

6

14

20

26

1.15

ICAITU012B

Design organisational documents using computer packages

10

0.90

21.41

5.01

4

8

15

23

.98

ICAITSO15B

Install software applications

9

15.66

2.77

4

17

23

.78

ICAITSO17B

Maintain system integrity

16

0.95

38.67

9.41

3

9

18

24

38

41

1.37

ICAITSO24B

Provide basic system administration

11

24.70

5.57

4

10

15

20

25

1.11

ICAITSO25B

Run standard diagnostic tests

6

15.56

3.50

5

12

15

0.71

ICAITU128A

Operate a personal computer

24

0.83

42.27

5.04

4

9

19

40

44

1.5

ICAITU129A

Operate a word processing application

21

0.86

53.08

7.58

3

13

41

47

52

56

1.49

ICAITU129A

Operate a word processing

application 29

0.95

61.51

14.81

3

16

48

65

70

1.36

ICAITU132A

Operate a presentation package

30

35.02

4.88

2

24

43

55

1.21

ICAITUO19B

Migrate to new technology

8

20.05

6.73

5

16

22

0.87

ICAITU126A

Use advanced features of computer applications

13

0.96

29.3

11

5

11

20

30

36

1.16

Page 127: scoring units of competency

127

10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00

Converted Score

0

10

20

30

40

50

Fre

qu

ency

Mean = 30.001Std. Dev. = 7.00004N = 295

Standardized scores of Information Technology using a mean of 30 and a standard deviation of 7.

Page 128: scoring units of competency

128

SRF for Subject Level IT

Page 129: scoring units of competency

129

ICAITTW001B Work effectively in an IT environment

At this level the student can discuss a range of issues that

impact on the IT industry, and can evaluate a range of

career choices. They can explain the capabilities of the IT

equipment/software and operating system supported by

the organisation, and how the IT functions contribute to

the achievement of the larger organisational goals, and

can suggest changes to policies and procedures to help

achieve organisational goals.

Level 2 Discusses issues impacting

on IT industry; can link

hardware and software to

work goals

At this level the student can explain the interrelationships

of key players in the IT organisation, and maintain high

customer service standards. They can describe the IT

equipment/software and operating system supported by

the organisation, and maintains records of equipment,

location and service requirements in accordance with

organisational requirements.

Level 1 Explains the relationships

among workplace roles,

describes hardware and

software

At this level the student can explain the function of key

players of an IT organisation and describe a range of

career pathways within the industry. They can list typical

organisational codes of conduct and key functions

offered or supported within the organisation. They can

further promote the organisation by displaying a

positive, courteous and helpful manner at all times.

Level 0 Learning about IT

organisation and career

options, developing skills

that promote the industry

Page 130: scoring units of competency

130

ICAITTW002B Communicate in the workplace

At this level the student can facilitate client satisfaction through

engaging in conversation, asking well focused questions and

paraphrasing the speaker’s ideas. The student can also apply

problem solving techniques to answer atypical inquiries.

Level 3 Ensures client

satisfaction and

applies problem

solving techniques

At this level the student can select the most appropriate

communication medium to maximise understanding of client

needs. The student can ensure that essential information is

recorded accurately and concisely, and any required follow-up

action is taken in accordance with organisational policy.

Level 2 Selects appropriate

medium for

communication and

follows up action

required

At this level the student can use the most appropriate language

and tone to aid communication by adjusting spoken language and

displaying the appropriate body language and use of gestures.

The student can respond to client inquiries promptly and refer

requests to the appropriate personnel.

Level 1 Spoken

communication is

polite, prompt and

appropriate

At this level the student can communicate with clients in a polite,

professional and courteous manner and ask questions to clarify

client needs. The student can attempt to meet all reasonable client

requests within acceptable enterprise timeframes.

Level 0 Learning to

communicate with

clients

Page 131: scoring units of competency

131

ICAITU005B Operate computer hardware

At this level the student can use, test and evaluate hardware

configurations according to required outcomes. The student can

perform keyboarding activities that exceed organisational

standards.

Level 4 Evaluates hardware

configurations to

improve performance

referenced to

organisational At this level the student can demonstrate monitoring and

preventative maintenance of hardware consumables, and make

suggestions to change hardware configurations to improve

outcomes. The student can perform keyboarding activities that

meet organisational speed and accuracy standards.

Level 3 Preventative

maintenance of

consumables and can

adjust hardware

configuration to

At this level the student can select appropriate hardware for

performing non-routine tasks, and use and test hardware

according to required outcomes. The student can diagnose

hardware consumables required and install replacements. The

student can also make suggestions to improve occupational

health and safety conditions.

Level 2 Selects appropriate

hardware for non

routine tasks; can install

hardware consumables

and link procedures to

OHS

At this level the student can explain the functions of a range of

office peripherals, identify task requirements and appropriate

procedures for meeting requirements. The student can select

appropriate hardware for routine tasks and use hardware to

increase operational efficiency. The student can also

demonstrate correct ergonomic use of equipment.

Level 1 Explains peripheral

functions for specific

routine tasks

At this level the student can identify a range of office

peripherals and use hardware to produce required outcomes.

Level 0 Learning to identify

appropriate approach

and materials for client

Page 132: scoring units of competency

132

ICAITU007B Maintain equipment and consumables

At this level the student can anticipate the need to replace

consumables, and utilise knowledge of ordering processes and

procedures to ensure that stock is replenished within necessary

timeframes. The student can differentiate between consumable

installation and hardware malfunctions. The student can explain

the impact of equipment maintenance on current and future

organisational needs, and suggest ways to improve inventory

systems.

Level 3 Anticipates

replacements and

maintenance of

consumables and

hardware develops

plans for inventory

systems

At this level the student can explain the importance of ensuring

disks and peripherals are cleaned in accordance with vendor and

organisational requirements. The student can apply routine

procedures to test the functioning and efficiency of consumables,

and apply strategies to overcome problems when replacing

consumables. The student can recognise non-routine maintenance

indicators and report problems to appropriate personnel. The

student can describe the storage requirements of a range of

peripherals and ensure the adequacy of supply levels to satisfy

future needs.

Level 2 Peripheral and

consumable cleaning,

storage and testing

follows supplier

procedures, including

non routine steps

At this level the student can record material accessed in

appropriate inventory systems. The student can clean disks and

peripherals according to vendor and organisational requirements,

and correctly store peripherals that are infrequently used. The

student can access and maintain records of stock purchased, and

can refer to relevant manuals when replacing and maintaining

consumables and supplies. The student can implement and keep

accurate records of routine maintenance, and can describe

preventative maintenance procedures for a range of equipment.

Level 1 Maintains inventory

and records of

consumables, storage

and maintenance;

produces stock

inventories

At this level the student can explain the importance of cleaning

disks and office peripherals, and can identify and access

appropriate cleaning materials. The student can replace

consumables when instructed and monitor a range of standard

office peripherals requiring maintenance.

Level 0 Learning the

importance of clean

peripherals, follows

instructions for

replacements.

Page 133: scoring units of competency

133

ICAITU012B Design organisational documents using computer packages

At this level the student can suggest improvements to

organisational design guidelines, and develop mechanisms for

evaluating client satisfaction.

Level 4 Suggests

improvements to

procedure; monitors

client satisfaction

At this level the student can evaluate document configuration

protocols and software suitability, and propose alternatives as

necessary. The student can use appropriate software features

to track changes and maintain version control, and can save

documents and exit applications whilst ensuring data

integrity.

Level 3 Evaluates document

configurations and

uses software options

to improve product

At this level the student can correctly configure business

documents, and identify and implement organisational design

guidelines to ensure that organisational requirements are met.

The student can evaluate a range of software packages, and

can use shortcuts to launch software applications. The student

can check that client requirements are satisfied, and where

necessary refer clients to the appropriate person.

Level 2 Uses a range of

applications and

shortcuts suited to

task and checks the

suitability of the

product

At this level the student can identify a range of suitable

software packages, use software to design documents

consistent with organisational requirements, and store

documents in appropriate locations to facilitate retrieval.

Level 1 Follows direction to

use a range of

software to develop

standard documents

At this level the student can use menus to launch software

applications, and can use standard organisational procedures

to open and amend files, save documents, and close

applications without any loss of data.

Level 0 Learning to use

standard menus to

open, develop, and

save files

Page 134: scoring units of competency

134

ICAITSO15B Install software applications

At this level, the student can refine and further

contextualise client needs, justify installation procedures

and update the client when appropriate. The student can

also justify the application in terms of commercial demands

and budget constraints.

Level 2 Contextualises

client needs

regarding

upgrades and their

commercial

At this level, the student can create summary notes of

communication with client and advise client on procedures

relevant to client needs. The student can also solve software

acquisition problems, anticipate potential disruptions that

may occur and advise the client of the relevant procedures.

When installing the software or upgrade, the student can

list the features of the computer and systems environment

and explain the guidelines for it.

Level 1 Makes summary

notes of client

consultations

regarding software

and hardware

upgrades

At this level, the student can confirm success of upgrade

installation and evaluate application. When installing the

software or upgrade, the student can explain procedures

undertaken. The student can also list the features of the

clients’ activities, needs and requirements.

Level 0 Learning to

recognise and

explain when

upgrades are

installed

Page 135: scoring units of competency

135

ICAITSO17B Maintain system integrity

At this level the student can evaluate the effectiveness of

organisational recording procedures. The student can

independently research organisational vulnerability and

advanced virus protection alternatives, and analyse virus

infection patterns, suggesting enhanced protection and

preventative procedures.

Level 5 Evaluates

effectiveness

records and

procedures for

access, security

and legality

At this level the student can recommend improvements to

organisational backup and restore procedures. The

student can establish a register of software licences, and

suggest systems and policies for reporting illegal software.

Level 4 Recommends valid

improvements to

security and

backup procedures

At this level the student can maintain records of software

usage patterns and virus infections. The student can report

illegal software to the appropriate person. The student can

describe organisational requirements for restoring system

backups, and identify issues encountered during the

restoration process.

Level 3 Maintains records

of file security and

legalities, backups

and issues

encountered

At this level the student can obtain virus updates,

maintain logs of virus protection activities, and repair

infected files using relevant software. The student can

determine software licences, maintain accurate records of

licence numbers and locations, and check personal

computers and networks for illegal software, recording

any illegal software found.

Level 2 Maintains virus

and access security

of software, system

and files

At this level the student can create and store secondary

backups in alternative locations. The student can maintain

virus protection procedures, and monitor software

licences. The student can restore system backups

according to organisational guidelines.

Level 1 Chooses

appropriate

location for file and

system backup

At this level the student can carry out file back-ups onto a

secure medium and restore backups when necessary.

Level 0 Learning to back

up data files

Page 136: scoring units of competency

136

ICAITSO24B Provide basic system administration

At this level the student can analyse security and other

documentation to identify improvements to organisational

security procedures. The student can ensure that the

organisation of software licence records promotes accuracy

and transparency. The student can explain how various

backup procedures are compatible with organisational

needs, and identify improvements to existing backup

procedures.

Level 4 Analyses

procedures and

documentation

regarding access,

legality and backup

and their

compatibility with

client systems and

needsAt this level the student can ensure that clients understand

security conditions surrounding password usage, and

provide additional information and explanations to

maximise client understanding. The student can maintain

records of security access to ensure system integrity. The

student can develop procedures for identifying illegal

software on both personal and networked machines, and

record observations of illegal software usage.

Level 3 Ensures that

clients understand

security access and

legality issues

At this level the student can issue security access

passwords and instructions to clients. The student can

verify licences by crosschecking additional resources, and

develop systems for recording and reporting the presence

of illegal software. The student can identify local network

environments for system backups, and record backup

procedures in accordance with organisational guidelines.

Level 2 Controls security

access and records

of backups

At this level the student can provide security

documentation and access to clients, and explain the

rationale for obtaining client clearance. The student can

determine licence validity through reference to supplied

documentation and software, maintain software licence

records, and check individual machines and network

servers for illegal software.

Level 1 Provides

documentation for

clearance and

software licensing

issues

At this level the student can obtain client requirements and

clearance in accordance with organisational guidelines. The

student can report the presence of illegal software to the

appropriate personnel. The student can describe

organisation guidelines for backup procedures and conduct

backups at regular intervals.

Level 0 Learning to work

with clients,

regarding legality

and guidelines for

installing software

Page 137: scoring units of competency

137

ICAITSO25B Run standard diagnostic tests

At this level the student can document predictive

maintenance undertaken, and record

recommendations for further action. The student can

suggest and justify improvements to virus reporting

and removal policies and procedures.

Level 2 Documents changes

made and justifies

recommendations for

action

At this level the student can record and report any

problems encountered when running system

diagnostics, and record any changes to system

configuration. The student can further undertake

predictive maintenance in line with organisational

guidelines. The student can conduct regular updates

of virus software, confirm virus protection is enabled,

and implement protocols for removing, documenting

and reporting viruses.

Level 1 Records and reports

system diagnostics

and modifications,

leading to

maintenance,.

Updates and virus

protections

At this level the student can execute diagnostics

programs and configure systems as indicated by

diagnostics. The student can describe relevant

organisational policies and procedures for virus

protection, identification and removal.

Level 0 Learning diagnostics

and configurations

for organisation

system needs

Page 138: scoring units of competency

138

ICAITU128A Operate a personal computer

At this level the student can suggest changes to organisational

guidelines to improve efficiency. The student can apply a

range of strategies to find or locate deleted or temporary back-

up files for data recovery. The student can adjust printer

configuration to maximise effectiveness, and evaluate the

suitability of printing alternatives.

Level 4 Evaluates

alternative

organisational and

system strategies for

locating and

printing files

At this level the student can use available help functions to

access system features, and identify additional sources of help

as needed. The student can apply strategies for viewing file

components, and can modify files using a range of appropriate

techniques. The student can locate files using a variety of tools

and file criteria, and can search for and restore deleted files.

Level 3 Uses standard and

other functions for

help, files

modification, and

search

At this level the student can use a range of strategies for

opening, resizing and closing windows, and can navigate

between various open windows on the desktop. The student

can view folder and directory components, and use

appropriate directory names for specific purposes. The

student can select, open and rename files using appropriate

shortcut features, and use operating systems to locate and

move files across directories.

Level 2 Uses a range of

strategies to open,

navigate, name,

shortcuts nd

relocate files

At this level the student can explain standard boot and log on

procedures. The student can customise desktop features to

comply with organisational guidelines, and can format and

copy files to disk as necessary. The student can use copy and

move functions, and rename directories and folders.

Level 1 Explains simple

procedures to start,

log on, format and

copy name file

structures

At this level the student can erase and format disks and print

information from installed printer. The student can close

applications without loss of data and shut down computer in

accordance with manufacturer’s instructions.

Level 0 Learning to erase,

format and print.

Open and close

basic applications

Page 139: scoring units of competency

139

ICAITU129A Operate a word processing application

At this level the student can create mailing lists, utilising the most

appropriate application and layout, from alternative data sources.

The student can create and merge documents in any form or

layout, and can import and activate links to objects from other

applications to allow information flow.

Level 5 Selects appropriate

format and

procedure to merge

documents linked

to other

At this level the student can create a mailing list, containing

appropriate filed names and all relevant information, in a format

appropriate for merging. The student can create and successfully

merge a number of documents, including labels and envelopes.

The student can ensure that all information and formatting is

consistent with organisational requirements for a range of

documents.

Level 4 Creates mailing

lists and formats

and merges

documents

according to SOPs

At this level the student can create document templates, explain

the function of a range of page display modes, and customise

displays to allow multiple document viewing within an

application. The student can apply non-standard formatting tools

to enhance the presentation of tabulated data, and format objects

linked from external applications. The student can ensure that all

image formatting is consistent with copyright laws, and that

sources are acknowledged where appropriate.

Level 3 Creates templates

for display modes,

and links to

external

applications.

Imaging is

consistent with

copyright laws

At this level the student can use a variety of techniques to

manipulate documents, including customising text and tables,

adjusting margin sizes, inserting headers and footers, and

importing and formatting images and objects from other

applications, in order to enhance presentation. The student can

preview documents for printing, make necessary adjustments, and

select the most appropriate printer options. The student can also

save documents in various formats for web site posting.

Level 2 Manipulates

documents, text

and tables, headers

and footers, import

and format,

preview and save

in web format

At this level the student can open new documents and use a

variety of techniques to carry out a simple formatting and editing

tasks, such as inserting tables and images. The student can

preview document for printing and select basic print options. The

student can save documents to the appropriate location, create

folders or directories where required, without loss of data.

Level 1 Uses techniques to

open and format

files, edit and

insert graphics,

preview and save

in new and

existing foldersAt this level the student can select text using a variety of

techniques and preview document in print preview mode. The

student can change the position and size of the graphic in

accordance with organisational requirements.

Level 0 Learning to use

select and preview

procedures and

graphic placement

in text

Page 140: scoring units of competency

140

ICAITU130A Operate a spreadsheet package

Students at this level can use basic built in functions to

discover more complex functions and apply them and

compare the consequences of different types and methods

of referencing and formulas. Predict and prevent formula

problems before error prompts and evaluate benefits of

different types of data organisation.

Level 4 Extends repertoire of skills

using complex spreadsheets

in across workbooks using

and evaluating formulae

Students at this level can use a range of automated

procedures for performing data manipulation and

spreadsheet functions, including linking across

spreadsheets. Modifications and a range of outputs can be

made to suit particular tasks. Students can also

Level 3 Uses a range of automated

methods in and across

spreadsheets and modifies

output to suit task

Students at this level can use a variety of procedures to

perform numerous basic spreadsheet functions. Data can

be manipulated for particular purposes and linked across

spreadsheets. Students can develop routines for

minimising errors and increasing efficiency and a range

of outputs can be produced

Level 2 Selects procedures to suit

data entry and number

manipulation on and

across spreadsheets

Students at this level can identify and describe basic

setting options, format options and data types as well as

the broader roles of the package. The student can vary the

method for entering values for different purposes and to

increase efficiency and use document save and close

features without loss of data.

Level 1 Uses setting and format

options to alternate method

of entering data

Students at this level can enter numbers, text and symbols

in cells and use basic functions of print preview mode.

The student can also describe differences between files,

directories and folders.

Level 0 Learning to enter data into

cells and use print preview

Page 141: scoring units of competency

141

ICAITU132A Operate a presentation package

Students at this level can typically justify changes to page display

modes, orientations, sizes and preset animations as well as the selection

of slide transition, animation and multimedia effects in terms of the

presentation impact on the targeted audience. The student can also

customise presentation templates and sequence slides to suit particular

presentation demands. Print outs are used as feedback for future

printing.

Level 3 Explains changes,

customisation, and

matches to audience

needs. Templates

and printouts

customised to

audience needs

Students at this level can typically select appropriate features to

automate the production of a simple presentation as well as apply a

range of standard formatting functions using various techniques. The

student can modify page display modes, objects, preset animation

effects, predefined styles, slide layout, font and colour to suit the task.

The student can also insert multimedia effects, add notes for personal

use, print in required format and ensure that the saved file has a

meaningful name, directory and file structure. The student can also

describe the target audience and presentation requirements as well as

the pros and cons of different formatting options. The student can also

customise presentation templates and sequence slides to suit particular

Level 2 Uses formatting,

multimedia, notes,

and layout matched

to task and

audience; files saved

and named;

template customised

Students at this level can typically use a variety of functions to copy

slides across and within presentations, a range of basic onscreen

navigation tools, standard formatting functions and the save and close

features. The student can import objects such as tables, view multiple

slides simultaneously and use print preview.

Level 1 Adapts presentation

to suit task, copies

slides between

presentations, using

importing and

navigation

procedures

Students at this level can select default style options to meet task

requirements and explain the features of various tools such as help,

search and replace, spell check, undo and simple formatting tools as

well as the range of situations in which they can be used.

Level 0 Learning to use

presentation styles

appropriate to the

task and to use

simple tools

Page 142: scoring units of competency

142

ICAITUO19B Migrate to new technology

At this level the student can evaluate and document how

upgraded technology can be configured and used to enhance

organisational productivity and efficiency. The student can

evaluate the application of specialised features, and explore

customisation configuration options to maximise system

effectiveness. The student can research new technology not

directly available including vendors, internet libraries and

external advisors.

Evaluates

documentation to

select and apply

and configure

technology from a

new source to an

unsolved problem

At this level the student can explain the benefits of the new

technology through a comparison with existing technology,

and how technology relates to the solution of organisational

problems. The student can apply advanced features of new

technology to solve organisational problems, and consult a

range of information sources to determine the full range of

benefits of new technology.

Explain and

compare

technologies,

relates to a

solution,and can

consult

information

sources for At this level the student can explain aspects of current

knowledge that can be used to explore new technology. The

student can describe and use specialised functions and

advanced features of new technology to meet organisational

needs.

Learning to link

current technology

to new

editions,and

explain features of

new technology

Page 143: scoring units of competency

143

ICAITU126A Use advanced features of computer applications

At this level the student can suggest ways to improve the transfer of

data between applications, and can create advanced objects, macros and

templates to automate activities. The student can make suggestions for

updating manuals and training materials. The student can evaluate the

efficiency of problem solving and trouble shooting procedures, and

develop feedback processes to revise and modify problem solving

strategies.

Level 4 Transfers data

between applications,

macros and templates,

evaluates and modifies

problem solving

procedures

At this level the student can explore external sources to identify updates

and advanced technical solutions, and supply trouble shooting results

and alert messages to technical support. The student can generalise

performance enhancement processes to organisational and network

systems contexts. The student can evaluate the effectiveness of PC

configurations and customise PC environments to suit user needs.

Level 3 Explores and evaluates

resourcesof technical

information and their

potential usefulness

for enhancing PC

performacne

At this level the student can create shortcut keys to automate

operations, and use search engines and discussion forums to gather

information on application usage. The student can describe problems

and possible solutions, and can access technical support resources and

facilities to assist problem solving. The student can configure the

computing environment as required, and identify ways in which the

computing environment can be customised to enhance PC performance.

Level 2 Automates operations,

search engines,

forums,to gather data

to enhance PC

performance

At this level the student can select suitable data formats for particular

applications, efficiently transfer data between applications, and use

advanced features of applications to present data. The student can

design and modify objects, macros and templates, and can refer to on-

line documentation to overcome difficulties. The student can identify

user and organisational requirements for configuring the computing

environment.

Level 1 Refers online to match

format to application,

transfer, modify

objects, macros and

templates to meet user

needs.

At this level the student can describe compare and contrast different

data formats. The student can explain the nature and functionality of

objects, macros and templates. The student can display extended

knowledge of application functionality and tools, and can use shortcuts

and other features to perform predefined operations.

Level 0 Learning to

understand data

formats , macros and

templates and to apply

these to predefined

operations

Page 144: scoring units of competency

144

Distributions.

This distribution does NOT represent national sampling distributions. It is the results of only the students who studied this unit.

ICAITTW001B Work effectively in an IT environment

National Levels

0 5 10 15 20 25 30 35 40 45 50

0

1

2

Page 145: scoring units of competency

145

Page 146: scoring units of competency

146

Variable map for IT SRF Subject level

I14 .4XXXX I15 .4

XXX

XXXXXX I7 L.4XXXXXXXXXX

XXXXX I2 L.4XXXXXXXX I1 L.3 I4 L.3

XXXXXXXXXXX I11 .4XXXXXX

XXXXXXXXXXXXXXXXXXX I3 L.4 I10 .4 I12 .3XXXXXXXXXX I5 L.3 I8 L.3

XXXXXXX I6 L.3XXXXXXXXXXXXXXXXXXXX I14 .3

XXXXXXXXXXXXXXXXXXXXXXXXXXXX I2 L.3 I7 L.3 I15 .3XX I9 L.3 I13 .3

XXXXXXXXXXXXXXXXXXXXXXXXXX I3 L.3 I11 .3XXXXXX

XXXXXXXXXXXX I6 L.2 I10 .3 I12 .2XXXXXXXXXXXXX

XXXXXXXXXXXXXXXXX I1 L.2 I4 L.2 I5 L.2XXXXXXXXX

XXXXXXXXXXXXXXXXX I8 L.2 I10 .2

XXXXXXX I2 L.2 I7 L.2 I9 L.2 I11 .2 I13 .2XXX I8 L.1

XXXXXXX I12 .1XXX I3 L.2

XXXXXI7 L.1 I9 L.1

XXXI4 L.1I5 L.1 I15 .2

X I3 L.1 I11 .1XXXXXX

XXXXX I14 .2

XI2 L.1I1 L.1

XXXXXXXXX I6 L.1 I10 L.1 I13 L.1 I14 L.1 I15 L.1

Page 147: scoring units of competency

147

Concurrent calibration of school based assessment and the NSW Information Technology exam.

s12 .3 e16a.3 e22 .5e20 .5e21 .4

s4 .3 s7 .4 e6X s1 .3 e16a.2 e16b.2 e16y.3

XX s2 .4 e15 e16c.3 e22 .4XXXXX e16c.2 e17d.2 e20 .4 e21 .3XXXXX s5 .3 s6 .3 e16c.1

XXXXXXXXXXX s10 .4 e18z.2 e18y.2XXXXXXXXXXXXXXXX s3 .4 e8 e16b.1 e16y.2 e17b.2 e17c e17y.2

XXXXXXXXXXXXXX e3 e4 e12 e13 e16a.1XXXXXXXXXXXXXXXX s7 .3 s11 .4 e21 .2

XXXXXXXXXXXXXXXXXXXX e5 e16y.1 e19c.4 e20 .3 e22 .3XXXXXXXXXXXXXXXX e11 e17y.1 e19c.3

XXXXXXXXXXXXX s3 .3 e16z.2 e16d e17d.1 e18z.1 e18y.1XXXXXXXXXX s6 .2 e19c.2

XXXXXXX s2 .3 s10 .3 e2 e16z.1 e19c.1XXX e17z e18a e18b e19a e20 .2 e22 .2XXX s12 .2 e14 e17a e17b.1

XX s4 .2 s13 .3 e9X s7 .2 e7 e20 .1X e10 e19bX s1 .2 e1

s5 .2 e21 .1s11 .3s2 .2 e22 .1s10 .2

s13 .2

s3 .2s11 .2

s2 .1 s4 .1 s12 .1

s3 .1

s7 .1

s1 .1 s10 .1 s11 .1s13 .1 s5 .1 s6 .1

SRF NSW Exam

Page 148: scoring units of competency

148

Appendix D

Standards Referenced Frameworks

Level Descriptions and Distributions1

Metal and Engineering Units

Metal and Engineering

Levels -> E D C B A

Ni á

ì

ó

Level Max Cut Score

School Based 17

.91 12.47 9.31 12 17 27

37 42

NSW Exam 56

.83 30.29 9.68 11 26 50

76 101

School and NSW Exam

39

.79 30.61 8.05 8 22

40 59

Page 149: scoring units of competency

149

Score conversion chart for MERS units

Metal and Engineering 0 1 2 3 4

Unit Description Level Max Cut Score Sej

MEM1,1FA Undertake interactive workplace communication

4 12 20 26 33 1.27

MEM1,2FA Apply principles of Occupational Health and Safety in work environment

3 7 18 1.05

MEM1,3FA Apply quality procedures 3 7 12 0.91

MEM1,4FA Plan to undertake a routine task 4 13 18 21 1.02

MEM2,1C12A

Apply quality systems 5 11 17 1.03

MEM 2,5C11A

Measure with graduated devices 3 7 14 1.04

MEM2,8C10A

Perform computations 3 6 11 18 1.11

MEM5,5AA Carry out mechanical cutting 8 15 25 1.04

MEM7,5AA Perform general machining 10 13 25 33 0.83

MEM9,2AA Interpret technical drawing 7 11 18 0.97

MEM9,3AA Prepare basic engineering drawing

6 11 16 0.94

MEM9,4BA Electrical/electronic detail drafting

3 5 8

MEM9,5AA Basic engineering detail drafting

4 9 14 1.71

MEM 9,9BA Create 2D drawings using computer aided design system

9 18 29 0.77

MEM9,10BA Create 3D models using computer-aided design system

4 11 15 18

MEM18,1A B Use hand tools 5 14 17 0.79

MEM18,2AA Use power tools/hand held operations

4 10 17 20 1.00

Page 150: scoring units of competency

150

0.00 10.00 20.00 30.00 40.00 50.00

Converted Score

0

3

6

9

12

15

Fre

qu

ency

Mean = 30.00Std. Dev. = 6.99999N = 56

Converted Score

Page 151: scoring units of competency

151

STANDARDS REFERENCED FRAMEWORKS FOR TRIAL UNIT (MERS)

MEM1.1FA Undertake interactive workplace communication Can discuss complex issues with unfamiliar audiences whilst utilising a range of communication strategies to assist with establishing rapport, gaining mutual understanding and encouraging contribution from all parties. The student can describe and evaluate a range of communication strategies appropriate to individuals and groups and situations both familiar and unfamiliar to the individual.

Level 4 Uses a range of

communication

strategies to discuss

complex issues

Can select and apply the most appropriate technique for communicating ideas, sourcing information and reporting information. The student can also evaluate the quality of information received in terms of accuracy, consistency and intent, as well as summarise and communicate findings or outcomes to a range of audiences.

Level 3 Checks information for

accuracy and

communicates to a

range of audiences

At this level, the student listens without interruptions and demonstrates understanding through statements or questions posed back to the speaker. The student can also organise information according to the type of speech event and the function of the message. He or she can encourage others to contribute to discussions by displaying appropriate body language, listening skills and questioning techniques and evaluate the views and opinions of others, whilst demonstrating respect and understanding.

Level 2 Organises information

according to function

and encourages others

to contribute

Can focus attention on the speaker when listening and, where necessary, ask closed questions for clarification. The student is also able to summarise findings and key issues, can constructively contribute to discussions involving familiar and unfamiliar contexts and explain the goals and aims to others using a range of appropriate communication strategies.

Level 1 Summarises findings

and explains goals to

others

Can select appropriate communication techniques and discuss a range of topics and content area with a third party. The student is also able to collect relevant information from easily accessible sources and can constructively contribute to discussions.

Level 0 Communicates

effectively and obtains

easily accessible

information

Page 152: scoring units of competency

152

MEM1.2FA Apply principles of OHS in work environment

Can report, to designated personnel, OHS issues and implications relevant to the operation of equipment, including those related to defects. The student can also describe a range of workplace hazards, their consequences and appropriate preventative measures.

Level 2 Actively contributes to

the implementation of

workplace OHS

Can interpret and describe the meaning of safety signs and symbols and start, operate and shut down equipment, in accordance with company policy and procedures and OHS requirements. The student can also follow the organisation’s hazard reporting procedures as well as evacuation procedures for dealing with fire and other emergencies.

Level 1 Follows OHS

procedures and can

handle emergency

situations

Can describe and follow workplace emergency and evacuation procedures. The student can also wear and store protective equipment in accordance with company procedures.

Level 0 Demonstrates ability

to follow basic OHS

procedures

MEM1.3FA Apply quality procedures Can undertake workplace activities within the appropriate framework and in accordance with customer requirements and organisational standards. The student can also describe and apply appropriate strategies and activities when workplace activities do not meet quality requirements.

Level 2 Applies appropriate

strategies to ensure

quality of products and

services

Can explain the organisational implications of not taking personal responsibility for his or her own quality of work. The student can apply quality checks on products and services to ensure product specifications are met in the appropriate timeframe.

Level 1 Applies quality checks on

products and services

Can explain the quality requirements of his or her own job and undertake workplace activities to conform with specifications. The student can also explain the implications of not meeting customer requirements.

Level 0 Understands the importance of quality requirements

Page 153: scoring units of competency

153

MEM2.1C12A Apply quality systems Can work with general reference to established procedures, monitor the quality of the product or service during operation and list examples of common defects.

Level 2 Monitors the quality

of products and

servicesCan typically report defects detected and explain the reasons for following process improvement procedures. The student can also explain the benefits of good customer/supplier relationships and carry out work in accordance with standard operating procedures.

Level 1 Reports problems and

works in accordance

with standard

operating proceduresCan typically obtain the job or work instructions in accordance with workplace procedures and carry out work in a manner consistent with the improvement of process and customer/supplier relationships.

Level 0 Attains job

instructions and

works in a consistent

manner

MEM1.4FA Plan to undertake a routine task Can prepare and justify plans, assess and report potential problems, and explain the reasons for final checking of outcomes against requirements and specifications.

Level 3 Justifies work plans

and understands

importance of

validating outcomes

Can apply questioning techniques to clarify and confirm specifications for task, where necessary. He or she is able to check the conformance of planned steps and outcome with instructions and relevant specifications and justify any revisions to plan.

Level 2 Checks outcome

against requirements

Can explain functions, importance and purposes of job specifications. He or she is able to accurately describe and sequence steps to be undertaken to complete the task, and to revise these steps if necessary to meet task requirements. The student reports potential problems with meeting task requirements to the appropriate person.

Level 1 Develops an

appropriate work plan

and reports potential

problems

Can access relevant instructions, specifications and projected task outcomes, and check these against task requirements.

Level 0 Accesses relevant

instructions and

checks against

requirements

Page 154: scoring units of competency

154

MEM7.5AA Perform general machiningCan typically sharpen tools with correct geometry for the range of materials required by the job and make the appropriate adjustments to machinery in response to carrying out routine maintenance. The student can ensure that production is waste- and time-efficient.

Level 3 Ensures timely output,

sharpens tools and adjusts

machinery as required

Can develop an appropriate sequence of operations and justify the selection of measuring devices with respect to job specifications. The student can apply mark references/datum points on appropriately selected materials and correctly mount and position tools. The student can typically explain the requirements of the drawing, instructions and specifications, and potential problems that could have an impact on the operation of machine performance can typically be identified.

Level 2 Develops a sequence of

operations, correctly

mounts and positions tools

and identifies potential

problems

Can justify his or her selection of tools, identify worn or damaged cutting tools and explain the importance of using sharpened cutting tools. The student can fully clamp or position work and operate machine in a safe manner that satisfies manufacturer’s instructions, job requirements and OHS requirements.

Level 1 Identifies damaged tools

and works in a manner

consistent with OHS

requirements

Can typically identify operations, machines/tools, materials and method of job holding required for job. The student can describe the correct procedures for clamping or positioning work as well as those for maintaining and operating machines. The student can explain the reasons for marking out materials and the tolerances of relevant devices can typically be identified.

Level 0 Identifies tools, materials

and methods required.

Describes procedures for

operating and maintaining

machine

MEM2.5C11A Measure with graduated devicesCan typically rank suitable devices in order of preference and justify the ranking. The student can also identify appropriate measures to be taken when accurate adjustment of measuring devices is not possible and know its importance.

Level 2 Ranks suitable

devices and suggests

alternative measures

when necessary

Can typically select the most appropriate measuring device and technique, and make allowances for the local conditions when handling and storing them. The student can typically explain the effects of inappropriate use, handling and storage of measuring devices.

Level 1 Makes allowances for

external conditions

when handling and

storing devices

Can select, use and store suitable measuring devices in accordance with manufacturer’s specifications of standard operating procedures.

Level 0 Selects, uses and

stores suitable

measuring devices

Page 155: scoring units of competency

155

MEM2.8C10A Perform computations Can express given ratios and proportions in terms of whole numbers, fractions and decimal fractions. The student can also produce charts and graphs with appropriate scales and correctly mark and label all the key features required. The student can make inferences from charts or graphs based on a visual appraisal as well as numerically describe evident trends and relationships.

Level 3 Converts dimensions,

creates and interprets

complex charts

Can use an appropriate technique, such as rounding-off, for estimating approximate answers as well as select and use appropriate formulas for particular applications. The student can perform calculations involving ratios and proportions, and interpret charts to provide answers to basic questions.

Level 2 Estimates approximate

answers and interprets

charts

Can perform rounding operations, produce simple charts and use appropriate formulas for particular applications.

Level 1 Produces simple charts

and applies appropriate

formulasCan substitute values for terms in a formula, apply estimated procedures to check their calculations and select required information directly from charts.

Level 0 Demonstrates ability to

perform basic

computations and

understand charts

MEM5.5AA Carry out mechanical cuttingCan typically make the necessary adjustments to the cutting machine and then start, operate and stop it, in accordance with standard operating procedures, manufacturer’s instructions and OHS requirements. The student can also appropriately justify the selection of tools, identify potential sources of tool defect and use measuring equipment to correctly position equipment.

Level 2 Correctly operates

machine, considering

tool selection, potential

problems and

positioning

Can select and justify the most appropriate cutting method to meet job requirements and identify procedures for setting up the cutting machine. The student can apply standard operating procedures to load and adjust the cutting machine.

Level 1 Justifies cutting

method and correctly

loads and adjusts the

machine

Can typically describe at least four cutting methods and the function of stops and guards in a cutting machine. The student can also identify the job specifications and tasks required. The student can select the most appropriate tool and describe standard operating procedures and OHS requirements for starting, operating and stopping machinery, and adjusting cutting machines for operation.

Level 0 Demonstrates ability to

select tools and use

machine

Page 156: scoring units of competency

156

MEM9.2AA Interpret technical drawingCan identify variations between the drawing, job requirements and/or related equipment. The student can explain the reasons for validating the drawing against job requirements, related equipment and version currency. The reasons and importance for using standard symbols can also be explained.

Level 2 Identifies lack of consistency between drawings, requirements and equipment

Can explain the relationship between the views in the drawing and appropriately respond to instructions within the diagram. The student can also identify the dimensions of key features and suggest alternative materials suitable to local needs or conditions.

Level 1 Interprets dimensions and views of drawings. Suggests alternative materials

Can identify the unit of measurement used in the drawing and, from it, the materials required for the job. The student can typically identify, from the drawing, the drawing version, commonly used symbols and the components, assemblies or objects it contains. The student can correctly check and validate the drawing against job requirements and related equipment and identify and follow instructions as required.

Level 0 Identifies and validates basic features from drawings

MEM9.3AA Prepare basic engineering drawingCan complete drawings and parts lists in accordance with SOPs and customer requirements, including copying, issuance, handling and cataloguing. He or she can also identify potential problems and report them to appropriate personnel, and justify the selection of drafting equipment and method.

Level 2 Correctly completes

drawings and parts

lists. Identifies and

reports potential

problems

Can collate the information needed for production of drawings in accordance with workplace requirements, and apply drafting principles to produce or change a drawing to conform with SOP. He or she can also explain reasons for recording drawings and parts lists, as well as the consequences of inappropriate handling and storage of drawings. The student can list and evaluate drafting equipment and describe the potential consequences of inappropriate or incomplete part description numbering outside SOPs.

Level 1 Collects all necessary

information and

evaluates equipment

Can list appropriate drafting principles and SOP for recording, handling and storing the drawings and parts lists. The student can obtain relevant job specifications and list alternative methods of drawing which are most appropriate to job requirements.

Level 0 Lists SOP for

recording, handling

and storing

drawings. Obtains

relevant job

specifications

Page 157: scoring units of competency

157

MEM9.4BA Electrical/electronic detail draftingCan produce detailed electrical/electronic schematics and drawings to meet the requirements of AS1102. He or she can also justify, in accordance with system requirements and work place procedures, the components and/or materials chosen.

Level 3 Produces detailed schematics and justifies components and material chosen

Can identify the relative positioning of electrical/electronic components in a schematic drawing, and use this information to identify all system component specifications including circuit specifications.

Level 2 Identifies all system components and specifications

Can identify symbols used in electrical/electronic schematics and describe their meaning and functions. The student can also identify appropriate suppliers’ catalogues using design specifications.

Level 1 Identifies basic features of schematics and appropriate catalogues

Requires assistance to interpret symbols and components of schematic drawings.

Level 0 Demonstrates limited ability to analyse schematic drawings

MEM9.5AA Basic engineering detail draftingCan produce layout, assembly and component drawings in conformance with specifications, understand the importance of using standard symbols in engineering drawings, and select and justify the chosen components and/or materials in accordance with workplace procedures and system requirements.

Level 2 Produces layout,

assembly and

component drawings

in accordance with

specifications

Can prepare drawings using appropriate projections and view in accordance with AS1100 or equivalent and list requirements of AS1100 with respect to dimensions, tolerances and labels relevant to the component, layout and/or assembly. The student can label and dimension drawings using supplied tolerances.

Level 1 Prepares drawings in

conformance with

AS1100

Can identify the appropriate projection for drawing purpose, and specifications for the components, layout and/or assembly. The student can read schematic diagrams and identify appropriate manufacturers'/suppliers' catalogues using design specifications, as well as identify procedures for the production of component, layout and/or assembly drawings. The student can identify symbols used in engineering detail drafting.

Level 0 Reads schematic

drawings and

identifies procedures,

projection and

components

Page 158: scoring units of competency

158

MEM9.10BA Create 3D models using computer-aided design systemCan create and manipulate entities in 3D space by selecting the most appropriate procedures and system features to meet job requirements. These selections can be justified accordingly.

Level 4 Creates and manipulates

entities in 3D space to

suit job specifications

Can apply the correct techniques for modifying models in 3D space to meet job requirements as well as explain his or her reasoning. The student can also correctly extract the physical properties of shapes created in 3D space to meet job requirements.

Level 3 Justifies modifications to

3D models and correctly

extracts properties of 3D

shapes

Can justify the selection of coordinate system and orientation in terms of job specifications as well as the use of different formats when saving drawing files. The student can also identify the procedures for creating entities in 3D space and apply those involving ruled and revolved surfaces. A range of procedures for extracting the physical properties of 3D shapes can also be identified.

Level 2 Lists procedures for

extracting and

replicating physical

properties of 3D shapes

Can identify an appropriate coordinate system to design a 3D model for job specifications and a range of alternative orientation models appropriate for the system in use. The student can also save files in various formats in accordance with SOP.

Level 1 Identifies appropriate

coordinate system and

alternative orientation

models

Requires assistance to design a 3D model. The student can use basic features of design software

Level 0 Requires support to

create 3D models

MEM9.9BA Create 2D drawings using computer aided design system

Can create and modify macros to maximise system efficiency as well as 2D drawing using the full capability of the available software system. The student can also extract supplementary data from drawing and evaluate a range of printing options for drawing files

Level 2 Produces 2D

drawing using full

capacity of the

software. Creates and

modifies macros

Can create 2D drawings utilising the range of features of the resident software, customise the system variables using references where necessary, and explain the reasons for customising menus and systems defaults. The student can also identify the properties of shapes/sections/features from drawing and explain the procedures for linking selected items. The student can also explain the importance of the standard features in a bill of materials.

Level 1 Identifies properties

of drawing. Uses a

range of software

features to create a

2D drawing

Can produce a 2D drawing with an appropriate bill of materials, save files in basic formats and identify features of the CAD software system, including the relevant database. The student can also appropriately customise menus and drawing defaults and link selected items within the database. The student can create detailed views using a determination of scale and explain the procedures for extracting data from drawn shapes/ features.

Level 0 Produces 2D

drawings and

customises database

to suit

Page 159: scoring units of competency

159

MEM18.1AB Use hand toolsCan typically use hand tools with consideration to the accuracy, appearance and ultimate purpose or function of the output. The student can discuss the importance of routine maintenance on overall enterprise functions and productivity.

Level 3 Produces accurate

output in accordance

with job

requirements

Can rank the appropriateness of a range of suitable hand tools and can justify the ranking. The student can describe common faults and explain the importance of routine maintenance of hand tools. The student can use appropriate tools to produce desired outcomes with minimal material wastage.

Level 2 Uses tools in a waste

efficient manner.

Describes common

faults and the

importance of

maintenance

Can correctly carry out marking and repairing for unsafe/faulty tools and correctly maintain/sharpen hand tools where appropriate. The tools can be stored in the appropriate location and the importance of correct storage of hand tools in terms of maintaining tool capacity, OHS and productivity can be explained.

Level 1 Repairs, sharpens

and stores tools

correctly

Can typically select and use appropriate hand tools to produce desired outcomes to job specifications, and follow all safety procedures at all times. The student can identify the storage location of a range of hand tools and list the requirements and techniques for operational maintenance of hand tools.

Level 0 Selects and uses

appropriate tools

MEM18.2AA Use power tools/hand held operations Can adjust power tools, including alignment, to ensure productivity. He or she can apply OHS, SOPs and manufacturer’s safety requirements when operating a range of power tools.

Level 3 Adjust tools to

imp;rove

productivity.

Can explain selection of power tools, and discuss such issues as the importance of operational maintenance on productivity. The student can secure job in accordance with specifications and use the power tools in correct sequence of operations.

Level 2 Explains importance

of maintenance.

Secures tools and

materials in

accordance with

specifications

Can identify the power tools needed to meet particular job requirements; can explain the importance of correct storage and describe the importance of securing materials prior to using power tools, as well as the correct sequencing of operations; can maintain / sharpen power tools where appropriate.

Level 1 Nominates tools

needed and

maintains and

sharpens tools where

necessary

Can use nominated power tools following a set sequence of operations; describe the personal protective clothing and safety equipment to be used; can list and carry out SOPs for marking and repairing faulty tools; list the requirements and techniques for the operational maintenance of a range of power tools.

Level 0 Uses power tools in a

sequence consistent

with SOPs

Page 160: scoring units of competency

160

Page 161: scoring units of competency

161

Level Band Nutshell

A

Can discuss complex issues with unfamiliar audiences

using a range of communication strategies to assist in

rapport, mutual understanding and encouraging

contribution from parties. Capable user of drawings,

measuring devices, tools and charts.Works with

general reference to established procedures, monitors

the quality of the product or service during operation

and lists examples of common defects.

Uses and evaluates communication

strategies. Creates and interprets

charts, graphs and drawings.

B

Can contribute to OHS issues relevant to the operating

equipment; describe hazards, their consequences and

preventative measures. Develops efficient work plans

to maximise efficency. Can identify and make

adjustments to appropriate measureing devices.

Contributes to OHS; quality checks and

improvements. Chooses and adjusts

measuring devices and comunication

strategy

C

Can select, apply, and justify, comunication technique

for sourcing and reporting information; evaluate

information received in terms of accuracy, consistency

and intent, as well as summarise and communicate

findings or outcomes to a range of audiences.

Evaluates the information and

communicates findings. Apples quality

procedures. Uses routine maintenance,

and proper storage . Uses simple charts.

D

Describes standard operating procedures for loading

and adjusting and operating machinery. Can explain

the pros and cons of a range of communication

strategies and apply the most suitable to convey

information for a given context. Displays attentive

body language without interruptions and asks open

ended questions for clarification. can organise

information according to the type of speech event and

the function of the message. When preparing reports,

the student can customise the reporting style and

content according to the needs of the audience.

Explains, listens and communicates

appropriately with audience and adjust

message to neet audience needs.

E

Select appropriate communication techniques, can

discuss a range of topics and content area with a third

party, focus attention on the speaker when listening

and where necessary, ask closed questions for

clarification. The student is also able to collect relevant

information from easily accessible sources, prepare and

administer verbal or written reports, summarise

findings and key issues and can constructively

contribute to discussions involving familiar and

unfamiliar contexts.

Participate in discussions led by others

and collect information in easily

obtainable forms and sources.

F

Access relevant instructions, specifications and

projected task outcomes, and check these against task

requirements.

follows instructions, and identifies

expected outcomes

Page 162: scoring units of competency

162

Variable map for MERS

Page 163: scoring units of competency

163

Concurrent calibration of school-based assessment and central examination in NSW.

Page 164: scoring units of competency

This document was created with Win2PDF available at http://www.daneprairie.com.The unregistered version of Win2PDF is for evaluation or non-commercial use only.