jeppesen english testing manual v2 april 2012 rejb ·  · 2017-06-28jeppesen’s procedures manual...

45
ORIGINAL PAGE 1 Jeppesen Aviation English Testing Procedures Manual for Aviation English Testing Version 2, December 2011

Upload: vubao

Post on 12-Apr-2018

217 views

Category:

Documents


1 download

TRANSCRIPT

ORIGINAL PAGE 1

Jeppesen Aviation English Testing

Procedures Manual for Aviation English Testing

Version 2, December 2011

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 2 OF 45

Aviation English Testing Manual

TTAABBLLEE OOFF CCOONNTTEENNTTSS

SSEECCTTIIOONN 11 IINNTTRROODDUUCCTTIIOONN……………………………………………………………………………………………………………………………………………………………………PPAAGGEE 99 OOVVEERRVVIIEEWW OOFF TTHHEE VVEERRSSAANNTT™™ AAVVIIAATTIIOONN EENNGGLLIISSHH TTEESSTT…………………………………………………………………………....PPAAGGEE 1100 11..11 TTEESSTT DDEESSCCRRIIPPTTIIOONN ………………………………………………………………………………………………………………………………………………....PPAAGGEE 1100--1111 11..22 SSOOLLUUTTIIOONNSS TTHHAATT IINNCCLLUUDDEE TTHHEE AAVVIIAATTIIOONN EENNGGLLIISSHH TTEESSTT……………………………………………………………………..PPAAGGEE 1122 11..33 TTEESSTT QQUUAALLIITTYY …………………………………………………………………………………………………………………………………………………………....PPAAGGEE 1122 11..44 SSCCOORREE RREEPPOORRTTIINNGG……………………………………………………………………………………………………………………………………………….... PPAAGGEE 1122 11..55 VVEERRSSAANNTT TTEESSTT AADDMMIINNIISSTTRRAATTIIOONN……………………………………………………………………………………………………………………....PPAAGGEE 1122--1133 SSEECCTTIIOONN 22 PPEEAARRSSOONN SSYYSSTTEEMM RREEQQUUIIRREEMMEENNTTSS…………………………………………………………………………………………………………………………PPAAGGEE 1144 SSEECCTTIIOONN 33 JJEEPPPPEESSEENN AAVVIIAATTIIOONN EENNGGLLIISSHH TTEESSTT CCOOMMPPLLIIAANNCCEE WWIITTHH IICCAAOO SSTTAANNDDAARRDDSS……………………………………....PPAAGGEE 1155--2244 SSEECCTTIIOONN 44 AAVVIIAATTIIOONN EENNGGLLIISSHH TTEESSTTIINNGG FFAACCIILLIITTIIEESS 44..11 HHAARRDDWWAARREE RREEQQUUIIRREEMMEENNTTSS………………………………………………………………………………………………………………………………..PPAAGGEE 2255 44..22 FFAACCIILLIITTYY SSEECCUURRIITTYY………………………………………………………………………………………………………………………………………………......PPAAGGEE 2266 SSEECCTTIIOONN 55.. AAVVIIAATTIIOONN EENNGGLLIISSHH TTEESSTT SSEECCUURRIITTYY PPRROOCCEEDDUURREESS…………………………………………………………………………………………..PPAAGGEE 2277 SSEECCTTIIOONN 66 IIDDEENNTTIIFFIICCAATTIIOONN VVEERRIIFFIICCAATTIIOONN PPRROOCCEEDDUURREESS 66..11 IIDDEENNTTIIFFIICCAATTIIOONN VVEERRIIFFIICCAATTIIOONN………………………………………………………………………………………………………………………………PPAAGGEE 2288 66..22 NNOONN CCOOMMPPLLIIAANNCCEE……………………………………………………………………………………………………………………………………………………....PPAAGGEE 2288 66..33 IIDDEENNTTIIFFIICCAATTIIOONN RREECCOORRDDSS……………………………………………………………………………………………………………………………………....PPAAGGEE 2288 SSEECCTTIIOONN 77 AADDDDIITTIIOONNAALL TTEESSTTIINNGG PPRROOCCEEDDUURREESS 77..11PPRREE--TTEESSTTIINNGG PPRROOCCEEDDUURREESS……………………………………………………………………………………………………………………………………PPAAGGEE 2299 77..22TTEESSTTIINNGG PPRROOCCEEDDUURREESS………………………………………………………………………………………………………………………………………………PPAAGGEE 2299 77..33 PPOOSSTT--TTEESSTTIINNGG PPRROOCCEEDDUURREESS………………………………………………………………………………………………………………………………..PPAAGGEE 2299--3300 SSEECCTTIIOONN 88 CCDDTT CCLLIIEENNTT IINNSSTTAALLLLAATTIIOONN GGUUIIDDEE 88..11 IINNTTRROODDUUCCTTIIOONN………………………………………………………………………………………………………………………………………………………………PPAAGGEE 3311 88..22 SSYYSSTTEEMM RREEQQUUIIRREEMMEENNTTSS………………………………………………………………………………………………………………………………………… PPAAGGEE 3311--3322 88..33 IINNSSTTAALLLLAATTIIOONN CCDDTT…………………………………………………………………………………………………………………………………………......……....PPAAGGEE 3322 88..44 VVEERRIIFFYYIINNGG CCDDTT OOPPEERRAATTIIOONN …………………………………………………………………………………………………………..…………………… PPAAGGEE 3322 88..55 CCHHEECCKK AAUUDDIIOO AANNDD HHEEAADDSSEETT …………………………………………………………………………………………………………..…………………… PPAAGGEE 3322 88..66 CCOOMMPPLLEETTEE AA TTEESSTT AANNDD VVEERRIIFFYY UUPPLLOOAADDIINNGG OOFF RREESSPPOONNSSEESS …………………………………………..............…………..PPAAGGEE 3333 88..77 UUNNIINNSSTTAALLLL OOLLDDEERR VVEERRSSIIOONNSS …………………………………………………………………………………………………………..…………………… PPAAGGEE 3333

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 3 OF 45

SSEECCTTIIOONN 99 IINNTTEERRNNEETT DDEELLIIVVEERREEDD TTEESSTTSS CCLLIIEENNTT UUSSEERR GGUUIIDDEE CCDDTT CCLLIIEENNTT -- VVEERRSSIIOONN 22..66 99..11 IINNTTRROODDUUCCTTIIOONN………………………………………………………………………………………………………………………………………………………………..PPAAGGEE 3344 99..22 PPRREEPPAARRIINNGG FFOORR TTEESSTT AADDMMIINNIISSTTRRAATTIIOONN………………………………………………………………....…………………………………………PPAAGGEE 3344--3355 99..33 DDOOWWNNLLOOAADDIINNGG TTEESSTTSS……………………………………………………………………………………………………………………………………………….. PPAAGGEE 3355 99..44 GGEETTTTIINNGG SSTTAARRTTEEDD…………....…………………………………………………………………………………………………………………………………………..PPAAGGEE 3355 99..55 PPRREE--LLOOAADDIINNGG TTIINNSS BBEEFFOORREE SSCCHHEEDDUULLEEDD TTEESSTTIINNGG……………………………………………………………………………………PPAAGGEE 3355--3366 99..66 OONN--DDEEMMAANNDD DDOOWWNNLLOOAADDIINNGG OOFF TTIINNSS………………………………………………………………………………………………………………....PPAAGGEE 3366 99..77 VVIIEEWW TTEESSTTSS……………………………………………………………………………………………………………………………………………………………………....PPAAGGEE 3366 99..88 TTAAKKIINNGG AA TTEESSTT………………………………………………………………………………………………………………………………………………………………....PPAAGGEE 3366--3377 99..99 TTRROOUUBBLLEESSHHOOOOTTIINNGG……………………………………………………………………………………………………………………………………………………....PPAAGGEE 3377--3388

SSEECCTTIIOONN 1100 TTEESSTT IINNVVIIGGIILLAATTOORR CCOOUURRSSEE SSYYLLLLAABBUUSS……………………………………………………………………………………………………………………......PPAAGGEE 3399--4411 SSEECCTTIIOONN 1111 MMIISSCCEELLLLAANNEEOOUUSS FFOORRMMSS 1111..11 PPRRAACCTTIICCEE PPLLAACCEERR BBRRIIEEFFIINNGG………………………………………………………………………………………………………………………………..PPAAGGEE 4422--4433 1111..22 BBRRIIEEFFIINNGG IINNSSTTRRUUCCTTIIOONNSS FFOORR IICCAAOO AAVVIIAATTIIOONN EENNGGLLIISSHH TTEESSTTIINNGG……………………………………………………....PPAAGGEE 4433--4444

1111..33 BBRRIIEEFFIINNGG IINNSSTTRRUUCCTTIIOONNSS DDEECCLLAARRAATTIIOONN…………………………………………………………………………………………………………..PPAAGGEE 4455

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 4 OF 45

Jeppesen’s Procedures Manual for

Aviation English Testing

This manual provides an overview of all procedures that are applicable to the Aviation English Certification Testing for Pilots and Air Traffic Control personnel, to show compliance with the Aviation English Proficiency requirements as described by ICAO, and is to be made available to any person that will be involved in invigilating Aviation English Certification tests, and a copy should be kept at any facility where Certification Tests will be conducted. Any certification testing conducted should be done in accordance with the procedures outlined in this manual, and in case there is any doubt on any of the procedures described in this manual, contact should be made with Jeppesen Academy for clarification. This manual also serves as Jeppesen Academy’s application for NAA approval for the Aviation English Certification Test.

This manual will be kept current with any changes made to the program, with notification and copies of changes made to the NAA, under any of the following circumstances:

� Change in the testing procedures, facilities, or equipment described

� Change in ownership

� Change in name or location

Refer to the Record of Revisions page at the front of this manual, and to the page footers for information regarding when each page was updated after its original submission.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 5 OF 45

Corporate Information Jeppesen Academy is a division within Jeppesen Sanderson Inc, a wholly owned subsidiary of Boeing.

Corporate Address:

55 Inverness Drive East Englewood, Colorado 80112

Phone: (303) 799-9090 Fax: (303) 328-4114

Email: [email protected]

Corporate Structure:

President & CEO: Mark Van Tine

Executive Vice President: Thomas Wede

Course Director: Kirk Quong Sing

Should any changes occur in ownership, name, or location, we will notify the NAA in writing within 10 days to ensure continuation of the course approval.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 6 OF 45

Aviation English Testing Procedures Manual List of Pages Section-Page Revision # Revision Date Section-Page Revision # Revision Date

1-1 ORIGINAL 24/10/2008 3-23 ORIGINAL 24/10/2008

1-2 ORIGINAL 24/10/2008 3-24 ORIGINAL 24/10/2008

1-3 ORIGINAL 24/10/2008 4-25 ORIGINAL 24/10/2008

1-4 ORIGINAL 24/10/2008 4-26 ORIGINAL 24/10/2008

1-5 ORIGINAL 24/10/2008 5-27 ORIGINAL 24/10/2008

1-6 ORIGINAL 24/10/2008 6-28 ORIGINAL 24/10/2008

1-7 ORIGINAL 24/10/2008 7-29 ORIGINAL 24/10/2008

1-8 ORIGINAL 24/10/2008 7-30 ORIGINAL 24/10/2008

1-9 ORIGINAL 24/10/2008 8-31 ORIGINAL 24/10/2008

1-10 ORIGINAL 24/10/2008 8-32 ORIGINAL 24/10/2008

1-11 ORIGINAL 24/10/2008 8-33 ORIGINAL 24/10/2008

1-12 ORIGINAL 24/10/2008 8-34 ORIGINAL 24/10/2008

1-13 ORIGINAL 24/10/2008 8-35 ORIGINAL 24/10/2008

2-14 ORIGINAL 24/10/2008 8-36 ORIGINAL 24/10/2008

3-15 ORIGINAL 24/10/2008 8-37 ORIGINAL 24/10/2008

3-16 ORIGINAL 24/10/2008 8-38 ORIGINAL 24/10/2008

3-17 ORIGINAL 24/10/2008 8-38 ORIGINAL 24/10/2008

3-18 ORIGINAL 24/10/2008 8-39 ORIGINAL 24/10/2008

3-19 ORIGINAL 24/10/2008 8-40 ORIGINAL 24/10/2008

3-20 ORIGINAL 24/10/2008 8-41 ORIGINAL 24/10/2008

3-21 ORIGINAL 24/10/2008 8-42 ORIGINAL 24/10/2008

3-22 ORIGINAL 24/10/2008 9-43 ORIGINAL 24/10/2008

9-44 ORIGINAL 24/10/2008 11-59 ORIGINAL 24/10/2008

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 7 OF 45

9-45 ORIGINAL 24/10/2008 11-60 ORIGINAL 24/10/2008

9-46 ORIGINAL 24/10/2008 11-61 ORIGINAL 24/10/2008

9-47 ORIGINAL 24/10/2008 11-62 ORIGINAL 24/10/2008

9-48 ORIGINAL 24/10/2008 ALL VERSION 2 04/15/2012

9-49 ORIGINAL 24/10/2008

9-50 ORIGINAL 24/10/2008

9-51 ORIGINAL 24/10/2008

9-52 ORIGINAL 24/10/2008

9-53 ORIGINAL 24/10/2008

9-54 ORIGINAL 24/10/2008

9-55 ORIGINAL 24/10/2008

10-56 ORIGINAL 24/10/2008

10-57 ORIGINAL 24/10/2008

10-58 ORIGINAL 24/10/2008

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 8 OF 45

Aviation English Testing Procedures Manual Record of Revisions Manual Holder ______________________________

This Record of Revision page must be retained in the front of the manual indicated above. Upon receipt of revisions, the holder must insert the revised page(s) in the appropriate section; enter the revision number, revision date, insertion date, and initials of the person inserting the revision. Outdated pages (those being replaced) should be removed and destroyed.

Revision #

Revision Date

Insertion Date

By Revision #

Revision Date

Insertion Date

By

The following representative at Jeppesen Academy is responsible for the maintenance of this document and

prompt distribution of changed pages to the NAA (National Aviation Authority) in accordance with NAA regulatory requirements.

Kirk Quong Sing, Course Director 303/328-6356

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 9 OF 45

Introduction New ICAO Language Proficiency Standards

In 1997, The International Civil Aviation Organization (ICAO) recognized the importance of improving communication between pilots and air traffic controllers. This was in response to numerous accidents and incidents where a lack of English proficiency was a contributory factor. The Air Navigation Commission (ANC) established a committee to review existing provisions related to air-ground and ground-ground voice communication in international civil aviation contexts. As a result, amendments were made to strengthen ICAO Annexes 1, 6, 10, 11 concerning language proficiency requirements for pilots and air traffic controllers.

Changes made:

� Annex 10 describes what language(s) shall be used for radiotelephony communication: the language of the ground station OR English. This means that proficiency in ICAO phraseology and plain English is required.

� Annex 6 and 11 establishes that all personnel comply with the ICAO language proficiency requirements stipulated in Annex 1.

� Annex 1 describes the language proficiency and testing requirements and contains a rating scale with six proficiency levels. It also describes how this will affect personnel licensing.

What the new standards mean for the industry:

Non-native speakers and native speakers of English must demonstrate a minimum language proficiency at ICAO Level 4 (Operational) as a licensing requirement.

For air traffic service personnel and pilots whose first language is not English, testing will be required to determine language proficiency according to the ICAO Proficiency scale. Personnel will need to demonstrate ability to use the language specific to all aspects of radiotelephony communication.

Member States who do not comply with the new licensing requirements will be required to notify ICAO, which may limit international recognition of licenses.

When changes take effect:

All ICAO Member States have been given until March 8, 2008 to fulfill the necessary training requirements to allow personnel to meet mandatory testing and licensing requirements.

Section 1

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 10 OF 45

Overview of the Versant™ Aviation English Test

1 Introduction The Jeppesen/Versant™ Aviation English Test (VAET) allows employers to quickly, objectively, and accurately assess the ICAO English language proficiency levels of pilots and air traffic controllers in ~30 minutes by telephone or computer. 1.1 Test Description The Jeppesen product line provides tests of spoken language for adult non-native speakers. Tests are delivered over the telephone or on a computer and are scored automatically. During a test, the system presents a series of verbal prompts at a conversational pace and elicits spoken responses. 1.1.1 Test Construct: The Aviation English Test is intended to measure the facility in spoken aviation English and common English in the aviation domain. This is the ability to understand spoken English both within the aviation radiotelephony phraseology and on topics related to aviation (such as movement, position, time, duration, weather, animals, etc.), and to be able to respond appropriately in intelligible English at a native-like conversational pace. This definition relates to what occurs during the course of a spoken conversation as schematized in Figure 1, adapted from

Levelt (1989). Figure 1. Conversational processing components in l istening and speaking. Spoken language facility is essential to successful oral communication – if language users cannot track what is being said, extract meaning as speech continues, and then formulate and produce a relevant and intelligible response in real time, they will not be able to interact effectively in operational communication. In the administration of the Aviation English Test, the Pearson testing system presents a series of discrete prompts to the test-taker over the telephone at a native conversational pace. These integrated “listen-then-speak” items require real-time receptive and productive processing of spoken language forms. Because the Aviation English Test requires real-time language processing, it measures the degree of automaticity in basic encoding and decoding of oral language. Automaticity is the ability to access and retrieve lexical items, to build phrases and clause structures, and to articulate these without conscious attention to the linguistic code (Cutler, 2003; Jescheniak, Hahne, and Schriefers, 2003; Levelt, 2001). Automaticity is required in order for the speaker/listener to pay full attention to actions and to what needs to be said rather than thinking through how the spoken message should be structured. Automaticity is critical for non-native English speakers since aviation professionals need to be able to handle complicated and unexpected turns of events without having to consciously attend to the process of understanding and producing the language.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 11 OF 45

1.1.2 Test Structure The VAET contains eight tasks or sections that test-takers must complete: Part A. Reading – Aviation Part B. Reading – Common English Part C. Repeat Part D. Short Questions Part E. Readbacks Part F. Corrections and Confirmations Part G. Story Retellings The tasks in the Aviation English Test provide direct measures of the test-taker’s listening and speaking ability. The ICAO manual states that tests that only assess phraseologies or that test only plain English are not appropriate. The Aviation English Test addresses this requirement by including both aviation specific-phraseology tasks and common English tasks. 1.1.3 Test Versions There are two versions of Aviation English Test that can be used by your organization: 1. Aviation English Certification Test: a robust certification of the ICAO English language proficiency levels required of pilots and air traffic controllers 2. Aviation English Practice Test: a low-stakes test version of the test that allows individuals to practice before taking the certification test. This version of the test can also be used as part of training programs for initial screening and placement.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 12 OF 45

1.2 Solutions that include the Aviation English Tes t Jeppesen can work with an aviation employer to design an overall solution for preparing employees and certifying that they meet the ICAO requirement for language proficiency. The Aviation English Test is included in these solutions in some of the following ways: As a testing solution to certify employees meet the ICAO language requirement As a discreet practice activity within a training curriculum (i.e. the practice test is an instructional exercise) As part of a complete training and certification program (combining both of the above) As a pre-screen of employees to get insight into how ready they are for a final exam 1.3 Test Quality All tests have been developed to provide a high quality assessment instrument for spoken language. 1.3.1 Research-based scientific development process makes approach credible The tests of spoken language have been designed and built by experts in aviation, applied linguistics, test development, speech recognition, and language processing. The test has gone through a rigorous test development process, followed by extensive field testing and test review and analysis. With the scoring system, the test has been validated with reference to human judgments of proficiency. 1.3.2 Proven validity and reliability for accurate, meaningful results Each test undergoes validation and reliability research studies. Studies on other Jeppesen tests show that they are valid measures of the spoken language abilities of non-native speakers when their results are compared to native speakers. The studies have also shown these tests to demonstrate very strong reliability statistics. 1.3.3 Objective, fair assessment for a more robust approach The testing system allows you to evaluate all employees on an equal basis with objective, unbiased scoring. Test items were designed to be region neutral. The content specification also requires that both native speakers and proficient non-native speakers find the items very easy to understand and to respond to appropriately. The items have been designed to cover a broad range of skill levels and skill profiles. 1.3.4 A legally defensible test for high stakes emp loyment decisions The test complies with the legal and professional guidelines in the Uniform Guidelines for Employee Selection Procedures and the Principles for the Validation and Use of Personnel Selection Procedures governing the use of tests for decision-making in organizations. 1.4 Score Reporting Versant tests are automatically scored by the patented grading system, which will provide numeric scores and performance levels that describe the test taker’s facility in the spoken language of the test – that is, the ability to understand spoken English on everyday topics and to respond appropriately at a native-like conversational pace in intelligible language. 1.5 Versant Test Administration Tests can be administered over the telephone or on a computer. Instructions for the test are spoken over the testing system in an examiner voice and are also presented verbatim on a printed test paper during telephone administration and on the computer screen during computer administration. Test items themselves are presented in various native speaker voices that are distinct from the examiner voice.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 13 OF 45

1.5.1 Telephone Administration Telephone administration is supported by a test paper that includes general instructions and an introduction to the test procedures, plus instructions and information that are specific and unique to the test. Included in this unique information is the phone number to call, the Test Identification Number, the spoken instructions written verbatim, item examples, and the printed sentences for the Reading section (if applicable). When the test taker calls into the testing system, the system will ask the test taker to use the telephone keypad to enter the Test Identification Number on the test paper. This identification number keeps the test taker’s information secure. An examiner voice speaks all the instructions for the test. The spoken instructions for each section are also printed verbatim on the test paper to help ensure that test takers understand the directions. Test takers interact with the test system in English going through all five parts of the test until they complete the test and hang up the telephone. 1.5.2 Computer Administration For computer administration, the computer must have an Internet connection and our Computer-Delivered Test (CDT) software (see section 2 for system requirements). To use the computer version of a test, Jeppesen can work with an aviation employer to complete a setup process of any testing computers and verify they meet the standards required for the test. During the test administration, the test taker is fitted with a microphone headset. The system allows the test taker to adjust the volume and calibrate the microphone before the test begins. The instructions for each section are spoken by an examiner voice and are also displayed on the computer screen. Test takers interact with the test system in English, speaking their responses into the microphone. When the test is finished, the test taker submits the test for scoring.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 14 OF 45

----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

System Requirements

2.1 For taking tests by telephone

To ensure voice quality is optimal for testing, we recommend that calls are made over a fixed-line or landline. We do not recommend using mobile or cellular telephones, or calling services that utilize voice-over-IP (VoIP), as the quality of transmission can be uneven to an extent that can affect the test-taker’s score.

2.2 For taking tests by computer

To use the Computer-Delivered Tests, the software must be downloaded and installed on a machine that meets the following system requirements:

• Windows® XP SP3+, Vista, or 7 • Broadband Internet connection

• Pentium® III at 600 MHz or higher • 512 MB of RAM

• 5 GB free disk space • Screen resolution of at least 1024 x 768

• Web browser: Internet Explorer 7.0 (or higher) • Network security access to allow CdtClient.exe application to access https://www.VersantTest.com (port 443)

• Soundcard / audio driver with that can play audio (headphones recommended)

• Soundcard / audio driver with recording and playback capabilities that has been certified to work with the version of Windows being run on the test computer (Note: the following audio drivers may not work: Conexant HD)

• Head-mounted USB headset with microphone and headphones must be compatible with the requirements below:

Headphone features

Sound mode

Stereo

Ear piece Double

Driver Unit Size

32 mm

Frequency Response

20 - 20000 Hz

Impedance 32 ohms

Microphone features

Frequency response

100 - 12000 Hz

Impedance 3320 ohms

To administer Computer-Delivered Tests, Jeppesen requires that there is a technical point of contact available to work with us to install the CDT software and verify the computer configuration at least 1 week or more before administering tests.

Section 2

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 15 OF 45

Jeppesen Aviation English Test Compliance with

ICAO Standards

The Manual on the Implementation of ICAO Language Proficiency Requirements Doc 9835 (2004) offers

guidelines for testing language proficiency of air traffic control personnel and flight crews involved in flight

operations. This document identifies standards associated with testing and relates those to the Versant Aviation

English Test (VAET) with the purpose of verifying the test’s compliance with the ICAO standards.

3.1 VAET Is a Proficiency Test of Speaking and List ening (6.6.3 a) VAET is a speaking and listening test. During the test administration, the Pearson testing system presents a series of recorded prompts to the test taker at a native conversational pace from several different speakers with a range of native accents and speaking styles. These integrated “listen-then-speak” items require real-time receptive and productive processing of spoken language forms. The VAET does not test knowledge about the language, but rather tests the test taker’s facility with the language; that is, the ability to understand spoken English on work-related topics and to respond appropriately and intelligibly in spoken English at a fully-functional conversational pace. In this way, the VAET directly evaluates speaking and listening through an audio exchange, as deemed appropriate by the ICAO manual. The VAET is supported by a test paper that presents instructions, examples, some items and call signs in print. The purpose of the test paper is to facilitate the test taker’s understanding of the tasks. During test administration, the instructions and examples are spoken over the testing system. These instructions and examples are written verbatim on the test paper to enhance understanding. At the beginning of the test, there are some items that the test taker is asked to read aloud. These items provide information about the test taker’s pronunciation and fluency and present a familiar task that is a comfortable introduction to the interactive mode of the test as a whole. Some of the items in the test require the use of a call sign. To alleviate memory burden while taking the test, these call signs are printed on the test paper.

3.2 VAET is Based on the ICAO Rating Scale and Holi stic Descriptors (6.6.3 b) ICAO Rating Scale Each of the six language skills in the ICAO Rating Scale is measured separately during the VAET and reported separately using the ICAO Scale Levels. The automated scoring produces multiple, fully independent measures from the same set of responses. For this reason, tasks often contribute to more than one subscore. The use of multiple item types for subscores maximizes score reliability. The added advantage of evaluating language skills independently is that a subscore is not confounded by features of other language skills. For example, a heavy accent will not affect the evaluation of the content of what the test taker says. As specified by the standards, the scoring is based on a conjunctive model in which the Overall score is the lowest subscore level. The specific tasks in the test are designed to provide information about the test taker’s ability on one or more of the measurement subcomponents: Pronunciation, Structure, Vocabulary, Fluency, Comprehension and Interactions:

Section 3

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 16 OF 45

Pronunciation Responses longer than a single word or short phrase are used to estimate Pronunciation. The system extracts information about the stress and segmental forms of the words in the responses and the pronunciation of the segments in the words within their lexical and phrasal context. These measures are scaled according to native and non-native distributions and then are rescaled such that they optimally predict human judgments of pronunciation. Structure For the Structure subcomponent, the goal is to measure the ability to control basic and complex grammatical structures. In a task section designed to assess these skills called Repeat, the test taker listens to a sentence and then tries to repeat it verbatim. As the sentences increase in length and complexity, the task becomes increasingly difficult for less proficient speakers who are less familiar with the language’s phrase structures and common syntactic forms. Generally, repetition of material is constrained by the size of the linguistic unit (e.g. “the very large descending aircraft”) that a person can process in an automatic or nearly automatic fashion. The Repeats include both basic and complex grammatical structures and sentence patterns. The other task section that extracts information about Structure is called Readbacks. In this task, test takers are presented with a radiotelephony message and are asked to give an appropriate readback to confirm their understanding of the message. The test taker is expected to produce a readback using ICAO phraseology. Readbacks measure the Structure subcomponent in predictable, work-related language. Vocabulary For Vocabulary, test takers are asked to listen to an orally presented story or incident that deals with an aviation-specific topic and then describe what happened in their own words. Some topics are designed to be common, concrete and familiar while others are designed to be more abstract and unfamiliar. Test takers must identify words in phonological and syntactic context, extract propositions, and then paraphrase what was said. Scoring of the Story Retellings focuses on the test taker’s vocabulary range and accuracy. Fluency Constrained responses longer than a single word or short phrase are used to assess Fluency. Although the same responses used to estimate Pronunciation ability are used here, the scoring is independent. For Fluency, features such as rate of speaking and the position and length of pauses are analyzed. The measures are scaled according to native and non-native distributions and then are rescaled such that they optimally predict human judgments of fluency based on the ICAO Rating Scale. Comprehension For the Comprehension subcomponent, the test assesses the test taker’s ability to understand common and work-related words and concepts in sentence context. To do this, the test taker is presented with a series of questions that can be answered with a word or short phrase. An example of a Short-Answer Question is, “Is land that's almost entirely surrounded by water a peninsula or a pond?” The question requires the test taker to identify the lexical items of the question in phonological and syntactic context, infer the demand proposition, and then say an appropriate response. Lexical items are based on the ICAO list of priority lexical domains including topics such as animals, numbers, movement, time, transportation, and weather. Since items are recorded in different native and non-native voices, the test taker must be able to comprehend a range of speech varieties.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 17 OF 45

Interactions Finally, two different aspects of speech contribute to the Interactions subscore: the content of what is said and the test taker’s response time. In a task called Corrections and Confirmations, the test taker hears a radiotelephony message, either from the air traffic controller’s perspective or the pilot’s perspective. Then, the test taker hears a readback, which might contain the correct information, wrong information, or a request for more information. The test taker is expected to respond to the message appropriately using ICAO phraseology when possible. If the response contains wrong information, the test taker is expected to provide correct information. Some items reflect routine communications/situations; others cover less routine communications/situations, and a small proportion explore unexpected communications/situations. The level descriptors in the ICAO Manual also include response time as an important aspect of the Interactions scale, so the Interactions subscore also incorporates a measure of the initial latency of the test taker’s response. Together, the tasks in VAET provide direct measures of the test taker’s listening and speaking ability, according to the six subskills in the ICAO Rating Scale.

3.3Holistic Descriptors The ICAO manual provides five holistic descriptors that characterize the kinds of linguistic behaviors that proficient speakers should exhibit. Test takers who score well on the VAET, should meet these criteria. First, proficient speakers shall “communicate effectively in voice-only (telephone/radiotelephone) and in face-to-face situations.” In the VAET administration, test takers interact with the testing system over the telephone. This voice-only administration paradigm enhances authenticity by offering a close simulation to radiotelephony communication. In both telephone and radiotelephony interactions, no facial cues or body language can aid communication. “Communications without such cues are considered to be more difficult and challenging, requiring a higher degree of language proficiency, than face-to-face interactions” (ICAO, 2004, Section 2.7.1). Given that speakers require a higher degree of language proficiency for voice-only communication, it follows that test takers who score high on the VAET will be able to communicate as effectively, if not more effectively, in face-to-face situations. Second, proficient speakers shall “communicate on common, concrete and work-related topics with accuracy and clarity.” The content of the VAET was designed to address work-related topics in the aviation domain. For the common English tasks such as Repeats and Short Answer Questions, the ICAO list of priority lexical domains guided the choice of lexical items and topics. In addition, transcripts of conversations from actual pilots and air traffic controllers were used as reference material for creating items in the Readback, Confirmation and Correction, and Story Retellings sections of the test. The scoring of the VAET is such that both accuracy and clarity are assessed across multiple tasks. For accuracy, the scoring algorithms of the VAET evaluate whether or not the test taker understood the prompt and responded with appropriate content. For clarity, the VAET assesses whether or not the test taker speaks like a native or favorably-judged non-native with regard to pronunciation and fluency. Producing accurate lexical and structural content is important, but excessive attention to accuracy can lead to disfluent speech production and can also hinder oral communication; on the other hand, inappropriate word usage and misapplied syntactic structures can also hinder communication. These different dimensions of spoken communication are measured independently in the VAET. Third, proficient speakers shall “use appropriate communicate strategies to exchange messages and to recognize and resolve misunderstandings (e.g. to check, confirm, or clarify information) in a general or work-related context.” The tasks in the VAET are designed specifically to engage test takers in checking, confirming and clarifying information. For example, the Readback task asks test takers to listen to a radiotelephony message and readback the information in ICAO phraseology. The task provides a means for test takers to show their understanding of how to confirm information using standard communication protocols. In the Confirmation and Corrections task, the test taker listens to a short dialogue between an air traffic controller and pilot and continues the dialogue. For some of the items, the test taker must resolve a misunderstanding by correcting information that was misinterpreted by one of the speakers. Other items elicit affirmative confirmations to questions, while others request clarification. The test taker must decide how to respond to the dialogue appropriately and in a timely fashion.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 18 OF 45

Fourth, proficient speakers shall “handle successfully and with relative ease the linguistic challenges presented by a complication or unexpected turn of events that occurs within the context of a routine work situation or communicative task with which they are otherwise familiar.” To handle the linguistic challenges associated with unexpected situations, test takers not only must have access to a vocabulary that extends beyond memorized phraseologies, but they must be able to process language in an automatic fashion. In normal everyday conversation, language processing components that function automatically are extremely efficiently. Speakers go from building a clause structure to phonetic encoding, for example, in 40 milliseconds (Van Turennout, Hagoort, and Brown, 1998). Automaticity entails the ability to access and retrieve lexical items, to build phrases and clause structures, and to articulate these without conscious attention to the linguistic code (Cutler, 2003; Jascheniak, Hahne, and Schriefers, 2003; Levelt, 2001). Automaticity is required in order for the speaker to be able to pay attention to what needs to be said rather than to how the encoded message is to be structured. The VAET uses items that are both routine and unexpected to measure the test taker’s control of core language processing. By presenting items at a conversational pace and measuring response time in addition to accuracy of the content in the response, the VAET measures basic encoding and decoding of oral language as performed in real time. This performance predicts a more general spoken language facility, which is essential in successful communication during a complication or unexpected turn of events. Finally, proficient speakers shall, “use a dialect or accent which is intelligible to the aeronautical community.” The VAET measures the intelligibility of the test taker’s speech through the Pronunciation subscore. The algorithms for Pronunciation scoring focus on the test taker’s manner of speaking as opposed to the content of the test taker’s responses. The algorithm is designed to predict how members of the aeronautical community would judge the pronunciation of test takers. Thus, if a test taker is unintelligible to many listeners because of a heavy accent or dialect, the test results will reflect this in the Pronunciation score. 3.4 VAET is Reliable, Valid, and Practical (6.1.1) Details regarding the reliability of VAET and evidence of its validity will be available in the Technical Manual. For the English production-level test built on top of the same testing framework, the split-half reliability of the Overall score is 0.97, and the alternative forms reliability is 0.96. Construct validity is supported by the large separation of scores of native and non-native English speakers and the high correlation (0.97) between human-generated and machine-generated scores. When considering practicality of test administration, the benefits of automation are clear. With automation, large-scale administrations are possible and ensure rating consistency. No time constraints are introduced because of difficulties hiring, training, and monitoring qualified professionals to interview test takers and rate responses. The testing system is available 24 hours a day, seven days a week. This can help with efficient decision making and enhanced training.

3.5 The VAET Development Team Follows Code of Ethic s Outlined by the International Language Testing Association (ILTA) (6.4.1) The development team for VAET follows the code of ethics outlined by ILTA as standard development practice. In order to ensure respect for the humanity and dignity of each test taker, tests are developed from data representing both genders and a range of ages, races, ethnicities, and language background. By using data from a wide variety of users, steps are taken to ensure that the scoring models in the test do not discriminate against any particular group. Further, all items are reviewed by experts for any content that might discriminate against individual test takers. All information gathered about test takers is kept confidential and is only shared with authorized score users. Research subjects consent to participate without coercion and are free to withdraw from the research at any time. The development team often attends and presents research at conferences focused on language testing in order to develop their knowledge and share information with other professionals. By presenting at conferences, the team also strives to educate the professional community about language proficiency and language testing. Examples include presentations at the International Aviation Training Symposium in Oklahoma City, USA (2006) and ICAO Aviation Language Proficiency Seminar in Hong Kong (2006). The test developers accept the responsibility of not misusing professional knowledge and ensuring the integrity of the profession of language testing. The team strives to improve the quality of language testing through innovation and technology

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 19 OF 45

and acknowledges its obligations to society, including the accurate reporting of test results, and considerations of the potential effects of its work on all stakeholders. Test Development of VAET Involved Representatives from All Stake Holders as Well As Language Experts (6.5.1, 6.5.2, 6.5.3) The VAET test development team consisted of numerous language testing experts and relevant stakeholders in aviation language testing. Table 1 presents the members of the development team and their qualifications.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 20 OF 45

Table 1.

In addition, the recorded voices for the test included two retired air traffic controllers, two active air traffic controllers, one retired commercial pilot and two active commercial pilots. Over 1200 professional air traffic controllers and pilots world-wide participated in the VAET’s trial and validation research studies.

3.6 Test Development of VAET Followed a Rigorous Pr ocess (6.5.4) The VAET Technical Manual will provide details of the rigorous process used to develop the test. As an overview, the process entails clearly defining the test construct, creating a detailed test specification, writing an item specification to be used by item writers, developing the items, coordinating review of the items by trained professionals, revising the items, creating a data collection system, collecting feedback on the test administration from stakeholders and revising the test instructions, trialing the items with both native and non-native English speakers from a variety of countries, analyzing the data, making a final selection of items, enabling automatic scoring of the items, creating score reports, and validating the test. Only items that are answered easily by native English speaking professionals and that can effectively discriminate non-native speakers are included in the test. Due to the automatic nature of the scoring, no rater training program or process to ensure inter-rater reliability is needed. All scores are stored in databases. The process aligns with those recommended by the ICAO Manual (6.5.4) and by the Japan Language Testing Association (JLTA) referenced in Appendix D of the ICAO Manual.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 21 OF 45

3.7 VAET Tests Language Use in a Broader Context th an in the Use of ICAO Phraseology Alone. (6.6.3 d) In VAET, both ICAO phraseology and plain English appear throughout the test. For example, the Repeats section presents common English sentences, while the Readbacks section uses ICAO phraseology. In the Corrections and Confirmation section, ICAO phraseology is used except in those cases in which the test taker is presented with a non-routine situation and must use plain English to clarify, correct or confirm information. The test follows the recommended model in the ICAO Manual, in which a test is “comprised of a mix of both aviation-specific content alongside less aviation-specific content.” (Section 6.8.7)

3.8 Theory of Language Forms the Basis of the VAET (6.6.4) The VAET is based on a psycholinguistic view of language competence, which emphasizes the development of first and second language skills and the real-time processes that underlie the performance of these skills. Since the 1960s, many production and perception studies have focused on the confirmation (or disconfirmation) of the “psychological reality” of structures and processes that have been hypothesized in linguistic research. This research has uncovered and quantified many robust phenomena of skilled (native-like) language performance, as well as a fairly clear structure of elementary processes that can be identified in appropriate experimental contexts. Eysenck & Keane (1995) review studies that show that complex cognitive skills that are used in language become automatic in skilled performance, and therefore do not absorb any of the attentional capacity of the speaker-hearers. A corollary of a psycholinguistic view relevant to testing is that attention to core language production limits attention to content; therefore the measurement of fluent, automatic control of core language will predict the complexity of content that can be produced or understood in real time. This psycholinguistic view forms the basis of the VAET test construct: facility in spoken English in the aviation domain; that is, the ability to understand spoken English on work-related topics and to respond appropriately and intelligibly in spoken English at a fully-functional conversational pace. Facility in spoken English is essential to successful radiotelephony communications – if language users cannot track what is being said, extract meaning as speech continues, and then formulate and produce a relevant and intelligible response in real time, they will not be able to interact in effective communication, especially when attention is focused on aviation-related tasks at hand. Because the Aviation English Test requires real-time language processing, it measures the degree of automaticity in the encoding and decoding of the core elements of oral language. Automaticity is the ability to access and retrieve lexical items, to build phrases and clause structures, and to articulate responses without conscious attention to the linguistic code (Cutler, 2003; Jescheniak, Hahne, and Schriefers, 2003; Levelt, 2001). Because cognitive capacity is limited, automaticity is required in order for the speaker/listener to be able to pay attention to the unfolding situation rather than paying attention to what needs to be said or how the encoded message should be structured.

3.9 The VAET Uses a Semi-Direct Testing Procedure ( 6.7.5) The ICAO Manual outlines two different testing procedures: direct and semi-direct testing. In direct tests, the test taker is presented with elicitation prompts from a tester or with other test takers while being observed by one or more raters. In the semi-direct testing procedure, pre-recorded prompts are presented to the test taker and responses are usually recorded and later analyzed. As described in the ICAO Manual, both procedures directly assess the test taker’s language speaking or listening ability and are appropriate for assessing language proficiency (6.7.10). The VAET uses a semi-direct administration procedure. Recorded prompts from a variety of voices are presented to the test taker over the telephone. Each prompt elicits a response from the test taker that is recorded digitally and then analyzed. Because the voices are all recorded, presentation of test material is consistent across test administrations. The recordings also allow test items to be presented in different voices and therefore reflect a controlled range of accents and speaking styles of aviation professionals. One concern regarding semi-direct tests is that they can become compromised. However, the VAET is not made up of static test forms, but rather dynamically generated forms that are unique from one test administration to the next. Each test is created from a pseudo-random selection of items from a large item pool, limiting item exposure. As is recommended in section 6.7.9 of the ICAO Manual, each test performance is audio recorded and can be referenced for verification and record-keeping purposes.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 22 OF 45

3.10 Note about Technology (6.9.2) In section 6.9.2, authors of the ICAO Manual offer a cautionary note regarding the use of computers to rate speech. The two specific areas of concern are the reliability of speech recognition technology and the evaluation of Interactions using computerized rating. Each of these concerns is addressed in turn. Concern #1: Speech recognition technology is not reliable. Automatic speech recognition technology has been deployed in many domains over the past few decades including dictation systems, call center applications, and voice dialing on cell phones. Unfortunately, some users’ experiences with these systems have been negative because of inconsistent recognition. Some studies have also been conducted to evaluate the suitability of using speech recognition in the aviation domain; however, these studies were not focused on testing, and as such measured success via recognition error rates as opposed to assessment of language proficiency. Such experiences and research results create a preconception of the limitations of speech recognition technology. In the VAET, speech recognition is a small part of a larger system of spoken processing technologies integrated together to evaluate an examinee’s spoken performance. Each response helps the system assign a level of speaking ability to the examinee. Since 77 items contribute to this process independently, the assigned score becomes very reliable as all the items are considered. For the VAET, the acoustic models are different from those used in other speech recognition engines because they are trained specifically from samples of non-native English speakers. This means that the VAET test is expecting (heavily) accented speech and can understand what non-native speakers are intending to say, regardless of poor pronunciation. The examinee’s pronunciation is analyzed and scored separately. An empirical test of how well the speech recognition and speech processing technologies are performing is a comparison of scores generated from speech processing technologies with scores generated from human experts. Such an experiment was conducted with the Versant for English test. Test scores were gathered from test takers using the speech recognizer and other speech processing models. Separately, scores were generated for the same test takers from careful transcriptions and expert ratings of pronunciation and fluency. The correlation between the two sets of scores was 0.97. Such a high correlation indicates that machine scores are virtually indistinguishable from scoring that is done by careful human transcriptions and repeated independent human judgments. A scatter plot is shown in Figure 1.

Figure 1. Scatter plot of machine-generated Overall scores and human grades, n=50, r=0.97. A similar correlation is expected for the VAET, since both tests are built within the same technical framework. Thus, the speech recognition technology used in Versant tests to assign a score of spoken language performance is extremely accurate as seen from comparisons with human-generated scores.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 23 OF 45

Concern #2: A Computer Cannot Assess the Subskill Interactions. The ICAO rubrics highlight two main dimensions for assessing Interactions: the speed of the response and how appropriate the response is. For response latency, different levels provide different descriptions of response speed. At the operational level, for example, responses are “immediate” whereas at the elementary level, “response time is slow.” In interviews, these judgments would be made from subjective impressions. In contrast, the Versant machine records precisely how many milliseconds it takes the test taker to respond to an item. From the machine’s point of view seven seconds is always seven seconds. Latency measurements are then compared to the same measurements from proficient speakers and then are rescaled. The calibrations are done the same way for each test taker, so there is no inconsistency or bias. In this way, the machine is more reliable than humans at measuring response latency. The assessment of whether the response is appropriate is simply a matter of identifying what the test taker has said and judging the appropriateness of the content of the response. This analysis can be done automatically by using speech processing technologies (recognizing what the test taker has said), and Item Response Theory (IRT), which produces an estimate of the test taker’s ability according to the item’s difficulty. The analysis is based on more than just one rater’s impression of what is appropriate, but rather comparisons of the test taker’s response to hundreds of responses from native and non-native pilots and air traffic controllers of all proficiency levels. These comparisons help to precisely pin-point the test taker’s level of ability through robust statistical analyses. Other aspects of the Interactions rubric such as dealing with “an unexpected turn of events” and “checking, confirming, or clarifying” information are more closely associated with the tasks in the test as opposed to whether or not a machine can score the performance. The VAET includes tasks that present predictable situations as well as unexpected events and provides ample opportunities for test takers to deal with apparent misunderstandings by clarifying, confirming, and correcting information. Aside from technical arguments, there is also a philosophical element to the issue. It is still difficult for many to profess that a machine might be as good if not better than a human being at performing complex tasks. The debate about whether or not machines can evaluate Interactions, which may involve not only skill but intuitive human judgment, is similar to the debate about whether a machine could ever beat a human player in chess. In 1995, the world chess champion, Garry Kasparov, insisted that no computer would ever beat him because, in his view, there is more to playing chess than mathematical calculations. He claimed that there is a certain ‘art’ to winning. Two years later, after a match against IBM’s chess machine, Deep Blue, he was proven wrong. The machine algorithms were not based on artificial intelligence and therefore did not approach the problem the same way a human would. Instead, the designers took advantage of what computers do best: apply massive computing power to a search problem. In a similar vein, an automated analysis of Interactions exploits the very things that machines do well. Computers are very good at measuring time, and automatic speech recognition can be characterized as a special type of search problem. In addition, scores can be scaled accurately because the computer encodes information about which items are surprising or difficult to non-natives, based on actual data from examinees. The key dimensions that define the Interactions subskill are how immediate and appropriate the responses are. Because a machine can measure latency of speech and can evaluate whether or not a response is appropriate (in both predictable and unexpected situations), a machine can, in fact, measure ability on the Interactions scale. Although there are preconceived notions regarding the limitations of speech recognition technology, the two concerns mentioned in the ICAO Manual have been overcome and should not prevent the aviation community from taking advantage of all the benefits offered by automated language testing including consistent administration, ability to handle large volumes of test takers, unbiased and consistent scoring, and immediate results.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 24 OF 45

3.11 References Balogh, J., Bernstein, J., Suzuki, M., Lennig, M. (2006). Automatically scored spoken language tests for air traffic controllers and pilots. Paper presented at the International Aviation Training Symposium (IATS), Oklahoma City. Chiang, K. (2006). A case for an automated aviation English test. ICAO Aviation Language Proficiency Seminar, Hong Kong. Cutler, A. (2003). Lexical access. In L. Nadel (Ed.), Encyclopedia of Cognitive Science. Vol. 2, Epilepsy – Mental imagery, philosophical issues about. London: Nature Publishing Group, 858-864. Eysenck, M. & Keane, M. (1995). Cognitive Psychology. Hove, UK: Psychology Press. International Civil Aviation Organization, (2004). Manual on the implementation of the ICAO language proficiency requirements, First Edition, Document 9835. Jescheniak, J.D., Hahne, A. & Schriefers, H.J. (2003). Information flow in the mental lexicon during speech planning: Evidence from event-related brain potentials. Cognitive Brain Research, 15(3), 261-276. Levelt, W.J.M. (2001). Spoken word production: A theory of lexical access. PNAS, 98(23), 13464-13471. Van Turennout, M. Hagoort, P., & Brown, C.M. (1998). Brain activity during speaking: From syntax to phonology in 40 milliseconds. Science, 280, 572-574.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 25 OF 45

Aviation English Testing Facilities

This section contains the requirements for the setup of a facility to be used as a testing facility for Aviation English Certification tests. It also outlines the specifications for hardware that will be used for conducting certification tests using the Internet Delivered Tests application (CDT Client). Each facility intended to be used as a testing facility for Aviation English certification tests shall be setup in accordance with the guidelines specified in this section and shall consult this section to crosscheck for compliance. The procedures to be followed during the setup of the testing facility are as follows: Regarding the testing facility, each facility shall offer the following: � Temperature-controlled testing room.

� Adequate lighting

� Full ventilation

� Appropriate insulation to prevent distractions from activities in adjacent rooms

� Internet connection.

� Designated person to coordinate software download and system setup.

4.1 Hardware Requirements To use Pearson’s Computer-Delivered Tests, the Pearson software must be downloaded and installed on a personal computer that meets the following system requirements:

• Windows® XP SP3+, Vista, or 7 • Broadband Internet connection

• Pentium® III at 600 MHz or higher • 512 MB of RAM

• 5 GB free disk space • Screen resolution of at least 1024 x 768

• Web browser: Internet Explorer 7.0 (or higher) • Network security access to allow CdtClient.exe application to access https://www.VersantTest.com (port 443)

• Soundcard / audio driver with that can play audio (headphones recommended)

Section 4

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 26 OF 45

Furthermore each testing facility shall make available to the test applicant a head-mounted headset with microphone and headphones compatible requirements:

Headset Type Connectivity Cable

Usage Consumer

Compatibility Computer

Design Head Mounted Headphone Features Sound

Mode Stereo

Ear Piece Double Driver Unit Size 32 mm Frequency Response 20-20000 Hz

Impedance 32 ohms Microphone Features Frequency

Response 100-12000 Hz

Impedance 3320 ohms

The testing facility shall assure that all equipment necessary to conduct certification tests is maintained properly and in good working order prior to administering any certification tests. Testing equipment shall be tested on a regular basis and any malfunctioning item shall be replaced immediately.

4.2 Facility Security

Each aviation English testing facility shall assure that each test applicant is provided a secure testing environment that is free of any distractions to the applicant and in accordance with the procedures outlined in the Aviation English Test Security Procedures Section. In the case a testing facility is not able to comply with any of the guidelines specified in this section, it will not be allowed to administer any certification tests until such time, that compliance can be guaranteed.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 27 OF 45

Aviation English Test Security Procedures This section contains relevant procedures to be followed by test invigilators during the Jeppesen Aviation English certification test process. Strict adherence to these procedures is required to assure proper conduct of the test and prevent any security-related irregularities that would infringe the quality of the certification test. These procedures are applicable to any test invigilator, whether this involves a Jeppesen invigilator, or a designated representative. Jeppesen will have the responsibility of assigning a qualified test invigilator, and will assure that the invigilator, or a designated representative has been properly trained and is competent to administer certification tests, as outlined in the Jeppesen Aviation English Test Invigilator Syllabus in section 10. The Jeppesen test invigilator or a designated representative shall assure compliance with the following items:

� Verify the test applicant’s identification document; this document will contain a photograph of the test

applicant.

� The test invigilator will verify that the identification document presented by the test applicant is valid and the name of the applicant is properly readable.

� Assure that the identification document presented is issued by an official state or country agency.

� If applicable, the invigilator will also verify the test applicant’s company identification.

� If the test invigilator has any doubt concerning the validity of the identification document presented, he or she shall have the applicant present an alternate form of identification.

� The test invigilator shall not commence with the testing if the applicant is unable to present any form of identification, or when in doubt about the validity of the identification document presented.

� The test invigilator shall assure that the test system has been properly set up and all hard- and software required for the testing process is functioning properly.

� The test invigilator shall assure that the applicant has been properly briefed on testing procedures, and will not commence the test until he or she is assured the applicant has understood the instructions.

� The test invigilator shall assure that the test is taken solely by the test applicant, and that there is no assistance provided by any other person to the test applicant.

� The test invigilator shall assure that the test is conducted in a secure environment, free of any distractions to the test applicant.

� The test invigilator shall immediately stop the test, should there be any improper conduct from the test applicant during the certification test.

Section 5

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 28 OF 45

Identification Verification Procedures This section outlines the procedures to be followed during the Aviation English Certification testing process, as related to test applicant identification verification. These procedures are to be followed by and adhered to by any person that will administer aviation English certification tests, whether conducted by a Jeppesen test invigilator, or a designated representative. The procedures outlined in this document compliment the specific testing security procedures set forth in the Aviation English Test Security Procedures document (Section 5) 6.1 Identification Verification

It will be the responsibility of the aviation English certification test invigilator to verify the identification document presented by the test applicant, in compliance with the following:

� Assure that the identification document presented is issued by an official state or country agency.

� Assure that the document presented is a valid passport, official identification card, or driver’s license.

� Assure that the identification document contains a clearly visible photograph.

� Assure that the name listed in the document is the same as the name that will be listed on the certificate of proficiency.

� Assure that the document presented is still valid, and has in no way been tampered with.

� In the case that the test applicant is an employee of an airline or any other aviation organization, the test invigilator shall verify the individual’s company identification card, in addition to the official document of identification.

� In the case the test applicant is an employee of any Air Traffic Control organization, the test invigilator shall verify the individual’s credentials, in addition to the official document of identification.

6.2 Non Compliance No test invigilator will be allowed to administer any certification tests for any applicant that is either unable to present any identification documentation, or who presents a document that is not in compliance with the guidelines outlined above. Should there any doubt as to the authenticity of the identification documents presented, the test invigilator will be required to ask the test applicant for alternate forms of identification. 6.3 Identification Records No certification test invigilator will be allowed to administer any certification tests, without making sure that a photocopy of the identification document presented by the test applicant has been made, and this photocopy shall be kept on file at the testing facility, along with all other testing records of the test applicant.

Section 6

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 29 OF 45

Additional Testing Procedures

This section contains additional procedures to be followed in regard to conducting Aviation English Certification Tests. This section also outlines the procedures for grading, issuance of Certificates of Proficiency, and the storage of Certification Test results. Any facility that conducts Aviation English Certification Tests will be required to follow the procedures outlined in this section, and it is the responsibility of the testing facility to assure that any person acting as a test invigilator is familiar with these procedures. No testing facility shall be allowed to conduct any certification tests, if unable to comply with the procedures outlined in this section.

7.1 Pre-testing Procedures. No testing facility shall commence any certification tests, unless it is in compliance with the following:

� Assure that the testing facility has been properly setup in accordance with the procedures outlined in the Aviation English Testing Facilities document.

� Assure that the identification documents of the test applicant have been properly verified in accordance with the procedures outlined in the Aviation English Identification Document.

� Assure that a qualified test invigilator is available at the testing facility.

� Assure that the test applicant is properly briefed on how the Certification test will be conducted.

7.2 Testing Procedures Each testing facility shall make available a qualified test invigilator that will assure compliance with the following:

� Assure that each test applicant is assigned a valid Test Identification Number (TIN).

� Verify that the test applicant is familiar with the instructions on how to execute the Certification test.

� Assure that the test is being conducted in compliance with the procedures outlined in the Aviation English Test Security Procedures section (Section 5).

� Assure that the test applicant will be monitored while the Certification test is being executed.

7.3 Post-Testing Procedures

The procedures to be followed by any testing facility after completion of any Certification test are as follows:

� Provide Jeppesen Academy with the Test Identification Number and identification of the test applicant.

� The testing facility shall advise the test applicant of the grading and proficiency level achieved, as soon as such results have been received from Jeppesen Academy.

Section Section 7

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 30 OF 45

� In the case where a test applicant has not achieved the minimum proficiency level required, the test invigilator shall conduct an evaluation the test applicant’s results, identify the areas of weakness and where appropriate schedule the applicant for remedial training.

� Any test applicant that does not achieve the minimum required proficiency level, shall be rescheduled for certification testing at such time deemed appropriate by the test invigilator.

� The test facility shall assure that each applicant who has successfully achieved the minimum required proficiency level is issued a certificate of proficiency that positively identifies the test applicant, indicates the level of proficiency achieved, and shows when the test applicant would have to take the next certification test( if applicable).

� The test facility shall keep on file all relevant documents for each test applicant, including copies of identification documents, test grading, and certificate of proficiency.

� The test facility shall keep on file all relevant documents for each test applicant for a period of no less than 36 months.

7.3.1 Appeals Process (to be Provided Prior to Tes ting)

Should a test taker or their company representative disagree with the test results, the following steps will be taken:

o Test administrator will listen to the Open Questions via ScoreKeeper to see if there is any static or other obvious issue with the quality of the captured sound file.

� If so, then a no charge re-test is made available to the original test taker.

o If no obvious issue with the test audio is observed, then within 2 business days a request is made by the proctor to Pearson Support for a review of the test.

o Pearson’s Support will listen to both the test questions and the test takers responses and listen for any audio or technical issue for the test.

� If they find any suspected technical or audio issue, then a no charge re-test is made available to the original test taker.

o If no obvious technical or audio issue is detected, then the test will be reviewed by a Person linguistic expert to determine if the test score is accurate.

� If the test score is deemed accurate, then the test taker or their company representative is notified via email that the test score stands as is.

� If after the full review of the test there is any question that the test was not accurately scored, a a no charge re-test is made available to the original test taker.

� A review analysis report is provided to the test taker or their company representative explaining the findings of the review.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 31 OF 45

CDT Client Installation Guide

8.1 Introduction This installation guide is written for administrators of Computer Delivered Tests. It describes the system requirements and installation procedure for the CDT and related components. Pearson, the Pearson logo, and Computer Delivered Tests (CDT) are trademarks of Pearson Corporation. Copyright © 2012 Pearson Corporation. All rights reserved. 8.2 System requirements Before installing the CDT, make sure that your computer meets the following minimum requirements. 8.2.1 Hardware

• Windows® XP SP3+, Vista, or 7 • Broadband Internet connection

• Pentium® III at 600 MHz or higher • 512 MB of RAM

• 5 GB free disk space • Screen resolution of at least 1024 x 768

• Web browser: Internet Explorer 7.0 (or higher) • Network security access to allow CdtClient.exe application to access https://www.VersantTest.com (port 443)

• Soundcard / audio driver with that can play audio (headphones recommended)

• Soundcard / audio driver with recording and playback capabilities that has been certified to work with the version of Windows being run on the test computer (Note: the following audio drivers may not work: Conexant HD)

Head-mounted headset with microphone and headphones compatible with the requirements below:

Headphone features

Sound mode

Stereo

Ear piece Double

Driver Unit Size

32 mm

Frequency Response

20 - 20000 Hz

Impedance 32 ohms

Microphone features

Frequency response

100 - 12000 Hz

Impedance 3320 ohms

8.2.2 Calculate and Verify Bandwidth for Expected T esting Volumes: The CDT program operates by downloading a test to the local machine and then uploading the responses once the test is complete for scoring. This requires network access to an Internet connection of sufficient bandwidth to accommodate the volume of concurrent testing that you plan to conduct in your test center. (Note: In addition to a real-time mode, CDT also supports an option that allows you to pre-load tests, complete the tests offline, and then reconnect later to upload results for scoring). To ensure your test center has adequate Internet bandwidth, please consult the document “Network and Bandwidth Requirements.”

Section 8

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 32 OF 45

8.3 Installing CDT Before testing can begin, you must download and install the CDT client from Pearson’s website onto all of the computers that you plan to use for Versant testing. The whole process typically only takes a few minutes per computer. Follow these steps for each computer:

1. Go to http://www.VersantTest.com/technology/platforms/cdt/index.jsp

2. Click to download the CDT Client application (approx. 10 MB download).

3. When the dialogue box appears, select Run to download and install CDT on the computer.

4. Click Next in each of the following dialog boxes until the installation is completed

5. Once installation of the CDT client is complete, an icon will automatically appear on your desktop and in the Start Menu .

8.4 Verifying CDT Operation After installing CDT onto a computer, you should complete a sample test, making sure to check the headset audio and microphone, so that you can verify CDT is working properly before you arrange for live testing with candidates. 8.4.1 Launch CDT and Download a Test

1. Ensure your computer is connected to the Internet. 2. Start CDT by double-clicking the CDT Client icon

on your desktop or from the Start Menu . 3. Enter the Test Identification Number for a sample test (your account manager can provide samples for

verifying CDT installations). 4. Monitor the test download progress on the screen to determine the typical download speed for your

Internet connection. 8.5 Check Audio and Headset

1. Check that your headset and microphone are plugged in, configured properly, and that the sound and volume are turned on and up.

2. Follow the instructions for the audio check to ensure your computer can play the test audio at an acceptable level for completing the test.

3. For speaking tests, follow the instructions to

check that your microphone is functioning properly.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 33 OF 45

8.6 Complete a Test and Verify Uploading of Respons es

1. With your computer connected to the Internet, complete a sample test and click Finish . 2. Monitor the test upload in the Administrator menu to determine the typical upload speed for your Internet

connection. 3. Check the score on www.VersantTest.com or in your ScoreKeeper account to ensure the test was

properly uploaded and scored. 8.7 Uninstallation of Older Versions

If you have previously installed a version of CDT or the older IDT software for running Versant tests, to uninstall,

please follow these steps:

1. Launch Java Web Start from the Window’s Control Panel

Start => Settings => Control Panel => Java.

2. On the General tab under Temporary Internet Files , click Delete Files

3. Be sure all boxes are checked and click OK.

4. Click OK again to close the Java Control Panel.

5. Open Windows Explorer and select the C: drive.

6. Delete the entire C:\Ordinate folder.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 34 OF 45

Section 9

Computer Delivered Tests Test Administrator’s Guide

9.1 Introduction Pearson’s Computer Delivered Test (CDT) program, using patented Ordinate® speech processing technology, enables test administrators to deliver Versant language tests on a test center computer and upload completed tests for scoring.

This Guide is written for administrators of CDT. It explains how to:

• Configure Testing Center Computers • Download Tests • Take a Test

This Guide assumes that you have successfully installed the CDT program on each of the computers on which you intend to administer tests. If you have not completed the installation process, please consult the CDT Installation Guide which can be downloaded on Pearson’s website: www.VersantTest.com/technology/platforms/cdt/ 9.2 Preparing for Test Administration Before administrating a test, prepare the computer for test delivery. Preparation involves three steps:

• Verify screen resolution settings • Verify that the microphone is working and that the volume is properly set • Verify bandwidth of your Internet connection for your expected testing volume

9.2.1 Verify screen resolution settings

1. Open the Control Panel. 2. In the Control Panel, click Display. 3. Click the Settings tab. 4. Check the screen resolution setting in the Screen Area. It should be at least 1024 x 768. 5. Move the slider to adjust the settings to the required minimum if needed.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 35 OF 45

9.2.2 Verify microphone volume level

Microphone volume level for Windows XP and Windows 7 is adjusted automatically by the CDT Client. For Windows Vista it is required that microphone volume level is adjusted by following these steps:

1. In the Start Menu, locate the search box, type Speech Recognition Options and press Enter. 2. In the Speech Recognition Options Window click on Set up microphone. 3. Follow the instructions provided and click Next. 4. Read the full sentence that appears on the screen into the microphone, then click Next. 5. Click Finish, to complete your microphone calibration.

9.2.3 Verify bandwidth for expected testing volume

The CDT program operates by downloading a test to the local machine and then uploading the responses once the test is complete for scoring. This requires network access to an Internet connection of sufficient bandwidth to accommodate the volume of concurrent testing that you plan to conduct in your test center. (Note: In addition to a real-time mode, CDT also supports an option that allows you to pre-load tests, complete the tests offline, and then reconnect later to upload results for scoring). To ensure your test center has adequate Internet bandwidth, please consult the document “Network and Bandwidth Requirements.” 9.3 Downloading Tests After the CDT software has been installed and each computer has been configured, testing can proceed. For test takers to be able to take a test, a Test Identification Number (TIN) must be entered for each test to download the appropriate test materials. There are 2 administrative options for downloading TINs:

1. Pre-loading TINs before scheduled testing : As the test administrator, you can pre-load the computer

with tests before your scheduled testing. We recommend this method if: a. You have any concerns about the speed or performance of your Internet connection.‡ b. You plan to conduct testing on multiple computers at the same time.‡ c. You need to conduct testing on a computer that cannot be connected to the Internet during the

test (for example, if you conduct testing with a laptop in a remote location).

2. On-demand downloading of TINs : If you can stay connected to the Internet and have enough bandwidth for the volume of testing you plan to do‡, you may also provide TINs to the test takers and instruct them to enter the TIN as the first step of their test. The test materials will download then, on-demand. This download will need to complete before the test taker can begin the test.

Note that he same TINs cannot be downloaded to multiple computers. If a batch or TIN was mistakenly downloaded to the incorrect computer, please contact Pearson Support ([email protected]) to release the TIN(s) from the computer.

9.4 Getting Started

1. Ensure the computer is connected to the Internet. 2. Start CDT by double-clicking the CDT Client icon on the desktop. 3. Go to Menu located on the top right of the screen. 4. Click on Administration . 5. Enter your ScoreKeeper Username and Password to enter the Administrator’s configuration preferences and click Enter . If you do not have a ScoreKeeper Username and Password, contact your Jeppesen Account Manager.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 36 OF 45

9.5 Pre-loading TINs Before Scheduled Testing

Tests can be pre-loaded by batch, or by Test Identification Numbers (TINs).

1. In the Administrator menu, click Download Tests . 2. To download an entire batch of tests

a. Click the Batch Key tab. b. Enter the alpha-numeric batch key into the space provided (Note: the batch key can be found by

going to the batch in your ScoreKeeper account and clicking the Test Materials link). c. Click Enter . CDT will then download all of the unused TINs in that batch.

3. To download individual Test Identification Numbers (TINs) from multiple batches a. Click the TIN(s) tab. b. Enter each TIN or a comma-separated list of TINs. c. Click Enter .

9.6 On-Demand Downloading of TINs To download tests on-demand, you may also provide TINs to the test takers and instruct them to enter the TIN as the first step of their test.

1. Start CDT, enter the TIN in the space provided and click Enter . 2. The TIN will then begin to download. The download progress will be displayed. Do not exit CDT while

the TIN is downloading. 3. Once the TIN is fully downloaded, CDT will display the headset (and for speaking test, microphone)

check. 9.7 View Tests Once you have downloaded tests to the computer, you can view all the unused tests that are available for use on this computer.

1. In the Administrator menu, click View Tests . 2. Verify the total number of tests available match your records. 3. Click Refresh TIN List to check the availability of the local test inventory against your ScoreKeeper

account. 9.8 Taking a Test To begin a test, the administrator or test taker must enter a valid Test Identification Number (TIN) into the Welcome screen of the CDT Client.

1. Start CDT and go to Home page. 2. Give the test taker a TIN. 3. The test taker enters the TIN in the space provided and clicks Enter to start the test.

If the TIN has been pre-loaded, then the test will begin immediately, starting with an audio check. If not, then the TIN will be downloaded before the test can begin.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 37 OF 45

9.8.1 Audio Volume and Microphone Check At the beginning of each test, test takers will go through an audio volume check. For speaking tests, test takers will also go through a microphone volume check.

1. The test taker is instructed to put on their headset (for speaking tests) or headphones (for writing tests) as

shown on the screen. 2. Click Next to proceed. 3. The test taker listens to a passage and is instructed to move the slider to change the volume. Note: Test

takers can also change the volume during the test in the upper right corner of the screen. 4. Click Next to proceed. 5. For speaking tests, test takers are asked to read a sentence into their microphone to verify that it is

working correctly. 6. If CDT cannot verify that the microphone is working correctly, an error message will appear. Follow the

instructions to complete the microphone check. Is important to verify that the microphone is working properly. If the microphone is unable to properly record the test taker’s responses, then that test’s score may be affected. If error messages persist, exit CDT and check the microphone volume and functioning using your computer’s audio and microphone controls (see Section 2.2).

7. Once the calibration checks are complete, the test will begin.

9.8.2 Completing the Test When the test taker is finished, the application will prompt the test taker to click Finish . If the computer is connected to the Internet with the test taker clicks Finish, then the test responses will be automatically uploaded to the Versant system for scoring. Note: if the test taker exits CDT before clicking Finish, then the test will not be completed and sent for scoring. If the Internet connection was disabled during testing, administrators must reconnect to the Internet and launch the CDT Client for the response files to be automatically submitted for scoring. Keep CDT open until all responses have been uploaded. You can check the status of completed tests by doing the following:

1. Go to Menu located on the top right of the screen. 2. Click on Administration . 3. Enter your ScoreKeeper Username and Password to access the Administrator’s configuration

preferences and click Enter . If you do not have a ScoreKeeper Username and Password, contact your Account Manager.

4. In the View Tests section, click Refresh TIN List . 5. Look at the “Sessions” information. If there are tests that still need to upload for scoring, they will be listed

here. If you see “No local sessions/response,” then all completed tests have been uploaded and you can exit CDT.

9.8.3 Checking Scores Test scores are available within minutes after the voice files are successfully submitted to the Versant system. To view scores, go to www.VersantTest.com to log into your ScoreKeeper account.

9.9 Troubleshooting This section contains suggestions for troubleshooting problems. If you encounter any problem that cannot be resolved, contact our Technical Support team at [email protected] or your Jeppesen account manager.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 38 OF 45

9.9.1 Microphone Errors

Microphone volume levels for Windows XP and Windows 7 users will automatically adjust in the CDT Client. For Windows Vista users, follow the steps in Section 2.2 to verify the microphone volume level. If the volume levels have been verified but error messages are still occurring, it could be for one of the following reasons:

1. The microphone has not been positioned correctly as shown on the computer screen. 2. The microphone may not be compatible or working properly with the computer. Exit the CDT Client, and

go to the Control Panel. Select Sound and Audio Devices and then click on the Voice tab. Next, click on the Test Hardware button and follow the steps on the screen to make sure the microphone is working properly.

9.9.2 Downloading Tests

If you are experiencing difficulties downloading tests, it could be for one of the following reasons:

1. The Batch Key may not be valid. The Batch Key is a 12 character alpha-numeric code that is unique to each batch. The Batch Key can be found in your ScoreKeeper account List Batches >> Test Materials for each batch.

2. The Batch has no available tests. If the batch is already downloaded into your computer it is recommended to Refresh the TIN list before starting using those TINs (see Section 3.3), to ensure the most up to date list of unused TINs are displayed. Only unused TINs will be downloaded to the computer.

3. The TIN(s) or Batch cannot be located. The TIN(s) or entire batch may have already been used. Log into your ScoreKeeper account and go to List Batches >> View Scores and select Click here to see a list of all tests in this batch to ensure the TIN(s) or batch are unused.

9.9.3 Scoring Delays

Test scores are typically available in ScoreKeeper within minutes of completing a test and uploading the responses to the Versant Testing System. If scores are not available within 15 minutes, it could be for one of the following reasons:

1. Your internet connection may have been interrupted. Exit the CDT Client, re-connect to the Internet, and

open the CDT Client. Tests will be automatically uploaded to the Versant testing system and the scores will become available shortly.

2. The Versant testing system may be experiencing a brief outage. Check with your Account Manager to

see if there is any scheduled maintenance downtime and when the system will become available. Once the outage is complete, tests that were administered during the outage window will be scored on a first-come, first-served basis to the best of our scoring capacity.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 39 OF 45

Section 10

Test Invigilator Course Syllabus

10.1 Applicability

This section sets forth the requirements of an approved Aviation English Test Invigilator Course. The course is intended for individuals desiring to invigilate certification tests conducted by aviation personnel to achieve the required level of English proficiency, as outlined in ICAO SARPS relevant appendices and ICAO Document 9835. Duration of the course is 1 day.

10.2 Course Objective

At the conclusion of the training curriculum, the individual will be able to successfully demonstrate his knowledge of invigilation of Aviation English Certification Tests. Upon successful completion of the course, the individual will be issued a Certificate of Qualification, which serves as proof that the individual has undergone the required training, and has reached the competency level required for aviation English Certification test invigilation.

10.3 Facilities Training will be conducted in a facility which can comfortably seat students and is equipped with head sets with boom microphone and will be maintained at a comfortable temperature.

10.4 Training Aids

Training aids include:

� Laptop Computer

� Projector, flipchart or whiteboard

� Internet Delivered Test (CDT) User Guide

10.5 Curriculum

10.5.1 Testing Facility Setup

Learning Objective Discuss the importance of having a secured, comfortable testing environment, where test applicants are not subjected to any unnecessary distractions and are able to fully concentrate on the certification test. Items to check for include:

� Adequate Facility Space

� Facility Tidiness

� Facility Noise Level.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 40 OF 45

10.5.2 Test Administration Preparation

Learning Objective

Obtain knowledge on how to properly set up the workstations on which the tests will be conducted, and verify the working status of required hardware. Items to verify include:

� Microphone and Headset functionality.

� Screen resolution settings.

� Audio Settings.

10.5.3 Administering a Test Learning Objective

Learn how to successfully launch the CDT Client application, execute and administer a test. This section will include the following items:

� Launching the CDT Client

� Downloading and executing a test

� Downloading multiple tests.

� Administering previously downloaded tests

� Adjusting microphone and sound playback volume levels

� Pilot Briefing

� Security

10.5.4 Administrative Functions Learning Objective

Become familiar with procedures to handle administrative functions on the CDT Client Application, and monitor the status of the system during the test upload process. The following items will be included in the discussion:

� Accessing the CDT configuration panel

� Changing the administrator password

� Configuring data transport

� Administering tests in offline mode

� Monitoring upload status

� Configuring display settings

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 41 OF 45

10.5.5 Trouble Shooting

Learning Objective Obtain knowledge on how to adequately resolve any irregularities specifically related to the CDT Client Application. Explanation and solutions will be offered for the following problems:

� Calibration Failure

� Password Recovery

� Upload Problems

� Test Data recovery

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 42 OF 45

Miscellaneous Forms

This section contains additional forms and instructions related to Aviation English Testing. 11.1 Aviation English Practice Placer Test Instruct ions. Welcome to the ICAO Aviation English proficiency evaluation. Please read these instructions carefully and ask for clarification on any items that you do not fully understand. The test will take approximately 10 minutes to complete. The Jeppesen English proficiency evaluation is presented by computer over the telephone. It is designed to assess your ability to understand and use ICAO phraseologies as well as common English. The test is intended to measure the six subskills in the ICAO language proficiency scale (Pronunciation, Structure, Vocabulary, Fluency, Comprehension, and Interactions). Procedure: First, take time to read the whole test paper. If there are words or sentences that you don't understand, you may use a dictionary or ask someone for help. When you are ready to begin the test, use an appropriate telephone to call the telephone number printed on the test paper. When asked, you should enter the Test Identification Number using the buttons on your telephone keypad. You should take the test on your own by following the instructions given over the telephone. Relax, concentrate, and do your best. If you do not know how to respond to a test item, just be silent or say “I don’t know.” Test Sections: The test has eight sections (Part A, B, C, D, E, F, G and H) as follows: Part A: AVIATION READING You will be asked to read three aviation specific messages in a random order from among those printed in Part A. Read the messages out loud smoothly and naturally. Part B: COMMON ENGLISH READING You will be asked to read three sentences in a random order from among those printed in Part B. Read the messages out loud smoothly and naturally. Part C: REPEAT You will hear 14 sentences in this section, one at a time. Repeat each sentence you hear − exactly as you hear it. If you can’t repeat the whole sentence, repeat as much of the sentence as you can. Example: When you hear: “My next flight is on Saturday.” You say: “My next flight is on Saturday.” Part D: SHORT ANSWER QUESTION In this section, you will hear 18 short questions. Answer each question with a single word or a short phrase of two or three words. Example: When you hear: “Where in the airplane do the pilots control the aircraft?” You say: “cockpit” or “in the cockpit” Part E: READBACK You will hear eight radio messages in this section, one at a time. The radio messages use ICAO phraseologies. When you hear a message, say an appropriate readback using ICAO phraseologies. Example: Cessna 29 (The call sign is printed on the test paper.) When you hear: “Cessna two niner, exit taxiway Hotel.” You say: “Exit taxiway Hotel, Cessna two niner.” Or: “Exiting taxiway Hotel Cessna two niner.”

Section 11

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 43 OF 45

Part F: CORRECTIONS and CONFIRMATIONS You will hear 3 radio communication exchanges between two speakers, Speaker 1 and Speaker 2. These speakers are a pilot and an air traffic controller. You need to respond to Speaker 2’s response as if you were Speaker 1. Speaker 2’s response may contain wrong information, a question, or a request. If the response contains wrong information, correct that information. If the response includes a question or request, respond appropriately. Depending on the situation, you may answer the questions using only ICAO phraseologies or a combination of ICAO phraseologies and common English. Example: East Global Air 295 (The call sign is printed on the test paper.) When you hear: (Speaker 1) “East Global Air 295, contact Atlanta Radar 122.15.” (Speaker 2) “Atlanta Radar 122.5, East Global Air 295.” You say: “East Global Air, negative, contact Radar 122.15.” Or: “East Global Air 295, I say again, 122.15.” Part G: STORY-RETELLING You will hear three stories. After each story, retell it as best you can in your own words in English. Each story will be read only once. After you hear a beep, you will have 30 seconds to retell the story. Part H: OPEN QUESTIONS You will hear one questions. After you hear a beep, speak your opinion as fully and clearly as you can. You will have 30 seconds to answer. Express your opinion and supporting reasons in clear, coherent English. Try to speak for the whole 30 second period. How the test is scored: The Aviation English Test scores are based on the exact words that you speak, as well as the pace, fluency, and pronunciation of those words as combined in phrases and sentences. Give quick, smooth, loud responses. Note that some test items have more than one correct answer. When you hear: “Thank you for calling the Pearson Testing System.”again, the test is complete; you may hang up. Suggestions: Place your call to the Pearson testing system on a good telephone in a suitable location. Choose a location that is quiet and where you will not be interrupted. Hold the phone as shown in the figure below and speak in a loud, steady voice. Use a land-line, push-button telephone in good working order that is set to “tone” (not “pulse”). Newer phones are generally better than older phones. Do not use a cordless, cellular phone, or VOIP phone.

Tel: for practice test + 1 415 738 3800 Scores for the practice tests can be viewed at www.Pearson.com by entering the Test Identification Number (TIN) in the field on the right side or the page and click get score.

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 44 OF 45

11.2 Briefing instructions for ICAO Aviation Englis h testing. Welcome to the ICAO Aviation English proficiency evaluation. Please read these instructions carefully and ask the test invigilator for clarification on any items that you do not fully understand. The test will take approximately 30 minutes to complete. The Jeppesen English proficiency evaluation is presented by computer over the telephone. It is designed to assess your ability to understand and use ICAO phraseologies as well as common English. The test is intended to measure the six subskills in the ICAO language proficiency scale (Pronunciation, Structure, Vocabulary, Fluency, Comprehension, and Interactions). Procedure: First, take time to read the whole test paper. If there are words or sentences that you don't understand, you may use a dictionary or ask the test invigilator for help. You should take the test by following the instructions given. Relax, concentrate, and do your best. If you do not know how to respond to a test item, just be silent or say “I don’t know.” Test Sections: The test has eight sections (Part A, B, C, D, E, F, G and H) as follows: Part A: AVIATION READING You will be asked to read three aviation specific messages in a random order from among those printed in Part A. Read the messages out loud smoothly and naturally. Part B: COMMON ENGLISH READING You will be asked to read three sentences in a random order from among those printed in Part B. Read the messages out loud smoothly and naturally. Part C: REPEAT You will hear 14 sentences in this section, one at a time. Repeat each sentence you hear − exactly as you hear it. If you can’t repeat the whole sentence, repeat as much of the sentence as you can. Example: When you hear: “My next flight is on Saturday.” You say: “My next flight is on Saturday.” Part D: SHORT ANSWER QUESTION In this section, you will hear 18 short questions. Answer each question with a single word or a short phrase of two or three words. Example: When you hear: “Where in the airplane do the pilots control the aircraft?” You say: “cockpit” or “in the cockpit” Part E: READBACK You will hear eight radio messages in this section, one at a time. The radio messages use ICAO phraseologies. When you hear a message, say an appropriate readback using ICAO phraseologies. Example: Cessna 29 (The call sign is printed on the test paper.) When you hear: “Cessna two niner, exit taxiway Hotel.” You say: “Exit taxiway Hotel, Cessna two niner.” Or: “Exiting taxiway Hotel Cessna two niner.” Part F: CORRECTIONS and CONFIRMATIONS You will hear 3 radio communication exchanges between two speakers, Speaker 1 and Speaker 2. These speakers are a pilot and an air traffic controller. You need to respond to Speaker 2’s response as if you were Speaker 1. Speaker 2’s response may contain wrong information, a question, or a request. If the response contains wrong information, correct that information. If the response includes a question or request, respond appropriately. Depending on the situation, you may answer the questions using only ICAO phraseologies or a combination of ICAO phraseologies and common English. Example: East Global Air 295 (The call sign is printed on the test paper.) When you hear: (Speaker 1) “East Global Air 295, contact Atlanta Radar 122.15.” (Speaker 2) “Atlanta Radar 122.5, East Global Air 295.” You say: “East Global Air, negative, contact Radar 122.15.” Or: “East Global Air 295, I say again, 122.15.”

VERSION 2, DECEMBER 2011 – JEPPESEN, AVIATION ENGL ISH TESTING MANUAL PAGE 45 OF 45

Part G: STORY-RETELLING You will hear three stories. After each story, retell it as best you can in your own words in English. Each story will be read only once. After you hear a beep, you will have 30 seconds to retell the story. Part H: OPEN QUESTIONS You will hear one questions. After you hear a beep, speak your opinion as fully and clearly as you can. You will have 30 seconds to answer. Express your opinion and supporting reasons in clear, coherent English. Try to speak for the whole 30 second period. How the test is scored: The Aviation English Test scores are based on the exact words that you speak, as well as the pace, fluency, and pronunciation of those words as combined in phrases and sentences. Give quick, smooth, loud responses. Note that some test items have more than one correct answer. When you hear: “Thank you for calling the Pearson Testing System.”again, the test is complete. End 11.3 Briefing Instructions Declaration. I hereby declare that I have been thoroughly briefed and understand the instructions and conditions of the Aviation English proficiency evaluation which I am about to take. Full Name: _____________________________________ Airline:________________________________________ Staff Number: __________________________________ Test Identification Number:________________________ Signature: ______________________________________ Date: ______________ Location: ___________________ Witness/Invigilator: ______________________________