constructing a language assessment knowledge base

Upload: paulo-goliath

Post on 17-Oct-2015

26 views

Category:

Documents


1 download

TRANSCRIPT

  • http://ltj.sagepub.com/

    Language Testing

    http://ltj.sagepub.com/content/25/3/385The online version of this article can be found at:

    DOI: 10.1177/0265532208090158 2008 25: 385Language Testing

    Ofra Inbar-Lourieassessment courses

    Constructing a language assessment knowledge base: A focus on language

    Published by:

    http://www.sagepublications.com

    can be found at:Language TestingAdditional services and information for

    http://ltj.sagepub.com/cgi/alertsEmail Alerts:

    http://ltj.sagepub.com/subscriptionsSubscriptions:

    http://www.sagepub.com/journalsReprints.navReprints:

    http://www.sagepub.com/journalsPermissions.navPermissions:

    http://ltj.sagepub.com/content/25/3/385.refs.htmlCitations:

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Language Testing 2008 25 (3) 385402

    Constructing a language assessmentknowledge base: A focus on languageassessment coursesOfra Inbar-Lourie Tel-Aviv University & Beit Berl AcademicCollege, Israel

    The competencies required for conducting assessment in the educationalcontext have recently been reformulated in view of social constructivist per-spectives and the acknowledgement of the role of classroom assessmentin promoting learning. These changes have impacted the knowledge baselanguage assessors need to obtain, and hence the contents of language assess-ment courses. This paper considers the possible components of a languageassessment literacy knowledge base, and proposes the establishment of acore knowledge framework for courses in language assessment.

    Keywords: assessment culture, assessment literacy, language assessmentcourses, language assessment literacy, language testing courses

    Testing concepts have until recently formed the framework withinwhich various facets of evaluating language knowledge/ability wereresearched and discussed. This is clearly reflected in the name of thisvery journal now celebrating its anniversary. It is also language testingexperts who have been disseminating knowledge in courses on the the-oretical and practical aspects of language testing the world over. Thelast few years, however, have seen the introduction of assessment ter-minology into the language evaluation research discussion, signalingnot merely a semantic change but a profound conceptual one, withassessment perceived to be an overarching term used to refer to allmethods and approaches to testing and evaluation whether in researchstudies or educational contexts (Kunnan, 2004, p. 1). With thisbroader understanding comes a recognition of the need for multipleforms of assessment for collecting information for various purposes indiverse contexts (Huerta-Macias, 1995). However, this conceptual

    Address for correspondence: Ofra Inbar-Lourie, School of Education, Tel-Aviv University, Tel-Aviv 69978, Israel; email: [email protected]

    2008 SAGE Publications (Los Angeles, London, New Delhi and Singapore) DOI:10.1177/0265532208090158

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 386 Constructing a language assessment knowledge base

    shift goes beyond notions of alternative assessment (or alternatives inassessment, Brown & Hudson, 1998), perceiving the language evalu-ation process as a socially constructed activity embedded in the localcontext with teachers, students and other community members recog-nized as meaningful assessment partners (Leung, 2004; Lynch, 2001;Lynch & Shaw, 2005; McNamara & Roever, 2006).

    This change of thinking has significant implications for the know-ledge base in the field and hence for courses that have hitherto beenreferred to as language testing courses, and lately include titles whichincorporate an assessment perspective (such as LanguageAssessment and Evaluation, Read & Erlam, 2007; or Assessment inthe Language Classroom, OLoughlin, 2006). While some availablecourses have retained their language testing orientation and deal mostlywith issues pertaining to the design and use of tests as the means fordetermining language proficiency (Bailey & Brown, 1995, and seeBrown & Bailey, this issue), others incorporate additional assessmentcomponents, as well as an examination of the social roles of tests andtesters in the assessment process (for example, Kleinsasser, 2005).

    This paper will focus on the knowledge base in courses in languageassessment geared primarily for language teachers but also for otherstakeholders who oversee the language assessment process in educa-tional contexts. The term assessment and hence language assess-ment courses will be used hereafter to distinguish such courses fromthose operating with a more traditional language testing paradigm.

    The paper will begin by surveying the components of assessmentliteracy in general education and the implications this holds for lan-guage assessment literacy. It will then focus on the crux of the issue,i.e., the assessment skills and understandings currently perceived asvital for conducting language assessment in educational settings.These competencies will then be linked to research on language test-ing and assessment courses, culminating with a proposal for estab-lishing a core language assessment knowledge base.

    I Assessment literacyViews about the knowledge base required for conducting assessmentin education have been influenced by socio-cultural approachestowards teaching and learning and the growing recognition of thesocial role of assessment in education (Broadfoot, 1996; 2005;National Research Council, 2001). In a review of educational learningtheory, from behaviorist to cognitive social-constructivist models,

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 387

    Shepard (2000) observes the gap between psychometric testing envir-onments, particularly those employing high-stakes standardizedtests (i.e., testing cultures, Wolf, Bixby, Glenn, & Gardner, 1991),and contemporary learning environments which follow Vygotskiantheories and are referred to as learning cultures. Learning culturesare grounded in interpretive epistemology which views reality as thesubject of social construction. Learning is perceived as a culturallysituated and mediated activity, with language occupying a funda-mental role as a mediating tool (Lantolf, 2000). The learning com-munity jointly co-constructs knowledge, and teachers feedback playsa central role in supporting, and promoting students language learn-ing (for a detailed description of the constructivist class, seeWindschitl, 2002).

    The assessment environment congruent with learning cultures isreferred to as an assessment culture (Dochy & Segers, 2001; Shepard,2000). Experts and practitioners who function within an assessmentculture share epistemological suppositions about the dynamic nature ofknowledge, as well as assumptions about students, teaching and learn-ing. Students are viewed as active empowered partners in the assess-ment process who monitor their own learning, provide feedback to theirpeers and set criteria for evaluating progress. The teachers role isgeared to formulating and scaffolding learning on the basis of on-goingfeedback from internal and external assessment sources. Learningdevelopment is monitored and recorded regularly focusing on both theprocess and product dimensions, and student evaluation is provided inthe form of a profile rather than a numerical score (Birenbaum, 1996;Wolf et al., 1991). Seeing that learning and assessment are viewed asintertwined, assessment culture highlights the notion of assessment forlearning emphasizing formative assessment practices (Black &Wiliam, 1998), and introducing suitable assessment tools such asDynamic Assessment (Lantolf & Poehner, 2008).

    Conducting assessment within this constructivist contextually-situated framework necessitates internalization of assumptions andbeliefs about assessment as a social practice and a social product(Filer, 2000), and awareness of the possible intended or unintendedconsequences particular assessment measures and actions may havefor specific groups and individuals. It also requires appreciation ofthe role of assessment in the instructional-learning cycle (Brookhart,2003; Gipps, 1994), and underscores the teachers dual (and oftenfuzzy) role as both teacher and assessor of curriculum attainment, inaddition to being (in the case of the language teacher) a facilitatorof language development (Rea-Dickins, 2007, p. 193).

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 388 Constructing a language assessment knowledge base

    Since assessment also operates on the external institutional level forranking, monitoring and placement purposes, teachers and administra-tors are expected to be familiar with external assessment formats,assessment procedures and data analysis, so as to interpret the resultsand feed them into their teaching. They also need to gain understand-ing of the competing and often contradictory forces at play between thetesting and assessment cultures. This is especially noticeable in con-texts where practitioners function simultaneously within two non-com-patible cultures: encouraged within their classrooms to pursuesocio-culturally based classroom pedagogy and assessment practices(often referred to as alternative assessment, Hargreaves, Earl &Schmidt, 2002), while concurrently required by external authorities toabide by the rules of testing cultures (McKay & Brindley, 2007). Manyissues in this dual complex reality are far from resolved. Teasdale andLeung (2000) analyze how this mixed discourse has contributed to amuddled view of the meanings attributed to assessment (p. 176). Thisis most prominent in the area of validity, and the debate centers onwhether to employ positivist forms of inquiry or broaden the scope,using methodologies that are compatible with the assumptions thatunderlie interpretive assessment cultures (Lynch & Shaw, 2005).

    In view of these changed perspectives, a reformulation of the com-petencies needed for conducting assessment in the educational con-text is called for. McMillan (2000) lists eight assessment principlescurrently deemed fundamental for teachers and school administrators,attempting to form a comprehensive body of knowledge which takesinto account the different perspectives and allows teachers andadministrators access to the issues at hand. The principles includereference to the potential tensions which impact decision-makingin assessment, that is, tensions between formative and summativeassessment, between criterion- and norm-referenced approaches,between traditional and alternative assessment formats and betweenexternal standardized testing versus classroom tests. Additional prin-ciples refer to the formative role of assessment in instruction, to theimportance of using multiple means for assessing learners and to theneed for fair, ethical, valid, and at the same time efficient and feasi-ble, assessment practices (McMillan, 2000).

    Similar views are reflected in the Standards for TeacherCompetence in Educational Assessment of Students (followed byStandards for Educational Administrators) published already in1990 by the American Federation of Teachers, National Council onMeasurement in Education and the National Education Association.The standards refer to activities occurring prior to, during and

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 389

    following instruction, involving decision-making in the school andschool district, as well as in a wider community of educators. Theyrange from the ability to choose and develop assessment tools tomatch instruction to the skillful administration, scoring and interpret-ation of externally and internally administered assessment proced-ures. They also incorporate standards for utilizing assessment datafor various teaching purposes, for reporting results to differentparties, and recognizing unethical, illegal, and otherwise inappro-priate assessment methods and uses of assessment (1990, p. 7).

    In order to operationalize and implement this conceptual frame-work one needs to gain literacy in assessment concepts, skills andstrategies. Assessment literacy (Boyles, 2005; Malone, 2008;Stoynoff & Chapelle, 2005) is the ability to understand, analyze,and apply information on student performance to improve instruc-tion (Falsgarf, 2005). Becoming assessment literate requires theattainment of a toolbox of competencies, some practical and sometheoretical, on why, when and how to go about constructing a varietyof assessment procedures (Boyles, 2005; Hoyt, 2005). Being literatein assessment thus means having the capacity to ask and answer crit-ical questions about the purpose for assessment, about the fitness ofthe tool being used, about testing conditions, and about what is goingto happen on the basis of the results. The manner and process ofacquiring assessment literacy, similar to professional developmentinitiatives in socio-cultural pedagogy (see Teemant, Smith, Pinnegar& Egan, 2005), assume a constructivist learning approach, wherebythe participants and assessment experts form a knowledge com-munity by discussing, critiquing and questioning fundamental issuesrelevant to their context.

    The concepts and derived assessment skills outlined above areperceived to be core competencies in the area of assessment. Thequestion is what additional knowledge language assessors require,and whether or to what extent such knowledge is incorporated intolanguage assessment courses.

    II Language assessment literacyThe very existence of language assessment courses indicates thatexpertise in language assessment requires additional competencies.The language assessment knowledge base in fact comprises layers ofassessment literacy skills combined with language specific compe-tencies, forming a distinct entity that can be referred to as language

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 390 Constructing a language assessment knowledge base

    assessment literacy (henceforth, LAL). Some of the LAL compe-tency dimensions focus on the trait, or the what of language testingand assessment, while others relate to the method, the how(Shohamy, 2008). However, understanding the what and perform-ing the how necessitates appreciation of the background and rea-soning behind the actions taken, that is, the why. Each of theseaspects is rooted in language-related considerations as well as ingeneral education and assessment and testing cultures.

    Discussion of LAL needs to be considered with reference to cur-rent assessment developments, in particular the support for assess-ment for learning approaches in many parts of the world(Assessment Reform Group, 2002; Leung, 2004; Davison, 2007).Language assessors, particularly teachers, are expected to engage inclassroom assessment practices, report on learners progress alignedwith external criteria as well as prepare learners for external exami-nations. To comply with these demands Brindley (2001a) offers anoutline for programs for professional development in languageassessment, which focuses on the knowledge components requiredfor conducting language assessment in an educational context. Theoutline is modular in that it acknowledges different assessment needs,some of which are regarded as core and some as optional, and pro-vides a useful framework for considering the components of the LALknowledge base. The review that follows uses the outline as a basisfor analyzing and discussing LAL competencies in relation to theassessment knowledge dimensions mentioned above: the reasoningor rationale for assessment (the why), the description of the trait tobe assessed (the what), and the assessment process (the how).

    1 The whyThe first core module in Brindleys proposal provides the back-ground and rationale for assessment. The focal point is the social,educational, and political aspects of assessment in the wider com-munity, looking at questions of accountability, standards, ethicsand the role in society of standardized competitive examinations andtests (Brindley, 2001a, p. 129). This social perspective concurs withwhat McNamara (2006) refers to as the social turn that the lan-guage assessment field has taken in the last decade. It implies epis-temological shifts regarding knowledge construction as well ascritical views on the role of language tests in society, on the need todemocratize assessment and on the responsibility of language testers(Kunnan, 2000; Lynch, 2001; McNamara & Rover, 2006; Shohamy,

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 391

    2001). Lynch and Shaw (2005) illustrate some of these issues byframing their discussion on the use of the portfolio within an assess-ment paradigm that is based on different suppositions than thosewhich typify traditional testing cultures. The paradigm offered is thatof an assessment culture, with the concepts of validity and ethicsanalyzed in terms of power relations.

    However, it is important to note, that the concepts of language orlanguage assessment are not explicitly referred to in Brindleys outlineas central social themes in the listings of topics to be discussed withinthis module. I would like to argue that the role of language evaluationin impacting on decision-making in various areas (e.g. civil, vocational,educational), needs to be accentuated and critically discussed in thisinitial phase of the professional development and knowledge attain-ment and construction of future language assessors. This is crucial forit will foreground and shape perceptions of language as a social entityin terms of both the trait to be assessed and the assessment process.

    2 The whatThe second core module in the program (Brindley, 2001a), entitledDefining and describing proficiency, presents the trait, the theor-etical basis for language tests and assessment, including the conceptsof validity and reliability and a critical evaluation of models of lan-guage knowledge. In practice this implies that language assessors areexpected to be well versed in current theories and research findingsregarding various facets of language knowledge and use, so as toskillfully implement assessment measures that are compatible withthese current perspectives. For example, in the case of second lan-guage learners, language assessors need to be aware of current evi-dence regarding the role of the first language and culture in acquiringadditional languages (Cummins, 2001). Such knowledge will guidethem in ensuring that the needs of immigrant students are aptlyaccommodated in testing situations (Solano-Flores & Trumbul,2003). Also pertinent to making and implementing decisions in lan-guage assessment are current debates about the norms of English asan International Language (EIL) (Canagarajah, 2006; Elder &Davies, 2006) and the linguistic competence of the multilingualspeaker (Canagarajah, 2007). One could reasonably argue that thesetopics should form part of the knowledge base of any informed lan-guage expert or practitioner. However, expertise in LAL requires, inaddition, that these theories, approaches and controversies be graftedonto competencies in assessment.

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 392 Constructing a language assessment knowledge base

    Furthermore, since present assessment paradigms strive for inte-gration with teaching, knowledge about current language teachingpedagogy (in contrast with general pedagogical skills), is also part ofLAL. Language assessors need to be familiar with contemporarytheories about the learning, teaching and assessment of grammar, forexample, so as to be able to design suitable assessment measures(Purpura, 2004). Likewise, knowledge about integrated languageand content models (e.g., Kaufaman & Crandall, 2005; Met, 1999),and the specific assessment considerations such curricular modelscall for, is essential for designing integrated content-based taskswhich adequately reflect this construct (Byrnes, 2008; Cushing-Weigle & Jensen, 1997).

    3 The howThe method or the how angle in the Brindley model is introducedin two non-core modules, each branching off in a different direction.Participants can choose (according to their context and needs)between taking the Constructing and evaluating language testsmodule, which focuses on test development and analysis, or a criterion-referenced focus in the module Assessment in the language curri-culum. Each module details the skills learners will gain so as toperform the relevant assessment or testing process using differentassessment procedures. Since the modules are optional, some of theparticipants may acquire knowledge in only one area, thus beingdenied the full range of possibilities that the language assessmentfield offers. The underlying assumption is that certain audiences typ-ically engage in one type of assessment environment (in the case ofteachers this would probably mean assessment rather than tests), andtherefore it suffices to choose only one of the modules. The questionis whether such a policy is valid. The division offered can be likenedto choosing between either quantitative or qualitative research para-digms rather than acquiring initial understandings and skills in boththeir underlying assumptions. Since at present both research trad-itions are being employed in most disciplines including in the lan-guage assessment domain (where qualitative research is also in usein large-scale assessment, see, for example, Taylor, 2007), restrictingthe knowledge base to a single tradition would be considered prob-lematic particularly in view of mixed method research designs.

    Stakeholders functioning in language assessment cultures need toattain knowledge about the different facets of language assessment,both large-scale examinations and classroom assessment, to deliver

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 393

    appropriate assessment and interpret assessment outcomes (Inbar-Lourie, 2008). The extent of the knowledge gained may differ andvary in scope and intensity depending on individual circumstances andneeds, as well as personal inclinations and beliefs. In addition, partak-ing in the critical appraisal and interpretation of the discourse thatcharacterizes various forces in internal and external assessment(Brindley, 2001b; Johnson & Kress, 2003; Leung & Rea-Dickins,2007), also necessitates a sound comprehensive knowledge base in thedifferent assessment orientations.

    The fifth and last component mentioned in the Brindley modelextends beyond assessment and testing techniques, to include onlythose professionals whose context requires delving into moreadvanced planning and exploration of assessment initiatives andresearch (Brindley, 2001a, pp. 129130). However, what is currentlyevident from reports on the implementation of language assessmentreforms is that classroom practitioners are also expected to activelyparticipate in assessment initiatives for evaluating language ability.This is apparent, for example, in the Common European Frameworkof Reference for Languages: Learning, Teaching, Assessment(CEFR) (Council of Europe, 2001), designed to act as a frame ofreference in terms of which different qualifications can be described,different language learning objectives can be identified, and the basisof different achievement standards can be set out (Morrow, 2004,p. 7). Functioning within the framework assumes not merely famil-iarity with its different levels and descriptors, with language teach-ing skills and assessment know-how. It also presumes that languageassessors (in many cases teachers), can adapt the framework for theirteaching and assessment purposes (Little, 2005), and thus functionat what Brindley (2001a) characterizes as an advanced and morespecialized level.

    The Brindley (2001a) proposal discussed at length in the precedingparagraphs offers interesting insights and guidelines. It sets out corecompetencies and allows for expansion of the assessment knowledgebase beyond what is normally included in a core assessment course.Similar principles are apparent in Fulcher and Davidson (2007),whose advanced resource book is divided into an Introduction sec-tion, which presents key concepts in language testing and assess-ment, an Extension of these issues and concepts and a third sectionentitled Exploration which builds on the knowledge gained in thefirst two parts.

    To sum up, though formed on the assessment literacy knowledge baseLAL can be said to constitute a unique complex entity. The fact that it is

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 394 Constructing a language assessment knowledge base

    concerned specifically with language means that considerations ofassessment purpose trait and method must be understood in light of con-temporary theories in language-related areas. The next section exploreswhether or to what extent these competencies in fact form part of lan-guage assessment courses.

    III Language assessment coursesClearly the general purpose of language testing and assessmentcourses is to train language experts in language assessment concepts,skills and strategies, or in LAL. However language assessment coursesdo not come in one shape or size. The target audience differs consid-erably as do the course contents. The course may be intended forteachers (practicing or training) who are responsible for both teachingand assessment in an on-going manner, or for researchers and testingexperts. What is known about the nature and contents of languageassessment courses? Not very much, as little research has been tar-geted specifically at the courses and their objectives and contents.

    One of the few research studies on language testing courses wasconducted by Bailey and Brown in 1995 specifically targeting lec-turers teaching language testing courses. The survey questionnaireused related to a number of issues: the hands-on experience studentsget in the course; the general topics attended to; item analysis;descriptive statistics; test consistency; test validity; textbooks usedand perceived students attitudes towards the testing course. The topicsin each section were presented from a measurement orientation, pre-sumably reflecting the researchers views on what language testingcourses are likely to incorporate. All the items in the hands-onexperience, for example, focus on different stages of the testingprocess. Findings for that section showed that the activity attendedto most was test critique, and then, ordered by frequency of use wereitem writing, interpreting test scores, revising and then scoring tests,administrating tests and test taking. Additional findings showed thatmeasuring the different skills is attended to most frequently, fol-lowed by proficiency testing (Bailey & Brown, 1995).

    The questionnaire composed more than a decade ago does not con-sider any form of assessment other than tests, nor other issues rele-vant to testing or assessment. In that sense it differs from the standardsfor assessment mentioned above (1990), which incorporate the use ofa variety of assessment tools as well as reference to ethical conduct.Furthermore, the only mention of language-related competencies in

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 395

    the questionnaire is measuring the different skills and proficiencytesting (p. 253), while all the other topics are applicable to generalcourses in educational testing, thus relinquishing to a great extentclaims that would support the recognition of language testing as a dis-tinct construct. The researchers do in fact note the Standards forStudent Assessment for Teachers mentioned above, commenting thatAlthough the items in the questionnaire were not developed withthese standards in mind, it behooves us, as a profession, to considerdrafting or adapting a similar set of standards dealing specificallywith language assessment (Bailey & Brown, 1995, p. 250).

    While the Bailey and Brown study surveyed a substantial numberof respondents about their language testing courses, researchers intwo more recent studies report directly and in more detail on theirown language assessment course. The methodology used in bothcases is narrative analysis but from different perspectives: while onestudy (OLoughlin, 2006) explores the course from the point of viewof two of the course participants (using their written contributions onan online forum), the second research reflects on the experience oftransforming the course from the instructors perspective(Kleinsasser, 2005). The descriptions of both courses allow insightinto the course objectives and contents as well as the manner inwhich the courses were conducted.

    In the case of the OLoughlin (2006) study, the course investigatedwas a postgraduate TESOL course in Australia entitled Assessment inthe Language Classroom. The course goals were to enable students todevelop (a) a sound understanding of key concepts in second languageassessment; (b) their ability to critically evaluate existing assessmentdocuments, and (c) their capacity to design or adapt assessment instru-ments for their particular teaching contexts (p. 73). The course includedboth practical components and discussion of conceptual themes, such associal issues in language testing. The researcher reports that two of thestudents, whose narratives form the research focus of the study, bothattained the course objectives. However, differences emerged as to theirwillingness and capacity to embrace new ideas in the area of languageassessment. These differences are attributed to personal background andprofessional experience and context, emphasizing the need to considerthe learners diverse cultural background and experiences when plan-ning and conducting the course (OLoughlin, 2006). Hence theapproach that is advocated in this research with regard to teaching lan-guage assessment courses is a learner-centered one.

    The second study by Kleinsasser (2005) provides an insightfuldescription of how the language testing and assessment course that

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 396 Constructing a language assessment knowledge base

    had been taught for a number of years in a Master of Arts languageprogram, was transformed from a content-focused to a learner-centered and teaching content-focused course. The course put intopractice constructivist notions of student involvement at every stage,with teacher mediation in the form of on-going negotiation. Thecourse contents included a wide range of topics ranging from hands-on item writing, construct validity, providing feedback, developingstandards and assessing reasoning skills, to notions of impact andcritical perspectives to language testing, and, as Kleinsasser notes (p. 83) changing thinking from the use of psychometrics to a focuson (second language) educational assessment. Through the discus-sions, critiquing and writing of testing and assessment items, a pro-fessional community of practice was formed. Our community wasinterpreting, expressing and negotiating meaning about the practicesand theories related to assessment materials while experiencing thenon-linear process of assessment materials development (p. 90).The final course evaluation was also a joint collaborative venturewith the students taking an active part in the decision-makingprocess including the final course grade.

    The two research studies (Kleinsasser, 2005; OLoughlin, 2006)report on courses which appear to incorporate LAL themes. What isparticularly striking, especially in Kleinsasser (2005), is the mannerin which learning was conducted and knowledge constructed, exem-plifying traits associated with both learning and assessment cultures.

    IV Acquiring language assessment literacy: A commonframeworkAcquiring expertise in the different aspects of language assessment isclearly a multifaceted process of learning about and mastering diversecompetencies in different areas as was demonstrated above. Courseinstructors thus face the difficult dilemma of choosing and prioritizingthe issues which will be included or excluded from the course syl-labus. Based on the above discussion of LAL it is suggested that lan-guage assessment courses focus on learning, negotiating, discussing,experiencing and researching a core language assessment framework.This framework construes language assessment not as a collection ofassessment tools and forms of analyses, but rather as a body of know-ledge and research grounded in theory and epistemological beliefs,and connected to other bodies of knowledge in education, linguisticsand applied linguistics. The core competences will reflect current

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 397

    views about the social role of assessment in general and languageassessment in particular, contemporary views about the nature of lan-guage knowledge, and give due emphases to both classroom andexternal assessment practices The focus and intensity would vary anddepend on the target audience, but an introduction to the core compo-nents will be obtained by all participants, including discussion of someof the unresolved controversies and tensions in the field.

    To borrow a language learning metaphor, mastering proficiency inthe discrete skills of language assessment (i.e., specific item types ordata analysis procedures), will not necessarily lead to acquiringLAL. On the other hand, integration of the conceptual basics in theeducational and language-related zones will enable the would-belanguage assessors to gain an initial footing in the field, and speakthe language of assessment. Additional competencies can be developed,refined and elaborated upon for specific purposes, such as construct-ing high- stakes assessment instruments or setting and implementinglanguage assessment initiatives. The life-long learning approachwhich characterizes learning in this era (Kalantzis, Cope & Harvey,2005), is applicable to language assessors who, similar to otherprofessional groups, need to constantly inform and update their know-ledge in the area of measurement and language theories vis--visnew research findings, new technological innovations and ap-proaches in related areas.

    A report by the Committee on Assessment and Evaluation inEducation in the National Israeli Academy for Sciences andHumanities (2005), establishes a common framework or knowledgebase in assessment and evaluation. The framework is based onshared concepts intended to guide the development of curricula andprofessional development programs in the area. The core frameworkalso branches off to outline the specific knowledge base required fordifferent professional groups: teachers, principals, assessment andevaluation coordinators, program evaluators and psychometricians.The knowledge base includes issues in educational assessment andlarge scale assessment, statistics, ethical issues, and program evalu-ation. The framework details precisely which of the knowledge com-ponents needs to be acquired as declarative or proceduralknowledge, with different levels of proficiency or mastery in each.

    Considering the major changes that have occurred in the lastdecade in educational and language assessment, perhaps it is timethat the language testing and assessment profession takes a similarinitiative. The time is ripe for revisiting the notion of drafting oradapting a set of standards for language assessment competencies as

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 398 Constructing a language assessment knowledge base

    was expressed by Bailey and Brown in their 1995 paper. Such corecompetencies would be intended for language assessment profes-sionals in educational contexts, and may also detail particular skillsfor specific capacities. Some of the skills can be integrated withother program components in language education programs and withresearch studies in various areas in applied linguistics and languagepedagogy. Such an initiative will greatly facilitate the meaningfulconstruction of language assessment courses and make a major con-tribution to the field at large.

    V ReferencesAssessment Reform Group. (2002). Assessment for learning: Ten principles.

    Available at: http://arg.educ.cam.ac.uk/CIE3.pdfBailey, K. M. & J. D. Brown. (1995). Language testing courses: What are

    they? In A. Cumming & R. Berwick (Eds.), Validation in language test-ing (pp. 236256). Clevedon, UK: Multilingual Matters.

    Birenbaum, M. (1996). Assessment 2000: Towards a pluralistic approach toassessment. In M. Birenbaum & F. Dochy. (Eds.), Alternatives in assess-ment of achievements, learning processes and prior knowledge (pp. 329).Boston, MA: Kluwer Academic.

    Black, P. & Wiliam, D. (1998). Inside the black box: Raising standardsthrough classroom assessment, Phi Delta Kappan International.www.pdkintl.org/kappan/kbla9810.htm

    Boyles, P. (2005). Assessment literacy. In M. Rosenbusch (Ed.), NationalAssessment Summit Papers. (pp. 115). Ames, IA: Iowa State University.http://www.nflrc.iastate.edu/nva/newsite/ciaa/assessment_ papers_Boyles.html

    Brindley, G. (2001a). Language assessment and professional development. InC. Elder, A. Brown, K. Hill, N. Iwashita, T. Lumley, T. McNamara, & K. OLoughlin (Eds.), Experimenting with uncertainty: Essays in honourof Alan Davies. Cambridge: Cambridge University Press, pp. 126136.

    Brindley, G. (2001b). Outcomes-based assessment in practice: some examplesand emerging insights. Language Testing, 18(4), 393408.

    Broadfoot, P. (1996). Education assessment and society. Buckingham: OpenUniversity Press.

    Broadfoot, P. (2005). Dark alleys and blind bends: Testing the language oflearning. Language Testing, 22(2), 123141.

    Brookhart, S. M. (2003). Developing measurement theory for classroomassessment purposes and uses. Educational Measurement: Issues andpractices, 22(4), 513.

    Brown, J. D. & Hudson, T. (1998). The alternatives in language assessment.TESOL Quarterly, 32(4), 653675.

    Byrnes, H. (2008). Assessing content and language. In E. Shohamy (Ed.).Language Testing and Assessment (Vol. 7). In N. Hornberger (general

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Ofra Inbar-Lourie 399editor), Encyclopedia of Language and Education (2nd ed.). New York:Springer Science and Business Media, pp. 3752.

    Canagarajah, S. (2006). Changing communicative needs, revised assessmentobjectives: Testing English as an international language. LanguageAssessment Quarterly, 3(3), 229242.

    Canagarajah, S. (2007). The ecology of Global English. InternationalMultilingual Research Journal, 1(2), 89100.

    Committee on Assessment and Evaluation in Education (2005). The know-ledge base for assessment and evaluation in education. A framework forcurricula: Academic studies and professional development programs.Israel Academy of Science and Humanities http://www.academy.ac.il/data/projects/34/Assessment_and_Evaluation.pdf

    Council of Europe (2001). Common European Framework of Reference forLanguages: learning, teaching, assessment. Cambridge: CambridgeUniversity Press.

    Cummins, J. (2001). Negotiating identities: Education for empowerment in adiverse society (2nd ed.). Los Angeles: California Association forBilingual Education.

    Cushing Weigle, S. & Jensen, L. (1997). Issues in assessment for content-based instruction. In M. A. Snow and D. Brinton (Eds.), The Content-based classroom: Perspectives on Integrating Language and Content.White Plains, NY: Longman, pp. 20112.

    Davison, C. (2007). Views from the chalkface: English language school-basedassessment in Hong-Kong. Language Assessment Quarterly 4(1), 3768.

    Dochy F. Segers M. (2001) Using information and communication technology(ICT) in tomorrows universities and using assessment as a tool for learn-ing by means of ICT. In Van der Molen, H. J. (Ed.), Virtual university?Educational environments of the future. London: Portland, pp. 6783.

    Elder, C. & Davies, A. (2006). Assessing English as a lingua franca. AnnualReview of Applied Linguistics, 26, 282304.

    Falsgraf, C. (2005, April). Why a national assessment summit? New visions inaction. National Assessment Summit. Meeting conducted in Alexandria,Va. http://www.nflrc.iastate.edu/nva/worddocuments/assessment_2005/pdf/nsap_introduction.pdf

    Filer, A. (Ed.). (2000). Assessment: Social practice and social product.London: RoutledgeFalmer.

    Fulcher, G. & Davidson, F. (2007). Language testing and Assessment. AnAdvanced resource book. London: Routledge.

    Gipps, C. (1994). Beyond testing: Towards a theory of educational assessment.London: Falmer Press.

    Hargreaves, A., Earl, L. & Schmidt, M. (2002) Perspectives on alternativeassessment reform. American Educational Research Journal, 39(1), 6995.

    Hoyt, K. (2005, April). Assessment: Impact on instruction. New vision onAction. National Assessment Summit, Alexandria, Va.

    Huerta-Macias, A. (1995). Alternative assessment: Responses to commonlyasked questions. TESOL Journal, 5(1), 811.

    Inbar-Lourie, O. (2008). Language assessment culture. In E. Shohamy (Ed.). Language testing and assessment (Vol. 7). In N. Hornberger

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • 400 Constructing a language assessment knowledge base

    (general editor), Encyclopedia of language and education (2nd ed.). NewYork: Springer Science and Business Media, Inc., pp. 285300.

    Johnson, D. & Kress, G. (2003). Globalisation, literacy and society: Redesigningpedagogy and assessment. Assessment in Education, 10(1), 514.

    Kalantzis, M., Cope, B. & Harvey, A. (2003). Assessing multiliteracies andthe new basics. Assessment in Education, 10(1), 1526.

    Kaufman, D. & Crandall, J. (2005). Standards and content-based instruction:Transforming language education in primary and secondary schools. InKaufman, D. & Crandall, J (Eds.). Content-based instruction in primaryand secondary school settings. Case Studies in TESOL. Alexandria(Virginia): TESOL Publications, pp. 17.

    Kleinsasser, R.C. (2005). Transforming a postgraduate level assessmentcourse: A second language teacher educators narrative. Prospect 20,77102.

    Kunnan, A. J. (Ed.). (2000). Fairness and validation in language assessment:Selected papers from the 19th Language testing research colloquium,Orlando, Florida. Cambridge, UK: Cambridge University Press.

    Kunnan, A. J. (2004). Regarding language assessment. Language AssessmentQuarterly, 1(1), 1.

    Lantolf, J. P. (2000). Second language learning as a mediated process.Language Teaching, (33)2, 7996.

    Lantolf, J. P. & Poehner, M. E. (2008). Dynamic assessment. In E. Shohamy(Ed.). Language testing and assessment (Vol. 7). In N. Hornberger (gen-eral editor), Encyclopedia of language and education (2nd ed). New York: Springer Science and Business Media, Inc., pp. 273284.

    Leung, C. (2004). Developing formative teacher assessment: Knowledge,practice and change. Language Assessment Quarterly, 1(1), 1941.

    Leung, C. & Rea-Dickins, P. (2007) Teacher assessment as policy instrument:Contradictions and capacities. Language Assessment Quarterly, 4(1), 636.

    Little, D. (2005). The Common European Framework and the EuropeanLanguage Portfolio: Involving learners and their judgment in the assess-ment process. Language Testing, 22(3), 321336.

    Lynch B. K. (2001). Rethinking assessment from a critical perspective.Language Testing, 18(4), 351372.

    Lynch, B. & Shaw, P. (2005). Portfolios, power and ethics. TESOL Quarterly,39(2), 263297.

    Malone, M. E. (2008). Training in language assessment. In E. Shohamy (Ed.).Language testing and assessment (Vol. 7). In N. Hornberger (general edi-tor), Encyclopedia of language and education (2nd ed). New York:Springer Science and Business Media, Inc., pp. 225240.

    McKay, P. & Brindley, G. ( 2007). Educational reform and ESL assessment inAustralia: New reforms new tensions. Language Assessment Quarterly4(1), 6984.

    McMillan, J. H. (2000). Fundamental assessment principles for teachers andschool administrators. Practical Assessment, Research & Evaluation,7(8). Retrieved November 9, 2007 from http://PAREonline.net/getvn.asp?v7&n8

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • McNamara, T. (2006). Second language testing and assessment: Introduction.In E. Hinkel (Ed.). Handbook of research in second language teachingand learning. Mahwah: Lawrence Erlbaum Associates, pp. 775778.

    McNamara, T. & Roever, C. (2006). Language Testing: The social dimension.Oxford: Blackwell Publishing.

    Met, M. (1999). Content-based instruction: Defining terms, making decisions.College Park, MD: National Foreign Language Center.

    Morrow, K. (Ed.) (2004). Insights from the Common European Framework.Oxford: Oxford University Press.

    National Research Council (2001). Knowing what students know: The scienceand design of educational assessment. Committee on the Foundations ofAssessment. J. Pelligrino, N. Chudowsky, and R. Glaser (Eds.). Board onTesting and Assessment, Center for Education. Division of Behavioraland Social Sciences and Education. Washington, DC: National AcademyPress. http://www.nap.edu/openbook.php?record_id10019&pageR1

    OLoughlin, K. (2006). Learning about second language assessment: Insightsfrom a postgraduate student on-line subject forum. University of SydneyPapers in TESOL, 1, 7185.

    Purpura, J. E. (2004). Assessing grammar. Cambridge: Cambridge UniversityPress.

    Rea-Dickins, P. (2007). Learning or measuring? Exploring teacher decision-making in planning for classroom-based language assessment. In S. Fotos & H. Nassaji (Eds.) Form-focused instruction and teachereducation: Studies in honour of Rod Ellis. Oxford: Oxford UniversityPress, pp. 193210.

    Read, J. & Erlam, R. (2007). Language Assessment and Evaluation coursedescription. http://www.arts.auckland.ac.nz/subjects/index.cfm?P9003

    Shepard, L. A. (2000). The role of assessment in a learning culture,Educational Researcher, 29(7), 414.

    Shohamy, E. (2001). The power of tests. Harlow, England: Pearson Education.Shohamy, E. (2008) Introduction to Volume 7: Language testing and assess-

    ment. In E. Shohamy (Ed.). Language testing and assessment (Vol. 7).In N. Hornberger (general editor), Encyclopedia of language and educa-tion (2nd ed). New York: Springer Science and Business Media, Inc.,pp. xiii-xxii.

    Solano-Flores, G. & Trumbul, E. (2003). Examining language in context: Theneed for new research and practice paradigms in the testing of Englishlanguage learners. Educational Researcher, 32(2), 313.

    Standards for teacher competence in educational assessment of students.(1990). American Federation of teachers, national council onMeasurement in Education, National Education Association. http://www.unl.edu/buros/bimm/html/article3.html

    Stoynoff, S. & Chapelle, C. A. (2005). ESOL tests and testing: A resource forteachers and program administrators. Alexandria, VA: TESOL Publications.

    Taylor, L. (2007, April). Two by two: paired interaction in large-scale profi-ciency assessment. Paper presented at The American Association forApplied Linguistics annual conference, Costa Mesa, CA.

    Ofra Inbar-Lourie 401

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from

  • Teasdale, A. & Leung, C. (2000).Teacher assessment and psychometric the-ory: A case of paradigm crossing? Language Testing 17(2), 16384.

    Teemant, A., Smith, M., Pinnegar, S. & Egan, M. W. (2005) Modelingsociocultural pedagogy in distance education. Teachers College Record107(8), 2005, 16751698.

    Windschitl, M. (2002). Framing constructivism in practice as the negotiationof dilemmas: An analysis of the conceptual, pedagogical, cultural, andpolitical challenges facing teachers. Review of Educational Research,72(2), 131175.

    Wolf, D., Bixby, J., Glenn, J. & Gardner, H. (1991). To use their minds well:Investigating new forms of student assessment. Review of Research inEducation, 17, 3174.

    402 Constructing a language assessment knowledge base

    by ROSNGELA RODRIGUES BORGES on September 4, 2010ltj.sagepub.comDownloaded from