measuring information literacy: beyond the case study

3
METRICS · Measuring Information Literacy: Beyond the Case Study by Christopher Stewart Available online 13 April 2011 L ast month, I attended a lecture at which the speaker spoke about the need for more effective advocacy efforts by libraries of all types. Advocacy should be centered on showing the value of the activities of the enterprise. 1 We have entered an era when our institutionspublic and privatemust not only define the good they do, but also the effect of this good or, put another way, the value this good provides to stakeholders. In general, higher education's accountability for outcomes has been the focus of much public debate in recent years. Major reports such as Measuring Up 2 and the 2008 report of then Secretary of Education Margaret Spellings 3 reflect the public's growing frustration in higher education's ability to deliver outcomes, which include preparing students for careers, critical thinking skills, and the ability to become lifelong learners. The library can play an important part in helping students develop these skills. As academic librarians, we have much experience in measuring what we consider library quality indicators. Much of this measure- ment is based on comparing past performance versus current performancea form of maintenance learning.4 While this type of assessment is useful for some areas, input/output performance indicators play only a supporting role in the overall value proposition of the academic library. The library's value proposition should also be tightly integrated with the institution's desired outcomes, primarily (though not exclusively) educational outcomes. Through the library's mission and core activities, we know that the library has an important role to play in learning outcomes, both its own and the institutions. Information literacy is arguably the most direct way for the library to show value in supporting an institution's learning outcomes. The ongoing growth of library instruction programs can be traced to the 1970s, when baby boom enrollment peaked on college campuses, and expanding curricula required new and larger library collections and services. 5 Users required new skills to navigate increasingly complex information landscapes, and librarians began to assume more active teaching roles. In 2000, the Association of College & Research Libraries released its Information Literacy Compe- tency Standards for Higher Education. 6 These standards frame infor- mation literacy to include five core competencies, each followed by a series of demonstrable skills: the ability to determine the extent of the information needed; the ability to effectively assess information; the ability to critically evaluate information; the ability to incorporate information into one's work; the ability to use information to accomplish research or other goals; and the ability to understand legal and ethical issues surrounding the discovery and use of information. 7 Since the creation of the ACRL standards, libraries have used them as a baseline for instruction design as well as a tool for advocating for information literacy to be incorporated into institu- tional learning outcomes. There are numerous examples of library outcomes for information literacy, especially among large research universities. The University of Illinois Libraries, for example, contextualize information literacy not only as developing competent research skills,but also enabling students to become independent critical thinkers and more success- ful in their academic pursuits.8 Purdue University Libraries, a long time innovator in promoting information literacy across the curric- ulum, offer a range of resources for instructors, CORE (Competencies for Online Research Education), and a one-credit course on core concepts of information retrieval, analysis, and organization.9 Higher education accrediting bodies have also begun to express (indirectly and directly) the importance of information literacy as a learning outcome. The Middle States Commission on Higher Education (MSCHE) specifically addresses information literacy as an outcome by explaining that, in making the case that students are information literate, it is the institution's responsibility to ensure that information literacy goals are defined and the various elements scattered across the curriculum are identified as part of a coherent whole.10 MSCHE further states that several skills collectively referred to as informa- tion literacyapply to all disciplines in an institution's curricula.11 It is fair to assume then, that in the academic library community and elsewhere, there is a great deal of attention being paid to information literacy and its relationship to institutional educational outcomes. How then are we measuring our success in teaching information literacy? Furthermore, how are we expressing these efforts, to the profession, to the institutions we serve, and to the higher education community on the whole? There are numerous examples in the professional literature of successful case studies in teaching information literacy. One recent study at the Mansfield Library at the University of Montana, for example, compared information literacy levels of first year students versus students in capstone courses. Results showed very positive impact of information literacy instruc- tion in developing strong research skills in students and the relationship of these skills to the curriculum. 12 Other reports outline the differences between library instruction and discipline-based learning within academic departments. 13 These reports and articles, while useful, support a point made in the recent ACRL research report, The Value of Academic Libraries, prepared by Megan Oakleaf. Although libraries have long taught and assessed information literacy, most of the published evidence of the impact on libraries on student learning is sporadic, disconnected, and focused on limited case studies. 14 So, while we do an excellent job of assessing information literacy programs with the assumption that the results may be applied at Christopher Stewart is Assistant Professor, Graduate School of Library and Information Science, Dominican University, 7900 W. Division St., River Forest, IL 60305, USA <[email protected]>. 270 The Journal of Academic Librarianship, Volume 37, Number 3, pages 270272

Upload: christopher-stewart

Post on 02-Sep-2016

217 views

Category:

Documents


3 download

TRANSCRIPT

METRICS

· Measuring Information Literacy:

Beyond the Case Studyby Christopher StewartAvailable online 13 April 2011

Last month, I attended a lecture at which the speaker spokeabout the need for more effective advocacy efforts by librariesof all types. Advocacy should be centered on showing the value

of the activities of the enterprise.1 We have entered an era when ourinstitutions—public and private—must not only define the good theydo, but also the effect of this good or, put another way, the value thisgood provides to stakeholders. In general, higher education'saccountability for outcomes has been the focus of much public debatein recent years. Major reports such as Measuring Up2 and the 2008report of then Secretary of Education Margaret Spellings3 reflect thepublic's growing frustration in higher education's ability to deliveroutcomes, which include preparing students for careers, criticalthinking skills, and the ability to become lifelong learners. The librarycan play an important part in helping students develop these skills.

As academic librarians, we have much experience in measuringwhat we consider library quality indicators. Much of this measure-ment is based on comparing past performance versus currentperformance—a form of “maintenance learning.”4 While this type ofassessment is useful for some areas, input/output performanceindicators play only a supporting role in the overall value propositionof the academic library. The library's value proposition should also betightly integrated with the institution's desired outcomes, primarily(though not exclusively) educational outcomes. Through the library'smission and core activities, we know that the library has an importantrole to play in learning outcomes, both its own and the institutions.Information literacy is arguably the most direct way for the library toshow value in supporting an institution's learning outcomes.

The ongoing growth of library instruction programs can be tracedto the 1970s, when baby boom enrollment peaked on collegecampuses, and expanding curricula required new and larger librarycollections and services.5 Users required new skills to navigateincreasingly complex information landscapes, and librarians beganto assume more active teaching roles. In 2000, the Association ofCollege & Research Libraries released its Information Literacy Compe-tency Standards for Higher Education.6 These standards frame infor-mation literacy to include five core competencies, each followed by aseries of demonstrable skills: the ability to determine the extent of theinformation needed; the ability to effectively assess information; theability to critically evaluate information; the ability to incorporateinformation into one's work; the ability to use information toaccomplish research or other goals; and the ability to understandlegal and ethical issues surrounding the discovery and use of

Christopher Stewart is Assistant Professor,Graduate School of Library and Information Science,Dominican University, 7900 W. Division St., River Forest, IL 60305, USA<[email protected]>.

270 The Journal of Academic Librarianship, Volume 37, Number 3,

pages

information.7 Since the creation of the ACRL standards, libraries haveused them as a baseline for instruction design as well as a tool foradvocating for information literacy to be incorporated into institu-tional learning outcomes.

There are numerous examples of library outcomes for informationliteracy, especially among large research universities. The Universityof Illinois Libraries, for example, contextualize information literacynot only as developing “competent research skills,” but also enablingstudents to “become independent critical thinkers and more success-ful in their academic pursuits.”8 Purdue University Libraries, a longtime innovator in promoting information literacy across the curric-ulum, offer a range of resources for instructors, CORE (Competenciesfor Online Research Education), and a one-credit course on “coreconcepts of information retrieval, analysis, and organization.”9

Higher education accrediting bodies have also begun to express(indirectly and directly) the importance of information literacy as alearning outcome. TheMiddle States Commission onHigher Education(MSCHE) specifically addresses information literacy as an outcome byexplaining that, “in making the case that students are informationliterate, it is the institution's responsibility to ensure that informationliteracy goals are defined and the various elements scattered acrossthe curriculum are identified as part of a coherent whole.”10 MSCHEfurther states that “several skills collectively referred to as ‘informa-tion literacy’ apply to all disciplines in an institution's curricula.”11

It is fair to assume then, that in the academic library communityand elsewhere, there is a great deal of attention being paid toinformation literacy and its relationship to institutional educationaloutcomes. How then are we measuring our success in teachinginformation literacy? Furthermore, how are we expressing theseefforts, to the profession, to the institutionswe serve, and to the highereducation community on thewhole? There are numerous examples inthe professional literature of successful case studies in teachinginformation literacy. One recent study at the Mansfield Library at theUniversity of Montana, for example, compared information literacylevels of first year students versus students in capstone courses.Results showed very positive impact of information literacy instruc-tion in developing strong research skills in students and therelationship of these skills to the curriculum.12 Other reports outlinethe differences between library instruction and discipline-basedlearning within academic departments.13 These reports and articles,while useful, support a point made in the recent ACRL research report,The Value of Academic Libraries, prepared by Megan Oakleaf.

Although libraries have long taught and assessed information literacy, most ofthe published evidence of the impact on libraries on student learning issporadic, disconnected, and focused on limited case studies.14

So, while we do an excellent job of assessing information literacyprograms with the assumption that the results may be applied at

270–272

other institutions, our efforts to illustrate our successes to broaderaudiences have been limited. Limited, but not invisible. There areseveral current, planned, and conceivable options to assess theefficacy and value of library information literacy programming tothe larger institutional audiences. To this point, Oakleaf reminds usthat, as higher education is increasingly required to show and delivervalue to the various constituencies it serves, the library cannot affordto be seen as distinct from institutional goals for learning outcomes.

In general, there are two metrical strategies that have potential astools to show the impact of information literacy to broaderinstitutional audiences. These strategies can be broadly categorizedinto direct and indirect methods. One direct method used to garnerinformation on how students use information is the transaction-basedonline survey. One such tool, Measuring the Impact of NetworkedElectronic Services (MINES), was recently described in this column.15

As the user accesses an online resource, he or she is given theopportunity to answer a short list of questions about how and why theresource is being used, as well as some demographic informationabout the user. Inferences about the library's information literacyefforts can be made by coupling user demographics (self-reported)with the types of resources being used and why. Another directmethod for assessing information literacy is the StandardizedAssessment of Information Literacy Skills (SAILS), which wasdeveloped several years ago at Kent State University. SAILS measuresskills in alignment with the ACRL's Information Literacy CompetencyStandards for Higher Education by focusing on eight skill sets.16 SAILSis a web-based exam which can be completed in approximately35 min. There are two test administration periods a year. Since pilottesting ended in 2006, SAILS has been used by approximately 250institutions.17 SAILS seeks to answer questions of relevancy ofinformation literacy for student success, the library's role ininformation literacy, and determining if a student possesses necessaryskills to be considered information literate.18 SAILS has good potentialfor showing information literacy outcomes to broad institutionalaudiences. It would be beneficial to have examples of how SAILS datahas been reported to parent institutions as a way of showing the valueof information literacy in student learning outcomes. Finally, anothermore recently developed tool for direct assessment of informationliteracy is James Madison University's Information Seeking Skills Test(ISST), which all students are required to complete by the end of theirfirst year.19

Other direct methods for measuring information literacy out-comes, while conjecture at this point, may prove to be powerfulmetrics in the coming years. Oakleaf contends that many of the input/output measures we currently use to assess library performance donot “resonate with many higher education stakeholders.”20 Oakleafargues that future studies to show library value, including informationliteracy outcomes, should be linked with existing data available instudent information systems. These data could be linked with data onlibrary use such as whether a student attended library instructionsessions, as well as other ways in which the library was used, to showthe value of the library in student learning outcomes.21

While direct measurements of information literacy programs areeffective, academic libraries may not have the resources to administerthese kinds of assessments. Indirect methods offer an alternative.Indirect methods involve user reporting of activities, opinions, andgoals reached in relation to information literacy. One of the mostprevalent service quality assessment instruments used in academiclibraries, LibQUAL, allows for the inclusion of up to five auxiliary itemsto be added to the core 22 questions in the instrument. These itemscan include questions about information literacy, such as whether thelibrary has taught the respondent “how to access, evaluate, and useinformation.”22 Responses to these questions can be separated bydifferent user demographics, including undergraduate, graduate, andfaculty. Tools such as LibQUAL that report the effectiveness of the

library in teaching users to be effective researchers can be used toshow relationships between information literacy and broader studentlearning outcomes.

Another quantitative tool that has been used to show theeffectiveness of information literacy programs is the National Surveyof Student Engagement (NSSE). NSSE “actively collects informationabout student participation in programs that institutions provide fortheir learning and personal development.”23 In 2006, as part of anexperimental project, four information-specific items were placed onthe NSSE survey for a consortium of 33 participating institutions.Approximately 14,000 students responded.24 Results supported“modest to high significant positive relationships between the twoinformation literacy scales and eight scales derived by NSSE,particularly among seniors with gains in practical knowledge andgeneral education.”25

While the 2006 NSSE results were promising, no furtherinstruments containing questions about information literacy havebeen developed since. The next fully revised version of the NSSEsurvey is being piloted and is scheduled for release in 2013. However,it is unlikely that information literacy questions will be included.26

According to Robert Gonyea, Associate Director of the IndianaUniversity Center for Postsecondary Research and coordinator ofNSSE reporting and research, however, there is a strong possibilitythat, beginning with the new survey in 2013, NSSE will offer a modulethat contains a set of questions related to information literacy.27 Thismodule would be available as an “electable” set of questions that, likethe other questions in NSSE, focus on activities that are indicators ofstudent engagement in learning and other areas.28 Institutions couldadd these questions, which, considering that NSSE is given at thebeginning and end of a student's undergraduate education, couldprovide valuable information on the link between information literacyand broader educational outcomes. Like LibQUAL, the questionswould be optional. Unlike LibQUAL, NSSE results are seen by a farwider audience of institutional stakeholders.

In the coming years, the academic library will be increasingly calledupon to demonstrate its value and role in the educational outcomes ofthe institution. This we know, and we have responded well in recentyears by further defining information literacy, devising programs,speaking to the importance of information literacy as a core componentof higher learning, and, finally, measuring the effectiveness ofinformation literacy instruction through case studies and otherprimarily localized investigations. We know what to do, how to do it,and why it is important. What remains missing, however, is theproliferation and use of effective, broad scale methods for assessing ourinformation literacy efforts. Inclusion in national surveys is a good start.As pressure continues on higher education to illustrate its value to arange of internal and, equally important, external stakeholders, thelibrary can be involved in shaping expectations for what comprises aquality postsecondary education.We have an opportunity to contributeto the discussion. Let's take it.

NOTES AND REFERENCES

1.Haycock, Ken. 2011. Advocacy Revisited: New Insights Based onResearch and Evidence presented at the Dominican University'sGraduate School of Library and Information Science (GSLIS) annualFollett Lecture, February 9, River Forest, IL.

2. Anon. Measuring Up 2008. http://measuringup2008.highereducation.org/ (March 2, 2011).

3. The Secretary of Education's Commission on the Future ofHigher Education. 2006. A Test of Leadership: Charting the Future ofU.S. Higher Education, A Report of the Commission Appointed bySecretary of Education Margaret Spellings. http://www.ed.gov/about/bdscomm/list/hiedfuture/index.html (March 2, 2011).

May 2011 271

4. Bennis,Warren and Burt, Nanus. 2005. Leaders: Strategies for TakingCharge. New York: HarperCollins, pp. 180–181.

5. Budd, John M. 2005. The Changing Academic Library: Operations,Culture, Environment. Chicago, IL: Association of College & ResearchLibraries, p. 237.

6. Anon. ACRL|Information Literacy Competency Standards for HigherEducation. http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm (February 25, 2011).

7. Ibid.8. University Library, University of Illinois at Urbana-Champaign.Resources for Faculty and Instructors. http://www.library.illinois.edu/infolit/faculty.html (February 24, 2011).

9. Anon. Research Guides - Purdue University Libraries. http://www.lib.purdue.edu/rguides/instructionalservices/online.html (February24, 2011).

10.Middle States Commission on Higher Learning. Developing Research& Communication Skills: Guidelines for Information Literacy in theCurriculum, Executive Summary, p.6.

11. Ibid., p.42.12. Samson, Sue. Information Literacy Learning Outcomes and Student

Success. Journal of Academic Librarianship 36, no. 3: 202-210.13.Harvey, Diane. 2007. Assessing for Improvement presented at the

I&O Brown Bag, November 11, Duke University. http://www.slideshare.net/harveydiane/learning-outcomes-assessment-for-library-instruction (February 23, 2011).

14. Association of College and Research Libraries, and Megan Oakleaf.2010. The Value of Academic Libraries: A Comprehensive ResearchReview and Report, p.7.

15. Stewart Christopher, “Keeping Track of it all: The Challenge

272 The Journal of Academic Librarianship

of Measuring Digital Resource Usage,” Journal of AcademicLibrarianship 37 (2) (2011): 174–176.

16. Anon. project SAILS–Standardized Assessment of InformationLiteracy Skills. https://www.projectsails.org/abouttest/aboutTest.php?page=aboutTest (February 22, 2011).

17. Anon. project SAILS—Standardized Assessment of InformationLiteracy Skills. https://www.projectsails.org/sails/history.php?page=aboutSAILS (February 22, 2011).

18. Anon. 2009. Introduction to the Sails Test presented at theAmerican Library Association Annual Conference, July 13, Chicago,IL. https://www.projectsails.org/events/past.php?page=events(February 22, 2011).

19. Anon. JMU—General Education Information Literacy Competency.http://www.jmu.edu/gened/info_lit_general.shtml (March 8, 2011).

20. Association of College and Research Libraries, and Megan Oakleaf.2010. The Value of Academic Libraries: A Comprehensive ResearchReview and Report, p. 41.

21. Ibid., p. 43.22. Association of Research Libraries. 2006. LibQual+ 2006 Survey:

Illinois Institute of Technology, p. 46.23. Anon. About NSSE. http://nsse.iub.edu/html/about.cfm (March 8,

2011).24. Gratch-Lindauer, Bonnie. 2007. Information literacy-related student

behaviors: Results from the NSSE items. C&RL News 68, no. 7(August): 432-441, p. 432.

25. Ibid., p. 433.26. Gonyea, Robert. 2011. Interview with Robert Gonyea. March 2.27. Ibid.28. Ibid.