altmetrics for kla final

of 28/28
Altmetrics and their Influence on Collection Development Sarah Sutton Rachel Miles Stacy Konkiel Michael Levine-Clark

Post on 15-Apr-2017




0 download

Embed Size (px)


Altmetrics and their Influence on Collection Development

Altmetrics and their Influence on Collection DevelopmentSarah SuttonRachel MilesStacy KonkielMichael Levine-Clark

Sarah welcomes the audience, explains Stacy and Michaels role...and (next slide)....

OutlineWhat are altmetrics?Introduction to our research studyStudy Results:Subject liaison dutiesUse of metrics for collection developmentQuestions

...Sarah briefly describes the organization of our talk.

Alternative Metrics, or Altmetrics: What are they?Altmetrics use a varied number of methods for measuring scholarly impact including:Web-based referencesArticle views/downloadsMentions in the news and on social media (Sutton, 2014).Twitter, Facebook, Blogs, Mainstream media & forums.Evidence for LinkedIn & Pinterest, Q&A sites insufficient (Thelwall et al, 2013). Social bookmarking services & social reference managers, like Mendeley.Examples of shared media:VideosConference presentationsData setsInfographicsSource code

Image courtesy of

Image courtesy of

Bibliometrics vs. Altmetrics Journal Impact Factor (JIF) Average # of Citations to Articles Published in a Journal Academic articles and reviews within a journal Limitation: Impact measured within a single field of study

Image courtesy of

Bibliometrics vs. Almetrics H-index Individual Researchers # of Articles Published to the # of Citations Ex of sources: Web of Science, Google Scholar Metrics, Microsoft Academic Search

Criticisms Easily manipulatedSingle field of studyIncapable of measuring online impact

Example of an h-index. Image courtesy of Microsoft Academic Search.

Bibliometrics vs. Altmetrics Citation counts Google Scholar and many digital libraries Sort search matches by most highly cited articles. Offer other faceted search options, such as most recent or a specific year/range. Citations accrue over months and years, and therefore are not an indication of recent research impact. Researchers are interested in important and current research in their fields (Thelwall et al., 2013).

Image courtesy of

Advantages of Measuring Online Impact Impact measured more rapidly than citation counts Measures immediate impact of research & current trends Social media mentionsdirectly after being published, or even before Supports Open Access (OA) Initiatives Encourages sharing research across news media platforms via social media and communicating research to the general public. Most research measured by altmetrics uses nontechnical language Could measure societal impact of research (Thelwall et al., 2013).

Image courtesy of

Advantages of Measuring Online Impact Rather than predicting future citations, altmetrics likely capture a different and unique aspect of research visibility and impact (Thelwall et al., 2013). Bornmann (2015) found that the more tags an article had in the F1000Prime, the more likely the articles were to be used for instruction, rather than as scholarly citations. Possible indicator of translating complicated research into unambiguous information for the publics use and in the classroom (Bornmann, 2015).

Image courtesy of

Advantages of Measuring Online ImpactSupports faculty in understanding the impact of their research. Assists faculty in pursuing tenure or promotion. Assists evaluation committees and administrators in measuring research and scholarly and creative achievements for tenure.

Image courtesy of Texas Tech University

Disadvantages and controversies surrounding altmetricsGaming altmetrics: the potential to manipulate the popularity of an article (Roemer & Borchardt, 2015).Can cause damage to the purpose and accuracy of altmetrics.Publishers can provide funding to draw attention to articles.Lacks standardization (Roemer & Borchardt, 2015).NISOs Altmetrics Assessment InitiativeEstimating future citations not a reality yet.Correlations between citations and altmetrics scores is still weak, (Barnes, 2015). However, this may not be the ultimate purpose of altmetrics.

Image courtesy of

Preliminary Data: Subject/Liaison Librarians210 liaison librarians 575 liaison areas identifiedMost liaison librarians responsible for multiple liaison areas.Some liaison librarians mentioned liaison areas in multiple disciplines, (e.g. humanities, engineering, and life sciences.)However, most liaison librarians were responsible for liaison areas in the same or similar disciplines. Ex: One survey participant was responsible for the liaison areas astronomy, biochemistry, chemistry & physics.

The SurveyPurposes: Assess academic librarians current usage of research metrics in the course of their work

Gather information about how librarians who conduct their own research or read others research articles use such metrics

Research metrics (e.g. journal impact factors, article citation counts, and emerging forms of metrics such as altmetrics)

The SurveyPopulation: 13,000 academic librarians at 4 year college or universityRespondents709 librarians ~ 5% response rate

Response rate is low but high enough to draw statistical conclusions; will need to test for goodness of fit between population and respondents

The Survey: What We AskedTitle and job dutiesTenure statusFamiliarity with evaluative research metrics (JIF, usage, altmetrics)Uses for evaluative research metrics (JIF, usage, altmetrics)for professional dutieswhen evaluating their own scholarly work

The Survey: Preliminary ResultsTitle: Liaison Librarian/Subject Specialist Job duties: Collection DevelopmentFamiliarity withJournal Impact FactorsCitation Counts, Usage Metrics, AltmetricsUse of metrics for collection developmentJournal Impact FactorsCitation Counts, Usage Metrics, Altmetrics

What were going to cover today: Rachels going to present her analysis of the disciplines in which our respondents have liaison duties, which we explored in order to identify differences in the use of metrics for collection development purposes across disciplines. Im going to present what we discovered about those differences.

Percentage of Liaison Areas Identified

209 of 708 respondents replied that the role that best described their job was liaison librarian. 187 of the 209 provided at least one liaison area.

Percentage of Liaison Areas Identified in the Hard SciencesHealth Sciences: 74 mentions:MedicineVeterinary ScienceDentistryDieticsFood ScienceSurgeryHuman SciencesNursing+36 Others!Applied Sciences, 47 mentions:Engineering (30)Architecture (9)Agronomy (6)Other (3)Formal Sciences, 37 mentions:Mathematics (17)Computer Science (13)Statistics (7)Physical Sciences, 39 mentions:Chemistry (17)Biochemistry (7)Physics (13)Physical Sciences, General (2)Life Sciences, 27 mentions:Biology (21)Plant Science (3)Other (2)Earth & Space Sci., 26 mentions: Earth & Environ. Sci. (19)Astronomy & Astrophysics (7)Science, unspecified, 3 mentions

Percentage of Liaison Areas Identified in the HumanitiesCultural and Gender studies, with 53 mentions, had 30 different types of studies! Overall, the humanities had 174 mentions, or 30 percent of all mentions.

Percentage of Liaison Areas Identified in the Social SciencesSocial sciences had the fewest mentions in the survey, with 60 mentions, or 30 percent.Political science had the most mentions, at 19 mentions, and also included world politics, global/international studies, global affairs, European Union studies, and peace studies.

Percentage of Liaison Areas Identified in the ProfessionsProfessional areas had 89 mentions, or 16 percent.Business had the most mentions but also included economics, finance, management, and marketing.

More about our respondentsAlmost all (97.3%) work full time in academic libraries at a 4 year college or university.

Most (62.3%) have worked as a librarian for 11 or more years and almost half of those (49.1%) have been working as a librarian for more than 21 years.

More than 50% have collection development duties.

A total of 387 or 54.7% of respondents had either regular (more than once per month) or occasional (less than once per month) collection development duties in their jobs.

Those with Collection Development Responsibilities' Familiarity with Metrics

In terms of their familiarity with the metrics we asked them about, the respondents with regular (once per month or more) or occasional (once per month or less) collection development duties is very similar to our respondents as a whole.1 = never heard of them, 5 = Im an expertSo very few of our respondents felt as if they had expert knowledge of any of these metrics, but most were fairly comfortable with JIF, Usage and Citation Counts (peaks at 4), while most were only moderalely familiar with altmetrics (peaks at 3).

All Respondents' Familiarity with Metrics

Same here: 1 = never heard of them, 5 = Im an expertCitation counts line doesnt show because it is exactly the same as the usage metric line.

Have you ever used journal impact factors for any of the following purposes?

Those respondents with collection development responsibilities used journal impact factors more often than the respondents overall to make collection development decisions.

How often do you evaluate materials using the following indicators of research impact in the context of your collection

Altmetrics and Expert Peer Reviews seem to share a trend: the number of respondents who either never (1) or always (5) use them is greater than the number who sometimes use them, which is contrast to JIF, Citations, Qualitative measures, and Downloads/Page views which peaks in the middle (3), sometimes and Usage which peaks at almost always (4)

What we still want to know...Are there variances in use of altmetrics for collection development between liaison areas?

Are there variances in FAMILIARITY WITH altmetrics between librarians with duties other than collection development (e.g. reference, instruction, assessment, scholarly communication)? Are there variances in THE USE OF altmetrics between librarians with duties other than collection development (e.g. reference, instruction, assessment, scholarly communication)?

a. Collection development (i.e. selecting and purchasing books, journals, etc for faculty or students)b. Instruction (i.e. teaching workshops and one-shot instruction sessions, etc)c. Reference services (i.e. staffing the reference desk, answering reference questions via email or in 1-on-1 consultations, etc)d. Scholarly communication support (i.e. helping faculty and students choose research software, tools, and which journals to publish in; helping scholars understand how to measure research impact)e. Assessment (i.e. gathering and reporting statistics and qualitative studies to understand the success of library-based resources and programs)


LATER?Sarah Sutton: [email protected] Miles: [email protected]