gathering evidence to demonstrate impact

17
Gathering Evidence to Demonstrate Impact and Reputation Sponsored by Academic Affairs September 6, 2012

Upload: heather-coates

Post on 28-Jul-2015

288 views

Category:

Data & Analytics


0 download

TRANSCRIPT

Page 1: Gathering Evidence to Demonstrate Impact

Gathering Evidence to Demonstrate Impact and

Reputation

Sponsored by Academic Affairs

September 6, 2012

Page 2: Gathering Evidence to Demonstrate Impact

Agenda

• Overview– What is impact?– Traditional impact metrics– Non-traditional evidence

• Hands-on– Institutional Repositories– Web of Knowledge– Cited Reference Search– Google Scholar

• Panel Q&ASeptember 6, 2012

Page 3: Gathering Evidence to Demonstrate Impact

What can librarians do for you?

• Guide you to quality sources of impact evidence

• Assist you in interpreting the context of impact evidence for your scholarly products

• Assist you in planning dissemination of your scholarly products

September 6, 2012

Page 4: Gathering Evidence to Demonstrate Impact

What is impact?

September 6, 2012

Page 5: Gathering Evidence to Demonstrate Impact

What is impact?

• Garfield distinguished between impact with influence (Leyesdorff, 2009)

• “Experience has shown that in each specialty the best journals are those in which it is most difficult to have an article accepted, and these are the journals that have a high impact factor.” Garfield, 2000

• Effects and outcomes, in terms of value and benefit, associated with the use of knowledge produced through research (Beacham et al, 2005)

September 6, 2012

Page 6: Gathering Evidence to Demonstrate Impact

Measuring impact

• Proxy for expert evaluation• Typically citation-based, however citations ≠

quality• Levels of evidence– Journal-level– Article-level– Scholar-level

• How can you use these in your dossier?– What is the value of these metrics?

September 6, 2012

Page 7: Gathering Evidence to Demonstrate Impact

TRADITIONAL IMPACT METRICS

September 6, 2012

Page 8: Gathering Evidence to Demonstrate Impact

Impact Factor (ISI)

• Journal-level metric• Average number of citations received per paper • Timeframe: previous 2 years• Updates: annually • Issues: Discipline-dependent, average number of

citations per paper is not normal distribution (mean is not valid), journal self-citation, can be affected by editorial policies

• Use to: provide range for your discipline

September 6, 2012

Page 9: Gathering Evidence to Demonstrate Impact

h-index

• Scholar-level metric• Attempts to measure both the productivity and

impact of the published works• Timeframe & Updates: depends on source• Can be manually determined using citation databases

or calculated automatically– Web of Science, Scopus, Google Scholar– Google Scholar has broader coverage but smaller

databases tend to be more accurate

• Use to: compare to other scholars at your stageSeptember 6, 2012

Page 10: Gathering Evidence to Demonstrate Impact

i10 index (Google Scholar)

• One of several Google Scholar Metrics (GSM™)• Article-level metric• Number of publications with at least 10 citations• Sources: unknown, no master list, could change• Timeframe: articles published 2007-2011, indexed in

Google Scholar as of April 1, 2012• Updates: unclear (info accurate as of April 2012)• Use to: compare to other scholars in your discipline

September 6, 2012

Page 11: Gathering Evidence to Demonstrate Impact

Eigenfactor & Article Influence scores

• Eigenfactor = journal-level metric• Article Influence = article-level metric• Timeframe: previous 5 years• Updates: annually• Based on number of incoming citations; citations from

highly ranked journals weighted to make a larger contribution

• Source: Journal Citation Reports (JCR)• Adjusts for differences across disciplines• Use to: provide range for your disciplineSeptember 6, 2012

Page 12: Gathering Evidence to Demonstrate Impact

NON-TRADITIONAL EVIDENCE

September 6, 2012

Page 13: Gathering Evidence to Demonstrate Impact

“Altmetrics”

• Includes things like page hits, downloads, Twitter mentions, etc.

• Timeframe: immediate to short-term impact• Sources: focus is on social media• Use to: provide measure of more immediate

impact and impact outside discipline, academia• Includes formal metrics, such as:– Total-impact.org – PLoS article-level metrics

September 6, 2012

Page 14: Gathering Evidence to Demonstrate Impact

Informal metrics

• Acceptance rates for journals• Visibility of item or scholar• Ownership count (libraries)• Indexed in major databases• View & download statistics• Editors/sponsoring organizations• Use to: supplement traditional metrics

September 6, 2012

Page 15: Gathering Evidence to Demonstrate Impact

Other relevant evidence

• Scholarship of Teaching & Learning– Learning object repositories, instructional content,

innovative use of technology, syllabi, etc.• Grey literature– Conference materials, white papers, unpublished

reports• Impact on the community– Media, changes in policy, law, or programs

September 6, 2012

Page 16: Gathering Evidence to Demonstrate Impact

Context: Defining quality

• Timeframe• Scope (source coverage, metric level)• Source/authority• Reliability/Accuracy• Relevance

Trends: interdisciplinary; new scholarly products; impact on diverse populations, communities, problems; collaborative work; training studentsSeptember 6, 2012

Page 17: Gathering Evidence to Demonstrate Impact

References• Beacham, B., Kalucy, L., McIntyre, E. (2005). Focus on Understanding and

Measuring Research Impact. Retrieved from http://www.phcris.org.au/phplib/filedownload.php?file=/elib/lib/downloaded_files/publications/pdfs/phcris_pub_3236.pdf

• Cozar, & Clavijo, . (2012). Google Scholar Metrics: An unreliable tool for assessing scientific journals. El Professionale de la Informacion, 21(4), 419-427.

• Garfield, E. (1986). Journal impact vs. influence: A need for 5-year impact factors. Information Processing & Management, 22(5), 445.

• Garfield, E. (2000). The use of JCR and JPI in measuring short and long term journal impact. Presented at the Council of Scientific Editors Annual Meeting: San Diego, CA.

• Leyesdorff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox? Journal of the American Society for Information Science and Technology, 60(7), 1327-1336.

September 6, 2012