final delasalle for uksg

Post on 10-May-2015

1.004 Views

Category:

Technology

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Role of the library in research evaluation

Jenny DelasalleAcademic Support Manager (Research)

University of Warwick

Today’s session will cover…

• Overview of research evaluation in the UK context.

• Lots of stuff about bibliometrics!• How librarians’ expertise is relevant• What can a library do, and what should a

library do, with regard to research evaluation?• Discussion!

Who is interested in research evaluation?• HEFCE, through REF 2014: funding and reputation• University management: central, department heads.

– Indication of staff performance– Recruitment and “Head hunting”!– Targeting of support– Demonstrate capabilities/accomplishments– Uni Rankings

• Researchers themselves: collaborations– Peer reviewers on REF panels, etc

Possible measures…• “Bibliometrics”

– Number of outputs (in quality publications)– Number of citations/calculation based on this– Write on card some appropriate measures

• Involvement as a peer reviewer• Journal editorships • Research grant applications • Research income• Prestigious awards• PhD supervision load

About REF 2014

• “Led by expert review, informed by metrics.”• They are looking for “Impact”: citations are

just one measure. • 65% outputs, 20% Impact, 15% Environment• Adapting to disciplinary differences: 36 UoAs• Panel criteria published in Jan 2012.• Not every researcher eligible for submission.

Output-based measures (bibliometrics, webometrics, altmetrics)

– Paper counts– journal impact factors – the H-index – citation scores at article level – visitor numbers (or other info) for online articles – and many others…. eg blog entries, tags, etc

Role for Librarians, advising on use?

Main sources of citations data

• Web of Science – Thomson Reuters (used for THE rankings)

• Scopus – Elsevier (used for REF 2014, in Sciences)

• Google Scholar… various tools available. (Used for REF 2014, Computer science only)

About WoS & Scopus citation dataScopus WoS

Approx. 17,000 journals Approx. 11,000 journals

Citations in journals since 1996

Relatively poor coverage of conferences?

Broadest subject coverage Science & Social sciences origins.

Scimago journal rank JCR Impact factor

Google Scholar • Beware:

– We’ve no idea what’s included or excluded – or how it works (Date data goes back to?)

– Data is inconsistent and there are no efforts at standardisation– Data includes multiple entries, false hits, reading lists, etc– It lacks the sophisticated search functionality of Scopus & WoS

• Benefits:– It includes citation data : external analysis tools. – Best Arts and Humanities coverage.– It identifies material which is not yet indexed by WoS. – Google search options are easy to learn/already familiar– It has an ‘advanced’ search option.– It’s fast at bringing you search results.

Be careful of citation measurement: motivations for citations

• Paying homage to experts– Especially those likely to be peer reviewers!– Lend weight to own claims

• Credit to peers whose work you have built upon• Provide background reading• Criticising/correcting previous work• Sign-posting under-noticed work

– (own paper which would affect your h-index!)

• Self citations!

Citation patterns

• Most publications have few or no citations.• Variety across the disciplines.• Therefore comparisons within a discipline are

most useful.• Percentages against a world average within

each discipline are more useful than basic numbers.

About the H-index• Invented by Jorge E. Hirsch, a physicist, in 2005

• Algorithm to calculate quality and sustainability of research output

• Calculated using number of publications and number of citations per output

• A researcher with an index of h has published h papers each of which has been cited by others at least h times

• E.g. a H-index of 20 means there are 20 published papers each with at least 20 citations

Example H-indexE.g. Professor X has a total of 10 publications

Publication 1 20 citesPublication 2 18 citesPublication 3 11 citesPublication 4 7 cites----------------------------------------------------------- H-index: 4Publication 5 4 citesPublication 6 3 citesPublications 7,8,9,10 0 cites

Other than the H-index• M-index = h/n, where n is the no of years

since the first published paper.• C-index accounts for quality of the citations• G-index: more weight to highly cited articles• H-1-index: how far away from gaining 1 more

point on h-index• E-index: surplas citations in the h set!• Contemporary h-index: recent activity• Google’s i10-index: no. of papers with at least

10 citations.

The author’s perspective

• A record of what you have published will be useful for:– your CV.– providing information to University data gathering exercises

(Department level or REF).– web pages that describe your work.

• Keeping an eye on who is citing your work helps you to:– identify future potential collaborators.– maintain awareness of other research in your field and

interpretations of your work.– Become aware of which articles are influencing your research

profile the most.

Advice to researchers: tell a good story…

• List of your articles and no. of citations for each. • Average citation no. for papers over 2 years old?• Is it high or low for your discipline?• Compare your article’s citation count to average

for the journal your article appears in, for the year of publication.

• Add context : who has cited your work? Anyone particularly impressive?!

Other things to measure…

• No. of articles with no citations at all?• No. of joint articles – and who the co-authors

are, to indicate collegiality and interdisciplinarity.

• No. of articles published in a quality (high impact factor?) journal.

• (not only articles, or even outputs!)

PLoS example (1)

PLoS example (2)

WRAP example

Gaining visitors to your paper

• Boost your Google juice!– Put a link to your paper everywhere you can:

Wikipedia? Academia.edu & other profile sites– Get someone else to cite your paper, even in a

draft paper online: Google Scholar will pick it up, and having GScholar citations seems to help rankings.

– Get your papers into an OA repository as quickly as possible: date sensitive

Shanghai Academic Ranking of World Universities• Quality of Education: Alumni of an institution winning Nobel

Prizes and Fields Medals - 10%• Quality of Faculty:

– Staff of an institution winning Nobel Prizes and Fields Medals - 20%– Highly cited researchers in 21 broad subject categories - 20%

• Research Output:– Papers published in Nature and Science* - 20%– Papers indexed in Science Citation Index-expanded and Social Science

Citation Index - 20%• Per Capita Performance: Per capita academic performance of

an institution - 10%

Discussion topics!• Who to:

– Researchers, University administrators, HoDs, etc• About:

– the data sources (Library subscriptions!)– How to calculate an h-index– Other measures available: which to use when– Characteristics of highly cited articles… “career tips”!

• How:– Partnership arrangements/meetings– Online guides– Training sessions: in Library, advertised or as invited to

departments– Enquiries/consultations

Reading list!

• Auckland, M (2012) RLUK Re-skilling for Research http://www.rluk.ac.uk/content/re-skilling-research Accessed 20 March 2012

• My blog: http://blogs.warwick.ac.uk/libresearch

• JISCMAIL lis-bibliometrics list

top related