informetrics and information seeking research in africa...keynote presentation at the university of...
TRANSCRIPT
Informetrics and information seeking
research in Africa
Research indicators and performance management in African R&D systems:
what role can information scientists play?
Keynote presentation at the University of Zululand
15th Information Studies (IS) Annual Conference
3-5 September
Peermont Metcourt Hotel at Umfolozi
Empangeni
Andrew M. Kaniki Executive Director: Knowledge Fields Development
The National Research Foundation,
Pretoria
Indicators Defined
Indicators provide evidence that a certain condition
exists or certain results have or have not been achieved: • They enable individuals/decision-makers to assess progress towards
the achievement of intended outputs, outcomes, goals, and objectives;
• They have been closely associated with programme management;
and
• More specifically, associated with monitoring and evaluation of
programmes, as part of a results-based accountability system
Brizius, J. A., & Campbell, M. D. (1991). Getting results: A guide for government accountability. Washington, DC: Council of Governors Policy Advisors.
Strategic Context:
Towards the knowledge society/economy
• African Governments have set themselves the objective of transforming their countries and the continent into a knowledge society that competes effectively in a global system
• The agenda… to become a knowledge society is increasingly
driven by Africans. The African Union and more specifically NEPAD are setting the agenda (Britz et. al (2006) Africa as a knowledge society: a reality check Int. Info & Lib Review.)
• The South African Government has set itself the objective of
transforming South Africa into a knowledge society that competes effectively in a global system
• cf: National R&D Strategy
• National Plan for Higher Education (NPHE)
• Human Resources Development Strategy for SA
• Ten year innovation plan
3
… Towards the knowledge society/economy A Renewed interest in S&T funding
After the decline in the 1990s of support for S&T development in Africa, there is now a new realisation by most role-players of the importance of developing Science and Technology Investment (STI)/R&D capacity in developing countries. High profile reports outlining new visions, priorities and directions for African STI have emerged, particularly the UNESCO Higher Education, Research and Innovation: Changing Dynamics (2009) Report, NEPAD’s African Innovation Outlook (2010 and 2014) and the UN Rio+20 Report (2012) as well as the World Bank Africa Strategy in strengthening competitiveness and employment.
These reports call for the international community’s intervention to assist in promoting technology development, transfer and utilisation in Africa to enhance knowledge to support African countries to develop effective STI institutions and the concomitant capacity to become global knowledge partners. The UN Millennium Project Report (2009) argues that STI underpins every one of the Millennium Development Goals (MGDs) and therefore becomes a prerequisite for sustainable development.
Mouton, J (20 June 2014) Science granting councils in Sub-Saharan Africa supported by the IDRC. NRF Presentation, slide 10
Strategic Context: Towards the knowledge society/economy
• Research [knowledge] and development, and innovation
seen as drivers to the knowledge economy
• Increasingly African R&D [and innovation] systems, their
stakeholders (people/ researchers) and structures are
paying or “being forced” to pay attention to the
management of research and development (R&D) and its
performance.
Curiosity-driven research Needs/market-driven research
Public appropriability
Development Commercialisation Open research Directed research
Agency - Core Grant
NRF THRIP
TIA
National Research Facilities
Product development
Technology development
Applied research
Basic research Commercialisation
Public appropriability Wealth creation
South Africa
Research, Development and Innovation System
Value Chain
Paying attention to management of R&D: Knowledge policies
(Knowledge) policies are high-level and long-term normative statements of desired end-states and require operationalisation through frameworks, strategies and plans.
– Such policies (and related strategies and plans) have differential impact: ranging from strong to weak, relatively direct to very indirect and both intended and unintended.
– Knowledge policies impact differentially on the different dimensions (nature, shape and volume) of knowledge outputs.
Johann Mouton, Johann Measuring research performance at SA universities:
Context, trends and consequences SATN Conference 3 October 2013
Paying attention to management of R&D Knowledge policies- SA Examples
– National knowledge policies
• White Paper on Science and Technology (1996)
• Education White Paper 3: Programme for the transformation of Higher Education (1997)
– National strategies/ plans / frameworks
• National Plan on Higher Education (2001)
• National R&D Strategy (2002)
• DoE Research Funding Framework (1987/ 2003)
• HEQC Audit Framework (2004)
• DST Ten Year Innovation Plan (2007)
• National Development Plan (2012)
How do we know that we have achieved/ making progress or regressing… tell the story
• R&D indicators are an integral part research performance.
• Different Levels of impact/performance i.e
Individual – personal promotions; reputation (D Jacobs/B Damonse); Professorship Debate!
Institutional – rankings
National – Summary South African Dataset Research profile
(InCites Data) ± 30 years Cumulative 1979-2014: Most Productive Authors;
Most Cited Authors
Most Active Subject Area
International
Key criteria or standards for good
indicators (1)
• Availability of data for an indicator;
• Currency and consistency & continued collection of data;
• Feasible to collect and analyse data for an indicator;
• If not currently collected, availability of cost effective instruments for data collection and or ability to develop them;
• Indicator set is coherent and balanced overall;
0.0
10.0
20.0
30.0
40.0
50.0
60.0
1984
1986
1988
1990
1992
1994
1996
1998
2000
2002
2004
2006
2008
2010
2012
Nu
mb
er
pe
er-
rev
iew
ed
pu
bli
ca
tio
ns
(s
mo
oth
ed
)
Albany Museum
Ditsong-NHM
Iziko
NMB
PE Museum
Number of research publications
Hamish Robertson (2014) The state of museums research in South Africa ICOM-SA, Durban
0
100
200
300
400
500
600
700
1984
1986
1988
1990
1992
1994
1996
1998
2000
2002
2004
2006
2008
2010
2012
Tota
l pages/a
uth
or
(sm
ooth
ed)
PE Museum
NMB
Iziko
Ditsong-NHM
Albany Museum
Pages per author
Hamish Robertson (2014) The state of museums research in South Africa ICOM-SA: Durban
Key Criteria or standards for good indicators
Cont’d (2)
• Indicator is needed and useful;
– Is there evidence that this indicator is needed at national level?
– Which stakeholders need and would use the information collected by this indicator?
– How would information from this indicator be used?
– What effect would this information have on planning and decision-making?
Total research publications output (1990 – 2010)
University Total research output Relative share (%) (1) University of Pretoria 19168.98 14.40 (2) University of KwaZulu-Natal [1:1.11] 17393.17 13.07 (3) University of Cape Town [1:1.13] 17010.91 12.78 (4) University of the Witwatersrand [1:1.16] 16531.75 12.42 (5) University of Stellenbosch [1:1.34] 14286.35 10.73 (6) University of South Africa [1:20] 9521.44 7.15 (7) University of the Free State [1:26] 7434.73 5.58 (8) University of Johannesburg [1:28] 6875.00 5.16 (9) North-West University [1:33] 5854.44 4.40 (10) Rhodes University [1:41] 4632.11 3.48 (11) University of the Western Cape [1:61] 3162.17 2.38 (12) NMMU [1:64] 2998.01 2.25 (13) University of Limpopo [1:9] 2110.54 1.59 (14) Tshwane University of Technology [1:16] 1225.44 0.92 (15) University of Zululand [1:18] 1084.04 0.81 (16) University of Fort Hare [1:18] 1081.26 0.81 (17) CPUT [1:23] 850.33 0.64 (18) Walter Sisulu University [1:37] 517.11 0.39 (19) Durban University of Technology [1:41] 464.1 0.35 (20) Central University of Technology [1:54] 356.84 0.27 (21) University of Venda [1:63] 302.57 0.23 (22) Vaal University of Technology [1:86] 222.45 0.17 (23) Mangosuthu Technikon [1:445] 42.78 0.03
Key Criteria or standards for good indicators
Cont’d (3)
• Has technical merit;
• Indicator is fully-defined;
• Field-tested or used in practice
Sources of data for Research Performance Indicators:
Impact of knowledge policies and strategies on research (1)
Sourced from: Johann Mouton, Johann Measuring research performance at SA
universities: Context, trends and consequences SATN Conference 3 October 2013
Knowledge policies and strategies have been devised to impact different dimensions of research (knowledge production).
1. The nature of research (basic, applied, strategic, Mode 1 and Mode 2): Steering research production into more collaborative, industry-linked research (e.g. THRIP, Innovation Fund grants)
2. The quality of research: Emphasis on quality-assurance systems and processes (HEQC Audits/ Assaf Journal Reviews/NRF Rating)
3. The shape of research (scientific field distribution): Greater emphasis on SET fields, prioritising IKS, Space science, Astronomy, etc.
4. The volume of research (magnitude of outputs): Increasing the volume of research outputs and knowledge outputs (Masters and doctoral) graduates
Sources of data for Research Performance Indicators:
Impact of knowledge policies and strategies on research (2)
5. Research efficiency (or productivity): Whether individual (and institutional) productivity has increased: Are we getting more outputs given the inputs (funding/ etc)
6. Research collaboration: Whether scientists and scholars pursue more colaboration both nationally and internationally
7. The visibility and (international) impact of research: The extent to which research in SA is moe visible and recognised and cited internationally.
8. The transformation of research: The extent to which more female and black scientists and cholars participate in research production, i.e. whether we have expanded the human capital base of research.
What’s the common thread in these
indicators, their data & analyses?
• Bibliometric/s
• Informetric/s
• Scientometric/s
19
So, what role/opportunities for information scientists/ professionals
• Issues related to international competitiveness
– what is it that SA wants to do in international competitiveness?
– Who is doing what?
– How is south Africa performing?
– Reliable measures of impact and competitiveness?
– Interpretation of international measures of performance and ranking systems (Max Price M&G article; THES; Shanghai)
• What does impact factor mean? h factor?
• Reliability of measure?
• Difference of bibliometric measures in different fields
20
…what role/opportunities for information scientists/ professionals cont’d
• Identification of reviewers (for Rating & Grant Proposals) – Use of ISI Web of Knowledge – Scopus
• Collecting and interpretation of indicators – Citation metrics – Discipline specific
• ISI - 8 282 journals (2007)
– SET 77% (6 417 journals) – SS&H 23% (1 865 journals)
• Scopus – 17 000 journals (2007) – Life Sciences 20% (3 400 journals) – Physical sciences 32% (5 500 journals) – Health sciences 31% (5 300 journals) – Social sciences 16% (2 800 journals)
21
…what role/opportunities for information scientists/ professionals? cont’d
• Measure and provide evidence of progress and or regression in international competitiveness
• Need identify and provide information and guidance on who is involved in the system (nationally and internationally)
• Provide understanding in publication patterns between and among disciplines i.e., sequence and arrangement of joint/ shared authorship; varying research outputs; differentiated impact factor etc.
But…NB!!!
Bibliometrics; informaterics; scientometrics …
NOT EXCLUSIVELY for Information scientists/ professionals
23
Therefore…
Value - Add; niche & partner • Liaison with Research Offices • Liaison with researchers • Pre-evaluation of outputs – especially books
(Accuracy of data reported – bibliographic information
• Rating system/Grant Applications – support for application process
• DHET Subsidy – facilitation of appropriately completed submissions
• Research integrity – data source; references etc • Scientific data preservation and management