engineering sciences

15
Engineering Sciences F. Lapicque (for J. Seville) IPN METODIKA www.metodika.reformy-msmt.cz

Upload: meys-msmt-in-czech

Post on 19-Feb-2017

367 views

Category:

Education


0 download

TRANSCRIPT

Page 1: Engineering Sciences

Engineering Sciences

F. Lapicque (for J. Seville)

IPN METODIKA

www.metodika.reformy-msmt.cz

Page 2: Engineering Sciences

Outline of the presentation

• 1- Metodika: how was it understood?

• 1.1. By surveyed Research units

• 1.2. By the panel members

Survey: 10 main questions

NB. When agreed

• 2.- A brief comparison with the system used in France for

evaluation…

1

Page 3: Engineering Sciences

1- Metodika- How was it understood?

1.1 by the surveyed research units

Question 1- Overall point of view

• Too many questions and too little space for commenting on

strategy and future

• Perfectible formulation of questions

• Overlapping in questions/items

Point of view for the panel members

• Certain reluctancy from the RU’s in answering the items

• Combined questions on qualitative and quantitative aspects in the main part

of the application

2

Page 4: Engineering Sciences

1- Metodika- How was it understood?

1.1 by the surveyed research units

3

Q2- Formulation of the five criteria

• Research excellence and research performance are not clear

o One is for tiptop

o the rest for a number of data (money, papers, citations etc…).

• Things are not clearly defined and formulated on the form.

• Do prefer the CAS evaluation

Q3- Evaluation of excellence: 1 or 2% output ?…

• Discussion on the figure : is it 2%? Or 3%?

• First define excellence. Often used but with other meanings and purposes

Q4- Bibliometric report

• A number of answers and comments but hardly understandable

• Personal comment : this strongly depends on areas and topics !

Page 5: Engineering Sciences

1- Metodika- How was it understood?

1.1 by the surveyed research units

4

Q5- Compare Metodika to the former Coffee grinder

• Find Metodika interesting, also because of the presence of external experts.

However more complex.

• Question on their impartiality and experience in the job. How were they recruited?

• For others, Metodika is better, more complete than others, with various weights,

the values of them being sometimes not agreed.

Q6-Comments, suggestions to Metodika?

Does this question just not emphasise Q5!?

• Most answers: please make it simpler

• Please provide experts of sufficient quality in evaluation

• Would love that things better prepared and organised

Page 6: Engineering Sciences

5

1- Metodika- How was it understood?

1.1 by the surveyed research units

5

Q7- Has the panel reached a correct view on ..?

• Well, it is often expressed that RU have the feeling that they have been under-

evaluated.

My point of view, yes, but 30% of the items were not really, correctly

filled!!).(For this reason, we mentioned “with potential to..”

• It is sometimes reproached that the report be too neutral, too formal, with just

a number of data, but without mention about evaluation of outstanding

research outputs..

Q8- Are Panel’s recommendations useful?

• To some extent but a number of things were already known

(OK, but they were poorly presented actually!).

• Often regret that means for improvements are not given by the experts

probably right but sometimes difficult for us, e.g. useless research ….

Page 7: Engineering Sciences

1- Metodika- How was it understood?

1.1 by the surveyed research units

Q9- Estimation of time

• Very diverse answers, actually..:

• from altogether more than 40 man-days to … approx.. 40 hours.

Nevertheless, regardless of the selection of papers, data etc..

sometimes have been poorly documented

6

Personal comment

• Both interest and reluctancy of evaluated RU’s in considering the test

• Too bad that no visits could be organised in some areas

Page 8: Engineering Sciences

1- Metodika- How was it understood?

1.2 by panel members

7

Q4- Do you consider visits important..?

Yes, for sure for both the visited units and the experts!

Q5- Are grading of outputs by independent reviewers useful?

• Yes absolutely. 70% of the reviews made seem really good

• However the selection of the output made by the RU may be questionable

Q6- Assessment of the RU: done on the output or on the lot?

« output » does not suffice. Would prefer to have both

Judgements made on bibliometrics

• Engineering sciences differ from other other disciplines for this point

• Within engineering sciences, paper production rate may differ quite a lot, so…

Page 9: Engineering Sciences

1- Metodika- How was it understood?

1.1 by the surveyed research units

Q7- Do you consider bibliometric data important in your area

• Yes with limits and clever evaluation of things

• Avoid overpublication and citation hunters!

8

Q8- Do you consider a calibration exercise necessary and useful?

• Certainly

• Presumably too short, i.e. would have preferred to contribute in its design

Q9- Comments on the pilot testing

• Great commitment and enthusiasm of the organising team

• Unfortunately sometimes insufficient quality of the reports transmitted

but this might understood …

• Presumably lack of time for the organisers

Page 10: Engineering Sciences

2- Comparison with the current

system in France. “HCERES”

9

Haut Commissariat à l’Evaluation de la Recherche et de l’Enseignement

Superieur

• formerly AERES… evaluating system in France for more than 10 years

• Evaluation made every five years (« plan quinquennal »)

• For Etablishments (Universities)

Research units or

Formation/diploma

Focus on evaluation of Research units (CNRS* and University)

UPR (CNRS Research Laboratory)

UMR (Mixed CNRS-Univ. Research units

EA (Research teams)

(*) or other public

research centers

Page 11: Engineering Sciences

2- Comparison with the current

system in France. “HCERES”

10

The application. Three files

1- Evaluation form.

Short and sharp with general presentation of the lab, summary and balance per

groups/teams for the 5 years, Involvement in Education through research, Strategy.

2- Appendices

More detailed description with all data and quantitative information, outcomes,

contracts, organisational aspects (safety, staff, formation for staff members

3- Data on the present period and on the future

Topics, human resources, budget

Page 12: Engineering Sciences

2- Comparison with the current

system in France. “HCERES”

11

Criteria for evaluation

• Production of knowledge and scientific level

• Attractivity and reputation in the community

• Interaction with societal, economic and cultural environment

• Organisation in the RU

• Commitment in formation by research

• Strategy and scientific prospects within the next 5 years

NB: Criteria do not correspond to parts in the application files

Page 13: Engineering Sciences

2- Comparison with the current

system in France. “HCERES”

12

Time schedule

Example of LRGP (Chemical engineering in Nancy, Fr.)

Jul. 2015. Start in collecting references, contracts, PhD, and actualise

Jan. 2016. Balance and summary written by per groups.

General presentation of the lab

Mar-May 2016: Preparation of the appendices

June-Aug 2016: Exchanges with the University

Sept. 2016: Final version of the application

Oct 2016: Validation of the various documents. Transfer to reviewers

Dec. 2016: Two-day intensive visit, with large emphasis on lab organisation

and life, in addition to strategy and prospects

Page 14: Engineering Sciences

2- Comparison with the current

system in France. “HCERES”

13

Evaluation, marks and levels

• A detailed report is prepared by the evaluation committee (6-10 persons)

with point per point assessment

• Evaluation made both for the whole lab (330 p.) or per groups (av. 65p.)

• Comments and suggestions for the next period

Formerly: A, B and C, with later A+, A, A- etc..

No more marks but significant impact on funding, reputation, success in

applications, recruitment, PhD grants etc.

Page 15: Engineering Sciences

Thank you for your attention!

www.metodika.reformy-msmt.cz