assessing the programme performance: what worked, what can be improved?

16
Assessing the Programme performance: What worked, what can be improved? Tarja Richard MED JTS Coordinator (interim) ‘ANALYSIS, EVALUATION AND DEVELOPMENT OF THE MED PROGRAMME’ Campobasso, June 13th 2011

Upload: sybill-stout

Post on 31-Dec-2015

19 views

Category:

Documents


1 download

DESCRIPTION

‘ANALYSIS, EVALUATION AND DEVELOPMENT OF THE MED PROGRAMME’ Campobasso, June 13th 2011. Assessing the Programme performance: What worked, what can be improved?. Tarja Richard MED JTS Coordinator (interim). CONTENT. Traditional Calls: Challenges; - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Assessing the Programme performance: What worked, what can be improved?

Assessing the Programme performance:What worked, what can be improved?

Tarja Richard

MED JTS Coordinator (interim)

‘ANALYSIS, EVALUATION AND DEVELOPMENT OF THE MED PROGRAMME’

Campobasso, June 13th 2011

Page 2: Assessing the Programme performance: What worked, what can be improved?

1. Traditional Calls: • Challenges;• Assessment: methods, tools, selection procedure, evaluation,

adjustments;• Strengths and weaknesses of projects;

2. Strategic call: 1. Challenges; 2. Assessment: selection procedure, expectations;3. Results;

3. Future perspectives?

CONTENT

Page 3: Assessing the Programme performance: What worked, what can be improved?

Ensure project quality: Offer possibility to improve applications with potential:

2-step selection procedure; Conditions / Recommendations.

Ensure fair assessment; Grid of criteria.

TRADITIONAL CALLSCHALLENGES

High number of applicationsDifferent evaluators

Page 4: Assessing the Programme performance: What worked, what can be improved?

General evaluation/Individual evaluation;Discussion/ double evaluation;Based on grid of criteria + internal scale of values:- Eligibility and Quality check PRESAGE system;

1. Coherence Axis – Political Frameworks;2. Coherence MED Objectives;3. Geographical balance – Transnationality;4. Quality of Project / Implementation;5. Partnership;6. Budget.

TRADITIONAL CALLSASSESSMENT: METHOD AND TOOLS

Page 5: Assessing the Programme performance: What worked, what can be improved?

1st call for projects – 2 phases – 11 months (273 eligible projects / 531) by 4 project Officers

Launch PA (1 month) Sub.PA (273/531) (4 months) Pre-S (109) (1 month)

Launch A (1 month) Sub.A (4 months) Selection (50/108)

2nd call for projects – 2 phases – 11 months (330 eligible projects / 447) By 4 project Officers

Launch PA (1 month) Sub.PA (330/447) (4 months) Pre-S (90) (1 month)

Launch A (1 month) Sub.A (4 months) Selection (52/76)

TRADITIONAL CALLS ASSESSMENT: SELECTION PROCEDURE

Page 6: Assessing the Programme performance: What worked, what can be improved?

2 step-application selection procedure:

No real improvement of the whole application between phases CONDITIONS set by the Selection Committee;

The draft application and the final one were too similar: the 1st phase was too heavy;

Long process and low programming rate (from 978 pre-proposals to 102 programmed projects).

TRADITIONAL CALLS ASSESSMENT: WHAT HAD TO BE IMPROVED?

Page 7: Assessing the Programme performance: What worked, what can be improved?

Taking better advantage of the 2-step selection procedure: Recommendations drafted at pre-selection stage. Improve coherence of evaluations:

- between evaluations;- between evaluators;

common understanding of criteria; common scale of values; harmonized comments;

Internal grid (scale of values); Double evaluation / discussion.

TRADITIONAL CALLS ASSESSMENT: ADJUSTMENTS IN TOOLS AND METHODS FOR THE 2ND CALL

Page 8: Assessing the Programme performance: What worked, what can be improved?

Open calls for all Priority Axis: high number of applications – ‘mass treatment’.

No possibility to exchange between programme bodies (JTS and NCP’s) and applicants in the application phase.

Projects having a weak correspondance with programme objectives.

Partnership often not sufficiently managed by the LP and not strong since the beginning numerous partner changes in projects.

TRADITIONAL CALLSSTRENGTHS AND WEAKNESSES OF ‘TRADITIONAL’ PROJECTS

Page 9: Assessing the Programme performance: What worked, what can be improved?

Capitalisation approach not foreseen by projects since their beginning, but a need to capitalise and to share experience is clear in many projects.

Spontaneous clusterisation is happening with part of the projects.

Many project results are lost after the end of activities: programme needs to create an ‘electronic library’ to improve the use of deliverables and results.

Projects are of different content quality, the key problem is that the operational programme has not been precise and concentrated enough to call for targeted activities.

TRADITIONAL CALLSSTRENGTHS AND WEAKNESSES OF ‘TRADITIONAL’ PROJECTS

Page 10: Assessing the Programme performance: What worked, what can be improved?

STRATEGIC CALLCHALLENGES

Low number of applicationsTerms of reference

Ensure project quality: Offer possibility to improve applications with potential;

2-step selection procedure; Conditions / Recommendations.

Ensure fair assessment: Grid of criteria.

Page 11: Assessing the Programme performance: What worked, what can be improved?

1 phase including a draft proposal – 11 months.

Launch (1 month) Submission of Drafts (4 months) Applications (5 months) Selection.

35 Draft proposals for the two objectives (Renewable energies and maritime safety).

12 final applications submitted – 3 projects programmed in 2011. Evaluation with 4 Project Officers: Evaluation grid adapted to Strategic Projects; Seminar with Lead Partners in draft proposal phase; Complementary opinions from external expert.

Two meetings of Selection Committee.

STRATEGIC CALL ASSESSMENT: SELECTION PROCEDURE for the 1st call

Page 12: Assessing the Programme performance: What worked, what can be improved?

Coherence with MED Programme and Terms of reference;

Need-based approach;

Realism / feasibility of objectives;

Connection with existing public policies;

Innovative approach / added value;

Coherence of the partnership to reach the objectives of the project;

Impact of the project / long term perspectives;

(Perspective North / South).

STRATEGIC CALLASSESSMENT: CONTENT EXPECTATIONS TO STRATEGIC PROJECTS

Page 13: Assessing the Programme performance: What worked, what can be improved?

The methodology ,with built-in exchange with the applicants and a Terms of Reference, was succesful in:

- reducing the amount of proposals;- focusing better the contents.

The call for Maritime Safety projects was not fruitful, none of the 5 eligible proposals was selected: the call was relaunched.

The choice of the priority/objectives of the 1st call: topics that did not receive enough proposals or they were of insufficient quality. The question on the relevance of programme priorities to key stakeholders must be raised if the call for Strategic project does not bring quality proposals.

STRATEGIC CALLRESULTS: EVALUATION OF THE 1ST CALL FOR STRATEGIC PROJECTS

Page 14: Assessing the Programme performance: What worked, what can be improved?

Limited range of priorities.Even more focused Terms of Reference: limits of the ‘top down’ approach?

Quality follow up: beyond administrative management. Exchanges between the programme (JTS and MC) and projects in the application phase: - conditioned by the amount of proposals received;- conflict of interest if evaluators participate in project development?

More added value to the money spent.Complementarity of transnational projects with cross-border and regional programming?How to attract new structures with no experience on ETC projects, but a content potential?

STRATEGIC CALLRESULTS: HOW CAN WE IMPROVE?

Page 15: Assessing the Programme performance: What worked, what can be improved?

Programming:

Preparing the OP with a strategic approach on the cooperation space; Few priorities but with an integrated approach;

Pilot projects shared with regional programmes;

Capitalisation and clusterisation in-built in projects since the beginning.

Management and implementation:

External expertise in evaluating project proposals;

Different roles to new and to more experimented partners in a project;

JTS composed of management and content units.

FUTURE PERSPECTIVES?

Page 16: Assessing the Programme performance: What worked, what can be improved?

Joint Technical SecretariatMED PROGRAMME

[email protected]

www.programmemed.eu

THANK YOU FOR YOUR ATTENTION!