Call management Master Class, 17-18 June 2013

59
Call management Master Class, 17-18 June 2013 Organised by WP3 Mutual Learning Date: 17 June at 14.00 to 18 June at 13.00 Venue: KoWi premises Rue du Trone 98, 8th floor Brussels, Belgium Participants: KBBE ERA-NETs call managers and PLATFORM WP3 Chair: Ulla Sonne Bertelsen Organisation: Ulla Sonne Bertelsen, Christian Listabarth and Christine Bunthof Master Class rationale and plan Starting point after June 2012 meeting: Partners decided that WP3 task 3.1 should focus on call organisation: ‘Efficient call management for optimal use of resources.’ and welcomed the approach of organising a master class for KBBE ERA-NETs. The target group for this Master Class would be those persons in the ERA-NETs that are working with call management. The content of the Master Class: To be finally decided in April at the PLATFORM Annual Workshop, when all partners and a larger group of network coordinators are present, after a discussion of what would benefit the KBBE ERA-NETs the most. Suggestion: Organisers will pick a case, call managers will identify differences in current procedures, discuss added value compared to extra efforts, and provide recommendations for the PLATFORM consortium. Expected outcome of the Master Class: The call managers will be familiar with procedures of other ERA-NETs and get ideas for improvements of their own call procedure.

Transcript of Call management Master Class, 17-18 June 2013

Page 1: Call management Master Class, 17-18 June 2013

Call management Master Class, 17-18 June 2013

Organised by WP3 Mutual Learning

Date: 17 June at 14.00 to

18 June at 13.00

Venue: KoWi premises

Rue du Trone 98, 8th floor

Brussels, Belgium

Participants: KBBE ERA-NETs call managers and PLATFORM WP3

Chair: Ulla Sonne Bertelsen

Organisation: Ulla Sonne Bertelsen, Christian Listabarth and Christine Bunthof

Master Class rationale and plan Starting point after June 2012 meeting: Partners decided that WP3 task 3.1 should focus on call organisation: ‘Efficient call management for optimal use of resources.’ and welcomed the approach of organising a master class for KBBE ERA-NETs. The target group for this Master Class would be those persons in the ERA-NETs that are working with call management.

The content of the Master Class: To be finally decided in April at the PLATFORM Annual Workshop, when all partners and a larger group of network coordinators are present, after a discussion of what would benefit the KBBE ERA-NETs the most.

Suggestion: Organisers will pick a case, call managers will identify differences in current procedures, discuss added value compared to extra efforts, and provide recommendations for the PLATFORM consortium.

Expected outcome of the Master Class: The call managers will be familiar with procedures of other ERA-NETs and get ideas for improvements of their own call procedure.

Page 2: Call management Master Class, 17-18 June 2013

8 May 2013 

1  

 

 

Agenda for the call management Master Class

17-18 June 2013

Organised by WP3 Mutual Learning

Date: 17 June at 11.00 to

18 June at 15.00

Venue: KoWi premises

Rue du Trone 98, 8th floor

Brussels, Belgium

(map on page 4)

Programme outline

Monday June 17

Welcome & introduction

A. Support tools for matchmaking and handling of applications and evaluation B. Evaluation and ranking of applications C. Selection of recommended applications, funding decision and project negotiation

Tuesday June 18

C. Selection of recommended applications, funding decision and project negotiation – continued

D. Joint project monitoring Overview of results and last inputs

Page 3: Call management Master Class, 17-18 June 2013

8 May 2013 

2  

Detailed Agenda

Monday June 17

11.00  Welcome and tour de table 

INTRO. A quick introduction to the ERALEARN call toolbox and how to get help for the topics that will 

not be discussed in this Master Class, and an introduction to the topics of the Master Class.  

11.30  A. Support tools for matchmaking and handling of applications and evaluation  

The topic includes a discussion of security standards (the system, the handling of the proposals and 

storage and access). 

1) CASE. Iver Thysen from ICT‐AGRI will introduce their Meta Knowledge Base which is under 

transformation to an open source system for matchmaking and on‐line submission and evaluation of 

applications. It can be used by all ERA‐NETs. 

2) SHARING. Round table in which the participants share their working methods and experiences like: 

do you assist in match making? Do you use an electronic submission system? How do you secure the 

data?  

3) MODELS. The practices are summarised into a limited set of models 

4) DISCUSSING. To get more insight in the differences, the advantages and disadvantages, the 

applicability and cost‐effectiveness of the approaches.  

5) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from the group. 

13.00‐13.30  Sandwich lunch  

13.30  B. Evaluation and ranking of applications 

  Quality assurance of your evaluation and ranking procedure is a key‐factor in joint funding of high 

quality international research‐projects. The evaluation process is very critical for the selection of high 

quality projects, and is seen as the most stressful period in a call procedure by some call managers.  

  1) CASE. Paul Beckers from ERA‐CAPs will introduce their method of expert‐ and panel‐evaluation, 

which is based on best practice in previous  ERA‐Net Plant Genomics (ERA‐PG) calls, and a possibility to 

outsource  the  peer review process  

2) SHARING. Round table in which the participants share their working methods and experiences like: 

What are the pros and cons of introducing a pre‐proposal step? How is your eventual 1st step 

evaluation made? How do you identify and work with the reviewers? Do you use a panel meeting? 

With same or different people than those that reviewed individual proposals? Examples of centralised, 

semi‐centralised and decentralised evaluations. What is the basis of your ranking list? Do you use 

categories (high quality / middle / low) or a full ranking (1, 2, 3, etc). Do you include a rebuttal 

process?  

3) MODELS. The practices are summarised into a limited set of models 

4) DISCUSSING. How to ensure that you have a sufficient scientific quality in the evaluation of the 

proposals. What should be the mandates of external peers versus review panel members? To get more 

insight in the differences, the advantages and disadvantages, the applicability and cost‐effectiveness 

of the approaches.  

5) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from the group. 

 

 

 

Page 4: Call management Master Class, 17-18 June 2013

8 May 2013 

3  

16.30  C. Selection of recommended applications, funding decision and project negotiation – part I 

  Part I concerns the selection of projects and funding decision.    

1) CASE. Nicolas Tinois from FACCE‐JPI will introduce different experiences on selection of applications 

and funding decisions  

2) SHARING – SELECTION AND FUNDING DECISION ONLY. Round table in which the participants share 

their working methods and experiences like: What do you do to ensure comparability of projects, e.g. 

big and small projects, projects on different topics, or inter‐person differences in using scales, gaps in 

funding, distribution of top‐up funding. 

3) MODELS. The practices are summarised into a limited set of models 

4) DISCUSSING. How to ensure a common scale for the evaluation. To get more insight in the 

differences, the advantages and disadvantages, the applicability and cost‐effectiveness of the 

approaches.  

5) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from the group. 

18.00  End of the first day 

20.00  Dinner in the centre of Brussels: Hemispheres, Rue Léopold 29 http://www.hemispheres‐resto.be/ 

Tuesday June 18

9.00  C. Selection of recommended applications, funding decision and project negotiation – part II 

Part II concerns the project negotiation.  The distributed pot mode of funding, which is used most 

often in ERA‐NET calls combined with unbalanced participation in proposals compared to national 

budget commitments may create complex puzzles for funders. 

1) CASE. Nicolas Tinois from FACCE‐JPI will introduce different experiences on negotiation with the 

selected projects  

2) SHARING –PROJECT NEGOTIATION ONLY. Round table in which the participants share their working 

methods and experiences: finalisation of the decisions in time, demands for changes, demands before 

project start 

3) MODELS. The practices are summarised into a limited set of models 

4) DISCUSSING. How to keep a strict procedure? How to spend as much of the funds allocated as 

possible? To get more insight in the differences, the advantages and disadvantages, the applicability 

and cost‐effectiveness of the approaches.  

5) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from the group. 

12.00  D. Joint project monitoring. Methods of following‐up the projects funded by the ERA‐NETs during 

their lifetime 

  Most funders have monitoring procedures for funded projects in place.  How to replace or transform 

national monitoring systems so that it best serves the transnational character of the ERA‐NET 

programme.   

1) CASE. Christine Bunthof will introduce how they monitored the funded research projects in ERA‐PG, 

2) SHARING. Round table in which the participants share their working methods and experiences: 

requirements from projects, national or transnational evaluation, timing of reporting,  

3) MODELS. The practices are summarised into a limited set of models 

4) DISCUSSING. Can a joint monitoring system replace national procedures? How? Alternatively, could 

national systems be harmonized when monitoring jointly funded projects? How to act if different 

funding bodies give a different assessment (particularly relevant when a go/no go decision is to be 

made)? To get more insight in the differences, the advantages and disadvantages, the applicability and 

cost‐effectiveness of the approaches. 

5) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from the group. 

Page 5: Call management Master Class, 17-18 June 2013

8 May 2013 

4  

12.30‐13.00  Sandwich lunch 

  Monitoring topic continued 

14.00  Overview of the results achieved and last input to the combined list of recommendations  

15.00  End of meeting 

Location of the meeting:

Page 6: Call management Master Class, 17-18 June 2013

Call management Master Class, 17-18 June 2013 Organised by WP3 Mutual Learning

Participant List for Report Name Email Network Day 1 Day 2

Alex Percy-Smith [email protected] ERA-ARD yes yes

Alois Egartner [email protected] EUPHRESCO yes yes

Anabel de la Peña [email protected] IPM yes yes

Anna I. Macey (BBSRC, SO) [email protected] ERA-CAPs yes yes

Annette Kremser [email protected] ERASynBio yes yes

Dominique Vandekerchove [email protected]

ANIHWA yes no

Elfriede Fuhrmann [email protected]

RURAGRI no yes

Ignacio Baanante Balastegui [email protected] ERASysAPP yes yes

Iver Thysen [email protected] ICT-AGRI yes yes

Johannes Bender [email protected] SUMFOREST yes yes

Katerina Kotzia [email protected] CORE Organic and COFASP yes yes

Marie Ollagnon [email protected] ARIMNET yes yes

Marion Karrasch-Bott [email protected] ERA-IB yes yes

Marta Norton [email protected] ERA-MBT yes yes

Matte Brijder [email protected] ERA-NET BIOENERGY yes yes

Mika Kallio [email protected] WOODWISDOM-NET yes yes

Nicolas Tinois

[email protected] FACCE-JPI yes yes

Paul Beckers [email protected] ERA-CAPs yes yes

Petra Schulte [email protected] ERA-MBT yes yes

Veronika Deppe [email protected] ETB yes yes

Organisers

Christine Bunthof [email protected] PLATFORM yes yes

Christian Listabarth [email protected] PLATFORM yes yes

Ulla Sonne Bertelsen [email protected] PLATFORM yes yes

ERA-NETs not attending: EMIDA, SAFEFOODERA, FORESTERRA, BiodivERsA

Page 7: Call management Master Class, 17-18 June 2013

Platform workshop 17 &18/6/2013

• ERA LEARN Toolbox

• Concept & Status Quo

• Workshop issues

WP 3 Mutual Learning

Christian Listabarth Brussels June 17/18 20132

Focus of this workshop: To provide

• specific contributions

• complementary contributions

• PLATFORM created contributions

The ERA LEARN background (01/2011)

3

ERA LEARNConsortium

FFG EI RCN DLR AKA VDI /VDE

ERA NETparticipations

22 7 39 22 22 6

• General expertise and experience

• ERA NET scope is not sectorial nor specialized

• Basic and applied science

• Provide generic rather than specific and normative information on calls

PLATFORM consortium > 20 partners

ERA NET participations Mostly 1, but up to 3 (4) participations

• Specific expertise and experience

• ERA NET scope is sectorial and specialized

• Basic and applied science

• Provide specific and recommending information on calls

Toolbox, first considerations

4

Not all phases will be tackled today

Selected topics (Paris, April 2013)

=> other topics: use toolbox

=> attend training workshop(September/October 2013)

Toolbox will be updated during summer

=> Contribute to it

Huge selection of documents

=> available for ERA LEARN?

All phases of a call are covered

Toolbox, structure

5 6

Why this structure ?

• Top priority: easy to use

•Not another large pdf document

•Call execution implies thinking ahead and corresponding planning inadvance

many crosslinks between the elements of the call cycle

•Module structure is suitable for quick updates and extensions

• Interactive online manual is a convenient solution

Assembling & maintaining the toolbox

7

ERA NET community

http://netwatch.jrc.ec.europa.eu

ERA NET coordinators

• questionnaires• online feedback

ERA LEARNInternal expertise

ERA LEARN

NETWATCH

PLATFORM

Internal expertise

Outcome of the workshop

• Recommendation on good practice resultingfrom specific experience

• Results should contribute to and fit into theERA LEARN scheme

• Results should ameliorate the toolbox

8

Page 8: Call management Master Class, 17-18 June 2013

ICT AGRI Call Submission andAdministration

Platform Masterclass 17 18 June 2013

Iver Thysen, [email protected]

ICT AGRI, DASTI

ICT AGRI Meta Knowledge Base with CallSubmission and Administration• The ICT AGRI Meta Knowledge Base is a complete site for running andadministrating the ERA NET. See it at http://db ictagri.eu

• Call Submission and Administration is an important part of the site

• The MKB is developed by the ERA NET and implemented in propriety php scriptsand hosted by a commercial web hosting company

• A new version is build on the Drupal open source content management system

• The new version is open source and can be downloaded free of charge. It ismodular with many facilities to choose among.

• It includes a configurable Call Submission and Administration system

• Downloads are available from September 2013, the call system from October2013

Page 9: Call management Master Class, 17-18 June 2013
Page 10: Call management Master Class, 17-18 June 2013

ERA-NET for CoordinatingAction in Plant Sciences

CaseERA-CAPS Joint Call for proposals 2012

Evaluation and ranking

Paul Beckers ERA-CAPS Coordinator Call Secretariat, DFG

Master Class Call Management17-18 June 2013,

Brussels

Facts and figures first call for proposals

Steps in evaluation and ranking

Basic principles in peer review

Outsourcing peer review; ESF case

Outline

Facts and figures first joint Call

• 16 ERA CAPS partner organisations or ERA CAPSassociated organisations

• 15 countries (inside EU plus New Zealand)

• 110 eligible proposals

• Average proposal; 5 teams

• Parallel Call from NSF encouraging mutual collaborationwith ERA CAPS consortia

Facts and figures first joint Call

• Molecular plant science• No specific topics• Non exclusive themes of common interest to

several of the funding organisations:• Food Security• Non food crops• Adaptation to a changing climate• Biotic/abiotic stresses

Parallel ERA-CAPS and NSF Call Steps in evaluation- and ranking-process

One step submission & eligibility checkExternal Peer ReviewRebuttalReview Panel

Submission & Eligibility check

Submission of full proposals through electronic platform at DFG; Elan- Guidelines for submission by single coordinator

Eligibility check by - Call Secretariat on completeness and format- National Call Coordinators on compliance with

national regulations- Guidelines for Application/ Call Notice

External Peer Review

Requirements external referees- independent scientific experts (working in a personal capacity and does not represent any organisation)- at least three years of scientific activity after his/her PhD - publications in refereed journals, text books, invited lectures, awards, leading academic positions, etc.- to be checked on personal webpages; PubMed; Google Scholar etc.- preferably three external referees per proposal

Recruitment from various sources - Existing databases (ERA-PG; NFOs etc.)- National Call Coordinators- Review Panel members- Coordination by Call Secretariat

Page 11: Call management Master Class, 17-18 June 2013

External peer review

Criteria for ER and RP

• novel, innovative research within the scientific scope of the call

• scientific quality of the project

• feasibility of the project

• transnational added value and complementarity of expertise

• scientific track record and potential of applicants

• adequacy of resources, cost effectiveness and project management

• economic, societal and environmental relevance (if appropriate)

External Peer Review

Overall score A B C D E

According

recommendation

Fund Fund Fund if

available

budget

Do not fund Do not fund

excellent very good good average poor

Rating by external referees

Rebuttal

Applicants are allowed to comment on the external review reports- maximum 500 words per report- within 7 days after receipt- only to resolve uncertainties that could arise when reading the External Referees reports

Review Panel

Nomination candidates- national call coordinators propose names- geographical spread - coverage of call-topics submitted proposals- workload around 25 proposals per member- three readers (1 first and 2 second) per proposal

110*3/ 25 = 14 members plus chair

Mandateprovide a consolidated evaluation summary and rating for each proposal and to provide a grouped list for further consideration in the Moderating Panel

Review Panel

First reader - judgment on the quality of the referee reports - short overall evaluation summary of the proposal and the referee reports- preliminary rating based on the referee reports and own judgment

Second readers - brief comments rather than a full summary and preliminary rating

All RP members will provide their assessments and scores to the Call Secretariat in advance of the meeting

Review Panel

Criteria for ER and RP

• novel, innovative research within the scientific scope of the call

• scientific quality of the project

• feasibility of the project

• transnational added value and complementarity of expertise

• scientific track record and potential of applicants

• adequacy of resources, cost effectiveness and project management

• economic, societal and environmental relevance (if appropriate)

Review Panel

Additional considerations:

• Proposals should take into account and search for solutions to the problems of data sharing and management

• There should be demonstration of due consideration of issue of protection of intellectual property and envisaged exploitations

• Exchange of personnel for significant periods during the lifetime of the project will be encouraged

Review Panel

Score Definition Guidance

4 5 Novel and innovative work that is at the forefront internationally and likely to have a significant impact in the field.

Would always be expected to be successful under international peer review. Would be within the top 10% of

proposals considered by the RP.

AND

Project team demonstrates complementarity of expertise and the proposed collaboration demonstrates

transnational added value in addition to true co operation.

Fund

3 – 3.9 Novel and innovative work that is highly internationally competitive in a significant proportion of the research

proposed and will answer important questions in the field. Would be expected to be in the top 30% of proposals

considered by the RP.

AND

Project team demonstrates complementarity of expertise and the proposed collaboration demonstrates

transnational added value in addition to true co operation.

Fund if available budget

2 – 2.9 Likely to advance the field of knowledge, leading to publication in international journals.

AND

Project team demonstrates complementarity of expertise and true cooperation within the collaboration.

Worthy of support if sufficient funds available.

Fund if available budget,

and if national

regulations allow

1 – 1.9 Scientifically sound but not internationally competitive. Unlikely to lead to significant advancement in the field.

OR

Significant concerns regarding cooperation within the project team.

Do not fund

0.1 – 0.9 Non competitive, unlikely to lead to advancement in the field.

OR

A scientifically flawed project.

OR

A clear lack of collaboration between the proposed team.

Do not fund

0 Outside the scientific scope of the call. Do not fund

Page 12: Call management Master Class, 17-18 June 2013

Review Panel

Highly recommended Not recommended

Suggested Grouping for

Call funding

recommendation after

RP deliberationA+ A B C

Proposalnumber

Acronym ER score 1 ER score 2 ER score 3 RP score 1 RP score 2 RP score 3 Averagescore

001 ZZZ C C D 2 2 2 2

002 YYY B B C 3 2 3 2.7

003 XXX A B A 4 4 4 4

004 WWW A A A 5 4 5 4,7

005 VVV A C C 3 3 2 2,7

… … … … … … … … …

Grouping during the Review Panel meeting

Review Panel

Proposalnumber

Acronym ER score 1 ER score 2 ER score 3 RP score 1 RP score 2 RP score 3Averagescore

Group

4 WWW A A A 5 4 5 4,7 A+

3 XXX A B A 4,5 4,5 4,5 4,5 A+

2 YYY B B C 3 2 3 2,7 B

5 VVV A C C 3 3 2 2,7 B

1 ZZZ C C D 2 2 2 2 C

… … … … … … … … …

After discussion RP provides grouped list of recommended proposals

Basic principles of Peer Review

• Excellence• Impartiality• Transparency• Apropriateness for purpose• Efficiency and speed• Confidentiality• Ethical and integrity considerations

Basic principles of Peer Review

ERA CAPS complies with principles by• Code of Conduct on Conflicts of Interest for ER and RP• Declaration of Impartiality and Confidentiality for RP

members• Call Notice including criteria and scope• National regulations for eligibility• Guidelines for Applications• Guidelines for Evaluation and Selection• Transparent communication about outcome

Page 13: Call management Master Class, 17-18 June 2013

1

ESF Peer Review Services

2

Background

Since 2005 ESF coordinated and implemented the PR of 5,000+ scientificproposals (27% from external contracts)- Various types of schemes and instruments

ESF has formalised the best practices in PR in the European Peer Review Guide

- Consensus involving 34 European RPOs and RFOs

- Based on detailed survey and discussion http://www.esf.org/coordinating-research/mo-fora/completed-mo-fora/peer-review.html

3

ESF Standards in Peer ReviewQuality: Proven ability to efficiently manage PR operations

•Advice on the PR approach and structure (e.g. one or two stages, with or without external peer review, rebuttal phase…)•Proven methodology in finding the relevant expertise/reviewers•Improved (and improving) process (not routine business)Integrity

•Probity (adequate management of Conflicts of Interest, confidentiality, equity of treatment…)•Independent process •Transparent processInfrastructure

•Online submission of proposals•Automated interactions with applicants and evaluators•Online submission of evaluation reports

4

Outsourcing Peer ReviewTailored approach

•There is no ‘one size fits all’ in PR•Process and structure has to adapt and best serve the specificities of a specific call (scope of the call, size of the community, objectives of the programme, international aspect,…)

Common procedures in the context of international calls

•Streamlines process and avoid duplication of effort•Avoids misunderstandings and procedural misalignment between different processes•Allows better monitoring of the process and early information •Allows the use of common platforms

5

Outsourcing Peer ReviewStandard and comparability

•Consistent methodology allows equality of treatment among various programmes and project homogeneity•The scientific community welcomes processes implemented by external independent bodies

Better use of resources

•PR is too operational for programme secretariats (should focus on more strategic and tactical issues)•Opportunity to avoid peaks of resource-demanding activities•Savings due to scale effect and experience•Well defined deliverables

Basic Provision of written expert reviews in the frame of an already defined call (no panel, no ranking)

6

Different levels of ESF Support

Intermediate Full scale elaboration, management and implementation of the scientific assessment process, resulting in prioritised list(s) and funding recommendations

Full End to end Peer Review process elaboration and implementation, from Call management (gathering and handling of proposals) to final funding recommendations.

Call Process

7

Some Past and Current External Peer Review activities at ESFERA-NETs

• Europolar (European Polar Consortium - 2009)• HERA - Humanities in the European Research Area (2008-2012)

FP7 FET Flagship• FP7 Graphene Flagship call for extension (2013-2014)

Other• European Space Agency microgravity programme (2009-2010)• University of Torino and University of Piemonte Orientale (2012)• AXA Research Fund (2013)

8

Contact Peer Review activities at ESF:

Mr Nicolas Walter ([email protected] )

Page 14: Call management Master Class, 17-18 June 2013

Selection of recommended applications, funding decision and project negotiation

PLATFORM Master Class, Brussels, 17-18 June 2013

Nicolas Tinois, Project Management Jülich

Project Management Jülich. Partner for Research Management

evaluation

barrel model (A-C)

recommendation

redress recommendation

rebuttal recommendation

ranking list

recommendation

redress recommendation

rebuttal recommendation

Project Management Jülich. Partner for Research Management

1. A2. A3. A

1. B2. B3. B

1. C2. C3. C

AA

A

BB

B

C

CC

B

B

C

C

recommendation

recommendation

1. A2. A3. A4. B5. B6. B7. C8. C9. C

Project Management Jülich. Partner for Research Management

1. Prop LK2. Prop HT3. Prop GF

Topic 2

1. Prop PS2. Prop NT3. Prop SL

Topic 3

1. Prop RV2. Prop UJ3. Prop SN

Topic 4

1. Prop ZC2. Prop XY3. Prop DL

Topic 1

recommendation recommendation recommendation recommendation

Project Management Jülich. Partner for Research Management

1. Prop LK2. Prop HT3. Prop GF

Topic 2

1. Prop PS2. Prop NT3. Prop SL

Topic 3

1. Prop RV2. Prop UJ3. Prop SN

Topic 4

1. Prop ZC2. Prop XY3. Prop DL

Topic 1

recommendation

1. Prop LK2. Prop PS3. Prop NT4. Prop RV5. Prop HT6. PropZC

Project Management Jülich. Partner for Research Management

Part I

Context: peer-review or internal evaluation? Ranking list, binding or not (?), or categories (A, B, C…)?

Funding decision: meeting of funders? Schedule of the meeting? How is the decision/recommendation taken for each project? Is it a “decision” or a “recommendation”?

How to ensure comparability of projects: big/small? Different topics?

How do you deal with gaps in funding?

Project Management Jülich. Partner for Research Management

Part I: ETB

Context: Proposal evaluated internally by each involved programme owner, outcomes discussed commonly in an evaluation meeting 3 Categories: A, B (as few as possible) and C -> commonly accepted as funding decision, B are treated on a case by case basisThis is a funding decision recommendation

Gaps in funding: try to “save” the project (condition to positive recommendation: the partner from …. “cannot require funding” or “must leave the consortium”…)Very smooth process thanks to the availability of internal evaluation beforehand of the meeting (bilateral discussions possible).

Project Management Jülich. Partner for Research Management

Part I: ERA-IBContext: peer-review (pre- and full proposals) by a commonly selected panel of 10-12 experts; additional external reviews only on demand by the panel-> ranking in categories (A, B, C, D, E)

-> A gets highest priority and is usually funded completely-> B is divided into B+ and B -> B+ gets higher priority than B, but

B proposals are still considered as “fundable”-> C – E is rejected

Funding decision: meeting of funders directly after the expert panel meetingpanel suggestions taken as “recommendation”problems / funding gaps are addressed on the meeting, but usually only solved afterwardssolutions: budget cuts, replacement or omission of partners if possible, subcontracts; sometimes: raise of national budgets

How to ensure comparability of projects: size limit of proposals: min 3 partners / max 8 partnersDifferent topics? Yes, to be compared and assessed by the panel

Page 15: Call management Master Class, 17-18 June 2013

Project Management Jülich. Partner for Research Management

Part I: SUSFOOD

Pre-proposal: internal evaluation - Ranking list per topicFull-proposal: peer-review - Ranking list per topicFunding decision: meeting of experts (peer review) – not yet planned in detailHow to ensure comparability of projects: 3 Different topics

How do you deal with gaps in funding: Discussion

Project Management Jülich. Partner for Research Management

Part I: FACCE-JPI Knowledge Hub

Context: Unique proposal!Peer-review evaluationAt the end of the evaluation meeting we have “recommended for funding provided that 1)…. 2)….”Outcome from the evaluation accepted as it is by the fundersRequires then a re-submission step (see project negotiation)

Project Management Jülich. Partner for Research Management

Part I: ANIHWA

Context: peer-review (web-conf. for each call topic), a ranking list is built for each of the call topics

Funding decision: meeting of the funders short after end of evaluation. First-ranked projects for each topic are recommended for funding. Then, if possible, try to follow the ranking list (possibility to “skip” one project if there is a budget gap in one country). Also, it is tried to have similar success rates in all topics.

Gaps in funding: “skip” the project or try e.g. change one partner (subcontracting could also be an option)

Project Management Jülich. Partner for Research Management

Part II

Finalisation: is the whole process finalized with the recommendation meeting? Are any steps still needed afterwards?

How to deal with provisional positive funding recommendation? / Changes required by the funders

Changes after funding recommendation (e.g. one partner leaving the consortium): which consequences?

Project Management Jülich. Partner for Research Management

Part II: ETB

Finalisation: recommendation is finalized within a few days (if e.g. a project can be positively recommended provided that… : this is checked quickly with the coordinator). Then communication to applicants and the projects goes to national negotiation.

Changes required by the funders have to be dealt with as soon as possible in order to have one unique recommendation to be communicated timely for all proposals.

Changes after funding recommendation (e.g. one partner leaving the consortium): this is dealt with on a bilateral (trilateral…) national/regional level.

Project Management Jülich. Partner for Research Management

Part II: ERA-IB

Finalisation:meeting minutes are agreed with the panel and the funders within 1 week after the evaluation meeting-> evaluation summary from the minutes is sent to the applicants-> projects enter national negotiationsHow to deal with provisional positive funding recommendation / Changes required by the funders ?-> close collaboration during these national negotiations between the funders and weekly follow-up of the national contracts is important, but difficult! Changes after funding recommendation (e.g. one partner leaving the consortium):is discussed among the concerned project partners and funders

Project Management Jülich. Partner for Research Management

Part II: SUSFOOD

Finalisation: the report of the selection meeting will be brought into agreement with the experts and sent afterwards to the funders

Changes required by the funders have to be dealt with as soon as possible

Changes after funding recommendation (e.g. one partner leaving the consortium): not yet decided

Project Management Jülich. Partner for Research Management

Part II: FACCE-JPI Knowledge Hub

Finalisation: is the whole process finalized with the recommendation meeting? Are any steps still needed afterwards? Provisional positive funding recommendation: explained by Chair of evaluation to the coordinators during a meetingRe-Submission of the proposalChair checks that the conditions have been respected and provides with the final funding recommendation

Changes after funding recommendation : mostly not accepted, still these are dealt with on a national basis during the national project negotiation.

Page 16: Call management Master Class, 17-18 June 2013

Project Management Jülich. Partner for Research Management

Part II: ANIHWA

Finalisation: recommendation letters are sent a couple of weeks after the meeting. Meanwhile, bilateral exchanges to solve one a case-by-case basis “problematic” cases, such as provisional funding recommendation.

Changes after funding recommendation (e.g. one partner leaving the consortium): this is also discussed on a case-by-case basis.

Project Management Jülich. Partner for Research Management

Thank you for your attention!

Nicolas Tinois, Project Management Jülich

Page 17: Call management Master Class, 17-18 June 2013

PLATFORM

Master class on call management

D. Joint Project monitoringCASE presentation. ERA PG

Christine [email protected]+31 (0) 317 480996Wageningen UR

1PLATFORMMaster Class Call Management, 17 18 June 2013, Brussels 2

Co-ordination and co-operationbetween national plant genomics research programmes

2004 2009 (72 mnd)

2004: 12 partners (11 countries)

2006: + 5 partners (4 countries)

Coordinator:

3

Delivery of excellent science in transnational collaboration – true transnational working

Transparency of process with minimal bureaucracy

Synergy by collaboration – maximisation of return on funding agency investments

Joint programme design and operation

Stimulation of industrial participation and cooperation

Enhanced profile for EU science in global terms – elevated competitiveness

What did we want to achieve from the ERA-PG common programme?

Plant GenomicsPlant Genomics

Overarching themes to unify partners

Abiotic and biotic stress

Genomic tools, technologies

and resourcesHigh value crops and non-

food cropsCrop and

forage plants for low input

systems

Use of models and model-crop translation

Other topics?Yield stability and genetic potential

Quality traits

4

5

ERA-PG First Call (2006)

Structuring Plant Genomic Research in Europe

Funders from BE, DE, IT, NL, DK, FI, NO, PT, UK, FR, ESBroad and inclusive research themesSub Call A Broad call for publicly funded research in plant genomicsSub Call B Trilateral partnership and beyond; the future for

European Public-Private Partnerships in EuropeCentral evaluation & selection procedure. Two stage Final funding decisions by national funding organisationsStart of projects in 2007

Sub Call A70 PP, 44 FP

15 granted projectstotal granted budget 22 M€77 partners (10 countries)

av: 4.8 organisations/project3.9 countries/project1.5 M€/project

Sub Call B36 PP, 30 FP

14 granted projects; 13 PPPtotal granted budget 16.6 M€111 partners (8 countries)

of which 31 companiesav: 7.9 organisations/project

3.4 countries/project1.2 M€/project 6

ERA-PG Second Call (2008)

Strengthening the European Research Area in Plant Genomics integrating new technologies in plant science

2nd Call54 FP

15 granted projectstotal granted budget xx M€xxx partners (x countries)

of which xx companiesav: x organisations/project

x countries/projectx M€/project

Funders from UK, DE, NL, PT, BE, FI, AT, CA, IL; Allocated budget (preliminary): ~ € 15 mln Central evaluation & selection procedureFinal funding decisions by national funding organisationsStart of projects in 2009

7

Project reports

Challenge: A unified system complying with different nationalrules and systems

Content of report

Language

Electronic form or pdf

Purpose: Go/no go decision or ‘just’ assessment

Evaluated by peers or desk officers

Rules for national financial reporting

8

Project reporting in ERA PG

• Mid term Report, End Report

• CS+NCP working group to develop the ERA PG reporting formsto ensure the reports comply with national rules

• No additional national reporting required (although exceptions)

• Project coordinator submits report to ERA PG Call Secretariat

• CS communicates with coordinator if extra information is need

• CS sends to NCPs for assessment

• If all is fine, CS sends letter of acceptance to coordinator

• If not, actions might be needed following national rules

Page 18: Call management Master Class, 17-18 June 2013

9

Project reports

Second challenge: how to organise the work after the ERA NETproject has ended.

Options(1)secretariat stays alive (costs issue)

(2) distribute tasks among funders

(3) no joint reporting

>> In ERA PG, tasks were distributed over ‘lead funders’>> Intranet was maintained for documents and communications

11

Mid term go/no go evaluation

>> For Sub Call B projects a mid term closed seminar in April2009. Project presentations by consortium leaders. An expertpanel appointed to evaluate the progress of the projects askedquestions during the seminar and gave a writtenassessment of the projects progress to the fundingbodies for feedback to the consortia.

10

Status seminars

>> to increase networking among the researchers involved inthe programme and create a forum for academics,companies, funding bodies, and other stakeholders.

>> First Seminar at PGEMs 2007 Tenerife. Kick off of projects offirst call

>> Second Seminar at PGEMs 2009 Lisbon. Progresspresentation of projects of first call, kick off presentation ofprojects of second call

12

Impact evaluation

Not possible time wise to do in the ERA PG project

PLATFORM

Thank you for your attention

Christine [email protected]+31 (0) 317 480996Wageningen UR

13PLATFORMMaster Class Call Management, 17 18 June 2013, Brussels

Page 19: Call management Master Class, 17-18 June 2013

www.euphresco.orgScreenshots of online tools

Alois Egartner Austrian Agency for Health and Food Safety

(AT-AGES, Partner 3, [email protected])

EUPHRESCOEUropean PHytosanitary REsearch

COordination

2

Timetable

Automatic date calculation & email reminders

Details after extension

3

Entering topic suggestions

4

Table that collect all topic suggestions

5

Individual topic windowsCollect all details to the topic and provide partners to enter information and decisions

- Topic details- Topic documents- Merged topic suggestions- Comments- Contributors (pot. Funders)- ….

6

Toolbook(Supporting documents/Tools)

Templates for every required kind of documents

Separated in:- Research Initiation[end: Funding consortia and agreed topics are given]

- Research Implementation

7

Toolbook

Research Initiation (~20 files)

Research Implementation (>60 files) www.euphresco.org

Alois Egartner Austrian Agency for Health and Food Safety

(AT-AGES, Partner 3, [email protected])

EUPHRESCOEUropean PHytosanitary REsearch

COordination

Page 20: Call management Master Class, 17-18 June 2013

 

WP3 Mutual learning  Task 3.1:  Good practices for ERA‐NET activities 

 

UB, 12 June 2013

Compilation of answers to questionnaire sent May 2012 to PLATFORM partners and to non-responders in May 2013

Filled in by:

Bioenergy (Kees Kwant) CORE Organic (Ulla Sonne Bertelsen) ERASynBio (Annette Kremser) ERA-CAPs (Paul Wiley) ERA-IB-2 (Karen Görner) incl. examples ETB-PRO (Chriatian Listabarth) EUPHRESCO (Alan Inman) incl. examples FORESTERRA (Rocío Lansac) ICT AGRI (Per Mogensen) MARINEBIOTECH (Steinar Bergseth) SAFEFOODERA (Mads Peter Schreiber) SUSFOOD (Marie Russel) RURAGRI (Thomas Dax) Woodwisdom (Mika Kallio) ANIWHA (Sabine Dues) ERASySAPP (Klaus Michel) ERA ARD (Alex Percy-Smith) ARIMNET (Marie Ollagnon)

NOT by Biodiversa, CoFASP, ERA-SysBio

Page 21: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

2  

1. Call planning

1.1. Do you normally use one or two step call procedure?

[x] One

ETB-PRO, EUPHRESCO RURAGRI, ERA ARD, ERA-CAPS FORESTERRA It varies with the ERA-NET; It depends on number of proposals

expected or on the agreement of the partners ERASySAPP One (however, this is not yet finally decided by the boards for the first

call in November 2013

[x] Two

Bioenergy, CORE Organic, ERASynBio, SAFEFOODERA, ICTAGRI, ERA-IB-2, WoodWisdom, ANIWHA FORESTERRA It varies with the ERA-NET; It depends on number of proposals

expected or on the agreement of the partners ARIMNET Non-compulsory letters of intent + call for -full- proposals

1.2. Which funding model? link

[x] virtual

Bioenergy, CORE Organic, ERASynBio, ERA-CAPS, SAFEFOODERA, ETB-PRO, ICTAGRI, ERA-IB-2, EUPHRESCO, MARINEBIOTECH, FORESTERRA, SUSFOOD, RURAGRI, ANIWHA, ERA ARD, ARIMNET ERASySAPP virtual common pot with fresh money and in kind funding

options

[x] real

CORE Organic (test call in COII), SAFEFOODERA, EUPHRESCO

[x] mixed

SAFEFOODERA, WoodWisdom EUPHRESCO We also use a non-competitive (NC) mechanism which is our

most used approach, and less occasionally VP, and more rarely RP (only two funders so far have done a RP together

1.3. Your call administration is?

[x] funded by the EC

CORE Organic, ETB-PRO, ICTAGRI, ERA-IB-2, SUSFOOD, RURAGRI, ANIWHA, ERASySAPP, ERA ARD, ARIMNET

[x] self-financed (please describe how)

Page 22: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

3  

Bioenergy, SAFEFOODERA, EUPHRESCO, FORESTERRA Each partner funds by the respective national funding

organization. ERA-NET members' funding agencies use their own national resources Multilateral Research Projects National calls, for instance

WoodWisdom At the moment our project is running under the ERA-NET Plus scheme and according to the rules “The consortium may choose to use part of the EU contribution to support activities of management or monitoring during the two phases as long as the partners replace this part with national contributions to the Joint call budget.”

ARIMNET Labour costs and travel costs of the representatives of the funding agencies not involved in the initial consortium of ARIMNet

[x] no calls at the moment

ERASynBio, ERA-CAPS, MARINEBIOTECH

1.4. How many calls have you launched?

[0] ERASynBio, ERA-CAPS, MARINEBIOTECH, FORESTERRA, ERASySAPP (plan three calls)

[1] RURAGRI [2] SAFEFOODERA, ICT-AGRI, ERA ARD

EUPHRESCO A VP pilot in 2008 (five topics) and a RP pilot in 2008 (2 topics. Since then all projects have been directly commissioned, though competition is still considered

[3] ERA-IB-2 [4] CORE Organic, WoodWisdom [5] Bioenergy [7] ETB-PRO

1.5. Any further remarks

[x]

ERASynBio: ERASynBio is planning a call for 2013 and just started preparation/design.

EUPHRESCO EUPHRESCO-1: 11 projects from 2008 pilot’s (4 x NC; 5 x VP; 4 x NC projects);

then an extra 6 NC projects commissioned in 2009 EUPHRESCO-2: 1st round of topic selection resulted in 10 x NC projects

in 2011 EUPHRESCO-2: 2nd round of topic selection currently under way in 2012 SUSFOOD SUSFOOD plans to launch two calls within its duration

2. Call preparation

2.1 The toolbox describes a number of documents Link. Which of the following documents do you use in your ERA-NET:

Call announcement, call text) ....................... [x] yes Bioenergy, CORE Organic, ERASynBio,

SAFEFOODERA, ETB-PRO, ICTAGRI, ............................................................... ERA-IB-2, EUPHRESCO, FORESTERRA,

SUSFOOD, ANIWHA, ARIMNET

Page 23: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

4  

............................................................... [ ] no Guidelines for applicants .............................. [x] yes Bioenergy, CORE Organic, ERASynBio, SAFEFOODERA, ETB-PRO, ICTAGRI, ERA-IB-2, EUPHRESCO, FORESTERRA,

SUSFOOD, ANIWHA, ARIMNET ............................................................... [ ] no FAQs ........................................................ [x] yes Bioenergy, SAFEFOODERA, ETB-PRO, ICTAGRI, FORESTERRA ............................................................... [x] no CORE Organic, ERASyn Bio, ERA-IB-2, EUPHRESCO, ANIWHA Proposal Forms ......................................... [x] yes Bioenergy CORE Organic, ERASynBio, SAFEFOODERA, ETB-PRO, ICTAGRI, ERA-IB-2, EUPHRESCO, FORESTERRA, ANIWHA ............................................................... [ ] no Feedback letters ......................................... [x] yes Bioenergy CORE Organic, ERASynBio, SAFEFOODERA, ETB-PRO, ICTAGRI, EUPHRESCO, FORESTERRA, ANIWHA ............................................................... [x] no ERA-IB-2 Evaluation checklist & reporting forms ........... [x] yes Bioenergy CORE Organic, ERASynBio, SAFEFOODERA, ICTAGRI, ERA-IB-2, EUPHRESCO, FORESTERRA, ANIWHA ............................................................... [ ] no Guidelines for evaluation ............................. [x] yes Bioenergy CORE Organic, ERASynBio, SAFEFOODERA, ICTAGRI, EUPHRESCO, FORESTERRA, ANIWHA, ARIMNET ............................................................... [x] no ERA-IB-2 Description of evaluation meeting ................. [x] yes Bioenergy CORE Organic, SAFEFOODERA ETB-PRO, FORESTERRA, ANIWHA ............................................................... [X] no ERASynBio, ICTAGRI, ERA-IB-2 Non disclosure agreement NDA ..................... [x] yes CORE Organic, ERASynBio, ETB-PRO, ERA-IB-2, EUPHRESCO, ANIWHA ............................................................... [x] no Bioenergy, ICTAGRI, FORESTERRA (it is included in other document) ERA-CAPS [no calls so far]

2.2 Do you have more documents in connection to the call than listed?

[x] Yes (please list and provide a copy)

CORE Organic    No-conflict of interest declaration

ERA-CAPS Likely to also have Material Transfer Agreement (not drafted yet)

EUPHRESCO EUPHRESCO toolbox on EUPHRESCO Website ... www.euphresco.org

FORESTERRA Eligibility and contacts of funding organisations Guidelines for electronic submission

WoodWisdom If we look at the supporting docs listed under the Chapter 2.1.1 of the Manual and tools for call implementation, otherwise our documents are pretty much similar but now being under the ERA-NET Plus scheme we have signed a Consortium Agreement incl. some key elements focussing on the call implementation/management.

Page 24: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

5  

ARIMNET Eligibility of funding organisations Guidelines for electronic submission

ETB-PRO Some information on IPR, partnering opportunities, description of eligibility

ERA-IB-2 All the above documents were drafted during ERA-IB-1 (for the 1st joint call) and amended as needed, and were not based on ERA-LEARN. Additional docs:

Memorandum of Understanding for the joint call (internal use only)

Implementation Agreement (internal use only)

Peer review forms (attached)

Rebuttal forms (attached)

WoodWisdom We have not analysed the examples in the toolbox that carefully but most probably there are no big differences since all the ERA-Nets are running under the same basic rules. See also: WoodWisdom-Net 2 - Report No. 1/2012: HANDBOOK OF THE WOODWISDOM-NET RESEARCH PROGRAMME

[x] No

Bioenergy, ERASynBio, SAFEFOODERA, ICTAGRI, RURAGRI, ANIWHA, ERASySAPP, ERA ARD

2.3 Do you find your documents more suitable for your calls than the examples in the toolbox?

[x] Yes (please explain below and provide a copy)

CORE Organic They are tailor made to our needs and agreed to by all partners ICTAGRI EUPHRESCO We have used and tested them

[x] Maybe

Bioenergy. WoodWisdom, ERA ARD

[x] No

ERASynBio, SAFEFOODERA, FORESTERRA, RURAGRI ETB-PRO Most of them are from ETB ERASySAPP So far documents still not exist – we can hand them in if wanted by

Platform ARIMNET No, there are just complementary. It was a way to adapt the general

process to the special case of ARIMNet

2.4 Could you expect your consortium to adapt to standardised KBBE ERA-NET call documents if developed within WP3 and agreed by all PLATFORM partners?

[x] Yes, I expect my ERA-NET would be that flexible

ICTAGRI, FORESTERRA

[x] Maybe (please elaborate)

RURAGRI, ANIWHA, ERASySAPP, ARIMNET

Page 25: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

6  

Bioenergy: In case more suitable, it can be considered CORE Organic:   Our partners would like harmonisation ERASynBio: something like a core document with several options would we useful

indeed. In general I think most ERA-Nets are not easy to compare ERA-CAPS Any standardised documents would need to retain a degree of

flexibility to allow tailoring to the specific needs of the ERA-NET SAFEFOODERA Some of the partners have strict funding rules (Food safety agencies) ERA-IB-2 I would ask for comments from the partners on such draft documents.

If any real concerns there may be can be addressed, then it should be fine.

EUPHRESCO Ff they were an improvement and if it did not take time or need adaptation

[x] No (please give your reasons) ETB-PRO (ETB has developed a call procedure specific for SMEs, so the

probability is low that it would fit with most other ERA-NETs. However, it could be feasible if more than one “standard types” is developed.)

SUSFOOD Maybe (we will be able to answer this question once we have a first experience in calls)

3 Submission link 3.1 Do you use an electronic system for submission and sharing applications?

[x] Yes (if you can recommend it to be used by other ERA-NETs please describe your system + how and what you use it for)

ERA-CAPS, SAFEFOODERA, ETB-PRO, ICTAGRI, RURAGRI CORE Organic   eracall.eu system 

ERASynBio: inhouse solution only suitable for the special call

MARINEBIOTECH A proprietary system used by the call responsible partner will most probably be used.

SUSFOOD We plan to use an electronic submission system via SUSFOOD online portal.

WoodWisdom In connection of all our calls we have used an Electronic Submission System (ESS, developed by a small Finnish IT house) which has turned out to be not only reasonably priced, but very reliable and easy-to-use. The system is more like a document handling system where authorised users can access all the proposals and evaluation statements. In the application phase each coordinator is asked to register for the system (basic data only) and once the registration is completed, the coordinator will get a project key which will allow him to upload his proposal into the ESS database. For the proposal there is a separate template in Word format, and the coordinators are asked to fill in this template and then upload it into the ESS database. Once the call is closed, all relevant project specific information is collected manually from the proposal documents into a separate Excel table. This table is then used in the evaluation phase. For the evaluators there are individual logins, and the evaluators can access all the proposals in the database. If any funding agency would like to give an access to an external expert to have a look at one specific project only, it is also possible to get a project specific login only. Once the evaluations statements are completed, they are also all uploaded into the ESS database where each evaluator can access them.

Page 26: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

7  

ANIWHA if you can recommend it to be used by other ERA-NETs please describe your system + how and what you use it for), a total online submission tool with partnering and expert database linked in, give me a call to get more information +49 24 61 61 9286

ERASySAPP Although we still have not decided on an EOSS, I can strongly recommend the JUELICH OESS including its functionalities for reviewer administration and the scientist metadata base

ARIMNET We used it:

To gather all the call documents, create online forms and forum (searching for partners) for the applicants.

To allow reviewers to get access to the proposals and evaluate them on line.

To allow members of the Evaluation Committee to study the evaluations made by the reviewers

To allow the funders to get access to the proposals, check their national eligibility online.

… and as archives now.

It was a very good way to limit the numbers of exchanges of emails.

[x] No

Bioenergy, ERA-IB-2, FORESTERRA EUPHRESCO and ERA ARD: Submissions by email

3.2 Do you think your consortium would be interested in a joint electronic call submission system (if funding can be found...)?

[x] Yes

Bioenergy, CORE Organic, MARINEBIOTECH, FORESTERRA, ARIMNET

[x] Maybe (please list your minimum requirements)

ICTAGRI

ERA-CAPS We wouldn’t discount it, but it would need to be very user-friendly, non-bureaucratic, and not based on current EC grant submission systems. Having a centralised system may result in doubling of effort if individual organisations need to add the details to their own systems in order to process payments etc.

ERA-IB-2 We did use one before (in ERA-IB-1), but stopped doing so as it was found too expensive for the little added value. A system must be lean enough to be economic, enable secure uploading of proposals and evaluation reports and be accessible by the funding partners as well as the external evaluators. It should also number proposals automatically and enable filtering by country. If any of this has to be done manually/via e-mail, there is too little added value regarding security or time-saving to merit the costs.

EUPHRESCO If there was no cost and there was a benefit; but unlikely I think

SUSFOOD Maybe (more details after our first call experience)

RURAGRI Easy access to technical personal responsible for call submission system; availability of information about call applicants to Call Secretariat, and with appropriate access for NCP and evaluators; adaptation possibilities to thematic requirements etc.

Page 27: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

8  

WoodWidsom - technically reliable - user-friendly - otherwise including all basic functionalities of any call submission systems

[x] No

ERASynBio, SAFEFOODERA

ETB-PRO The submission tool is the basis of the ETB joint database (for eligibility, evaluation and monitoring), so the use of everything else would be an additional job.

ANIWHA No, as it is already in use!

ERASySAPP There are already many excellent tools out there, no need to spend more tax money on this issue!

4 Evaluation link

4.1 Which type of evaluation is used in your ERA-NET? Link to explanation

[X] Centralised evaluation

ERASynBio, ANIWHA, ERASySAPP, ERA ARD, ARIMNET

WoodWisdom In the case of ERA-NET Plus calls, EC has imposed some specific requirements: International Peer Review process is mandatory (in Step 2), and the final evaluation criterion is excellence.

[x] Semi-centralised evaluation

Bioenergy, CORE Organic, SAFEFOODERA, ERA-IB-2, MARINEBIOTECH, FORESTERRA, SUSFOOD (probably), RURAGRI

[x] Decentralised evaluation

EUPHRESCO

ETB-PRO (with a centralized consensus meeting)

4.2 How do you select the experts for the evaluation?

Please describe the process shortly and highlight issues you find especially well functioning in your ERA-NET.

Bioenergy each national partner recommends 1 or 2 experts

CORE Organic each national partner recommends 1 or 2 experts, the list is approved by all partners, the CB leader and the call secretariat create the expert panel consisting of 3-4 experts based on their expertise, research area, nationality (to cover both north and south), the panel composition is unknown to the partners

ERASynBio I will ask the partners to provide me with a list of possible experts

ETB-PRO experts are mostly in-house experts

ICTAGRI MKB and EAG

Page 28: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

9  

ERA-IB-2 Each funding organisation suggests a certain number of experts, stating their names, contact details and describing their fields of expertise. Afterwards, partners vote for up to 5 experts from the complete list, and the 12 experts with the highest number of votes will form the panel. In case of draws for the last few to be selected, we check which fields of expertise are not yet well represented, and if this brings no solution, which countries are under- or already overrepresented. The experts for the peer reviews of the proposals are, at a later stage, suggested by the members of the expert panel.

EUPHRESCO We ask partners to nominate peer reviewers, if we have a peer review stage see EUPHRESCO-1 WP4 report www.euphresco.orh

MARINEBIOTECH Experts will be proposed by the consortium partners and chosen to fulfil the needs according to the expertise needs in the panel as determined by the applications

FORESTERRA Funding partners decide the composition of the Scientific Evaluation Committee (international reviewers)

RURAGRI Experts for evaluation have been selected through all ERA-Net partners, that means every partner could suggest 1-2 experts for each of the three priority themes of the Call. This provided a potentially large number of experts which was in the end only partly used by countries. In addition experts from non-participating EU countries and from the research community were addressed to complement the expert data base. Experts had to provide information on their fields of expertise and availability for the respective evaluation period and meeting participation. The evaluation process and criteria were summarized in the attached information document to provide insight to the requirements and the contents of the evaluation work (document 15).

WoodWisdom Each participating country can suggest experts for the joint expert pool. Depending on the call topics and after joint discussions the persons in charge of the task will then contact the selected experts. The external experts are used for the evaluation of the full proposals only, the evaluation of the pre-proposals (eligibility and suitability) are done by the participating funding organisations.

ANIWHA nominated by funders involved, criteria discussed within the group and according to the agreement experts get nominated.

ERASySAPP via a metadata base with experts and on the basis of personal recommendations

ERA ARD NB! The link doesn’t work anymore, so it is difficult to answer this question.

ARIMNET We selected two kind of experts : those of the Evaluation Committee and the reviewers. The scientific Evaluation Committee (EC) was constituted to evaluate the proposals submitted in response to this joint transnational call and to propose a final ranking of these proposals. The Programme Call Board (CB) nominated the members of the EC, proposed by the ARIMNet’s members. It has been decided that the EC Chair should be drawn from non-participating countries. The EC members were internationally recognized scientists chosen for their scientific or technical expertise. The members of the EC could not represent the nominating parties or adopt national considerations. In order to avoid any conflict of interest, the EC members could not apply to this call.

During the first EC meeting, the members completed the database of referees (prepared by the ARIMNet members) and assigned 4 referees per proposals. When possible, they assigned:

Page 29: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

10  

- 2 Mediterranean referees and 2 international ones for each of the 79 eligible proposals,

- referees native of (or working in) countries not involved in the evaluated project,

They assigned as much as possible only one proposal per referee.

They also assigned 2 EC members to each proposal.

The Call Office contacted 1 Mediterranean and 1 international expert per proposal and when one expert was not available, the Call Office contacted others experts identified by the EC members.

(the main criteria were then the expertise first and then the country)

4.3 Are there criteria only assessed by the funders and not by the experts?

[x] Yes, please list

Bioenergy 1. Eligibility, 2. Fit in National Program

ERA-CAPS E.g. eligibility and remit

CORE Organic: Eligibility and fit in national programmes for pre-proposals, and relevance for full proposals

ERA-IB-2 Compatibility with the relevant national/regional funding programmes

MARINEBIOTECH Do not know yet

FORESTERRA Scientific aspects (quality, relevance, novelty, impact, consortium)

Management (added value of cooperation, risk, dissemination activities…)

SUSFOOD Probably (at least eligibility criteria)

[x] No

ERASynBio, ETB-PRO, ICTAGRI, EUPHRESCO, ANIWHA

4.4 Do you think your consortium would be able to agree to use standard scientific evaluation criteria for KBBE ERA-NETs? It could for example be grouped in three for development-, research and development- and pure research projects. link to examples

[x] Yes

ICTAGRI, ERA-IB-2, FORESTERRA

MARINEBIOTECH With the possibility to add specific criteria as needed and subtract ones not needed for the specific call. A list could be provided with “Obligatory” criteria and “can be used” criteria + “Own” criteria as needed.

[x] Maybe (please list your minimum requirements in addition to the ones in the example)

Page 30: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

11  

Bioenergy We fund applied research, and might not fit in the scientific criteria of other eranets

ERA-CAPS As for previous standard forms, there would need to be a degree of flexibility that allows the evaluation criteria to be tailored to the specific needs of the ERA-NET

SAFEFOODERA

CORE Organic If partners would have influence

ETB-PRO We could agree on scientific standards, but we would insist on additional application, exploitation and economic criteria

SUSFOOD Maybe (minimum requirement list available only once we have an experience)

RURAGRI Dependent on set of criteria; processes etc. Minimum requirements: transparency of each evaluation criteria assessment, information by experts, check of Conflicts of Interest etc.

WoodWise If the call would emphasize e.g. the participation of your researchers and/or SMEs, there should be a place for such evaluation criteria, too. And of course if EC rules would allow to follow such standard evaluation criteria.

ANIWHA As not only this link is not working properly (all not working properly) I cannot check your example!

[x] No

ERASynBio, EUPHRESCO,

ERASySAPP No, we think that this is not recommendable since each call has specific stand alone criteria, a general proposal or guideline is fine – nothing more nothing less

ERA ARD Rather doubtful

5 Funding decisions link Which agreements have you made on forehand to avoid problems during project selection?

[x] Memorandum of understanding for each call (please share latest version)

ERA-IB-2, FORESTERRA, SUSFOOD (we plan to have such agreements), RURAGRI WoodWisdom, ANIWHA, ERASySAPP, ARIMNET

[x] Rules are well defined in the Consortium Agreement

Bioenergy, CORE Organic, ERASynBio (not yet produced), SAFEFOODERA, WoodWisdom ERASySAPP ERA ARD A Consortium Agreement was mandatory but the form is not prescribed

– only a suggestion given. The CA binds the consortium together but does not involve the funders. The funders use contracts to bind themselves to the researcher consortium] Rules are well defined in the Consortium Agreement (please share)

Page 31: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

12  

[x] No formalised agreement is needed. Decisions are made in mutual understanding at meetings

ETB-PRO, ICTAGRI, EUPHRESCO

[x] None/other kind (please explain)

ERA-IB-2, MARINEBIOTECH (not yet relevant)

6 After the call link

6.1 Monitoring and management of funded projects link 6.1.1 How do you manage scientific/technical changes to the funded projects?

[ ] Centralised procedure where the ERA-NET project manager accepts or rejects the request from the project coordinator and informs the funding bodies

[x] Centralised procedure where all funding bodies agree

CORE Organic, SAFEFOODERA, RURAGRI, ANIWHA, ARIMNET (for example for the change of a scientist... sending of CV and explanations to all)

[x] National procedure with communication between national project partner and national funding body

Bioenergy, ICTAGRI, ERA-IB-2, FORESTERRA, RURAGRI, ERA ARD ETB-PRO => but data are shared within the consortium ERASySAPP That is up to now our favourite solution

[x] Changes are only reported in annual/mid-term/final reports

EUPHRESCO, FORESTERRA, ERA ARD

[x] Other, please explain

ERASynBio Not decided yet ERA-IB-2 National decisions/requests to the funding are forwarded to the Call

Secretariat, who informs the other relevant funding organisations. Certain decisions organisation (such as cost-neutral extensions) are taken individually, but communicated (additionally at least) by the Call Secretariat.

6.1.2 Does your ERA-NET organise seminars where researchers report on their project progress?

[x] Yes

Bioenergy, CORE Organic, ERASynBio, ERA-CAPS, SAFEFOODERA, ERA-IB-2, MARINEBIOTECH (will do …), FORESTERRA, ANIWHA

[x] No

ETB-PRO, ICTAGRI EUPHRESCO Encouraged or arranged, e.g. presentations to EC Plant Health standing

Committee for policy-relevant projects

Page 32: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

13  

6.1.3 Does your ERA-NET monitor the progress of the funded projects by participating in their individual project meetings?

[x] Yes

Bioenergy, EUPHRESCO, RURAGRI CORE Organic The funding body of the country where the meeting is held participates

if possible

FORESTERRA Workshop to present results

WoodWisdom A special activity was launched during Call 1 of the WW-Net Research Programme to monitor the funded projects: a project observer from one of the participating funding organisations is assigned to each of the selected projects to monitor the progress in trans-national cooperation on behalf of the participating funding organisations and to provide a communication link between the project and the WW-Net Research Programme. Due to the positive experiences and feedback from the projects this activity will be further continued in connection with the WW-Net+ Joint Call. Among other things the observers may participate in the project meetings if appropriate.

ANIWHA Yes, sometimes

ERASySAPP This is intended but yet undecided

ARIMNET To some of the kick off meetings when possible

[x] No

Bioenergy, ERASynBio, ERA-CAPS, SAFEFOODERA, ETB-PRO, ICTAGRI, ERA-IB-2, MARINEBIOTECH (can be done, but no obligation), ERA ARD

6.1.4 How is the reporting structured?

[x] common reporting only

MARINEBIOTECH

[x] common reporting + national reporting

Bioenergy, CORE Organic, ERA-CAPS, SAFEFOODERA, ICTAGRI, ERA-IB-2, EUPHRESCO, FORESTERRA, RURAGRI, WoodWisdom, ANIWHA, ERASySAPP, ERA ARD, ARIMNET

ETB-PRO in fact it is reverse: national reporting as the basis of common reporting (data are shared and can be discussed within the consortium if problems occur)

[ ] only national reporting

ERASynBio Not decided yet

6.1.5 Are all funding bodies involved in the evaluation of the common reports?

[x] Yes (please explain)

SAFEFOODERA, ICTAGRI, EUPHRESCO, FORESTERRA, ERASySAPP

Page 33: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

14  

Bioenergy Each participant reports nationally and the coordinator to the ERANET

[x] No (please explain)

RURAGRI, WoodWisdom, ERA ARD

ERASynBio Not decided yet

ETB-PRO usually only the involved parties

ERA-IB-2 Only those involved in funding the respective project

MARINEBIOTECH Don’t know yet…, but would go for “No” as the call administrator follow up on the reporting and asks for support nationally if there are deviations to discuss.

ANIWHA only those who are supporting the project

6.1.6 Is the common reporting connected to funds? Will funds be withheld until the common reporting is approved?

[ ] Yes

SAFEFOODERA, MARINEBIOTECH, FORESTERRA, SUSFOOD (probably yes)

[x] No

ETB-PRO, WoodWisdom, ARIMNET

[x] Different from country to country

Bioenergy, CORE Organic, ERA-CAPS, ICTAGRI, ERA-IB-2, EUPHRESCO, RURAGRI, ANIWHA, ERA ARD ERASySAPP Still undecided

6.1.7 Looking ahead towards Horizon 2020, the ERALEARN toolbox recommends harmonisation with FP7 requirements because the reporting and monitoring requirements accepted by the Commission should be sufficient for most of the participating funding agencies. Harmonization with FP7 requirement will also ensure easier access to comparable statistics. Finally – when an ERA-NET evolves into ERA-NET+ or other kind of joint programming including Commission R&D funding - the FP7 requirements will have to be fulfilled. It is consequently prudent and effective to establish common reporting systems which at a minimum meet FP7 reporting requirements.

Do you think your consortium would agree to amend your reporting towards a harmonisation with the Commissions reporting requirements?

[x] Yes

Bioenergy, ETB-PRO, ICTAGRI, MARINEBIOTECH, FORESTERRA, ANIWHA, ARIMNET ERASynBio But only if reporting stays lean

CORE Organic With additional questions on added value and feedback to COII

Page 34: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

15  

ERA-IB-2 Yes, but additional (short) national reports would still be necessary for some partners

SUSFOOD Probably yes

ERA ARD Possibly but not if the requirements are too stringent and time consuming

[x] No SAFEFOODERA, RURAGRI, ERASySAPP

EUPHRESCO We will almost certainly not be eligible or attract EU funds via an ERA-Net+ approach since are calls are so small, plus many projects are non-competitive also.

WoodWisdom The applicants value the light ERA-NET reporting procedure we have had, the EC requirements might become too heavy.

6.2 Monitoring of call implementation link Does your ERA-NET evaluate the call procedure and management?

MARINEBIOTECH Don’t know yet

[x] Yes, both external and internal

CORE Organic, ETB-PRO, ERA-IB-2

SUSFOOD Probably – not decided yet, but at least there will be an internal evaluation via our scorecards

ANIWHA Yes, both external and internal

[x] Only internal

Bioenergy, ERASynBio, ERA-CAPS, SAFEFOODERA, ICTAGRI, EUPHRESCO, ARIMNET

[ ] No

6.3 Call results link Do you make the statistical analysis suggested by ERA-LEARN?

[x] Yes

ERASynBio, SAFEFOODERA, ETB-PRO, FORESTERRA SUSFOOD (we will when data available)

[x] A part of it (please list)

ARIMNET

ERA-IB-2 Number of submitted proposals (stage 1 & 2) Number of applicants involved Number of funded projects and overall success rate Total costs and requested (or accepted) funding Composition of consortia in terms of organisation type (e.g. SME, IND, RES, HE)

[x] No

Page 35: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

16  

Bioenergy, CORE Organic, ICTAGRI, EUPHRESCO

ANIWHA I do not know as I cannot check your statistical reports

6.4 Dissemination link 6.4.1 How do you disseminate the results of the call?

[x] The result is published on the website of the ERA-NET

Bioenergy, CORE Organic, ERASynBio, ERA-CAPS, SAFEFOODERA, ETB-PRO, ICTAGRI, ERA-IB-2, EUPHRESCO, MARINEBIOTECH, FORESTERRA, SUSFOOD (probably yes), ANIWHA, ERASySAPP

WoodWisom Printed newsletter, e-newsletter and website of the project and of most participating funding agencies.

ERA ARD Dissemination is primarily the responsibility of the research consortium. No, the result is published on the website of the ERA-NET

ARIMNET websites of partners, emailings, internal communication

[x] In an additional format (please explain)

ERA-CAPS

ETB-PRO Press release, brochure, launch articles in national and international journals

ERA-IB-2 National and ERA-IB-2 press release, presentations at relevant events (conferences etc.)

MARINEBIOTECH On national websites as decided by each partner. A common text will be provided for local adjustments.

FORESTERRA leaflet

RURAGRI it is planned to disseminate the results to a wider audience; however, this will take place only after contracts are finalized (July/ August 2013).

6.4.2 Do you assist the funded projects in disseminating the results obtained?

[x] Yes (please explain)

SAFEFOODERA, EUPHRESCO

CORE Organic Leaflets and a CORE Organic II subsite for each project. Translation of project outcomes when possible. Exploring a way to improve use of the results.

ERA-CAPS via Grantholders’ workshops

ERA-IB-2 it is planned to make the final seminar a public (so far: closed, only call participants & funders) event in the future and thus use it for dissemination

MARINEBIOTECH In newsletters and websites relevant for the theme.

FORESTERRA Web, leaflet, video

WoodWisom Perhaps the main help is that we organise annual open seminars with participants from industry, academia and policy-making organisations in the forest-based sector. The seminars have been very successful with a lot of participants. The seminar presentations are published on

Page 36: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

17  

the WW-Net website and the information on the seminar is disseminated via newsletters, too. Earlier we also provided some projects with a website service i.e. each project had a chance to use a WW-Net platform to establish a website for itself.

ERASySAPP In case of outstanding results the partners would certainly try to raise some attention to the results – either via newsletter, website, printed materials or other means.

[x] No

Bioenergy, ERASynBio, ICTAGRI, ANIWHA, ERA ARD, ARIMNET (not yet)

If “No”, would you like to do that if appropriate tools were developed?

ETB-PRO SMEs do not want this

RURAGRI No, but maybe too early to answer

[x ] Yes (good ideas are welcome)

Bioenergy, ICTAGRI, RURAGRI [ ] No ERASynBio, ANIWHA

ERA ARD scientific results should be disseminated by scientific groups whereas experience of working together through an ERA Net should be reported by the ERA Net

6.5 Analysing impact link Does your ERA-NET have any experience in making impact assessments of your 1) calls, 2) the ERA‐NET, or 3) the funded projects?

MARINEBIOTECH Don’t know yet SUSFOOD We consider that the first impact to assess is the impact of the SRA

[x] Yes to 1, 2 and 3 (please explain)

Bioenergy, ETB-PRO (link is based on ETB)

WoodWisom WoodWisdom-Net 2 - Report No. 2/2012 EVALUATION OF THE WOODWISDOM-NET RESEARCH PROGRAMME, PHASE 1, published on May 29, 2012. The evaluation was done by invited external experts, and the report includes e.g. the following issues: Impacts of the WoodWisdom programme on the development of the European innovation environment: - Competencies developed - Transnational added value and its development during WWN life time - Impacts on networking within and outside the Forest based sector - How to increase industry involvement - How to support innovation WoodWisdom-Net Report No. 3/2008 REPORT ON WOODWISDOM-NET SELF-EVALUATION.

[ ] Yes to one or two of them (please explain)

Page 37: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

18  

[x] No experience

CORE Organic, ERASynBio, ERA-CAPS, SAFEFOODERA, ICTAGRI, ERA-IB-2, EUPHRESCO, FORESTERRA, RURAGRI, ANIWHA, ERASySAPP, ERA ARD, ARIMNET

7 The ERA-LEARN toolbox itself link 7.1 The utility of the toolbox is

[x] useful, particularly for new ERA-NETs

Bioenergy, ERASynBio, ERA-CAPS, SAFEFOODERA, ETB-PRO, ERA-IB-2, EUPHRESCO, FORESTERRA [x] useful also for advanced ERA-NETs CORE Organic, ICTAGRI, EUPHRESCO, MARINEBIOTECH, FORESTERRA, SUSFOOD

[ ] should focus on one of the former target groups

7.2 Is there something missing in the toolbox?

[x] Yes (please explain)

CORE Organic possibility for questions and answers would be great ERASynBio ERANETS are not only executing calls, but also training and education

mearures etc. Examples for that would also be welcomed ARIMNET Database of reviewers. Implementation and follow-up of the projects (tool

to follow-up deliverables, tool to check the real implementation of the fundings of the research teams by the funding agencies..)

[x] Maybe

MARINEBIOTECH Have not had time to look deeply into it. Should be edited when needed

[x] No

Bioenergy, ERASynBio, SAFEFOODERA, ICTAGRI, ERA-IB-2, EUPHRESCO, FORESTERRA, RURAGRI, ERASySAPP WoodWisdom Actually we have not been using the ERA-LEARN toolbox that actively and

therefore it is impossible to say if something is missing in the toolbox.

7.3 Have you identified areas of the toolbox that can be improved?

[ ] Yes (please explain)

[x] No

Bioenergy, CORE Organic, SAFEFOODERA, ICTAGRI, EUPHRESCO, FORESTERRA, RURAGRI, WoodWisom, ERASySAPP

Page 38: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

19  

8 Problems 8.1 Have you experienced/identified any problems/specific items in your ERA-NET that you would like

WP3 to take up as an issue for mutual learning?

[x] Yes (please explain)

Bioenergy dissemination of results and impact assessment CORE Organic harmonisation of procedures, dissemination of results ERASynBio in general: pitfalls, risks etc. In ERANET execution EUPHRESCO Always interested in experiences of the Real Pot FORESTERRA Lack of reaction of some partners RURAGRI information in the selection process, particularly with regard to failing

projects; harmonize eligibility rules ARIMNET Constitution of the database of reviewers. Implementation and follow-up

of the projects (tool to follow-up deliverables, tool to check the real implementation of the fundings of the research teams by the funding agencies..)

[ ] No

ERA-CAPS, SAFEFOODERA, ETB-PRO, ICTAGRI, ANIWHA, ERASySAPP WoodWisom On the other hand there might be some but it would need a further

analysis to find them.

9 Mapping activities Link 9.1 Has your ERA-NET made mapping exercises?

[x] Yes (please list)

ERA-CAPS, EUPHRESCO, SAFEFOODERA, FORESTERRA, ERA ARD Bioenergy foresight studies, R&D prioritisation

CORE Organic in the first period: mapping of research programmes

ERASynBio currently planning

ETB-PRO scientific area, national landscapes, political priorities individual programs

ICTAGRI Using the MKB

MARINEBIOTECH doing it as part of the CSA

SUSFOOD We will use a database (MKB) filled in by researchers and the Partners for mapping existing research

RURAGRI Mapping activities were based on FP5, FP6, FP7 projects, other EU and international projects, and national relevant projects; research priorities and institutions and involved an in-depth discussion of a group of experts. The strong focus on this activity provided a good survey on relevant research activities and gaps in the research field.

WoodWisom WoodWisdom-Net Report 1/2005. DESIGN OF COMMON EVALUATION SYSTEM – OVERVIEW OF EVALUATION PRACTISES AND GUIDELINES FOR COMMON RESEARCH ACTIVITIES

Page 39: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

20  

WoodWisdom-Net Report No. 1/2006. OVERVIEW ON BARRIERS THAT HINDER TRANSNATIONAL COOPERATION AND MODELS FOR FUTURE COOPERATION

WoodWisdom-Net Report No. 2/2006. NATIONAL AND TRANSNATIONAL RESEARCH PROGRAMMES OF THE WOODWISDOM-NET PARTNERS IN THE FIELD OF WOOD AND FORESTY RESEARCH

WoodWisdom-Net Report No. 3/2006. GUIDELINES FOR A PROGRAMME MANAGER EXCHANGE PROGRAMME VISITS/ACTIVITIES

WoodWisdom-Net Report No 1/2007. GUIDELINES FOR OPENING OF FACILITIES OR LABORATORIES

WoodWisdom-Net Report No. 1/2008. BEST PRACTISES OF RESEARCH PROGRAMMES OF THE WOODWISDOM-NET PARTNERS IN THE FIELD OF WOOD AND FORESTY RESEARCH

WoodWisdom-Net Report No. 2/2008. FORESIGHT REPORT – OVERVIEW OF FORESIGHT STUDIES IN WOODWISDOM-NET COUNTRIES

ANIWHA not ready yet

ERASySAPP systems biology centers, systems biology company, stake holders, still to be started by partners

ARIMNET   Mapping of existing research programmes and research capacity on Mediterranean agriculture in the ARIMNet countries

[x] No

ERA-IB-2

9.2 If yes, would you recommend other ERA-NETs to do the same?

[x] Yes

ERA-CAPS, ICTAGRI, EUPHRESCO, MARINEBIOTECH, FORESTERRA, SUSFOOD, RURAGRI ANIWHA, ERASySAPP ETB-PRO (we found this was the basis to start joint actions)

[x] Not exactly the same (please explain)

Bioenergy, SAFEFOODERA

ERASynBio depends on the maturity of the scientific field

WoodWise Although there have been done quite a lot of mapping exercises already, the systems are changing and therefore it might be useful at least to update the existing data. In any case the mapping exercises teach you a lot of other organisations and the more you know about other participating organisations, the easier it is for you to trust each other and to build future collaboration.

ERA ARD There have been many mapping exercises and any new mapping must add new knowledge otherwise the mapping takes over from the research

ARIMNET It was very time consuming and not totally satisfying (difficulties to obtain the data, and data that could be compared between countries, difficulties to define the scope of this regional ERA-Net –What is Mediterranean and what is not-, …)

Page 40: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

21  

[x] No

CORE Organic

10 Development of strategic research agenda (SRA) 10.1 Does your ERA-NET have a SRA?

[x] Yes (please provide structure, focus, target group and document)

ERASynBio work just started

ETB-PRO, FORESTERRA, ERA ARD

ICTAGRI It is not finished yet

EUPHRESCO Yes (please provide structure, focus, target group and document) www.euphresco.org (a Deliverable from EUPHRESCO-1)

RURAGRI https://www.ruragri-era.net/lw_resource/datapool/_items/item_45/ruragri_sra.pdf (see attachment RURAGRI_SRA.pdf)

ANIWHA not ready yet

ERASySAPP We use the SRA from our predecessor ERA-Net ERASysBIO and will update and develop that further

[x] No

Bioenergy, CORE Organic, SAFEFOODERA, ETB-PRO, ERA-IB-2, MARINEBIOTECH, WoodWisdom, ARIMNET

ERA-CAPS Our ERA-NET is based on the Plants for the Future ETP SRA

SUSFOOD (No, not yet, but it will be)

10.2 Do you have experiences from elaborating SRA that you would like to share for mutual learning?

[x] Yes (please provide)

ICTAGRI [Public consulting]

RURAGRI This was a long discussion, given the wide scope of topics and diversity of national interests. It requires a much longer process and more guidance than anticipated. In the end, the conclusions for a common call are rather difficult if so many partners and diverse interests have to be reconciled.

ERA ARD We are holding a very important International conference on 5th June 2013 in Brussels. This is very forward looking and strategic thinking

[x] No

Bioenergy, CORE Organic, ERASynBio, ERA-CAPS, SAFEFOODERA, ERA-IB-2, EUPHRESCO, FORESTERRA, SUSFOOD, WoodWisom, ANIWHA, ERASySAPP, ARIMNET

10.3 How and where is your SRA used

Page 41: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

22  

[x] please provide organizations, institutions that use your SRA and what is the application area for its use

ERASynBio will be used to identify routes of funding

EUPHRESCO SRA sets out a general approach and framework of the common research areas/agandas shared by partners, plus aspects about implementation. One major use it toi set our clearly what types of projects EUPHRESCO can/will take foreard and what is more suitable for EU funding, hence document is made avalaibelto EU (DG-SANCO and DG-Research; EU CWP of the COPHS

11 Interaction with stakeholders 11.1 Which stakeholders are involved in your ERA-NET?

[x] The organisations/people subscribing for our newsletter are considered our stakeholders. Please explain who they are.

Bioenergy, WoodWisdom, ICTAGRI

ERASynBio no NK yet

ERA-CAPS Scientists, policy makers, industrialists

SAFEFOODERA European organisations for food producers, restaurants, researchers, agencies for food safety,

[x] The ones requesting it or suggested by a partner

CORE Organic, ERA-CAPS, FORESTERRA, SUSFOOD, RURAGRI, WoodWisdom, ERASySAPP

[x] Others, please explain how you identify this group

ERA-CAPS We have the European Plant Science Organisation as an observer in the ERA-NET. EPSO represents the plant science community. The European Commission is also a stakeholder

ETB-PRO All institutions/organizations having a stake in the ERA-NET: Program owners (policy level) and agencies (operative level); applicants (SMEs, ROs); Clusters and SME organizations; EC.

ERA-IB-2 Each national/regional ERA-IB-2 partner holds their own stakeholder lists, and invitations to stakeholder events are distributed from the WP leader to all partners, and from each partner to their national/regional stakeholders. This ensures maximum outreach, as central data storage would limit the lists due to data protection considerations.

EUPHRESCO We have very little ‘industry’ stakeholder involvement. However we have key stakeholders involved in our Expert Advisory board, consisting of EPPO, EFSA, SANCO and 3 Governing Board members (MS reps, but also some are part of COPHS, i.e. chief Officers of Plant Health Services)

MARINEBIOTECH Funding agencies and relevant stakeholders like companies, organisations, NGOs, etc are identified through the partner’s networks.

SUSFOOD We have decided to involve several types of stakeholders (JPIs, ERANETS, Projects, NGOs, SCAR experts, ETPs, research associations, industries …)

Page 42: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

23  

WoodWisdom In essence there are two distinct groups of stakeholders – WoodWisdom-Net+ internal and external stakeholders.

Internal Communication

i. Consortium members

ii. European Commission – WoodWisdom-Net+ Project Officer

External Communication

i. Scientific community

ii. Forest-Based Sector Technology Platform (FTP)

iii. Industry

iv. Other R&D funders

v. Other European initiatives

vi. Other associations at the national, European and international level

vii. The media

ARIMNET ARIMNet chosen to open proper identification and relevance of the potential priorities on thematic areas to stakeholders and experts invited to participate in a Conference to give them an opportunity to provide valuable and thoughtful contribution to the envisaged discussions. Previously a working group composed of the Coordinator, INIA, GDAR, CIHEAM and ICARDA had been working on its preparation. All the ARIMNet ́s partners, in their respective country, proposed the stakeholders they considered most representative from the different groups and sectors: National and regional agricultural research services, research councils, agricultural universities or university agricultural departments, agro and agrofood industry, regional and international organizations, farmers organizations, NGOs (consumers organization s and other organizations involved in environmental issues).

11.2 How and when do you involve stakeholders?

[x] When prioritising the research topics for the call

Bioenergy, ERA-CAPS, ERA-IB-2, MARINEBIOTECH, FORESTERRA, SUSFOOD, WoodWisdom, ANIWHA, ERA ARD, ARIMNET

[x] They are partners or associated partners of the consortium and attend our governing/steering meetings

CORE Organic, ERA-CAPS, MARINEBIOTECH, FORESTERRA, ANIWHA

[x] They receive newsletters or other information about the ERA-NET and projects funded

ERASynBio, ERA-CAPS, ETB-PRO, ICTAGRI, ERA-IB-2, MARINEBIOTECH, FORESTERRA, SUSFOOD, RURAGRI, WoodWisdom, ERASySAPP

[x] Other, please explain

SAFEFOODERA We have stakeholder meetings discussing research priorities

ERA-IB-2 Through the Advisory Board for ERA-IB-2

EUPHRESCO EAB invited to annual project meetings. We write 6-monthly progress reports to COPHS.

Page 43: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

24  

SUSFOOD The stakeholders are gathered in an External Advisory Group which will met jointly with our Governing Board.

WoodWisdom For example, they are invited to our annual research programme seminars, and there are some joint activities such as among other things combining seminars (e.g. WoodWisdom-Net Research Programme seminar in Stockholm on 11th November 2009 was organised in association with the Forest-Based Sector Technology Platform Conference).

ERA ARD We have a Southern advisory group to provide input and contact as the ERA-ARD Net is one of the few working with partners outside Europe

11.3 Are these stakeholders involved in dissemination of the results from the funded projects?

[x] Yes (please explain how)

FORESTERRA, RURAGRI, WoodWisdom

Bioenergy workshops

ERA-CAPS Via EPSO

EUPHRESCO alerts to research reports made via EPPO website and its own ‘Reporting Service’

SUSFOOD probably (we plan to involve them, the dissemination paths still being to be defined)

[x] No

CORE Organic, ERASynBio, SAFEFOODERA, ETB-PRO, ICTAGRI, ANIWHA, ERASySAPP, ARIMNET

[x] Other, please explain

ERA-IB-2 They are a target group rather than a medium

MARINEBIOTECH they are encouraged to disseminate newsletters etc.

12 In general, should mutual learning in PLATFORM

[x] maintain the ERA-LEARN concept

Bioenergy, ERASynBio, SAFEFOODERA, ETB-PRO, EUPHRESCO, MARINEBIOTECH, FORESTERRA, SUSFOOD

[x] and amend / improve the content

CORE Organic, ERASynBio, SAFEFOODERA, ETB-PRO, EUPHRESCO, MARINEBIOTECH, FORESTERRA, SUSFOOD

[x] be more normative

ICTAGRI

Page 44: Call management Master Class, 17-18 June 2013

WP3 Mutual learning: compilation of answers to questionnaire May 2012 

25  

[x] aim at “framework conditions”

ERASynBio, ERA-IB-2, MARINEBIOTECH, FORESTERRA

[x] if yes, do you think this aim is realistic within KBBE relevant ERA-NETs

ICTAGRI, FORESTERRA

[x] if yes, what is your priority / flexibility towards reaching this aim, please explain

ERASynBio personal communication is most worthy in my eyes

MARINEBIOTECH Follow ERA-LEARN / PLATFORM and use the Netwatch guidelines in planning and discussing with the ERA-net partners. Feed experience back to the PLATFORM / Netwatch activities.

Where would you place your own ERA-NET

Page 45: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

 

Report of the Master Class on call management 17-18 June 2013

Organised by WP3 Mutual Learning

Date: 17 June at 11.00 to

18 June at 15.00

Venue: KoWi premises Rue du Trone 98, 8th floor

Brussels, Belgium

Dinner: 17 June at 20:00, Restaurant Hemispéres

Report contents Monday June 17

Welcome & Introduction

A. Support tools for matchmaking and handling of applications and evaluation B. Evaluation and ranking of applications C. Selection of recommended applications, funding decision and project negotiation Part I selection of projects and funding decision

Tuesday June 18

C. Selection of recommended applications, funding decision and project negotiation Part II project negotiation D. Joint project monitoring

Any other topics that were brought up by participants

List of presentations

Participants

Page 46: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

Monday June 17

 

Welcome & Introduction 

 

Welcome by Christine Bunthof (P1) with short introduction to PLATFORM followed by tour de table 

of the participants. 

 

A quick introduction was made by Christian Listabarth (P9) to the ERALEARN call toolbox and how to 

get help for the topics that will not be discussed in this Master Class, and announced that ERA‐LEARN 

will organize a workshop for ERA‐NET newcomers in September/October this year, in which all topics 

would be covered. Topics relevant to this master class were selected by the PLATFORM partners and 

an introduction to the topics of the Master Class presented. 

ERA‐LEARN started in 2009 and established  a toolbox for call implementation and coordination, 

among others, and organized the annual ERA‐NET and Joint programming Conferences; ERA‐LEARN 

feeds information into the Netwatch portal (http://netwatch.jrc.ec.europa.eu/web/lp/learning‐

platform). It collects and provides generic information but does not make recommendations, thus 

PLATFORM could complement the toolbox through recommendations based on the expertise of the 

participants. 

The ERA‐LEARN call toolbox has a modular structure with six sections. In the Toolbox all uploaded 

documents of PLATFORM participants could be made available to the ERA‐NET community as 

complementary information, and a set of recommendations (outcome of the master class) could be 

interlinked with the relevant sections of the toolbox. This workshop aims at recommendation of 

good practises in call management and tries to collect relevant documents and templates from the 

various ERA‐NETs (to be provided to ERA‐LEARN).  

 

The general structure of the meeting was introduced by Ulla Sonne Bertelsen (P11). Each item was 

introduced by the session chair. After a case presentation, a Round Table was made. The questions 

and remarks led reflections and further sharing of information and ideas. The session chair   guided 

the discussions to conclusions and recommendations, which were collected and collated for a report 

on good practices and recommendation. 

 

 

A. Support tools for matchmaking and handling of applications and evaluation 

Chair Ulla   

The session focussed on use and performance of electronic submission and evaluation 

systems (ESS). This also touched upon security standards (the system, the handling of the 

proposals and storage and access). 

 

1) CASE  

Iver Thysen from ICT‐AGRI (Information Technology and Robotics in Agriculture) introduced 

the “Meta Knowledge Base” which is being transformed into an open source system for 

matchmaking and on‐line submission and evaluation of applications. It can be used by all 

ERA‐NETs. The ERA‐NET ICT‐AGRI collects metadata of projects and partners and provides it 

Page 47: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

on‐line for about 1100 users at the moment. Through a Member login one can join forces in 

a secured setting. There are opportunities to create private or public project‐proposals. The 

tool can also serve for matchmaking; one can share and search for profiles. The coordinator 

(ICT AGRI) is currently exploring a suitable open‐source platform (DRUPAL) in order to 

adapt the system and to share data with the community outside ICT‐AGRI. Once ready, it 

will be available for free use by other ERA‐NETs. Key issues are to fill and maintain the 

database; at current stage the partner‐profiles are rather advanced; available actual project 

information is limited. Consortium information of the projects can be posted by 

coordinator. National regulations such as funding schemes, themes, and eligible funding are 

provided automatically if the data have been entered into the system.  The electronic 

application forms available are online  as well as a tool for on‐line evaluation. The system 

also provides a tool to automatically calculate budgetary consequences of different 

proposal selections supporting the discussions and decision taking during selection of 

proposals and helps maximising the number of funded proposals.  

 

2) DISCUSSION  

Discussion of working methods and experiences along the questions: do you assist in match 

making? Do you use an electronic submission system? How do you secure the data?  

Call management systems in use. A quick survey of other operational call management 

systems yielded: PTJ uses an ESS (PETERAS); this system is adjustable through a 

subcontracted ICT company; it is not possible to make it open source. However, carrying 

the costs of subcontractor and workforce spent for adaptation by Project Management 

Jülich, Germany, the system is available for other networks. It does not provide a database 

for selection of referees. Except for Project Management Jülich, there are other providers 

of ESS, too, mostly offering the ESS including specific modifications on demand: DLR, 

VDI/VDE, AIT (all GE), NWO (NL), and a Basque private company. ERA‐LEARN toolbox will 

update the chapter on ESS during summer. ERA‐NET Bioenergy used a drop‐box system 

(access by user key and password) to handle some 30 proposals successfully. However, the 

ESS is also a sign of prestige, thus e.g. large companies could be reluctant to submit 

proposals to a drop‐box. Most ERA‐NETs are already using an ESS, others look for a simple 

and cost efficient system. Some ERA‐NETs (SUMFOREST/CoFASP) are exploring the market 

for existing ESSs that would fit for their own purposes. 

Match‐making methods. WoodWisdom‐Net organised a brokerage event for matchmaking 

during their last conference: every participant gave a short flash presentation of project 

ideas or partner competences. Afterwards all flash presentations were made public on the 

WoodWisdom‐Net website. The over‐all feeling was that it worked quite well to find 

matching collaboration‐partners. Participants were charged a small fee, thus there were 

zero costs. ERASynbio and some other ERA‐NETs, offers factsheets for interested partners 

to profile themselves on the ERA‐NET webpage. 

Data‐protection. How about data protection when using an open source system? Open 

source systems can be installed on your own server and you carry own responsibility about 

the data. Data protection/security is more of a concern once you go into the open‐source 

system. Though it depends on the competition within the area, it was generally felt that 

one should not over‐react on security issues. Also a web‐based drop‐box protected by user 

Page 48: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

key and password is considered safe, but submission and proposal handling by E‐mails is 

not considered sufficiently secure. 

In order to protect novel ideas access to proposals should be limited to evaluators, who 

should declare confidentiality.  Adequate confidentiality regulations must be defined. 

National contact points should have exclusive access to all data, however. 

 

3) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from 

the group:  

Electronic Submission and management tools: There is no one‐size‐fits‐all‐model; more 

advertisement of existing open source models and/or service providers is needed. 

The community should get ready for the new ERA‐NET‐Plus top up scheme (funding 

distribution tool). Simple systems and profiles are recommended. 

Existing models are either tailor‐made or adjusted from existing National Funding 

Organisation. 

There is a need for a budgetary tool for financial recommendations after ranking. ERA‐

SysBio Plus uses a system based on Excel sheets (contact Bernhard Gillissen); this is 

available in the ERA‐LEARN toolbox1.  

Confidentiality of proposal data should be guaranteed during the assessment phase; 

security of data is a relevant issue to be addressed. 

List of systems should be provided on ERA‐LEARN (e.g. PTJ system PTERAS; Open 

Source system, etc. 

To aim for a single cost efficient meta‐data system for all ERA‐NETs would be 

beneficial. 

 

B. Evaluation and ranking of applications 

Chair Christine    

The session was introduced with a message that quality assurance of your evaluation and 

ranking procedure is a key‐factor in joint funding of high quality international research‐

projects. The evaluation process is very critical for the selection of high quality projects, and 

is seen as the most stressful period in a call procedure by some call managers.  

 

1) CASE  

Part I  

Paul Beckers from ERA‐CAPs introduced their method of expert‐ and panel‐evaluation, 

which is based on best practice in the previous ERA‐NET Plant Genomics (ERA‐PG) calls. 

They used a one‐step full proposal submission procedure using an adjusted ESS of the 

German Science Foundation (DFG). So far there has been one thematically open call within 

molecular plant sciences area; Am eligibility check has been performed according to the 

criteria that are published  in Call Notice and National Regulations; external evaluators were 

used to obtain expert opinions (2‐3 independent experts, non‐numerical rating). Different 

sources for evaluator recruitment were presented; the procedure included a rebuttal step 

                                                            1 http://netwatch.jrc.ec.europa.eu/web/lp/learning‐platform/toolbox/call‐implementation/funding‐decisions/funding‐decisions 

Page 49: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

where applicants are allowed to briefly comment on the anonymous review reports. A 

Review Panel assessed the proposals, review reports and rebuttals lead to a grouped 

ranking (=List numerically but grouped). The procedure complies with the basic principles of 

Peer Review (Code of Conduct on Conflicts of Interest; Application Guidelines; Evaluation 

and Selection Guidelines. 

 

Part II  

Peer Review Services offered by the European Science Foundation (ESF) were presented as 

a possibility for outsourcing. ESF has a longstanding experience in international peer review 

and produced a reference document on best practices2 . The service they offer is modular 

and can be tailored to specific needs. Costs aspects need to be assessed on a case by case 

basis. 

 

2) DISCUSSION on methods and experiences  

Pre‐proposals v. Full Proposals (one‐step v. two‐step procedure).For a combination of 

reasons from too heavy burden to scientific community and administrative staff to negative 

cost‐benefit analysis, ERA‐IB skipped the external review step and only used an expert 

panel with rebuttal step over time. Most other ERA‐NETs use pre‐proposal step but still use 

external peers for full proposals; pre‐proposals enable you to increase the success rate and 

to reduce the efforts to be made by applicants, reviewers and administrative staff. In 

various ERA‐NETs either Funding Agencies or Review Panel decides on pre‐proposals. 

Applicants tend to object to rejection by administrators; they want an expert opinion as a 

basis for rejection. 

External Peers. There is a need for databases on qualified peers; EUREKA may give access 

to their sources. EC provides access to evaluator databank on request. The process of 

getting access to the EC database is described in the ERA‐LEARN toolbox3,  

Evaluation criteria and comparability of scores. It is important to provide strict and clear 

guidelines on evaluation criteria (quality and weighting), and to provide exact explanations 

for the use of scores (in almost all ERA‐NETs discrepancies in the use of scores have been 

observed). It is good practice to reduce the range of scores: evaluators like it and are forced 

to take decisions. In general, except for very narrow topics and within a panel in which each 

panellist knows every single proposal, the comparability of proposals to justify a final 

ranking list was heavily challenged. 

Rebuttal. Rebuttal‐stage has been welcomed by most of the science community. Some of 

the ERA‐NETs used it; others are considering to do so. All who employed rebuttals found it 

a good experience rather than a burden. 

Outsourcing Peer Review. The interest for outsourcing depends on the actual costs. ERASME and Cornet used Eureka office for their peer review; outsourcing may be an 

option. European Science Foundation could be contacted to discuss the needs and to 

obtain quotes for their services. 

                                                            2 http://www.vr.se/download/18.2ab49299132224ae10680001647/European+Peer+Review+Guide.pdf 3 http://netwatch.jrc.ec.europa.eu/web/lp/learning‐platform/toolbox/call‐implementation/evaluation/procedures/decentralised‐evaluation‐carried‐out‐by‐national‐programmes. 

Page 50: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

 

3) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from 

the group: 

There is a need for transparent communication of National Regulations regarding 

eligibility. 

A two‐step process (pre‐ and full proposals) is the preferred model because it 

decreases disappointment on the applicants’ side and reduces workload for ER and RP 

(review panel). 

Frequently a success rate of at least 25% but preferably 30‐40% for full proposals is 

aimed for. Anything below 25% makes a cost‐benefit balance negative. A two‐step 

procedure enables to reduce the number of full proposals being submitted.  

Number of expected proposals determines whether a pre‐proposal stage is needed. 

The result of the pre‐proposal assessment could be formatted as a non‐binding 

recommendation in case national regulations don’t allow a rejection without external 

peer review or without a full evaluation. 

Review Panel evaluation remains desirable in most cases. 

Need for clear Code of Conduct on Conflicts of Interest and Guidelines for Evaluation 

and Selection. 

Clear instructions to ER and RP about scoring criteria and a limited range of scores in 

order to avoid artefacts in the ranking.  

Access to the EC expert database (and to the EUREKA database – to be explored) is 

possible; access rules to expert databases should be shared.  

To spread the workload in the search for External Referees, some ERA‐NETS agreed to 

do it proportional to the proposals: countries that had most applicants involved in 

proposals should provide most ER names. 

Rebuttal step is recommended because it is adding value to the quality of assessment 

and transparency. 

Even symbolic fees paid to the Review Panel members would value their indispensable 

contribution to the quality assurance of the evaluation process. 

 

 

C. Selection of recommended applications, funding decision and project negotiation – part I 

Chair Christian  

Part I concerns the selection of projects and funding decision. 

 

1) CASE  

Nicolas Tinois from FZ‐JUELICH introduced different experiences on selection of 

applications and funding decisions. Different methods to get to a recommended funding list 

were addressed. Evaluation results can either be presented as (binding) ranking list or a 

group model. Possibility to redress a proposal was offered in specific cases. Redressing a 

proposal is to fulfil a condition in order to become eligible for funding, e.g. to replace a 

partner who is not eligible. 

Various models are applied: 

ETB, CORE Organic: Group model 

Page 51: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

SUSFOOD: binding ranking list per topic (i.e. more than one ranking list) (e.g. ANIHWA tries 

to fund at least one proposal per topic). FACCE‐JPI; one unique proposal 

 

2) DISCUSSION 

Differences between Rebuttal (process during evaluation) and Redress (process after first 

or second evaluation step) were explained. Redressing is meant to be more flexible to 

replace partners who drop out for unforeseen reasons (e.g. SMEs who went bankrupt).  

The new ERA‐NET scheme under Horizon 2020 requires that the funding recommendations 

should be based on a binding ranking list. This would leave no room for using the group 

model, which most participants considered the best model for funding a maximum of good 

projects. Often the restricted availability of funding from one (or more) partner(s) prevents 

other partners to fund projects that are evaluated as good projects, but which are not 

ranked high enough. An optimization of funds is, therefore, not possible. Simultaneously 

the reliability of ranking objectively and absolutely within a group of e.g. excellent projects 

was heavily doubted by the participants. 

Groups of similar project quality help to avoid jumping along a poorly trusted, absolute 

ranking list, and are considered the single most appropriate outcome of a thorough 

evaluation. 

As for the application of ERA‐NET Plus, successful applicants to FP7 mentioned that the 

global funding of the consortium must not be overrated in the ERA‐NET Plus application, 

because of the potential drop‐out of funds considering the distribution of effective funds 

following the binding ranking list. In the ERA‐NET Plus scheme, for example, a maximum of 

8 M€ top‐up funding from EC should be matched by 16M€ actually deployed (not 

potentially available) for funding from the national partners.  

The question of how much of the top‐up funding should be spent to fill gaps of the ranking 

list and how much will be attributed to national funding budgets were discussed. This must 

be decided before the funding recommendations are made, but it is evident that the more 

top‐up funding goes to the gap‐filling, the better a ranking list can be served.   

 

 

3) Preliminary CONCLUDING AND RECOMMENDING  

Recommendation for the new ERA‐NET scheme should be in favour of a group scoring 

without making any concessions to the quality of the evaluation of the research 

proposals.  

Groups must be defined as equally ranked proposals. 

Two step evaluation and panel ranking is preferred model/best practice; maintain best 

practice and reduce unnecessary ranking to assure the optimization of spending tax‐

payers money. 

 

 

*****End of day 1 of the master class***** 

Page 52: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

Tuesday June 18  

Conclusions of the discussions of Part C.I 

 

Introduction by Christian to summarize, continue and conclude the discussion of the 

previous day: A two‐step evaluation with a final panel ranking, in which panellists know 

most or all projects, was identified best practice. It is absolutely important to aim at the 

highest quality of the evaluation process, but final scores are considered to introduce 

artificial hierarchy. 

Final scores of a ranking list with minute differences tend to be arbitrary, due to (i) different 

individual use of scores and (ii) a poor comparability of projects. 

Variable geometry in the provision of funds often leads to the rejection of even highly 

ranked projects. The group model was the preferred model of the evaluation outcome, 

because it allows optimizing the funding recommendations.  

A possible solution would require the following characters: (i) to avoid any loss of, and keep 

the highest quality of evaluation; (ii) to reduce unnecessary (and doubtful) hierarchy in the 

ranking and allow for equally ranked proposals; (iii) to consolidate the dichotomy of “ideal 

nuances of scientific quality” v. “the pool of proposals considered cutting edge/excellent“. 

Reflections from the brainstorming session among participants on how to solve the current 

proposed EC regulations of the ERA‐NET Plus scheme: 

It is virtually impossible for a Review Panel to provide a rational basis for distinguishing 

between proposals of which  the overall quality  is very much alike.   This  leads  to an 

artificial  ranking  of  neighbouring  applications  which  does  not  reflect  the  actual 

scientific appraisal of the proposals. Experience shows that there is no rational basis to 

rank one proposal over the other among a category that  is defined narrowly enough. 

E.g. among ‘very good’ proposals any further distinction is often arbitrary. 

Like the EC, the national procedures for proposal selection aim for excellence, and the 

intention of national  research  funding organisations  is  to provide  funds  for  the best 

transnational project proposals. Therefore, any  loss and/or concessions to the quality 

of the evaluation of the research proposals are to be avoided. 

It is top‐priority that national funds and the top‐up funding must be effectively used in 

order to benefit the ERA. 

Options that provide solutions to the problems that arise from binding ranking lists are: 

Ranking proposals on equal  ranks;  as  this would  reduce  the  controversy of  artificial 

hierarchy and benefit the optimization of using the funds available. 

Creating multiple parallel calls with different scope or participation possibilities instead 

of  one  rigid  format;  as  this  would  allow  for  desired  flexibility  in  ERA‐NET  Plus 

implementation. 

An  approach  with  an  over‐all  and  subtopic  ranking  lists  where  the  allocation  of 

national money starts at the top‐ranked projects;  top‐up funding can be used to cover 

topics that otherwise would not be funded. 

These options are constrained by EC documents that already state that there can only be one call 

with one single and binding ranking list in ERA‐NET Plus in Horizon 2020. 

Page 53: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

 

 The community is in favour of keeping the flexibility of different evaluation outcomes 

and models for ERA‐NET calls. This would foster a bigger impact of the investment provided 

through national funds. 

 

C. Selection of recommended applications, funding decision and project negotiation – part II 

Chair Ulla  

Part II concerns the project negotiation.   

The distributed pot mode of funding, which is used most often in ERA‐NET calls combined 

with unbalanced participation in proposals compared to national budget commitments 

creates sometimes complex puzzles for funders. 

 

1) CASE  

Nicolas Tinois presented his experiences on negotiation with the unique proposal for a 

FACCE‐JPI Knowledge Hub. This proposal had to be redressed based on recommendations 

of evaluators 

 

2) DISCUSSION 

Negotiating proposal budgets. In general it is good practise to avoid cutting back parts of projects to save money (in order to guarantee that national budgets are not exceeded). This 

is perceived as disrespectful towards applicants and evaluators. If necessary any revised 

proposal should be re‐evaluated by (part of) the Review Panel 

The National Funding Organisations (NFOs) should check their financial mandates before 

the Moderating Panel meeting to speed up the process of funding decisions. To increase 

the transparency and to avoid raising wrong expectations, NFOs should publish funding 

ceilings and project costs they expect to fund. Expected budget reductions should be 

communicated to the applicants. 

National funding decisions. One of the perceived bottlenecks in the negotiations is getting the national funding decisions at the agreed time. This is no longer justifiable in view of the 

general ambition of smooth and lean public funding procedures. In cases where a signed 

consortium agreement is mandatory before the start of the project/ a funding decision can 

be taken, this might cause additional delay. It depends on complexity of the consortium 

issues how simple the CA can be and how quick this CA can be signed. 

 

3) CONCLUDING AND RECOMMENDING  

Various templates for Consortium Agreements are available on ERA‐LEARN/ NetWatch, 

specific templates might be added, but finally the applicants themselves will have to 

negotiate their Consortium Agreement individually 

Clear communication by NFO’s about maximum fundable amounts is necessary to 

avoid budget cuts at a late stage. 

In general it is good practice to avoid cutting back parts of projects 

Political power must be mobilised to streamline the timeline of National funding 

decision procedures in order to get projects started (simultaneously). 

 

Page 54: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

10 

D. Joint project monitoring. Methods of following‐up the projects funded by the ERA‐NETs 

during their lifetime 

Chair Christian  

Monitoring of jointly funded projects is mandatory in the ERA‐NET Plus scheme, at least 

within the life span of the ERA‐NET action. Most national funders have monitoring 

procedures for projects in place. The question is how to replace or transform national 

monitoring systems in order to best serve the transnational character of the ERA‐NET calls. 

There are different levels of monitoring, which are often dealt with all together: (i) during 

the lifetime of a project, (ii) after the project end (impact assessment), and (iii) impact of 

the ERA‐NET instrument. Project monitoring can be used as an instrument to guide a 

consortium to a better project performance, while assessments of impact are rather ex‐

post evaluations. The Master Class item focused on monitoring of projects performance. 

 

1) CASE 

Christine Bunthof introduced how ERA‐NET Plant Genomics monitored the funded research 

projects. The objective of the funding programme of ERA‐PG was to foster excellent 

science, transnational collaboration, and synergy in investments. ERA‐PG launched calls in 

2006 and in 2008 that have a combined budget of over 55 million euros. In 2006 there were 

two sub calls to serve schemes as wished by different funding organisations., Sub Call A 

aimed at curiosity‐driven/basic science with funders financing academic teams, and   Sub 

Call B aimed at innovation driven research by public‐private partnerships in which private 

partners were also eligible for receiving funding. In 2008 there was a second call for basic 

science. In the process of developing a common, effective and efficient set of procedures 

for the monitoring of funded projects, a working group was established taking into account 

current practices of the funders involved. The aim was to  agree on purpose, content, 

frequency and form of the monitoring, and to produce a unified system for reporting There 

were differences between the Sub Calls in the objective of the reporting/monitoring rooted 

in the national schemes that were brought to collaboration in the ERA‐PG calls. In short, in  

Sub Call A monitoring was about justification of spent budget and in Sub Call B the follow‐

up process involved a mid‐term evaluation with  go/ no‐go decision. Most steps of the 

process for Sub Call A and Sub Call B were the same, including a common procedure for 

collecting reports were the Call Secretariat sent mid‐term report forms to the leading PI of 

the funded projects. The submitted reports were uploaded on a restricted part of the 

intranet and funding organisations notified.  The secretariat and National Contact Points 

assessed the report.  Because projects had started at different times with almost a year 

difference between the start of the first and the last one, this work was spread out over a 

long period of time. 

Extra required steps were incorporated in the procedure for the 14 projects of Sub Call B. 

An expert panel read the reports and during a closed seminar for Sub Call B, PIs presented 

the projects and the evaluation panel discussed the progress and gave recommendations. 

The seminar also served for networking among the participants.  

The organisation of the monitoring after the contract with the Commission ended was a 

challenge. There was not a direct continuation and no funding organisation offered to take 

the whole task.  The option chosen was to distribute the task earlier carried out by the 

Page 55: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

11 

project management office (incl. call secretariat) to the funding organisations, so that the 

organisation funding the lead PI of a research consortium was, as  ‘lead funding 

organisation’ responsible for communication with the consortium and with the other 

funding organisations to continue the way of working.  Having the tasks thus distributed 

among a few NFO’s works, is not ideal.  The recommended solution would be to keep the 

original secretariat running instead, if needed with the support of fees, and of course with 

National Contact Points in all organisations still in place.   

Grant‐holder meetings were held bi‐annually back‐to‐back with large conferences on plant 

genomics   to have the project leaders presenting project plans or progress. This was felt 

beneficial by the scientific community and the national funding organisations and provided 

good networking opportunities.  

The evaluation of the ERA‐PG Programme (programme impact assessment) is taken up into 

one of the Work Packages of the ERA‐NET Coordination Actions in Plant Sciences, ERA‐CAPS 

Furthermore, ERA‐CAPS invited at its first programme seminar the consortia funded under 

the second call of ERA‐PG to present project results.   

  

2) DISCUSSION 

The discussion on working methods and experiences with reporting and monitoring 

covered the following topics:   

Impact assessment. WoodWisdom‐Net has developed some parameters to measure 

impact.  Annual seminars are used to collect and follow scientific progress. It may replace 

annual content‐reporting; this needs to be decided among NFO’s. 

The trans‐national aspect is considered the highest added value of impact assessment. 

Monitoring. The question was raised whether a go/ no‐go decision has an impact on the 

performance of the consortium. One felt the reporting requirement enables the 

coordinator to solve potential problems in the consortium. Only PTJ uses go/ no‐go 

decisions for large scale 5 year programmes. No further experience with this measure 

among stakeholders was expressed. Most programmes adhere to the national funding 

regulations which have an annual reporting requirement and are only entitled to withdraw 

money if a grant‐holder is not performing well or at all.  CORE Organic uses web‐based 

evaluation meetings instead of physical meetings. ETB / ICT‐AGRI has a national monitoring 

system in place which is shared among the involved NFOs. ICT‐AGRI developed an on‐line 

monitoring tool to be used for uploading reports, publications and a discussion forum. 

BBSRC has no annual monitoring; reports are only end‐term. The gap between the end of 

the ERA‐NET project and the end of the monitoring process needs to be bridged. Frequently 

voluntary national funding organisations willing to contribute to this part, are employed in 

this task. The question arose what happens in case of discrepancies among NFO’s in their 

assessment of the project progress. In ERA‐PG, issues that potentially could result in a 

funder not accepting the report were generally resolved by the Secretariat asking the lead 

PI for providing more detailed information, or explaining deviation from the original plans 

more clearly. Some points are only relevant at national level (e.g. eligibility of expenses or 

hiring within a certain time after grant decision). In such cases, lack of compliance was dealt 

with on the national level.  

Page 56: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

12 

Impact of the ERA‐NET instrument. An impact assessment on FP6 ERA‐NETs has been 

carried out, commissioned by the EC, by  Matrix Insight – Rambøll4. They did an extensive

amount of work involving surveys among all ERA-NETs, many interviews and analysis. As 

far as any of the master class participants knew, there is no such extensive study planned 

for FP7 ERA‐NETs. Reports are available on Netwatch that map and monitor the ERA‐NETs 

and the Commission recently updated its report on ERA‐NET, ERA‐NET Plus and JPIs and 

their joint calls5.  The bioeconomy ERA‐NETs welcome these monitoring and summary 

reports. The question came up if an extensive study on FP7 ERA‐NETS, as done for FP6 ERA‐

NETs would be useful in view of Horizon 2020.   The ERA‐NETs present at the PLATFORM 

Master Class would have a general interest in a potential impact assessment of FP7 ERA‐

NETs and would contribute to surveys. It is, however, difficult to see the added value of 

such an assessment in the perspective of Horizon2020, since the framework is already set. 

It only would be useful if highlighting the rational of the FP7 ERA‐NET instrument (along 

ERA‐NET Plus) can affect the making of policy and regulations. Furthermore, it was 

recognized that reports demonstrating the impact could contribute to promoting ERA‐NETs 

as instruments, ultimately to leverage national funding for ERA‐NETs. 

 

 

3) CONCLUDING AND RECOMMENDING. Concrete conclusions and recommendations from 

the group: 

Financial reporting is better placed and will stay at the national level; monitoring of the 

scientific progress is felt to be useful at the transnational level. 

Project management is integral part of the application and has been evaluated at the 

start of the project. One should trust the project coordinator that this is implemented 

as proposed, and no monitoring of project monitoring is required. 

Financing of the ex‐post evaluation and monitoring is felt as a bottleneck and needs to 

be addressed. This relates to the discussion about the development of self‐sustained 

ERA‐NETs. 

There is no joint impact assessment tool for individual ERA‐NETs, they do this 

individually.  

Creating a legacy of what has been achieved in the various ERA‐NETs and emphasise 

the good aspects for future collaborative funding initiatives would be highly valued. 

Use of conferences for mid‐term and end term is recommended.  

There is strong support for developing an on‐line common database for monitoring; 

existing models like the one developed for ICT‐AGRI (Meta Knowledge Base) should be 

considered 

 

                                                            4 Published in: FP6 ERA‐NET Study Summary of the Impact Assessment Study of the ERA‐NET scheme under the Sixth Framework Programme (EC report EUR 23909 EN, June 2009) and four volumes composing the full report. All can be retrieved at http://ec.europa.eu/research/evaluations (search in Documents on topic ‘ ERA‐NET). 5 Report on ERA‐NET, ERA‐NET Plus and JPIs and their joint calls, Directorate B – European Research Area B.4 – Joint Programming; May 2013 

Page 57: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

13 

End of the official agenda: A general request to the participants was presented: To upload relevant 

documents (in the context of the workshop) to the PLATFORM Intranet. Any problems with failing 

logins for PLATFORM should be forwarded to Linda Oud (PLATFORM project administrator). 

 

Any other topics that were brought up by participants: Selection of call topics 

 

1. Iver Thysen presented a case on how to select topics for a Call for Proposals.   

As an example to demonstrate the need, he mentioned ICT‐ AGRI which did not receive 

project suggestions in the topics they thought were the most needed when launching broad 

calls. They decided to do topical calls instead. From a stakeholder consultation they 

deduced a Strategic Research Agenda and Action Plan. In order to implement the plan they 

would like to identify and involve new funders and partners. They used the ICT‐AGRI on line 

tool to consult the scientific community for identifying bottlenecks in their specific scientific 

area. This led to a harmonisation of topics that were useful for dedicated calls.    

 

2. Alois Egartner presented the EUPHRESCO On‐line tool for the topic suggestion and selection process.  

To identify and select topics and build consortia EUPHRESCO developed an individual 

procedure with eight different phases within each round of research initiation (= call). 

Theses phases are independent from the funding mechanisms that finally are in use and 

require a one year period from the ‘preparation of initiation’ until the ‘funding decision 

phase’ in which partners decide on their support for the agreed topics.  

An online timetable (with e‐alerts) informs partners about the phases and the required 

actions. In the phase ‘initial identification of topic suggestions’ partners can propose topics 

via the online tool on the EUPHRESCO website. After a merge of topic suggestions with 

overlaps by call‐coordinators (‘reviewers’), partners can join the suggested topics and 

therefore inform the community about their potential funding interest in the phase ‘joining 

listed topic suggestions’.  This is followed by three automated topic selection steps 

(separate phases) after which only topics remain in the process that fulfil the minimum 

criteria of having sufficient interest of partners, having an assigned Topic‐Coordinator and 

having an agreed short topic description. The ‘funding decision phase’ in which partners 

finally decide on their funding participation and therefore form the funding consortia closes 

EUPHRESCO’s topic suggestion and selection process.  http://www.euphresco.org/public/calls/index.cfm 

 

Industrial participations registered in several ERA‐NETs (ERA TransBio, ERA‐PG, 

WoodWisdom‐Net) could be collected by consultation of the databases of those ERA‐NETS, 

which might be used to collect information about potential industrial stakeholders for topic 

selection and partnering in project proposals  

 

Page 58: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

14 

List of presentations

The presentations have been distributed to the participants of the Master Class and are available for 

the PLATFORM partners on the project intranet. 

1. ERA‐LEARN Toolbox. Christian Listabarth 

2. ICT‐AGRI Call Submission and Administration. [CASE for item A]  Iver Thysen 

3. ERA‐CAPS Joint Call for proposals 2012. Evaluation and ranking [CASE for item B] Paul Beckers 

4. ESF Peer Review Services [Contribution for item A] Paul Beckers 

5. Selection of recommended applications, funding decision and project negotiation [CASES for 

Item C] Nicolas Tinois 

6. ERA‐PG joint project monitoring [CASE for item D]  Christine Bunthof 

7. EUPHRESCO Online tool for topic selection [Extra topic] Alois Egartner 

Page 59: Call management Master Class, 17-18 June 2013

Report of PLATFORM Master Class on Call Management  Version: Final, 2 September 2013 

15 

 

Call management Master Class, 17-18 June 2013 Organised by WP3 Mutual Learning

Participant List

  Name  Email  Network 

1  Alex Percy‐Smith  [email protected]  ERA‐ARD 

2  Alois Egartner   [email protected]  EUPHRESCO 

3  Anabel de la Peña  [email protected]  IPM 

4  Anna Macey   [email protected]  ERA‐CAPS 

5  Annette Kremser  a.kremser@fz‐juelich.de  ERASynBio 

6  Dominique Vandekerchove1 [email protected] 

ANIHWA 

7  Elfriede Fuhrmann2 [email protected] 

RURAGRI 

8  Ignacio Baanante Balastegui   [email protected]   ERASysAPP 

9  Iver Thysen  [email protected]  ICT‐AGRI 

10  Johannes Bender   [email protected]  SUMFOREST  

11  Katerina Kotzia  [email protected]  CORE Organic and COFASP 

12  Marie Ollagnon  [email protected]  ARIMNET 

13  Marion Karrasch‐Bott  m.karrasch@fz‐juelich.de  ERA‐IB 

14  Marta Norton  [email protected]  ERA‐MBT 

15  Matte Brijder  [email protected]  ERA‐NET BIOENERGY 

16  Mika Kallio  [email protected]   WoodWisdom‐Net 

17  Nicolas Tinois   n.tinois@fz‐juelich.de   FACCE‐JPI 

18  Paul Beckers  [email protected]  ERA‐CAPs 

19  Petra Schulte  petra.schulte@fz‐juelich.de    ERA‐MBT 

20  Veronika Deppe   v.deppe@fz‐juelich.de  ETB 

  Organisers    

21  Christine Bunthof  [email protected]  PLATFORM 

22  Christian Listabarth   [email protected]  PLATFORM 

23  Ulla Sonne Bertelsen  [email protected]  PLATFORM 

1)  day 1 only.  2)  day 1 only 

ERA‐NETs not attending: EMIDA, SAFEFOODERA, FORESTERRA, BiodivERsA.