M&E Manual for SMEDA - United States Agency for ...pdf.usaid.gov/pdf_docs/PA00MF7F.pdf · M&E...

69
M&E Manual for SMEDA December 2012

Transcript of M&E Manual for SMEDA - United States Agency for ...pdf.usaid.gov/pdf_docs/PA00MF7F.pdf · M&E...

M&E Manual for SMEDA December 2012

This page is intentionally left blank.

Table of Contents

Acknowledgements ............................................................................................................... 5

List of Abbreviations .............................................................................................................. 6

M&E GLOSSARY ..................................................................................................................... 6

1. Introduction ....................................................................................................................... 8

2. Background ........................................................................................................................ 8 2.2 Importance of SME Sector ........................................................................................................ 8

2.3 SMEs in Pakistan ...................................................................................................................... 8

2.4 Importance of Monitoring and Evaluation ................................................................................. 8

2.5 Monitoring and Evaluation Regime for Public Sector Development Projects ............................... 9

3. M&E Regime at SMEDA – Situation Analysis ....................................................................... 9 3.1 M&E Challenges for SMEDA .................................................................................................... 11

M&E FRAMEWORK FOR SMEDA ........................................................................................... 12

4. Monitoring and Evaluation and its Link with Planning ....................................................... 12 4.1 Project Planning and M&E ...................................................................................................... 14

4.2 Existing Planning Regime at SMEDA ........................................................................................ 14

4.3 Strengthening Planning Processes at SMEDA ........................................................................... 14

4.4 Adopting the Results Framework at SMEDA ............................................................................ 15

4.5 Developing the Results Framework ......................................................................................... 15

5. Scope of the M&E Framework for SMEDA ........................................................................ 17 5.1 Activities/projects to be monitored and evaluated .................................................................. 17

5.2 Responsibility for M&E ........................................................................................................... 18

5.3 Timing of M&E Activities ......................................................................................................... 18

5.4 M&E Approach and Methods .................................................................................................. 18

5.5 Resources for M&E ................................................................................................................. 18

6. M&E APPROACH METHODS AND TOOLS ........................................................................... 18 6.1 Types of Evaluation ................................................................................................................ 22

6.2 M&E Stages ............................................................................................................................ 22

7. INDICATORS ..................................................................................................................... 23

8. IMPLEMENTING THE M&E REGIME FOR A PROJECT .......................................................... 25 STEP 1 – KEY QUESTIONS .............................................................................................................. 26

STEP 2 – M&E APPROACH AND METHODOLOGY ........................................................................... 26

STEP 3 – SELECTING INDICATORS .................................................................................................. 26

STEP 4 – DATA COLLECTION .......................................................................................................... 27

STEP 5 – TIMEFRAME ................................................................................................................... 27

STEP 6 – RESOURCES .................................................................................................................... 27

STEP 7 – IMPLEMENTATION .......................................................................................................... 27

9. M&E TIMING AND RESPONSIBILITY .................................................................................. 28

10. INSTITUTIONAL ARRANGEMENT ..................................................................................... 30

11. USE OF TECHNOLOGY – M&E MIS ................................................................................... 31

12. ACTION PLAN FOR M&E FRAMEWORK ROLL-OUT ........................................................... 32

APPENDIX A – REPORTING AND RECORD MANAGEMENT GUIDELINE FOR SMEDA TRAINING ACTIVITIES ........................................................................................................................... 34

1. Record Management ........................................................................................................ 34

2. Reporting Requirements .................................................................................................. 35 2.1 Training Inception Report ....................................................................................................... 35

2.2 Course participant Data Sheet ................................................................................................. 36

2.3 Monthly Training Progress Report ........................................................................................... 36

2.4 Training Completion Report .................................................................................................... 37

FORM A – ATTENDANCE SHEET ............................................................................................ 38

FORM C – INSTRUCTOR’S CV ................................................................................................ 41

FORM E – COURSE PARTICIPANT DATA SHEET ...................................................................... 44

FORM F – TITLE SHEET ............................................................................................................ 0

APPENDIX B – M&E TOOLKIT FOR SMEDA TRAINING ACTIVITIES ............................................. 1

1. Monitoring Visit SOPs......................................................................................................... 1

2. Reporting SOPs .................................................................................................................. 2

FORM I – MONITORING VISIT FORM ....................................................................................... 3

FORM II – ATTENDANCE SHEET ............................................................................................... 5

FORM III – TRAINEE FEEDBACK FORM ..................................................................................... 7

FORM IV – MONITORING VISIT REPORT FORM ....................................................................... 8

APPENDIX C – M&E TOOLKIT FOR SMEDA ADVISORY SERVICES ............................................ 10

Client Registration Form ...................................................................................................... 12

Advisory Service Feedback Form .......................................................................................... 13

APPENDIX D - SOFTWARE USER REQUIREMENT SPECIFICATIONS FOR SMEDA’s M&E MIS ..... 15

APPENDIX D – DRAFT RESULTS FRAMEWORK FOR SMEDA .................................................... 19

References ........................................................................................................................... 22

Acknowledgements

This report has drawn from international literature available on monitoring and evaluation within the

development sector’s context. While relevant footnotes have been given at some places within the report,

it was not possible to provide credit to relevant guides and handbooks at every stage. In particular, this

framework relies heavily on Guidelines for Project Management, Planning Commission of Pakistan;

Handbook on Planning Commission, Planning Commission of Pakistan; Handbook on Planning, Monitoring

and Evaluating for Development Results, UNDP 2009; The Monitoring and Evaluation Handbook for

Business Environment Reform, IFC; Monitoring and Evaluating Projects: A step-by-step Primer on

Monitoring, Benchmarking, and Impact Evaluation, Grun, Rebekka E., 2006; ‘Malaysian Experiences of

Monitoring in Development Planning’, Discussion Paper; Hussain, Datuk Zainul Ariff; Implementation

Coordination Unit, Jabatan Perdana Menteri, Putrajaya, Malaysia; A Guide for Project M&E; International

Fund for Agriculture Development (IFAD), 2002; and Guidelines for Preparing a Design and Monitoring

Framework; Asian Development Bank, 2007.

In addition, this report has been developed through valuable support provided by USAID, in particular from

Mr. Iqbal Ahmad Raja from ASP and from Mr. Khurram Khan, Ms. Nadia and Ms. Hameedullah Khan from

SMEDA. A lot of useful information was also provided by various SMEDA officials during the metings.

List of Abbreviations

AiD Associates in Development (Pvt.) Ltd.

ASP Assessment and Strengthening Program

ERKF Economic Revitalization of Khyber Pakhtunkhwa and FATA

IFAD International Fund for Agriculture Development

IFC International Finance Corporation

LUMS Lahore University of Management Sciences

M&E Monitoring and Evaluation

MIS Management Information System

MoI Ministry of Industries

PC Planning Commission

PISDAC Pakistan Initiative for Strategy Development & Competitiveness

PKR Pakistani Rupee

PMES Project Monitoring and Evaluation System

PPMEIU Project Planning Monitoring Evaluation and Implementation Unit

PSDP Public Sector Development Programme

RBM Results Based Management

RSPN Rural Support Programs Network

SME Small and Medium Enterprise

SMEDA Small and Medium Enterprise Development Authority

UNDP United Nations Development Programme

USAID United States Agency for International Development

M&E GLOSSARY

INPUTS - The resources that will be used including people, money, expertise, technology and information to deliver the activities/tasks of the project/program. It is usual to monitor the inputs and activities providing information for analysis and ultimately data for an evaluation.

ACTIVITIES OR TASKS - The actions taken or the work performed as part of an intervention. For example, the provision of technical advice, training sessions, facilitation of meetings or events etc. Activities utilize inputs, such as funds, technical assistance and other types of resources to produce specific outputs. Essentially activities or tasks are what the project will ‘do’.

OUTPUTS - These are the immediate results derived from the activities of the project. These outputs might be directly experienced by those being targeted by the intervention e.g. training advice or indirectly through outputs like reports, mapping of a situation etc.

OUTCOMES - These are the short-term and medium-term results of an intervention’s outputs, usually requiring the collective effort of partners. Outcomes represent changes in conditions that occur between the completion of outputs and the achievement of impact. It is usual to evaluate outcomes providing information for analysis and ultimately data for impact assessment.

IMPACT - Positive and negative, long-term results/benefits for identifiable population groups produced by an intervention, directly or indirectly, intended or unintended.

IMPACT ASSESSMENT - Seek to capture impacts that have occurred and ideally to differentiate those changes that are attributable to the project/intervention from other external factors. It can take place throughout the project program but usually towards or after the end of a project/program and is undertaken by those not involved in the project implementation.

BASELINES - A set of factors or indicators used to describe the situation prior to a development intervention and act as a reference point against which progress can be assessed or comparisons made. These are sometimes referred to as benchmarks.

INDICATORS - A quantitative and/or qualitative variable that allows the measurement and verification of changes produced by a development intervention relative to what was planned.

TARGETS - Indicators are a means by which change will be measured; targets are definite ends or amounts, which will be measured. A target is an explicit statement of the desired and measurable results expected for an indicator at a specified point in time. Targets should be expressed in terms of quantity, quality and time.

MILESTONES - Significant points in the lifetime of a project. A particular point in the project by which specified progress should have been made.

1. Introduction Small and Medium Enterprise Development Authority (SMEDA) is the leading government agency to develop and promote Small and Medium Enterprises (SMEs) in Pakistan. SMEDA was initially established in 1998 and in 2002 the SMEDA Ordinance was promulgated giving it the status of an independent authority and a corporate entity, functioning under the Ministry of Industries (MoI). The USAID-funded Assessment and Strengthening Program (ASP)1 is working with SMEDA to build its capacity. Amongst other institutional weaknesses of SMEDA, the absence of a robust monitoring and evaluation regime severely hampers organization’s capacity to monitor and manage the quality of its projects and measure their impact. In order to address this weakness, this M&E framework has been developed through ASP’s support to strengthen SMEDA’s capacity to effectively monitor its initiatives and create a more visible and meaningful impact in future. This framework covers the overall M&E regime that SMEDA would adopt along with a brief action plan. The framework would be adopted in coming months.

2. Background

2.2 Importance of SME Sector Although SME sector plays a critical role in all economies, it assumes even greater significance in developing countries. Besides being a major growth driver and provider of employment, it also acts as a nursery for future corporations and a testing and breeding ground for new technologies and innovation.

2.3 SMEs in Pakistan In Pakistan, the SME sector (companies with less than 100 employees) constitutes nearly 90% of all 3.5 million private firms2 that employ 80% of the non-agricultural labor force, claiming approximately 40% share in the annual GDP. The small-scale sector in Pakistan had largely grown up as an informal sector with enterprises in construction, wholesale, retail, trading, hotels, transport, communications and storage industries in urban areas. These SMEs are facing a host of challenges raging from absence of intellectual property rights to rising cost of doing business and SMEDA has been endeavoring to address these challenges to facilitate SME sector’s growth.

SMEDA was established in 1998, under Ministry of Industries and Protection. The main objective of this organization is to facilitate and encourage the small and medium entrepreneurs. SMEDA provides entrepreneurs with Business and Sector Development Services such as legal advisory services, financial services, Industrial Information services and technical assistance. The organization is spread all over the four provinces of Pakistan and provides entrepreneurs with various training programs. The Policy and Planning Wing of SMEDA assists in policy drafting and researches that can facilitate the small and medium sized entrepreneurs.

2.4 Importance of Monitoring and Evaluation Monitoring and evaluating projects and activities help an organization is assessing its

1 ‘Assessment and Strengthening Program (ASP) for Civil Society Organizations and Government of Pakistan’ is a five-year USAID-

funded initiative that is aimed at strengthening capacity of local organizations. The program is being implemented by three Pakistani organizations, including LUMS, RSPN and AiD. 2 SMEDA estimates

performance, calibrating and refining ongoing activities and informing development of future programs and activities. Monitoring and evaluation are also closely linked with planning as effective project plans clearly lay out the results framework, determine expected outputs and outcomes and identify relevant indicators. This framework then forms the basis for further M&E activities. It is also important to distinguish between monitoring ad evaluation as both generally encompass different activities. Monitoring is an ongoing process through which relevant stakeholders get regular feedback on the progress being made under a particular project, program or activity, whereas evaluation is usually a rigorous and often independent assessment of either completed or ongoing activities to determine the extent to which they have achieved the stated objectives. Monitoring can be done in-house by an organization or through a third party monitor, whereas evaluations are mostly done by independent reviewers. Evaluations can be done at the beginning of a project (baseline evaluation), during the course of a project (mid-year, year-end or mid-term evaluations), at the end of a project (end-term evaluation) or even after a few months or years subsequent to the completion of a project (impact assessment).

To ensure an effective delivery of results SMEDA like any other development organization requires a concrete monitoring and evaluation regime. This would allow the organization to regularly evaluate outcomes of its activities and will help in assessing the strengths and weaknesses of various projects, informing the planning of future initiatives.

2.5 Monitoring and Evaluation Regime for Public Sector Development Projects Planning Commission of Pakistan is the leading planning agency of Pakistan and is responsible for preparing the Public Sector Development Programme (PSDP), facilitating implementation by line agencies, developing policy guidelines for project preparation, review, appraisal, monitoring and evaluation. All the PC formats, used by the public agencies in both federal and provincial governments, are developed by the Planning Commission. Planning Commission has a dedicated Monitoring and Evaluation Wing headed by the Member (Implementation and monitoring), who is assisted by a Director General, Directors, Project Directors and a number of Monitoring Officers.

PC has developed specific monitoring and reporting formats and apart from PC-I to PC-V forms, PW-002 (project profile) and PW-003 (Monitoring report) are frequently in use. Planning Commission has also developed the Project Monitoring and Evaluation System (PMES), an MIS tool to monitor and review the progress of various projects. All the federal ministries have web-based access to PMES, whereby they can upload their information regarding ongoing projects.

3. M&E Regime at SMEDA – Situation Analysis SMEDA delivers its functions through four distinct divisions: Outreach; Business and Sector Development Services; Central Support; and Policy and Planning. The four provincial offices fall under the Outreach Division of SMEDA and engage in a number of diverse activities supporting the SME sector.

Although SMEDA undertakes a wide variety of initiatives, its activities can be broadly classified under three areas:

I. Key Routine Activities - These are routine activities undertaken by various divisions to support SME development such as development of sector strategies; provision of business advisory and legal services; formulation of pre-feasibilities; financial services software support; managing the Information Resource Center; development of research reports and special publications; undertaking trainings, workshops and seminars, etc.

M&E of Key Routine Activities - Currently, there is no formal monitoring regime employed by SMEDA for its key routine activities. Over the past few years, all divisions have been making their annual plans and presenting their progress at the end of each year. But beyond this exercise, there is no periodic monitoring or evaluation being done of any of these initiatives.

II. PSDP Projects - During recent years, the financial support provided to SMEDA through its regular budget have shrunk, with increasingly low share of funding for development activities. Consequently SMEDA has been seeking development budget under various Public Sector Development Programme (PSDP). Till date SMEDA has worked on 29 PSDP projects, amongst which five have been completed; three are new and the rest are operational. Amongst the operational projects, allocations have only been made against 15 projects for 2012-13.

M&E of PSDP Projects - The PSDP projects are monitored as per Planning Commission guidelines to some extent. In particular, these projects are monitored in three ways:

A) Reporting, Monitoring and Evaluation by Planning Commission (PC) – All PSDP projects are approved through formal PC-Is and are monitored by the Planning Commission in some way. Although the progress on all PSDP projects is supposed to be reported through PC-III (a and b) forms, it is not being done by SMEDA till date, or even elsewhere in the

CEO

B&SDS

Industry Support

Programme

Balochistan

KPK

Sindh

Outreach

Accounts

Human Resources

Administration

& PR

MIS/IT

Planning &

Coordination

Policy

Development Financial Services

Training

Services

Sector

Development

Punjab

Legal Services

Policy & Planning

Central

Support

Technical/

Innovation

Donor

Coordination

E-Services / IIN

CEO Secretariat

Information

Resource

Center

Research

PPMEIU

public sector. For some of the completed projects, PC-IVs (Completion Reports) have been made, however, SMEDA reports progress on all PSDP projects regularly through Project Monitoring and Evaluation System (PMES) - Planning Commission’s MIS. In addition, the PC staff also conducts some on-site monitoring for selected projects.

B) M&E Regime of the Ministry of Industries – MoI, being SMEDA’s parent ministry also conducts some monitoring activity of all the PSDP projects. This includes submission of cash plan, work plan and progress by respective projects to the ministry for onwards submission to and approval by the Planning Commission. As per instructions issued by Planning Commission, all projects now make their work plans in accordance with fixed ceilings for financial releases each quarter (20% in first and second quarter each and 30% for third and fourth quarter each), however, the releases are often constrained by the resource availability a Planning Commission and are often not made in accordance with the requested releases. In addition, the quarterly review of each project is done by the ministry, at the end of each quarter as per information collected through a template developed by the Ministry The Ministry also conducts sporadic monitoring visits of the development projects. Within the ministry, the office of Joint Secretary (Development) is responsible for monitoring activities and has also been implementing a project for strengthening monitoring activities.

C) SMEDA’s Internal Monitoring Regime - In order to monitor SMEDA’s PSDP projects and to coordinate with Ministry and Planning Commission’s monitoring activities, a dedicated unit (developed under a separate PSDP project) – Project Planning, Monitoring, Evaluation and Implementation Unit (PPMEIU) tracks the progress of each project as per an in-house developed proforma. The unit is quite thinly resourced and is only staffed with two officials.

III. Donor-Funded Projects - Over the years SMEDA has undertaken a number of projects through bilateral and multilateral support such USAID-funded Pakistan Initiative for Strategy Development and Competitiveness (PISDAC), etc. Currently SMEDA is involved with three such projects; two funded by UNDP, including Early Recovery and Restoration of Flood Affected Communities in Pakistan and Supporting Pro-Poor Governance for Legal Empowerment of the Poor; and one funded by the World Bank namely Economic Revitalization of Khyber Pakhtunkhwa and FATA (ERKF).

M&E of Donor-Funded Projects - All the donor-funded projects are monitored through monitoring regimes in accordance with donor agencies’ requirements and have fairly well developed results frameworks.

3.1 M&E Challenges for SMEDA Some of the key issues identified during the inception phase, includes the following:

Absence of an M&E Regime for SMEDA’s Key Routine Activities - While reviewing the existing monitoring regime employed in SMEDA, it becomes clear that while donor projects are being monitored through M&E regimes employed by the donor agencies and the PSDP projects are under the scrutiny of Planning Commission, Ministry of Industry and SMEDA’s own PPMEIU, there has been no monitoring or evaluation done for any of SMEDA’s key activities, such as provision of business advisory and legal services; formulation of pre-feasibilities; financial services software support; development of research reports and special publications; undertaking trainings,

workshops and seminars, etc. This underscores the need to have a sound M&E framework, which can cover the routine activities of SMEDA and can provide a sound evidence base for results achievement and better planning of future initiatives.

Excessive Focus on Monitoring Brick-and-Mortar Projects – Another important aspect of existing monitoring regime for PSDP projects is that there has been an overwhelming focus on brick-and-mortar activities as well as on funds utilization but there is rarely any monitoring or evaluation done of the softer interventions. For instance the Planning Commission and MoI regimes may cover construction and initial operation of a common facility center (CFC) but there is no mechanism in place to measure, review and monitor the services provided through the CFC. There is a need to introduce this aspect in the proposed M&E framework to monitor these critical activities, which form the core services offered by SMEDA for SME development and promotion.

Planning Vacuum and Absence of Results Framework – It is interesting to note that while SMEDA has a clear vision and mission statement as well as some very crisp strategic objectives, there is no detailed results framework, which would lay out the chain of results. While it may be simpler to develop a monitoring and evaluation framework for various SMEDA interventions, it would be very hard to implement any such regime in the absence of a sound results framework. Moreover, while all divisions make their annual plans, these plans are mostly in terms of outputs and not desired results. This calls for strengthening the planning process at SMEDA to identify and lay out indicators against which progress can then be measured through an M&E framework.

Limited Capacity and Resources for M&E – The availability of capacity and resources seems a major constraint in implementing a robust M&E framework in the department. The PPMEIU is currently staffed with only two people and SMEDA has been facing the resource shortage as well in recent years. It will however be critical to ensure that adequate resources and capacity is available to implement the proposed M&E regime and therefore it may be a viable idea to develop a new PC-I for adoption of the new framework, extension of the old PC-I or seeking donor assistance.

Unidentified Indicators and Baselines – While each initiative and project undertaken by SMEDA can have its own desired results, it will be critical for the proposed framework to identify some key indicators that would reflect the contributions SMEDA has made and will continue to make for SME development. Once these key indicators are identified, it would be important to identify primary and secondary resources for measuring them and setting a baseline.

Embedding the Overall M&E Cycle – It is important to note that M&E frameworks are not just meant for monitoring and highlighting any deviation from desired results or uncover irregularities. The framework should also ensure that the results are linked with project management for exception management and course correction. Moreover, these results should also feed into planning process to improve the future projects. Therefore the proposed M&E framework should cover the whole M&E cycle.

M&E FRAMEWORK FOR SMEDA

4. Monitoring and Evaluation and its Link with Planning

M&E provides government officials, development professionals, the private sector and civil society with better means for learning from past experience, improving service delivery, planning and allocating resources and demonstrating results as part of accountability to key stakeholders. Although evaluation is distinguished from monitoring, they are also interdependent. Monitoring presents what has been delivered and evaluation answers the question “what has happened as a result of the intervention?” Impact evaluation is a particular aspect of evaluation, focusing on the ultimate benefits of an intervention. Monitoring gives information on where a policy, program or project is at any given time (or over time) relative to respective targets and outcomes. Monitoring focuses in particular on efficiency, and the use of resources. While monitoring provides records of activities and results, and signals problems to be remedied along the way, it is descriptive and may not be able to explain why a particular problem has arisen, or why a particular outcome has occurred or failed to occur

Evaluation deals with questions of cause and effect. It is assessing or estimating the value, worth or impact of an intervention and is typically done on a periodic basis – perhaps annually or at the end of a phase of a project or program. Evaluation looks at the relevance, effectiveness, efficiency and sustainability of an intervention. It will provide evidence of why targets and outcomes are or are not being achieved and addresses issues of causality.

What are Monitoring, Evaluation and Impact Assessment?3

Monitoring Regular systematic collection and analysis of information to track the progress of program implementation against pre-set targets and objectives. Did we deliver?

Clarifies program/project objectives

Links activities and their resources to objectives

Translates objectives into performance indicators and sets targets

Routinely collects data on these indicators, compares actual result with targets

Reports progress to managers and alerts them to problems

Evaluation Objective assessment of an ongoing or recently completed project, program or policy, its design, implementation and results. What has happened as a result?

Analyzes why intended results were or were not achieved

Assesses specific casual contributions of activities to results

Examines implementation process explores unintended results

Provides lessons, highlights significant accomplishments or program potential and offers recommendations for improvement

Impact Assessment Impact assessment assesses what has happened as a result of the intervention and what may have happened without it - from a future point in time. Have we made a different and achieved our goal?

Seeks to capture and isolate the outcomes that are attributable (or caused by) the program/project

Will review all foregoing M&E activities, processes, reports and analysis

Provides an in-depth understanding of the various causal relationships and the mechanisms through which they operate

May seek to synthesize, compare, contrast a range of interventions in a region, timeframe, sector or reform area

3 The Monitoring and Evaluation Handbook For Business Environment Reform

4.1 Project Planning and M&E Effective monitoring and evaluation of a project program, depends on how effectively it has been planned and clarity of desired results. It is not often clear what a particular project or activity aims to achieve. In many ongoing projects at SMEDA, the outputs have been clearly earmarked but intended outcome and impact are either not clear or there are gaps in the results chain. Planning, monitoring and evaluation processes should be geared towards ensuring that results are achieved — not towards ensuring that all activities and outputs get produced as planned. Moreover, these individual project results must be in line with overall SMEDA goals and vision.

4.2 Existing Planning Regime at SMEDA It is interesting to note that while SMEDA has a clear vision and mission statement as well as some very crisp strategic objectives, there is no detailed results framework, which would lay out the chain of results. While it may be simpler to develop a monitoring and evaluation framework for various SMEDA interventions, it would be very hard to implement any such regime in the absence of a sound results framework. Moreover, while all divisions make their annual plans, these plans are mostly in terms of outputs and not desired results. This calls for strengthening the planning process at SMEDA to identify and lay out indicators against which progress can then be measured through an M&E framework.

4.3 Strengthening Planning Processes at SMEDA Results based management (RBM) is a good approach to synthesize planning, monitoring and evaluation and is also adopted by the Planning Commission of Pakistan. All the PC-Is now have a section of desired results, although they are seldom given due consideration in project planning.

RBM4 is defined as ‘a broad management strategy aimed at achieving improved performance and demonstrable results’, and has been adopted by many multilateral development organizations, bilateral development agencies and public administrations throughout the world. RBM is an ongoing process, encompassing continuous feedback, learning and improving. Existing plans are regularly modified based on the lessons learned through monitoring and evaluation, and future plans are developed based on these lessons.

Monitoring is also an ongoing process. The lessons from monitoring are discussed periodically and used to inform actions and decisions. Evaluations should be done for programmatic improvements while the project/program is still ongoing and also inform the planning of new projects/programs. This ongoing process of implementing, learning and improving is what is referred to as the RBM life-cycle approach.

4 At some places, RBM is also referred to as Managing for Development Results (MfDR) to place the emphasis on

development rather than organizational results

Figure 1: The RBM Life Cycle Approach5

4.4 Adopting the Results Framework at SMEDA Before an effective M&E regime can be adopted by SMEDA, it is critical that results framework is adopted by SMEDA for all its projects as well as for the overall organization6. It is not a very difficult project as all the PSDP projects do have defined objectives and relatively clear outputs laid out in their respective PC-Is. Other key routine activities are captured in annual plans made by all four divisions with clearly set targets. There is a need to link these targets, often for outputs, to desired outcome and impact.

4.5 Developing the Results Framework This section provides some guidance on developing results framework for various projects and activities at SMEDA. However, detailed technical guidance is also available through a number of internationally available resources on implementing RBM. To begin with, the management of SMEDA needs to clearly identify desired impact and outcome of every project and other activities as well as the planned outputs, inputs (or activities) for them. Indicators for all targets also need to be identified along with the source of their verification. The following matrix provides some guidance on these terms:

PARAMETER QUESTIONS THAT RELEVANT PROJECT MANAGER OR DIVISION GM SHOULD ASK:

IMPACT (vision, goal, objective, longer term outcome, long-

What are we trying to achieve? Why are we working on this problem? What is our overall goal?

5 Handbook on Planning, Monitoring and Evaluating for Development Results; UNDP 2009

6 While this report presents some guidance on how to adopt the results framework, the scope of the assignment does

not cover technical assistance in adoption.

term results

OUTCOME (first, positive result or immediate result, prerequisites, short- and medium-term results)

Where do we want to be in three-to-five years? What are the most immediate things we are trying to change? What are the things that must be in place first before we can achieve our goals and have an impact?

OUTPUT (interventions)

What are the things that need to be produced or provided through projects or programs for us to achieve our short- to medium-term results? What are the things that different stakeholders must provide?

ACTIVITIES (inputs, actions)

What needs to be done to produce these outputs?

INDICATORS (measure, performance measurement, performance standard)

How will we know if we are on track to achieve what we have planned?

MEANS OF VERIFICATION (data sources, evidence)

What precise information do we need to measure our performance? How will we obtain this information? How much will it cost? Can the information be monitored?

The outcome and impact part answers the ‘why’ question, while the outputs present the answer to what is going to be done under any given project or program. Together, they define the desired results. The input and activities on the other hand answer how all of this is going to be done and what would be the resources required to achieve this (pls see figure below).

Figure 2: Planning and Implementation in Results Framework7

7 Handbook on Planning, Monitoring and Evaluating for Development Results; UNDP 2009

5. Scope of the M&E Framework for SMEDA This section presents the scope of the proposed M&E framework for SMEDA. It provides guidance as to what should lie within this scope, however, it would be for SMEDA’s management to modify this proposed scope.

5.1 Activities/projects to be monitored and evaluated It is suggested that all activities undertaken by all the four divisions of SMEDA need to be brought under this M&E framework. As explained earlier, the activities performed by SMEDA can be categorized under three major areas: key routine activities; PSDP projects; and donor-funded projects/programs.

Key Routine Activities - For key routine activities, it is suggested that SMEDA should develop an overall results framework for the organization and these activities (and their desired outputs) should be clearly linked with these organizational results. For instance, if a key intended outcome of SMEDA is to ensure strengthened capacity of SMEs in Pakistan, a key output can be to provide capacity building to business development service providers as well as to provide direct advisory services, with clearly defined targets. Currently, besides the annual planning activity and periodic review meetings, there is no M&E system in place for these activities. Therefore, once the results framework for SMEDA as a whole and then for different activities performed by different divisions is ready, the M&E of these activities can be undertaken.

PSDP Projects - For PSDP projects, the Planning Commission (and even the Ministry of Industries) has a robust enough M&E framework and it is therefore suggested that this proposed framework should not be applied to these projects during their implementation period. However, there is a need to bring these projects under this proposed M&E framework, once their PC-Is are completed and these projects (such as common facility centers, etc.) are handed over to partner organizations or work as independent entities.

Donor-funded Projects - For donor-funded projects, the respective donors do have their own results framework and there is no need to bring them under this M&E framework, however, there is a need to review on an ongoing basis that these result frameworks (for these projects) are in line with overall SMEDA vision and results and are clearly defined.

5.2 Responsibility for M&E While the overall responsibility of M&E framework should be entrusted to the PPMEIU, there is a need to have delegated responsibility and flexible arrangement for each project. Also there is a need to strengthen PPMEIU to enable it to take on these increased responsibilities.

Key Routine Activities – For key routine activities, the M&E responsibility should be entrusted to PPMEIU. While the respective divisions should continue with their annual planning and quarterly review exercise, the PPMEIU should be the custodian for all this work and should act as driving force.

PSDP Projects - For PSDP projects, the PPMEIU is already providing assistance in carrying out M&E work during the life of the project. As suggested earlier, these projects should be monitored and evaluated even after the completion of PC-I to improve their functioning and to assess their usefulness and therefore the respective PC-Is should lay out these post-completion reporting, monitoring and evaluation requirements. These requirements may identify an extended or limited role played by PPMEIU, however PPMEIU should ensure that all PC-Is duly address this aspect.

Donor-funded Projects - For donor-funded projects, the PPMEIU should just act as a quality assurance unit to ensure that their results framework and M&E plans are in line with SMEDA’s organizational requirements.

5.3 Timing of M&E Activities The timing of M&E activities are covered in detail in Section 9.

5.4 M&E Approach and Methods Section 6 presents an elaborate discussion on this aspect.

5.5 Resources for M&E Currently, SMEDA is facing a severe resource crunch and there is a need to identify resources to implement this M&E framework. The existing PPMEIU is already working through development budget, provided through a dedicated PC-I. It is suggested that this PC-I should be modified in light of the requirements given in this report and should be submitted for seeking additional funding.

Additionally, SMEDA should ensure that all future PC-Is, especially the ones offering soft services, should dedicate at least 3-7% of their total cost for M&E activities. This fund can then be allocated to PPMEIU for further use.

6. M&E APPROACH METHODS AND TOOLS Monitoring and evaluation are complementary and yet distinct aspects of assessing the result of a development intervention. The function of monitoring is largely descriptive and its role is to provide data and evidence that underpins any evaluative judgments. As noted earlier monitoring is ongoing providing information on where a policy, program or project is at any given time (and over time) relative to its respective targets and outcomes. The function and role of evaluation is to build upon monitoring data, bring together additional information and examine whether or not

the project results have been achieved.

Monitoring - Monitoring includes periodically collecting information, through use of appropriate tools, at specific points in the process. This information is then used to assess various parameters, as against the initial plan or set standards. There are a number of tools or instruments that can be used in M&E and in most of the projects more than one tool can be used. Some of the tools used for data collection for both monitoring and evaluation include the following. Some of these tools and approaches are complementary; some are substitutes. Some have broad applicability, while others are quite narrow in their uses. The choice of which is appropriate for any given context will depend on a range of considerations. These include the uses for which M&E is intended, the main stakeholders who have an interest in the M&E findings, the speed with which the information is needed, and the cost. Different tools/instruments have strengths and weaknesses as methods of collecting different types of data and their use with different types of stakeholders, application with different types of indicators and different target groups.

Following is a list of various data collection tools followed by a matrix on their various attributes:

DATA COLLECTION TOOL DESCRIPTION EXAMPLES

Monitoring forms A form to record observations of an M&E officer/manager against certain pre-selected parameters. The qualitative information can also be collected and converted to categorical information (scale 1 to 5). Such forms can also be sent to project staff for reporting purposes.

Template developed by PPMEIU staff to review project progress

Sample Surveys Collect a range of data through questionnaires with a fixed format that are delivered via the post electronically over the telephone and face to face interviews.

Can be used with a range of subjects such as households (social-economic survey); a sector (business management survey); or an activity (enterprise survey).

A sample of SMEs are surveyed for data on regulatory issues faced or their use of a specific CFC.

Quantitative data is produced on average time and cost, and perceptions.

The enterprise survey is a core example.

Group Interviews / Focus Groups

Collect largely qualitative data through structured discussions amongst small groups of pre selected participants.

Usually these groups will comprise no more than 12 people and the sessions last up to 3 hours

These discussions are managed by an appointed facilitator who is not a participant.

A sample of SMEs participates in a focus group and provides qualitative feedback on regulatory bottlenecks.

Individual Interviews Collect a range of data through face-to-face discussions with individual stakeholders often called ‘informants’.

These can be "open" interviews or "structured" interviews, with questionnaires as part of a sample survey. They can vary in time and be held over a

A sales tax official provides information about the sales tax regime for SMEs.

An industry association representative provides feedback on usefulness and relevance of a common

number of sessions.

Often stakeholders who are viewed as being critical to the success of a project or program will be selected for interview and these are often called ‘key informant’ interviews.

facility center.

Case Studies Collection of data usually face-to-face interviews with a particular individual, business, group, location or community on more than one occasion and over a period of time.

The questioning involves open-ended and closed type questions questioning and involves the preparation of ‘histories’.

A sample of SMEs as well as officials of a CFC provide information about the success of a business center or a common facility center run by SMEDA.

Participant Observation Data is collected through observation where the M&E person takes part in an event or attends a place or situation and assesses what is happening through what they see.

May involve some questioning for clarification. Observations may take place over a period of times through a number of visits.

PPMEIU staff or a third party reviews records from a provincial SMEDA office for business plan services request and completion time taken by staff.

Tracer Studies When a range of data collection methods are used to collect different types of data on an individual group or community to determine the effects of an aid intervention over a longer period.

A sample of SMEs is tracked over time using a combination of methods cited above.

Method Criteria Surveys Rapid appraisal

Participant observation Case Studies

Focus Groups

Coverage - scale of applicability High Medium Low Low Low

Representative High Medium Low Low Low

Ease of quantification High Medium Low Low Low

Ability to isolate /measure non-project causes of change High Low Low Low Low

Speed of delivery Low High Medium High High

Expense of design and delivery High Medium Medium Low Medium

Ease of quantification High Medium Medium/ Low Low Low

Ability to isolate and measure non-project causes of change High Low Low Low Low

Ability to cope with the attribution problem High Medium Low Low Medium

Ability to capture qualitative info Medium High High High High

Ability to capture causal processes Low High High Medium Medium

Ability to understand complex processes - e.g. institution building Minimal Medium High Medium Medium

Ability to capture diversity of perceptions

Medium High Medium Low Medium

Ability to elicit views of diverse/disadvantaged groups

Medium Medium

High if targeted

High of targeted Medium

Ability to capture unexpected impacts Low High High High High

Degree of participation encouraged by method

Medium High Medium Medium High

Potential to contribute to stakeholder capacity building

Medium High Low

Medium to Low High

Evaluation - Evaluation usually involves using a number of different data collection tools to obtain a range of quantitative and qualitative information about the outcomes and impact of a project or program. For example, surveys may be complemented by focus group discussions and a small number of detailed case studies as well as in-depth interviews with key informants. This performs a checking role or triangulates the information collected by combining multiple data sources and

methods. In this way, this can help to overcome the bias that comes from only using one source and method of data collection.

Evaluation Criteria8

Criteria Definitions Core questions

Relevance The extent to which the activity is responsive to the priorities and policies of the target group, entrepreneur or SME

Does the intervention address needs?

Is it consistent with the policies and priorities of major stakeholders?

Is it compatible with other efforts?

Does it complement, duplicate or compete?

Effectiveness The extent to which a development effort attains its objectives and the degree to which desired outcomes are achieved through services provided

Are the desired objectives being achieved at outcome and impact/goal level?

Does it add value to what others are doing?

Efficiency The operational and administrative efficiency of projects and services provided.

Are we using the available resources wisely and well?

What is the efficiency of communication mechanisms, knowledge management and coordination with other organizations, donors, etc.?

How can we measure outputs – both qualitative and quantitative – in relation to inputs?

Sustainability Measuring whether the benefits of an activity are likely to continue after SMEDA funding has been withdrawn.

Will the outcomes and impacts be sustained after external support has ended?

Will activities, outputs, structures

8 IFC's The Monitoring and Evaluation Handbook for Business Environment Reform.

What are the key questions for evaluation?

“Evaluation is the systematic and objective assessment of an ongoing or completed project, program or policy, its design, implementation and results. The aim is to determine the relevance and fulfillment of objectives, development efficiency effectiveness, impact and sustainability. An evaluation should provide information that is credible and useful, enabling the incorporation of lesson learned into the decision making process of both recipients and development partners.

Source: Development Assistance Committee (DAC) of the OECD

and processes established be sustained?

Impact The positive and negative changes produced by a development intervention, directly or indirectly, intended or unintended.

What changes, positive or negative have occurred?

Are these changes attributable to the initiative?

6.1 Types of Evaluation Project Evaluation and Impact Evaluation are two main types of Evaluation techniques. The follow matrix summarizes the main components of these evaluations.

Review Evaluation Impact Evaluation

DES

CR

IPT

ION

Focuses on outcomes in terms of effectiveness, efficiency and relevance.

Examines whether the activities have delivered the planned outputs and whether these outputs have in turn led to outcomes that are contributing to the purpose of the project.

Is typically carried out towards or at the end of projects; or after their completion

They usually carried out by those ‘outside’ of the project in an effort to enhance objective accountability but may also involve insiders in order to enhance lessons learning.

Impact evaluations focus on relevance, effectiveness, efficiency, sustainability in relation to project goals.

Impact evaluations can also be carried out to assess and synthesize the outcomes of several initiatives together on a thematic, sector or program basis to examine their overall impact.

CR

ITER

IA

Program/Project Outcomes Program/Project Goals or Impact

MEA

SUR

ING

Have the SMEs been given due support and facilitation and whether they have been positively impacted

Whether the improvement in SMEs have led to increased contribution in economy, leading to growth and poverty alleviation

6.2 M&E Stages The monitoring and evaluation work needs to be carried out at multiple stages, depending upon the project requirement. These stages include the following:

Baseline or Initial Mapping - If a project involves undertaking a baseline or mapping exercise then the findings from this work need to be analyzed and reported quickly because they form an integral base from which the project proceeds and will often determine what tasks will be progressed and which will not.

Pilot Phase - A project may involve undertaking a pilot phase, where something will be tested out with a group or a particular locality before the project is ‘rolled out’ further. Again it is important that the analysis of M&E data from this pilot is undertaken thoroughly and quickly, as the findings

from this are needed to inform the progression of the project.

Mid-term or Periodic Evaluation/Review - Key findings from periodic evaluation work usually from the mid term timeframe of the project onwards need to be analyzed and reported in a timely manner as they illustrate whether the outputs of the project are being achieved or not and whether process issues are progressing. The findings from these mid-term evaluations inform the ongoing validity of the M&E plan for assessing outcomes and impact for the project. If initial findings show that the project is not achieving and or is achieving in an unexpected way then the M&E plan may need to be reviewed and updated for the end of project evaluation activities.

End of Project Evaluation/Review - This is usually the most substantive analysis as it is bringing all of the above together, as well as undertaking end of project evaluation data collection analysis and reporting. This is the key time of activity for M&E work if findings are to be processed and reported in a timely manner after the end of the project. Therefore resources need to have been in place and tasks managed well during this period.

Impact Assessment or Post-project Evaluation/Review – This comes after the culmination of a certain time period after the project completion to assess the impact of the project.

7. INDICATORS To measure something it is important to have a unit or variable ‘in which’ or ‘by which’ a measurement is made i.e. an indicator. The fundamental challenge for various managers and officials at SMEDA (and for M&E officials in general) is to develop appropriate performance indicators, which measure project performance. These indicators measure the things that projects do, what they produce, the changes they bring about and what happens as a result of these changes. In order to choose indicators, decisions must be made about what to measure. Having the right indicators underpins effective project implementation and good M&E practice. Therefore time, effort, debate and thought should be given to their identification, selection and use.

Firstly, there is need to distinguish indicators for different levels of assessment, that is monitoring, evaluation and impact indicators. The former (monitoring) concern tracking the progress of project implementation and primarily relate to inputs and activities. The latter two (evaluation) relate to measuring the results of the project: the outputs, the outcomes and ultimately, impact. Each aspect of implementing a project or program has typical types of indicators illustrating performance at each project level as the following table shows.

LEVEL OF INDICATORS

GENERAL EXAMPLES EXAMPLES FOR SME SECTOR

Input/Activities Human resources; Financial resources; Training for SMEDA staff; PSDP and regular budgets; donor funding; software for SMEs such as

Indicators for M&E should be SMART

Specific - Reflect what the project intends to change and are able to assess performance

Measurable - Must be precisely defined; measurement and interpretation is unambiguous

Attainable - Achievable by the project and sensitive to change

Relevant - Relevant to the project in question

Time bound - Describes when a certain change is expected

Material resources; Training

SMAP

Outputs Recommendations/Plans; Products; Studies/Reports; Legislations drafted

Training sessions; establishment of a CFC; provision of business advisory services; development of business plans; etc.

Outcomes Change in knowledge and/or behavior; Improved practices; Increased services; Legislation passed

Better performance of SMEs; Initiation of businesses as a result of business plan formation; cost savings and efficiency gains as a result of adopting SMAP

Impact Increased sales; Increased profitability; Increased employment

Greater number of new SMEs; Growth in SMEs; Increased exports/imports by SMEs;

7.1 SMEDA Related Indicators – As discussed earlier, there are a number of indicators, which SMEDA can track on a regular basis. Some of these SME-related indicators include the following. However, it must be noted that there may not be any secondary sources available for these indicators and SMEDA may need to commission surveys to collect this data.

Number of Small and Medium Enterprises in Pakistan

Number of Medium Enterprises

Number of Medium Manufacturing Enterprises

Number of Medium Services Enterprises

Number of Medium Women Enterprises

Number of Small Enterprises

Number of Small Manufacturing Enterprises

Number of Small Services Enterprises

Number of Small Women Entrepreneurs

Number of Rural Enterprises

Total Employment in Medium Enterprises

Total Employment in Small Enterprises

Per Unit Employment Registered

Total Entrepreneurial Activity Index9

Necessity Entrepreneurial Activity Index

9 Total Entrepreneurial Activity Index: Measures the number of people currently setting up a business or

owning/managing a business existing up to 3, 5 years: relative to the adult population age 18-64 years.

Necessity Entrepreneurial Activity Index: Measures the number of people involved in entrepreneurial activity out of necessity: relative to the adult population age 18-64 years.

Opportunity Entrepreneurial Activity Index: Measures the number of people involved in entrepreneurial activity out of opportunity: relative to the adult population age 18-64 years

Male Total entrepreneurial Activity Index: Measures the number of men involved in entrepreneurial activity: relative to the male adult population age 18-64 years.

Female Total entrepreneurial Activity Index: Measures the number of women involved in entrepreneurial activity: relative to the female adult population age18-64 years.

Opportunity Entrepreneurial Activity Index

Male Total entrepreneurial Activity Index

Female Total entrepreneurial Activity Index

Some other proposed indicators for various levels of assessment for SMEDA projects include the following:

Output Indicators

Number of SMEs receiving advisory services

Number of improved regulations for SMEs

Number of participants in workshops, training events, seminars, conferences

Number of participants reporting satisfied or very satisfied with workshops, training, seminars, conferences, etc.

Number of business plans and feasibilities developed

Number of women participants in workshops, training events, seminars, conferences, etc.

Number of CFCs established

Number of SMEs accessing CFCs for provision of services

Revenues of self-sustaining facilities, such as Gujranwala Business Center, etc.

Outcome Indicators

Credit disbursed to SMEs

Percentage of cost savings or increase in revenue in SMEs receiving advisory services

Number of businesses starting as a result of business plan development

Number of downloads of feasibility studies

Impact Indicators

Number of new jobs created by SMEs

New investment in SMEs

Contribution of SME sector in economy

8. IMPLEMENTING THE M&E REGIME FOR A PROJECT A number of projects and initiatives are already underway at SMEDA by its four divisions and the organization is also planning a number of new projects. There is a need to implement a robust M&E regime for each of these ongoing and new initiatives. While the previous sections discuss in detail how to carry out monitoring and evaluation, this section presents a step-by-step approach for relevant Project Managers or General Managers (for four SMEDA Divisions) on how to implement the M&E regime for a particular project or activity.

In order to start monitoring and evaluation for any ongoing or new project or activity, there are six steps involved:

All these steps have been summarized here:

STEP 1 – KEY QUESTIONS The first step for developing any M&E regime would be to identify key questions that need to be answered through M&E. For instance, in case of establishing a CFC, the manager may be interested in knowing the number of SMEs accessing the center, the volume of business of CFC, impact on client SMEs in terms of enhanced revenues or profits or reduced costs, etc. The answers to these questions would then help SMEDA officials in reviewing progress of their projects and would feed into better planning. While identifying these questions, the relevant Project Manager/Director or General Manager may want to ask the following questions:

Does this project have results framework? If it does, then what are the key outputs and desired outcome and impact for the project? If it does not, then what should be the desired results?

Has SMEDA or any other organization undertaken a similar project earlier? If yes, then what has been the learning?

How would the M&E results feed into the project cycle? Is the design flexible enough to incorporate the learning?

STEP 2 – M&E APPROACH AND METHODOLOGY This step would include developing the overall design of the M&E regime, such as the target population, sampling, control points, etc. Some of the questions that need to be answered at this stage include the following:

What is going to be the best approach for M&E? What is the counterfactual?

What has been learned from previous M&E designs?

What is going to be the baseline? Would secondary data suffice or primary data needed?

How will the sample be selected?

What will be the periodicity of data collection?

STEP 3 – SELECTING INDICATORS Selecting the right indicators is of paramount importance. Once key questions have been

STEP 1 - Identify the key questions to be asked and answered by the M&E

STEP 2 - Agree the overall M&E approach and methodology

STEP 3 - Choose the appropriate indicators

STEP 4 – Select tools and instruments for data collection and analysis

STEP 5 – Plan clear time frames with milestones

STEP 6 – Identify people and other resources for undertaking the M&E

STEP 7 – Implementing the M&E plan

identified these need to be translated into indicators and then targets (in the results framework). Indicators are then measured to demonstrate that the project is or is not doing what it set out to do. The key questions that need to be asked at this stage include the following:

Does SMEDA use any core indicators? (Currently it does not. Please see the section on indicators)

What should be the right mix of quantitative, core and customized, activity and process indicators?

How can indicators be appropriately disaggregated for various dimensions, such as for gender, geography, etc.?

STEP 4 – DATA COLLECTION Although this is the simplest step, it is also quit critical because the information that would feed the M&E results is going to be collected at this stage. Data collection needs to be undertaken at different times: prior to, during project implementation, at fixed points including at and after the end of the project. Some of the key considerations here would include:

In case of secondary data, are the sources reliable?

In case of primary data, who will be collecting the data? How can data integrity be ensured?

How often should the various data sets be collected?

Who will be responsible for subsequent steps such as data entry, consolidation, cleaning and analysis?

STEP 5 – TIMEFRAME It is essential to plan clear time frames with milestones. The key considerations at this stage include:

What are the key M&E milestones and when should they be reached?

How can the activities leading to these milestones be designed?

What are the implications for delay?

STEP 6 – RESOURCES Considering the prevalent resource crunch at SMEDA, the resource identification would be critical. It is important to have a realistic M&E regime in place, so that the resource availability does not become a problem. Some of the important questions at this stage would include:

Who is going to bear the costs or provide resources for these activities? Is PPMEIU going to be responsible for any of the activities or the project team would take the lead?

Is there any provision for M&E costs in the PC-I (in case of a PSDP project)?

STEP 7 – IMPLEMENTATION Once the M&E plan is approved, the next stage is to undertake M&E activities. This would include assigning M&E tasks and responsibilities; preparing terms of reference for any external sources needed; initiating baseline work; data collection, etc.

9. M&E TIMING AND RESPONSIBILITY Timings for various M&E activities may vary from project to project but here are some broad guidelines

M&

E C

om

po

ne

nt

Activity Responsibility Timeframe

Mo

nit

ori

ng

Results Framework for Program and Thematic Areas Relevant SMEDA GM / Project

Manager Start of Project (for new)

Monitoring Plan Templates for SMEDA PPMEIU As early as possible

M&E Framework for a Project Relevant Division At project design stage

Monitoring Plan Development Project Team At project kick-off stage

Third Party Monitoring for Special Projects/Interventions GM Central Services / Project

Team As and when required

Periodic Monitoring Reports Project Team / M&E Focal

Person (if any) Preferably on quarterly basis

Annual Monitoring Report (for all projects) Project Team / M&E Focal

Person (if any) At year end

Project Completion Report Project Team and PPMEIU End of project

Eval

uat

ion

s

Baseline Evaluation Project Team; Assistance from

PPMEIU Prior to project

Mid-Term Evaluation – Preferably through third party GM Central Services / Project

Team / PPMEIU At mid-point of a project life

End-Term Evaluation – Preferably through third party GM Central Services / Project

Team / PPMEIU End of a project

Re

view

s /

Ass

ess

me

nts

Review of Programs/Projects/Systems PPMEIU As and when required

Impact Assessment PPMEIU At least a year or more after

project completion

Stu

die

s

To assess needs or assessments of conditions of a specific project and/or define baseline indicators or milestones

PPMEIU As and when required

10. INSTITUTIONAL ARRANGEMENT Developing a robust results framework, as part of effective project planning, if a critical pre-requisite to adopting an M&E framework. Once the results framework is there, developing baselines and monitoring the progress of major activities and results then form the foundation for M&E framework. While each SMEDA Division and respective project teams should be responsible for monitoring their own projects, the PPMEIU should work with divisions and project teams to improve the quality, timeliness and analysis and also communicate analysis to internal and external stakeholders.

PPMEIU - The overall M&E framework at SMEDA should be driven by PPMEIU, however all divisions, project teams and even donors agencies (for their funded projects) should monitor the projects within their own domains for timely delivery of results. The PPMEIU would be reporting to senior management of SMEDA after aggregating results from various projects. For strengthening PPMEIU, the requirements should be assessed and consolidated into a PC-I for seeking funding from the government. PPMEIU will then be staffed and resourced as per these requirements.

Capacity Development - The PPMEIU currently is short of staff and resources and its capacity must be built in order for it to take an enhanced role. Moreover, once strengthened, the PPMEIU should closely work with the four SMEDA divisions and project teams to develop their monitoring capacities, train their resources and help them develop tools for tracking various indicators.

Monitoring MIS – SMEDA should also develop a dedicated MIS for its M&E framework to keep track of various indicators. The MIS would be accessible to project teams, where they can directly feed their information/data.

High Level Monitoring Committee – For important projects and even for monitoring broad organizational indicators, SMEDA may form a Monitoring Committee, which should supervise PPMEIU. This would also establish an independent character of the unit, working independently. The proposed committee will:

meet periodically with various project teams to assess progress towards planned results;

conduct occasional field monitoring missions to gauge achievements and constraints;

identify any lessons or good practices;

reflect on how well project results are improving situation on ground;

identify capacity development needs among various divisions and project teams related to data collection, analysis, monitoring and reporting; and

advise PPMEIU to provide capacity development support related to monitoring to implementing departments, as per identified needs.

Donor Assistance – PPMEIU should also develop a technical assistance plan to seek assistance from donors as well. SMEDA can then coordinate with ASP, FIRMS and other donor-funded projects to seek this support.

Following is the overall schematic of the institutional arrangements:

11. USE OF TECHNOLOGY – M&E MIS

The proposed monitoring and evaluation framework for SMEDA will greatly benefit from a strong technological backbone for record keeping, data analysis and review of information. This could be provided through a customized MIS for SMEDA, which can be accessible to various project teams and divisions within SMEDA.

The proposed software user requirement specifications are attached at Appendix A.

HighLevelMonitoringCommi ee

PPMEIU

PSDPFundingforStrengthening

Par cipa ngAgencies/DepartmentsPar cipa ngAgencies/DepartmentsPar cipa ngAgencies/DepartmentsPar cipa ngAgencies/DepartmentsM&EPlansbyVariousProjects/

Interven ons

SMEs

DonorsDonors

TechnicalSupport/Financing

DonorCoordina on

PrivateSectorServiceProvidersThirdPar es

Reviews/Assessm

ent

INSTITUTIONALARRANGEMENTSM&EFRAMEWORKFORSMEDA

SMEDA–SeniorManagement

TrainingTemplates

OverallM&EFrameworkM&EMIS

TrackingOrganiza onalIndicators

Monitoringofindividualprojects

12. ACTION PLAN FOR M&E FRAMEWORK ROLL-OUT In order to adopt the proposed M&E framework at SMEDA, following steps have been identified. They may not be taken in the order presented.

Approval of M&E Framework – This framework has been developed through assistance from USAID for SMEDA, based on the requirements identified by SMEDA officials, however, this needs to be formally approved by SMEDA before its implementation. SMEDA may consider it as an organic document and start implementation. Changes can be made later on, as and when needed.

Strengthening of PPMEIU – This is a critical requirement for implementation of this framework. PPMEIU seems like an ideal candidate for residing the M&E capacity of SMEDA, however currently the unit is severely understaffed and lack adequate resources. There is a need to supplement PPMEIU’s capacity and provide more resources – both human and financial – in order to enable the unit to take on a more active role. For this purpose, it is proposed to make a new PC-I to seek PSDP support or donor assistance.

Establishment of High Level Monitoring Committee – As discussed earlier, to embed M&E in organization’s culture and to give PPMEIU the due standing, there is a need to establish a high level monitoring committee, which van supervise PPMEIU, agree on organizational indicators, provide momentum to various M&E activities and give credibility to PPMEIU’s independent functioning.

Planning Forum – Results Framework for Existing Projects – As discussed, a clear results framework, both for organization as a whole and individual projects, is needed for effective monitoring and evaluation. While for new projects, this practice can be introduced, there is also a need to develop, improve or validate results framework for existing activities and projects. It is therefore proposed that SMEDA should conduct a one-time planning forum, which would be attended by relevant project teams and even external stakeholders to develop the results framework for SMEDA as well as for individual projects. These result frameworks would then form the basis for M&E activities.

Identification of Organizational Indicators – Once the Planning Forum is conducted and a high level monitoring committee is established, there is a need to identify high-level indicators, which are supposedly impacted by SMEDA’s performance. These indictors can then be tracked.

Establishment and Deployment of MIS – A robust MIS would be critical to implement the M&E framework, as it would not only standardize reporting formats but would also provide better access and visibility into project results. The proposed MIS would be accessed by all project teams and would be updated regularly.

Annual Planning and Quarterly Review – SMEDA currently engages in an annual planning exercise and with subsequent review meetings. This practice should not only be continued but should also be streamlined and done every quarter. The targets for each year and quarter should be clearly inked with organizational targets agreed.

Periodic and Annual Monitoring Reports – PPMEIU should develop consolidated periodic (quarterly and annual) reports, based on information collected from various projects and for organizational indicators. These reports can then be shared with a number of stakeholders to inform them about SMEDA’s achievements.

Capacity Development Activities – Once PPMEIU is strengthened itself, it should conduct various capacity development activities, such as trainings and workshops for project teams across SMEDA to build capacity for undertaking effective M&E activities.

APPENDIX A – REPORTING AND RECORD MANAGEMENT GUIDELINE FOR SMEDA TRAINING ACTIVITIES

1. Record Management The respective SMEDA Divisions/Provincial Offices are bound to maintain records of the following

documents on their own for a minimum period of five years after the completion of training

activity/workshop and have an obligation to provide them for inspection on request from SMEDA

HO.

1. Duly filled profile forms of all workshop participants, with allied documents such as

copies of CNIC or business registration documents, etc.

2. All receipts/records of stipend paid to workshop participants or fees charged

3. Copy of the attendance sheet, duly filled for each workshop, in the prescribed format

(The actual attendance sheet will be submitted to SMEDA HO/PPMEIU, along with

Training Completion Report)

4. Copies of all reports submitted to SMEDA HO/PPMEIU

Instructions for Recording & Maintaining Attendance Records

1. Attendance record is a critical part of training records and SMEDA HO/PPMEIU attached

high importance to the integrity and accuracy of this information. Therefore, all respective

SMEDA Divisions/Provincial Offices should duly comply with this process.

2. The attendance of workshop participants and trainer has to be recorded in a prescribed

manner (Form A), in the form of a sheet. The sheet should be prepared before the

submission of inception report and should be placed at a conspicuous place in the

classroom/training site.

3. The attendance sheet should be signed by each course participant everyday, followed by

signatures of trainer, training in-charge (from the respective SMEDA office) and monitoring

officer (on the day of his visit).

4. The attendance has to be completed within the first half hour of the day’s training session,

whoever comes in after that should be marked absent.

5. To mark a course participant absent, the box should be crossed out with a diagonal like this

, right after all the present workshop participants have signed their presence.

The first step is that all workshop participants sign in front of their names in the

appropriate day’s column.

Then the instructor should sign under the column of course participant signatures after

counting and verifying the number of workshop participants.

The sheet should have a box allocated for the signatures of a monitoring officer, be it

from SMEDA HO/PPMEIU or a representative of the third party monitoring organization

contracted by SMEDA HO/PPMEIU.

2. Reporting Requirements The respective SMEDA Divisions/Provincial Offices are expected to submit the following reports.

All reports should be submitted through e-mail and signed hard copy via courier.

REPORT FREQUENCY & TIMING CONTENTS Training Inception Report

Only submitted once; at least 7 Days before start of the workshop

Indicative number of workshop participants Detailed workshop plan (Form B) Instructor CVs (Form C) Compliance Sheet (Form D)

Course Participant Data

Only submitted once; after the workshop

Course Participant Data Sheet (Form E)

Monthly Training Progress Report

Submitted every month within 7 days after the month end

Course Participant Attendance Sheet for the Month (Form A) Progress on workshop plan (Form B) Key Issues: participant dropouts; Change of trainer; or any other issue

Training Completion Report

Submitted once after training completion

All the reports should be submitted with a title sheet, as per prescribed format (Form F).

2.1 Training Inception Report Training Inception Report is to be submitted at least 7 days before the commencement of training.

SMEDA HO/PPMEIU will review the status of all the measures and provide the respective SMEDA

Divisions/Provincial Offices with feedback on the sufficiency of their preparedness for beginning

the training. However, this is not to be seen as an obstacle in the path of commencing the training,

the respective SMEDA Divisions/Provincial Offices should start the training 7 days after the

submission of inception report, if no feedback has been received. However, all received feedback

has to be sufficiently addressed through necessary steps and reported back, before commencing the

training.

Please see Forms B, C and D for inception report format.

I. Indicative Number of Workshop Participants: Respective SMEDA Division/Provincial

Office will mention the indicative number of workshop participants in the inception report,

however, this number may vary and exact information is expected to be covered in the

course participant data sheet submitted subsequently.

II. Detailed Workshop Structure: The respective SMEDA Division/Provincial Office needs to

include a detailed workshop structure (Form B) with the inception report, covering the plan

for both the lesson and on-job placement training.

III. Instructor’s CV: The CV of instructor should also be a part of the inception report, in the

prescribed format (Form C).

IV. Compliance Sheet: The respective SMEDA Divisions/Provincial Offices should duly fill the

attached compliance sheet (Form D), after carefully reading the minimum expected

standards, as per the contractual obligations, to be fulfilled by him. In case a particular

standard is not met, the respective SMEDA Divisions/Provincial Offices should clearly

mention it in the sheet and also the reasons for non-compliance. Misstatement or

misrepresentation of facts on compliance sheet, however, would be a gross violation and

may result in severe penalty including the possibility of termination of contract.

2.2 Course participant Data Sheet Respective SMEDA Divisions/Provincial Offices must submit a printed signed copy of the course

participant data sheet with information on each enrolled course participant using the Course

Participant Data Sheet (Form E). Although, the Course participant Data Form asks for limited

information of the workshop participants, the respective SMEDA Division/Provincial Office should

maintain detailed forms of all workshop participants for a period of five years, as mentioned in the

record management section earlier. SMEDA HO/PPMEIU reserves the right to request this

information at any time during the training or afterwards. Furthermore, if a course participant

drops out after the submission of course participant profile; this must be provided in the relevant

section of Monthly Progress Report to SMEDA HO/PPMEIU.

The course participant data form has been designed to accommodate data for six workshop

participants within a page. A serial number has to be generated to inform SMEDA HO/PPMEIU on

all workshop participants inducted so far. The “Reference No.” is a number generated by the

respective SMEDA Division/Provincial Office for its own record keeping, it is to be included for

SMEDA HO/PPMEIU’s record and recall purposes.

Detailed address of the workshop participants, identifying Union Council and Tehsil is to be

included on the form.

2.3 Monthly Training Progress Report The respective SMEDA Division/Provincial Office has to submit Monthly Training Progress Report

providing information about progress of the training plan. This report comprises following

components:

I. Attendance Record for the Month: This component should include the monthly

attendance record (Form A), with a summary sheet clearly mentioning: (i) the total number

of workshop sessions conducted during the month; (ii) total number of workshop

participants; and (iii) number of workshop participants, who have completed the monthly

training with at least 80% or more attendance.

II. Progress on Workshop Structure: Each monthly report should include duly filled Form B,

showing the progress on workshop structure.

III. Key Issues: Each monthly report should include key issues faced during the training, such

as course participant dropouts, change of trainers, any other highlights, etc.

2.4 Training Completion Report After the completion of classroom training, the respective SMEDA Division/Provincial Office may

arrange internal and/or external assessment of each course participant.

The original attendance sheet has to be submitted with the Training Completion report, and its

copy maintained for five years.

This report will be submitted at the end of the training cycle for each workshop.

38

FORM A – ATTENDANCE SHEET (To be maintained in the form of a sheet, duly kept, with each sheet showing records for a month)

Roll #

Name Day 1

Day 2

Day 3

Day 4

Day 5

… … … Last Day of Works

hop

Total Days Attended

Mark X (if below 80%)

1 Ahmad Course participant’s signature

2 Ali Course participant’s signature

3 A

4 Course participant’s signature

5 Course participant’s signature

Instructor’s Name Instructor’s signature

Centre In-charge’s Name Centre In-charge’s Signature

Monitoring Officer Monitoring Officer’s Signature

39

40

FORM B – WORKSHOP STRUCTURE & PROGRESS

For time period of workshop:

Week Lessons Planned Lessons Delivered Signature

(Instructor)

Date

1 1.

2.

3.

1. ☐

2. ☐

3. ☐

2

.

41

FORM C – INSTRUCTOR’S CV

Name of Instructor

[Paste picture here]

Trade

CNIC No.

Contact Address

Contact Number Mobile #

Phone #

Relevant Qualification

Qualification Grade Institute Year

Work & Teaching Experience

Designation Organisation Duration

From To

42

FORM D – COMPLIANCE SHEET10

No. Parameter Compliance

Comment (Pls. comment, if you mark ‘No’ on

Compliance)

Standard

EQUIPMENT & CONSUMABLES 1. Equipment for training Yes/No Equipment

installed/available at training site and ready to use.

Name of equipment Quantity a) b) c)

2. Consumables a) b) c)

Yes/No Ready for use and distribution

3. Training manual Yes/No Ready and printed in sufficient numbers for all workshop participants

SITE PREPARATION 4. Attendance Sheet Yes/No Attendance sheet is ready

according to instructions 5. Blackboard/Whiteboard Yes/No Installed, Chalk/Marker

available 6. Chairs Yes/No Separate chairs /benches

available for all workshop participants

7. Light Yes/No Room should be sufficiently lit by natural light, at least one source of artificial light should be there

10

The Training Respective SMEDA Divisions/Provincial Offices may contact the SMEDA HO/PPMEIU for guidance and elaboration.

43

8. Ventilation Yes/No Ideally there should be at least one air outlet other than the door, at least one fan in a room

9. Access Yes/No The access to the training centre is available through public transport.

10. Lab Yes/No Equipment installed, wok-stations ready (lighting and ventilation also applies)

11. Toilets Yes/No Clean, functional with availability of water and soap

12. Drinking water Yes/No Basic filtered water with a cooler and glass

44

FORM E – COURSE PARTICIPANT DATA SHEET Sr. No: Reference No:

Paste picture here

Sr. No: Reference No:

Paste picture here

Sr. No: Reference No:

Paste picture here

Name: ____________________________ Gender: M/F Age: _____ CNIC.: ______________ Father’s Name: _________________ Detailed address: _______________ ______________________________ Highest Educational Qualification: _______ Monthly Income (Rs.): ___________ Name of Business: ______________

Name: ____________________________ Gender: M/F Age: _____ CNIC.: ______________ Father’s Name: _________________ Detailed address: _______________ ______________________________ Highest Educational Qualification: _______ Monthly Income (Rs.): ___________ Name of Business: ______________

Name: ____________________________ Gender: M/F Age: _____ CNIC.: ______________ Father’s Name: _________________ Detailed address: _______________ ______________________________ Highest Educational Qualification: _______ Monthly Income (Rs.): ___________ Name of Business: ______________

Sr. No: Reference No:

Paste picture here

Sr. No: Reference No:

Paste picture here

Sr. No: Reference No:

Paste picture here

Name: ____________________________ Gender: M/F Age: _____ CNIC.: ______________ Father’s Name: _________________ Detailed address: _______________ ______________________________ Highest Educational Qualification: _______ Monthly Income (Rs.): ___________ Name of Business: ______________

Name: ____________________________ Gender: M/F Age: _____ CNIC.: ______________ Father’s Name: _________________ Detailed address: _______________ ______________________________ Highest Educational Qualification: _______ Monthly Income (Rs.): ___________ Name of Business: ______________

Name: ____________________________ Gender: M/F Age: _____ CNIC.: ______________ Father’s Name: _________________ Detailed address: _______________ ______________________________ Highest Educational Qualification: _______ Monthly Income (Rs.): ___________ Name of Business: ______________

45

FORM F – TITLE SHEET

Title sheets for all reports must carry the following information:

WORKSHOP INFORMATION SHEET

Subject: __________________

Centre (location): _________

Start date: _______________

End date: _______________

Participants (No.): ___________

Male ☐ Female ☐ Mixed ☐

Relevant SMEDA Manager: _________

1

APPENDIX B – M&E TOOLKIT FOR SMEDA TRAINING ACTIVITIES

1. Monitoring Visit SOPs A monitoring visit may be conducted by SMEDA’s own authorized staff or third party staff,

specifically hired for the purpose. For the purposes of this toolkit, the term ‘monitoring officer’

would include any such person, having authority to monitor a training course conducted by any

SMEDA division or on its behalf. Any monitoring officer, while conducting a monitoring visit,

will perform the following two functions:

Monitor the training, by visiting the training facility/venue and the on-going workshop and

record observations (or make the trainees to record in case of feedback) in the prescribed

forms. The monitoring toolkit includes following three monitoring forms:

Form I: Monitoring Visit Form (Enclosed)

Form II: Attendance Sheet (Enclosed)

Form III: Trainee Feedback Form (Enclosed)

In addition, the Monitoring Officer must take pictures the workshop in session.

Once the observations are recorded, the Monitoring Officer will report his findings as per

enclosed Form IV, provided to SMEDA, based on at least one (or more) visits per

workshop/course.

All these forms are self-explanatory, but following are some additional details regarding each

form:

Form I: Monitoring Visit Form

The Monitoring Officer will carefully inspect the training facility/venue and workshop/course

and will observe compliance against each quality parameter, as per the standard mentioned in

the form. The Monitoring Officer will also ask the trainee and instructor to fill the Form II

(Attendance Sheet) in front of him.

In case, the training provider (SMEDA division or relevant office) has not complied with the

given quality parameter, the Monitoring Officer may record some comments in this regard. The

form includes some key parameters relating to trainee and instructor attendance and identity

verification as well as management of study plan for the course. The second part of Form I deals

with workshop/course-specific and training facility/venue-specific parameters.

While monitoring the attendance, the Monitoring Officer must comply with the following

standard:

2

“If an hour has lapsed into the start of the training, the attendance should have been marked with

trainees’ signatures, absences crossed out and cross-signed by the instructor. If two hours have

lapsed, the facility/venue in-charge should also have cross-signed the day’s attendance. If all

guidelines have been met, then the officer is to sign at the bottom of the day’s column.”

Form II: Attendance Sheet

The Monitoring Officer will get the Attendance Sheet (Form II) signed by all trainees and

instructors.

Form III: Trainee Feedback Form

The Monitoring Officer will request the instructor and training facility/venue staff to leave the

room and will explain the feedback form (Form III) to trainees. He will also assure them of

anonymity of their feedback. Then the Monitoring Officer will distribute copies of the Feedback

Form to all trainees and will moderate them to fill it. He will read out each question, explain it

and will describe 2 or 3 given options. He will also ask them to record any additional

observations, if they so desire regarding any aspect of the training.

2. Reporting SOPs

Form IV: Monitoring Visit Report Form

The Form IV provides a template for the Monitoring Officer to report back to SMEDA on

findings of the monitoring visit(s). The third party monitoring firm should ensure one

monitoring visit conducted per workshop/course.

3

FORM I – MONITORING VISIT FORM TRAINEE ATTENDANCE

A) Attendance Sheet, placed at a conspicuous place

Yes/No

If No, Please comment:

B) Training Attendance duly filled for the day and for previous days: Yes/No If No, Please comment:

C) Number of trainee count in workshop/course:

Number of trainee count as per attendance sheet:

If any discrepancy, please record comments: D) Do signatures on Form II match signatures on Attendance Sheet: Yes/No If No, Please comment:

TRAINEE IDENTIFICATION

E) Are trainees, carrying ID cards?

Yes/No

If No, Please comment:

INSTRUCTOR IDENTIFICATION

F) Does Trainer is carrying his/her CNIC

Yes/No

If No, Please comment: G) Is trainer’s identity verified through CNIC or any other photo ID

Yes/No

If No, Please comment:

STUDY PLAN H) Is study plan maintained, till the monitoring visit:

Yes/No

4

If No, Please comment:

No. Parameter Compliance (Yes/No)

Comment (If ‘No’ on

compliance)

Compliance Standard

Workshop/course-Specific Parameters

1. Equipment for training Equipment installed/available at training site and ready to use.

Name of equipment Quantity

a) b) c)

2. Consumables a) b) c)

Ready for use and distribution

3. Training manual Ready and printed in sufficient numbers for all trainees

4. Blackboard/Whiteboard Installed, Chalk/Marker available

5. Chairs Separate chairs /benches available for all trainees

6. Light Room should be sufficiently lit by natural light, at least one source of artificial light should be there

7. Ventilation Ideally there should be at least one air outlet other than the door, at least one fan in a room

8. Lab Equipment installed, wok-stations ready (lighting and ventilation also applies)

Training Facility/venue-Specific Parameters

9. Structural integrity of premises Building without cracks or leakages

10. Access The building is easily accessible to trainees, through public transport.

11. SMEDA banner/board display SMEDA banner and board clearly display funded workshop/course/course.

12. Drinking water Basic filtered water with a cooler and glass

5

FORM II – ATTENDANCE SHEET

Roll # Name Day 1 Day 2+ Total Workshop/co

urse Attended (till date)

Mark X (if below 80%)

1 Ahmad Trainee’s signature

2 Ali Trainee’s signature

3

4 Trainee’s signature

5 Trainee’s signature

Instructor’s Name Instructor’s signature

Facility/venue In-charge’s Name

Facility/venue In-charge’s

Signature

Monitoring Officer Monitoring Officer’s

13. Toilet Clean, functional with availability of water and soap

6

Signature

7

FORM III – TRAINEE FEEDBACK FORM Give comments, in front of each question, if you have ranked any specific problem:

1. Does the instructor regularly come to the workshop/course?

Yes ☐ Not always ☐ No ☐

2. Are you able to follow the coursework?

Yes ☐ Not always ☐ No ☐

3. Do you think what you’re learning here will be useful in improving the performance of your business?

Yes ☐ Not sure ☐ No ☐

4. Do you get to perform the practical on the equipment and tools to your satisfaction?

Yes ☐ Not always ☐ No ☐

5. Do you get consumables to perform practicals?

Yes ☐ Not always ☐ No ☐

6. Have you received the course manual?

Yes ☐ No ☐

7. What is the amount of fee that you have paid for the course?

8. What are the three lessons that you learnt in this course?

1. ________________

2. ________________

3. ________________

8

FORM IV – MONITORING VISIT REPORT FORM

MONITORING REPORT

Training Organizer (Dvision/Office):

Training Location:

Shift: Morning ☐ Afternoon ☐

Month:

Facility/venue Supervisor:

Mobile No.:

VISIT 1 2 3 4 5

Visit Date

Visit Time

No. Parameter COMPLIANCE

Yes/No Yes/No Yes/No Yes/No Yes/No

A Attendance Sheet Placement

B Attendance Sheet Filling

C Trainee Count Accuracy

D Signatures Accuracy

E Trainees ID

F Instructor ID

G Instructor ID Verification

H Study Plan Management

Workshop/course-Specific Parameters

1 Equipment

2 Consumables

3 Training manual

4 Blackboard/Whiteboard

5 Chairs

6 Light

7 Ventilation

8 Lab

Training Facility/venue-Specific Parameters

9 Structural integrity of premises

10 Access

9

11 SMEDA banner/board display

12 Drinking water

13 Toilet

Trainee Feedback

RESPONSE Percentage Respondents

Yes/Received Not sure/Not Always/Received

with problems

No

Number of Responses

1 Instructor attendance

2 Trainees following coursework

3 Training useful for improving business

4 Opportunity to perform practical

5 Availability of consumables

6 Provision of Course Manual

7 Payment of Fee

8 Highlights

Details of Monitoring Officer

Name & Signature:

Designation:

Organization (SMEDA or third party):

10

APPENDIX C – M&E TOOLKIT FOR SMEDA ADVISORY SERVICES

Introduction:

SMEDA provides a number of advisory services to SMEs such as technical advisory, legal and

contract services, business plan and feasibility development, etc.

The following framework provides guidelines on how the advisory and business support services

will be monitored to ensure compliance with quality standards and service parameters. There

are separate requirements for each component of the advisory services and these requirements

are further categorized into reporting requirements, monitoring and evaluation. The reporting

requirements have also to be fulfilled. The deliverables listed in the reporting requirements will

have to be submitted in both hard and soft forms to SMEDA HO/PPMEIU, within the stipulated

time.

The monitoring will be carried out by the PPMEIU as per the developed formats.

Reporting Requirements:

DELIVERABLE DEADLINE CONTENTS

Inception Report Within 15 Days on contract signing or after client/SME registration

A) A brief description of services including areas covered. B) Client’s information (as per Annex I), collected through a registration form C) Picture of client D) Date of signing E) Details of services provided F) List of resource persons, if used by SMEDA G) Copy of the contract

Monthly Progress Reports One report for each month, combining information for various clients/SMEs

A) A brief description of progress on services provided B) Expected date of

11

completion of respective cases

Advisory Service Completion Report

10 days after the service completion

A) Highlights of the case B) Explanations for any issues C) Client Feedback on Service Provision (Annex II)

12

Client Registration Form

Q. No

Questions

1 Date of Accessing Service – Initial Meeting

2 Name of Relevant SMEDA Office

3 Address of Relevant SMEDA Office

4 Name of Client

5 Gender

6 Address of Client

7 CNIC Number of Client

8 Date of Birth of Client

9 Contact No. of Client

10 Has the client received any services from SMEDA before?

1= Yes, 2= No

11 How did the client know about the relevant service that has been requested?

1= Yes, 2= No

12 Name and Description of SME/Business

13 Business Registration Number

14 NTN Number

15 Average Turnover

16 Number of Employees

13

Advisory Service Feedback Form

Rate the Questions 1 to 14 on the following scale:

1 = Very poor

2 = Poor

3 = Neutral

4 = Good

5 = Very good

1 Behavior of SMEDA officials?

2 Relevance of services provided?

3 Utility of services provided in solving your business problem?

4 Value for money?

5 Quality of services provided?

6 Availability of relevant SMEDA officials through out the contract?

7 Availability of relevant technical resource through out the contract?

8 Overall satisfaction with the services?

Q. Have you requested any SMEDA services in the last 12 months?

Yes No

Q. In your opinion, is SMEDA well positioned to provide advisory services?

14

Yes No

Q. If no, please suggest how advisory services can be improved

_____________________________________________________________________________________

_____________________________________________________________________________________

_____________________________________________________________________________________

_______________________________________________________.

Q. What additional services you would like us to offer?

_____________________________________________________________________________________

_____________________________________________________________________________________

_____________________________________________________________________________________

_______________________________________________________.

Q. Will you come to SMEDA again for any service?

Yes No

If no (please explain)

_____________________________________________________________________________________

_____________________________________________________________________________________

_____________________________________________________________________________________

_____________________________________________________________________________________

___________________.

Q. Will you discuss your experience with SMEDA with other business/SMEs/friends?

Yes No

15

APPENDIX D - SOFTWARE USER REQUIREMENT SPECIFICATIONS FOR SMEDA’s M&E MIS

1. BACKGROUND

Small and Medium Enterprise Development Authority (SMEDA) is the leading government agency to develop and promote Small and Medium Enterprises (SMEs) in Pakistan. SMEDA undertakes a number of activities for SME development and promotion in Pakistan, through its four provincial offices. It is also implementing a number of PSDP projects. A few other projects are funded by donors. SMEDA is implementing a new M&E regime to monitor its activities and projects. The M&E regime would provide on going information about various projects to senior management and would help the project teams improve quality of their initiatives.

2. INTRODUCTION

In order to implement and embed the new M&E regime, SMEDA intends to develop a robust management information system (MIS) for monitoring various projects. The software is aimed at providing a robust, scalable and economical design for monitoring SMEDA’s projects all across Pakistan. Currently, the existing M&E regime is maintained through manual forms, apart from PMES maintained and managed by Planning Commission of Pakistan.

3. PURPOSE OF THIS DOCUMENT

This document is intended to develop and share a common understanding of the proposed software and to enable the technical team (to be hired or in-house) to better understand the functional requirements from the software. The document would also help to assess efficacy of the software as per provided specifications. This document only covers software requirement analysis and further details can be acquired from a detailed project scope document (to be prepared by SMEDA) and user manuals/documentation (to be developed later on by the technical team).

4. PROJECT SCOPE

The objectives of the project include capitalizing on e-governance in a simple yet powerful way to automate critical components of M&E regime; ensuring access to relevant information; reducing work load; improving project performance; refining project design; better targeting of SMEs; and last but not the least facilitating efficient record management. Based on these objectives, the proposed software would consist of a robust database to host monitoring data and records and related information, along with capability to produce customized reports as well as to provide relevant information to senior management to facilitate informed policy and management decisions.

5. SOFTWARE FUNCTIONALITY REQUIREMENTS

5.1 KEY FUNCTIONS

16

Overall functionality of the proposed software would include the following:

A. STORAGE OF INFORMATION

Storage of monitoring data collected through various monitoring tools

Storage of reporting data, as submitted by project teams

Storage of results framework and targets/indicators for various projects

Storage of information on ongoing and previous SMEDA projects as well as various initiatives

Option to populate a particular data field through cross-reference

B. UPDATING AND CONSOLIDATING INFORMATION

Updating above categories of information, owing to any changes in one part of the database

Consolidating information through various sources, such as monitoring, reporting, etc. and presenting in a consolidated fashion, with the option to view disaggregated information

C. ANALYSIS

Identifying exceptions

Measuring progress against results

Highlighting areas lacking adequate progress

D. INFORMATION RETRIEVAL

Creation and printing of progress reports

Creation and printing of monitoring reports

Aggregated information searchable through any of the stored parameters

Disaggregated information searchable through any of the stored parameters

Customized reports

5.2 SCALIBILITY

Although the proposed MIS would work only for M&E regime initially, it should have the provision to scale up to cover other organizational modules, if SMEDA so desires in future.

5.3 FLEXIBILITY

The following options would not be added initially but the software should have the capability to build such options at a later stage.

Addition of any additional parameters/forms at a later stage

Web-based accessed for data entry and retrieval

5.4 AUDIT TRAIL

17

The system should have the ability to completely trace any changes and raise alerts on any exceptional level of activity. For this reason, the software must have the audit trail capability to maintain a record of system activity both by system itself and by specific users (and even intrusion or attempts thereof) to facilitate in detecting both security violations and performance issues. Such capability may entail recording and maintaining user logs.

5.5 INTERFACE REQUIREMENTS

Although it may be simple to make a system without any web interface, the requirement for such an interface is likely to arise due to following:

Providing direct access to senior management such as CEO/GMs (and even MoI officials) sitting in Lahore and Islamabad to stored information

Providing flexible access to users in the field to stored information and for printing respective reports

Flexibility in locating a centralized server remotely and accessing the same from different locations

Therefore the software should preferably has web-based interface or at least have the capacity to integrate such interface at a future stage.

Other than web functionality of the interface, other requirements in a specific interface would be dependent upon the access granted, as per the user categories defined below.

6. LEVEL OF SECURITY

The security of the software would be critical and therefore the software must have adequate security measures to prevent any security breaches and to mitigate the security risks. The security of software should therefore be duly built in, which would translate into:

Dependability (Correct and Predictable Execution): Justifiable confidence can be attained that software, when executed, functions only as intended;

Trustworthiness: No exploitable vulnerabilities or malicious logic exist in the software, either intentionally or unintentionally inserted;

Resilience (and Survivability): If compromised, damage to the software will be minimized, and it will recover quickly to an acceptable level of operating capacity;

Conformance: A planned and systematic set of multi-disciplinary activities will be undertaken to ensure software processes and products conform to requirements and applicable standards and procedures

7. INFORMATION/DATA BACKUP

The proposed software and database should have due backup arrangements to recover data in case of any loss, inadvertent deletions or data corruption. Moreover, the backup arrangements should also have the ability to recover data from an earlier time, according to a pre-defined data retention policy.

18

In order to ensure adequate backup arrangements, a mix of various data repository models may be used. It is recommended that a combination of ‘full system imaging’ and ‘incremental data storage’ should be used. Full system imaging would capture the whole data at a specific point in time, whereas the use of incremental data storage would facilitate more feasible and frequent backups.

8. CLASSES OF USERS

Primarily, there would be five kinds of users of the proposed software. These categories have been listed below:

Class I Users

Such class of users would be responsible for upkeep of the system and for any troubleshooting. The access granted to the system administrators would be suggested by the technical team and would be approved by GM(CS).

Class II Users

Such class of users would be responsible for managing the monitoring and reporting records in their respective areas and should have access to all information (relating to their projects) and should also have the authority to change such information. This class of users would primarily consist of relevant Project Managers for reporting/monitoring information and PPMEIU monitoring staff for monitoring information.

Class III Users

Such class of users would do bulk of the work on the system both for storing information and for changing records. But these users should only have the access to initiate such changes. These changes would need to be validated by class II users, before becoming part of the permanent record. Class III users may members of project teams. These users should also have the authority to generate customized reports, relevant to their respective projects.

Class IV Users

Senior management would not need to have access to individual records but would need a lot of analytical information for decision making, Therefore this class of users would have the access to view or change anything, except making changes in individual monitoring/reporting records. These users would also be able to view the draft (initiated) changes, which have not been validated by respective Class II users.

Class V Users

Such kind of users would not be directly involved with the records but would have broad access to view aggregated information. Such users may include GMs and Provincial Chiefs who are not working in or responsible for some projects.

19

APPENDIX D – DRAFT RESULTS FRAMEWORK FOR SMEDA

Results Framework – SMEDA 2012-17

OUTCOME AREA 1 - Growth and development of existing SMEs

OUTCOMES INDICATORS MILESTONES & TARGETS

2013 2014 2015 2016 2017

Facilitation of credit flow to SMEs

Credit Guarantee Scheme Launched

# of banks engaged under the scheme

Credit provided with guarantee under Credit Guarantee Scheme

No. of SME units awarded

Performance & Credit rating of SMEs

Improving competitiveness of SMEs

Loans provided for technology up-gradation of SMEs

Enhancing competitiveness of SMEs with lean manufacturing and other TQM techniques

Quality improvement in SMEs in terms of relevant indicators such as wastage, etc.

Introducing new product/ process designs in SMEs

Increasing energy efficiency in SMEs by X%

20

Technology adoption is SME clusters

Incubation of new business ideas

Promotion of SMEs through cluster based approach

CFCs constructed and commissioned

CFCs operational

# of SMEs accessing CFCs

New SMEs set up under and registered in a particular cluster

Ratings in independent evaluation of various cluster development activities

Marketing support to SMEs

Entrepreneurs provided support for participation in International Fairs/ Exhibitions

Number of Domestic Fairs & Exhibitions organized/ co- sponsored

Evaluation & restructuring of existing delivery mechanism

Study on existing SME delivery mechanisms

Indicators coming out of study

OUTCOME AREA 2 - Creation of new enterprises

OUTCOMES INDICATORS MILESTONES

2013 2014 2015 2016 2017

Creation of new Enterprises

New SMEs registered

Operationalizing new SMEs, which have

21

benefited from SEMDA’s support

Conducting a beneficiary survey in new SMEs and ratings

Number of new industries with new SMEs

Initial support to strengthen new SMEs

# of SMEs assisted

Success rate of new SMEs

OUTCOME AREA 3 - Growth and development of Rural and Cottage Industries

OUTCOMES INDICATORS MILESTONES

2013 2014 2015 2016 2017

Growth and development of rural and cottage industries

Gross sales of rural and cottage industries

Annual gross sale of rural and cottage industries as against production value of rural and cottage industries

Number of SMEs assisted in rural and cottage industries

Export of rural and cottage industries products

Demonstration of new technology through R&D intervention in rural and cottage industries

Conducting an independent evaluation of rural and cottage industries support initiatives by SMEDA and ratings

22

Transfer of innovative technology in rural and cottage industries

Initiating actions on handholding/technological support to model SMEs

Development of improved machines/ process /services in rural and cottage industries

OUTCOME AREA 4 - Skill and Entrepreneurship Development and Quality Upgradation

OUTCOMES INDICATORS MILESTONES

2013 2014 2015 2016 2017

Training of Entrepreneurs Number of persons trained

Number of persons trained in specific industries

Number of people employed in SME sector

Productivity improvement in SMEs

Tracer studies and success rates

References

SMEDA’s Pre-Award Assessment Report; USAID; January 2010

SMEDA’s official organization brief

SMEDA presentation for Deputy Prime Minister

Annual program presentation template

23

SMEDA's latest budget 2012-13

Third Party Evaluation Report of SMEDA; Shah, Salman

PC-I for PPMEIU

PC-I for Gujranwala Business Center

PSDP Project list with approved costs, throw forward, current allocations, etc.

Guidelines for Project Management; Planning Commission of Pakistan

Handbook on Planning Commission; Planning Commission of Pakistan

Handbook on Planning, Monitoring and Evaluating for Development Results; UNDP 2009

The Monitoring and Evaluation Handbook for Business Environment Reform; IFC

Monitoring and Evaluating Projects: A step-by-step Primer on Monitoring, Benchmarking, and Impact Evaluation; Grun, Rebekka E.; 2006

‘Malaysian Experiences of Monitoring in Development Planning’; Discussion Paper; Hussain, Datuk Zainul Ariff; Implementation Coordination Unit, Jabatan Perdana Menteri, Putrajaya, Malaysia

A Guide for Project M&E; International Fund for Agriculture Development (IFAD); 2002

Guidelines for Preparing a Design and Monitoring Framework; Asian Development Bank; 2007