Vendor Performance Management

41
CMC - Canada February 2012 Vendor Performance Management Study Prepared by: R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC TABLE OF CONTENTS 1.1 BACKGROUND...............................................................................................................................................1 1.2 APPROACH ....................................................................................................................................................1 1.3 RESEARCH LIMITATIONS...............................................................................................................................2 2 RESPONDENT PROFILE ........................................................................................................................ 2 2.1 RESPONDENT CHARACTERISTICS ..................................................................................................................3 2.2 EXISTING VENDOR PERFORMANCE MANAGEMENT SYSTEMS .....................................................................12 3 MEASUREMENT ................................................................................................................................... 17 4 PERFORMANCE ELEMENTS.............................................................................................................. 22 5 IMPLICATIONS ..................................................................................................................................... 24 6 FAIRNESS ............................................................................................................................................... 27 7 SUMMARY AND CONCLUSIONS ....................................................................................................... 29 7.1 KEY ELEMENTS...........................................................................................................................................29 7.2 DESIGN CONSIDERATIONS ..........................................................................................................................32 7.3 DISCUSSION POINTS ....................................................................................................................................34 7.4 IMPLEMENTATION CONSIDERATIONS ..........................................................................................................35 8 BIBLIOGRAPHY.................................................................................................................................... 36 9 APPENDIX A .......................................................................................................................................... 38

description

Report created by the Canadian Association of Mgmt Consultants for the Ministry of Government Services for the Province of Ontario.

Transcript of Vendor Performance Management

Page 1: Vendor Performance Management

CMC - Canada February 2012

Vendor Performance Management Study

Prepared by:

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

TABLE OF CONTENTS

1.1 BACKGROUND............................................................................................................................................... 1 1.2 APPROACH .................................................................................................................................................... 1 1.3 RESEARCH LIMITATIONS ............................................................................................................................... 2

2 RESPONDENT PROFILE ........................................................................................................................ 2

2.1 RESPONDENT CHARACTERISTICS .................................................................................................................. 3 2.2 EXISTING VENDOR PERFORMANCE MANAGEMENT SYSTEMS ..................................................................... 12

3 MEASUREMENT ................................................................................................................................... 17

4 PERFORMANCE ELEMENTS .............................................................................................................. 22

5 IMPLICATIONS ..................................................................................................................................... 24

6 FAIRNESS ............................................................................................................................................... 27

7 SUMMARY AND CONCLUSIONS ....................................................................................................... 29

7.1 KEY ELEMENTS........................................................................................................................................... 29 7.2 DESIGN CONSIDERATIONS .......................................................................................................................... 32 7.3 DISCUSSION POINTS .................................................................................................................................... 34 7.4 IMPLEMENTATION CONSIDERATIONS .......................................................................................................... 35

8 BIBLIOGRAPHY .................................................................................................................................... 36

9 APPENDIX A .......................................................................................................................................... 38

Page 2: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 1

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

EXECUTIVE SUMMARY

INTRODUCTION

1.1 Background

The Ontario government advised vendors through a White Paper in spring 2010 of its intention to

develop and implement a vendor performance management program for consulting services. This

intention was formalized in notices as part of Requests for Proposals for new Vendor of Record (VOR)

arrangements for consulting services in spring 2011. CMC-Ontario provided initial input through its

submission to the Ontario government in July 2010 on the “Modernization of Ontario’s Consulting

Services Vendor of Record (VOR) Program.” After posting of the VOR Requests for Proposals, CMC-

Ontario commissioned a research project to assist the Association in advising the government in the task

of developing a robust, fair and transparent framework for working with vendors, to manage risk, and to

assist vendors in understanding the expectations of government clients.

Although the purpose of implementing such programs differs across jurisdictions and various services,

the purpose of vendor performance evaluation programs is to monitor the performance of vendors,

ensure the management of contracts, cut costs and alleviate risks, foster better communication, and

enhance the value of such services by providing timely and structured feedback to vendors (Ontario

Realty Corporation, 2010); (Office of the Procurement Ombudsman, 2010); (The Department of

Housing, 2006); (Survey Analytics, 2011); (Weber, 1996). Vendor performance management programs

establish a mutually beneficial relationship between vendor and client and promote the continuous

improvement of the quality of goods and services (The Department of Housing, 2006); (Office of the

Procurement Ombudsman, 2010). In addition, vendor management programs increase the vendor’s

competitive advantage, improve stakeholder satisfaction, and increase performance visibility (Survey

Analytics, 2011).

The objective of the current research project was to provide CMC-Ontario (the Ontario Institute of

Canadian Association of Management Consultants) with information and recommendations that would

enable the organization to articulate a position on a performance management approach for

management consulting services in the public sector. The intention is for CMC-Ontario to table this paper

with the Ontario government as input to the government’s plan to implement a vendor performance

management program for consulting services within the next six to twelve months.

1.2 Approach

The Vendor Performance Management Study was a mixed mode project consisting of a survey,

interviews, and a literature review. The objective of the survey was to collect representative data from

sector stakeholders in a statistically valid way. An electronic link to an online survey was sent to

potential respondents, and a total of 119 respondents completed the survey entirely. 184 respondents

accessed the survey but did not complete it entirely. The results from nine valid incomplete cases were

added to the base of 119 completed cases and the remaining 56 invalid cases were discarded. The

current report contains the valid responses for each survey question.

Three telephone interviews were also completed with knowledgeable sector stakeholders. The objective

of these interviews was to discuss the procurement and management of management consulting

contracts, the use of existing vendor performance management systems, and features and best

practices of potential vendor performance management systems. These interviews were also intended

to inform the development of the survey instrument.

Page 3: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 2

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

A literature review was also conducted at the beginning of the project to understand current and

existing vendor performance management systems and to collect some baseline data on best practices

in the field. Over 20 sources were consulted during the development of this review.

1.3 Research Limitations

Effort was made during the research project to gather information that was both representative and

reliable. However, as with all research endeavors, some considerations should be noted.

The total sample size limits the level of detail in the data analysis. The survey used census approach,

which means that the resulting data is drawn from a sample of convenience: those who were aware of

the study and who opted to participate. A total of 128 respondents provided input, but these

respondents were not randomly selected.

Since the data was collected from a non-random sample, there is no margin of sampling error.

Furthermore, an analysis of subgroups is not possible with the exception of a distinction between buyers

(clients) and providers (vendors) of management consulting services, although one question was cut by

sector.

2 RESPONDENT PROFILE

Respondents were asked to provide some background on their work and professional experiences

relative to management consulting. This section provides details on those respondent characteristics.

Page 4: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 3

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

2.1 Respondent Characteristics

Figure 2.1.0 Respondent Characteristics

n=119

QA1: Are you a buyer or provider of management consulting services?

Management consulting services refers to both the industry and practice of helping organizations

improve their performance primarily through the analysis of existing organizational problems and the

development of plans for improvement.

The North American Industry Classifcation system description of management consulting is:

“Establishments primarily engaged in providing advice and assistance to other organizations on

management issues, such as strategic and organizational planning; financial planning and budgeting;

marketing objectives and policies; human resource policies, practices and planning; and production

scheduling and control planning.”

Buyers (clients) were defined as users of management consulting services while providers (vendors)

were defined as providers of management consulting services. Relatively equal proportions of providers

(44%) and buyers (43%) responded to the survey. 13% of respondents were neither providers nor buyers

of management consulting services.

44% 43%

13%

0%

10%

20%

30%

40%

50%

Provider (Consultant) Buyer (Client) Neither

Page 5: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 4

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.1 Previous Management Consulting Experience

Provider n=68

Buyer n=67

QA1a: Have you ever been a Client (Buyer) of management consulting services?

QA1b: Have you ever been a Vendor (Provider) of management consulting services?

Both providers and buyers were asked whether they had ever bought or provided management

consulting services, respectively. A much greater proportion of providers had previously been in a buyer

role, with 62% of providers responding that they had previously been buyers of management consulting

services. Conversely, 82% of buyers had never been in a position where they had provided management

consulting services.

18%

82%

62%

38%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

Yes No

Buyer (Client)

Provider (Consultant)

Page 6: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 5

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.2 Respondent Fields and Specialties

n=119

QA2: What is your field or specialty?

Respondents were asked to identify their field or specialty. Half of all respondents are in supply chain

management, while 23% of respondents are in management consulting. Smaller proportions are from

the human resources (2%), project management (6%) and information technology (4%) fields.

23%

2%

50%

6% 4%

15%

0%

10%

20%

30%

40%

50%

60%

Page 7: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 6

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.4 Respondent Sectors

n=249 (multiple response permitted)

QA2a: What sectors are you involved with?

Respondents were asked to identify the sectors that they are involved with. There was a dispersion of

answers to this question with broad representation across various industry sectors, with 22% involved in

private companies and relatively equal proportions involved in the municipal government (12%), health

care (11%) or academic (i.e. university/college)(10%) sectors. 16% of respondents are involved in the

broader public sector, while 14% are involved in the federal and provincial government sectors. Smaller

proportions are involved with school board (7%) and other (8%) sectors.

12%10%

7%

11%

16%

22%

14%

8%

0%

10%

20%

30%

Page 8: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 7

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.5 Years of Respondent Experience

n=119

QA3: How many years have you been active in this field?

Respondents were asked to indicate how many years they had been involved in their field. Respondents

were relatively experienced with 30% having 11 to 20 years of experience in their field and 25% having

20 to 29 years of experience in their field. Only 10% had less than five years experience. Respondents

had an average of 18.2 years of experience.

10%

18%

30%

25%

17%

0%

10%

20%

30%

40%

1 to 5 years 6 to 10 years 11 to 20 years 20 to 29 years 30 years +

Page 9: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 8

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.6 Respondent Memberships or Professional Designations

N=180 (multiple response permitted)

QA4: Do you possess any of the following memberships or designations?

Respondents were asked to identify their membership in professional associations or their professional

designations. Equal proportions of respondents have Ontario Institute of the Purchasing Management

Association of Canada OIPMAC (17%) and CMC Canadian Association of Management Consultants (17%)

designations. 13% has membership in the Ontario Public Buyers Association OPBA and 10% have

membership in the National Institute of Governmental Purchasing NIGP. Smaller proportions of

respondents were members or possessed designations from the Canadian Public Procurement Council

CPPC (3%), Ontario Educational Collaborative Marketplace OECM (3%), Healthcare Supply Chain

Network HSCN (2%), or construction association (1%). 14% of respondents stated that this question was

not applicable to their situation.

17%

13%

3%

10%

3%

17%

2%1%

20%

14%

0%

10%

20%

30%

Page 10: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 9

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.7 Degree of Involvement with Management Consulting Contracts

N=119

QF1: How many management consulting contracts have you been a part of in the last year?

Respondents were asked to indicate how many management consulting contracts they had been a part

of in the last year. 50% of respondents had been a part of a minimum of one contract and a maximum of

5 contracts in the last year, while 18% had not been involved in any management consulting contracts.

Smaller proportions were involved in between 6-10 contracts (14%), 11-15 contracts (12%) and more

than 16 contracts (6%).

18%

50%

14%12%

6%

0%

10%

20%

30%

40%

50%

60%

none 1 to 5 6 to 10 11 to 15 16 +

Page 11: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 10

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.9 Number of Performance Management Evaluations

N=119

QF2: During the course of a typical year how many performance management evaluations are performed at your company or

organization?

Respondents were asked to indicate how many vendor performance evaluations are performed during

the course of a typical year at their company or organization. On average, companies and organizations

complete 18 performance management evaluations during the course of a typical year. 43% of

respondents stated no vendor performance evaluations are typically done, while 34% stated that

between 1 and 10 vendor performance evaluations are typically done at their company or organization.

Smaller proportions indicated that they typically performed 11-25 vendor evaluations (10%), 26-49

vendor evaluations (6%) or more than 50 vendor evaluations (8%).

The survey also asked respondents to state the average value of these contracts. The mean for this

question (or average) average value was $379,844.00.

43%

34%

10%

6%8%

0%

10%

20%

30%

40%

50%

none 1 to 10 11 to 25 26 to 49 50 +

Page 12: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 11

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.1.10 Importance of Vendor Performance Management

N=119

G2: Overall, do you see the management of vendor performance as an important activity to measure, evaluate and improve

the performance of vendors?

Respondents were asked whether they saw the management of vendor performance as an important

activity to measure, evaluate and improve the performance of vendors. 92% of respondents stated that

they saw it as an important activity, while 8% stated that they did not see it as an important activity to

measure, evaluate and improve vendor performance.

92%

8%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Yes No

Page 13: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 12

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

2.2 Existing Vendor Performance Management Systems

Figure 2.2.1 Existing Vendor Performance Management Policies

n=67

QA5: Does your company or organization have a vendor performance management policy for outside consultants?

Buyers of management consulting services were asked if their company or organization had a vendor

performance management policy for outside consultants. While three-quarters of buyers do not have a

vendor performance management polity, 25% do. The remainder of this subsection pertains to this 25%

and the existing vendor performance management policies possessed by those companies or

organizations.

Figure 2.2.2 Utilization of Performance Contractual Clauses

n=17 (caution: small base)

QA7: Does your company utilize performance contractual clauses for managing vendor performance?

Of companies and organizations with existing vendor performance management policies, 94% of

companies and organizations utilize performance contractual clauses for managing vendor performance.

This and subsequent figures must be interpreted with caution because of the small base sizes.

25%

75%

0%

10%

20%

30%

40%

50%

60%

70%

80%

Yes No

94%

6%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Yes No

Page 14: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 13

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.2.3 Utilization of Performance Incentives

N=17 (caution: small base)

QA8: Does you company or organization use positive and negative performance incentives for managing vendor performance?

Of companies and organizations with existing vendor performance management policies, 65% use

positive and negative performance incentives for managing vendor performance. 12% use only positive

performance incentives, while 6% use only negative performance incentives. 18% of companies and

organizations use neither positive nor negative performance incentives.

12%

6%

65%

18%

0%

10%

20%

30%

40%

50%

60%

70%

Yes - positive Yes - negative Yes - positive and

negative

No - neither

Page 15: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 14

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.2.4 Utilization of Tools and Resources

N=17 (caution: small base)

QA10: Which of the following tools and resources are available to your organization for managing vendor performance?

Companies and organizations with existing vendor performance management policies were asked which

tools and resources were available to their organization for managing vendor performance. Just over

three-quarters said they used forms (76%) and templates (76%), 71% said they used progress meetings,

and 65% of companies and organizations used performance documentation. Lesser proportions used

checklists (41%), third party verification (35%) or user guides and manuals (29%).

41%

76% 76%

29%

71%65%

35%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

Page 16: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 15

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 2.2.5 Automation of Forms and Templates

Forms n=19 (caution: small base)

Templates n=17 (caution: small base)

QA10b and QA10c: Please specify your applications

MS Excel was the most commonly used form and template for managing vendor performance: 47% of

those with existing vendor performance management systems used MS Excel as their application. 42%

of those with existing vendor performance management systems used MS Word as their application for

managing vendor performance forms, while 35% used MS Word as their application for managing

vendor performance templates. 11% of those with existing vendor performance management systems

used other software applications for managing vendor performance forms, including CMIC and a SAP

Contracts Database. 18% of those companies or organizations with existing vendor performance

management systems used other software applications for managing vendor performance templates,

including MS Access, Cordys Software, and online surveys.

Figure 2.2.5 Evaluation Triggers for Existing Vendor Performance Management Systems

42%

47%

11%

35%

47%

18%

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

50%

MS Word MS Excel Other Software

Application

Forms

Templates

Page 17: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 16

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

N=138

QF2a: What triggers an evaluation?

Companies and organizations with existing vendor performance management policies were asked what

triggers a vendor evaluation. Issues with contract performance (32%) and contract value (24%) were the

leading evaluation triggers. Complex projects triggers vendor evaluations at 12% of the companies and

organizations with existing vendor performance management policies, while periodic triggers occur at

9% of the companies and organizations. Client dissatisfaction and project profiles were triggers at 6% of

the companies and organizations, while new vendors, repeat vendors, the length of engagement

triggered evaluations at 3% of the companies and organizations each. Only 3% of companies and

organizations with existing vendor performance management policies trigger evaluations on every

project.

9%

24%

32%

6%

3%

12%

6%

3% 3% 3%

0%

10%

20%

30%

40%

Page 18: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 17

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

3 MEASUREMENT

This section contains findings on respondent preferences as to when and how vendor performance

should be measured.

Figure 3.0.1 Vendor Performance Evaluation on Every Project?

N=119

QC1: In your opinion, should vendor performance be measured on every project?

Providers (83%) were more likely than buyers (78%) and more likely than respondents who were neither

buyers nor providers (69%) to state that vendor performance should be measured on every project.

Overall, approximately 4 out of 5 respondents, or 79%, state that vendor performance should be

measured on every project.

83%78%

69%79%

17%22%

31%21%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Provider

(Consultant)

Buyer (Client) Neither Total

No

Yes

Page 19: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 18

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 3.0.2 Vendor Performance Evaluation Triggers

n=34

QC1: If no, what do you think should trigger a formal evaluation?

Respondents who said that performance should not be measured on every project were asked what

should trigger a formal evaluation. 32% of respondents said that formal evaluations should be triggered

by poor performance, while 16% said that formal evaluations should be periodic. 14% said that formal

evaluations should occur at the end of a contract, while 11% said formal evaluations should occur upon

request. 5% qualified the fact that evaluations should occur on every project, while 3% said that the

contract value should be the trigger. 2% said that they don’t know, and 11% said this question was not

applicable to their situation.

16%

5%

32%

14%

11%

3%

7%

2%

11%

0%

10%

20%

30%

40%

Page 20: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 19

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 3.0.3 Vendor Performance Evaluation Timelines

N=153

QC2: When should vendor performance be evaluated?

Respondents were asked when vendor performance should be evaluated. 43% said that vendor

performance should be evaluated at the completion of each stage or phase of the project. 16% of

respondents stated that vendor performance should be evaluated at the end of the contract, while 8%

said it should be evaluated on a quarterly basis. Lesser proportions were seen for other periodic

evaluations including monthly (1%), biannually (4%), annually (7%), or every other year (1%). Other

respondents stated that it would depend on the length (5%), value (2%) and complexity (1%) of the

project/contract. 3% stated that vendor evaluations should occur at times that are specified in the

contract.

16%

1%

8%

4%7%

1%

43%

5%2% 1%

3%

10%

0%

10%

20%

30%

40%

50%

Page 21: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 20

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 3.0.4 Evaluation Report Duration

N=119

QC3: How long should an evaluation report take to complete?

Respondents were asked to state how long, in minutes, an evaluation report should take to complete.

71% of respondents said that an evaluation report should take between 10 and 30 minutes to complete,

while 18% stated that an evaluation report should take between 30 minutes and an hour to complete.

Lesser proportions stated that evaluation reports should take less than 10 minutes (7%) or longer than

an hour (4%). The mean for this question was 30 minutes, while the average was 32.5 minutes.

7%

71%

18%

4%

0%

10%

20%

30%

40%

50%

60%

70%

80%

under 10 minutes 10 minutes to half

an hour

between half an

hour and an hour

longer than an

hour

Page 22: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 21

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 3.0.5 Vendor Performance Evaluation Measurement Scales

N=125;

QC4: Please indicate your preference with respect to the following scales used to measure satisfaction with vendor performance.

Respondents were asked to indicate their preferences with respect to several types of scales used to

measure satisfaction with vendor performance. Providers were more likely to state that an expectation

scale should be used to measure vendor performance (65%), with smaller proportions of providers who

preferred numeric (19%) and qualitative (i.e. very good to very poor) scales (7%). Buyers were equally

divided between the expectation (43%) and numeric scales (43%), with only 11% preferring the

qualitative scale. Respondents who were neither buyers nor providers were more likely to prefer the

numeric scale (47%) over the expectation (41%) or qualitative scale (6%). Overall, 52% of respondents

preferred the expectation scale while lesser proportions preferred the numeric (33%) and qualitative

scale (9%).

65%

43% 41%

52%

7%

11%6%

9%

19%

43%47%

33%

9%4% 6% 6%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Provider

(Consultant)

Buyer (Client) Neither Total

Other

Numeric scale

Very good to very poor

Exceeded, met, fell short of

expectations

Page 23: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 22

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

4 PERFORMANCE ELEMENTS

This section contains findings on vendor performance elements.

Figure 4.0.1 Vendor Performance Element Ratings

Overall (n=128) Providers

(Consultant) (n=56)

Buyers (Clients)

(n=55) Neither (n=17)

Top Box

%

Average

(mean)

Top Box

%

Average

(mean)

Top Box

%

Average

(mean)

Top Box

%

Average

(mean)

A. Effective communication

throughout engagement 59.2% 9.2 69.6% 9.5 58.2% 9.3 29.4% 8.8

B. Quality of resources 45% 8.9 53.6% 9.3 41.8% 9.0 29.4% 8.1

C. Availability of resources to

carry out contract 46.7% 9.0 51.8% 9.1 47.3% 9.1 23.5% 8.7

D. Quality of the final

deliverables 69.7% 9.5 66.1% 9.6 74.5% 9.6 58.8% 9.4

E. Providing value added

services 22.7% 8.1 21.7% 8.2 25.5% 8.1 11.8% 7.8

F. Maintaining

timelines/deadlines 44.1% 9.1 33.9% 9.0 56.4% 9.3 41.2% 9.3

G. Budget/cost control 47.9% 9.1 39.3% 9.0 50.9% 9.3 58.8% 9.3

H. Having a vendor contact

for dispute resolution 30.8% 8.1 25.0% 8.1 38.2% 8.4 17.6% 7.8

N=various (see table)

QB1: In general, please indicate how important the following elements are for evaluating vendor performance?

Overall, respondents rank the quality of the final deliverables (69.7%) and effective communication

throughout the engagement (59.2%) as the most important elements for measuring vendor

performance, as measured through the top box score. Budget/cost control (47.9%), availability of

resources to carry out the contract (46.7%), quality of resources (45%) and maintaining

timelines/deadlines (44.1%) followed with similar overall average and top box scores. Having a vendor

contact for dispute resolution (30.8%) and providing value added services (22.7%) are considered to be

the least important elements for evaluating vendor performance.

Page 24: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 23

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 4.0.2 Vendor Performance Evaluation Rankings

Overall (n=126)

Providers (Consultant)

(n=55) Buyers (Clients) (n=54) Neither (n=17)

Rank

1

Rank

2

Rank

3

Rank

1

Rank

2

Rank

3

Rank

1

Rank

2

Rank

3

Rank

1

Rank

2

Rank

3

A. Effective communication

throughout engagement 26% 19% 14% 30% 20% 9% 18% 20% 17% 24% 19% 19%

B. Quality of resources 9% 18% 11% 9% 25% 9% 11% 7% 9% 0% 13% 19%

C. Availability of resources

to carry out contract 11% 8% 8% 7% 5% 11% 13% 9% 7% 12% 6% 6%

D. Quality of the final

deliverables 38% 23% 14% 43% 16% 18% 33% 26% 15% 41% 31% 6%

E. Providing value added

services 2% 5% 8% 2% 11% 9% 2% 0% 6% 0% 0% 13%

F. Maintaining

timelines/deadlines 3% 12% 25% 0% 9% 25% 4% 19% 26% 6% 6% 19%

G. Budget/cost control 8% 13% 16% 4% 5% 18% 16% 19% 17% 12% 25% 6%

H. Having a vendor contact

for dispute resolution 3% 2% 1% 4% 4% 0% 0% 0% 0% 6% 0% 6%

I. Other, Specify 1% 2% 3% 0% 4% 0% 2% 0% 4% 0% 0% 6%

N=various (see table)

QB2: Please rank the three most important elements for evaluating vendor performance in general.

Respondents were asked to rank the three most important elements for evaluating vendor performance

in general. As with the top box scores, the quality of the final deliverables (38.3%) and effective

communication throughout the contract (25.8%) were ranked first by the greatest number of

respondents overall. The third most important first-ranked element was the availability of resources to

carry out the contract (10.8%). The quality of the final deliverables (22.7%) and effective communication

throughout the engagement (18.5%) were the most important second ranked elements, although the

quality of resources (17.6%), budget and cost control (13.4%) and maintaining timelines/deadlines

(11.8%) were still important to a considerable number of respondents overall. Maintaining

timelines/deadlines (25.2%) and budget/cost control (16.0%) were the most important third ranked

elements, while quality of the final deliverables (14.3%) and effective communication throughout the

contract (14.3%) were tied for the third most important third ranked elements.

In examining the overall results horizontally it can be concluded that while maintaining

timelines/deadlines and budget/cost control are important elements, they do not figure as prominently

in evaluating vendor performance as the quality of the final deliverables or effective communication

throughout the engagement. Moreover, the fact that maintaining timelines/deadlines and budget/cost

control have increasing importance from left to right across the ranks suggests that they figure

prominently as important elements regardless of communication and quality deliverables.

Comparing results across those who are providers, buyers and neither providers nor buyers of

management consulting services, it is clear that providers rank effective communication throughout the

engagement more highly than buyers. Providers are also more likely to rank the quality of resources as

an important element (2nd rank) relative to buyers, and to rank maintaining timelines/deadlines and

Page 25: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 24

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

budget/cost control as less important elements relative buyers. 59% of both buyers and providers rank

the quality of the final deliverables either first or second out of all performance elements.

5 IMPLICATIONS

This section contains findings on respondent preferences for how vendor performance measurement

systems should be used.

Figure 5.0.1 Performance Incentive Preferences

N=119

QD1: Should a vendor performance measurement system use positive and negative performance incentives? E.g. Bonuses and

/ or financial penalties.

Overall, respondents were divided as to whether vendor performance measurement systems should use

positive and negative performance incentives or not, with 45% stated that they should not be used and

equal proportion stating that both positive and negative performance incentives should be used. Buyers

and respondents who were neither providers or nor buyers were more likely to state that positive and

negative performance incentives should be used with 51% and 50% stating that they should be used.

56% of providers think that performance incentives should not be used and 37% of providers think that

both positive and negative performance incentives should be used. Much smaller proportions of

respondents overall thought that only negative (5%) or only positive (5%) performance incentives should

be used.

6% 4% 6% 5%2% 6%

13%5%

37%

51%

50%

45%

56%

39%31%

45%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Provider

(Consultant)

Buyer

(Client)

Neither Total

No - neither

Yes - positive and

negative

Yes - negative

Yes - positive

Page 26: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 25

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 5.0.2 Implications for Receiving Subsequent RFP’s and RFS’s

N=119

QD2: Should a vendor performance measurement system be used to determine which vendors receive subsequent RFP’s and

RFS’s?

There were high levels of consensus that vendor performance measurement systems should be used to

determine which vendors receive subsequent RFP’s and RFS’s, with relatively equal proportions of

providers (75%) and buyers (76%) in agreement that this constitutes a good business practice.

Respondents who were neither providers nor buyers were more likely to state that vendor performance

measurement systems should be used to determine which vendors receive subsequent RFP’s and RFS’s

(88%).

75% 76%88%

77%

25% 24%13%

23%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Provider

(Consultant)

Buyer (Client) Neither Total

No

Yes

Page 27: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 26

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 5.0.3 Probation for Vendors with a Poor Performance Record

N=119

QD3: Should a vendor with a poor performance record be put on probation for a defined period of time?

Buyers and providers were more likely to be at odds over whether vendors with poor performance

records should be put on probation for a defined period of time, with 86% of buyers but only 71% of

providers in agreement that this constituted a good business practice. Respondents who were neither

providers nor buyers were more likely to state that vendors with a poor performance record should be

put on probation for a defined period of time, with 94% in agreement. Overall, approximately 4 in 5

(81%) of respondents were in agreement that probationary periods should be used to respond to poor

vendor performance.

71%

86%94%

81%

29%

14%6%

19%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Provider

(Consultant)

Buyer (Client) Neither Total

No

Yes

Page 28: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 27

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

6 FAIRNESS

Figure 6.0.1 Preferences with Respect to Dispute Resolution

N=239 (multiple response permitted)

QE1: What types of dispute resolution should be available?

Relatively similar responses were received from the respondent groups on the types of dispute

resolution that should be available. This was a multiple response question so percentages should be

viewed as the degree of agreement rather than as the percentage of those who agreed. Client/vendor

meetings (44% overall) and debriefings on the vendor evaluation report (31%) overall are seen as the

leading types of dispute resolution that should be available. Across the respondent groups relatively

similar proportions were in agreement that there should be a designated ombudsman in the client

organization. The respondent groups were most highly misaligned on whether a third party should be

brought in to facilitate, mediate and arbitrate in the event of a dispute between the provider and buyer

of management consulting services. 18% of providers thought that this type of dispute resolution should

be available as compared to 13% of buyers and just 8% of respondents who were neither providers nor

buyers.

7% 8%4% 7%

29%32%

35% 31%

18% 13%

8%15%

42% 45%

50%44%

4% 1% 4% 3%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Provider

(Consultant)

Buyer (Client) Neither Total

Other

Client/Vendor meetings

Bring in third party

Debriefing on evaluation

report

Designated ombudsman

Page 29: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 28

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 6.0.2 Additional Remarks

N=44

QG3: Is there anything else that you’d like to add in regards to vendor performance management?

At the end of the survey respondents were asked whether there was anything else that they’d like to

add in regards to vendor performance management. This question received a range of open-ended

remarks. 18% of respondents to this question stated that a standardized approach is needed across

sectors and fields with respect to the management of vendor performance. 20% of respondents

cautioned that vendor performance management can become resource intensive and/or bureaucratic if

improperly designed and implemented. 16% also cautioned that an unbiased approach is needed so that

vendor performance evaluations are a fair and effective management tool. 11% of respondents to this

question stated that vendor performance management is needed as a business practice for relationship

building. 5% of respondents suggested that training is needed for users of any given vendor

performance management system, and a similar proportion (though presumably of providers) suggested

that buyers also need to be managed. 2% also cautioned that the public and private sectors are different

and that an approach used in one sector may not necessarily be effective in another.

18%

5%

16%

20%

5%

11%

2%

23%

0%

10%

20%

30%

Page 30: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 29

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

7 SUMMARY AND CONCLUSIONS

7.1 Key Elements

As previously mentioned, the research for this project included interviews and a literature review in

addition to the survey research. These supplementary activities provided data that is useful for

informing, validating and confirming the survey findings contained in the report. While clear direction is

provided by the majority of the survey findings, this summary and the following conclusions also draw

upon that supplementary material.

Respondents are in agreement that the management of vendor performance is an important activity for

measuring, evaluating and improving the performance of vendors. Any vendor performance

management system will include not only the tools with which vendor performance can be measured,

but the processes and knowledge with which such measurements can be integrated into the operations

and business practices of an organization. The fact that only a quarter of buyers represented in the

survey have a vendor performance management policy represents an opportunity for the development

of vendor performance management systems.

Existing performance managements systems characteristically utilize performance contractual clauses

for managing vendor performance. Such clauses are used as a framework against which performance

can be measured. Approximately two-thirds of companies and organizations with existing performance

management policies utilize both positive and negative performance incentives, suggesting that

performance incentives (both positive and negative) are a best practice in terms of managing vendor

performance. Forms, templates, progress meetings and performance documentation are the most

commonly utilized tools and resources. It should be noted, however, that other survey findings and the

interviews have cautioned that user guides and manuals (and system training) ought to be considered

part of an effective vendor performance management system.

Among companies and organizations with existing vendor performance management systems, poor

contract performance, budget thresholds and complex projects were generally the most prevalent

vendor evaluation triggers. While Providers and Buyers are generally in agreement that vendor

performance should be used uniformly on every project, those who suggested that vendor performance

evaluations should not be utilized on every project were most likely to state that poor vendor

performance should the evaluation trigger. This finding suggests that vendor performance evaluations

could become punitive in the absence of a uniform trigger system. Indeed, in our review of the

literature, vendor performance programs generally used a complaint based system in which clients only

complete such forms if they have a negative experience and wish to submit a complaint. Informants

noted that performance evaluations should be ongoing because performance cannot be improved at the

end of the contract. Mid-contract evaluations were also seen as opportunities for the vendor to ‘make

good’ on their contractual obligations. Indeed, respondents were most likely to state that vendor

performance should be evaluated at the completion of each stage or phase of the project.

The current study identified a mean value of $379,844.00 as the average value of respondent’s contracts

for which performance evaluations were completed. Since budget thresholds are a more objective

trigger mechanism than poor performance, a figure for vendor performance evaluation systems that are

triggered by a budget threshold ought to be considered. In reviewing the literature many broader public

sector organizations utilize budget thresholds in order to determine which procurement opportunities

ought to be made publicly available to all suppliers. The Agreement on Internal Trade (AIT) for instance,

utilizes a figure of $100,000 as the threshold for goods and services procurement opportunities which

must be made available on an electronic tendering system readily accessible by all suppliers across

Canada. Given the need to automate vendor performance measurement systems and in the interests of

Page 31: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 30

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

developing synergies between the current initiative and public sector regulations a budget threshold of

$100,000 is similarly recommended as the trigger for vendor performance evaluations in the

management consulting sector.

Respondents are generally in agreement that a vendor performance evaluation should take, on average,

approximately 30 minutes to complete. Respondents were more divided on the scales used to measure

performance, with many (52%) suggesting that an expectation scale ought to be used. Nevertheless,

qualitative comments and data from the interviews suggested that an expectation based scale will not

capture differences between vendors because of the central tendency (i.e. most vendors will ‘meet

expectations’). A number of qualitative comments and data from the interviews suggested that a

numeric scale (i.e. 1-4) would be best in order to avoid that central tendency. Indeed, a Vendor

Performance Tracking Report produced by the State of Florida myMarketPlace demonstrates (Appendix

A Figure 9.0.1) that approximately two thirds of all vendors fall between 2.81 and 3.20 on the five point

scale used by that jurisdiction. A greater dispersion and avoidance of the central tendency would be the

main benefits of a four point numeric scale.

Also reoccurring throughout jurisdictions is the use of a numeric rating scale to evaluate vendor

performance. For instance, the Department of Housing (2006) in Atlanta, Georgia uses a numeric rating

performance scale from 0-4 (0 representing unsatisfactory performance and 4 referring to excellent

performance) including a “N/A” option. Each numeric rating is defined in a legend describing the levels

of performance. The same technique is used by (MyFloridaMarketPlace.com, 2005) (Government of

Tasmania, 2001), and the (Ontario Realty Corporation, 2010) who uses a numeric scorecard and

scorecard guide to obtain a Vendor Performance Rating score.

There were some clear findings regarding the performance elements of the vendor performance

management system. Quality of the final deliverables and effective communication throughout the

engagement were identified as priorities, suggesting that these two elements should contain the most

‘attributes’ or question dimensions in order to address the importance of these elements with respect

to vendor performance. Budget/cost control, availability of resources to carry out the contract, quality

of resources, and maintaining deadlines/timelines are also important elements. Value added services

and having a vendor contact for dispute resolutions were considered the least important elements.

Regardless of the specific elements, the performance criteria should always be disclosed, in advance, to

vendors and the performance criteria must be simple and easily applied (Kestenbaum & Straight, 1995).

Informants also noted that performance management systems require clear definitions of each score

item.

Informants noted during the interviews that the recognition of need for expertise that is not available in

house was generally the instigation for the procurement of management consulting contracts. It is

generally thought that the ‘misunderstandings’ that may lead to performance issues generally begin

with the development of a scope of work. Because the scope of work is used to define client

expectations, the main source of problems are generally introduced at the end of the contract when the

budget has been used up. It is at this point that clients may recognize a gap between their expectations

and the consultant’s deliverables when there is no time or money to rectify the performance issue. A

high rating in the current survey for “effective communication throughout the engagement” suggests

that the gap between client expectations and consultant’s deliverables may be addressed through the

management of client expectations through effective communication. One informant noted,

additionally, that management consultants often function with a vague scope of work. This issue also

highlights the importance of communication for defining service, work and deliverable parameters.

Within this context (and particularly when clients have not defined their need) there may be an

opportunity to introduce performance criteria associated with innovation and creativity with respect to

Page 32: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 31

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

how the consultant produces the desired outcome. The emphasis here, again, is on the quality of the

final deliverables.

The overarching performance criteria among vendor performance programs reviewed in the literature

tend to concern the quality of performance, partnership between client and vendor, delivery of services,

and cost. For example, (Ontario Realty Corporation, 2010) utilizes a performance scorecard in which

vendors are rated on a scale of 1-5 based on quality, partnership, and value for money. A similar

scorecard is used by (The Department of Housing, 2006) in Georgia, which rates vendors based on

satisfaction, quality, business relations and timeliness. Of all performance measures, quality of service is

the most difficult to evaluate according to (Kestenbaum & Straight, 1995) and it is the factor that

generally receives the most weighting overall (Stueland, 2004). An analysis of the consequences of

contract administration problems for contracted services revealed that poor performance was the

leading cause of contract delays of more than 10 days (18.4%), and a leading cause for contract

termination (17.7%) amongst causes including such contract administration problems as wrong

products, delays, definitions of acceptance, change orders, conflicts, risks, subcontracts and costs

(Davison, B and Sebastian, R.J, 2009). Poor performance was revealed to be the third leading cause of

contract delays of less than 10 days (16.7%), and the leading cause of contract termination (13.8%) for

professional services contracts (Davison, B and Sebastian, R.J, 2009).

Page 33: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 32

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 7.1.2 Performance Incentive Preferences by Sector

N=240 (multiple responses permitted)

QD1: Should a vendor performance measurement system use positive and negative performance incentives? E.g. Bonuses and

/ or financial penalties?

Because respondents were equally divided (i.e. 45% / 45%) as to whether a vendor performance

management system ought to include or not include performance incentives, the researchers examined

the question against categories formed by aggregating the respondent sectors together. In this context,

respondents who worked in government sectors (municipal, provincial, federal and the broader public

sector) were more likely to suggest that vendor performance measurement systems should not use

positive and negative performance incentives (56%) than those respondents in the institutional

(academic, school board, health care; 45%) and private company (40%) sectors. However, respondents

are generally in agreement that a vendor performance management system should be integrated into

procurement operations so as to determine which vendors receive or do not receive subsequent RFP’s

and RFS’s. There were also high levels of agreement that vendors with a poor performance record

should be put on probation for a defined period of time.

Respondents are in agreement that client/vendor meetings and debriefings on the vendor evaluation

report ought to be available as dispute resolution techniques. Providers are more likely to seek third

party involvement, while smaller numbers are in agreement that there should be a designated

ombudsman in the client organization for dispute resolution purposes.

7.2 Design Considerations

Qualitative comments received through the survey and interviews suggest that vendor performance

ought to be measured against clauses in the contract. Generally these clauses are provided with

tender/RFP/RFS packages to prospective vendors. An informant revealed, for instance, that “a proactive

4% 0%9%3%

3%

3%

38%53%

48%

56%

45%40%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

GOVERNMENT INSTITUTION PRIVATE

COMPANY

No - neither

Yes - positive and negative

Yes - negative

Yes - positive

Page 34: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 33

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

approach to vendor performance is required. Essentially, it needs to be disclosed that the successful

bidder will be subject to a performance evaluation in the bid documents, along with a disclosure of the

tools that will be available to the client in assessing the vendor. And then, a consistent application of the

proper contract administration functions is important through-out the life of the project.” Comments

received, however, indicate that publicly available performance policies are not a best practice with

respect to management consulting services, and very few companies or organizations represented in the

survey have publicly available performance policies (via the internet) . However, the integration of

performance policies with evaluation processes and metrics into service agreements, as mentioned

above, is a widespread best practice.

The majority of programs reviewed in the literature, including (Ontario Realty Corporation, 2010), (The

Department of Housing, 2006), (MyFloridaMarketPlace.com, 2005), and (Ministry of Transportation

(MTO), 2007), utilize a performance scorecard and scoring guide, a performance rating system, and a

weighting technique. The scorecard is used by clients to evaluate a vendor’s performance by applying

evaluation criteria that are aligned with the various performance components. The scoring guide assists

clients with completing the scorecard by outlining the criteria used to evaluate such performance

components. Benchmarking techniques are also used as tools to assign weighting to performance

components. Weighting is a common measurement technique used to evaluate the importance of each

performance criterion relative to one another in order to provide vendors with a total score. The

weighting scales include percentage per criterion and numerical values (Stueland, 2004).

Other comments point to the existence of other performance measurement tools and resources such as

the Better Business Bureau, financial stability, level of responsiveness to bid invitations, project team

member skills, safety records, and references as components of an overall vendor performance

management system. It is not clear on the basis of these comments whether these tools and resources

are formally or only informally part of the vendor performance management systems at play within the

respondent companies and organizations.

There are several options with respect to the use and implications of various metrics and measurement

systems. The (Ontario Realty Corporation, 2010), for example, uses the average of all a particular

Vendor’s scorecards over a three year period to derive their Vendor Performance Rating (VPR). The VPR

can be applied in a Request for Qualification, Tender, Proposal, or services (Ontario Realty Corporation,

2010). (Ministry of Transportation (MTO), 2007) employs a similar approach to their Consultant

Performance and Selection System (CPSS) by measuring past performance through a Corporate

Performance Rating (CPR), which is the weighted average of the consultant’s appraisals over three-

years. Informants also expressed the view that performance management systems ought to include the

capacity to reflect trends.

(Office of the Procurement Ombudsman, 2010) found in their study of vendor performance that a few

organizations, such as: Public Works and Government Services Canada - Real Property Branch (PWGSC –

RPB) and the National Capital Commission (NCC), use key performance indicators (KPI’s) as measures of

vendor performance. They define a KPI as “a key measure of performance for a specific activity that is

pre-identified by the organization, and is used for determining the success of the vendor in meeting its

contractual obligations” (Office of the Procurement Ombudsman, 2010, p. 9).

Page 35: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 34

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

7.3 Discussion Points

DISCUSSION POINT ONE: CMC-Ontario should assist the Ministry of Government Services Supply Chain

Management Division in developing a vendor performance management system which can be used by

the broader public sector and its suppliers, including buyers (clients) represented in the current report.

75% of buyers represented in the current report do not have an existing vendor performance

management system and 92% of respondents in the current survey agree that the management of

vendor performance is an important activity for measuring, evaluating and improving the performance

of vendors. In the words of one informant, a standardized approach to measurement of vendor

performance is long overdue.

DISCUSSION POINT TWO: The vendor performance measurement system ought to be utilize

performance contractual clauses that are made available to vendors within the bid documents.

DISCUSSION POINT THREE: Given the opportunity to develop the vendor performance measurement

system for the broader public sector, it ought not to include positive and negative performance

incentives. Instead, vendors with a poor performance record ought to be put on probation for a defined

period of time. The vendor performance management system also ought to be used to determine which

vendors receive or do not receive subsequent RFP’s.

DISCUSSION POINT FOUR: Vendor evaluations ought to be triggered by contract values in excess of

$100,000, and ought to be undertaken at the completion of each stage or phase of the project. This will

mitigate against punitive evaluations and will place the emphasis on continuous improvement.

DISCUSSION POINT FIVE: The vendor performance evaluation systems should be designed in such a way

that individual vendor evaluations take approximately 30 minutes to complete. Vendor evaluation

reports ought to use a numeric four point scale for measuring performance in order to avoid the ‘central

tendency.’

DISCUSSION POINT SIX: “The quality of the final deliverables” and “effective communication throughout

the contract” ought to include the most attributes or question dimensions in the evaluation form in

order to reflect the importance of these performance elements.

DISCUSSION POINT SEVEN: Vendors and clients ought to have recourse to client/vendor meetings and

debriefings on the vendor evaluation report as dispute resolution techniques. Evaluators ought to be

trained on the use of the vendor performance management system, and evaluations ought to be signed.

Page 36: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 35

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

7.4 Implementation Considerations

In the open-ended comments, respondents noted that various personnel were responsible to

undertaking vendor performance evaluations at companies and organizations with existing vendor

performance management policies for outside consultants. The responsibility resided with project

managers, procurement/purchasing, business services, corporate managers and even presidents at

those companies and organizations with existing performance management policies. Informants noted

that vendor performance management evaluations should be a business practice or “collaborative tool

which will form the basis of conversation on how to improve both the vendor and the organization’s

ability to manage the particular contract.” Informants envisioned a system where business and finance

arms, as well as procurement and the procuring department would have access to and retain custody of

performance evaluations. One informant noted that performance evaluations should be signed.

Respondents in both the interviews and surveys noted that any personnel involved with undertaking

vendor performance evaluations should be trained on how the system works at all levels. Respondents

cautioned that personnel should be aware of the implications of their vendor evaluations and of how

evaluation results are used within various areas of the company or organization. Personnel should also

be trained on the use of all performance management tools and resources to ensure a consistent

application from one project to the next. Likewise, vendors ought to be aware of the performance

metrics and schedule as well as the resources that are available to them and be made aware of such

resources through clauses in their formal contract. According to informants, training should focus on

and emphasize the need for consistency with respect to both use and application.

Although corrective action measures are not well documented in the literature, (Ministry of

Transportation (MTO), 2007) states that infraction reports are only issued for serious contract breaches

such as: failure to comply with the terms and conditions of such agreement, failure to provide adequate

organization, co-operation, personnel or equipment, failure to comply with standards and legislations,

and delayed delivery/failure to complete project in a timely manner. (Stueland, 2004) suggests that in

order for a vendor performance program to be successful, a vendor performance policy must be in

place, enforced, and available publicly.

The (Ontario Realty Corporation, 2010) and the (Government of Tasmania, 2001) state that if a Vendor

Management Program is to be effective, the program must be standardized, streamlined, and consistent

and it is fundamental to the process that the information be timely, accurate, and a true reflection of

performance. The (Office of the Procurement Ombudsman, 2010) also states that the use of

automated systems is a best practice among many organizations, such as DCC and the Government of

Newfoundland and Labrador. The use of an automated system makes it manageable to control vendor

performance, as many organizations deal with a large number of contracts at any given time (Office of

the Procurement Ombudsman, 2010). Very few companies and organizations with existing

performance measurement systems represented in the current survey had automated forms (11%) or

automated templates (18%) for conducting vendor evaluations.

Page 37: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 36

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

8 BIBLIOGRAPHY

Aberdeen Group, Inc. (2002). The Supplier Performance Measurement Benchmarking Report: Measuring

Supply Chain Success. iSource.

Government of Tasmania. (2001). Performance Reports for Prequalified Contractors and Consultants.

Retrieved from The Government Purchasing Information Gateway:

http://www.purchasing.tas.gov.au/buyingforgovernment/getpage.jsp?uid=4C1F9B61B1F4F980C

A256C9400148B03

Kestenbaum, M. I., & Straight, R. L. (1995). Procurement Performance: Measuring Quality, Effectiveness,

and Efficiency. In Public Productivity & Management Review (pp. 200-215). Armonk: M.E.

Sharpe. Inc.

Ministry of Transportation (MTO). (2007, October). RAQS Consultant: Consultant Performance &

Selection System (CPSS). Retrieved from Ministry of Transportation:

https://www.raqsa.mto.gov.on.ca/login/raqs.nsf/english/Text/RAQSPages/B.+Consultant+Headi

ng+-+F.+Consultant+Performance+and+Selection+System+(CPSS)?OpenDocument

MyFloridaMarketPlace.com. (2005). Contract Administrators' Meeting. Tallahassee: State of Florida.

MyFloridaMarketPlace.com. (2011, April). 04_01_2011 vendor performance tracking report. Retrieved

from Department of management services - state of florida:

http://www.dms.myflorida.com/business_operations/state_purchasing/vendor_information/ve

ndor_performance_tracking_vpt/vpt_tracking_reports/04_01_2011_vendor_performance_trac

king_report.pdf

Office of the Procurement Ombudsman. (2010). study on a management approach to vendor

performance. In Chapter 6: Procurement practices review (pp. 9-14). Ottawa: Office of the

Procurement Ombudsman.

Ontario Realty Corporation. (2010, June 2). Vendor Performance Program. Retrieved from Ontario Realty

Corporation: http://www.ontariorealty.ca/Doing-Business-With-Us/Strategic-Sourcing----Bid-

Opportunities/Vendor-Performance-Program.htm

Shirouyehzad, H. (2011, April). Efficiency and ranking measurement of vendors by data envelopment

analysis. International Business Research, 4(2), 137-146.

Stueland, V. J. (2004). Suppliers evaluations best practices and creating or imporving your own

evaluation. ISM's 89th annual international supply management conference. San Antonio: Wells

Fargo Services Company.

Survey Analytics. (2011). Vendor performance management. Retrieved from Survey Analytics enterprise

research platform: http://www.surveyanalytics.com/

Page 38: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 37

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

The Department of Housing. (2006). Improving vendor performance: Vendor performance reports in

contract administration. Georgia: Georgia Institute of Technology.

Weber, C. A. (1996). A data envelope analysis approach to measuring vendor performance. Supply Chain

Management, 1(1), 28-39.

Page 39: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 38

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

9 APPENDIX A

Figure 9.0.1 The Central Tendency

N= 5105

Source: State of Florida myMarketPlace Vendor Performance Tracking Report

3%

66%

31%

0%

10%

20%

30%

40%

50%

60%

70%

>= 1.00 to <= 2.80 >= 2.81 to <= 3.20 >= 3.21 to <= 5.00

Overall Rating

Overall Rating

Page 40: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 39

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 9.0.2 Consequences of Contract Administration Problems for Contracted Services

No Effect

Contract

Delay < 10

days

Contract

Dely > 10

days

Increased

Contract

Cost < 10%

Increased

Contract

Cost > 10%

Contract

Termination

Percent Percent Percent Percent Percent Percent

Contract Administration Problem

Wrong Product 48.8% 23.1% 7.9% 8.7% 4.1% 7.4%

Delays 30.8% 29.7% 18.1% 10.9% 4.7% 5.8%

Definition of Acceptance 38.7% 22.7% 16.4% 9.8% 5.5% 7.0%

Change Order 31.8% 17.8% 12.5% 20.8% 10.6% 6.4%

Conflict 31.3% 25.7% 17.3% 8.1% 7.7% 9.9%

Other Sources 48.5% 17.0% 12.4% 11.2% 7.9% 2.9%

Poor Performance 26.7% 18.1% 18.4% 10.8% 8.3% 17.7%

Risk of Failure/Termination 33.0% 21.9% 13.0% 8.9% 8.1% 15.2%

Subcontractors 41.5% 19.0% 11.7% 12.5% 8.1% 7.3%

Costs 29.2% 14.4% 12.5% 22.9% 12.2% 8.9%

Consequences 35.6% 21.0% 14.2% 12.5% 7.8% 9.0%

N= 2228

Source: Davison, B and Sebastian, R.J. An Analysis of the Consequences of Contract Adminstration Problems for Contract Types; 2009.

Page 41: Vendor Performance Management

CMC-Canada February 2012

Vendor Performance Management Study

Prepared by: P a g e | 40

R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

Figure 9.0.2 Consequences of Contract Administration Problems for Professional Services

No Effect

Contract

Delay < 10

days

Contract

Dely > 10

days

Increased

Contract

Cost < 10%

Increased

Contract

Cost > 10%

Contract

Termination

Percent Percent Percent Percent Percent Percent

Contract Administration Problem

Wrong Product 56.9% 14.4% 13.0% 5.1% 6.5% 4.2%

Delays 26.3% 17.6% 27.8% 13.7% 11.8% 2.7%

Definition of Acceptance 41.9% 15.7% 18.3% 10.9% 9.2% 3.9%

Change Order 26.0% 12.7% 17.9% 19.5% 20.7% 3.3%

Conflict 31.0% 18.4% 21.5% 10.0% 12.6% 6.5%

Other Sources 52.7% 14.5% 11.4% 10.0% 7.7% 3.6%

Poor Performance 27.9% 16.7% 19.7% 13.4% 8.6% 13.8%

Risk of Failure/Termination 35.7% 15.3% 17.6% 10.6% 8.2% 12.5%

Subcontractors 41.5% 14.1% 14.5% 11.2% 9.1% 9.5%

Costs 25.4% 13.6% 14.3% 21.3% 17.6% 7.7%

Consequences 35.8% 15.3% 17.8% 12.8% 11.4% 6.9%

N= 2264

Source: Davison, B and Sebastian, R.J. An Analysis of the Consequences of Contract Adminstration Problems for Contract Types; 2009.