KMPI-00-Antecedentes

23
http://jis.sagepub.com/ Journal of Information Science http://jis.sagepub.com/content/32/1/17 The online version of this article can be found at: DOI: 10.1177/0165551506059220 2006 32: 17 Journal of Information Science Mu-Yen Chen and An-Pin Chen Knowledge management performance evaluation: a decade review from 1995 to 2004 Published by: http://www.sagepublications.com On behalf of: Chartered Institute of Library and Information Professionals can be found at: Journal of Information Science Additional services and information for http://jis.sagepub.com/cgi/alerts Email Alerts: http://jis.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://jis.sagepub.com/content/32/1/17.refs.html Citations: What is This? - Feb 13, 2006 Version of Record >> at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012 jis.sagepub.com Downloaded from

description

journal of information science

Transcript of KMPI-00-Antecedentes

  • http://jis.sagepub.com/Journal of Information Science

    http://jis.sagepub.com/content/32/1/17The online version of this article can be found at:

    DOI: 10.1177/0165551506059220 2006 32: 17Journal of Information Science

    Mu-Yen Chen and An-Pin ChenKnowledge management performance evaluation: a decade review from 1995 to 2004

    Published by:

    http://www.sagepublications.com

    On behalf of:

    Chartered Institute of Library and Information Professionals

    can be found at:Journal of Information ScienceAdditional services and information for

    http://jis.sagepub.com/cgi/alertsEmail Alerts:

    http://jis.sagepub.com/subscriptionsSubscriptions:

    http://www.sagepub.com/journalsReprints.navReprints:

    http://www.sagepub.com/journalsPermissions.navPermissions:

    http://jis.sagepub.com/content/32/1/17.refs.htmlCitations:

    What is This?

    - Feb 13, 2006Version of Record >>

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge managementperformance evaluation: a decadereview from 1995 to 2004

    17Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Mu-Yen Chen and An-Pin Chen

    Institute of Information Management, National Chiao TungUniversity, Taiwan

    Received 25 March 2005Revised 9 May and 1 June 2005

    Abstract.

    In this paper, the development of knowledge management(KM) was surveyed, using a literature review and classifi-cation of articles from 1995 to 2004. With a keyword indexand article abstract, we explored how KM performanceevaluation has developed during this period. Based on ascope of 108 articles from 80 academic KM journals(retrieved from six online databases), we surveyed andclassified methods of KM measurement, using the followingeight categories: qualitative analysis, quantitative analysis,financial indicator analysis, non-financial indicatoranalysis, internal performance analysis, external perform-ance analysis, project-orientated analysis and organization-orientated analysis, together with their measurementmatrices for different research and problem domains. Futuredevelopment directions for KM performance evaluation arepresented in our discussion. They include: (1) KM perform-ance measurements have tended towards expertiseorientation, while evaluation development is a problem-orientated domain; (2) different information technologymethodologies, such as expert systems, knowledge-based

    systems and case-based reasoning may be able to evaluateKM as simply another methodology; (3) the ability to con-tinually change and obtain new understanding is the drivingpower behind KM methodologies, and should be the basis ofKM performance evaluations in the future.

    Keywords: knowledge management; performanceevaluation; literature survey

    1. Introduction

    A knowledge-based economy is emerging, and knowl-edge management (KM) is being rapidly disseminatedin academic circles, as well as in the business world.While an increasing number of companies havelaunched knowledge management initiatives, a largeproportion of these initiatives retain a technical per-spective. The problem with this type of focus is theexclusion and neglect of potential benefits that may bederived from knowledge management. The followingtypes of question are proliferating: is it really worth-while to invest in KM? Has our implementation of KMbeen a success? Is our KM system productive andeffective?

    Recent surveys indicate that issues such as measur-ing the value of KM and evaluating KM performanceare of great importance to managers in places like Asia[1], the United States [2] and the United Kingdom [3].Given the increasing role of KM in upgrading businesscompetition, the interest of managers, in measuringand evaluating both KM performance and its benefits,is not surprising [4]. This brings up an importantresearch issue: how do most firms that have initiatedKM develop appropriate metrics to gauge theeffectiveness of their initiative? In other words, there is

    Correspondence to: Mu-Yen Chen, Institute of InformationManagement, National Chiao Tung University, 1001 TaHsueh Road, Hsinchu, Taiwan 30050. E-mail: [email protected]

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    a need for metrics to justify KM initiatives. Ourresearch objective was therefore to analyze a variety ofevaluation perspectives, in order to estimate knowl-edge management performance.

    This paper is focused on surveying knowledgemanagement development through a literature reviewand classification of articles from 1995 to 2004, in orderto explore KM performance evaluation during thatperiod. This survey methodology is valuable for anumber of reasons. First, it brings an issue into focusby defining and detailing its various characteristics.Second, the results of a survey are typically quantifiedand therefore amenable to statistical analysis. Third,statistical inference allows one to extend the resultsobtained from a sample of respondents to a large popu-lation, thereby permitting a wider application. Fourth,the survey methodology is fast and straightforward,compared to many other research methods. In addition,our goal was not only to examine the research trend inKM performance evaluation changes, but also to under-stand the gap between the academic and the businessworld. For this reason we used a questionnaire surveyto investigate high-technology organizations and verifyif there really exists a gap in academic research. As aresult, the questionnaire survey had a significantrelationship with the dependent variables. In otherwords, it was reasonable to use qualitative, quantita-tive, internal performance, external performance,project-orientated and organization-orientated analysesto evaluate KM performance. Hence, these six perspec-tives will be used in our literature survey methodologyto classify and evaluate KM performance.

    There are two reasons for choosing this period tosurvey knowledge management development. First, theknowledge spiral was proposed to corporations andorganizations in 1995, and this model plays importantroles, not only in conducting academic researchstudies, but also in creating, exploiting and recyclingknowledge within the business environment. Second,there is no doubt that KM existed prior to 1995. KM hasbeen rapidly disseminated in academic circles as wellas in the business world because of Nonaka andTakeuchis influential book. However, while an increas-ing number of companies launched knowledge man-agement initiatives, KM still remained in the theory,model, and application development phases. Recently,the research trend has moved towards how to measureKM performance. Therefore, our goal was to examinethe research trend in KM performance evaluationchanges, using two phases, distinguishing the first five

    years (19951999) from the second five years(20002004). Finally, we hope that the distinctionbetween these two five-year periods will be evident.The history of KM evolution, over the past decade, canbe seen in Figure 1.

    This literature survey began in January 2005. It wasbased on a search for knowledge management in thekeyword index and article abstract within the ISI,Elsevier SDOS, IEEE Xplore, EBSCO, Ingenta andWiley InterScience online databases, for the periodfrom 1995 to 2004, in which 3699 articles were found.After topic filtering, there remained 108 articles, from80 journals, related to the keyword knowledge man-agement performance evaluation. In addition, ISI Webof Knowledge, an integrated Web-based platform,provides high-quality content and the tools to access,analyze, and manage research information. We alsoused the keyword and subject category functions toanalyze the selected articles. We found that the first fivefields were Management (20%), Computer Science andInformation Systems (19.2%), Information Science andLibrary Science (16.2%), Operations Research andManagement Science (12.7%), and Computer Scienceand Artificial Intelligence (10.4%). Based on the scopeof these 108 articles, from 80 academic KM journals,this paper surveyed and classified KM measurementsinto the following eight categories: qualitative analysis,quantitative analysis, financial indicator analysis,non-financial indicator analysis, internal performanceanalysis, external performance analysis, project-orientated analysis and organization-orientatedanalysis, together with their measurement matrices fordifferent research and problem domains.

    The basic underlying assumption is that knowledgemay be viewed from a unified perspective; it circulatesin the organization creating knowledge assets andinfluences the performance of the organization. It hasmultifaceted characteristics, such as: state of mind,object, having access to information, or the potentialfor influencing future action. We summarized the dis-tinctions between these perspectives about knowledgein Table 1.

    The remainder of the paper is organized as follows.Section 2 explains the methodology used to classifyKM measurements into the above eight categories.Section 3 presents the survey results of KM perform-ance evaluation, based on the above categories, respec-tively. In Section 4, a discussion offers suggestions forthe future development of KM performance evaluation,while in Section 5 we present a brief conclusion.

    18 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    2. Methodology of the literature review

    2.1. Measures

    In this study, we have examined various KM perform-ance evaluation perspectives, using path analysis with

    structural equation modeling (SEM). A survey, con-ducted among existing KM project members, who werecorporate employees, was analyzed using LISRELsoftware. Path analysis is an extension of the regressionmodel, used to test the fit of the correlation matrixagainst two or more causal models, which are being

    19Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Fig. 1. The KM process [512].

    Table 1Diverse perspectives of knowledge and implications for KM [7, 13]

    Perspectives Implications for KM

    State of mind Knowledge is the state of knowing and KM involves enhancing an individuals learningunderstanding and understanding through the provision of

    information

    Object Knowledge is an object to be stored and Key KM issue is building and managingmanipulated knowledge stocks

    Process Knowledge is a process of applying expertise KM focus is on knowledge flows and the processof creation, conversion, circulation and carryingout

    Access to information Knowledge is a condition of access to information KM focus is organized access to, and retrieval of,content

    Capability Knowledge is the potential to influence action KM is about building core competencies andunderstanding strategic know-how

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    compared by the researcher. A regression is done foreach variable in the model that is dependent on others,which the model indicates as causes. The regressionweights, predicted by the model, are compared withthe observed correlation matrix for the variables, and agoodness-of-fit statistic is calculated. The best fitting, oftwo or more models, is selected by the researcher as thebest model for the advancement of the theory. Pathanalysis, in the main, has two models: Path Analysiswith Observed variables (PA-OV) and Path Analysiswith Latent Variables (PA-LV). In this study, weadopted the PA-OV model to observe each performanceevaluation perspective, because there were no latentvariables in the PA-OV model; all variables in thismodel were manifest variables.

    Our research has emphasized the evaluation ofknowledge management performance. We reviewed theliterature on the following categories: qualitativeanalysis, quantitative analysis, internal performanceanalysis, external performance analysis, project-orientated analysis and organization-orientatedanalysis. Moreover, we obtained 21 items representingvarious dimensions underlying KM performance evalu-ation, and these were used to form the initial item poolfor the scale in the questionnaire. To make sure thatimportant aspects of KM performance were notomitted, we conducted experience surveys andpersonal interviews with two professionals, fourcollege teachers, and two CKOs (chief knowledgeofficers). They were asked to review the initial item listof the scale, and approved the items. Consequently, the21-item list was considered to constitute a completedomain for KM performance measurement.

    In addition, we designed a mediation variable Knowledge Management Evaluation (KME), which pre-sented the total score from the previous six main items.The higher the KME score, the better it expresses theknowledge management evaluation. Although it isimportant to understand the score of the KME, our aimis to analyze the KM performance in the business. Inother words, KME is just an evaluation tool to presentthe KM performance. As a result, the variable Knowl-edge Management Performance (KMP) is dependenton the KME status, which uses a 10-point scale, scoredby the administrator. Again, the higher the score, themore successful the knowledge management perform-ance. Finally, we also designed a moderator Time to prove that there was a significant relationshipbetween KME and KMP through T-value analysis.Therefore, there was a total of nine observation vari-ables in our model.

    For that reason, our exploratory instrument involv-

    ing 24 items (as shown in the Appendix), with the onevariable KMP perceived overall performance andgiven the success of the knowledge managementcriterion, the instrument was developed using a 10-point Likert-type scale, with anchors ranging fromstrongly unimportant to strongly important. Foreach question, respondents were asked to circle theresponse which best described their level of agree-ment. The KMP can be used to analyze the criterion-related validity of the instrument and to measureoverall significant relationships prior to detailedanalysis. After careful examination of the result ofexperience surveys and interviews, the statementswere further adjusted to make their wording as preciseas possible.

    2.2. Subjects

    The data used to test the research model was obtainedmainly from five international organizations: theTaiwan Semiconductor Manufacturing Corporation(TSMC), the United Microelectronics Corporation(UMC), the Coretronic Corporation, the Trea AutotechCorporation, and IBM Taiwan. Each company alreadyhad KM implemented, and each respondent hadexperience involving KM projects or in using a KMsystem. The respondents completed a self-administered,24-item questionnaire. For each question, respondentswere asked to circle the response which bestdescribed their level of agreement with the statements.Of the 300 surveyed, 269 useful responses werereturned, and thus the response rate was 89.6%. Therespondents averaged 34 years of age and had anaverage of five years experience in KM; the male-to-female ratio was approximately 1.8 to 1. Thirty-ninepercent had completed a college or university degreeonly, while 45 percent had obtained postgraduatedegrees. Table 2 gives the detailed information for thequestionnaire survey.

    2.3. Model assumptions

    The KME and KMP in our model were dependentvariables. When the KME is a dependent variable, theQualitative (X1), Quantitative (X2), Internal Perform-ance (X3), External Performance (X4), Project-Oriented (X5), Organization-oriented (X6), and Time(X7) are all independent variables. When the KMP is adependent variable, the Time (X7) and KME (X8) areindependent variables. The regression equation isshown as follows:

    20 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    Y1 = b1X1 + b2X2 + b3X3 + b4X4 + b5X5 + b6X6 +b7X7 + a1 (1)

    Y2 = b7X7 + b8X8 + a2

    Where a is the intercept, and b is the regressioncoefficient.

    Because path analysis is an SEM application, theSEM equation is shown below:

    y = + y + X + (2)where = intercept, = regression coefficient, and =regression coefficient between dependent and inde-pendent variable.

    The model, shown in Figure 2, has four assumptions,which are described below:(1) The variables can all be observed.(2) Each dependent variable has two explained dis-

    turbances (1, 2).(3) y1 has seven structural parameters, from 1 to 7.(4) y2 has two structural parameters, from 1 to 2.

    2.4. Path and T-value analysis

    The relationships related to the performance evalu-ation perspectives were tested using LISREL path

    21Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 2Details of the questionnaire survey

    TSMC UMC Coretronic Trea Autotech IBM, Taiwan

    Industry IC Manuf. IC Manuf. LCD-TFT Automation IT

    Survey 60 60 60 60 60

    Respondent Deptartment KM KM KM IT KMPosition CKO and CKO and CKO and Manager

    Members Members Members and Members ConsultantsAverage experience 6 4 4 6 5Average age 38 36 31 31 34

    Fig. 2. Path analysis model.

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    analysis. This provided estimates of the parameters andtests of fit for the linear structural equation model;Figure 3 shows these model relationships as well as thefit measures for the model. The chi-square statisticstested the hypothesis that the proposed model couldgenerate the observed data. What this means is thatwhen the chi-square values are statistically significant,the observed data can be closely patterned by thehypothesized model. Figure 4 shows the T-valueanalysis between the six performance evaluation per-spectives and KME and KMP. The main findings aredescribed as follows:(1) Qualitative analysis and quantitative analysis are

    important in evaluating KM. This is because 11and 12 are 0.25 and 0.21, respectively and simi-larly, 11 (t = 4.38) and 12 (t = 3.70) are significantat the 0.001 level, showing that both the quali-tative and quantitative analyses are observableindicators for KM evaluation. In addition, the 15(t = 3.01) and 16 (t = 2.20) are also significant atthe 0.01 and 0.05 levels, respectively.

    (2) None of the independent variables had a directeffect on KMP. However, these independentvariables may influence the KMP through themediation variable KME. As shown in Figure 2,all independent variables had significance in thechi-square statistic tests, especially for X1, X2, X5and X6.

    (3) Time (X7) had a high total effects value in thisquestionnaire analysis. The 17 and 27 were0.13 and 0.2, respectively. In the meantime, X7had an indirect effect, with KMP (Y2), of 0.06.Therefore, the total effect was 0.26 between Time(X7) and KMP (Y2). The 27 (t = 4.52) also hadsignificance at the 0.001 level, showing Time as acritical observable indicator for KM evaluationand performance.

    2.5. Results

    The detailed questionnaire analyses are summarized inTable 3. From these statistics, some interesting factsmay be noted:(1) Qualitative and quantitative analyses had the

    most significant relationships with KME andKMP. The aim of the quantitative analysis was topresent the extent of the impact on both decisionmaking and task performance, using historicaldata that is easily available, relevant, accurate andtimely. This type of evaluation may avoid thedrawbacks of qualitative analysis, especially inthe subjective judgments of empirical results.

    Therefore, KM performance was measured by twoquantitative methods: financial and non-financialindicators. Using two indicators to analyze thedata was more effective than using quantitativeanalysis alone.

    (2) Project-orientated and organization-orientatedanalyses had the second most significant relation-ships with KME and KMP.

    (3) Internal and external performance analyses hadrelationships with the lowest significance to KMEand KMP. However, we found that 13 (t = 0.61)and 14 (t = 0.78) still had positive relationshipswith KME and KMP.

    (4) In this research, we used interview byquestionnaire methodology in five famous inter-national organizations. Each company hadalready implemented KM, and each respondenthad experience involving KM projects or in usinga KM system. Our questionnaire survey includedsix independent variables: qualitative analysis,quantitative analysis, internal performanceanalysis, external performance analysis, project-orientated analysis and organization-orientatedanalysis. In addition, we designed a mediationvariable Knowledge Management Evaluation(KME), since it presented the total score from theprevious six independent variables. As shown inTable 3, the independent variables had a signifi-cant relationship with the KME in the chi-square statistic tests. In other words, it wassuitable and reasonable to use qualitative, quanti-tative, internal performance, external performance,project-orientated and organization-orientatedanalyses to evaluate KM performance. Hence,these six perspectives to classify and evaluate KMperformance, were used in our research. Mostimportant of all, we use the results from pathanalysis to ensure our classification of KM per-formance evaluation is compatible with ourliterature survey. In this way, the gap betweenthe academic and business worlds has beenreduced.

    3. Knowledge management evaluationperspective

    3.1. Qualitative analysis

    A universally accepted definition of KM does not yetexist. While there is debate as to whether knowledgeitself is a cognitive state, a process or an object, the

    22 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    23Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Fig. 4. T-value analysis.

    Fig. 3. LISREL path diagram for KM performance evaluation.(Qual: qualitative; Quan: quantitative; IP: internal performance; EP: external performance; PO: project-orientated; and OO:organization-orientated.)

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    description of KM as a process is based on understand-ing an organization as a knowledge system [14]. AsPolanyi observed, we can know more than we can tell[15]. The notion of tacit knowledge was introduced byPolanyi, a philosopher made known to a largeraudience by being quoted in the writings of Kuhn in1962 [16]; this notion has since had a renaissance, due

    to the writings of Nonaka [17] and Nonaka andTakeuchi [11].

    While referring to and building on the arguments ofPolanyi, different scholars have arrived at contradic-tory conclusions. Cook and Brown argue, in what theyclaim is agreement with Polanyi, that explicit and tacitare two distinct forms of knowledge, and that one form

    24 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 3Summary of path analysis

    Dependent variables

    Independent variables Knowledge management evaluation (Y1) Knowledge management performance (Y2)

    Standardized t-values Standardized t-values

    Qualitative (X1)Direct effects 0.25 4.38*** Indirect effects 0.09 3.29***Total effects 0.25 4.38*** 0.09 3.29***

    Quantitative (X2)Direct effects 0.21 3.70*** Indirect effects 0.08 3.24**Total effects 0.21 3.70*** 0.08 3.24**

    Internal performance (X3)Direct effects 0.04 0.61 Indirect effects 0.01 0.60Total effects 0.04 0.61 0.01 0.60

    External performance (X4)Direct effects 0.04 0.78 Indirect effects 0.02 0.77Total effects 0.04 0.78 0.02 0.77

    Project-orientated (X5)Direct effects 0.18 3.01** Indirect effects 0.07 2.75**Total effects 0.18 3.01** 0.07 2.75**

    Organization-orientated (X6)Direct effects 0.13 2.20* Indirect effects 0.05 2.09*Total effects 0.13 2.20* 0.05 2.09*

    Time (X7)Direct effects 0.17 3.56*** 0.2 1.37Indirect effects 0.06 3.15**Total effects 0.17 3.56*** 0.26 4.52***

    Knowledge management evaluation (X8)Direct effects 0.38 6.75***Indirect effects Total effects 0.38 6.75***

    *t-value > 1.96, p < 0.05.**t-value > 2.58, p < 0.01.***t-value > 3.29, p < 0.001.

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    cannot be made out of, or changed into, the other [18].In contrast, Tsoukas, also building on Polanyi, claimsthat tacit and explicit knowledge are mutually consti-tuted and should not be viewed as two separate typesof knowledge [19]. In a critique of Nonaka, Tsoukasfurther argues that tacit knowledge is not explicitknowledge internalized. In fact, tacit knowledge isinseparable from explicit knowledge since tacitknowledge is the necessary component of all knowl-edge. It seems that most scholars share the opinion ofCook and Brown, that it is useful to treat tacit knowl-edge as separate from explicit knowledge.

    Consequently, because of the methodology, quali-tative means are suitable to measure tacit knowledge.Qualitative research includes an array of approachesthat share some non-quantitative methods. Severalhave a well-established and widely respected place inthe social sciences and social work. The qualitativeresearch approach has been refined by using theoutcomes of pilot studies and reviews by researchers inorganizational learning. For example, the success ofknowledge sharing in organizational culture is not onlytechnological but also related to behavioral factors[2022]. Besides, expert interviews, critical successfactors (CSFs) and questionnaires are used to imple-ment qualitative methods in exploring specific humanproblems.

    From the organizational perspective, attention to anorganizations internal controls has increased signifi-cantly since the 1990s. Although management is ulti-mately responsible for ensuring that internal controlsare adequate, managers often lack knowledge ofinternal control concepts. Changchit et al. used a ques-tionnaire in examining an expert system which couldfacilitate the transfer of internal control knowledge tomanagement [23]. The results indicated that expertsystems are viable aids for transferring internal controlknowledge to managers, whose work experience isoutside the field of accounting and control systems.Longbottom and Chourides reported on interviewswith key staff within organizations, at various stages ofapproaching and deploying KM programs [24, 25]. In afollow-up paper, the research investigated issuesconcerning the CSFs and measurements of KM, estab-lishing practical and key factors likely to enhancesuccessful implementation. It accessed a range ofcritical factors and identified appropriate measuresover five organizational perspectives: strategy; humanresource management; information technology; quality;and marketing [26]. Furthermore, ASSESS is a proto-type decision support system for managing tacit assess-ment knowledge, which uses knowledge management

    system techniques [27]. These techniques includedatabases, internet architecture and artificial intelligence.The qualitative analysis framework methodology iscategorized in Table 4.

    3.2. Quantitative analysis

    Returning to the literature, we learn that Nonaka andTakeuchi defined explicit knowledge or codifiedknowledge as knowledge that can be articulatedthrough formal language, including grammatical state-ments, mathematical expressions, specifications andmanuals [11]. Such explicit knowledge, they con-cluded, can be transmitted easily and formally betweenindividuals. Choo suggested that explicit knowledge isknowledge made manifest through language, symbols,objects and artifacts [28].

    The aim of quantitative analysis is to present theextent of the impact on both decision making and taskperformance, using historical data that is easily avail-able, relevant, accurate and timely. This evaluation canavoid the drawbacks of qualitative analysis, especiallyin the subjective judgment of empirical results. There-fore, a quantitative research approach is designed torepresent a tangible, visible and comparable ratio. Inother words, quantitative analysis can be used tomeasure the explicit knowledge of an organization oran individual, with both financial and non-financialindicators; this is discussed below. Table 5 shows thebenefits and classes of KM, with qualitative and quan-titative indicators.

    3.2.1. Financial indicator analysis. Traditional quan-titative methods focus on well-known financialmeasures, such as analyses of financial statements,payback periods, return on investment (ROI), netpresent value (NPV), return of knowledge (ROK), andTobins q. These methods are best suited to measuredaily transaction processing system values.

    25Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 4Qualitative analysis

    Research methodology Authors

    Questionnaire Changchit et al. (2001)

    Expert Interviews Longbottom and Chourides (2001)Longbottom and Chourides (2002)

    Critical Success Factors Chourides et al. (2003)

    Decision Support System Mitri (2003)

    See References [2327].

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    The goal is to expand business leaders knowledge ofthe key drivers of customer satisfaction and businessprocess excellence, strengthening their skills todevelop profitable growth strategies, based on customervalue added (CVA) [29]. Laitamaki and Kordupleskiused an ROI index to evaluate KM projects and per-formance in CVA. From the managerial perspective,Stein et al. deployed a knowledge-based system, whichwas designed to automate tasks previously performedmanually, train new staff members, and capture knowl-edge, to enable a university organization to improveservices. Performance evaluation used NPV to diagnosethe project outcome. Finally, the system could beviewed as an estimation tool, giving a competitiveadvantage to the organization [30]. From an empiricalpoint of view, it is well known that Tobins q ignoresreplacement costs for intangible assets, because of theaccounting treatment of intangibles [31]. Tangibleassets are capitalized and reported on firms balancesheets. In contrast, intangibles are expensed, i.e.written off on the income statement, along with regularexpenses such as wages, rents and interest. As a result,the book value of assets does not reflect the stock ofintangibles, resulting from cumulative investments;market value does, however. In fact, it is a fairlycommon practice, in studies using Tobins q as ameasure of corporate performance, to correct thedenominator of q for the presence of such intangibles.Examples include knowledge capital [31, 32], orcustomer assets [33]. Villalonga also used Tobins q totest empirically the hypothesis that the greater theintangibility of a firms resources, the greater the sus-tainability of its competitive advantage [34]. Theresults suggest that intangibles can help firms to

    maintain a persistent advantage. Stein et al. also pre-sented a knowledge-based system, to assist furnace pro-duction staff in diagnosing and correcting faults inelectron beam guns, which are used to melt titanium[35]. This project used payback period analysis tomeasure future cash flows over a three year period. Infinancial strategy, bank failure prediction is an import-ant issue for the regulators of the banking industry.Very often, bank failures are due to financial distress.Early Warning Systems (EWS) may be able to identifythe inherent traits of financial distress, based on finan-cial covariates derived from publicly available finan-cial statements. An EWS is a knowledge managementsystem, which uses a knowledge base to aid in bankregulation. [36]

    Unfortunately, evaluation methods which rely onfinancial measures are not as well-suited for compli-cated IT applications. These systems typically seek toprovide a wide range of benefits, including many thatare intangible in nature. For example, it is difficultto quantify the full value of a point-of-sales (POS)system [37] or an enterprise resource planning (ERP)system [38].

    A number of researchers have written about the useof option models in IT investment decision making.The pioneering work of Dos Santos [39] employedMargrabes exchange option model [40] to value an ISproject, using a novel technology for testing. He arguedthat the option model would be better than NPV toevaluate the new IT project. Similarly, Kambil et al.[41] used the CoxRubinstein binomial option pricingmodel [42], to determine whether or not a pilot projectshould be undertaken.

    For a software platform, several options may berelevant. In an analogy to Kesters growth options forfirms [43], Taudes investigated options for evaluatingsoftware growth options [44], which could bringvaluable software platform benefits.

    Benaroch and Kauffman [37] investigated theproblem of investment timing, using the BlackScholesmodel in a real-world case study dealing with thedevelopment of point-of-sale (POS) debit service. Theircontribution did not ask whether an investment shouldbe undertaken, but when to exercise the option held,i.e. when to implement a particular IT solution. In afollow-up paper [45], they used sensitivity analysis toprobe BlackScholes valuation for IT investmentopportunities. Taudes et al. [38] also compared NPVwith the BlackScholes valuation method, which theyused to employ SAP R/2 and to switch to SAP R/3.Their results also indicated that, in the absence of aformal evaluation of the time option, traditional

    26 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 5benefits of qualitative and quantitative indicators

    Knowledge management benefits

    Qualitative index Quantitative index

    Improving employees Decreasing operation skills costs

    Improving quality Decreasing product cyclestrategies time

    Improving core business Increasing productivityprocesses Increasing market share

    Developing customer Increasing shareholderrelationships equity

    Developing supplier Increasing patent incomerelationships

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    approaches to evaluating information technologyinvestments would have produced the wrong recom-mendations. The methodology of financial indexanalysis is categorized in Table 6.

    3.2.2. Non-financial indicator analysis. Measurementrequires a medium- to long-term commitment fromboth senior management and the entire staff, andpotentially offers little impact on financial perform-ance, in the short-term. The drivers underpinningknowledge performance measures, such as teamwork,learning, communication, knowledge processes, toolsand techniques, etc., require non-financial performancemeasures to ensure that progress is being made, as wellas to determine where and when to take correctiveaction. In fact, non-financial methods are quite differ-ent from traditional financial statement analysis, usingnon-financial indexes, such as the frequency ofemployee logins to the knowledge base, how manytimes each employee comes up with a proposal, howmany topic numbers are on the KMS discussionboard, the level of customer satisfaction, the depth ofloyalty of employees, and the number of communi-ties of practice (CoP) within the company. Theseindexes are all related to behavioral factors and systemusage. Moreover, non-financial indexes are as import-ant as financial indexes, and belong to quantitativeanalysis.

    One good thing about KM is that a company canretain the knowledge it has acquired, even after thesource of the knowledge (the employee) has moved on.In terms of human resource training, focus must beplaced on developing people who are capable of

    turning internal knowledge into organizational knowl-edge. Performance appraisal aims at bringing organiz-ational improvement through effectively directingemployee behavior. Yahya and Goh investigated per-formance appraisal characteristics and their respectiveassociation with KM [46]. The feedback generated wasthen used for the purpose of promoting or encouragingbetter KM practices, especially from the knowledgetransfer phase to the knowledge application phase.

    Communities of practice have begun to play anincreasingly important role in modern, knowledge-intensive organizations. CoPs foster knowledgedevelopment and creative interaction among highlyspecialized experts, helping to channel their effortstowards areas of most need. Smits and Moor presenteda Knowledge Governance Framework, which focusedon how to define, measure, and use performance indi-cators for KM in a CoP. The results were successful andoffer useful guidelines for KM procedures [47].

    To manage knowledge successfully, it must bemeasured. It is not always clear how this can be done,however, as proper measurements may not exist and,indeed, knowledge may be immeasurable. To addressthis issue, Ahn and Chang assessed the contribution ofknowledge to business performance, rather than tryingto measure the value of knowledge directly [1]. Theyprovided a way to assess the contribution of knowledgeto business performance, by employing products andprocesses as intermediaries. Product knowledge isdirectly related to a companys specific product.Process knowledge is associated with the activities per-formed at each stage in a value chain, from inboundlogistics to customer care. In the same way, Holt et al.used four metrics to access organizational knowledge,including individual, context, content and processknowledge measures [48]. These approaches enable usto relate knowledge to business performance moreexplicitly, and provide valuable insight into howknowledge may be strategically managed.

    Organizational performance must usually be definedusing non-monetary metrics, making it relatively diffi-cult to measure. Though it can be measured indirectly,using intermediate measures, such as the number ofnew ideas, the number of new products, job satis-faction levels, and the contribution of knowledge man-agement activities to organizational performance, theseare difficult to translate into tangible benefits. Organiz-ational performance is as important as financial per-formance; organizational quality can indirectlyinfluence financial performance, serving as a moderat-ing factor. The methodology of non-financial indexanalysis is categorized in Table 7.

    27Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 6Financial indicator analysis

    Research methodology Authors

    Return on investment Laitamaki and Kordupleski (1997)

    Net present value Stein et al. (2001)

    Tobins q Ittner and Larcker (1998) Hall et al. (2000) Lev (2001) Villalonga (2004)

    Payback period Stein et al. (2003)

    Financial statements Tung et al. (2004)

    Options Benaroch and Kauffman (2000)Taudes et al. (2000)

    See References [2938].

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    3.3. Internal performance analysis

    Internal performance measurement methods focus onprocess efficiency and goal achievement efficiency.These methods evaluate KM performance through thegap between target and current value. Well-knownmethods include ROI, NPV, balanced scorecard (BSC),performance-based evaluation, activity-based evalu-ation and other models.

    Underlying Kaplan and Nortons concept of BSC wasthat all aspects of measurement have their drawbacks;however, if companies offset some of the drawbacks ofone measure with the advantages of another, the neteffect can lead to decisions resulting in both short-termprofitability and long-term success [4951]. As a result,they suggested that financial measures be supple-mented with additional ones, reflecting customer satis-faction, internal business processes and the ability tolearn and grow. Many scholars have discussed the useof a Balanced Scorecard approach in determining abusiness-orientated relationship between strategic KMusage and IT strategy and implementation [5254].They have applied an IT investment to KM, by creatinga KM scorecard focusing on both the current financialimpact of intellectual capital on core processes, andfuture earnings capabilities in structural or humancapital.

    Most research on KM has been limited to individuallevels or knowledge transfer within organizations.However, firm innovation capability is the mostimportant determinant of product performance andcompetitive ability. Competitive advantage has a sig-nificant positive economic value for a firm, and themain purpose of KM is to retain sustainability. Cavusgilet al. used three items to measure innovation perform-ance [55]. They measured whether the innovationproject had succeeded in achieving its main objectives:financial and ROI. The contribution lay in its perform-

    ance-based measures and the values of inter-firmrelationships in tacit knowledge transfer, as well asinnovation capability.

    As mentioned earlier, valuable knowledge resideswithin individual employees and is critical to anorganizations ability to solve problems and create newknowledge. In a sense, KM can be viewed as an activitywhich acts as a constituent of a community, perform-ing ones task by using tools or technology [56, 57].Some KM studies have taken an IT perspective, whereenterprise information portals (EIPs) are gateways thatstreamline access to information, thereby easing thetask of transforming data into knowledge, thus increas-ing KM efficiency. Kim et al. stressed the importance ofactivity theory, by using it to evaluate EIP systems inthe context of knowledge integration or application[58]. The analysis revealed that EIP functions, from aKM activity perspective, remain underdeveloped.

    Many measurement systems have failed to be effec-tive, because they are disparate, often measuring activi-ties that are of local or individual interest to a manager,rather than a key activity for the business. Pervaiz et al.proposed a model founded upon a continuousimprovement methodology. This model utilized aDeming type PDCA (Plan-Do-Check-Act) cycle [59].The proposed measurement framework enabled theeffective and efficient leveraging of knowledge assets.The methodology of internal performance analysis iscategorized in Table 8.

    3.4. External performance analysis

    External performance measurement methods alwayscompare a firm with benchmark companies, primary

    28 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 7Non-financial indicator analysis

    Research methodology Authors

    Human resource training Yahya and Goh (2002)

    Communities of practice Smits and Moor (2004)

    Product and process knowledge Ahn and Chang (2004)assessment

    Individual, context, content and Holt et al. (2004)process knowledge assessment

    See References [4648].

    Table 8Internal performance analysis

    Research methodology Authors

    Balanced scorecard Van Grembergen and VanderBorght (1997) Martinsons et al. (1999)Fairchild (2002)

    Performance-based evaluation Cavusgil et al. (2003)

    Activity-based evaluation Kuutti (1996)Hasan and Gould (2001)Kim et al. (2002)

    Plan-do-check-act (PDCA) Pervaiz et al. (1999)cycle

    See References [5259].

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    competitors or the industry average. For example,benchmarking is the process of determining who is thevery best, who sets the standard, and what thatstandard is. When we apply the benchmarking conceptto business, the following types of questions are asked:Which company has the best manufacturing opera-tion? and How do we quantify that standard? Withbenchmarking or best practice methodologies, firmscan understand their KM performance by comparisonwith competitors. Thus, firms can retain a competitionadvantage and expand the gap between themselves andcompetitors.

    Traditionally, benchmarking has been described as apractice that promotes imitation. However, accordingto a more recent approach, benchmarking has lookedoutside a firms boundaries, to enable comparison withothers, in terms of both practice and performance, inorder to acquire both explicit and tacit knowledge[6062]. Such newly acquired knowledge, once inte-grated with a firms prior internal knowledge, maycreate new knowledge that can give rise to improve-ments and innovations. Benchmarking is also seen as atool for identifying, understanding and adopting bestpractices, in order to increase the operational perform-ance of intellectual capital (IC) [63, 64]. From anorganizational learning perspective, benchmarking isconcerned with enhancing organizational performance,by establishing standards against which processes,products and performance can be compared and con-sequently improved [65]. Furthermore, all the organiz-ational factors examined in both sectors proved to bestatistically significant, when comparing world-classand potentially winning companies with their competi-tors; this adds weight to the argument that the exist-ence of organizational learning, within a company, isan essential ingredient in the quest for superior per-formance.

    In the endlessly hyped knowledge age of the newmillennium, evaluators are being asked to generatelessons learned and best practices. Lessons learned(local knowledge about what works) can be convertedto best practices (universal knowledge about whatworks, at least by implication of being best). Lessonslearned represent principles extrapolated frommultiple sources, and increase transferability in theform of cumulative knowledge that can be adapted andapplied to new situations. The internal validity of anysingle source of knowledge must be judged by thecriteria appropriate for that type of knowledge. Thus,practitioner wisdom and evaluation studies may beinternally validated in different ways [66]. On the otherhand, the Best Practice approach is an essential com-

    ponent of KM. It provides an opportunity to retain anduse knowledge, even when an expert has left theorganization. Asoh et al. investigated how governmentscould deliver more innovative services to a demandingpublic [67]. They felt that governments must beinvolved in the deployment of new services, such as e-Government and e-Commerce. Active management ofknowledge assets is mandatory for success. A suggestedimplementation approach highlights leadership,culture, technology and best practice measurements ascritical success factors. The methodology of externalperformance analysis is categorized in Table 9.

    3.5. Project-orientated analysis

    Since projects characteristically involve the develop-ment of new products and new processes, obviousopportunities may present themselves for novel ideasto emerge and for cross-functional learning to occur,thereby enhancing the organizations innovativecapacity and potential. On the other hand, recentstudies of knowledge management and organizationallearning, in project environments, have emphasizedinstead the difficulties of learning from projects notonly within individual projects, but also across andbetween projects [68].

    Some articles have set out to examine the significanceof social factors in enhancing knowledge managementcapabilities in the construction industry [69, 70].Bresnen et al. revealed that processes of the capture,transfer and learning of knowledge, in project settings,rely very heavily upon social patterns, practices andprocesses, in ways which emphasize the value andimportance of adopting a community-based approach tomanaging knowledge [69]. Bresnen et al.s paper madea contribution to the development of knowledge man-agement theory, within project environments.

    29Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 9External performance analysis

    Research methodology Authors

    Benchmarking Pemberton et al. (2001)Chai et al. (2003)Leung et al. (2004)Massa and Testa (2004)Carrillo (2004)Marr (2004)

    Best practices Patton (2001)Asoh et al. (2002)

    See References [6067].

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    In recent years, after the term was proposed, numerousindividuals and organizations have been trying to putmore science behind the art of knowledge manage-ment. Rubenstein-Montano et al. found that currentproject management frameworks did not typicallyemploy a systems approach [71]. For this reason, theysuggested that frameworks should be developed withina systems context. With this in mind, Liebowitz providedsome useful frameworks to help project managers andothers to conceptualize and implement knowledge man-agement initiatives [72]. In the strategy approach,Kamara et al. described a framework for selecting a KMstrategy that is appropriate to the organizational andcultural context in KM projects [73]. This approachunderscores the fact that knowledge management is notan end in itself, but a means towards the solution ofbusiness problems that mitigate inefficiencies andimprove the innovative capacity of a company.

    Nevertheless, project organizations require particu-larly systematic and effective knowledge management,if they are to avoid knowledge fragmentation and lossof organizational learning [74]. Kasvi et al. dealt withknowledge management and knowledge competence inproject organizations, particularly from a programmersperspective [75]. Finally, they made a contribution bypresenting the Learning Programme Model. In order tosystematically manage the knowledge created within aproject, the project itself must be systematicallymanaged by the model. The methodology of project-orientated analysis is categorized in Table 10.

    3.6. Organization-orientated analysis

    With the increasing importance of effective knowledgemanagement in organizations, it has become increasingly

    important for organizations to be able to measure theirstate of the art on this subject. Organization-orientatedanalysis is focused on the entire organization, on themulti-dimensional and multi-layering aspects of thefirm. In the horizontal perspectives, KM performanceevaluation is focused on leadership, and cultural andtechnological as well as process dimensions. In thevertical perspectives, KM performance evaluation isfocused on strategy, management, and implementationlayers. The primary objective is to estimate the level ofKM performance from the perspective of the wholeorganization. KM performance evaluation is carried outusing the Skandia AFS (Assurance and FinancialServices) model, technology tools and all four perspec-tives in BSC.

    Most organizations have only a vague understandingof how much they have invested in intellectual capital(IC), let alone what they may receive from those invest-ments. Standard financial accounting systems do notallow for the easy estimation of intellectual capitalinvestments. Without methods to measure intellectualcapital, many firms are ignorant of its full potential.Among the most widely used approaches for IC man-agement and reporting are the so-called IntangibleAsset Monitor by Sveiby and the IC approach byEdvinsson and Van Buren, originally introduced by theinsurance company Skandia [7678]. These models aredesigned to measure human, innovation, process, andcustomer capital, and represent a major step towardproviding precisely the information that firms and theirstakeholders need to foresee the future. Thus, these ICmodels can help visualize the knowledge-productionprocess of research organizations [79]. In addition,some firms have also used BSC originally developedfor strategic management, control, and performancemeasurement for IC management and reporting[8082].

    Knowledge management tools can support the per-formance of applications, activities or actions, such asknowledge generation, codification or transfer, and alsopromote and enable the knowledge process, in order toimprove decision making. Ruggles claimed that Knowl-edge Codification is the capture and representation ofknowledge, so that it can be accessed, reused and trans-ferred, either by an individual or by an organization[83]. Jackson investigated 59 Knowledge Managementtools and examined both the software and technologyapproaches for knowledge management [84]. Wensleysimply discounted any tool that was not web-based,believing that KM tools would only be utilized in aninternet environment [85]. Tyndale also evaluated awide variety of such tools, by examining the literature

    30 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 10Project-orientated analysis

    Research methodology Authors

    Social patterns Edelman et al. (2001)Bresnena et al. (2003)

    KM project management Rubenstein-Montano et al.framework (2001)

    Kamara et al. (2002)Liebowitz and Megbolugbe(2003)

    KM project management model Vartiainen et al. (1999)Kasvi et al. (2003)

    See References [7075].

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    related to the selection and evaluation of the KM toolsavailable in the software market [86].

    Van Den Hooff et al. presented the Knowledge Man-agement Scan, which is an adequate instrument fordiagnosing organizations, with the results providingsufficient insight into the organizations knowledgeprocesses [87]. In practice, the scan is repeatedlytranslated into concrete activities, which improvessuch processes. The methodological reflection, whichis important in the execution of the scan, will lead toan instrument that is practically and scientificallyvalid; moreover, it will give much valuable insight intothe subject of knowledge management. The methodol-ogy of organization-orientated analysis is categorizedin Table 11.

    4. Discussion, limitations and suggestions

    4.1. Discussion

    KPMG reported that the reasons for the creation ofknowledge management initiatives, cited by most

    companies, are to facilitate better decision making,increase profits and reduce costs [88]. However, KMsuffers from the same challenges as many other man-agement issues: it assumes that knowledge is a thing,which is amenable to being managed by a manager.It must first be determined which KM process is key toachieving a competitive advantage, and second, whichmeasurement method is the most appropriate toappraise KM performance.

    KM performance measurement methods comprisebroad categories of research issues. Method develop-ment has been diverse, due to researchers back-grounds, expertise and problem domains [89]. On theother hand, some means of analysis have commonmeasurement concepts and methodologies. Forexample, the NPV measurement method is used in bothfinancial and internal performance analysis. Inaddition, the BSC measurement method is used ininternal performance and organization-orientatedanalysis. This indicates that the development trend inevaluation methods is also diverse, due to the authorsresearch interests and ability in the methodology andproblem domains. This directs the development of KMperformance measurement towards an expertiseorientation.

    Furthermore, some evaluation methodologiesoverlap, to a high degree. For example, financial state-ment analysis, ROI, ROK, payback period and optionevaluation methods are all quantitative methods, withdifferent concepts and methodologies, which evaluateKM within a common problem domain. This indicatesthat these evaluation methods are the major trend forKM development, and that many methodologies arefocused on these problems. This can direct thedevelopment of KM evaluation towards a problemdomain orientation.

    As shown in Table 12, we gathered statistics for thisKM performance evaluation survey research from 1995to 2004. We divided this data into eight categories ofKM performance evaluation methodologies. Our goal

    31Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Table 12A review of articles evaluating KM performance: 19952004

    Approach Paper amount 19951999 20002004

    Qualitative 14 4 10Quantitative 26 10 16Internal performance 16 6 10External performance 15 4 11Project-orientated 17 5 12Organization-orientated 20 9 11Total 108 38 70

    Table 11Organizational-orientated analysis

    Research methodology Authors

    Technology Ruggles (1997)Jackson (1999)Wensley (2000)Tyndale (2002)

    Process Van Den Hooff et al. (2003)intellectual capital Edvinsson (1997)

    Sveiby (1998)Van Buren (1999)Leitner and Warden (2004)

    BSC De Gooijer (2000)Johanson et al. (2001)Bukh et al. (2002)

    See References [7687].

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    was to examine the research trend in KM performanceevaluation changes, using two phases, to distinguishthe first five years (199599) from the second five years(20002004). In Figure 5, we can see the changebetween these first and second five-year periods. Themain findings can be described as follows:(1) KM performance evaluation is becoming more

    important. Articles published over the last fiveyears are almost double the five years previous tothat. This shows that research topics havechanged from KM creation, transformation andimplementation to the evaluation of KM perform-ance.

    (2) Quantitative analysis is the primary methodologyused to evaluate KM performance; indeed, mostresearch articles, in the last five years, haveinvolved quantitative analysis. Traditionally, mostscholars have suggested financial indicators todisplay the value of KM; now, more and morescholars are insisting on evaluating KM perform-ance using non-financial indicators, in a socialand behavioral sciences approach.

    (3) Firms are now highlighting the KM performanceof competitors, through benchmarking or bestpractices, rather than internally auditing KM per-formance via BSC. In Table 12, we can see thatarticles outlining the external performanceapproach have grown quite substantially. These

    results allow us to infer that, in the future, firmswill carefully consider their own KM perform-ance, as well as that of their competitors. For thisreason, firms are now using an external perform-ance approach to replace the original BSC frame-work, using benchmarking or best practices tointegrate the four perspectives of BSC activities. Itis now evident that past, present and future KMperformance can be measured by an external per-formance approach.

    (4) Firms may begin to focus more on project manage-ment measurement, than on the entire organiz-ation. In Table 12, it can be seen thatproject-orientated articles have grown consider-ably, showing that measurement and control ofthe achieved percentage of scheduled progress inKM project management is becoming a majorfocus. Measurement of the entire organizationsKM performance is very difficult from process,leadership, culture or technology perspectives; itis obvious that better efficiency and effectivenessin KM performance can be reached through aproject-orientated approach.

    In this paper, most of the articles discussed camefrom management science and social science journals,found on the five online databases, while a few camefrom computer and information science journals. It ishoped that different research fields will begin to

    32 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Fig. 5. KM development trend analysis.

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    publish KM performance evaluation articles, in orderto broaden the horizon of academic and practical KMstudies.

    4.2. Limitations

    This analysis and research into KM performance evalu-ation has several limitations. First, a literature reviewof this broad category is difficult, due to the extensivebackground knowledge required to study, classify andcompare these articles. Although limited in back-ground knowledge, this paper has presented a briefliterature review of KM from 1995 to 2004, in order toexplore how KM performance evaluations developedthroughout this period. Thus, the first limitation of thisarticle is its scope, in that surveys that focused solelyon a specialty domain, or which were proprietary, andtherefore inaccessible, were excluded.

    Second, the scope of our investigation was furtherdiminished by utilizing only surveys for which wewere able to obtain the original documents. Someacademic journals listed in the science citation index(SCI) and the social science citation index (SSCI), aswell as other practical reports, were not included inthis survey. This weakens our conclusions, somewhat,and notably our proposition that the epistemologicalfoundation of survey research in KM performanceevaluation turns around eight categories.

    A third constraint, constituting possible limitationsin the analysis of these eight categories, may requiresubmission to other researchers for further validation.

    Fourth, non-English publications were not con-sidered in this survey, so the effects of differentcultures on the development of KM performance evalu-ations were not determined. Many other KM perform-ance evaluations have been published and developed,in addition to those discussed in this article.

    4.3. Suggestions

    (1) Integration of cultural perspective. In this survey,we obtained questionnaire data from five inter-national high-technology organizations. As suchwe infer that KM performance evaluationdevelopment is closer to that of the informationtechnologies. However, confusion betweenknowledge and information underlies many of theproblems caused by information technology. AsBrown and Duguid note, knowledge entails aknower, but people treat information as inde-pendent and self-sufficient [90]. They argue it isdifficult to separate knowledge from information.

    In addition, Nardi and ODay define informationecology as a system of people, practices, values,and technologies in a particular local environ-ment [91]. Their goal is to change the way peoplelook at information technology. A key to thought-ful action is to ask more know-why questions,before jumping to the more straightforwardknow-how questions. Since we are heading intoa totally technology-dominated world, it is veryimportant that we not only know how to use acertain technology, but why we use a certaintechnology. Therefore, by trying to understandtechnology this way, we will then be able to com-municate our thoughts to others and find ways touse technology much more effectively.

    (2) Integration of information technologies. KM is aninterdisciplinary research issue. Thus, future KMperformance evaluation developments will beintegrated with information technologies, especi-ally for high-technology organizations; cross- orinter-disciplinary research may offer moremethodologies to investigate the challenges of KMperformance evaluation.

    (3) Integration of options. A major challenge lies indesigning models and theories to evaluate the per-formance and value of KM. Traditional financialanalysis indicators have long relied on NPV,simple costbenefit analysis, critical successfactors and other less-structured techniques, tomake their assessments. Thus, our literaturesurvey has critically reviewed the case for usingoption pricing as a basis for KM performanceanalysis, evaluating its merits in a real-wordbusiness setting.

    (4) Other computer science methodologies. The defi-nition of KM performance evaluation is notcomplete in this survey, because other method-ologies, such as artificial intelligence, were notincluded. Data mining and soft computingmethods are other research technologies that maybe used to solve problems in social studies. Thus,computer science methodologies may include aKM performance evaluation category in future.

    (5) Evolution as a source of development. Social andtechnical evolution may empower the develop-ment of KM performance evaluation. To continuecreating, converting, circulating and implement-ing KM processes may be the factors necessaryfor the successful development of KM. Mostimportantly, the more KM development is encour-aged, the better KM performance evaluation willbe.

    33Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    5. Conclusions

    This paper was based on a literature review of Knowl-edge Management performance evaluation from 1995to 2004, using a keyword index search. Development ofKM performance measurements have tended towardsexpert orientation, while KM evaluation developmentis a problem-orientated domain. Different informationtechnology methodologies, such as artificial intelli-gence, may be another way of implementing KM per-formance evaluation. Integration of knowledge-basedsystems, expert systems and data mining technologiesmay also increase our understanding of this subject.The ability to continually change, and gain newinsights into the power of effective KM performanceevaluation, will be the core of future KM research.

    Acknowledgement

    The authors gratefully thank the Editor and anonymousreviewers for their valuable comments and construc-tive suggestions.

    References

    [1] J.H. Ahn and S.G. Chang, Assessing the contribution ofknowledge to business performance: the KP3 methodol-ogy, Decision Support Systems 36 (2004) 40316.

    [2] U. Schultze and D.E. Leidner, Studying knowledge man-agement in information systems research: discoursesand theoretical assumptions, MIS Quarterly 26(3) (2002)21342.

    [3] M. Shin, T. Holden and R.A. Schmidt, From knowledgetheory to management practice: towards an integratedapproach, Information Processing and Management 37(2001) 33555.

    [4] E. Brynjolfsson, A.A. Renshaw and M.V. Alstyne, Thematrix of change, Sloan Management Review 38(2)(1997) 3754.

    [5] M. Alavi, KPMG Peat Marwick U.S.: one giant brain,Case 9397108 (Harvard Business School, Boston, MA,1997).

    [6] T. Beckman, A methodology for knowledge manage-ment. In: M.H. Hamza (ed.), Proceedings of the IASTEDInternational Conference on AI and Soft Computing,Banff, Canada, 1997 (ACTA Press, Calgary, 1997) 2932.

    [7] A.P. Chen and M.Y. Chen, Integrating option model andknowledge management performance measures: anempirical study, Journal of Information Science 31(5)(2005) 38193.

    [8] M.Y. Chen, M.J. Tsai and H.R. Wu, The research of KM

    operation module in science and technology industry case study of TSMC. In: S-C.T. Chou (ed.), Proceedingsof the 12th International Information Management Con-ference (CSIM Press, Taipei, 2001) A-75.

    [9] T.H. Davenport, D.W. Long and M.C. Beers, Successfulknowledge management projects, Sloan ManagementReview 39(2) (1998) 4357.

    [10] J. Liebowitz, Key ingredients to the success of anorganizations knowledge management strategy, Knowl-edge and Process Management 6(1) (1999) 3740.

    [11] I. Nonaka and H. Takeuchi, The Knowledge CreatingCompany (Oxford University Press, New York, 1995).

    [12] K. Wiig, Knowledge management: where did it comefrom and where will it go? Expert Systems with Appli-cations 13(1) (1997) 114.

    [13] M. Alavi and D.E. Leidner, Review: knowledge manage-ment and knowledge management systems: conceptualfoundations and research issues, MIS Quarterly 25(1)(2001) 10736.

    [14] R.M. Grant, Prospering in dynamically-competitiveenvironments: organizational capability as knowledgeintegration, Organization Science 7(4) (1996) 37587.

    [15] M. Polanyi, The Tacit Dimension, Knowledge inOrganizations (Butterworth-Heinemann, Newton, MA,1996).

    [16] T.S. Kuhn, The Structure of Scientific Revolutions(University of Chicago Press, Chicago, 1962).

    [17] I. Nonaka, A dynamic theory of organizationalknowledge creation, Organization Science 5(1) (1994)1437.

    [18] S.D.N. Cook and J.S. Brown, Bridging epistemologies:the generative dance between organizational knowledgeand organizational knowing, Organization Science 10(4)(1999) 381400.

    [19] H. Tsoukas, The firm as a distributed knowledge system:a constructionist approach, Strategic ManagementJournal 17 (1996) 1125.

    [20] R.J. Calantone, S.T. Cavusgil and Y. Zhao, Learningorientation, firm innovation capability, and firm per-formance, Industrial Marketing Management 31 (2002)51524.

    [21] M. Hertzum, The importance of trust in software engi-neers assessment and choice of information sources,Information and Organization 12 (2002) 118.

    [22] G. Walsham, Knowledge management: the benefits andlimitations of computer systems, European Manage-ment Journal 19(6) (2002) 599608.

    [23] C. Changchit, C.W. Holsapple and R.E. Viator, Transfer-ring auditors internal control evaluation knowledge tomanagement, Expert Systems with Applications 20(2001) 27591.

    [24] D. Longbottom and P. Chourides, Knowledge manage-ment: a survey of leading UK companies. In: P. Hermel(ed.), Proceedings of the 2nd MAAOE InternationalConference, Versailles, France, 2001 (UVSQ Press,Versailles, 2001) 11326.

    34 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    [25] D. Longbottom and P. Chourides, Climbing new heights:conquering K2, Knowledge Management Magazine 17June (2002).

    [26] P. Chourides, D. Longbottom and W. Murphy, Excellencein Knowledge Management: an empirical study toidentify critical factors and performance measures,Measuring Business Excellence 7(2) (2003) 2945.

    [27] M. Mitri, Applying tacit knowledge managementtechniques for performance assessment, Computers &Education 41 (2003) 17389.

    [28] C.W. Choo, The Knowing Organization (Oxford Uni-versity Press, New York, 1998).

    [29] J. Laitamaki and R. Kordupleski, Building and deploy-ing profitable growth strategies based on the waterfall ofcustomer value added, European Management Journal15(2) (1997) 15866.

    [30] E.W. Stein, M.P. Manco, S.A. Manco, A knowledge-based system to assist university administrators inmeeting disability act requirements, Expert Systemswith Applications 21(2) (2001) 6574.

    [31] B. Lev, Intangibles: Management, Measurement, andReporting (Brookings Institution Press, Washington, DC,2001).

    [32] B.H. Hall, A. Jaff and M. Trajtenberg, Market Value andPatent Citations: A First Look. Working Paper No. 7741(National Bureau of Economic Research, Cambridge,MA, 2000).

    [33] C.D. Ittner and D.F. Larcker, Are non-financial measuresleading indicators of financial performance? An analysisof customer satisfaction, Journal of AccountingResearch 36 (1998) 135.

    [34] B. Villalonga, Intangible resources, Tobins q, and sus-tainability of performance differences, Journal ofEconomic Behavior and Organization 54(2) (2004)20530.

    [35] E.W. Stein, M.C. Pauster and D. May, A knowledge-based system to improve the quality and efficiency oftitanium melting, Expert Systems with Applications24(2) (2003) 23946.

    [36] W.L. Tung, C. Quek and P. Cheng, GenSo-EWS: anovel neural-fuzzy based early warning system forpredicting bank failures, Neural Networks 17(4)(2004) 56787.

    [37] M. Benaroch and R.J. Kauffman, A case for using realoptions pricing analysis to evaluate information tech-nology project investments, Information SystemsResearch 10(1) (1999) 7086.

    [38] A. Taudes, M. Feurstein and A. Mild, Options analysisof software platform decisions: a case study, MIS Quar-terly 24(2) (2000) 22743.

    [39] B.L. Dos Santos, Justifying investments in new infor-mation technologies, Journal of Management Infor-mation Systems 7(4) (1991) 7190.

    [40] W. Margrabe, The value of an option to exchange oneasset for another, Journal of Finance 33(1) (1978)17786.

    [41] A. Kambil, J. Henderson and H. Mohsenzaden, Strategicmanagement of information technology investments: anoption perspective. In: R.D. Banker et al. (eds), StrategicInformation Technology Management: Perspectives onOrganization Growth and Competitive Advantage (IdeaPublishing Group, Hershey, 1993) 16178.

    [42] J. Cox, S. Ross and M. Rubinstein, Option pricing: a sim-plified approach, Journal of Financial Economics 6(1979) 22963.

    [43] W.C. Kester, Todays options for tomorrows growth,Harvard Business Review 62 (1984) 15361.

    [44] A. Taudes, Software growth options, Journal of Manage-ment Information Systems 15(1) (1998) 16585.

    [45] M. Benaroch and R.J. Kauffman, Justifying electronicbanking network expansion using real options analysis,MIS Quarterly 24(2) (2000) 197225.

    [46] S. Yahya and W.K. Goh, Managing human resourcestoward achieving knowledge management, Journal ofKnowledge Management 6(5) (2002) 45768.

    [47] M. Smits and A.D. Moor, Measuring knowledge manage-ment effectiveness in communities of practice. In: R.Sprague (ed.), Proceedings of the 37th Hawaii Inter-national Conference on System Sciences, 2004 (IEEEPress, Big Island, 2004) 80236b [Abstract only].

    [48] D.T. Holt, S.E. Bartczak, S.W. Clark and M.R. Trent, Thedevelopment of an instrument to measure readiness forknowledge management. In: R. Sprague (ed.), Proceed-ings of the 37th Hawaii International Conference onSystem Sciences (IEEE Press, Big Island, 2004) 80238b[Abstract only].

    [49] R. Kaplan and D. Norton, The balanced scorecard:measures that drive performance, Harvard BusinessReview 70(1) (1992) 719.

    [50] R. Kaplan and D. Norton, Putting the balanced scorecardto work, Harvard Business Review 71(5) (1993) 13442.

    [51] R. Kaplan and D. Norton, Using the balanced scorecardas a strategic management system, Harvard BusinessReview 74(1) (1996) 7585.

    [52] A.M. Fairchild, Knowledge management metrics via abalanced scorecard methodology. In: R. Sprague (ed.),Proceedings of the 35th Hawaii International Confer-ence on System Sciences 2002 (IEEE Press, Big Island,2002) 243 [Abstract only].

    [53] M. Martinsons, R. Davison and D. Tse, The balancedscorecard: a foundation for the strategic management ofinformation systems, Decision Support Systems 25(1999) 7188.

    [54] W. Van Grembergen and D. Vander Borght, Audit guide-lines for IT outsourcing, EDP Auditing (1997) 18.

    [55] S.T. Cavusgil, R.J. Calantone and Y. Zhao, Tacit knowl-edge transfer and firm innovation capability, TheJournal of Business & Industrial Marketing 18(1) (2003)621.

    [56] H. Hasan and E.Gould, Support for the sense-makingactivity of managers, Decision Support Systems 31(1)(2001) 7186.

    35Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    [57] K. Kuutti, Activity theory as a potential framework forhuman-computer interaction research. In: B. Nardi (ed.),Context and Consciousness: Activity Theory andHuman-Computer Interaction (MIT Press, Cambridge,MA, 1996) 922.

    [58] Y.J. Kim, A. Chaudhury and H.R. Rao, A knowledgemanagement perspective to evaluation of enterpriseinformation portals, Knowledge and Process Manage-ment 9(2) (2002) 5771.

    [59] K.A. Pervaiz, K.L. Kwang and Z. Mohamed, Measure-ment practice for knowledge management, Journal ofWorkplace Learning 11(8) (1999) 30411.

    [60] K.H. Chai, M. Gregory and Y. Shi, Bridging islands ofknowledge: a framework of knowledge sharing mechan-isms, International Journal of Technology Management25(8) (2003) 703 27.

    [61] H.N. Leung, W.K. Chan and W.B. Lee, Benchmarking therole-modification process for successful knowledgetransfer, Benchmarking 11(6) (2004) 6019.

    [62] S. Massa and S. Testa, Innovation or imitation? Bench-marking: a knowledge-management process to innovateservices, Benchmarking 11(6) (2004) 61020.

    [63] F.J. Carrillo, Capital cities: a taxonomy of capitalaccounts for knowledge cities, Journal of KnowledgeManagement 8(5) (2004) 2846.

    [64] B. Marr, Measuring and benchmarking intellectualcapital, Benchmarking 11(6) (2004) 55970.

    [65] J.D. Pemberton, G.H. Stonehouse and D.J. Yarrow,Benchmarking and the role of organizational learning indeveloping competitive advantage, Knowledge andProcess Management 8(2) (2001) 12335.

    [66] M.Q. Patton, Evaluation, knowledge management, bestpractices, and high quality lessons learned, AmericanJournal of Evaluation 22(3) (2001) 32936.

    [67] D. Asoh, S. Belardo and R. Neilson, Knowledge manage-ment: issues, challenges and opportunities for govern-ments in the new economy. In: R. Sprague (ed.),Proceedings of the 35th Hawaii International Confer-ence on System Sciences 2002 (IEEE Press, Big Island,2002) 129 [Abstract only].

    [68] R.J. DeFillippi, Project-based learning, reflective prac-tices and learning outcomes, Management Learning32(1) (2001) 510.

    [69] M. Bresnen, L. Edelman, S. Newell, H. Scarbrough andJ. Swan, Social practices and the management of knowl-edge in project environments, International Journal ofProject Management 21 (2003) 15766.

    [70] L. Edelman, M. Bresnen, S. Newell, H. Scarbrough andJ. Swan, The paradox of social capital: structural, cogni-tive and relational dimensions. In: R.A. Bettis (ed.),Reinventing strategic management: old truths and newinsights: Strategic Management Society, 21st AnnualInternational Conference, San Francisco, 2001 (Black-well, San Francisco, 2001) 15374.

    [71] B. Rubenstein-Montano, J. Liebowitz, J. Buchwalter, D.McCaw, B. Newman and K. Rebeck, A systems thinking

    framework for knowledge management, DecisionSupport Systems 31(1) (2001) 516.

    [72] J. Liebowitz, I. Megbolugbe, A set of frameworks to aidthe project manager in conceptualizing and implement-ing knowledge management initiatives, InternationalJournal of Project Management 21 (2003) 18998.

    [73] J.M. Kamara, C.J. Anumba and P.M. Carrillo, A CLEVERapproach to selecting a knowledge managementstrategy, International Journal of Project Management 20(2002) 20511.

    [74] M. Vartiainen, M. Hakonen, A. Simola, A. Kokko and T.Rantamaki, Learning project model and transfer ofexperiences. In: P. Chung (ed.), Proceedings of the 6thInternational Product Development Conference 1999(Cambridge U.P., Cambridge, 1999) 77486.

    [75] J.J. Kasvi, M. Vartiainen and M. Hailikari, Managingknowledge and knowledge competences in projects andproject organizations, International Journal of ProjectManagement 21 (2003) 57182.

    [76] L. Edvinsson, Developing intellectual capital atSkandia, Long Range Planning 30(3) (1997) 36673.

    [77] K.E. Sveiby, Intellectual capital: thinking ahead, Aus-tralian CPA 68(5) 1998 1822.

    [78] M.E. Van Buren, A yardstick for knowledge manage-ment, Training & Development 53(5) (1999) 718.

    [79] K.H. Leitner and C. Warden, Managing and reportingknowledge-based resources and processes in researchorganizations: specifics, lessons learned and perspec-tives, Management Accounting Research 15 (2004)3351.

    [80] P.N. Bukh, M.E. Johansen and J. Mouritsen, Multipleintegrated performance management systems: IC andBSC in a software company, Singapore ManagementReview 24(3) (2002) 2133.

    [81] J. De Gooijer, Designing a knowledge management per-formance framework, Journal of Knowledge Manage-ment 4(4) (2000) 303 10.

    [82] U. Johanson, M. Martensson and M. Skoog, Measuringto understand intangible performance drivers, EuropeanAccounting Review 10(3) (2001) 40737.

    [83] R. Ruggles, Knowledge Management Tools (Butterworth-Heinemann, Oxford, 1997).

    [84] C. Jackson, Process to product creating tools for knowl-edge management. In: Y. Malhotra (ed.), KnowledgeManagement for Business Model Innovation (Idea GroupPublishing, Hershey, 1999) 40213.

    [85] A. Wensley, Tools for knowledge management. In:D. Chauvel and C. Despres (eds), Proceedings of theBPRC 2000 Conference on Knowledge Management:Concepts and Controversies (Butterworth-Heinemann,Warwick, 2000) 21.

    [86] P. Tyndale, A taxonomy of knowledge managementsoftware tools: origins and applications, Evaluation andProgram Planning 25 (2002) 18390.

    [87] B. Van Den Hooff, J. Vijvers and J. De Ridder, Founda-tions and applications of a knowledge management

    36 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • M.Y. CHEN AND A.P. CHEN

    scan, European Management Journal 21(2) (2003)23746.

    [88] KPMG, Knowledge Management Research Report (1998).[89] S.H. Liao, Knowledge management technologies and

    applications-literature review from 1995 to 2002, ExpertSystems with Applications 25 (2003) 15564.

    [90] J.S. Brown and P. Duguid, The Social Life of Information(Harvard Business School, Boston, MA, 2000).

    [91] B.A. Nardi and V.L. ODay, Information Ecologies: UsingTechnology with Heart (MIT Press, Cambridge, MA,1999).

    37Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    Appendix: Questionnaire

    1 2 3 4 5 6 7 8 9 10

    I. Qualitative of Method

    Questionnaire

    Expert Interviews

    Critical Success Factors

    Decision Support System

    II. Quantitative of Method

    Financial Indicator Analysis of Revenue

    Financial Indicator Analysis of Cost

    Non-Financial Indicator Analysis of Human

    Non-Financial Indicator Analysis of Process

    III. Internal Performance of Method

    Balanced Scorecard (Internal Business Perspective)

    Performance-based Evaluation

    Activity-based Evaluation

    Plan-Do-Check-Act Cycle

    IV. External Performance of Method

    Benchmarking

    Best Practices

    V. Project-Oriented Method

    Social Patterns

    KM Project Management Framework

    KM Project Management Model

    VI. Organizational-Oriented Method

    Technology

    Process

    Intellectual Capital

    BSC

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from

  • Knowledge management performance evaluation

    1 2 3 4 5 6 7 8 9 10

    VII. Knowledge Management Performance

    Knowledge Management Evaluation

    Knowledge Management Evaluation of Method

    Time of Performance

    Respondents will checkmark the level that best describes their degree of agreement with the statements. The higher the level,the more important the evaluation of knowledge management.

    Personal Data

    Age: _______ Sex: Male Female

    Company: ________________ Department: ________________ Position: ________________ Experience: _______ years

    Education: High School College University Postgraduate

    38 Journal of Information Science, 32 (1) 2006, pp. 1738 CILIP, DOI: 10.1177/0165551506059220

    at Pontificia Universidad Javeriana. Biblioteca Alfonso Borrero Cabal, S.J. on April 19, 2012jis.sagepub.comDownloaded from