Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia...

100
Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems and Computer Engineering Supervisor: Prof. Paulo Jorge Fernandes Carreira Examination Committee Chairperson: Prof. José Luís Brinquete Borbinha Supervisor: Prof. Paulo Jorge Fernandes Carreira Member of the Committee: Prof. Inês Flores-Colen May 2015

Transcript of Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia...

Page 1: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Cloud-based Facility Management Benchmarking

Sofia Pereira Martins

Thesis to obtain the Master of Science Degree in

Information Systems and Computer Engineering

Supervisor: Prof. Paulo Jorge Fernandes Carreira

Examination Committee

Chairperson: Prof. José Luís Brinquete BorbinhaSupervisor: Prof. Paulo Jorge Fernandes CarreiraMember of the Committee: Prof. Inês Flores-Colen

May 2015

Page 2: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

ii

Page 3: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Acknowledgments

First of all, I need to thank my advisor Professor Paulo Carreira, who gave excellent ideas and re-

sources, who connected me to the right persons. I thank him for his full and expertise support during

the development of this thesis, making this work even more interesting and important.

Thank you to Sara Oliveira for all her help on Facility Management and Benchmarking subject and a

thank you to APFM for helping with the distribution of questionnaires.

A big thank you to Bernardo Simoes who helped me learn Ruby on Rails from scratch, and had

patience even when I was being obtuse. Thank you to all my friends for being there for the funny and the

working times, a special thanks to Andreia Santos who was my great companion through the course.

Also, I need to thank Guilherme Vale for helping, specially at the end of this thesis.

I am specially thankful to my family, who supported me through everything all my life. A big special

thank you to my brother, Andre Martins, who always guided me, and helped me being who I am today.

iii

Page 4: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

iv

Page 5: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Resumo

As despesas de instalacoes, tais como a manutencao de equipamentos ou a limpeza do espaco, con-

stituem uma grande fatia do custo base das organizacoes. O Facility Management moderno utiliza

software especıfico, como Computer Aided Facility Management Software, para identificar e otimizar o

desempenho dos instalacoes, atraves do benchmark de diferentes categorias de indicadores non-core

do negocio.

No entanto, um grande problema e que nao e claro para as organizacoes se esses softwares podem

trazer otimizacoes a nıvel de desempenho, ou seja, nao e claro que a pratica do Facility Management

esta directamente ligada com o desempenho dos melhores na sua industria. No geral, ha uma neces-

sidade de uma solucao de benchmarking que junte os indicadores de Facility Management de facilities

distintas, identificando os contrastes e semelhancas entre elas.

Esta tese estuda o problema da avaliacao comparativa do Facilities Management — atraves das

ferramentas existentes para benchmarking de facilities, tendo em conta tanto a literatura cientıfica e

como a pratica atual na industria — e propoe uma solucao em Cloud para integrar indicadores chave

de desempenho (Key Performance Indicators) de instalacoes distintos. Especificamente, esta tese

sistematiza e prioriza uma lista de indicadores mais utilizados e, em seguida, desenvolve uma solucao

Cloud que integra indicadores de facilities distintas, validando a utilidade da lista anteriormente referida.

A solucao e validada atraves de um conjunto de testes de usabilidade e desempenho, com resultados

promissores.

Usando esta solucao, cada organizacao conseguira acompanhar a sua classificacao em relacao

a outras organizacoes, e saber se podem otimizar ainda mais a sua pratica de Facility Management.

Espera-se que este facto leve a uma melhoria global no funcionamento de edifıcios.

Palavras-chave: Gestao de Instalacoes, Indicadores Chave de Desempenho, Benchmark-

ing, Computacao em Nuvem, Analise de Dados, Medicoes, Normas

v

Page 6: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

vi

Page 7: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Abstract

Facilities expenditures, such as equipment maintenance or space cleaning, constitute a big slice of

organizations base cost. Modern Facilities Management employs specific software such as Computer

Aided Facility Management Software to identify and optimize facilities performance by benchmarking

different categories of the non-core business indicators.

A pervasive problem, however, is that organizations do not realize whether they can bring those opti-

mizations to new levels of performance, i.e., they do not perceive if their Facilities Management practice

is in line with the performance of the best in their industry. Overall, there is a need for a benchmarking so-

lution that brings Facilities Management indicators of distinct facilities together, identifying the contrasts

and similarities between them.

This thesis studies the problem of Facilities Management benchmarking—through the existing tools

for facilities benchmarking, taking into account both scientific literature and current industry practice—

and proposes a Cloud-based solution to integrate Key Performance Indicators of distinct facilities. Specif-

ically, the thesis systematizes and prioritizes a list of the most commonly used indicators and then devel-

ops a Cloud-based solution to integrate indicators from distinct facilities that validates their usefulness.

The solution is validated through a set of usability and performance tests with promising results.

Using this solution, organizations will be able to track their ranking with respect to other organizations,

and know whether they can further optimize their Facilities Management practice. Expectably, this will

lead to global improvement in the operation of facilities.

Keywords: Facilities Management, Key Performance Indicator, KPI, Benchmarking, Cloud

Computing, Data Analysis, Measurements, Standards

vii

Page 8: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

viii

Page 9: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Contents

Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii

Resumo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v

Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii

List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii

List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvi

Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xviii

1 Introduction 1

1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.2 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.3 Methodology and Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.4 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 Concepts 7

2.1 Facilities Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.2 FM Software Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.3 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.4 Key Performance Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.5 Cloud Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.6 Cloud Computing Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.7 Cloud Computing Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.8 Cloud Computing Benefits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

2.9 Cloud Computing for Facility Management (FM) . . . . . . . . . . . . . . . . . . . . . . . . 13

3 State-of-the-art 15

3.1 International Standards for FM Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.2 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.2.1 Benchmarking in Business . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.2.2 Benchmarking in FM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.3 Existing Software Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

3.3.1 ARCHIBUS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3.3.2 PNM Soft Sequence Kinetics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

ix

Page 10: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

3.3.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

3.4 Scientific Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.5 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.5.1 Priority Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

3.5.2 Normalized FM Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4 Solution Proposal 32

4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

4.2 Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

4.2.1 Elicitation Interviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

4.2.2 User Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

4.2.3 Functional Requirements and Constraints . . . . . . . . . . . . . . . . . . . . . . . 34

4.2.4 Defining a System Domain Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.2.5 Creating the interfaces conceptual model . . . . . . . . . . . . . . . . . . . . . . . 35

4.3 Solution Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.3.1 Server Side . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4.3.2 Database Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

4.3.3 Client Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

4.4 Implementation Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

4.4.1 Languages and Frameworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

4.4.2 Version Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

4.4.3 Test Driven Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

4.4.4 Deployment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.5 Solution Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.5.1 Database Schema . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.5.2 VAT ID and ZIP Code Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.5.3 User Authentication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.5.4 Seed and Fixture for Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.5.5 KPIs Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4.5.6 Implementation of Granular Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4.5.7 Excel File Import and Values Verification . . . . . . . . . . . . . . . . . . . . . . . . 50

4.5.8 Google Places Auto-complete . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

4.5.9 Metrics List Improvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

4.5.10 Login and Register Modal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

5 Evaluation 52

5.1 Validation of the Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

5.2 Usability Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

5.2.1 Defining Scenarios and Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.2.2 Usability Tests Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

x

Page 11: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

5.3 Performance Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

5.3.1 Cache Efficiency Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

5.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

6 Conclusions 58

6.1 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

Bibliography 67

A Indicators Tables 69

A.1 Indicators Table 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

A.2 Indicators Table 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

B Questionnaires 71

B.1 Routine Cleaning Quality Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

B.2 Special Cleaning Quality Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

B.3 Users Questionnaire about KPIs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

B.4 Usability Testing: Google Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

B.5 Usability Testing: Scenarios and Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

C Prototype 78

C.1 Measures Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

C.2 Benchmarking Reports Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

C.3 User Details Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

C.4 Home Screen Prototype . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

C.5 Facility Details Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

C.6 Sign In Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

xi

Page 12: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

xii

Page 13: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

List of Tables

2.1 SMART characteristics for performance indicators. . . . . . . . . . . . . . . . . . . . . . . 11

3.1 List of relevant International Organization for Standardization (ISO) standards for Facilities

Management and Maintenance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.2 List of European (EN) FM and Maintenance Standards. . . . . . . . . . . . . . . . . . . . 16

3.3 List of distinct FM softwares characteristics. . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3.4 List of KPIs covered by the market leading FM softwares. . . . . . . . . . . . . . . . . . . 23

3.5 Use of the different metrics on UK benchmarking. . . . . . . . . . . . . . . . . . . . . . . . 24

3.6 The final list of 23 Key Performance Indicator (KPI)s to support operational requirements

of organizations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3.7 List of KPIs covered by the literature. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.8 Full list of KPIs covered by literature. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

3.9 Final list of Proposed KPIs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

4.1 Functional system requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.2 System Constraints. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

5.1 Usability Testing Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.2 Qualitative Testing Questionnaire. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

5.3 Usability Testing Users Information. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

5.4 Results from performance testing Benchmarking Page. . . . . . . . . . . . . . . . . . . . 57

A.1 Key Performance Indicators for FM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

A.2 List of reports of the ARCHIBUS FM software package. . . . . . . . . . . . . . . . . . . . 70

B.1 Example of Routine Cleaning Quality Questionnaire. . . . . . . . . . . . . . . . . . . . . . 71

B.2 Example of Special Cleaning Quality Questionnaire. . . . . . . . . . . . . . . . . . . . . . 72

xiii

Page 14: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

xiv

Page 15: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

List of Figures

1.1 Outline of the research methodology followed by this thesis. . . . . . . . . . . . . . . . . . 5

2.1 Arrangement of the six classes of software applications for FM. . . . . . . . . . . . . . . . 9

3.1 Measurement methods organization in categories. . . . . . . . . . . . . . . . . . . . . . . 18

3.2 International Facility Management Association (IFMA) Benchmarking Methodology adapted

from Roka-Madarasz. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

3.3 Measurement Categories ordered by importance. . . . . . . . . . . . . . . . . . . . . . . . 24

3.4 FM KPIs organized according to frequency and importance. . . . . . . . . . . . . . . . . . 29

4.1 FM benchmarking system architecture. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

4.2 System Domain Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4.3 Mockup Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

4.4 Mockup Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.5 Systems architecture overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4.6 Database arrangements overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

4.7 Top Bar specifics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

4.8 Examples of User details and Facility Details screens rendered on Inner Page. . . . . . . 43

4.9 (2)Examples of Metrics and Indicators Report screens rendered on Inner Page. . . . . . . 44

4.10 Database Schema. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.11 VAT ID Validation Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.12 Fixture Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4.13 Seed Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4.14 Granular Metric Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

5.1 Results obtained through the KPI questionnaire. . . . . . . . . . . . . . . . . . . . . . . . 53

5.2 Results obtained through the Usability Testing. . . . . . . . . . . . . . . . . . . . . . . . . 55

5.3 Results obtained through the Usability Questionnaire. . . . . . . . . . . . . . . . . . . . . 56

B.1 First page of the users questionnaire about KPIs. . . . . . . . . . . . . . . . . . . . . . . . 73

B.2 Second page of the users questionnaire about KPIs. . . . . . . . . . . . . . . . . . . . . . 74

B.3 Google Form from Usability Testing first page. . . . . . . . . . . . . . . . . . . . . . . . . . 75

B.4 Google Form from Usability Testing second page. . . . . . . . . . . . . . . . . . . . . . . . 76

xv

Page 16: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

B.5 Scenarios and Tasks from Usability Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . 77

C.1 Measures prototype Screen. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

C.2 Benchmarking Reports prototype Screen. . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

C.3 User edition details screen. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

C.4 Home Screen for users not authenticated. . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

C.5 Facility Details Screen. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

C.6 Sign In Screen. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

xvi

Page 17: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

List of Acronyms

BAS Building Automation Systems. 1, 8

BCIS Building Cost Information Service. 18

BIM Building Information Models. 1, 8, 19

CAD Computer Aided Design. 1, 8, 19

CAFM Computer Aided Facility Management. 1, 2, 8, 9, 19, 22

CC Cloud Computing. 5, 11, 13, 14

CMMS Computerized Maintenance Management Systems. 1, 8, 9

EMS Energy Management Systems. 1, 8

ERP Enterprise Resource Planning. 1, 9

FM Facility Management. ix, xiii, xv, 1–5, 7–11, 13, 15–17, 19, 20, 22–27, 32, 33, 58, 59

FTE Full Time Equivalent. 22, 26, 28, 30, 39

GEA Gross External Area. 17

GIA Gross Internal Area. 18

IaaS Infrastructure as a Service. 12, 13

ICS International Classification of Standards. 15, 16

ICT Information and Communication Technology. 23, 27–30

IFMA International Facility Management Association. xv, 7, 19

IMS Incident Management Systems. 8

ISO International Organization for Standardization. xiii, 15–17

IT Information Technology. 2, 13

xvii

Page 18: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

IWMS Integrated Workspace Management System. 1, 9, 19

KPI Key Performance Indicator. xiii, 1–3, 5, 7, 9–11, 15, 16, 19–21, 24–26, 32, 38, 40, 41, 52, 56, 58

NFA Net Floor Area. 39, 42

NIA Net Internal Area. 18

PaaS Platform as a Service. 12, 13

PI Performance Indicator. 10, 19

RICS Royal Institution of Chartered Surveys. 17, 18

SaaS Software as a Service. 12, 13

TDD Test Driven Development. 47

xviii

Page 19: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Chapter 1

Introduction

Facility Management (FM) is a non-core activity that supports organizations in the pursuit of their objec-

tives (core business), and is considered “the practice of coordinating the physical of business adminis-

tration, architecture, behavior and engineering science” [1].

Over the past three decades FM has known a significant development due to a number of factors

ranging from an increase in construction costs to increased performance requirements by users and

owners, as well as a greater recognition of the effect of space on productivity [2]. Consequently, FM has

matured to a professional activity and it is estimated that, along with related activities, represents more

than 5% of the GDP both of European countries and the US [3].

Since facilities-related expenditure is a big slice of the organizations base cost, FM has been equip-

ping itself with appropriate tools [4, 5], such as Computer Aided Facility Management (CAFM), Building

Information Models (BIM), Computerized Maintenance Management Systems (CMMS), Computer Aided

Design (CAD), Building Automation Systems (BAS), Energy Management Systems (EMS), Enterprise

Resource Planning (ERP), Integrated Workspace Management System (IWMS).

These software applications rely on large amounts of integrated data allowing the ability to extract

measures and indicators and, with them, calculate Key Performance Indicators (KPI). Indeed, these KPIs

give important insight into functioning of FM activities—keeping track of KPIs is one aspect of quality

control [6]—as well enabling organizations to compare each other on performance of their facilities and

services, since organizations have to perform better that their competitors, while operating at the lower

costs.

According to the FM European norm EN1221-7, benchmarking can be defined as “part of a process

which aims to establish the scope for, and benefits of, potential improvements in an organization through

systematic comparison of its performance with that of one or more other organizations” [7]. It has also

been defined as the search of “industry best practices that lead to superior performance” [8] and it has

a key role to play in FM [9]. Benchmarking can be used either for performance comparison between

distinct organizations or between facilities within the same organization. A facility can also be com-

pared with itself at different points in time. Therefore, many organizations have regular benchmarking to

reassess their overall costs [9].

1

Page 20: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Benchmarking provides important advantages to FM such as justification of energy consumption,

costs, identification of weaknesses/ threats, strengths/ opportunities and best practices, and addition

of value to facilities integrating them in CAFM systems and supporting maintenance management. Al-

though some organizations have their own benchmarking software, this software is not compatible be-

tween distinct organizations. There is no centralized mechanism to integrate all these data. Therefore,

benchmarking between FM organizations is not yet possible.

An accurate benchmarking requires the comparison of appropriate measures and indicators. Thus,

the importance of standards resides in the creation of specifications, which normalize how companies

manage their FM data to enable compatibility of analysis results between organizations, and also, ensu-

ing that there is no misinterpretation between different organizations regarding a given measurement.

Furthermore, similarly to other management disciplines, it is still not clear which are the most impor-

tant KPIs to be used in FM. Moreover, the success of benchmarking strategy depends on a number of

organizational factors such as the management support behind it, the personality of the managers, the

organizational structure, and the FM contract [9].

Up to now, the comparison between facilities of distinct organizations is not yet possible. Although

some organizations have their own benchmarking software, this software is not compatible between

distinct organizations. There is no centralized mechanism to integrate all these data. Date is represented

differently and organizations use distinct indicators. Furthermore, is still not clear which are the most

important KPIs that should be used by each sector organizations. The same holds for the field of FM.

Another recent trend is cloud solutions [10] that are being employed successfully in sectors such

as energy, maintenance and space [11, 12, 13], even churches use it now [14]. These cloud solutions

present several benefits such as: saving of Information Technology (IT) costs and maintenance (since

it is not necessary any installation of equipment or software, and neither their maintenance by the or-

ganization IT sector), strong integration capabilities, short time-to-benefit, and scalable computation on

demand that keep up with the customer needs [15, 16]. Specifically, in benchmarking, cloud applications

enable integration of data from web data sources and processing data in a way that will be accessible

by everyone anywhere [17].

A benchmarking application that enables organizations to send facilities related information in stan-

dardized format to be processed and presented graphically is a valuable addition, since it enables every

organization benchmark results to be compared with each other—by creating interfaces that can com-

municate with the software that captures the measurements. This makes possible a ranking between

them, which would generate a healthy competition to spur the improvement of each organization’s FM.

Therefore, the central motivation for this work is to study the migration of facilities benchmarking to

the cloud using the latest computing technologies and design a solution, where a cloud application would

receive all important data from multiple sources, this data will correspond of various metrics necessary

for the calculation of a set of KPIs that will be identified through an analysis of related work. Through

the previous information, the solution would also carry out a benchmark comparison between distinct

organizations in the industry according to the obtained KPIs.

This thesis analyses literature and standard benchmarking surveys to understand which indicators

2

Page 21: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

are more commonly used in FM. These indicators were validated through the study of scientific studies,

existing standards and interviews with FM experts. This process was also useful to understand which

features should be implemented on the solution. The interface mockups of a Cloud-based solution for

FM Benchmarking were validated by FM and design experts. The solution was implemented in Ruby on

Rails and validated regarding usability and performance.

1.1 Motivation

Using a Cloud solution that aggregates benchmarking information of distinct facilities, and ranks them

according to their performance, facilities managers will gain a deeper insight of their own FM areas. The

following two scenarios will illustrate this point.

Scenario 1 Consider an organization that has applied FM and where benchmarking has

been applied for some time now. This organization decides to use the application proposed

in this document. Through it, verifies that its position is rating well below than expected.

Thus, seeing their ranking, they become motivated to improve (as they have a perception of

their space for improvement) both globally and particular—at an indicator level.

Scenario 2 Consider two distinct organizations that are using a cloud benchmarking appli-

cation presented in this document. The first organization has come up first on the ranking

for some time now. Suppose that a second organization took their place in the rank and that

the first organization wants to regain its position. This creates a healthy competition among

participants (who do not know the identity of the other), leading to a scenario of dynamic

improvement.

There are some solutions—detailed on Section 3.3 of Chapter 1—to address the facility management

and facility benchmarking. However, today’s solutions are difficult to use, information is difficult to read

and understand, and none of them can give your position in the market relatively to your organizations

competition.

1.2 Problem Statement

As we made clear before, there is yet no agreement about which KPIs should be applicable in each

sector. According to Hinks and McNay [18], the lack of generalized sets of data and industry wide sets

of KPIs, results on poor comparability of performance metrics across organizations and industries. Fur-

thermore, there still is a lack of solutions for FM that enable integrating data from different organizations

in a way that brings gains for them. Organizations continue to use distinct software to support their FM

and KPI gathering, that hinders aggregation and analysis of data.

Our hypothesis is that a cloud-based and vendor-independent FM solution for benchmarking will

enable organizations to know their positioning and also to compare the performance of distinct facilities

3

Page 22: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

(in the case of facilities managed by the same entity), through a set of metrics provided by the facilities

managers while at the same time engaging them in performance improvement behaviors.

This thesis aims at developing a platform for comparing KPIs between organizations. Our solution

will thus create an FM Benchmarking vendor-independent system that will use international standard

indicators to compare distinct organizations and enable them self improve through this comparison.

1.3 Methodology and Contributions

The methodology of this document starts with an analysis of literature and standard benchmarking

surveys followed by the systematization of most commonly used indicators in FM. These indicators will

undergo a prioritization to identify the most relevant ones. The prioritization and validation will make

use of scientific studies, existing standards for FM, and interviews with FM experts, analyzing the most

important indicators on a theoretical and practical level and understanding which features should be

implemented on the proposed solution.

Then, a set of mockups of the final solution is developed and validated through interviews with FM

Benchmarking experts and design experts. Finally, a Cloud-based prototype will be implemented that

will undergo validation through processes such as performance testing and usability tests with users.

This process is depicted in Figure 1.1. More specifically, the contributions of this document are as

follows:

• A rigorous comparative study between the different Facilities Management standards;

• An evaluation and comparison between the main FM benchmarks and their corresponding indica-

tors;

• The identification of the main benchmarking indicators of interest to be used to benchmark distinct

facilities with interviews to experts;

• A survey of the main software tools for FM benchmarking;

• The design of the architecture of the Cloud benchmarking application and its implementation;

• An evaluation of the developed application in terms of performance and usability.

4

Page 23: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 1.1: Outline of the research methodology followed by this thesis. Each box with a specific colorcorrespond to a stage of development. The lines correspond to the order each stage was applied.

1.4 Document Structure

The remaining of this thesis is organized as follows. Chapter 1 introduces the contents of this document.

Then, Chapter 2 explains in detail some important subjects-matter in the area such as FM, Benchmark-

ing, KPIs and Cloud Computing (CC). Chapter 3 discusses about standard organizations and bench-

marking standards for FM, also it describes an interview of existing benchmarking solutions, presents

the scientific literature with a study of several papers related to FM Benchmarking—to understand the

most used and most useful KPIs for organizations,—and makes a discussion about normalization and

prioritization of KPIs to conclude which indicators should be used internationally on organizations. Chap-

ter 4 describes the process undertaken to achieve the final prototype solution. Finally, in order to validate

the proposed solution described before, Chapter 5 describes the set of tests performed. Lastly, Chapter

6 summarizes the work performed and points direction for future work.

5

Page 24: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

6

Page 25: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Chapter 2

Concepts

This section introduces key aspects of FM Indicators, Benchmarking, and Cloud applications. The

discipline of FM and its importance is introduced followed by a description of the distinct softwares which

support FM activities. Software packages gather KPIs that can be used for Benchmarking, enabling the

comparison of performance aspects such as operating costs, maintenance and cleaning activities, space

utilization, energy consumption or administrative costs. Cloud Computing is introduced since it is being

increasingly applied to FM.

2.1 Facilities Management

FM can be understood as the result-oriented management of facilities and services securing best value

in the provision of services, making the organization more efficient, by creating better conditions while

making less expenditures. Thus, a Facility Manager deals “with the proper maintenance and manage-

ment of the facility in a way that expectations of the occupant, owner or portfolio manager are met, and

maximum value from the facility is provided and maintained for all stakeholders” [19].

The core business of FM, aims at creating and maintaining an effective built and equipped environ-

ment to support the successful business operation of an organization – core business [20], making the

organization more efficient, by offering employees better working conditions and rationalizing expendi-

tures related to the facility. While core activities are bound to the central business of the organization

and its strategy, FM is a non-core activity that supports an organization in the pursuit of its objectives

(core business), and is considered “the practice of coordinating the physical of business administration,

architecture, behaviour and engeneering science” [1] by the IFMA. FM is a result-oriented management

of facilities and services securing best value in the provision of services, making the organization more

efficient, by giving better conditions and making less expenditures.

7

Page 26: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

2.2 FM Software Systems

FM is supported by specialized software such as CAFM packages that track space usage, cable path-

ways, employee locations, security and access control.

CAFM systems are often integrated with CAD, important to support the planning and monitoring

of spaces and activities in it, and a database back-end that contains non graphical data about the

spaces. CAFM software also enables managing changes to the space since can be tried and tested

in computer before they are made, which can avoid future problems. CAFM systems can be populated

from BIM information containing spatial relationships, geographic information, quantities and component

properties [21], also interfaces to other systems such as CMMS—to manage preventive maintenance

activities. They also have the ability to help decrease the time for task request to task completion,

increase the speed and accuracy of information related to each task, and provide improved cost and

trend analysis [22].

These systems bring along several benefits as: i) efficient completion of operational sequences (en-

try and analysis of data), ii) increase the productivity of workers (by determining property improvements),

iii) potential cost savings (in areas such as cleaning contracts and energy consumption), iv) analysis of

information on costs, v) supporting management decisions, vi) precise valuation of fixed assets, vii) op-

timization of space utilization.

Beyond this softwares, there are others, for instance Incident Management Systems (IMS) that are

used by operators (which register incidents) or technicians (who deal with occurrences and close them)

to register, centralize and follow each occurrence’s status. These systems can generate an automatic

work backlog for each technician, which provides work efficiency. It also has reporting services, statis-

tics, incidents status, etc. IMS can be integrated with CMMS or with incident management systems from

third party service suppliers to fill-in work order requests.

CMMS creates and associates maintenance plans, that can be grouped by type of device, for each

equipment of an organization, such as air conditioning, roller stairs, furnitures, etc. Some CMMS assist

in performing contract management and equipment management. Some of CMMS can be integrated

with BAS to obtain equipment usage metrics, however, some information can not be possible to retrieve

directly from BAS, but it is possible using gathering devices.

BAS manage every building’s environment aspect, automatically controlling devices installed in the

building. Facility managers can remotely command and supervise automation sub-systems in real-time.

BAS keeps a history log of the status of each device and defines alarm conditions that can be used to

detect malfunction symptoms or devices failure, and can be integrated to CMMS or CAFM tools to report

space usage metrics obtained from occupancy sensors.

EMS gathers all energy consumption information from energy meter devices installed along the build-

ing such as electric current consumption. This information can be analyzed in detail and enables facility

managers to analyze consumption variation within comparable periods of time. BAS and EMS can

be connected together to create consumption profiles and determine relative contribution of devices or

groups of devices to the overall energy consumption.

8

Page 27: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 2.1: Arrangement of the six classes of software applications for FM. The interoperability betweenthe classes is represented by a line between them. IWMS is a group composed by the three classes:CMMS, ERP and CAFM.

There are also ERP systems, however, they are not considered FM applications, they are very im-

portant on FM environment—where management of suppliers, logistics, accounting, billing and orders

is being made.

IWMS suites integrate functionalities of ERP for FM with CAFM and CMMS in one single application.

The arrangement of all these softwares referred above can be seen on Figure 2.1.

Most of these systems today are web-based, enabling an easier entering of the facilities data that

can be analyzed and consulted anywhere. With these analysis and benchmarking of data outcomes a

set of KPIs over a core of skills aligned with business objectives as a way to measure current levels of

performance and achievement.

2.3 Benchmarking

Benchmarking has been defined has the search of “industry best practices that lead to superior perfor-

mance” by Camp [8]. It is an important process to compare performance aspects just as operating costs,

maintenance and cleaning activities, space utilization, energy consumption or administrative costs. It

uses different previously established metrics, identifies differences, alternative approaches and assess

opportunities for improvements and change. Overall, it is a process that gives organizations instruments

to comprehend how they are performing both internally and to costumers.

Benchmarking in facilities is a process that compares performance aspects such as operating costs,

maintenance and cleaning activities, space utilization, energy consumption or administrative costs. It

uses previously established metrics, identifies differences, alternative approaches and assesses oppor-

tunities for improvements and change.

Benchmarking of facilities brings many advantages for organizations such as: increasing the aware-

ness of the need for change and restructuring, improvement in and off itself, by forcing organizations to

examine present processes, and not wasting time and resources when someone else has already done

it better, faster and cheaper. However, they all must be SMART [23] as presented on Table 2.1. Overall,

it is a process that gives organizations instruments to know how they are performing both internally and

externally (related to customers).

Historically, FM started with a focus on performance measurement, that had three broad purposes.

9

Page 28: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Ensuring the achievement of goals and objectives, evaluating, controlling and improving procedures

and processes, and comparing and assessing the performance of different organizations, teams and

individuals. Over the years, however, these purposes shifted to quality and consumer satisfaction [5],

translated in a set of requirements such as: knowing what clients require of the organization process,

involving key stakeholders in the benchmarking processes—users, investors, regulators, product man-

ufacturers, designers [24],—eliminating change resistance, and finally, results are not always available

instantaneously.

Benchmarking is not a simple aggregation of a few indicators. It should also include the meaning

and the purpose for each indicator. Camp lists four fundamental steps for benchmarking [8]:

• Knowing operation to evaluate internal operation strengths and weaknesses.

• Knowing the industry leaders or competitors to know the strengths and weaknesses of the

competition.

• Incorporating the best to emulate the strengths of the leaders in competition.

• Gaining superiority to go beyond the best practices installed and be the best of the best.

Benchmarking allows to compare the results between organizations, which can potentially result in

an improvement and enhancement of FM for each of the considered organizations. However, buildings

can easily demand more than their occupants and management are prepared to afford, mainly because

they do not have a defined the level of management they regard as reasonable [25]. Accordingly, it is

essential to have specific standards of measurement and metrics to ensure a common understanding of

performance and to identify performance gaps [26].

To summarize, a number of issues still have to be addressed in order to achieve appropriate and

efficient FM benchmarking, such as: i) the role of standards, ii) performance criteria and iii) verification

methods within the overall regulatory system [27].

Clearly, the FM benchmarking process requires a planning phase that decides which data to collect

[28].

2.4 Key Performance Indicators

The first step of benchmarking is establishing performance objectives and metrics. Measurements are a

direct representation of the scale of the organization and its measurable items (for organizations internal

usage), while indicators are quantifiable metrics that reflect the achievement of goals by the organization

(external usage). The services and the deliverables provided by an organization are measured according

to Performance Indicator (PI)s [29]. It is also important to distinguish between PIs and KPIs. PIs are

collected in many complex systems such as, manufacturing marketing and sales among others [6].

It is known that indicators are not always perfect and may have problems of definition and inter-

pretation. Nevertheless, they are important as they give insight on how well a system is functioning,

which is one aspect of quality control [6]. PIs inform what is the current performance, while the KPIs

10

Page 29: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Characteristic DescriptionSpecific in the sense that they are well defined and clearly understoodMeasurable meaning that there is a well defined process that enables KPI trackingAttainable to be achievable with the data availableRealistic in the sense that in can be measured at a reasonable costTime driven in the sense that it corresponds of a time interval

Table 2.1: SMART characteristics for performance indicators.

inform how to increase performance. KPIs are measures that provide essential information about the

performance of facility services delivery and they are established in order to measure performance and

monitor progress over time [30]. Therefore, KPIs represent a set of metrics which focus on the most

critical aspects of organizational performance [31].

KPIs should be distinguished according to the distinct roles of an organization. It is also common for

associate directors or head of advisers have custom KPI for their specific responsibilities. Furthermore,

there are various generic KPIs for other professionals and for managerial personnel based on business

measures.

Accordingly, FM departments must have their own KPIs that are aligned with the core business

KPIs [32]. The main typical FM KPIs (over a unit of time) according to Teicholz [32] are: i) Operational

Hours, ii) Response Times, iii) Rework, iv) Value Added, v) Number and performance of suppliers,

vi) Employee satisfaction, vii) Innovations (new processes), viii) Customer satisfaction, ix) Plan versus

actual on contracts, x) Number of items on tasks lists.

2.5 Cloud Computing

Cloud Computing (CC) is arising and is increasingly being applied to various fields. CC provides an

environment to enable resource sharing in terms of scalable infrastructures, middleware and application

development platforms, and value-added business applications [33].

2.6 Cloud Computing Concepts

The Cloud is an aggregation of a set of resources such as networks, servers, applications, data storages

and services, in only one place, which the end user can access and use “with minimal management

effort or service provider interaction” [34]. The main goal of CC is to make a better use of distributed

resources, combining them to achieve higher throughput and be able to solve large scale computation

problems [35]. For which contributes the following characteristics [36]:

On-demand self-service enables the consumer to acquire computing capabilities—such as server

time and network storage—as necessary without involving human interaction with the service

provider.

Broad network access in the sense that capabilities are available over the network and ac-

cessed through standard mechanisms that promotes heterogeneous use by thin and thick client

11

Page 30: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

platforms (e.g., mobile phones, tablets, laptops, and workstations).

Rapid elasticity means that computing capabilities are provisioned and released in order to

scale rapidly outward and inward proportionally with demand. Therefore, the capabilities available

often appear to be unlimited to the user and can be appropriated in any quantity at any time.

Computing resources pool to serve multiple consumers using a multi-tenant model, with differ-

ent physical and virtual resources dynamically assigned and reassigned according to consumer

demand. The customer has no control or knowledge over the exact location of the provided re-

sources but may be able to specify location at a higher level of abstraction—country, state or

data-center. Examples of resources include storage, processing, memory, and network band-

width.

Measurement services to control and optimize resources by type of service—such as storage,

processing, bandwidth, and active user accounts. Thus, resource usage can be monitored, con-

trolled, and reported, providing transparency for both the provider and consumer of the service.

2.7 Cloud Computing Models

Today we have several services that we can use, since Infrastructure as a Service (IaaS), Platform as a

Service (PaaS) or Software as a Service (SaaS) [37]:

Infrastructure as a Service (IaaS) the consumer has the capability to acquire processing, stor-

age, networks, and other fundamental computing resources. The consumer is able to deploy

and run arbitrary software, which can include operating systems and applications. An example is

Amazon EC2 which provides web service interface to easily request and configure capacity online

[38]. For a higher layer of IaaS, computational, storage, and network. An example is Amazon’s

Dynamo [39].

Platform as a Service (PaaS) the consumer has the capability to deploy onto the cloud infras-

tructure, consumer-created or acquired applications created using programming languages, li-

braries, services, and tools supported by the provider. An example is Google’s App Engine [40]

and Heroku [41].

Software as a Service (SaaS) the applications are accessible from various client devices through

either a thin client interface, such as a browser, or a program interface. The consumer does not

manage or control the underlying cloud infrastructure including network, servers, operating sys-

tems, etc.

The application developers can either use the PaaS layer to develop and run their applications or

directly use the IaaS infrastructure [37].

12

Page 31: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

2.8 Cloud Computing Benefits

CC exempts organizations from the burden of installing software. Thus infrastructure requirements are

low because the service is sold on-demand. The infrastructure can be rented rather than bought [11].

Also, cloud applications are managed and updated by the provider, who take care of server maintenance

and security issues [11]. This removes burden of the IT infrastructure, and consequently, lower costs.

Because cloud enables the access of applications with both mobile or desktop devices, personnel can

work flexibly anywhere and anytime. Work traveling restrictions such as different time zones or access

to the software are no longer an issue. Therefore, functionality maintenance and availability are the main

advantages of CC [16].

In short, cloud solutions present several benefits that keep up with the customer needs and have

pushed many residential and commercial solutions to the cloud, such as:

• Saving of IT costs and maintenance because it allows to avoid overhead costs on acquiring and

maintaining hardware, software, and IT staff

• Easy access and up-to-date data because applications can be easily accessed from anywhere

in the world with an Internet connection and a browser, i.e., without having to download or install

anything

• Short time-to-benefit with quick deployment of IT infrastructures and applications. The software

deployment times and resource needs associated with rolling out end-user cloud solutions are

significantly lower than with on-premise solutions

• Improving business processes with better and faster integration of information between different

entities and processes

• Scalable computation on demand to overcame constant environments and usage changes.

As all new technology arrives, it brings with it some issues which may prove to be adverse if not taken

care of. The major concerns about CC are security and privacy, network performance and reliability.

However, if the correct service model (IaaS, PaaS, or SaaS) and the right provider is selected, the

payback can far outweigh the risks and challenges. The cloud implementation speed and ability to scale

up or down quickly, means companies can react much faster to changing requirements like never before.

2.9 Cloud Computing for FM

Today CC is being applied in FM to customize scheduling and reporting, health and safety compliance,

asset management and lower costs managing teams [42]. But also to book meeting rooms, offices and

handling catering [43]. There are also specific cloud-based software for each sector, for example for

maintenance management that provides service, asset, and procurement management solutions for fa-

cility managers such as preventative maintenance schedules [44, 45]. Specifically on FM Benchmarking,

CC brings a set of advantages [17] with respect to:

13

Page 32: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

• Comparing information between the system parties

• Comparing the facility with international metrics/standards

• Simplifying data sharing between various stakeholders—without the Cloud, data would be stored

on in-house computers, and could not be shared easily with others, making the integration more

difficult.

Thus, it is clear that CC is already being applied to support FM activities and is likely to bring a set

of new features that will increase the workforce and operational efficiency.

14

Page 33: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Chapter 3

State-of-the-art

This section discusses benchmarking literature and standards in FM. Moreover, it overviews existing

software solutions with respect of their benchmarking features. To understand the most useful KPIs

for organizations, the section discusses the normalization and prioritization of KPIs and drives a set of

indicators to be used internationally on organizations.

3.1 International Standards for FM Benchmarking

ISO is the largest developer of voluntary International Standards covering all aspects of technology

and business. ISO has formed joint committees to develop different kinds of standard according to the

commission they join: IEC-International Electrotechnical Commission or ASTM-American Society for

Testing and Materials. The International Classification of Standards (ICS) is a structure for catalogs of

international, regional and national technical standards and other normative documents developed and

maintained by ISO. It covers every economic sector and activity where these technical standards can be

used. Its objective is to facilitate the harmonization of information and ordering tools [46].

The need for FM standards has been recognized in multiple sources [47]. In this research work,

we select the most important ISO standards for FM and Maintenance for those working with facilities in

various stages of their life cycle, which are shown in Table 3.1, ordered by ICS. There are also standards

focused on a specific European scope that can be consulted in Table 3.2.

15

Page 34: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

StandardICS 01. 110: Facilities ManagementISO/CD 18480-1 - Part 1: Terms and definitionsISO/CD 18480-2 - Part 2: Guidance on strategic sourcing and the development of agreementsICS 01. 110: Document ManagementEC 82045-1:2001 - Part 1: Principles and methodsIEC 82045-2:2004 - Part 2: Metadata elements and information reference modelISO 82045-5:2005 - Part 5: Application of metadata for the construction and facility management sectorICS 03. 100: Risk ManagementISO 31000:2009 - Principles and guidelinesISO/TR 31004:2013 - Guidance for the implementation of ISO 31000IEC 31010:2009 - Risk assessment techniquesICS 03. 100: Asset ManagementISO 55000:2014 - Overview, principles and terminologyISO 55001:2014 - Management systems-RequirementsISO 55002:2014 - Management systems-Guidelines for the application of ISO 55001ICS 03. 080: OutsourcingISO/DIS 37500 - Guidance on outsourcingICS 91. 010: Building Information ModelingISO/TS 12911:2012 - Framework for building information modeling (BIM) guidanceISO 29481-1:2010 - Information delivery manual-Part 1: Methodology and formatISO/AWI 29481-1 - Information delivery manual-Part 1: Methodology and formatISO 29481-2:2012 - Information delivery manual-Part 2: Interaction frameworkICS 91. 040: Buildings and Building Related FacilitiesISO 11863:2011 - Functional and user requirements and performance – Tools for assessment and comparisonICS 91. 040: Buildings and Constructed AssetsISO 15686-1:2011 - Service life planning-Part 1: General principles and frameworkISO 15686-2:2012 - Service life planning-Part 2: Service life prediction proceduresISO 15686-3:2002 - Service life planning-Part 3: Performance audits and reviewsISO 15686-5:2008 - Service life planning-Part 5: Life-cycle costingISO 15686-7:2006 - Service life planning-Part 7: Performance evaluation for feedback of service life data from practiceISO 15686-8:2008 - Service life planning-Part 8: Reference service life and service–life estimationISO/TS 15686-9:2008 - Service life planning-Part 9: Guidance on assessment of service–life dataISO 15686-10:2010 - Service life planning-Part 10: When to assess functional performanceISO/DTR 15686-11 - Service life planning-Part 11: TerminologyICS 91. 040: Buildings ConstructionISO 15686-4:2014 - Service Life Planning – Part 4: Service Life Planning using Building Information ModelingISO 6242-1:1992 - Expression of users’ requirements-Part 1: Thermal requirementsISO 6242-2:1992 - Expression of users’ requirements-Part 2: Air purity requirementsISO 6242-3:1992 - Expression of users’ requirements-Part 3: Acoustical requirementsICS 91. 040: Performance Standards in BuildingISO 6240:1980 - Contents and presentationISO 6241:1984 - Principles for their preparation and factors to be consideredISO 7162:1992 - Contents and format of standards for evaluation of performanceISO 9699:1994 - Checklist for briefing-Contents of brief for building designISO 9836:2011 - Definition and calculation of area and space indicators

Table 3.1: List of relevant ISO standards for Facilities Management and Maintenance applicable tofacilities at different stages of their life cycle organized by ICS.

Code TitleFacilities ManagementEN 15221-1:2006 Part 1: Terms and definitionsEN 15221-2:2006 Part 2: Guidance on how to prepare facility management agreementsEN 15221-3:2011 Part 3: Guidance on quality in facility managementEN 15221-4:2011 Part 4: Taxonomy, classification and structures in facility managementEN 15221-5:2011 Part 5: Guidance on facility management processesEN 15221-6:2011 Part 6: Area and space measurement in facility managementEN 15221-7:2012 Part 7: Guidelines for performance benchmarkingMaintenance ManagementNP EN 13269:2007 Instructions for maintenance contract preparationNP EN 13460:2009 Maintenance documentationNP EN 15341:2009 Maintenance KPINP 4483:2009 Guide for maintenance management system implementationNP 4492:2010 Requirements for maintenance servicesEN 13306:2010 Maintenance terminologyEN 15331:2011 Criteria for design, management and control of maintenance services

for buildingsCEN/TR 15628:2007 Qualification of maintenance personnelEN 13269:2006 Guideline on preparation of maintenance contracts

Table 3.2: List of European (EN) FM and Maintenance Standards that apply to facilities in various stagesof their life cycle.

16

Page 35: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

3.2 Benchmarking

Benchmarking has been generally explained in Section 2.3 on Chapter 2, however, a more specific

and directional characterization is demanded to understand the importance of benchmarking in both

business and Facility Management.

3.2.1 Benchmarking in Business

Benchmarking is the process that determines who is the best. This is very important in business to

undestand who is the best sales organization, how to quantify a standard or who has less cleaning

expenses [48].

Moreover, benchmarking identifies what other businesses do to increse profit and productivity to

subequently adapt those methods on your organization and make business more competitive [49]. More

speciffically, is important to understand how the winner got the best results and what should be done to

get there [48].

3.2.2 Benchmarking in FM

The importance of standards resides in the creation of specifications, which normalize how some ac-

tivity is performed. In the case of benchmarking, standardizing how companies evaluate their FM data

enables compatibility between organizations, and so, there is no misinterpretation between different

organizations for a given measurement.

Compare the results between them becomes possible, which empowers an improvement and en-

hancement of facilities management for each organization. Accordingly, it is essential to have specific

standards of measurement and metrics for ensuring a common understanding of performance and to

identify performance gaps [26].

Various FM softwares from many organizations tend to use ISO standards and Royal Institution of

Chartered Surveys (RICS) measurement practices, such as the ones presented on Tables 3.1 and 3.2.

Moreover, we could conclude that a cloud benchmarking FM system between several organizations

and facilities on a Portuguese scope do not exist yet, and would bring a great impact for our country’s

economy, since FM represents about 5% of global Gross Domestic Product [50, 51].

RICS and BICS Space Measurement Normalization

Collecting and analyzing concrete and reliable space measurements is of utmost importance for guar-

anteeing the quality of the performance indicators in FM. “Inaccurate performance prediction may lead

to buildings that behave worse than expected” [52]. There are many standards that specify how to per-

form measurements, so that it can be executed equally by the different organizations. However, reliable

measurements must be performed by accredited specialists in the matter, such as members of the RICS.

RICS has a Code of Measurement Practice that deals with practice of measurements such as val-

uation techniques (zoning of shops) or special uses. It specifies measurements for i) Gross External

17

Page 36: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Level Area (LA)

Gross Floor Area (GFA)

Internal Floor Area (IFA)

Net Floor Area (NFA)

Net Room Area (NRA)

Technical Area (TA)

Circulation Area (CA)

Amenity Area(AA)

Primary Area(PA)

Non

Fun

ctio

nal L

evel

Are

a (N

FLA)

Exte

rior C

onst

ruct

ion

Area

(EC

A)

Res

trict

ed P

rimar

y Ar

ea

Unr

estri

cted

Prim

ary

Area

Res

trict

ed A

men

ity A

rea

Unr

estri

cted

Am

enity

Are

a

Res

trict

ed C

ircul

atio

n Ar

ea

Unr

estri

cted

Circ

ulat

ion

Area

Res

trict

ed T

echn

ical

Are

a

Unr

estri

cted

Tec

hnic

al A

rea

Parti

tion

Wal

l Are

a (P

WA)

Inte

rior C

onst

ruct

ion

Area

(IC

A)

Figure 3.1: Measurement methods organization in categories.The different categories which methodsand units of measurement are organized. Rentable Floor Area (RFA) = Net Floor Area (NFA) [54].

Area (GEA): area of a building measured externally at each floor level, ii) Gross Internal Area (GIA):

area of a building measured to the internal face of the perimeter walls at each floor level iii) Net Internal

Area (NIA): area within a perimeter walls at each floor level [53]. RICS provides precise definitions to

permit the accurate measurement of buildings and land or the calculation of sizes (areas and volumes)

presented in the RICS’ Code of Measuring Practice [53].

Belonging to RICS, the Building Cost Information Service (BCIS) provides built environment cost

information, and is the basis of early cost advice in construction industry, since they provide services

respecting occupancy costs, construction duration, repair costs, construction inflation, among others.

The standard EN15221-6 also regulates areas and spaces measurements and is a common basis for

i) planning and design ii) area and space measurement iii) financial assesment iv) tool for benchmarking

for existing and owned or leased buildings as well as buildings in state of planning or development

[54]. It includes concepts such as rentable, lettable, leasable, equivalent rent and corresponding terms.

Methods and units of measurement are explained and illustrated per categorie. The distinct categories

can be seen on Figure 3.1.

Today, the focus is to deliver buildings to a known cost and on being able to track the reduction in

costs that result from improvements in procurement [55]. For this to happen, it is necessary information

provided by BCIS such as cost per m2 of Gross Internal Floor Area for buildings and elements.

The BCIS Elemental Standard Form of Cost Analysis [55] provides built environment cost informa-

tion, such as occupancy costs, construction duration, repair costs, construction inflation, among others.

As referred in the BCIS Elemental Standard Form of Cost Analysis, there has to be detailed informa-

tion documents about the projects, buildings, procurements, costs (there should be provided Total Costs

for each element and sub-elements, and should be shown separately when required and for different

forms of construction), risks and design criteria.

18

Page 37: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 3.2: IFMA Benchmarking Methodology adapted from Roka-Madarasz [4]. First step is to identifythe KPI, then, use it to measure the facility performance. At this point three different paths can be taken:Best-in-class Facility Performance, Own Performance, or directly to Compare. After this, the functionsof benchmark has to be chosen, just like which companies to benchmark.

IFMA Benchmarking

The IFMA has developed a facility benchmarking useful for current FM services benchmarking that can

be seen in Figure 3.2 as shown in the report [4].

In order to measure facilities performance, IFMA has established 9 KPIs that must be easily mea-

surable and that must be defined for monitoring the actual process and also to control it. These Key

Performance Indicators shown in the report [4] can be seen in Table A.1 on Appendix.

3.3 Existing Software Solutions

As specified in section 2.1, there are many types of softwares for FM solutions, just as CAFM or IWMS.

All of known FM solutions like Maxpanda [56] or IBM Tririga [57] are a simpler way to manage facili-

ties. They centralize organizations information, making management more efficient through business

analytics, critical alert, increasing visibility and control.

Some of them, for instance ARCHIBUS [58] or FM:Systems [59] have integration with CAD or BIM

models, which is very important for visualization of departments occupation or others space and occu-

pation management areas.

Most of these systems promotes their capabilities for organizations cost reduction—since they cost-

justify real changes in preventive maintenance routines and predicts cost effects of preventive mainte-

nance changes before they are made—some permits multiple users, others make possible that each

user only can access specific information regarding his organization position.

There are different sectors that a FM system can focus such as: i) Capital/ Financial, ii) Real Es-

tate/Retail: Construction or Project Management, iii) Space and Workplace, iv) Maintenance, v) Sus-

tainability and Energy, vi) Move, vii) Higher Education and Public. Many of the existent solutions only

focus on some of this sectors and not in all of them.

For Real Estate, it is usual features for incorporation of current lease accounting standards, track-

ing of dates and contractual commitments, management of occupancy and facilities costs. On Capi-

tal/Finantial, are being used features to identify funding priorities within capital programs, reduce project

schedule overruns or streamline project cost accounting. In Space and Workplace it is important to have

19

Page 38: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Software SolutionsCentralization of

OrganizationsInfo

BusinessAnalysis

IncreasedVisibility and

ControlCosts Reduction CAD/BIM

Integration Cloud Application Benchmarking

Maxpanda • • • •IBM Tririga • • • •FM:Systems • • • • •Indus System • • • • •PNMSoft • • • • • •ARCHIBUS • • • • • • •

Table 3.3: List of distinct FM softwares characteristics.

tools for space use agreements and chargeback to increase departmental accountability for space use.

Maintenance requires features for automatically route and manage both incoming and planned main-

tenance, while at the same time keeping internal customers up to date on the progress of their work

tickets, or streamline facility maintenance, service management and facility condition assessments. The

Sustainability and Energy sector is also very important for defining which projects will achieve the right

mix of environmental benefits and cost savings, for reduce energy consumption to meet sustainability

goals, identifying poorly performing facilities and automate corrective actions.

Solutions as Indus System [60], Manhattan Mobile Apps [61], PNMSoft [62] or ARCHIBUS have

cloud-based software that enables users to access FM systems anywhere on mobile devices from a

browser. Indus System enables users to store, share, view drawings, space, assets, related costs,

leases and contracts just by accessing the browser.

On the other hand, ARCHIBUS and PNMSoft are both capable of showing an organizations KPIs

through their web site. The packages enables users normal usage of their daily management software

and then, when necessary, the visualization of the results on a graphical web application. However, this

solution is only applicable for the facilities that have ARCHIBUS or PNMSoft software installed, and only

for comparison from previous results from that facility. In contrast, with our solution, any organization

could benefit from these features and one more: the comparison with others organizations on the same

sector. On Table 3.3 is presented a summary of the different FM softwares characteristics.

Moreover, is very important to mention that Cloud-based benchmarking solutions have been used

not only to FM but also in other Business Sectors. For example, eBench [63] is a cloud solution for

benchmarking digital marketing indicators from different brands.

3.3.1 ARCHIBUS

ARCHIBUS is the provider Facilities Management software solution that effectively tracks and analyzes

not only facilities-related information but also real-estate. ARCHIBUS is an integrated solution that

applies to organizations of several sizes and sectors (here we focus on ARCHIBUS for Educational

Institutions reports—Table A.2 on Appendix).

The system architecture consists of three main modules, the first, is named ARCHIBUS Web Central

and provides live enterprise access to facilities data and enables the easy maintenance and distribution

of facilities information across the entire enterprise [64]. A role-based security service, allows that when

users log on, they only access information relevant to their roles on the organization.

ARCHIBUS have a .NET Windows application, named Smart Client, used by back-office personnel

for data entry, data transfer, importing and exporting data from other systems. This module has another

20

Page 39: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

one integrated, the Smart Client Extension for AutoCAD or DWG Editor that is very important for those

organizations who want to include Computer-Aided Drawing or BIM models.

From a technical standpoint, the software architecture of ARCHIBUS consists of a database that can

be one of MS SQL, Oracle or Sybase. This database communicates with the application servers that

can run on Tomcat, Jetty, WebLogic or WebSphere. The applications of SmarClient module running on

the computer of the client companies, communicate with application servers through Web Services and

the applications of Web Central module communicates via HTTP with those same servers [64].

Being part of the ARCHIBUS platform, ARCHIBUS Performance Metrics Framework delivers KPIs

and other performance data about the real estate, infrastructure and facilities using a detailed graphical

view of the data. Thus, it is possible to use analytical measures and productivity tools, which provides

decision-makers to align their portfolio to organizational strategy, spotlight underperforming business

units or assets, and benchmark organizational progress to achieve targeted goals.

3.3.2 PNM Soft Sequence Kinetics

PNM Soft Sequence Kinetics is an Intelligent Business Process Management Suite and covers process

optimizing KPI, dynamic process change, KPI analysis, process operation and tracking, communica-

tion with external systems and mobile and cloud KPI. This system has four main focus: Processing-

Optimizing KPI, Process Operation and Visual Tracking, KPI for Process Administrators and Mobile

Process KPI.

On the Process Optimizing KPI there are two different processes:

Extra-Process Performance Analytics permits the process performance tracking via runtime

dashboards and displays KPI like process status levels or average execution time of a process,

which helps to understand how successful the process is and highlight required improvement

areas.

Intra-Process Analytics aggregation and calculation of intra-process data by real-time analyt-

ics, that is built into the Business Rule editor, which enables routing according to their results via

a simple GUI, being an artificial process intelligence form which sees the business teach itself

how to perform better over time.

Process Operation and Visual Tracking is possible by Flowtime that is a extension of Microsoft Share-

Point with a built-in process operation environment, which enables the collaboration on processes in a

familiar interface and includes advanced task management, delegation and monitoring of KPI capabili-

ties with a tracking views, which shows the process stands.

The KPI for Process Administrators provides important indicators on process performance per type

of process [65].

PNM Soft also has a mobile application named Mobile Portal, that is available as an application or

an online service, where users can access the same features provided by Sequence Kinetics Flowtime

and can be configured by the customer to meet his necessities. For the cloud platform PNM Soft uses

Windows Azure.

21

Page 40: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

3.3.3 Discussion

The previous softwares have important characteristics, however, none of them have all the features at

once. We asked the collaboration of the providers of Maxpanda, IBM Tririga, ARCHIBUS, FM:Systems

and Indus System to understand which are the indicators on which each of these softwares are more

focus. An email was sent, where was asked them to fill in a form with distinct indicators previously

selected from the scientific literature. However, only two of the providers answered it. On Table 3.4 can

be seen the results of the questionnaire. We concluded that are many Financial indicators that are not

considered by the providers, also, indicators of Service Quality, Satisfaction or Spatial Indicators are

highly applied on softwares. These differences are related to the software classification. Software B is

a CMMS solution, and that is the reason why the it is focused on maintenance and cleaning indicators,

while Software A is a CAFM solution, therefore, it is not focused on those indicators.

Indicators Software A Software B Software D Software E Software F TotalFinancial IndicatorsTotal Cleaning Cost • �Cleaning Cost per m2 • • ��Total Maintenance Cost • • ��FM Costs • • ��Utility Costs • �Facility Budget/CorporationBudget

• �

Occupancy Cost per m2 • �Space Costs per m2 • • ��Operation Cost per m2 • • ��Moving Costs • �HSSE Costs • • ��Security Costs • �Logistic Costs • • ��Hospitality Costs • • ��Project Costs (Deviation) • �Financial Ratios • �Annual Income • �Total Annual Facility Cost • �Annual Cost of EnergyPurchased

• �

Total Environment Cost • �Outdoor Costs • �Spacial IndicatorsNet Floor Area per FTE • • ��Percentage Net Floor Area • • ��Percentage Internal Area • • ��Percentage Gross Floor Area • • ��Support Area • �Maintenance/CleaningIndicatorsRepairs VS PreventiveMaintenance

• �

Asset Replacement Values • �Percentage of Area Cleaned • �Productivity IndicatorsCore operating hours of facility(FM)

• • ��

Uptime Facility (Business) • • ��Staff Turnover (HumanResources)

• • ��

Absenteeism (HumanResources)

• • ��

Environmental IndicatorsCO2 emissions • • ��Total Energy Consumption • • ��Total Water Usage • • ��Total Waste Production • • ��Service Quality IndicatorsQuality of Service/Product • �Quality of Cleaning • • ��Quality of Workplace • • ��

Continued on next page

22

Page 41: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Indicators Software A Software B Software D Software E Software F TotalQuality of Security • • ��Quality of Reception andContact Center

• • ��

Quality of DocumentManagement

• • ��

Satisfaction IndicatorsClient Satisfaction • �Satisfaction with FM • • ��Satisfaction with Space • • ��Satisfaction with Outdoors • • ��Satisfaction with Cleaning • • ��Satisfaction with Workspace • • ��Satisfaction with HSSE • • ��Satisfaction with Hospitality • • ��Satisfaction with ICT • • ��Satisfaction with Logistics • • ��

Table 3.4: List of KPIs covered by the market leading FM softwares.

There still is a lack of solutions for FM that enable integrating data from different organizations in a

way that brings gains for them. Today’s solutions are difficult to use, information is difficult to read and

understand, and there is no solution that can compare your organization with the competition.

A cloud solution that aggregates benchmarking information of distinct organizations, and ranks them

according to their performance would be highly desirable. Facilities managers would have a deeper

insight of their own FM areas and their organizations market position.

3.4 Scientific Literature

Ho et al [66] report different performance measurements and most used indicators in Asia Pacific re-

gion organizations. This research work rates the importance of 97 metrics on a five point scale and

indicate if the metric was being used in their organization FM—the metrics consisted of performance

measurements and performance indicators grouped by eight categories: i) size and use of facilities,

ii) maintenance, iii) refurbishment, iv) cleaning, v) energy consumption, vi) ground and environment,

vii) safety and security, viii) parking. These categories are represented by order of importance on Figure

3.3, according to the study by Ho et al [66].

Moreover, Ho et al [66] also identified which of the 97 metrics were the most used and more important

to the organizations: the metrics that lead a direct financial implication were the ones with a higher

rating. The 30 metrics with higher rates are especially related to the areas of: i) Financial such as Total

Annual Facility Cost and Operational Cost, ii) Spacial such as Gross Floor Area/Usable Floor Area and

Usable Area, iii) Maintenance such as Asset Replacement Value (Maintained), iv) Cleaning such as

Area Cleaned and Cleanliness Status of Site.

Massheder and Finch [26], surveyed the FM benchmarking metrics used in the UK. Their work

identified which metrics were used to measure performance of the FM function. They organized metrics

according to five categories: i) business metrics, ii) building performance metrics, iii) portfolio metrics,

iv) acquisition metrics, v) disposal metrics. The results of their study are presented in Table 3.5. As we

can see, the most used metrics are the Business, Portfolio Metrics and Building Performance. The latter

is considered the most important one—with percentage of use above 50% on 5 of 6 metrics presented.

23

Page 42: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Cleaning

Refurbishment

Parking

Ground and Environment

Size and Use of Facilities

Safety and Security

Maintenance

Energy ConsumptionPriority

Higher

Lower

Figure 3.3: Measurement Categories ordered by their importance according to the results of the re-search of Ho et al [66].

Metric Percentage of UseBusinessOccupancy cost of operating revenue by building 43%Occupancy cost of the total of labour and costs by business unit 29%Occupancy of cost of operating revenue by business unit 21%Occupancy cost of the total sales and admin cost by business unit 14%Location analysis on basis of where key skills are available 7%Location optimization (in context of attractors and repellers) 7%

Building PerformanceOccupancy cost per m2 98%Occupancy cost per person 79%Occupancy of cost per building size 64%m2 per person 64%Itemized (occupancy) cost comparisons of m2 per person by building 36%Absentee rates by building 0%

Portfolio MetricsProportion of operational space compared to non-operational space 49%Current market capital value compared to book value by building 21%Current market rental value compared to rent peasing by building 14%Proportion of Non-operational Space that is Sublet or Assigned 14%

Acquisition Metrics (only those who include real estate in FM)Costs of acquisition measured against returns 20%Actual extra occupancy cost against prediction cost 10%Amount of space coming on stream per unit time 10%Time to find and acquire space against program 10%Time to occupation against program 10%

Disposal Metrics (only those who include real Estate in FM)Holding costs per year 30%Time to dispose of buildings against program 30%Cost of disposal against savings 20%Time to clear buildings against program 20%Holding costs to lease end, break and/or estimated disposal date 10%Number of months vacant to date 10%Disposal performance measures against natural portfolio shed rate 0%Months vacancy to lease end, break and/or estimated disposal date 0%

Table 3.5: Use of the different metrics on UK benchmarking organizations according to [26].

Another study by Hinks and McNay [18] identified a need to established a set of universally accepted

KPI that realistically evaluate organizations performance. The authors identified the need to clarify and

prioritize the parameters and indicators which correlated the views of the customer and the departments,

to support the operational requirements of the core financial business. To this end, they used the Del-

phie technique [67]—used to gather expert opinion in areas where there is considerable uncertainty

and/or lack of agreed knowledge [18]—, where a group of the premises departments and their internal

customers were consulted using questionnaires, scenario workshops and group discussions set.

The first phase of Hinks and McNay’s study consisted of a literature review, where the authors con-

cluded that the practical use of KPIs frequently involves industry-specific or organization-specific indica-

tors. They also concluded that most of the indicators were providing data that were of direct applicability

for monitoring the management of FM tasks. Unfortunately these indicators were not likely to be relevant

to their customers [18]. Since no single measure can adequately provide a clear performance target,

24

Page 43: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

it is desirable to achieve a balance between financial and non-financial measures [68]. Taking this into

account by these authors, 172 KPIs were selected and can be consulted in [18]’s paper, as well as per-

formance indicators must be comparable and sufficiently complete and objective to accurately describe

the address FM activity [69].

These authors identified seven performance dimensions. The most common are General, Change

Management and Maintenance. The top five indicators belong to the dimensions Business, General,

Change Management, Environment and Space. Although the Space dimension appears only once on

the list, it belongs to the top five most important indicators according to [18]. Through a well structured

questionnaire survey, the list can be found on Table 3.6.

Performance Dimension IndicatorBusiness No loss of business due to failure of premises servicesGeneral Customer SatisfactionChange Management Completion of project to customer satisfactionEnvironment Provision of safe environmentSpace Effective utilization of spaceChange Management Effectiveness of communicationMaintenance ReliabilityGeneral Professional approach of premises staffGeneral Responsiveness to problemsGeneral Competence of staffMaintenance Management of maintenanceChange Management Responsiveness of PD to changes/requirementsBusiness Value for moneyEnvironment Satisfactory physical working conditionsEquipment Equipment provided meets business needsBusiness Suitability of premises and functional environmentChange Management Quality of end productMaintenance Effectiveness of helpdesk serviceChange Management Achievement of completion deadlinesEquipment Correction of faultsMaintenance Standards of cleaningGeneral Management informationEnvironment Energy performance

Table 3.6: The final list of 23 KPIs to support operational requirements of organizations, ordered byimportance according to [18]. Higher importance indicators come at the top.

[70], describes two projects aimed at conceiving and implementing performance measurement sys-

tems for benchmarking in the Brazilian construction industry based on a discussion about three bench-

marking projects in the United Kingdom (Key Performance Indicators Working Group, 2000), in the

United States of America (Construction Industry Institute, 2000) and in Chile (Corporacion de Desarrolo

Tecnologico, 2002).

The objective of the above mentioned initiatives was to measure the performance of the FM activity,

and to identify and evaluate best practices, through comparison of key performance indicators. To this

end, a web-based online tool was developed by [70], enabling input of previously gathered data. Tools

were provided for displaying graphically the comparative performance of companies involved.

25

Page 44: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

The authors organize KPIs found on the studies by the previous authors selected KPIs by com-

bining distinct approaches: extensive reviews by a panel of experts and the publication of an initial

report, selection based on previous studies, and a committee involving both industry representatives

and Construction Industry Institute. In the end, KPI selections were essentially focused on categories

such as Financial (Deviation of Cost by Project), Safety (Labor Accident Rate), Satisfaction (Client and

Employee) and Performance (Project Schedule Growth). Based on the results of these surveys, the

authors also conclude that the set of measures should be simple and well designed in order to support

improvement. Moreover, give a comprehensive company wide-view [70].

[71] took the investigation further, and developed an online benchmarking application which offers

analysis and simulation of organizations performance with respect to reference values found for each

sector. The data being reported is related to reference values and general results, anonymizing the

identity of companies participating in the project.

In order to understand what are the most relevant KPIs for the field of FM we combine the results of

distinct authors. Table 3.7 presents the various KPIs pointed out by the scientific literature along with the

frequency of reference. The most cited indicators are Financial and Spacial, followed by Maintenance

and Cleaning. The Quality of Service/Product and Client Satisfaction are also important indicators.

Indicators USA, UKand ChileProjects

[70]

IFMA [4] [66] [26] [18] Total

Financial IndicatorsTotal Cleaning Cost • �Cleaning Cost per m2 • �Total Maintenance Cost • • ��FM CostsUtility Costs • �Facility Budget/CorporationBudget

• �

Occupancy Cost per m2 • • • ���Space Costs per m2 • • ��Operation Cost per m2 • • ��Moving Costs • �HSSE Costs • �Security Costs • �Logistic CostsHospitality CostsProject Costs (Deviation) • • • ���Financial Ratios • • • ���Annual Income • • ��Total Annual Facility Cost • • • ���Annual Cost of EnergyPurchased

• �

Total Environment Cost • • ��Outdoor Costs • �Spacial IndicatorsNet Floor Area per FTE • • • • ����Percentage Net Floor Area • • • ���Percentage Internal Area • • • ���Percentage Gross Floor Area • • • ���Support Area • �Maintenance/CleaningIndicatorsRepairs VS PreventiveMaintenance

• • ��

Asset Replacement Values • • ��Percentage of Area Cleaned • • ��Productivity IndicatorsCore operating hours of facility(FM)Uptime Facility (Business)

Continued on next page

26

Page 45: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Indicators USA, UKand ChileProjects

[70]

IFMA [4] [66] [26] [18] Total

Staff Turnover (HumanResources)

• �

Absenteeism (HumanResources)

• �

Environmental IndicatorsCO2 emissionsTotal Energy Consumption • �Total Water UsageTotal Waste ProductionService Quality IndicatorsQuality of Service/Product • • ��Quality of Cleaning • �Quality of Workplace • �Quality of Security • �Quality of Reception andContact CenterQuality of DocumentManagementSatisfaction IndicatorsClient Satisfaction • • ��Satisfaction with FMSatisfaction with SpaceSatisfaction with OutdoorsSatisfaction with CleaningSatisfaction with WorkspaceSatisfaction with HSSE • �Satisfaction with HospitalitySatisfaction with ICTSatisfaction with Logistics

Table 3.7: List of KPIs covered by the literature. The rightmost column represents the total of documentsand tools that reference a specific KPI.

3.5 Analysis

The problem of KPIs identification in the domain of FM had already been studied by distinct system

industries and scientific studies. However, there is no commonly agreed prioritization used in the FM

industry. Determining which KPIs to use for a centralized benchmarking infrastructure for FM is still an

open question.

The methodology to achieve a list of indicators with potential to be applied for a higher number of

organizations, require analyzing of distinct indicators frequency on standard benchmarking surveys and

on FM software, presented on Tables 3.7 and 3.4. However, these tables present significant differences

between them, with the exception of certain financial and spacial indicators (5 in total). It turns out that

software packages are more focused on performance, service and satisfaction indicators, but not main-

tenance indicators as found in the literature table. Thus, to identify the most relevant indicators to the

organizations these indicators need to undergo a prioritization. One such prioritization can be achieved

by combining scientific studies, existing standards for FM, and the help of FM experts. In this way in-

dicators get analyzed on a theoretical and practical level. Finally, a list of indicators to be applied for a

higher number of organizations is presented. KPIs are performance measurements, thus, it is important

to normalize them in turns of units, computation (how they are achieved) and semantics. Indicators in

categories such as Quality and Satisfaction have to be measured through audits or questionnaires using

27

Page 46: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

adequate scales. Table 3.8 describes how each one of the previous KPIs can be calculated based on

several references [71, 70, 72, 73, 74].

Indicators Units DescriptionFinancial IndicatorsTotal Cleaning Cost e Sum of all cleaning costsCleaning Cost per m2 e/m2 Total Cleaning Cost/Net Room AreaTotal Maintenance Cost e Sum of costs of maintenance for electricity equipment, HVAC,

elevators, escalators, generators, UPS, ICT maintenance, etcFM Costs e Costs of FM department or FM outsourcingUtility Costs e Sum of costs for water, electricity, oil, gas and othersFacility Budget/Corporation Budget % (Facility Budget/Corporation Budget)x100Occupancy Cost per m2 e/m2 Total Occupancy Cost/Net Floor AreaSpace Costs per m2 e/m2 Total Space Costs/Net Floor AreaOperation Cost per m2 e/m2 Operation Cost/Net Floor AreaMoving Costs e Sum of planning costs such as boxing, transport, assembling

and space planningHSSE Costs e Sum of costs for health, safety, security and environment

(outsourcing + Department)Security Costs e Sum of Physic Security Costs (fire prevention and protection

sensors and extinguishers) and Human Security Costs(surveillance and reception)

Logistic Costs e Sum of storage and distribution costsHospitality Costs e Sum of costs for meeting rooms, conference centers, daycare

centers, gyms, apartments, etc.Project Costs (Deviation) % (Actual Total Project Cost - Initial Predicted Project Cost/ Initial

Predicted Cost)x100Financial Ratios This includes various distinct KPIs such as Gross Profit Margin,

Inventory Turnover, etcAnnual Income e/yrTotal Annual Facility Cost e/yrAnnual Cost of Energy Purchased e/yrTotal Environment Cost e Sum of all environment costsOutdoor Costs e

Spacial IndicatorsNet Floor Area per FTE m2/FTE Net Floor Area/Number of FTE personnelPercentage Net Floor Area % (Net Floor Area/Total Level Area)x100Percentage Internal Area % (Internal Area/Total Level Area)x100Percentage Gross Floor Area % (Gross Floor Area/Total Level Area)x100Support Area m2

Maintenance/Cleaning IndicatorsRepairs VS Preventive Maintenance (byspecialty)

% (Number of Corrective Maintenance per month/Number ofPreventive Maintenance per month)x100

Asset Replacement Values (by specialty) % (Annual Maintenance Cost/Maintained Assets ReplacementValue)x100

Percentage of Area Cleaned % Area Cleaned/Net Floor Area

Productivity IndicatorsCore operating hours of facility hoursUptime Facility (by specialty) % (Facility Run Time (Production)/Total Available Time to Run or

Produce)x100Staff Turnover % (Number of Employee Departures (FTE)/Average Number of

Staff Members (FTE) Employed)x100Absenteeism % (Total Days Lost/Total Possible Days Worked)x100

Environmental IndicatorsCO2 emissions tons Conversion of the Total Energy ConsumptionTotal Energy Consumption kWhTotal Water Usage m3

Total Waste Production tons

Continued on next page

28

Page 47: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Indicators Units DescriptionService Quality IndicatorsQuality of Service/Product Scale Values Obtained Through Audits or QuestionnairesQuality of Cleaning Scale Values Obtained Through Audits or QuestionnairesQuality of Workplace Scale Values Obtained Through Audits or QuestionnairesQuality of Security Scale Values Obtained Through Audits or QuestionnairesQuality of Reception and Contact Center Scale Values Obtained Through Audits or QuestionnairesQuality of Document Management Scale Values Obtained Through Audits or Questionnaires

Satisfaction IndicatorsClient Satisfaction % Values Obtained Through QuestionnairesSatisfaction with FM % Values Obtained Through QuestionnairesSatisfaction with Space % Values Obtained Through QuestionnairesSatisfaction with Outdoors % Values Obtained Through QuestionnairesSatisfaction with Cleaning % Values Obtained Through QuestionnairesSatisfaction with Workspace % Values Obtained Through QuestionnairesSatisfaction with HSSE % Values Obtained Through QuestionnairesSatisfaction with Hospitality % Values Obtained Through QuestionnairesSatisfaction with ICT % Values Obtained Through QuestionnairesSatisfaction with Logistics % Values Obtained Through Questionnaires

Table 3.8: Full list of KPIs covered by literature along with their corresponding descriptions [71, 70, 72,73, 74].

3.5.1 Priority Analysis

In order to understand which are the most important KPIs for FM, we need to combine data from various

scientific literature with the existing solutions. Figure 3.4 presents the indicators found in tables 3.6, 3.7

Figure 3.4: FM KPIs organized according to frequency and importance.

and 3.4 enabling the identification of the most relevant KPIs based on importance and frequency.

29

Page 48: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

It must be noted that this graphic is constructed from the KPIs described in the academic literature.

However, some of the KPIs mentioned in literature may not be the most suitable. Sometimes there were

repeated KPIs with different names or KPIs that were included in other less specific KPIs, for example,

the indicator Life Planning Costs is part of the indicator HSSE Costs. Moreover, some of the KPIs most

frequently reported in the literature have necessary measurements to their calculation that are very

difficult to obtain, for example CO2 Emissions indicator, so, organizations do not want to have all the

work to obtain them, thus, they are not able to gather enough data to calculate all the previously referred

KPIs.

3.5.2 Normalized FM Indicators

The indicators which are the most prevalent in literature and software applications were chosen for the

final set of KPIs. For example, Client Satisfaction has 3 references in Table 3.7 and one reference

in Table 3.4, making a total of four references. Thus, Client Satisfaction was chosen to integrate the

final list. However, the indicator Occupancy Cost per Operation Costs was also chosen, which is an

aggregation of two indicators: Occupancy Cost per m2 and Operation Cost per m2. The final list of KPIs

can be seen in Table 3.9.

Indicators Units DescriptionFinancial IndicatorsCleaning Cost per m2 e/m2 Total Cleaning Cost/Net Room Area OR Total Cleaning

Costs/Net Floor AreaTotal Maintenance Cost e Sum of costs of maintenance for electricity equipment, HVAC,

elevators, escalators, generators, UPS, ICT maintenance, etcFM Costs e Costs of FM department OR FM outsourcingUtility Costs e Sum of costs for water, electricity, oil, gas and othersSpace Costs per m2 e/m2 Total Space Costs/Net Floor AreaOccupancy Cost e/m2 Total Occupancy Cost/Net Floor AreaOccupancy Cost per Operation Costs % (Occupancy Cost/Total Operation Costs)*100

Spacial IndicatorsNet Floor Area per FTE m2/FTE Net Floor Area/Number of FTE personnelPercentage Net Floor Area % (Net Floor Area/Total Level Area)x100Percentage Internal Area % (Internal Area/Total Level Area)x100Percentage Gross Floor Area % (Gross Floor Area/Total Level Area)x100

Maintenance/Cleaning IndicatorsRepairs VS Preventive Maintenance (byspecialty)

% (Number of Corrective Maintenance per month/Number ofPreventive Maintenance per month)x100

Asset Replacement Values (by specialty) % (Annual Maintenance Cost/Maintained Assets ReplacementValue)x100

Percentage of Area Cleaned % Area Cleaned/Net Floor Area

Productivity IndicatorsStaff Turnover % (Number of Employee Departures (FTE)/Average Number of

Staff Members (FTE) Employed)x100Absenteeism % (Total Days Lost/Total Possible Days Worked)x100

Environmental IndicatorsTotal Energy Consumption kWhTotal Water Usage m3

Service Quality IndicatorsQuality of Service/Product Scale Values Obtained Through Audits or QuestionnairesQuality of Cleaning Scale Values Obtained Through Audits or QuestionnairesQuality of Workplace Scale Values Obtained Through Audits or Questionnaires

Continued on next page

30

Page 49: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Indicators Units DescriptionQuality of Security Scale Values Obtained Through Audits or Questionnaires

Satisfaction IndicatorsClient Satisfaction % Values Obtained Through QuestionnairesSatisfaction with HSSE % Values Obtained Through Questionnaires

Table 3.9: Final list of Proposed KPIs [71, 70, 72, 73, 74].

31

Page 50: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Chapter 4

Solution Proposal

Given the limitations of the current benchmarking solutions there is a clear necessity for new and innova-

tive benchmarking applications, a new solution is in order. The solution should give FM Benchmarking

results to organizations through a graphical representation of their KPIs, aggregating different orga-

nizations, with different FM systems. Indeed, aggregation of different organizations data is the most

important feature on the solution.

4.1 Overview

The solution brings better insights about organizations FM position relatively to others, since it uses the

gathered information to compare facilities. Organizations and facilities identities are classified, however,

the ranking by their results are still possible. This Cloud-based solution is developed as a Web Appli-

cation, which is divided in two parts: Server side and Client side. The server side runs directly on the

hosting servers, while the client side runs on the browser as an endpoint to the server. To better under-

stand the solution, an overview of the architecture can be found on Figure 4.1. This organization of the

system enables the solution to:

• Authentication service to authorize the users of a organization.

• Integrate data of distinct organizations.

• Database Caching for better performance.

32

Page 51: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.1: FM benchmarking system architecture. Organizations send their data on a standard for-mat, to be stored on a database. This info is then accessed by different users representatives of eachorganization.

4.2 Requirements

To develop an interface, the user opinion is very important, and so, to achieve a system well accepted

by final users, all concept as to be developed in order to please their needs. For this reason, is essential

to establish a methodology where all steps lead to a well designed, easy to use and useful interface.

Thus, our interface development process consists on the following steps:

1. Elicitation Interviews with experts to elicit system requirements.

2. User Requirements specification establish who the stakeholders are and what are their needs.

A questionnaire was used to understand which metrics and indicators users would like to see on

the first version of the system.

3. Functional Requirements and Constraints definition specify exactly what the system must do.

4. Defining a system domain model to understand how the components relate to each other.

5. Creating a conceptual interface model which consists on creating low-fidelity sketches in order

to integrate all requirements into the screens. This step is not trivial, and so, to help define the

essential interactions and to prioritize them, is useful to create a conceptual model based on the

previous steps.

The previous methodology was applied to create and develop the benchmarking reports interface.

To support implementation of the previous steps, we used Confluence team collaboration software [75],

which integrates several tools that enable a simple elaboration of models, mockups and requirements

33

Page 52: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

of our system. This phase enabled uncovering inconsistencies upfront before starting the implementing

phase.

4.2.1 Elicitation Interviews

Interviews can be used to verify facts, clarify ambiguity, identify requirements and solicit opinions and

ideas. Therefore, they are one of the most popular business analysis techniques.

For this reason, a set of elicitation interviews with experts were made—experts are people who have

relevant knowledge to share, that are responsible for harvesting and eliciting the knowledge for potential

use by others at some future point in time [76]. This interviews had the goal to identify the system

requirements and constraints, but also the domain structure. The conclusion of these interviews are

specified on subsections 4.2.3 and 4.2.4 correspondingly. Moreover, elicitation interviews were used to

comprehend and specify which interface should be used on the system. The final interface is described

on Section 4.3.3.

4.2.2 User Requirements

It is crucial to understand who are the users and what are their necessities. It is also important to know

how users perform their tasks on current technology, understanding where is room for improvement and

where to keep functionality.

Some of this information is explained on Chapter 3, where a study about the current technologies

and the key performance indicators was made. From this previous study, we derive a set of indicators to

be used on the first version of the system. The validation of this indicators is presented on Section 5.1

of Chapter 5.

4.2.3 Functional Requirements and Constraints

A requirement is a set of conditions that the system must conform to. Thus, requirement engineering

is an approach to manage the requirements of systems, concerning the real-world goals for functions

and constraints of systems [77]. Requirements Engineering is also concerned with the relationship of

these factors to precise specifications of system behavior, and to their evolution over time and across

system families [77]. This engineering is a cooperative, iterative, and incremental process which aims

at ensuring: i) all relevant requirements are explicity known and understood, ii) all requirements are

documented and specified [78].

To define the system requirements is essential to understand what are the actions of the users of

the system—what they will be able to do; what functions will they use; and what they need to know.

However, these requirements may change along the project.

Functional requirements are the specification of a functional/behavior that the system must be able to

perform—what the system must do. On the other hand, non-functional requirements are the specification

of the qualities of the system—what the system must be. At Table 4.1 we present the system functional

and non-functional requirements.

34

Page 53: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Functional RequirementsUser accesses his facility informationUser edits his facilities informationUser registersUser logs inUser logs outUser accesses Measurements by FacilityUser accesses Measurements by SiteUser accesses Indicators by FacilityUser accesses Indicators by Organization

Table 4.1: Functional system requirements.

ConstraintsAll users must be registered to log inUsers must be logged in to access their informationUser can only access his facility informationUser can only edit his facilities informationEach Measurement correspond to a Time PeriodEach Indicator correspond to a Time PeriodThe calculation of Indicators require the previous existence of MeasurementsThe calculation of Indicators require the previous existence of Static MeasuresCan not be inserted more than one entry of a particular metric per monthThere must be a Measurement value for each monthThe dates of the different values of a metric can not overlap.xls file columns must be in this order: name, value, start date, end date.xls file cells can not be in blankMeasure names must be the ones on the presented list

Table 4.2: System Constraints.

A constraint is a factor that limits what the system can achieve [79]. For an optimal functionality of

the system, were defined some design constraints which specify some system behaviors. These are

presented on Table 4.2.

4.2.4 Defining a System Domain Model

A system can be defined as a collection of different elements, that together produce results impossibly

obtained from the individual elements [80]. Thus, a system is a set of elements and processes that are

related and whose behavior satisfies customer and operational needs [81]. The value of the system is

created by the relationship among the different parts and how they are interconnected [80].

Therefore, it is important to understand what are the systems elements and how they interact. We

used Confluence to build a domain model of the system that is presented on Figure 4.2.

4.2.5 Creating the interfaces conceptual model

In order to develop a Benchmarking system that integrates distinct facilities and organizations, a simple

and user-friendly interface was developed, for better understanding and satisfaction of the platform by

users. This interface was evaluated incrementally to enable an individual assessment of each compo-

nents. The base system has two main features which can be performed by the user:

Data Insertion and Edition for user to create, insert and edit organizations and facilities infor-

mation: name, address, NIF and all correspondent metrics. The data input can either be inserted

35

Page 54: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.2: System Domain Model. Presents how data entities relate to each other.

36

Page 55: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

by hand or through an excel file import.

Indicators Report for user to visualize and study all indicators calculated by the system through

the previous user metric input. These indicators will be shown through a set of several charts

where can be made different types of comparisons by facility or between them.

In order to consolidate all requirements in a set of screens, low-fidelity sketches were developed

using Balsamic. These sketches depict: i) Home Page, ii) Sign Up User, iii) General User Interface,

iv) Edit/View User Profile, v) Add/Edit New Facility - Facility Details, vi) Log In, vii) Facility/Site Measures,

viii) Facility/Site Indicators, ix) Add New Role. Some of the previous sketches are presented in Figures

4.3 and 4.4.

(a) User details screen

(b) Facility details screen

Figure 4.3: Mockup screens of distinct aspects of the application developed using Balsamiq: User andFacilities details screens.

37

Page 56: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

(a) Mockup Metrics management screen

(b) Indicators Management screen

Figure 4.4: Mockup screens of distinct aspects of the application developed using Balsamiq: Metricsand Indicators Management Screens

4.3 Solution Architecture

As explained before, the solution proposed is a cloud web application, therefore, its architecture has

two main modules: Server Side and Client Side. Another important component of the solution is the

database and how it is formulated. These three components are explained with more detail above:

Client Side runs on the user web browser by performing series of HTTP requests to the server.

It was used Bootstrap, a framework that implements responsive design best practices, which

enables rapid, high quality CSS and HTML development. For the generation of the graphics was

used the Javascript library Highcharts.

Server Side is where the application is running and is responsible for processing and storage

of data sent by the Organization to the database. Ruby on Rails was used as the server side

framework, which is also responsible for the generation of HTML templates that are sent to the

Client Side.

Database PostreSQL was used as the relational database, which is theoretically divided in three

parts: Input Data Staging Area, KPI Aggregated Data and Facility Metadata.

38

Page 57: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.5: Systems architecture overview. This image shows a clear contrast between Server Side andClient Side of the application, and how they interact with each other.

These server and client sides interact with each other through HTTP requests—the overall architec-

ture is illustrated on Figure 4.5, where can be seen clearly the division between these two modules.

4.3.1 Server Side

All the software logic is on the Server side of the application. Moreover, all the processing and data

storage is executed here. The authentication and data management and update (CRUD - Create, Read,

Update and Delete) are responsibility of the Server. There are six main entities on this solution:

User: Each user can have more than one role and several facilities. Also, each user can only

access his facilities data.

Role: Each role represents the part played by the user to each facility bonded to that specific

role. There are four distinct roles: Facility Manager, Occupant, Owner and Service Provider.

Facility: Represents a facility of a specific organization. For example, a specific building of an

university.

Static Measure: These are facilities’ measures that rarely change, like Net Floor Area (NFA) or

FTE, unless there are major transformations on the facility. The input of these measures is rare,

and because of that, they had to be treated differently.

Measure: These measures have to be inserted on the application each month and are bonded

to a specific facility. Examples of this type of measures are: Water or Energy Consumption,

or Cleaning Costs. Because some of these measures don’t represent a specific month—for in-

stance, half January and half February—it was important to transform them to represent a specific

39

Page 58: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

month for the benchmarking. Thus, was applied a granular transformation of the measure from

month to day—we found the average value for each day—, and then, the days were transformed

to a specific month, explained on Section 4.4.

Benchmarking: Applying measures and static measures, a set of metrics are calculated and

used to benchmark facilities. These benchmarking is shown through several charts.

Exemplifying, a user can have the Role ”Facility Manager of IST”. In this case, this role is bonded with

several facilities— Taguspark building, Central Building in Alameda, among other IST buildings. Each

facility has its owns measures and static measures, and therefore, its own benchmark reports.

4.3.2 Database Layer

The relational database architecture developed had two main components—Input Data Staging Area

and Facility Metadata. The Input Data Staging Area is responsible for the data sent by the organization

storing—measures—, while the Facility Metadata stores standard facility information such as address or

name.

Each time a user accesses the benchmarking page, it is necessary to compute the benchmark data

and present it through charts. However, this process was very time and memory consuming because it

was necessary to compute all facility’s indicators each time the page was accessed, and consequently,

the system would be slower.

Therefore, it was essential to design a database architecture different from the previous one. Was

then decided to add another component to the database—the KPI Aggregated Data. This component

uses the Input Data Staging Area data to calculate the KPIs and store them—it works as a cache of

indicators. Therefore, each time the user imports a xls file with measures, all correspondent indicators

are calculated and stored. Thus, it is not necessary to calculate the indicators in run-time, but just query

them on the database. This update on the architecture makes possible to deliver KPIs faster and less

memory consuming—validation on Section 5.3 of Chapter 5. Both database architectures are presented

in Figure 4.6.

4.3.3 Client Interface

The Client Side runs on the browser of the user connecting to the website. The interface enables

the user to interact with the application, which is also where the statistics about the organization are

presented. Because the interface is really important and is a key element on the application appealing,

it was necessary some iterations to achieve a more clean and easier user interactivity.

The first contact the user has with the solution is the Landing Page, where is presented a small

description of the main features of the solution, and where the user can Register or Log In into the site.

40

Page 59: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

(a) Database without cache. It receives the data andstores it on Input Data Staging Area.

(b) Database with cache. It receives the data and storesit on Input Data Staging Area, then, when the KPIs arecalculated they are stored at KPI Aggregated Data.

Figure 4.6: Database arrangements overview with and without cache.

Interface Elements

After the user logs in the site, is available a complex system, where distinct components are presented

in three major elements. This elements are presented on Figure 4.7 and are organized as follows:

Top Bar has three clear parts 1. the logo (which is also a button to the Home page), 2. the display

of the current selected role (which redirect to the Role Edition page) and 3. the user name (when

clicked, opens a dropdown with buttons to Account Edition page and to Log Out).

Left Side Bar: has five distinct parts 4. Details dropdown (links each listed facility to their Details

page), 5. Metrics dropdown (links each listed facility to their Metrics page), 6. Reports dropdown

(links each listed facility to their Benchmarking Reports page), 7. Add Facility button (opens the

New Facility page) and 8. Minimize button (to minimize the side bar width).

Inner Page: consists on only one part 10. part of page where the distinct screens are rendered.

Inner Pages

The Inner Page is where all the necessary screens are rendered. Next are presented the solution main

screens:

• Account Edition screen: account details can be edited,

• Role Selected Edition screen: roles details can be edited,

• Add Role screen: roles details can be added,

• Facility Details screen: facility details can be edited, also, there is a link to static metrics screen,

• Facility Static Metrics screen: the static metrics are presented through a list, and can be

searched by name, start date and end date,

41

Page 60: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.7: Specification of Top Bar. 1) Logo button to Home Page. 2) Selected Role link to Role EditionPage. 3) User Name and buttons to Account Edition page and Log Out. 4) Details dropdown. 5) Metricsdropdown. 6) Reports dropdown. 7) Add Facility button. 8) Minimize button. 9) Dropdown facilities list.10) Inner Page.

• Facility Metrics screen: the metrics are presented through a list by month of year, and can be

searched by name, start date and end date,

• Facility Reports screen: the indicators are presented through a set of charts, and can be

searched by facility’ business sector, city and/or country, and/or indicators’ year,

• Add Facility screen: facility details can be added.

Figures 4.8 and 4.9 show examples of screens rendered on Inner Page.

Relatively to the Facility Reports Screen, there are three types of charts for each indicator. The first

chart shows every input metric for a specific facility, being possible to do zoom in or zoom out to better

study them. The second chart presents the indicator by the four quarters of the year with bars, and

with a line represents the indicator values for the best ranked facility. Finally is presented a chart with

three indicators—Cleaning Cost, Space Cost and Occupancy Cost per NFA—in three years, to better

compare the facility results in that period of time.

42

Page 61: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

(a) User details screen

(b) Facility details screen

Figure 4.8: Examples of of User details and Facility Details screens rendered on Inner Page.

43

Page 62: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

(a) Metrics management screen

(b) Indicators Report screen

Figure 4.9: Examples of Metrics and Indicators Report screens rendered on Inner Page.

44

Page 63: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

4.4 Implementation Process

From the resulting sketches, a first prototype was developed using Ruby on Rails, HTML, and CSS

Bootstrap Framework. The reasons why languages and frameworks were chosen to develop the solution

will be explained on Subsection 4.4.1. Moreover, the rest of this section will address how main features

were implemented, which libraries were used, and how was made the version control and deployment.

4.4.1 Languages and Frameworks

The development of the solution was in the Ruby language [82]. It is a modern object-oriented language

that focus on simplicity and productivity, with an easy to write and read syntax [83]. “You can concentrate

on solving the problem at hand, instead of struggling with compilerand language issues” [84]. Ruby has

an enormous standard library—libraries included in Ruby distribution—, a great standard documentation

system—RDoc [85]— and a system of packaging Ruby—Ruby Gems [86]—that are, essentially, Ruby

software packages. However, the default Ruby has some problems: i) the lack of native threads, and

ii) the Global Interpreter Lock (GIL). Nonetheless, these problems can be solved in other Ruby virtual

machines implementations such as the following:

• Rubinius: is designed for concurrency with native threads, allowing to run code on several CPU

cores. It is implemented atop the Low Level Virtual machine (LLVM), where the stdlib is imple-

mented in Ruby instead of C [87].

• Jruby: implementation of Ruby on the Java Virtual Machine (JVM), which supports native thread-

ing but not Gems with ”c bindings” [88].

In parallel with Ruby, was used the Rails Framework. Rails is a full stack, Model View Controller

(MVC) framework that has a lot of nice features to get new applications off the ground quickly and some

really great online guides to setting things up [89], such as Generators or ActiveRecord. ActiveRecord

is the key to the interactions with the database, writing ruby syntax to interact with it, this means that the

database can be any Relational Database Management System, and can be changed with just a couple

of lines in the configuration file [89]. Shortly, to understand better the MVC architecture, a Model consists

of the application data, business rules, logic and functions, while a View can be any data representation

output, such as a table or a diagram. In turn, the Controller mediates entry, converting it into commands

to the model or vision. Also, Rails “is designed to make programming web applications easier by making

assumptions about what every developer needs to get started” [90]. Moreover, Rails makes easy to test

application, therefore, applications tend to be tested [83]. Ruby on Rails brings the better of two worlds:

programs are shorter and more readable, which makes easier to see whats happening. In others web

frameworks, a simple change to the schema could involve half a dozen changes, in Rails, the MVC

(consists of the reusability of code and separation of concerns) and the DRY convention (which stands

for: don’t repeat yourself) empowers that a piece of knowledge should be expressed in just one place of

the system, as a result, decreases possible changes [83].

45

Page 64: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Github [91], Shopify [92] and Basecamp [93] are examples of large scale, high performance web

applications implemented in Rails that show how well Rails works and how Rails along with a good

system architecture can be scalable. Also, this framework has a massive community, and very good

podcasts and vodcasts which eases the learning of language and the dissolution of bugs. Moreover,

rails will make 10 years soon, which gives it high maturity and consistency.

As addressed on Subsection 4.4.4, the application is deployed on Heroku, and for that reason we

chose to implement the database in PostgreSQL [94].

For the frontend development, was used Bootstrap [95] which is a CSS framework that brings many

advantages such as: a responsive 12-column grids, layouts and components; many different elements

such as headings, lists, tables, buttons, forms, etc; interactive elements with JavaScript and good doc-

umentation. Also, for the charts of reports page was used the Highcharts API [96]. Highcharts is a

JavaScript library of interactive charts released in 2009, and provides very appealing and professional

looking charts [97]. The very good and well organized documentation—as how you would pass it to

create a chart—allied with a powerful Forum, makes a great support community.

Ruby Libraries(Gems)

As explained on Subsection 4.4.1, Gems are Ruby libraries. They were very useful helping the imple-

mentation of some solution features, such as authentication, for the seeding the database and even the

debugging. Were used the following Ruby Gems:

Rails itself is also a gem.

Pry is a gem to create a debbuging session. At any point of the source code, is possible to insert

the line binding.pry. When the Ruby interpreter runs the line, it stops and opens a Pry REPL

console. It allows to inspect state, variables, and even the flow of the program.

Seed-fu is used to feed default values to a fresh installation of the software. Basically, the default

initial data is inserted on db/seed.rb file that is used to feed the database with those through a

console command—rake db:seed. Thus, it is possible to initiate the application with standard

users, facilities, roles and static measures, also, is possible to reset the database to those initial

values at any time. Seed-fu is even more robust seed data system than the default seed feature

of ruby.

Devise is a flexible authentication solution. Because is constantly updated by several contribu-

tors, it makes the authentication more strong and stable.

Sucker-punch is a single-process Ruby asynchronous processing library, which reduces costs

of hosting on a service like Heroku. It was used to implement a dedicated thread for the metrics

import, without holding the user at this point.

Ransack enables the creation of both simple and advanced search forms for Rails applications,

and it is in regularly maintained. It was used to implement the search for Static Metrics, Metrics

and Indicators.

46

Page 65: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Applying Ruby Gems was very well pondered decision. They are constantly maintained by several

contributors and for that reason, they are very reliable. Moreover, Gems makes some difficult and time

consuming implementations a little easier, stronger and steadier.

4.4.2 Version Control

Version control system records changes to a file over time, allowing to recall a specific version later

[98]. More specifically, it enables to revert a file, files or the whole project to a previous state, compare

changes over time, see who last modified something—who introduced an issue and when. Version

Control is very important not only to integrate and control team work flows, but also to recover from a

problem [98].

Today’s version control can be local, centralized or distributed. Local control can be useful when

there is only one person developing. However, when there are more than one person collaborating to

a system, there is a need for a centralized or distributed control version. In centralized control there is

normally a server with all versioned files [98]. Distributed Version Control Systems such as Git clients

can check out the latest snapshot of the files or even fully mirror the repository. Therefore, when a server

dies, any client repository can be copied back up to the server to restore it [98]. Also, it deals pretty well

with several remote repositories, thus a client can collaborate with different groups of people in different

ways simultaneously within the same project.

Because all the previous reasons, we chose to use Git [99] as our version control system. Moreover,

Git has already 10 years, and was idealized by Linus Torvald for a better and easier management control

of Linux Kernel development versions [100]. Also, we used GitHub—Git repository hosting service

[91]. GitHub provides a Web-based graphical interface, access control and collaboration features, for

example, wikis and basic task management tools for projects [101].

4.4.3 Test Driven Development

Testing the application is very important because it assures the features are correctly implemented, it

diminish the fear of breaking the application when refactoring the code, also, it offers an instant feed-

back about the code. However, testing manually is not an option, since clicking hundreds of buttons of

filling forms in a browser each time the code is changed is a waste of time, and so, it is not an option.

Automated testing saves a lot of time and helps tracking all sorts of bugs.

For this solution we adopted a Test Driven Development (TDD) approach: testing before adding the

functionality, and only then ensure that the test passes—some times it is easier to solve a problem if we

go from finish to start. The TDD is exceptional to build up a test suite that exercises every part of the

application and that helps to figure what is the next implementation step.

In order to setup the testing environment, and the setting up of the test data is very important. In

Rails this can be handled by defining and customizing fixtures (sample data). Fixtures are written in

YAML and allows to populate the testing database with predefined data before the tests run. After this

step we were ready to implement and run distinct tests:

47

Page 66: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

• Unit Tests validates every method in the application models. A test suit was created for every

model of the solution.

• Integration Tests tests the work flows within the application—such as creating new user or new

facility—and tests the interaction among any number of controllers. With integration tests it is

possible to exercise the entire stack from database to dispatcher.

• Functional Tests tests the different actions of each controller. For example, we tested if web

requests (get, post, put, delete, etc) were successful, if the user was redirected to the right page,

if the user is successfully authenticated, if the user only accesses pages he has permission for, if

the messages displayed to the user in the view were the appropriate one, etc.

4.4.4 Deployment

To provide a real situation evaluation and benchmarking, concerning the previously mentioned tests, this

prototype was deployed on the cloud provider Heroku. Heroku is a PaaS which provides SQL Postgres

Database and a Github integration, which permits a quicker and easier deployment of the application

while allowing vertical and horizontal scalability on the fly [41]. On this phase, the project will not have

a significant amount of users, and for that reason, the app is running on a single web dyno. Later, more

dynos can be added for a better scalability of the application. The current dyno has 521MB of RAM and

1 core CPU.

4.5 Solution Implementation Issues

The solution implementation method followed some steps previously mentioned on the preceding sec-

tions. However, there are some specific feature implementations that should be specified in more detail.

Some of them are related to back-end development when other relate to front-end development, both

are address at the following subsections.

4.5.1 Database Schema

The Database Schema is essential to the overall architecture. Studying the domain model presented

previously on Figure 4.2, it was possible to understand what should be the better Database schema,

which can be seen on Figure 4.10.

4.5.2 VAT ID and ZIP Code Validation

The validation of the VAT ID and ZIP Code is possible by simply specify a precision number of digits

expected for each field on the database migration. For example, the Figure 4.11 presents the Roles

table migration, where is specified that the VAT ID (the Portuguese NIF) has to be a exact number of

digits, can not be null, has to be unique and has to be the type Integer.

48

Page 67: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.10: Database Schema.

Figure 4.11: VAT ID Validation Code. Example of how to validate the number of digits expected for afield.

4.5.3 User Authentication

The user authentication is verified in each Controller with a before filter. Filters are methods that run

before, after or around a controller action. Thus, a before filter allows to run a method before any

controller action. Using it, as required to run the require-login and authentication methods before filter,

validating the authentication of the user previously. Both methods were implemented and do not belong

to Devise Gem explained later on this Section.

4.5.4 Seed and Fixture for Testing

The appliance of seed and fixtures was essential to execute the several testing. Fixture allows to store

data in YML files—exemplified in Figure 4.12—-, enabling to test against it, through unit, integration and

functional tests. On the other hand, seeds (explained in next subsection and illustrated in Figure 4.13)

is fundamental to execute performance testing, feeding previously the database with data.

49

Page 68: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.12: Fixture Code. Example of a Fixture for an organization.

Figure 4.13: Seed Code. Example of a Seed for an user.

4.5.5 KPIs Computation

The Key Performance Indicators computation is define in a ruby file as a library. However, this file

had some changes because the several stages of the database—with or without the cache. First, the

database had no cache of indicators. All the calculations were made when the user accessed the

Reports screen. At this phase, the Server makes the calculations and simply sends them to the Client

side, where Highcharts use the values to build the charts. When the cache is implemented, the indicators

have to be calculated every time the user inputs new measures, and then stored on a KPI database table.

Thus, when the user accesses the Reports screen, the Server only sends the indicators and have no

longer to calculate them at this time.

4.5.6 Implementation of Granular Metrics

An implementation of granular metrics was crucial for KPIs calculation. Users can input metrics corre-

sponding to 30/31 days, but not corresponding a specific month. For example, the Energy Cost metric

can be from 12 March to 11 April, however, the indicators always corresponds to a specific month. It

was necessary to desegregate the metrics value by day, and then, recalculate a specific month value,

to be used on the Indicators calculations. The code for the granulation of the metric in days values is

presented on Figure 4.14.

4.5.7 Excel File Import and Values Verification

The import and validation of the excel file is implemented on the metric model. For each value read

from the excel file, is necessary to verify the consolidation of them, and then define new metric and new

granular metrics with that same value.

50

Page 69: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 4.14: Granular Metric Code. Transformation of the month metric to day metrics.

4.5.8 Google Places Auto-complete

To improve the fill of addresses on the application, is used the Google Places Auto-complete API, simply

applying it to the Client Side via JavaScript.

4.5.9 Metrics List Improvement

There was verified a need for an improvement in the list of metrics, because it was very heavy and

confusing. Thus, the new list was organized by month of year, where the columns correspond to each

available metric. Clicking on a metric allows the user to edit it or delete it.

4.5.10 Login and Register Modal

For a lighter and simpler landing page, was implemented a modal for the user Login and Register.

Therefore, when the user clicks Register or Log In buttons, a modal popover appears in an elegant

manner.

51

Page 70: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Chapter 5

Evaluation

This chapter describes the validation of the proposed solution. It is very important to validate i) indicators

utility, ii) utility of a cloud solution for benchmarking, iii) solution usability and iv) solution performance.

The indicators validation is presented on next Section. The existence of other cloud-based benchmark-

ing solutions—as presented on Section 3.3 of Chapter 3—eliminates the need for a study corroborating

the usefulness of a cloud solution for benchmarking on FM. For the usability and performance validation

of the solution, were performed Usability Tests (evaluates the application by testing it with users), Quali-

tative Tests (to gather users opinions) and Performance Tests (in order to test the application efficiency).

It also presents the test results and their analysis.

5.1 Validation of the Indicators

To understand if the previously selected set of KPIs for the systems is useful, a questionnaire was made

to possible users, where we asked to rate the KPIs between 0 (not useful) and 10 (very useful). This

possible users can be Facility Owners, Facility Managers, Facility providers or Facility Occupants. This

questionnaire can be consulted on Figures B.1 and B.2 of Appendix B.3. The questionnaire was an-

swered by four people from four different sectors: Facility Services, Energy Sector, Real Estate and

Public Administration. The questionnaire results are presented on Table 5.1. As can be seen, all indica-

tors had a high average rate—lower rate average was 6.5 and the highest was 8.5.

Through the results gathered, we can concluded that, from a general point of view, all indicators had

a good rate and, therefore, the list of KPIs does not need to be modified.

52

Page 71: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 5.1: The minimum, maximum and average rate for each indicator suggested in the KPI question-naires results. The average of the results obtained through the KPI questionnaire is represented by thebullet. All indicators had a good rate, being all rates equal or above 6,5.

5.2 Usability Tests

Usability testing has been defined as “process of learning from users about a product’s usability by

observing them using the product...” [102]. Therefore, usability testing provides a direct perception on

how the users interact with the application and the common errors and difficulties. A usability test model

computing 5-7 representative users generally finds about 80% of the problems [103]. The purpose of

conducting the usability activities are to validate some assumptions about the target user audience,

identify usability issues that exist and gather information on future needs [103]. Therefore, the process

of usability tests consist of giving some tasks—between 2 and 10—to users along with instructions. An

observer must be answering quick questions from the users and recording issues such as errors, time

to complete the task and users opinions [103, 104, 105].

These tests find better ways to improve the application interface or functionality by asking users

directly what should be improved. Thus, as a result of usability tests, we can understand if the application

interface is well designed and perceptible.

53

Page 72: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

5.2.1 Defining Scenarios and Tasks

Testing scenarios give participants a background to complete the tasks, establishing introductory in-

formation and the rationale of the task. For these reasons, scenarios should be realistic enabling test

participants to relate to the situation being tested [106]. Accordingly, we present next 3 distinct scenar-

ios:

Scenario 1 Consider you have been using the cloud-based Facility Management Bench-

marking platform. You are the Facility Manager of a organization named ”Testing Organiza-

tion 1” that has 2 distinct facilities: ”Facility A” and ”Facility B”. You have already input the

static measures (measures such as NFL) and you need to input the non-static measures for

”Facility B” to be able to see the benchmarking charts.

Scenario 2 Consider that you are the Owner of ”Facility A”, and that you need to know if your

performance for Cleaning Costs is better than the previous year. And what you can improve

next year.

Scenario 3 Consider that you are the Owner of ”Facility A” and ”Facility B”. You need to

know if there is another facility which have better performance in ”Space Experience” that

your ”Facility A” in 2012.

The tasks should relate to the scenarios. For each task will be measured the time that each user

take to complete it. If after three minutes the task is not completed, it will be considered incomplete.

Therefore, we present the tasks according to each scenario, on Table 5.1 that will be completed for each

user testing the prototype. The print version can be seen on Appendix B.5.

Description Time Incomplete

Scenario 1Task 1 Import the xls file ”facilityB.xls” for Facility B.Task 2 Change Facility A name to ”Testing Facility A”.Task 3 Change the Facility Manager role to Owner.

Scenario 2 Task 4 Compare your ”Cleaning Costs” performance to previous year.Task 5 Verify what you can change to improve your next year results.

Scenario 3 Task 6 Verify your space of improvement of ”Space Experience” in 2012 relatively to other facilities.

Table 5.1: Table of tasks to be completed for users in a three minute window each.

After the usability test, each participant fills out a usability questionnaire (selected from surveys about

ISO 9241 [107, 108]) covering human-system interaction aspects regarding Functionality, Design and

Ease of use, Learning and Satisfaction. Each aspect is measured through five point likert scale (from

1-”Totally Disagree”; 3-”Not agree nor disagree”, till 5-”Totally Agree”). The questionnaire is presented

on Table 5.2 and the Google Form version is presented on Appendix B.4. The last question is an open

question where users can specify any relevant issues not covered by the previous questions, and their

opinion about the relevance of the proposed solution or improvements to be done.

5.2.2 Usability Tests Results

Were asked five users, with a Facility Management background, to complete the Usability Testing. The

information about these users can be seen on Table 5.3. The generality of tasks, were complete with

54

Page 73: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Aspect Question

Functionality Question 1 The application has every expected functionality.Question 2 The system’s available information is enough to complete all tasks.

Design and Ease of Use

Question 3 I liked using the interface.Question 4 The application interface is pleasing to use.Question 5 The application is easy to use.Question 6 The needed information is easy to find.Question 7 Globally, the application is easy to use.

Learning Question 8 It is easy to learn how to work with the application.

Satisfaction

Question 10 Generally, I can complete all tasks effectively.Question 11 I am globally pleased with my tasks performance.Question 12 The software forces me to apply procedures that are not related to my actual work.Question 13 Rate your general impression of the software.

Table 5.2: Qualitative Testing Questionnaire to be answered by he participant users.

Figure 5.2: The minimum, maximum and average time for each task completed by users. The averageof the results obtained through the KPI questionnaire is represented by the bullet. All indicators had agood rate, being all medium times less than two minutes.

success (only four out of a total thirty tasks were incomplete). More specifically, one user couldn’t

complete task 3 and 6, while two users couldn’t complete task 4. Also, users completed each task within

a period of time relatively short, as can be seen through the usability test results on Figure 5.2.

Age Gender SectorUser 1 39 Female Facility ManagementUser 2 40 Male ArchitectUser 3 51 Male Software DevelopmentUser 4 31 Male Software DevelopmentUser 5 41 Female Administrative

Table 5.3: Usability Testing Users Information.

This shows the easy interaction between user and interface, which is backed up by the usability

questionnaire results on Figure 5.3, where users presented their opinion. Furthermore, is possible to

verify the satisfaction revealed by users on the questionnaire, where all aspects—Functionality, Design

and Ease of Use, Learning and Satisfaction—had good results on the likert scale, where the lowest

average rate was on question six. All other questions had an average above three on the likerts scale.

55

Page 74: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure 5.3: Average values from the likert scale for each question on the Usability Questionnaire.

5.3 Performance Tests

Performance Tests aim at uncovering speed or memory problems that the application may have—if it

is slow or if is using too much memory. Performance testing is also useful to understand where the

problems are coming from and what are the bottlenecks.

The tool set selected for testing the performance reflects on the value of the data collected during the

tests. Testing tools comes with specific characteristics which make it more or less suitable for different

scenarios.

Some benchmarks make possible to measure the application performance by hitting the website with

several requests. The Siege Ruby plug-in does exactly that [109].

5.3.1 Cache Efficiency Tests

In order to test the cache efficiency on the database, the performance testing was divided in two stages.

First, a first set of tests was run, where KPIs were asked directly to the Input Data Staging Area. On

the second stage testing, it was used a cache, and the KPIs were asked to the KPI Aggregated Data.

This way, it is possible to understand the performance optimization brought by the cache. For a coherent

testing, the test workload comprised 15 concurrent users trying to access the same URL on both testing

performance stages.

Base Testing: The Benchmarking Page URL of two distinct users, without an implemented

cache. The results can be seen on Table 5.4.

Cached Testing: The Benchmarking Page was tested again with the same two distinct users,

now the application has the indicators cached. Therefore, they are not calculated each time the

Benchmarking Page is requested. Results can be seen on Table 5.4.

From the performance testing results, can be concluded that, because now it is necessary to make

more queries to the database in order to get the indicators, the number of transactions is also higher,

and so is the quantity of data transfered. However, it is not necessary to calculate all indicators each

time the web page is requested.

56

Page 75: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Item Scenario 1 Scenario 2Base Cached Base Cached

Transactions (hits) 114 552 97 567Availability (%) 100 100 100 100Elapsed time (secs) 119.75 119.79 119.51 119.99Data transferred (MB) 4.41 21.32 3.75 21.90Server Response time (secs) 14.70 3.17 16.83 2.94Transaction rate (trans/sec) 0.95 4.61 0.81 4.73Failed transactions 0 0 0 0Longest transaction 19.43 7.25 22.08 5.06Shortest transaction 1.77 0.77 2.25 0.73

Table 5.4: Results from performance testing on Benchmarking Page (BP) from two distinct users—Scenario 1 and 2. The Base presents the first database architecture results, and Cached presents theresults after the cache implementation.

Despite the fact that the time elapsed of the Benchmarking Pages tests were substantially similar,

other factors can show the usefulness of caching the indicators. The cache allows a larger number of

transactions per second. Also, the Longest and Shortest Transaction for both users take much less time

on the Cached testing than on the Base one. Moreover, instead of accessing the database each time it

runs a query, Rails also has a SQL caching feature, saving several query results to be used later, which

improves the application performance.

5.4 Discussion

Through the different tests carried out the solution, it was possible to validate indicators utility, solution

usability and the solution performance. The Indicators validation showed that users find most of the

chosen indicators useful, since all indicators had a high average rate (lower rate average was 6.5 and the

highest was 8.5). The usability testing of the solution reveal a easy and fast interaction user-interface.

Also, most users showed great satisfaction and found the Cloud-based benchmarking solution high

functional and easy to learn. Finally, the performing testing exhibit the great potentiality of the indicators

cache which decreases the transactions time, therefore the global performance of the solution is also

better. Overall, the results are very promising.

57

Page 76: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Chapter 6

Conclusions

Benchmarking is a fundamental tool for a continuous performance improvement on management with

a great potential to be applied on Facility Management. However, the difficulties in finding a commonly

accepted set of metrics, combined with the lack of accurate metrics to drive decision-making makes this

goal particularly challenging.

To date, there is no simple solution for facilities benchmarking—which is our main goal in Europe

and, thus, organizations continue to use distinct software to support their FM and KPI gathering, hinders

aggregation and analysis of data.

This report proposes a new FM benchmarking cloud application for presenting KPIs and a ranking

FM organizations. First, concepts of FM benchmarking are studied in detail along with concepts of

Cloud-Computing, the different Standards used nationally and internationally, the related work done by

other researchers in the field of FM benchmarking, and some technologies existent today in this area.

Based on a state-o-the-art and literature analysis, this work develops and validates a list of KPIs that ae

commonly accepted in industry.

To further test our hypothesis, a working prototype os an FM benchmarking solution was developed

and validated for performance and usability. The evaluation results showed a very promising. In partic-

ular, the indicators chosen showed very useful to users, and the performance testing display very good

performance, specially with the indicators cache. Moreover, users reported to be very satisfied with the

solution and found it easy to use and learn.

Overall, with the solution proposed in this project, quantifying real property performance will be easier

and organizations will have a better way to evaluate its own FM metrics, while enabling the comparison

of metrics between enterprises and facilities.

6.1 Lessons Learned

We found that, indicator data are very difficult to obtain and that data is critical for benchmarking appli-

cation.

Due to the amount of entities involved in this project, it was really hard to coordinate efficiently crucial

58

Page 77: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

project tasks such as having the designs on time, getting data from facilities or set a date for usability

testing due to differences in schedules. Eventually all of these small problems led to an overall project

entropy and was sometimes a little bit overwhelmed.

6.2 Future Work

Facilities benchmarking is clearly an area growing every day and therefore, since there is a lack on

solutions to support the FM benchmarking this work can definitly form the basis of products in this area.

This is why there is such a need to achieve a FM benchmarking cloud application. In the future, the

development of this cloud-based facility management benchmarking will continue on INESC, by creating

new features and some performance improvements.

A new important feature to be added to this application is an information data exchange standard that

will allow different softwares interoperate and exchange FM indicators data with the application proposed

—- integrating data into our system. Thus, the manual input or import of different metrics would not be

necessary — the system would become much more clean, simple and easy of usage.

59

Page 78: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

60

Page 79: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Bibliography

[1] International Facility Management Association. IFMA Homepage, May, 2014 2014. http://www.

ifmaoc.org.

[2] Sarel Lavy and Igal M. Shohet. Performance-based facility management – an integrated approach.

International Journal of Facility Management, 1(1):1–14, 2010.

[3] T. Madritsch, M. May, H. Ostermann, and R. Staudinger. Computer Aided Facility Management

(CAFM) as a New Branch of Decision-making Support Technologies in the Field of Facility Man-

agement. In F. Adam and P. Humphreys, editors, Encyclopedia of Decision Making and Decision

Support Technologies, pages 84–92, 2008.

[4] Lıvia Roka-Madarasz. Facility Management benchmarking. 8th International Conference on Man-

agement, Enterprise and Benchmarking, pages 171–181, 2010.

[5] Michael Pitt and Matthew Tucker. Performance Measurement in Facilities Management. Property

Management, 26:241–254, 2008.

[6] Carol Taylor Fitz-Gibbon. Performance indicators. Benmchmark, 12:31, 1990.

[7] CEN. EN 15221-7: Facility Management - Part 7: Guidelines for Performance Benchmarking,

2012.

[8] Robert C Camp. Benchmarking: The Search for Industry Best Practices that Lead to Superior

Performance. Quality Press, Michigan, 1989.

[9] FM Leaders Forum. BENCHMARKING: Effective Performance Management for FM, 2013.

[10] Elizabeth Dukes. SaaS, cloud solutions and web-based software all

prominent in facilities management. http://www.iofficecorp.com/blog/

saas-cloud-solutions-and-web-based-software-all-prominent-in-facilities-management

(Last accessed on April, 2015), 2013.

[11] Claire Baker. The Sky’s the Limit for Facilities Management with Cloud Based Applications. http:

//www.qcdata.com/skys-limit-facilities-management-cloud-based-applications/ (Last

accessed on April, 2015), 2014.

61

Page 80: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

[12] Hara Software. Hara Platform. http://www.capterra.com/energy-management-software/

spotlight/106849/Hara%20Platform/Hara%20Software (Last accessed on April, 2015), 2015.

[13] Utilibill. Utilibill. http://www.capterra.com/energy-management-software/spotlight/82428/

Utilibill/Utilibill (Last accessed on April, 2015), 2015.

[14] Essets. Essets. https://essets.com/Churches-Religious-Institutions-Facility-Management-Software#

.VSUxzVy4k6g (Last accessed on April, 2015), 2015.

[15] Novo Solutions. Novo ShareNet Cloud Platform. http://www.novosolutions.com/

facility-management-software/ (Last accessed on April, 2015), 2015.

[16] Steven Rawlins . Cloud-Based Facilities Management Software: Empowering Fa-

cility Managers. http://www.buildings.com/buzz/buildings-buzz/entryid/336/

cloud-based-facilities-management-software-empowering-facility-managers.aspx

(Last accessed on April, 2015), February 2015.

[17] Kimmel, Peter S. Benchmarking for Facility Professionals, April, 2015

2015. http://foundation.ifma.org/docs/default-source/Whitepapers/

benchmarking-for-facility-professionals-ifma-foundation-whitepaper-small.pdf?

sfvrsn=4.

[18] John Hinks and Peter McNay. The creation of a management-by-variance tool for facilities man-

agement performance assessment. Facilities, 17:31–53, 1999.

[19] Godfried Augenbroe and Cheol-Soo Park. Quantification methods of technical building perfor-

mance. Building Research and Information, 33(2):159–172, 2005.

[20] David G. Cotts, Kathy O. Roper, and Richard P. Payant. The Facility Management Handbook.

Amacom, 2010.

[21] B Atkin and A Brooks. Total Facilities Management. 2009.

[22] Michael May and Geoff Williams. The Facility Manager’s Guide to Information Technology. Great

Britain, 2012.

[23] Arash Shahin and M. Ali Mahbod. Prioritization of key performance indicators: An integration

of analytical hierarchy process and goal setting. International Journal of Productivity and Perfor-

mance Management, 56:226–240, 2004.

[24] Wolfram Trinius and Christer Sjostrom. Service life planning and performance requirements. Build-

ing Research and Information, 33(2):173–181, 2005.

[25] Bill Bordass, Adrian Leaman, and Paul Ruyssevelt. Assessing building performance in use 5:

Conclusions and implications. Building Research and Information, 29(2):144–157, 2001.

[26] Keith Massheder and Edward Finch. Benchmarking Metrics Used in UK Facilities Management.

MCB University Press, 16:123–127, 1998.

62

Page 81: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

[27] Brian Meacham, Robert Bowen, and Amanda Moore. Performance-based building regulation:

Current situation and future needs. Building Research and Information, 33(2):91–106, 2005.

[28] John D. Gilleard and Philip Wong Yat-lung. Benchmarking Facility Management: Applying Analytic

Hierarchy Process. Facilities, 2, 2004.

[29] Dore Steenhuizen, Ines Flores-Colen, A. G. Reitsma, and Pedro Branco Lo. The Road to Facility

Management. Facilities, 32:46 – 57, 2014.

[30] CEN. EN 15221-1: Facility Management - Part 1: Terms and definitions, 2006.

[31] David Parmenter. Key performance indicators : developing, implementing, and using winning

KPIs. United States of America, 2007.

[32] Eric Teicholz. Facility Design and Management Handbook. McGraw-Hill Professional, 2001.

[33] Liang-Jie Zhang and Qun Zhou. CCOA: Cloud Computing Open Architecture. 2009 IEEE Inter-

national Conference on Web Services, pages 607 – 616, 2009.

[34] Michael Johnson. Work Space Virtualization: What you need to know for IT operations manage-

ment. [S.l.] : EMEREO PTY LIMITED, 2011.

[35] Sachin Achara and Rakesh Rathi. Security Related Risks and their Monitoring in Cloud Comput-

ing. International Journal of Computer Applications, 86:42–47, 2014.

[36] Peter Mell and Timothy Grance. Recommendations of the National Institute of Standards and

Technology. The NIST Definition of Cloud Computing, 2011.

[37] A. Lenk, M. Klems, J. Nimis, S. Tai, and T. Sandholm. What’s inside the Cloud? An architectural

map of the Cloud landscape. 2009 ICSE Workshop on Software Engineering Challenges of Cloud

Computing, 0:23–31, 2009.

[38] Amazon Web Services. Amazon Web Services EC2 Homepage. http://aws.amazon.com/ec2/

(Last accessed on May, 2014), 2014.

[39] Giuseppe DeCandia, Deniz Hastorun, Madan Jampani, Gunavardhan Kakulapati, Avinash Laksh-

man, Alex Pilchin, Swaminathan Sivasubramanian, Peter Vosshall, and Werner Vogels. Dynamo:

Amazon’s Highly Available Key-value Store. Proceedings of twenty-first ACM SIGOPS symposium

on Operating systems principles, 41:205–220, 2007.

[40] Google Inc. Google App Engine. http://www.google.com/apps (Last accessed on May, 2014),

2014.

[41] Heroku. Heroku Homepage. https://www.heroku.com (Last accessed on May, 2014), 2014.

[42] GeoForm Software. GeoPal. http://www.capterra.com/facility-management-software/

spotlight/135105/GeoPal/GeoForms%20Software (Last accessed on April, 2015), 2015.

63

Page 82: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

[43] Reflex Online. Reflex Facility. http://www.capterra.com/facility-management-software/

spotlight/141181/Reflex%20Facility/Reflex%20Online (Last accessed on April, 2015), 2015.

[44] WorkOasis. WorkOasis. http://www.capterra.com/facility-management-software/

spotlight/127353/WorkOasis/WorkOasis (Last accessed on April, 2015), 2015.

[45] Core 7 US. WOW! http://www.capterra.com/facility-management-software/spotlight/

118171/WOW!/Core%207%20US (Last accessed on April, 2015), 2015.

[46] ICS. International Classification for Standards, 2005.

[47] Kevin Yu, Thomas Froese, and Francois Grobler. A development framework for data models for

computer-integrated facilities management. Automation in Construction, 9(2):145 – 167, 2000.

[48] Reh, F. John. How to Use Benchmarking in Business. http://management.about.com/cs/

benchmarking/a/Benchmarking.htm (Last accessed on April, 2015), 2015.

[49] W., Alexis. How Do Businesses Use Benchmarking to Im-

prove Productivity and Profit? http://smallbusiness.chron.com/

businesses-use-benchmarking-improve-productivity-profit-493.html (Last accessed

on April, 2015), 2015.

[50] Premises and Facilities Management. Premises and Facilities Management Online. http://www.

pfmonthenet.net/article/47196/Facilities-Management--The-Secret-Service.aspx (Last

accessed on January, 2015), 2011.

[51] Per Anker Jensen. Facilities Management Comes of Age. http://www.the-financedirector.

com/features/feature84914/ (Last accessed on January, 2015), 2010.

[52] Konstantinos Papamichael. Green building performance prediction/assessment. Building Re-

search and Information, 28(5/6):394–402, 2000.

[53] RICS Property Measurement Group. Code of Measuring Practice, 2007.

[54] CEN. EN 15221-6: Facility Management - Part 6: Area and Space Measurement in Facility

Management, 2011.

[55] Building Cost Information Service of RICS. Elemental Standard Form of Cost Analysis - Principles,

Instructions, Elements and Definitions, 4 edition, 2008.

[56] Maxpanda Inc. Maxpanda Homepage, May, 2014 2014. http://www.maxpanda.net.

[57] IBM Inc. IBM Tririga Homepage, May, 2014 2014. http://www-03.ibm.com/software/products/

en/ibmtrirfacimana/.

[58] Archibus Group. Archibus Homepage, May, 2014 2014. http://www.archibus.com.

[59] FM:Systems Inc. FM:Systems Homepage, May, 2014 2014. http://www.fmsystems.com.

64

Page 83: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

[60] Indus Systems Inc. Indus Systems Homepage, May, 2014 2014. http://www.indus-systems.

com.

[61] Manhattan Software Inc. Manhattan Mobile Apps Homepage, May, 2014 2014. http://www.

manhattansoftware.com/solutions/mobile-apps.html.

[62] PNMSOFT Inc. PNMSOFT Homepage, May, 2014 2014. http://www.pnmsoft.com.

[63] eBench company. eBench homepage. http://www.ebench.com (Last accessed on May, 2015),

2015.

[64] ARCHIBUS Group. ARCHIBUS Fundamentals Training, 2014.

[65] Sequence Kinetics iBPMS. Sequence Kinetics, from Workflow to Social, Mobile and Intelligent

BPM, 2011.

[66] Daniel C.W. Ho, Edwin H.W. Chan, Nicole Y. Wong, and Man-wai Chan. Significant Metrics for

Facilities Management Benchmarking in the Asia Pacific Region. MCB University Press, 18:545–

555, 2000.

[67] Donald F. Van Eynde and Stephen L. Tucker. A Quality Human Resource Curriculum: Recom-

mendations from Leading Senior HR Executives. Human Resource Management, 36:397–408,

1997.

[68] Stanley F. Slater, Eric M. Olson, and Venkateshwar K. Reddy. Strategy-Based Performance Mea-

surement. Business Horizons, 40:37–44, 1997.

[69] Q. Wang. Activity-Based Facility Management Benchmarking. Facilities Management Benchmark-

ing On-Line Conference FMDatacom, Jan - March:Appendix A, 1998.

[70] Dayana B. Costa, Helenize R. Lima, and Carlos T. Formoso. Performance Measurement Systems

for Benchmarking in the Brazilian Construction Industry. International Symposium on Globalisation

and Construction AIT Conference Centre, Bangkok, Thailand, pages 1029–1039, 2004.

[71] Dayana Bastos Costa, Helenize R. Lima, Karina B. Barth, and Carlos T. Formoso. Performance

Measurement Systems for Benchmarking in the Brazilian Construction Industry: A Learning Ap-

proach. Porto Alegre Brasil, 2005.

[72] IPD Environment Code. Measuring the Environmental Performance of Buildings, 2010.

[73] Richard Loth. Investopidea, February, 2014 2014. http://www.investopedia.com/university/

ratios/.

[74] CEN. EN 15341: Maintenance - Maintenance Key Performance Indicators, 2005.

[75] Atlassian Inc. Confluence Team Software. https://www.atlassian.com/software/confluence

(Last accessed on May, 2015), 2015.

65

Page 84: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

[76] Greenes Consulting. Knowledge Management: Guide to Knowledge Elicitation Interviews . Tech-

nical report, Greenes Consulting, August 2010.

[77] Antoni Olive. Conceptual Modeling of Information Systems. Springer, 2007.

[78] Klaus Pohl. Requirements Engineering - Fundamentals, Principles, and Techniques. Springer,

2010.

[79] H. William Dettmer. Systems and Constraints: The Concept of Leverage. Goal Systems Interna-

tional, 2006.

[80] Cecilia Haskins. . Incose, 2012.

[81] IEEE. IEEE 1220-2005: Standard for Application and Management of the Systems Engineering

Process, 2005.

[82] Ruby. Ruby Homepage. https://www.ruby-lang.org/en/ (Last accessed on May, 2015), 2015.

[83] Sam Ruby, Dave Thomas, and David Heinemeier Hansson. Agile Web Development with Rails.

Packt Publishing, 2013.

[84] Dave Thomas, Chad Fowler, and Andy Hunt. Programming Ruby 1.9: The Pragmatic Program-

mer’s Guide. The Facets of Ruby. The Pragmatic Bookshelf, Raleigh, North Carolina, 2009.

[85] Ruby rDoc. rDOC Homepage. http://rdoc.sourceforge.net/ (Last accessed on May, 2015),

2015.

[86] Ruby Gems. Ruby Gems Homepage. https://rubygems.org/ (Last accessed on May, 2015),

2015.

[87] Rubinius. Rubinius Homepage. http://rubini.us/ (Last accessed on May, 2015), 2015.

[88] Jruby. Jruby Homepage. http://jruby.org/ (Last accessed on May, 2015), 2015.

[89] Wayne Graham. Why Ruby? Technical report, Scholar Lab, May 2010.

[90] Per Anker Jensen. Getting Started with Rails. http://guides.rubyonrails.org/getting_

started.html (Last accessed on February, 2015), 2014.

[91] Github. Github Homepage. https://github.com/ (Last accessed on May, 2015), 2015.

[92] Shoppify. Shoppify Homepage. http://www.shopify.com/ (Last accessed on May, 2015), 2015.

[93] Basecamp. Basecamp Homepage. https://basecamp.com/ (Last accessed on May, 2015),

2015.

[94] PostgreSQL. PostgreSQL Homepage. http://www.postgresql.org/ (Last accessed on May,

2015), 2015.

66

Page 85: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

[95] Bootstrap. Bootstrap Homepage. http://getbootstrap.com/ (Last accessed on February,

2014), 2014.

[96] Highcharts. Highcharts Homepage. http://www.highcharts.com/ (Last accessed on May,

2015), 2015.

[97] Joe Kuan. Learning Highcharts. Packt Publishing, 2012.

[98] Git VC. Git Documentation - Version Control. http://git-scm.com/book/en/v2/

Getting-Started-About-Version-Control (Last accessed on May, 2015), 2015.

[99] Git Inc. Git Homepage. http://git-scm.com/ (Last accessed on May, 2015), 2015.

[100] Cloer, Jennifer. Linux News Homepage. http://www.linux.com/news/featured-blogs/

185-jennifer-cloer/821541-10-years-of-git-an-interview-with-git-creator-linus-torvalds

(Last accessed on May, 2015), 2015.

[101] Klint Finley. Techrunch Homepage. http://techcrunch.com/2012/07/14/

what-exactly-is-github-anyway/ (Last accessed on May, 2015), 2015.

[102] Carol M. Barnum. Usability Testing and Research 1st. Allyn and Bacon, Inc. Needham Heights,

MA, United States of America, 2001.

[103] Laura L. Downey. Group Usability Testing: Evolution in Usability Techniques. Journal of Usability

Studies, 2:133–144, 2007.

[104] Rolf Molich, Ann Damgaard Thomsen, Barbara Karyukina, Lars Schmidt, Meghan Ede, Wilma van

Oel, and Meeta Arcuri. Comparative evaluation of usability tests. In CHI’99 extended abstracts on

Human factors in computing systems, pages 83–84. ACM, 1999.

[105] Anne Kaikkonen, Titti Kallio, Aki Kekalainen, Anu Kankainen, and Mihael Cankar. Usability Testing

of Mobile Applications: A Comparison between Laboratory and Field Testing. Journal of Usability

Studies, 1:4–16, 2005.

[106] Tech Smith. Software Usability Testing with Morae. http://assets.techsmith.com/Docs/

pdf-morae/SW-Usability-Testing-with-Morae.pdf (Last accessed on April, 2015), 2009.

[107] Gunther Gediga, Kai-Christoph Hamborg, and Heinz Willumeit. The IsoMetrics Manual. Universi-

tat Osnabruck, 1998.

[108] Reza Safdari, Hussein Dargahi, Leila Shahmoradi, and Ahmadreza Farzaneh Nejad. Comparing

Four Softwares Based on ISO 9241 Part 10. J Med Syst, 36:2787–2793, 2012.

[109] Tomasz Borowski. Siege Load Test. https://github.com/tbprojects/siege_load_test (Last

accessed on Jan, 2015), 2011.

67

Page 86: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

68

Page 87: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Appendix A

Indicators Tables

A.1 Indicators Table 1

IndicatorDescription of FacilitiesIndustries representedFacility useHours of operationNumber of occupantsSizes and Uses of FacilitiesGross areaRentable areaUsable areaSquare footage per occupantBuilding efficiency ratesWorkstation utilization ratesOffice space per workerSupport areaOffice Space PlanningVacancy ratesSpace allocation policiesOffice type and sizeRelocation and ChurnOrganizational movesCost of movesChurn rateMaintenance, Janitorial and Indirect CostsOrganizational movesMaintenance costs (by age of facility)Percentage of replacement costRepair vs preventive maintenanceOutsourcing of maintenance functionJanitorial costsIndirect costsUtility CostsUtility costsUtility usageEnvironmental and Life Safety CostsEnvironmental costsLife-safety costsSupport and Project CostsSecurity costsProject costsSpace planning costsEmployee amenities costsFinancial IndicatorsReplacement value of facilityLease type and costCost of operationsCost of providing the fixed assetOccupancy costFinancial ratiosTotal annual financial costs

Table A.1: Key Performance Indicators for FM organized by areas within facilities operation according toIFMA [4].

69

Page 88: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

A.2 Indicators Table 2

Activities and ReportsSpace ManagementRooms by buildingSynchronize room percentagesHighlight rooms by departmentFinancial statement for charge-backActual cost vs. budgets for departmentsHistorical space usage by departmentAsset ManagementEquipment standards and inventoryDepreciation schedules for assetsEquipment disposition historyOperations and MaintenanceWork orders scheduled vs. completedRooms with active work ordersService level agreementsParts usage historyPlanning board for labor resourcesCapital BudgetingApproved projects by funding yearAvailable capital and expense fundsBudget by programGeo-spatial ViewsFacilities and site infrastructure master planningUtilities, cable plant and network managementEnvironmental health and safety complianceEmergency preparedness and response

Table A.2: List of reports of the ARCHIBUS FM [64] software package for Educational Institutions orga-nized by category.

70

Page 89: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Appendix B

Questionnaires

B.1 Routine Cleaning Quality Questionnaire

Routine Cleaning 0% 25% 50% 75% 100%

Workspace Frequency 1x per week 2x per week 3x per week 4x per week every day

Toilets Frequency < 2x perweek

2-3x perweek

every day 2x per day > 2x per day

Staff Supervision poor su-pervisionby areamanagers

acceptablesupervisionby areamanagers

expert su-pervisionby areamanagers

Cleaning Standard very in-consistentand of poorstandard(noticeablyunclean oninspection)

usually in-consistentand belowstandard(numerousissues toaction oninspection)

consistentlyto an ac-ceptablestandard(issues toaction oninspection)

usuallyconsistentof a highstandard(few issuesto action oninspection)

always con-sistent andof high stan-dard (veryclean oninspection)

Customer Service cleaningstaff areimpolite andnot veryhelpful

cleaningstaff are po-lite, but notvery helpful

cleaningstaff arepolite andhelpful

cleaningstaff areproactivein offeringservice

cleaningstaff goabove andbeyond thecall of duty

Staff Presentation cleaningstaff lookuntidy andare often outof uniform

cleaningstaff lookacceptableand oc-casionalexceptionsare promptlyrectified

cleaningstaff looktidy and arealways inuniform

User Complaints monthlycom-plaints/staffbase >20%

monthlycom-plaints/staffbase =15-20%

monthlycomplaints/staff base =10-15%

monthlycom-plaints/staffbase =5-10%

monthlycom-plaints/staffbase ¡5%

Table B.1: Example of Routine Cleaning Quality Questionnaire. The percentages correspond to: 0%- very poor, 25% - poor, 50% - average, 75% - good, 100% - very good. Personnel should select theoption closest to their situation.

71

Page 90: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

B.2 Special Cleaning Quality Questionnaire

Special Cleaning 0% 25% 50% 75% 100%

Flooring Frequency (deep cleaning) < 2x perannum

2x per an-num

3x per an-num

4x per an-num

> 4x perannum

Partitions Frequency <2x perannum

2x per an-num

3x per an-num

4x per an-num

>4x perannum

Windows Frequency <2x perannum

2x per an-num

3x per an-num

4x per an-num

> 4x perannum

Table B.2: Example of Special Cleaning Quality Questionnaire. The percentages correspond to: 0%- very poor, 25% - poor, 50% - average, 75% - good, 100% - very good. Personnel should select theoption closest to their situation.

72

Page 91: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

B.3 Users Questionnaire about KPIs

Figure B.1: First page of the users questionnaire to understand the usefulness of the selected indicatorsto real users.

73

Page 92: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure B.2: Second page of the users questionnaire to understand the usefulness of the selected indi-cators to real users.

B.4 Usability Testing: Google Form

74

Page 93: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure B.3: Google Form Questionnaire from Usability Testing first page.

75

Page 94: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Figure B.4: Google Form Questionnaire from Usability Testing second page.

76

Page 95: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

B.5 Usability Testing: Scenarios and Tasks

Figure B.5: Scenarios and Tasks from Usability Testing.

77

Page 96: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

Appendix C

Prototype

C.1 Measures Screen

Figure C.1: Measures prototype Screen. Shows the metrics table. It is possible to add new measuresby an import of a xls file or by hand input. It is possible to search a specific KPI by name or by start dateor end date.

78

Page 97: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

C.2 Benchmarking Reports Screen

Figure C.2: Benchmarking Reports prototype Screen. Presents the different KPI graphs from a specificfacility. It is possible to search a specific business sector, year, country or city.

C.3 User Details Screen

Figure C.3: User edition details screen. It is possible to change user information but also the differentuser roles details.

79

Page 98: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

C.4 Home Screen Prototype

Figure C.4: Home Screen for users not authenticated.

C.5 Facility Details Screen

Figure C.5: Facility Details Screen. It is possible to add a new Facility or edit current facility detailsinformation. Also, can be added attributes — measures that only change sporadically such as NFA.

80

Page 99: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

C.6 Sign In Screen

Figure C.6: Sign In Screen.

81

Page 100: Cloud-based Facility Management Benchmarking · Cloud-based Facility Management Benchmarking Sofia Pereira Martins Thesis to obtain the Master of Science Degree in Information Systems

82