Driving Performance

6
Number 3 2015 www.iomnet.org.uk V41 PLUS IOM Annual Conference 2015 preview Navigating the Future – see page 8 The Future of Manufacturing A new era of opportunity for the UK – see page 19

Transcript of Driving Performance

Page 1: Driving Performance

Number 3 2015www.iomnet.org.ukV41

PLUS – IOM Annual Conference 2015 previewNavigating the Future – see page 8

The Future of ManufacturingA new era of opportunity for the UK – see page 19

1front_cover 4/9/15 16:18 Page 1

Page 2: Driving Performance

Proactively driving operationalperformance through visualmanagement

Steven White FIOM

10

n an ever more competitive business landscape, businessesare on the lookout for new ways to improve performance

or make efficiencies. This article looks at how Severn TrentWater approached improving performance, looking at someof the key considerations when developing metrics to trackand drive performance, targeted at different audienceswithin the business.

‘If you can’t measure it, you can’t improve it’1

Severn Trent Water is a water company operating in the UKservicing 4.3 million homes and businesses across the Midlandsand mid-Wales, providing clean water and taking waste water

away. In the pursuit of improving levels of service, we set out tounderstand how teams were driving and tracking their ownperformance, sharing best practice through internally andexternally benchmarking drawing from management tools,techniques and academia.

In 2008, an internal improvement team was set up to helpshape the culture of Severn Trent. Continuous improvementfrom all staff was and still is actively encouraged and is supportedby tools and techniques to aid and embed improvements for a lasting impact. The pace of change was driven throughempowering all employees, especially those in front-linepositions that are closest to customers and processes and in the

I

6proactivedrive 4/9/15 16:21 Page 1

Page 3: Driving Performance

One of the most striking observations made during theworkshops was that although we had a goodunderstanding of our processes, we focused on laggingindicators at almost all of the levels within the business.

Operations Managementwww.iomnet.org.uk

Number 32015

perfect position to identify opportunities. However, we found atrade-off had occurred. With teams developing metrics to drivetheir own performance, metrics to monitor the full end-to-endprocess performance had not been considered, which meantthat some key performance indicators (KPIs) were not beingreviewed. We realised we had a golden opportunity to learnfrom teams that were driving performance and apply sometheoretical principles to help teams understand how they canproactively influence performance, as well as ensure measuresare in place to review overall end-to-end performance.

We defined our goal statement, such that for each keycommitment we should:

� Use proactive leading indicators, as well as lagging measures

� Present this information to teams that can influence them

� Provide a line of sight of how each team can influencecommitment performance

Define key commitments

Our starting point was to define our key commitments as abusiness. These consisted of external commitments tocustomers, such as ‘Responding to customer request within fourhours’, and to regulators, such as ‘How many times we leavecustomers without a supply of water’. Internal commitments tothe board of directors or executive committee could includecommitments such as reducing operating costs or increasingproductivity by fixing more leaks each day, or commitments tostaff such as reducing the number of accidents at work.

The water industry in the UK is regulated and a business plan is submitted to the regulator every five years. In the latestregulatory period, 2015–20, water companies submit a

11

Severn Trent Water is a water company servicing 4.3 million homes and businesses across the Midlandsand mid-Wales

Workshops with staffwere organised toidentify any gaps inreporting

6proactivedrive 4/9/15 16:21 Page 2

Page 4: Driving Performance

business plan where, working with regulators and customers,SMART (specific, measureable, achievable, realistic and timebound)2 commitments were identified, with targets beingagreed based on acceptability to the customer, historicperformance and planned interventions.

A prioritisation exercise was undertaken on these commitmentsto understand where the business would gain the largest benefitthrough focused effort. We considered the size of the gapbetween projected performance vs desired performance, aswell as the resulting impact of poor performance, be that serviceor customer failures, fines, reputational impact or licenceconditions.

Once a list of commitments had been finalised, a seniormanagement sponsor was agreed for each measure. The roleof the sponsor was multifaceted: to act as a champion of themeasure through supporting the activity; to help manageresistance by removing blockers and bottlenecks in the process;and, most important, to promote and communicate theimportance of the activity so teams and individuals would bemore willing to engage fully in the process, participate inworkshops and use the newly developed measures to driveperformance.

Use proactive leading indicators

We understood there may be gaps in reporting and broughtpeople together to share existing reporting outputs andprocedures formally and to share best practice. Working withkey stakeholders and subject matter experts from around thebusiness, workshops were organised to identify any gaps inreporting and define new measures. Each of the new measureswas peer reviewed with the end-users, so their learning andfeedback could be included in the final build. Engaging withteams at this early stage paved the way for an easier rolloutprocess, with teams primed to accept new measures.

One of the most striking observations made during theworkshops was that although we had a good understanding ofour processes, we focused on lagging indicators at almost all ofthe levels within the business. Lagging indicators are defined asan after-the-event measurement, such as counts of failure. Thiscould be the amount of energy we use, which can be calculatedonce we have used the energy. To drive performance effectively,we needed to include more leading indicators or proactivemeasures. These fall under three main categories:

� Behaviours: a cultural change to encourage the business todo something differently – for example, turning off allpersonal computers after use

� Process – for example, installing software to put computersinto hibernation mode automatically when not in use

� Asset upgrade – for example, replacing old energy-hungryequipment with more energy-efficient models

Leading indicators act as an early warning system for thebusiness and will indicate potential future changes inperformance of the lagging indicators or commitments – forexample, if we have a target of reducing energy consumed inthe office by 10% but budgets have been frozen, which hasarrested the upgrade of energy efficient computing hardwareand lighting, which were seen as two of the biggest contributorsto performance, we can see this as a sign that we are unlikely toachieve our target of 10% efficiency.

Presenting the right information to the right people

During these exploratory workshops, we found teams of thesame function but in different geographical location had beenmeasuring performance of the same commitment slightly

Proactively driving operational performance through visual management

12

Continuousimprovement from allstaff is activelyencouraged throughoutSevern Trent Water

Example: how teams can proactively influence performance of a commitment to reduce energy consumption by 10%Figure 1

6proactivedrive 4/9/15 16:21 Page 3

Page 5: Driving Performance

differently. Reports had been created using data from differentsources, different reporting timescales even different filters onthe data including and excluding different criteria. Each reportthen presented different outputs, which denied viewingcomparative performance and made the sharing lessons learnedor best practice more difficult. We resolved such issues byaligning standards across areas to provide consistency inreporting to enable areas to be compared. Development ofthese reports were then produced by a central team, whichfreed up resource in the departments.

To ensure we did not overwhelm teams by expecting them toreview too many KPIs, measures were targeted at specific teamsrather than a wider scale rollout. This meant that teams wouldbe able to focus on key metrics where they could directlyinfluence performance, rather than more generalised ‘for info’measures not driving specific teams into action. Where teamsare relatively diverse, influencing a number of the key measures,a Pareto prioritisation exercise3 was conducted to identify whichmeasures would have the greatest impact on performance inthe first instance.

Even though some of the lower priority measures may not beused immediately, they were recorded for two reasons:

1. If we found performance dipping even though we were ontrack with our lower level performance indicators, we may nothave prioritised correctly. By keeping this record of all measures,they could be swapped so new measures could be brought into replace those that were found to have less of an impact onperformance than expected.

2. Looking holistically at driving performance across the suite ofpreviously identified commitments, if performance is sustainedat an acceptable level and change initiatives have beenembedded to sustain performance, these measures could be

retired. When performance is sustained longer term, review ofthese measures could move towards a monthly or quarterlycheck-in to ensure performance remains stable.

Providing a line of sight of how each team caninfluence performance

As the suite of measures was developed at the workshops, thiswas represented visually following principles of critical to qualitytrees4 to make it clear how each team and each level ofmanagement within the business can contribute to businessperformance. We found this visual representation of key metricsand the teams that are involved, brought previously disparateteams closer together, creating better relationships through acommon cause and breaking down working silos as colleaguesstarted to see how each team influence overall performance ofthe commitment.

Let us look at the monthly energy bill in an office – see Figure 1.Our key commitment is to reduce energy consumption by 10%by September 2016 following principles of SMART objectives.The four levels of the pyramid represent the layers ofmanagement in the business from the director level, who wouldreview the 10% reduction in energy consumption outlined inorange, through the organisation to the team leaders on thelower level of the pyramid highlighted in purple. It is importantto note that in most instances approximately 80% of thebusiness is either the team management level or their directreports, so metrics need to be targeted at these teams tomaximise impact.

Whilst developing measures, if we ask the question: ‘What canwe influence that will drive this performance?’, we will identifythemes of improvement. Within this example we have focusedon two key areas of energy consumption that the teams canmanipulate that analysis showed would have the highest impact:

Operations Managementwww.iomnet.org.uk

Number 32015

13

To ensure teams are not overwhelmed by too many KPIs, measures were targeted at specific teams

6proactivedrive 4/9/15 16:21 Page 4

Page 6: Driving Performance

being energy used through key types of equipment and energyconsumed by lighting. These two measures will be monitoredby the senior management. In this example, it is important tonote these measures are targeted at those that can influencethem. The call centre managers are responsible for theequipment they use, shown by the grey-shaded boxes, whereasoffice lighting is controlled by the facilities management teams,shown by blue-shaded boxes.

The second level of the organisational structure – middlemanagers that report to senior management – would reviewmeasures that influence the level above. In this example, callcentre middle management would have measures aroundenergy consumed by heating and a second measure viewing theenergy used by personal computing. The team managers in turnwould influence each of these, so there may be a target aroundinstalling each site with thermostatic controls, which willinfluence energy consumption though heating. Three keymeasures to influence energy used through personal computerswould be:

� Computers can be standby enabled through an update thatwill put them into hibernation when not in use

� The percentage of computers left on overnight can beinfluenced by changing behaviours to ensure computers areshut down when users leave the office

� Understanding current assets and replacing them with moreenergy-efficient models where available

Looking at the facilities management teams, the seniormanagement would have a measure looking at the overallenergy consumed by lighting, middle management would lookat lighting by floor and finally team managers may be targeted atinstalling energy saving lightbulbs, which will all enable teams todrive performance.

Lessons learnt

It has been an interesting journey and we have learnt manylessons along the way:

1. The workforce had been equipped with the tools to improvecontinuously and teams had embraced this. However, someteams spent time sharing lessons learnt and best practice

primarily because of a lack of visibility of roles and responsibilitiesof other teams. By getting a cross-section of key stakeholderstogether at workshops, it helped facilitate better workingrelationships between teams and sharing of lessons learnt. Gapsin reporting were spotted and key performance metrics weredeveloped, with the added benefit of smoothing rollout ofmetrics through providing the ability for those teams to shapethe metrics they would be using.

2. Proactive leading measures help to arrest slippingperformance before it becomes a problem.

3. When developing metrics to drive performance againstcommitments, it is important to understand the capacity ofteams using them. Initially, too many metrics were proposed.Some teams that influenced several key measures had around40 measures, and it was not feasible to expect teams to reviewand drive performance against each of them. This wascountered by developing targeted measures for teams to amore manageable number of metrics and, as noted earlier,swapping these with metrics to refocus teams onceperformance stabilised.

4. Documenting measures and the proposed metrics that driveperformance helped cement cross-team cohesion, becauseindividuals have a line of sight to how their work impacts onothers’ teams and, ultimately, the commitment.

In summary, there will always be competing views on whichperformance indicators are the most important. Agreeing KPIsand getting buy-in from the business was crucial to success, as itallowed the business to focus improvement activities in aprioritised order. Getting key stakeholders together to discusshow to best track performance around a prioritised measure,sharing best practice, standardising measures and exploringmore of the proactive leading indicators got teams thinkingabout how their day-to-day activities could contribute to higherlevel business commitments, as well as providing a line of sightof how each team can influence performance.

Performance needs to be measured before it can be improved.At Severn Trent, we now have a framework in place to monitorperformance proactively against key commitments and aremuch more nimble, enabling us to react quickly to changingperformance.

About the author

References

teven White FIOM is a Senior Business Analyst, Severn Trent. He has used his black belt in lean Six Sigma to implement a system that proactively drives performance.

1. DRUCKER, P (1993), The Practice of Management (reissue), Harper Business

2. DORAN, G T (1981), ‘There’s a S.M.A.R.T. way to write management’s goals and objectives’, Management Review, 70, 35

3. GALLOWAY, L, ROWBOTHAM, F and AZHASHEMI, M (2000), Operations Management in Context, Butterworth-Heinemann

4. ECKES, G (2003), Six Sigma for Everyone, John Wiley & Sons

S

Proactively driving operational performance through visual management

14

6proactivedrive 4/9/15 16:21 Page 5