You are on page 1of 8

Business Improvement - Analysis of Benchmarking Information provides New Insights into How to Improve Maintenance Performance International Maintenance

Conference (ICOMS) Australia, June 2000 Richard Blayden Principal Consultant - Maintenance Hatch Associates Pty Ltd Abstract
Hatch is a global consulting practice that helps clients in mining, metals and manufacturing businesses to make breakthrough improvements in their business performance. Maintenance is a critical component for organisations operating in these industries. In this paper we illustrate how our benchmarking and improvement process can unlock value for the business by making the right improvement decisions. Working under the sponsorship of its Global Maintenance Network, we have helped BHP Billiton embark on an extensive program to benchmark the performance of all of its major operations using a detailed Maintenance Evaluation process. To date, these evaluations have been applied to over 50 sites both within and outside of BHP Billiton. The benchmarking results have provided a clear picture of the strengths and improvement opportunities in maintenance over a diverse range of industrial and mining operations. They indicate that in many cases, improvement initiatives implemented over the past years have not fully delivered to expectation. By looking behind these benchmarking results to find the underlying factors that have prevented sustainable improvement, we have gained a new insight into what impedes our rate of progress, and why. Through the application of root cause analysis, we have learned that our improvement strategy has to focus more on encouraging and helping people with the necessary knowledge sharing, alignment and learning processes which enable effective improvement progress. This is in contrast to the traditional approach of continuous implementation and upgrade of systems and procedures. We have learned much from this exercise. In the short time available, this paper attempts to share some of the key elements of these learnings.

BACKGROUND
The Maintenance Evaluation* process was first developed under the sponsorship of BHP Billiton Minerals International by a team comprising BHP Billiton Minerals, BHP Billiton Engineering, and Fluor Daniel (USA). The first evaluations were conducted at the Escondida Copper operations in Chile in 1996. Since then, many other sites have undergone the evaluation process, and a few have undergone a second pass to assess improvement progress, and to validate the changes made against actual business performance improvement achieved. The evaluation process spans several weeks of preparation work leading up to an intensive two week, on-site assessment by a team of six to eight people. The team is made of experienced evaluators plus three or four people from the operation being evaluated. Site people are included to provide local knowledge, build ownership of the process, and to continuously validate the results. This enables them to learn the process, and help share the ownership of the outcomes (and associated improvement actions) throughout their organisation. * See note at end.

EVALUATION PROCESS AND RESULTS


The Maintenance Evaluation process involves a considerable data gathering and review effort followed by facilitated group discussion in the scoring, validation and analysis phases. It is conducted generally as illustrated in figure 1. Scoring is assessed by comparing the information collected against a template of best practice processes and procedures. The scored results are provided in graphic form as illustrated in figure 2.
Evaluation Process:
2 weeks Team Selection Site Pre-work Data Gathering Site Discussions Question Sets based on template of 22 Elements of best practice

Results:
Maintenance Evaluation Element Scores ~ 53 Sites to November 1999
100 90 80 70

Strengths (Best Practice)

Site Coordinator Experienced Team Leader Experienced Evaluator Other Site Representative (x2) Site representative (x2)

Facilitated Group Scoring Analysis & Inference

60 50

Reality Checks

Networking & Leverage Opportunities (to Learn & Improve)

40 30 20 10 0
Safety Perf. Meas.

Environment

Organisation

Work Allocat'n

Work Compl'n

Eqp. Strategy

Employee Dev

Budget & Cost

Figure 1

Report Preparation

Presentation & Discussion

Figure 2

How can we use this ?

The immediate application of these results is to show how maintenance at the site being evaluated compares against all of the other sites already evaluated. It is then easy to identify the strengths of the site, and to highlight areas where improvement may be necessary. Information and contact names for high scoring sites can be made available to enable networking and sharing of best practice ideas with the lower scoring sites.

HOW CAN WE USE THESE RESULTS?


There are of course, two important aspects to this type of benchmarking exercise. One is to see how an organisation measures up in the wider world of maintenance. The second, and probably more important, is to establish direction, targets and priorities for improving the way we do things. The common approach to identifying our improvement opportunities is gap analysis. Look at the low scoring areas and determine the desired improvement actions. Figure 3 illustrates how a typical gap analysis might lead us to define particular improvement actions.

T rad itional A pproach - G ap A n alysis & Im pro vem en t P lan:


S ym ptom
L ow score in P la nn in g

S olu tion
~ R ecru it / T rain m o re p lann ers ~ D ev elo p m o re (R C M ) p reven tiv e m ain ten ance

Po or perfo rm an ce in W o rk C o n tro l P o or M ea su rem en t & Im p ro vem ent etc .

~ D esign / p ro vid e b etter sy stem s ~ S et-up K P Is & Failu re A n aly sis

C ollate th ese in to an im p rovem en t p lan

F igu re 3

Im p lem en t th e p lan

The question is, does this approach lead us to the desired outcome? Can we reliably achieve sustainable improvement, simply by taking actions that appear to directly address, and therefore close the gaps (or improvement opportunities) identified by our benchmark? The answer to both these questions is generally no. We will not reliably achieve sustainable improvement simply by applying these solutions, even though they may appear to be technically correct. We will fail because we have not understood the problem that we are trying to solve.

Facility, Eqp

Drgs & Docs

Work Origin

Failure An.

Shutdown

Scheduling

MIS Mgmt

Plant Acq.

Policy

Planning

Contracts

Materials

C.I. Mgmt

THE SITUATIONAL ROOT CAUSE ANALYSIS


Suppose for example, that we have identified low scores in the elements associated with maintenance work control. It could be easy to judge poor planning as a key cause of the problem. In reality however, the low scores in these (or any other) elements are not necessarily a problem. But they are definitely symptoms. We will not identify the real problems until we probe deeper to find out why we have a particular pattern of low scores. Almost all of the sites that have contributed to our database of benchmark information have been in operation for many years. During this time they have invested heavily in maintenance and business improvement themes. These have included various upgrades in maintenance management (CMMS) systems, procedures and technologies, and significant effort in education and training of the people employed. Have these initiatives really worked? Figure 4 gives us another look at the overall benchmark results, but this time, lets treat them as a measure of our success in past improvement initiatives.
100 90 80 70

Systems

60 50 40 30 20 10 0

Concepts
Environment Policy Employee Dev Safety

Planning & W ork Control


Work Allocat'n Work Compl'n Eqp. Strategy Scheduling Shutdown Materials Work Origin Contracts Planning

M easurement & Analysis


Perf. Meas. Failure An. Budget & Cost Facility, Eqp MIS Mgmt Drgs & Docs Plant Acq. C.I. Mgmt

Figure 4

Across the board, we see consistently high scores in the systems element. These verify the installation and availability of appropriate (CMMS) information systems. However, a large proportion of sites still score low in the elements associated with maintenance planning and work management. If the systems are available, then clearly, these sites have not leaned how to make effective use of them. A large proportion of sites also score low in the elements associated with measurement, analysis and improvement. Why? The most common reason given is that our history data is not good enough. But look at the low scores in planning and work control - the elements where we are supposed to develop and record the data that we might later analyse. Why dont we collect good quality data? Simple, because nobody ever uses it. We have a recursive problem in a negatively reinforcing loop. When we look deeper to find out why this is happening, we find a situation where there are many of these loops, all interlinked, and all cascading in the same, negatively reinforcing direction. A significant clue to why this is happening lies in the area highlighted as concepts. In the element of maintenance policy we tend to find statements that allude to worlds best practice but give little indication of how that might be achieved. In the realm of employee development, the focus is predominantly on individual skills and capability development. Whats missing here is simple. We have not sat down as a group of people to discuss and agree how we will work together. We often pull in opposing directions rather than in alignment.

Organisation

THE CHALLENGE - TO DO SOMETHING DIFFERENT


Although many organisations seem to have difficulty in doing them well, the maintenance work management processes of planning, scheduling, measurement, and improvement are all basically simple. Generally, we already have the systems, procedures and skills required, so the problem does not lie in our technical ability to carry out these processes. Why do we have this apparent difficulty? Our experience has shown that in many cases, these work management processes are undermined (and therefore partially disabled) by our failure to work together effectively within the organisation. On the basis of these learnings, the approach we are now taking to help sites achieve fast and effective ramp up to sustainable improvement includes a lot more time in learning and developing strategy. This is illustrated in figure 5.

The O ld W ay
Evaluation & Benchmarking
Improvement Kick-off

How do we make (facilitate) it happen ??


R ealisation R eassurance Learning Planning Implementation

Measured Achievement

Other Impetus

The New W ay Situation Appraisal Root Cause Analysis Learning, Sharing & Alignment Strategy Development Practice & Build Capability Plan & Implement Improvement

Maintenance Benchmark Evaluation Conducting the benchmark evaluation process is an excellent way to establish a current assessment of the site performance because it provides a validated baseline measure that helps the organisation to recognise the starting point and the potential for improvement. Situation Appraisal This is the start of the root cause analysis process where, based on the results of the benchmark evaluation, we look behind the scenes to find out what is actually happening in real time to cause the results to be as they are. This is usually presented pictorially to illustrate the interlinking of the vicious circles which cause the situation to be like it is. Realisation / Reassurance Just about every organisation we evaluate is made up of people who are justifiably proud of what they have built in terms of a long-standing and successful business. The idea that it may not be as good as they think is often hard to swallow. We need to help these people through a process of realisation and reflection that leads to their dissatisfaction with the current situation and hence create the desire for improvement. An important part of this process is to provide a level of reassurance that its not all bad and that there are things we can do without too much pain. Learning The learning stage is designed to share ideas and concepts on the should be / could be situation whilst applying further root cause analysis on elements in the current situation to establish and agree an improvement strategy. Once the improvement strategy has been agreed, time is then allowed for the people involved to learn about, practice and validate the improvement themes (actions) contained in the strategy.

Planning / Implementation It may appear that implementation planning occurs late in this improvement process - this is deliberate. There is no point in planning to implement improvement initiatives unless we are confident that we are doing the right things (eg: they will produce the desired results), and that we know how to make them work. By deferring the implementation planning stage to later in the process, we are better assured that we have a viable plan that is owned and supported by the people involved.

MEASURED ACHIEVEMENT
The process for transcribing the improvement strategy into specific actions that can be aggregated into an overall improvement plan is based on the development of one or more task briefs. This is a facilitated process whereby, for each element of the improvement strategy, we work together to develop work specifications that contain the following elements: Task title - a concise description of what needs to be done. Objective - a clear statement of the specific outcome that is required. Purpose - a statement of the business benefits to be achieved by that outcome. Assessment Criteria - details of how we will measure and monitor specific parameters against set targets to provide the information we need to judge whether or not we have done the task successfully. Issues - a list of the issues, considerations and constraints to be addressed whilst carrying out the task. Team - who will be responsible for (or involved in) carrying out the task? Time scale - when the task, or task milestones need to be completed by. This approach helps generate a shared understanding and ownership of the improvement work that needs to be done, in terms of scope, detail and expectation. It also provides a baseline from which we are able to demonstrate progress in developing and implementing the improvement tasks, and measure the effectiveness of those tasks in helping to achieve the expected benefits.

THE LEARNING PROCESS


We enable the improvement process through a series of facilitated workshops that form a bridge between the evaluation (benchmark) stage and the kick-off of the improvements implementation as illustrated in figure 6. It is worth re-stating an important issue: one of the key learnings from the root cause analysis of these benchmarking results is that, by not working together effectively, we undermine (and therefore disable) our intrinsic ability to manage maintenance. This is in spite of the fact that the maintenance management processes we use are basically simple and most individuals already have a good knowledge of them. The real problem lies in the area of group understanding and alignment. Whilst we all have good knowledge of the same subject matter, we have it in the context of our own, individual interpretations of scope, detail, relevance, and importance. Without a common context, we are always at risk of working in opposition rather than in unison.

Im plem enting

Arousal

R efle ction E x citem e nt R ealisation & R oot C ause L ea rn in g & S trategising

P lann ing

L a ck of A ction

M ainten an ce E valu ation

L oss of Im pe tus ?

M ainten an ce Im provem e n t W ork sh ops

O ngoing Support

tim e

P lan ning

T im ing to suit the site: long enoug h to reflec t quick enou gh to m aintain m om entum

R eassu ran ce
P la n n in g

R ea lisa tio n

Im plem entation R ealisation E valu ation

E va lu a tio n

Im p lem en ta tio n

F igu re 6

We use our Strategy Based Learning (SBL) methodology to facilitate the development of a shared understanding of what improvement actions need to be implemented in a common context of why they are necessary, and how they will work. This alignment process is illustrated in figure 7 and leverages our extensive library of business process templates and learning materials designed specifically to support this learning process.

T h e F a cilita ted L earn in g* P ro cess:


W o rk sh o p M a terials

S h arin g of in dustry kn ow led ge, harin k now ledge, id eas and b est practice an d best

Diversity

D iscuss cu rrent ideas, concerns an d percep tions

S hare ideas and co ncepts

D iscuss how w e m igh t go fo rw ard


T ea chin g .. T ec hn ology .. T en sion S h arin g ... L earn ing.. A chieving .

A lig n m en t O w n ersh ip A ctio n P lan

D iffering leve ls of: K n ow ledg e U nd erstand ing D ifferin g O p inio ns on: R eleva nce Im po rtan ce

R oo t C au se A n alysis

U n d er-p inn ed b y O rga nisa tio n al C om m itm ent & S u p p ort in ned n isa tion om en t
O n-g oing develo pm ent an d n-going d evelopm im plem entation supp ort sup port
* B ased on Strateg y B ased L ea rnin g
(S B L
TM

F igu re 7

techniq ues

THE RESULTS SO FAR - SOME COMMON THEMES


We have developed and applied this improvement approach over many sites over the past two to three years. Our experience has shown that there are many recognisable patterns in the various situations encountered across all the individual sites. First of all, we have found that the rate at which people are identifying and successfully resolving problems (or barriers to improvement) can be increased dramatically, and without too much effort. This is not because people are not good at solving problems. Rather, it is because they are not good at recognising and identifying the problems in the first place, and then prioritising and agreeing which ones to work on, and finally putting them to bed. Getting this side of the improvement process working more effectively has unlocked immediate value for the business, and energised the improvement process on many of the sites we have worked at.

Based on this experience, there are four common themes that are evident in almost every site improvement strategy we have encountered through this approach: 1. Start Using the Information Collected So Far

Make better and more visible use of existing information to identify improvement opportunities, and to give a sense of purpose to the data recording and collection processes. This means spending a lot more time in discussion about what the information means, and how it should be interpreted and used to drive improvement. It involves working towards alignment in measuring and interpreting the current situation. The outcomes of this information analysis should also be used to identify and quantify problems that need to be resolved. 2. Start Analysing and Solving Problems

Get some early wins by identifying and resolving current problems. Focus the team to identify and resolve the issues that inhibit progress in implementing other improvement initiatives. This follows the previous theme because the first step in resolving a problem is to recognise that the problem exists. Our experience in facilitating many workshops on the processes of problem identification, analysis and solution development has shown that significant benefits are achievable on almost every site in a very short time frame. Analysis of the results has shown that up to 97% of the solutions are non-capital, and that the benefits significantly outweigh the costs and effort required. There is one catch however. If the solutions are non-capital, then to achieve the benefits, we are dependent on changing the way people behave. That is a lot harder than simply spending capital. 3. Start Improving Work Effectiveness

This follows the problem-solving theme because improving work effectiveness usually involves implementing better systems and procedures with increased attention to detail. This implies significant extra work before any real benefit is realised. If we are already flat out busy, then we need to create the time by first resolving some of the perennial problems which, in an inefficient maintenance work management environment, consume our time unproductively, and therefore disable our capacity to work effectively. 4. Start Collecting Better Quality Information

As a result of our improved work management systems, we can now collect better quality data to drive improved use of information - ie: re-cycle through the first theme.

It is worth noting the sequence in which these improvement themes are listed. In many of the traditional approaches to maintenance improvements implementation, we have seen plans which involve an initial thrust into improving work control systems and procedures, followed by history analysis, followed again by root cause analysis and continuous improvement. We are turning this sequence on its head, and for good reason. Whilst these key themes sound simple, they are not necessarily so easy to implement. However, they do represent a simple and easily understood improvement strategy that facilitates: Understanding of the current situation; Effective use of information for measurement, analysis and problem identification; Focus on achievement through visible progress in eliminating problems; Understanding of the relevance of effective use of work control systems; Building a hunger for better quality information. This simplicity helps generate alignment through a common understanding and shared ownership of the improvement strategy. Get the simple themes agreed in the context of what is needed for the individual site, then use those themes as the basis for facilitated development of an appropriately detailed (and sequenced) action plan.

A further issue worth noting is the way that this approach addresses the needs of the people involved. The facilitated workshop process helps them identify, address and resolve the current issues that are important to them in their day-to-day endeavours. It helps build a shared understanding of what needs to be done and why. Their influence and involvement in co-developing the way forward demonstrates that their input is valued, and gives them ownership and control of the improvement actions which affect their future. There is no prescription or systems directive from above or outside of their immediate colleges and their own working environment.

SUMMARY
If we continue to follow the traditional approach to improvement based on assessment, gap analysis, solution identification, and solution implementation, then the best we can hope for is more of the same. We will continue to toss solutions into the ring in the hope that things will change, but we will not be addressing the real problems. We will continue to impose large amounts of change on groups of people (who are already flat out busy) with no immediate benefits in sight. In terms of achieving significant and sustainable improvement quickly, we are probably doomed to failure from day one. In preference to this, we are applying a strategy that focuses on the key requirement for enabling improvement - helping people to align and develop their capability to perform better. We are doing this by helping to develop a better understanding of the current situation, then by improving the alignment in the way people share information and work together, and by focusing on identifying and addressing root cause. By following this route, we have been able to change the way groups of people think about and address their improvement needs. We have observed a corresponding good effect on their progress and achievement.

Note: As part of the amalgamation of BHP Engineering into the Hatch group in November 1999, the ownership of the intellectual property rights for the Maintenance Benchmarking Evaluation Process (together with many other methodologies, systems, and business process templates) have passed to Hatch Associates Pty Ltd, whilst still being available for use within BHP Billiton through their Global Maintenance Network.

You might also like