You are on page 1of 36

International Journal of Operations & Production Management

A framework of the factors affecting the evolution of performance measurement


systems
Mike Kennerley Andy Neely
Article information:
To cite this document:
Mike Kennerley Andy Neely, (2002),"A framework of the factors affecting the evolution of performance
measurement systems ", International Journal of Operations & Production Management, Vol. 22 Iss 11 pp.
1222 - 1245
Permanent link to this document:
http://dx.doi.org/10.1108/01443570210450293
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Downloaded on: 09 October 2016, At: 17:22 (PT)


References: this document contains references to 34 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 9386 times since 2006*
Users who downloaded this article also downloaded:
(2005),"The evolution of performance measurement research: Developments in the last decade and a
research agenda for the next", International Journal of Operations & Production Management, Vol. 25
Iss 12 pp. 1264-1277 http://dx.doi.org/10.1108/01443570510633648
(2005),"Erratum", International Journal of Operations & Production Management, Vol. 25 Iss 12 pp.
1228-1263 http://dx.doi.org/10.1108/01443570510633639

Access to this document was granted through an Emerald subscription provided by emerald-srm:273599 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for
Authors service information about how to choose which publication to write for and submission guidelines
are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as
providing an extensive range of online products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee
on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.

*Related content and download information correct at time of download.


The current issue and full text archive of this journal is available at
http://www.emeraldinsight.com/0144-3577.htm

IJOPM
22,11 A framework of the factors
affecting the evolution of
performance measurement
1222
systems
Mike Kennerley and Andy Neely
Centre for Business Performance, Cranfield School of Management,
Cranfield, UK
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Keywords Performance measurement, Development, Organizational change


Abstract The effectiveness of performance measurement is an issue of growing importance to
industrialists and academics alike. Many organisations are investing considerable amounts of
resource implementing measures that reflect all dimensions of their performance. Consideration is
being given to what should be measured today, but little attention is being paid to the question of what
should be measured tomorrow. Measurement systems should be dynamic. They have to be modified
as circumstances change. Yet few organisations appear to have systematic processes in place for
managing the evolution of their measurement systems and few researchers appear to have explored
the question, what shapes the evolution of an organisation's measurement system? The research
reported in this paper seeks to address this gap in the literature by presenting data that describes the
forces that shape the evolution of the measurement systems used by different organisations.

Introduction
Although it has long been recognised that performance measurement has an
important role to play in the efficient and effective management of
organisations, it remains a critical and much debated issue. Significant
management time is being devoted to the questions what and how should we
measure while substantial research effort, by academics from a wide variety
of management disciplines, is being expended as we seek to enhance our
understanding of the topic and related issues (Neely, 1999).
Survey data suggest that between 40 and 60 per cent of companies
significantly changed their measurement systems between 1995 and 2000
(Frigo and Krumwiede, 1999). Most of these initiatives, however, appear to be
static. Although many organisations have undertaken projects to design and
implement better performance measures, little consideration appears to have
been given to the way in which measures evolve following their
implementation (Waggoner et al., 1999). It is important that performance
measurement systems be dynamic, so that performance measures remain
relevant and continue to reflect the issues of importance to the business (Lynch
and Cross, 1991).
International Journal of Operations &
Production Management, The authors are grateful to the Engineering and Physical Sciences Research Council (EPSRC)
Vol. 22 No. 11, 2002, pp. 1222-1245.
# MCB UP Limited, 0144-3577
for the award of research grant number GR/K88637, to carry out the research reported in this
DOI 10.1108/01443570210450293 paper.
In order to ensure that this relevance is maintained, organisations need a Performance
process in place to ensure that measures and measurement systems are measurement
reviewed and modified as the organisation's circumstances change (Dixon et systems
al., 1990). Yet few organisations appear to have systematic processes in place
for managing the evolution of their measurement systems and few researchers
appear to have explored the question what shapes the evolution of an
organisation's measurement system. 1223
The research reported in this paper seeks to address this gap in the literature
by presenting a framework that describes the forces that shape the evolution of
the measurement systems used by different organisations.
Following this introduction the paper consists of a further six sections. The
next section discusses the literature regarding the evolution of performance
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

measurement systems, providing the context for the research. Descriptions of


the research methodology, the case study findings and the resultant framework
of factors affecting the evolution of performance measures are then presented.
The subsequent discussion is followed by conclusions that are drawn in the
final section.

Performance measurement literature


The problem of how organisations should assess their performance has been
challenging management commentators and practitioners for many years.
Financial measures have long been used to evaluate performance of
commercial organisations. By the early 1980s, however, there was a growing
realisation that, given the increased complexity of organisations and the
markets in which they compete, it was no longer appropriate to use financial
measures as the sole criteria for assessing success.
Following their review of the evolution of management accounting systems,
Johnson and Kaplan highlighted many of the deficiencies in the way in which
management accounting information is used to manage businesses (Johnson,
1983; Kaplan, 1984; Johnson and Kaplan, 1987). They highlighted the failure of
financial performance measures to reflect changes in the competitive
circumstances and strategies of modern organisations. While profit remains the
overriding goal, it is considered an insufficient performance measure, as
measures should reflect what organisations have to manage in order to profit
(Bruns, 1998). Cost focused measurement systems provide a historical view,
giving little indication of future performance and encouraging short termism
(Bruns, 1998).
The shortcomings of traditional measurement systems have triggered a
performance measurement revolution (Eccles, 1991; Neely, 1999). Attention in
practitioner, consultancy and academic communities has turned to how
organisations can replace their existing, traditionally cost based, measurement
systems with ones that reflect their current objectives and environment. Many
authors have focused attention on how organisations can design more
appropriate measurement systems. Based on literature, consultancy experience
and action research, numerous processes have been developed that organisations
IJOPM can follow in order to design and implement performance measurement systems
22,11 (Bourne et al., 1999). Many frameworks, such as the balanced scorecard (Kaplan
and Norton, 1992), the performance prism (Kennerley and Neely, 2000), the
performance measurement matrix (Keegan et al., 1989), the results and
determinants framework (Fitzgerald et al., 1991), and the SMART pyramid
(Lynch and Cross, 1991) have been proposed that support these processes. The
1224 objective of such frameworks is to help organisations define a set of measures
that reflects their objectives and assesses their performance appropriately. The
frameworks are multidimensional, explicitly balancing financial and non-
financial measures.
Furthermore, a wide range of criteria has also been developed, indicating the
attributes of effective performance measures and measurement systems. These
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

include the need for measures to relate directly to the organisation's mission
and objectives, to reflect the company's external competitive environment,
customer requirements and internal objectives (Globerson, 1985; Wisner and
Fawcett, 1991; Maskell, 1989; Kaplan and Norton, 1993). Others make explicit
the need for strategies, action and measures to be consistent (Lynch and Cross,
1991; Dixon et al., 1990).
At the heart of the processes, frameworks and criteria discussed, as with
much that has been written on the subject of performance measurement, is the
premise that measures and measurement systems must reflect the context to
which they are applied (Neely, 1999). Indeed as Johnson (1983) observed, the
introduction of financial performance measures, such as cash flow and return
on investment, reflected the changing marketplace in which organisations
competed. At the turn of the century sole traders were giving way to owner
managers who needed to assess the return on their investment in plant and
premises.
The performance measurement revolution has prompted many
organisations to implement new performance measurement systems, often at
considerable expense. However, unlike the environment in which organisations
operate, many measurement initiatives appear to be static. Senge (1992) argues
that, in today's complex business world, organisations must be able to learn
how to cope with continuous change in order to be successful. Eccles (1991)
suggests that it will become increasingly necessary for all major businesses to
evaluate and modify their performance measures in order to adapt to the
rapidly changing and highly competitive business environment. Numerous
authors espouse the need for reflection on measures to ensure that they are
updated to reflect this continuous change (Meyer and Gupta, 1994; Ghalayini
and Noble, 1996; Dixon et al., 1990; Wisner and Fawcett, 1991). However, there
has been little evidence of the extent or effectiveness with which this takes
place. Moreover, the literature suggests that ineffective management of the
evolution of measurement systems is causing a new measurement ``crisis'', with
organisations implementing new measures to reflect new priorities but failing
to discard measures reflecting old priorities resulting in uncorrelated and
inconsistent measures (Meyer and Gupta, 1994). Furthermore, it is suggested
that organisations are drowning in the additional data that is now being Performance
collected and reported (Neely et al., 2000). As with measurement systems measurement
introduced at the turn of the century, there is a danger that failure to manage systems
effectively the way in which measurement systems change over time will cause
new measurement systems to lose their relevance, prompting a new crisis and
necessitating a further measurement revolution.
This raises a crucial question. Why do performance measurement systems 1225
fail to change as organisations change, rendering them irrelevant? This is an
important question to answer if history is not to be repeated and organisations
are to avoid the expense of another extensive overhaul of their measurement
systems.
Wisner and Fawcett (1991) acknowledge the need for performance measures
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

to be reviewed and changed to ensure that measures remain relevant in the last
step of their nine step process. They highlight the need to ``re-evaluate the
appropriateness of the established performance measurement systems in view
of the current competitive environment''. Bititci et al. (2000) identify the need for
performance measurement systems to be dynamic to reflect changes in the
internal and external environment; review and prioritise objectives as the
environment changes; deploy changes in objectives and priorities; and ensure
gains achieved through improvement programmes are maintained.
Dixon et al. (1990) and Bititci et al. (2000) propose audit tools that enable
organisations to identify whether their existing measurement systems are
appropriate given their environment and objectives.
Bititci et al. (2000) go on to posit that a dynamic performance measurement
system should have:
. an external monitoring system, which continuously monitors
developments and changes in the external environment;
. an internal monitoring system, which continuously monitors
developments and changes in the internal environment and raises
warning and action signals when certain performance limits and
thresholds are reached;
. a review system, which uses the information provided by internal and
external monitors and the objectives and priorities set by higher level
systems, to decide internal objectives and priorities; and
. an internal deployment system to deploy the revised objectives and
priorities to critical parts of the system.
Bourne et al. (2000) suggest measurement systems should be reviewed and
revised at a number of different levels. They identify the need for review of
targets and performance against them; individual measures as circumstances
change; and the set of measures to ensure that they reflect the strategic
direction.
Although the authors discussed above propose the need to review measures
and suggest techniques for such review, there is little discussion of their
IJOPM application in practice, investigation of how measures actually change or of the
22,11 factors that affect how effectively and efficiently performance measurement
systems change. With a few notable exceptions (Meyer and Gupta, 1994;
Townley and Cooper, 1998; Bourne et al., 2000), empirical investigation of the
evolution of measurement systems over time remains a considerable gap in
performance measurement research (Neely, 1999).
1226 Meyer and Gupta (1994) observe that measures tend to lose their relevance
and ability to discriminate between good and bad performance over time as
performance objectives are achieved or as behaviour no longer reflects the
performance objectives underpinning the measures. They observe that failure
to effectively manage this change causes the introduction of new measures
``that are weakly correlated to those currently in place'' so that an organisation
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

will have a diverse set of measures that do not measure the same thing.
Townley and Cooper (1998) undertook a longitudinal study of performance
measurement in Alberta government in Canada. They observed that support
for performance measurement can diminish over time. They observe that
measurement initiatives can suffer from loss of initial enthusiasm, which is
replaced by scepticism and disillusionment. They cited a number of causes of
this including failure to manage the change appropriately, underestimating the
effort required and lack of commitment to the change. They also identified that
political issues and the involvement of employees affect success. Not only does
their study identify factors affecting the success of performance measurement
activities, but it also highlights the need for support of such activities within
the organisation.
In a case study company, Bourne et al. (2000) observed that performance
measures changed over time. They identified that changes were prompted by
existing budgetary review processes, chance, intervention of the researchers
and eventually by design, however they provide little insight into how this
change by design took place.
Despite the limited discussion of evolution in the performance measurement
literature, it is possible to draw lessons from a variety of other streams of
literature that address issues relating to the management of change (Waggoner
et al., 1999).
Based on a review of the relevant literature, Waggoner et al. (1999)
summarise the key forces driving and demanding change as: customers,
information technology, the marketplace, legislation (public policy), new
industries, nature of the work (e.g. outsourcing) and future uncertainty.
However, many authors also identify barriers to change that have received
little attention in the performance measurement literature.
Gabris (1986) identifies four categories of such barriers:
(1) process burden, where processes such as performance measurement
take employees away from their actual responsibilities;
(2) internal capacity, where organisations lack the in-house capability to
support an initiative;
(3) credibility anxiety, where organisations suffer from an overload of Performance
management techniques; and measurement
(4) the ``Georgia giant syndrome'', where management techniques work only systems
under rigorous and closely supervised control conditions.
These factors can be considered to be the organisation's readiness for change
(Waggoner et al., 1999). Furthermore, Kotter (1996) argues that willingness or 1227
urgency to change throughout the organisation is necessary for such change to
be effective.
Greiner (1996) categorises inhibiting factors as institutional, pragmatic,
technical and financial. Numerous authors (such as Scott, 1995 and Pettigrew
and Whipp, 1991) also highlight that the political nature of organisations
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

requires further consideration, one of a number of factors demonstrating the


impact that corporate culture can have on evolutionary change (Tichy, 1983).
The literature reviewed highlights the importance of managing the evolution
of performance measurement systems to ensure that they continue to reflect the
environment and objectives of the organisation. The literature suggests that
the factors affecting evolutionary change within organisations, and hence the
evolution of performance measures, are many and complex. However, these
issues can be grouped into two main themes:
(1) drivers of change (those factors that cause change to be necessary); and
(2) barriers to change (those factors that must be overcome if change is to be
effective).
These issues are summarised in Figure 1.
The research reported in this paper seeks to investigate these drivers of, and
barriers to, evolution as they apply to performance measurement systems.

Methodology
A multiple case study approach was used to investigate the way in which
performance measures actually evolve within organisations. The research
involved semi-structured interviews with a total of 25 managers from a range

Figure 1.
Summary of factors
affecting evolution
drawn from the
literature
IJOPM of management functions, from seven different organisations. The companies
22,11 involved in the research were from the industries shown in Table I.
The interview structure was designed to investigate the key themes
identified from the literature reviewed. As such the case studies sought to
answer the following questions:
. What factors encourage the introduction of new measures, modification
1228 of existing measures and deletion of obsolete measures?
. What factors inhibit the introduction of new measures, modification of
existing measures and deletion of obsolete measures?
The companies were selected on the basis of their considerable experience in
the implementation and use of performance measures. Companies from
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

different industry sectors and with a wide variety of competitive and


organisational characteristics were deliberately chosen to introduce diversity
into the sample. This enabled the identification of factors affecting evolution of
measurement in a variety of different circumstances. Similarly, interviewing
managers from a number of different departments ensured that consideration
was given to the diversity of factors affecting evolution in different functional
circumstances. As a result the findings of the case studies provide a broad
understanding of the factors affecting the evolution of an organisation's
performance measures.

Case study findings


There was general consensus among the managers interviewed of the need for
performance measures to evolve over time, so that they reflect the changing
circumstances and requirements of the organisation.
The group technical and quality director of company 7 stated that: ``ensuring
that performance measures continue to reflect the issues that are important to
the organisation is important if measurement is to be useful and help
management''.
The consultancy sales manager of company 2 stated that: ``the evolution of
measures ensures that they increase in sophistication and change to reflect the
changes in behaviour we want to achieve''.
The systems analyst from company 6 indicated that: ``evolution enables us to
tackle soft issues and develop hard measures to reflect how well we are doing''.

Company Industry

1 Maintenance of transport infrastructure


2 Supplier of IT services
3 Supplier of stationary to retail and commercial sectors
Table I. 4 Courier/global package delivery
Companies involved in 5 Utility energy generation and supply
the research and their 6 Manufacturer of food packaging
industry 7 Manufacturer and supplier of printing machinery
Although the need for performance measures to evolve over time was Performance
recognised, the evolution of measures was managed with varying degrees of measurement
success. Findings from each of the organisations are now discussed in turn. systems
Company 1
In the past, company 1 had been unable to manage effectively the evolution of
performance measures. The lack of flexible information systems and inertia 1229
throughout the organisation were found to be the main barriers to the effective
management of legacy measurement systems. The problems prompted a
company-wide initiative to establish effective performance measurement,
explicitly addressing problems that had previously been experienced. The
managing director was the major driving force behind the initiative and
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

extensive use was made of existing and accepted communication tools to ensure
performance measurement had the appropriate credibility. As the human
resources manager remarked: ``Effective use of the measurement system is due
to the managing director's promotion of the need for and importance of
measurement and his use of measurement to manage and communicate''.
The managing director highlighted the need for flexible systems ``None of
the commercial performance measurement software provided the required
support you must have a system that satisfies your requirements''. In-house
information systems were developed to provide data collection, analysis and
reporting systems giving flexibility not provided by systems available on the
market. Addressing these issues, and integrating performance measurement
into the strategy development and review process, provided the organisation
with a measurement system that they believed would evolve with the
business's requirements.

Company 2
Although performance measurement systems had been implemented in
company 2 for a number of years, failure to actually use new performance
measures to manage the business was seen as major barrier to their deployment
and hence evolution. Although senior management had backed the
implementation of a balanced set of measures, the continued emphasis on
financial performance measures prevented use of the balanced measurement
system being embedded throughout the organisation. As in company 1,
company 2 used experiences of ineffective measurement practices in the past to
design a measurement system with the attributes that they considered
necessary to maintain a relevant set of performance measures in the future. To
ensure that their measures remained relevant, managers in company 2 explicitly
included a review of measures in the periodic review of business processes.
The head of business process development highlighted the importance of
having the appropriate systems to facilitate measurement activity and the
evolution of measurement systems. ``New systems have been designed from
scratch to be flexible enabling measures to be changed easily. The system
being Web-based enables worldwide access to all information allowing
IJOPM information sharing. This facilitates benchmarking and the transfer of best
22,11 practice. The global availability of the same reporting systems enables
commonality of approach''.
Furthermore he highlighted that: ``reporting needs to be efficient to reduce
the resources required to administer measurement, allowing resources to be
dedicated to acting on the results.'' The system was designed to enable efficient
1230 and effective data collection and reporting, minimising the effort of
measurement to ensure acceptance throughout the organisation.
According to the consultancy sales manager: ``Benchmarking of
performance against competitors (including those in new markets) has given a
common understanding of the need to improve and where improvement should
be focused. This has reduced any resistance to the change of performance
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

measures as the need can be demonstrated.'' This enabled the organisation to


overcome some of the people issues that had limited the development of
performance measurement activities in the past.

Company 3
The evolution of measures was not effectively managed in company 3. ``The
culture at [company 3] is a barrier to the implementation of a consistent
approach to measurement across the whole company.'' The ad hoc approach to
performance measurement that was adopted led to inconsistency in approaches
between different business units and geographical locations. The inconsistency
in measurement practices limited the comparability of performance data,
detrimentally affecting the credibility, and hence acceptance, of performance
measures. Despite attempts to change measures to reflect changing business
circumstances, managers were reluctant to use non-financial data to manage
the business. ``The overriding factor affecting the acceptance of performance
measurement is that it become a business issue so that it occupies the minds of
managers and measures are used to manage the business'' (Manager
Stationary Office Supplier). This reflects the need for managers to actively use
measures to manage the business. It was found that this would increase their
desire to ensure measures changed to remain appropriate, as their performance
would be assessed on them.
Inflexible IT systems were also found to be a major barrier to evolution. The
European customer care manager specifically noted that: ``it is not possible to
change the structure and content of the performance reports produced by the
mainframe IT system.''

Company 4
The use of performance measurement to manage the business was accepted in
company 4. However, the tendency to report too much data and produce too
many measurement reports acted as a significant barrier to evolution. The
service recovery manager stated: ``I spend too much time preparing reports for
my manager to take to board meetings. It prevents me from reviewing and
updating measures so that they remain current. Most of the reports are never
referred to, they are just a security blanket in case he is ever asked to produce Performance
the data.'' In the past key individuals had stood in the way of the use of some measurement
measures. ``This resistance was due to reluctance to provide a better systems
understanding of actual performance for which they were responsible. Removal
of the individuals has been the most successful way of eliminating the
problem'' (Service Recovery Manager).
The availability of people with the appropriate skills to analyse and redefine 1231
measures was also identified as an issue. This was particularly the case when
individuals responsible for measurement left departments or the company all
together. It was recognised that measurement practices could be developed
further by planning skills development and ensuring that the appropriate skills
were maintained in the areas they were required.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Company 4 also provided an example of the effect of the design of measures


on their use. While discussing the graphical representation of one measure, the
field service manager explained: ``nobody uses this measure as they don't
understand it. I would explain it to you but I don't understand it either''. As a
result the measure was not seen as relevant and was not used.

Company 5
Extensive performance measurement implementation had been undertaken in
company 5. However, as in company 2, although senior management had
initiated the implementation of new measures, they failed to use the resultant
performance measurement data, in favour of traditional financial performance
measures. ``The previous CEO paid lip service to the scorecard but only really
focussed on the financials, hence this is where all attention was focused'' (Head
of Strategic Planning). As a result the new measures were not considered to be
important at other levels of the organisation and they were not effectively used.
Measurement reverted to financial measurement and the process of evolution
was stifled. This clearly demonstrated the need for top level support for
measurement and the need for a change in mindset of management so that
measures are used to manage the business.

Company 6
Company 6 provided the best example of managing the evolution of
measurement systems. The primary factor facilitating evolution was the
availability of resources dedicated to measurement and the management of
performance measures. ``The availability of a dedicated employee who is
responsible for the review of measures enables gaps to be identified and the
need to change existing measures as well as identifying performance
measures'' (Sales Order Manager).
The dedicated systems analyst ensured that measures were reviewed and
that action was taken to improve performance and ensure that measures were
changed to remain relevant. In addition, ``having split responsibility and
budget from operations and the IT department enables me to develop systems
that would not be justified under either department individually''. This ensured
IJOPM that systems were flexible enough to change as required. The availability of a
22,11 manager dedicated to measurement, who had credibility within all areas of the
business stimulated measurement activity and helped overcome barriers to the
acceptance and evolution of measurement, such as inflexible payroll structures
and high staff turnover.
Company 6 highlighted the need to create the appropriate environment in
1232 which the use of performance measures is most effective. Weekly meetings to
review performance were open and honest discussions of performance,
including new issues requiring measurement and identifying new areas of
performance on which to focus improvement attention. ``It is important to
recruit and retain employees who are open to new ideas and are willing and
able to implement new performance measures.'' ``Use of neutral measures, that
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

focus on improvement and do not apportion blame, help acceptance and


adoption of measures.''

Company 7
The lack of a formal review process was considered to be the main reason that
the evolution of performance measures was not managed in company 7 (``There
is no process to review measures and identify whether or not they are
appropriate. That is a major factor affecting whether measures change in line
with organisational circumstances''). Within company 7 the leadership of the
managing director was clearly the main driver of measurement activity. ``The
ability and energy of the managing director drive measures and measurement.
He prompts other board members to review measures and ensure that they are
relevant and appropriate to the business and reflect what is important.''
The availability of management time to reflect on measures was also
considered to be a major constraint. The group technical and quality director
identified that: ``In previous years we have had too many measures. We need to
focus on fewer important objectives''. He also noted that the frequency with
which measures are reviewed is dependent on the availability of management
time. Similarly the availability of management skills is also a key determinant
of the ability to review and modify measures. This will affect when
inappropriate measures are identified and the ability to change measures to
make them appropriate''. He identified the need for systems that could
accommodate a hierarchy of measures, reporting the few important measures,
but enabling analysis of the many underlying measures of the drivers of
performance.
Table II summarises the key factors that facilitate and inhibit the evolution
of performance measurement systems in each of the case study companies.
Evidence from the case study companies demonstrates the need for
companies to change their performance measures as the organisation's
circumstances change. The group technical and quality director in company 7
pointed out: ``If people don't think measures are relevant they won't use them,
so they won't evolve''. This clearly demonstrates that in order for an
organisation to have performance measures that evolve over time, they must
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Company Facilitators of evolution Barriers to evolution

1 Senior management driving measurement activities Off the shelf systems insufficiently flexible
Development of in-house IT systems Availability of skills to effectively collect and analyse data
Use of accepted communication media to communicate,
generate feedback and involve all employees
Integration of measurement with strategy development and
review
Consistent approach to measurement
2 New Web-based system developed Senior management inertia
In-house systems provide required flexibility Measures not used to manage the business
Measurement included in business process review Time consuming and costly data collection
Alignment of rewards to measures
Need for measures to evolve considered important
Common understanding of objectives and the need to improve
3 Enthusiastic champion of measurement Management inertia
Contact with external research bodies to keep up to date with Inflexible IT/finance systems
developments in measurement practices Incompatibility of measures/inconsistent approach
Make measurement a business issue manage with measures Culture ad hoc measurement, no integrated approach or PM
function
4 Enthusiastic champion to kick off ``measurement revolution'' Individual inertia/resistance to measurement
The need for succession planning identified Time wasted producing reports
Ability to quantify performance
Measures lacking credibility
5 Top level management support is critical Measurement not used to manage the business (need new
User involvement in designing measures mind set)
Alignment of rewards Accounting systems focus
Inconsistent approach to measurement (due to changes in
ownership and management)
Lack of flexible systems to collect and analyse data
(continued)

Summary of case study


findings
Table II.
Performance
measurement
systems

1233
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

22,11

1234
IJOPM

Table II.
Company Facilitators of evolution Barriers to evolution

6 Dedicated PM resource (review of measures to ensure action is Cross-functional ownership of measures/performance


taken, IT and operational responsibility, credible sponsor) Staff/skill retention lose of skills to analyse data and
Integrated approach to measurement redefine measures
Open and honest process for reviewing measurement Payroll and union systems
Centres of practice established to share knowledge Incompatible systems/measurement in different locations/
Involvement of those being measured/local ownership of business units
measures
Measures linked to individual objectives
Measurement not owned by finance
Alignment of personal rewards
Away day to review measures
7 Top management support for measurement No review process in place
The need for improved electronic reporting including hierarchy Management time main constraint to reviewing measures
of measures and drill down facilities identified Too many measures/lack of focus
Incompatible measures barrier to effective use
Measures, actions and rewards not always aligned
have a set of performance measures that is effectively used throughout the Performance
organisation. Companies 3 and 5 were prevented from maintaining a relevant measurement
set of measures by senior management, who continued to use financial systems
measures to manage the business, despite the availability of a more balanced
set of measures.
Given that the availability and effective use of measures is a pre-requisite to
their evolution, a key question that remains is how do companies know when 1235
they should change the measures they use. Each of the case study companies
approached this problem in different ways, however each company also
encountered considerable barriers to effective evolution of measures. As a
result no organisation demonstrated a complete solution to the problem of
managing the evolution of their measurement system. Analysis of the barriers
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

that the case study organisations encountered, and approaches that different
organisations used to overcome them, provide significant insight into the way
that the evolution of measurement systems can be managed.

Barriers to and facilitators of evolution


The previous section presents the main findings of the case studies undertaken
and Table II summarises the main barriers and facilitators of the evolution of
performance measurement systems found within the companies studied. As
presented, the findings provide an insight into the factors affecting the
evolution of measures in each of the individual organisations. However, to
draw generic lessons from the data collected it is necessary to identify common
problems encountered by the organisations studied and solutions to these
problems found in other organisations.
Some of the organisations, notably companies 2 and 7, identified the need for
management processes to ensure the review of performance measurement is
prompted and that measures are changed as appropriate.
Companies 1 and 2 overcame such problems by incorporating measurement
into regular strategy and business process reviews. In company 1, considerable
attention was focused on measurement as a key tool to support the
achievement of strategic objectives. As a result measures were a fundamental
part of the annual strategy review. Meanwhile in company 2, each business
process had clearly defined performance measures. As a result, the process
audits assessed the effectiveness of measurement and prompted remedial
action as necessary.
These comparisons suggest that absence of an effective process is a
commonly encountered barrier to the evolution of measurement systems.
Effective processes enable identification of changes in circumstances that
necessitate changes in measures and ensure that measures are appropriate.
Companies 4 and 6 both highlight the lack of the appropriate skills as a
barrier to identifying the measures that need to be changed and to the
modification of measures. Historically, high staff turnover had significantly
affected the ability of company 6 to retain the skills required to analyse
performance data, identify whether measures remain appropriate and design
IJOPM new measures when necessary. The availability of management time to reflect
22,11 on measures was also found to be a constraint on evolution (companies 4 and 7).
In order to overcome these issues, company 6 devoted dedicated resources to
measurement, thereby giving individuals the responsibility for ensuring that
measurement remained appropriate to the organisation's circumstances. This
dedicated resource acted as support for measurement activities, including use
1236 of measures, analysis of data and ensuring that measures reflected the
requirements of users. The resource acted as a focal point for the development
and maintenance of internal measurement skills including the development of
appropriate information systems.
The lack of the necessary skills and human resources (people) is the second
barrier to evolution to be identified from the findings. The necessary skills
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

include: the ability to identify when measures are no longer appropriate to


measure the organisation's performance; and the ability to refine measures to
reflect the organisation's new circumstances.
The lack of flexibility of information systems, especially accounting
systems, was considered to be a barrier to the evolution of measures by a
number of companies (companies 1, 2, 5 and 6). Company 6 indicated that the
implementation of an ERP system resulted in loss of functionality that had
been developed to aid analysis of performance measurement data. Although
there are many software products designed to support performance
measurement on the market, company 1 in particular found ``off the shelf
systems'' to be insufficiently flexible, limiting the ability to modify of measures.
Companies 1, 2 and 6 identified the need to design data collection and
reporting systems so that they facilitate the identification of inappropriate
measurement and enable the change of data collection, analysis and reporting
tools. In company 1 this was included in a strategic information system
developed in-house. Company 2 developed a Web-based system that enabled
consistent and flexible measurement on a global scale, while company 6
ensured that their dedicated measurement personnel had both operations and
information systems responsibilities, ensuring that systems were developed to
reflect the requirements of operational measurement.
Inflexible systems are the third barrier to evolution to be drawn from the
findings.
The acceptance of measurement throughout the organisation was identified
as a key prerequisite of evolution in each of the companies studied. This was
linked to the importance placed on maintaining an effective and efficient
measurement system, including the benefit derived from measurement
activities in relation to the effort required. In each case, establishing a culture
that embraced the use of performance measurement to manage the business
was crucial. Without such a culture measurement was considered to be a non-
value adding activity which was to be endured, rather than a tool to support
business decision making.
Company 1 used existing communication media to establish such a culture
throughout the organisation. Use of statutory safety reports and briefings
ensured that all employees received business performance information, while Performance
mechanisms were put in place to encourage, collect and respond to queries from measurement
all employees regarding information reported. By demonstrating that this was systems
an open and honest process, an average of 300 questions and suggestions per
month were generated from a work force of approximately 600, demonstrating
that measures were actually being understood and used. Similarly,
management meetings in company 6 included open discussion of the relevance 1237
of performance measures and the way in which they could be modified and
improved to increase their utility. In both cases, developing an open and honest
culture in which measurement was used to support improvement rather than a
tool to punish individuals was considered crucial to use maintenance of
relevant performance measures.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Thus the data suggests that culture that is inappropriate to the use (and
change of measures) is the fourth key barrier that was identified.
This further analysis of the data identifies four key themes commonly
observed within the case study organisations. These are demonstrated through
the examples discussed. Table III presents the findings from the case studies
structured around four themes that emerge from the data collected. This
demonstrates that these themes comprehensively cover all of the case study
data as presented in Table II.

Framework of factors affecting the evolution of performance


measurement systems
It is evident from the summary of the interviews shown in Table III that the
evolution of measurement systems is a complex phenomenon to study. At the
most fundamental level the research reported in this paper has identified that
before a measurement system can evolve it has to be used actively (use). It is
worth noting at this point that a performance measurement system itself
consists of several components, including:
. Individual measures that quantify the efficiency and effectiveness of
actions.
. A set of measures that combine to assess the performance of an
organisation as a whole.
. A supporting infrastructure that enables data to be acquired, collated,
sorted, analysed, interpreted and disseminated (Neely, 1998).
Assuming that these elements exist and the measurement system is actually
used then evolution of the measurement system is possible. Typically, this
starts with a trigger, which can be either internal or external in nature.
Companies 1 and 5 both provide examples of an external trigger, with
significant changes in their consumer markets following deregulation. These
changes significantly affected the assumptions underpinning the companies'
objectives and competitive priorities, necessitating realignment of measures. In
Company 6, monthly management meetings and an away day provide
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

22,11

1238

findings
IJOPM

Table III.
Recategorised
summary of case study
Facilitators of evolution Barriers to evolution

Process Integration of measurement with strategy development and review Lack of proactive review process (7)
(company 1) Inconsistent approach to measurement:
Integration of measurement with business process review (2) over time (5)
PM ``function'' the focal point of measurement activity (6) between locations/business units (3, 6, 7)
Forum to discuss appropriateness of measures (6) no integrated measurement function (3)
Implementation of common definitions/metrics (3, 7) Insufficient time to review measures:
Consistent approach to measurement across all areas of the business (1) lack of management time (4, 7)
Away day to measures (6) too much data reported (4, 7)
Involvement of external bodies (3) The need to trend measures limits ability change (7)
User involvement in measurement (5) Lack of data analysis (5, 6)

People Maintain PM capability (6) Lack of appropriate skills:


Dedicated PM resource (6) to identify appropriate measures (4, 6, 7)
facilitation of use of measures (6) to design measures/quantify performance (4)
ensure action is taken (1, 2, 6) to collect accurate data (1)
prompt review of measures (6) to analyse data (6)
credible sponsor (1, 6, 7) High staff turnover (6)
IT and operational responsibilities (6) Lack of management time (4, 7)
Skills/succession planning (4) Ownership of cross-functional (6)
Involvement of those being measured/local ownership of measures (6)
Community of users of measures (6)

Systems Develop in-house/customised IT systems (1, 2): Inflexible legacy systems:


flexible (1, 2) data collection (4, 5)
Web-based (2) reporting (4, 5)
electronic reporting (7) Inflexible ERP systems loss of functionality (6)
hierarchy of measures (7) Inappropriate ``off the shelf'' systems (1)
linked to strategy deployment/business process review (1, 2)
Maintain internal systems development capabilities (1, 2, 6)
Integration of operations and IT (budgets, responsibility, etc.) (6)
(continued)
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Facilitators of evolution Barriers to evolution

Culture The need for evolution considered to be important (2, 6, 7) Senior management inertia (2, 3)
Communication: Individual inertia/resistance to measurement (4)
use of accepted medium (1) Ad hoc approach to measurement (3)
feedback all actions (1) Lack of alignment of actions with measures (7)
engage all employees (1) In appropriate use of measures/measures not used to
Measurement integrity is encouraged: manage the business (2, 5)
open and honest discussion of performance (6) Rigid remuneration and union systems (6)
no blame culture (6)
discouragement of ``gaming behaviour'' (6)
Ongoing senior management support/champion for measurement (all
companies):
continued focus on measurement (1, 6)
identify and remove barriers to use/change of measures (1, 6)
Establish common understanding of objectives (2)
Integration/alignment of reward systems (2)
Measurement not owned by finance (6)
Alignment of measures and rewards (2, 5, 6)

Table III.
Performance
measurement
systems

1239
IJOPM examples of internal triggers which prompted review of the relevance of
22,11 current measures given changes in circumstances. Other such triggers were
also identified that prompted the realisation that measures were
inappropriately designed for their purpose, that use of measures prompted
inappropriate behaviour or that circumstances, such as competitive
requirements, changed. Once the trigger has been received then the first stage
1240 in the evolution of the measurement system is to reflect on the performance
measurement system and identify whether it remains appropriate given
changing organisational circumstances. This stage of the evolutionary process
is known as reflect (reflect) and the research identified several barriers that
prevent it from occurring in organisations, most crucially those associated with
process, people, infrastructure and culture:
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

. Absence of an effective process. Company 7 highlighted the lack of an


effective process as the main barrier to reflection, while in both
companies 4 and 7 there was insufficient management time set aside to
reflect on performance measures.
. Lack of the necessary skills and human resources. Companies 1, 4, 6 and 7
each identified a lack of appropriate skills to analyse data and identify
inappropriate measures. Company 6 specifically highlighted that high
staff turnover caused problems in retaining people with the skills
necessary to identify which measures are inappropriate. Company 4 also
highlighted that the lack of succession planning was a barrier to
reflection.
. Inflexible systems. These were identified as barriers to reflection. In
particular, company 6 found ERP system implementation led to lost
analysis functionality required to investigate performance trends and
causes of performance variances.
. Inappropriate culture. Companies 4 and 6 both highlighted individuals
who were resistant to reflection on and change to measures as they did
not want measures to more effectively reflect specific dimensions of
performance for which they were responsible. Lack of alignment of
measures with rewards was also found to be a barrier to reflection in
company 7. Alignment of measures with rewards ensures that those
being measured have an incentive to reflect on measures and prompt
their evolution.
During the reflection stage, each of the constituent parts of the performance
measurement system should be critically appraised and reviewed to ensure
that they remain appropriate to the requirements of the organisation. Many
tools and techniques have been developed to help organisations design
performance measures and measurement systems. Several of these tools can be
applied to reflect on the content of an organisation's current performance
measurement system. For example, the performance measurement record sheet
(Neely et al., 1996) lists the characteristics of a performance measure, any of
which might be affected by changes in the organisation's circumstances. Many Performance
of the performance measurement frameworks that have been proposed measurement
(Kennerley and Neely, 2000) might also support reflection on the relevance of systems
the set of measures used by the organisation. Furthermore, tools such as the
Performance Measurement Questionnaire (Dixon et al., 1990) are specifically
designed to help an organisation to identify the appropriateness of their
measurement system. 1241
Reflecting on the measurement system will enable required changes to be
identified and will in turn trigger modifications (modify). In addition external
triggers, such as changes in legislative or regulatory requirements, and/or
changes in ownership can lead to the imposition of new performance measures,
which will also prompt the modification stage. In turn the modification stage
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

will result in changes to the constituent elements of the measurement system.


Once these changes have been enacted then the modified measurement system
can be said to have been deployed (deploy) and hence the cycle of evolution can
start again. This entire evolutionary cycle is illustrated in Figure 2, which
contains a framework of the factors affecting the evolution of measurement
systems.
The key to this discussion is to recognise that the case study data collected
demonstrates that to manage effectively the evolution of performance
measures, an organisation must consider several inter-related issues:
(1) The active use of the performance measurement system is a pre-
requisite to any evolution.
(2) The performance measurement system itself consists of three inter-
related elements (individual measures, the set of measures and the

Figure 2.
Framework of factors
affecting the evolution
of performance
measurement systems
IJOPM enabling infrastructure). Each of these elements must be considered
22,11 during the evolution of the performance measurement system.
(3) There are four stages of evolution use, reflect, modify and deploy.
These form a continuous cycle.
(4) Barriers exist that will prevent the evolutionary cycle from operating.
1242 These barriers can be overcome if the evolutionary cycle is underpinned
by enabling factors broadly categorised under the headings: people,
process, people, infrastructure and culture. Specifically, a well designed
measurement system will be accompanied by an explicitly designed
evolutionary cycle with clear triggers and:
. process existence of a process for reviewing, modifying and
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

deploying measures;
. people the availability of the required skills to use, reflect on,
modify and deploy measures;
. infrastructure the availability of flexible systems that enable the
collection, analysis and reporting of appropriate data;
. culture the existence of a measurement culture within the
organisation ensuring that the value of measurement, and
importance of maintaining relevant and appropriate measures, are
appreciated.

Discussion
The literature and case study data presented clearly show first, the importance
of managing measurement systems so that they change over time and second,
the complex range of interrelated factors that affect the evolution of
performance measurement systems. The literature highlights many of the
issues affecting the management of change within organisations. This paper
discusses many of these issues in the context of case study data relating to
performance measurement system evolution.
A considerable amount has been written about the design and
implementation of measurement systems and a number writers have identified
the need to reflect on measures to ensure that they remain relevant as the
organisation changes. The research findings echo the themes identified in the
literature concerning the external and internal drivers of change affecting
organisations and the need for organisations to have effective process in place
to identify these changes and when they necessitate changes to measurement
systems. However, there is little discussion in the literature of what to do once
that reflection has taken place. The data collected clearly show that the process
of managing the evolution of measurement systems consists of a number of
stages that have to date received little attention. In addition to reflection,
consideration should be given to how measures are to be modified and how
modified measures are to be deployed without embarking on a wholesale
performance measurement system redesign project.
It is also clear that for measurement systems to evolve effectively there are Performance
key capabilities that an organisation must have in place (i.e. effective processes; measurement
appropriate skills and human resources; appropriate culture; and flexible systems
systems). The research demonstrates how lessons from different strands of
literature such as the need for the appropriate resources (Greiner, 1996) and
capabilities (Gabris, 1986); the appropriate culture (Tichy, 1983); willingness to
change (Kotter, 1996); and relevant processes (Bourne et al., 2000; Bititci et al., 1243
2000) can be drawn together into a structured framework.
The data indicates that organisations should consider these capabilities at
each stage of the evolutionary cycle, as they are fundamental to effective
evolution. However, little consideration is given to these capabilities in the
literature concerning the design and implementation of measurement systems.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

It is the development and maintenance of these capabilities within an


organisation that will determine whether its measurement systems evolve
effectively. As such, reviewing the availability of these capabilities is an
important stage in the management of measurement systems over time. This
reflects the need to review and update measurement systems at three different
levels, i.e. the individual measure; the set of measures; and the supporting
infrastructure, and shows that these capabilities are an integral part of that
supporting infrastructure.
The framework presented provides a structured view of the factors affecting
the evolution of performance measures and measurement systems. It
conceptualises a very complex combination of factors affecting the evolution of
measurement systems into a manageable form.

Conclusions
Although the issue of development of effective performance measures has
received considerable attention from both academic and practitioner
communities, neither has satisfactorily addressed the issue of how performance
measures should evolve over time in order to remain relevant.
The research reported in this paper provides an understanding of how
measurement systems can be managed so that a dynamic and relevant set of
performance measures can be maintained, reflecting an organisation's
changing requirements. It provides an understanding of the factors, both
internal and external to the organisation, that facilitate and inhibit the
introduction of new measures, the modification of existing measures and
deletion of obsolete measures. These factors are presented in a framework that
illustrates the process, people, infrastructure and culture capabilities that an
organisation must demonstrate in order to manage the evolution of measures.
The paper discusses many issues of relevance to the growing literature in the
field of performance measurement while providing organisations with a
practical tool to help them establish an effective performance measurement
system. Ensuring that evolution of measurement systems is effectively
managed over time is important if another measurement crisis and revolution
is to be avoided.
IJOPM References
22,11 Bititci, U.S., Turner, T. and Begemann, C. (2000), ``Dynamics of performance measurement
systems'', International Journal of Operations & Production Management, Vol. 20 No. 6,
pp. 692-704.
Bourne, M., Neely, A., Mills, J. and Platts, K. (1999), ``Performance measurement system
implementation: an investigation of failures'', Proceedings of the 6th International
Conference of The European Operations Management Association, Venice, 7-8 June,
1244 pp. 749-56.
Bourne, M., Mills, J., Wilcox, M., Neely, A. and Platts, K. (2000), ``Designing, implementing and
updating performance measurement systems'', International Journal of Operations &
Production Management, Vol. 20 No. 7, pp. 754-71.
Bruns, W. (1998), ``Profit as a performance measure: powerful concept, insufficient measure'',
Performance Measurement Theory and Practice: The First International Conference on
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Performance Measurement, Cambridge, 14-17, July.


Dixon, J.R., Nanni, A.J. and Vollmann, T.E. (1990), The New Performance Challenge Measuring
Operations for World-Class Competition, Dow Jones-Irwin, Homewood, IL.
Eccles, R.G. (1991), ``The performance measurement manifesto'', Harvard Business Review,
January-February, pp. 131-7.
Fitzgerald, L., Johnston, R., Brignall, T.J., Silvestro, R. and Voss, C. (1991), Performance
Measurement in Service Businesses, The Chartered Institute of Management Accountants,
London.
Frigo, M.L. and Krumwiede, K.R. (1999), ``Balanced scorecards: a rising trend in strategic
performance measurement'', Journal of Strategic Performance Measurement, Vol. 3 No. 1,
pp. 42-4.
Gabris, G.T. (1986), ``Recognizing management techniques dysfunctions: how management tools
often create more problems than they solve'', in Halachmi, A. and Holzer, M. (Eds),
Competent Government: Theory and Practice, Chatelaine Press, Burk, VA, pp. 3-19.
Ghalayini, A.M. and Noble, J.S. (1996), ``The changing basis of performance measurement'',
International Journal of Operations & Production Management, Vol. 16 No. 8, pp. 63-80.
Globerson, S. (1985), ``Issues in developing a performance criteria system for an organisation'',
International Journal of Production Research, Vol. 23 No. 4, pp. 639-46.
Greiner, J. (1996), ``Positioning performance measurement for the twenty-first century'', in
Halachmi, A. and Bouckaert, G. (Eds), Organizational Performance and Measurement in
the Public Sector, Quorum Books, London, pp. 11-50.
Johnson, H.T. (1983), ``The search for gain in markets and firms: a review of the historical
emergence of management accounting systems'', Accounting, Organizations and Society,
Vol. 2 No. 3, pp. 139-46.
Johnson, H.T. and Kaplan, R.S. (1987), Relevance Lost The Rise and Fall of Management
Accounting, Harvard Business School Press, Boston, MA.
Kaplan, R.S. (1984), ``The evolution of management accounting'', The Accounting Review, Vol. 59
No. 3, pp. 390-418.
Kaplan, R.S. and Norton, D.P. (1992), ``The balanced scorecard measures that drive
performance'', Harvard Business Review, January/February, pp. 71-9.
Kaplan, R.S. and Norton, D.P. (1993), ``Putting the balanced scorecard to work'', Harvard Business
Review, September/October, pp. 134-47.
Keegan, D.P., Eiler, R.G. and Jones, C.R. (1989), ``Are your performance measures obsolete?'',
Management Accounting (US), Vol. 70 No. 12, pp. 45-50.
Kennerley, M.P. and Neely, A.D. (2000), ``Performance measurement frameworks a review'', Performance
Proceedings of the 2nd International Conference on Performance Measurement,
Cambridge, pp. 291-8. measurement
Kotter, J.P. (1996), Leading Change, Harvard Business School Press, Boston, MA. systems
Lynch, R.L. and Cross, K.F. (1991), Measure Up The Essential Guide to Measuring Business
Performance, Mandarin, London.
Maskell, B. (1989), ``Performance measures for world class manufacturing'', Management 1245
Accounting (UK), May, pp. 32-3.
Meyer, M.W. and Gupta, V. (1994), ``The performance paradox'', in Straw, B.M. and Cummings, L.L.
(Eds), Research in Organizaional Behaviour, Vol. 16, JAI Press, Greenwich, CT, pp. 309-69.
Neely, A. (1998), Measuring Business Performance Why, What and How, Economist Books,
London.
Neely, A.D. (1999), ``The performance measurement revolution: why now and where next'',
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

International Journal of Operations and Production Management, Vol. 19 No. 2, pp. 205-28.
Neely, A.D., Kennerley, M.P. and Adams, C.A. (2000), The New Measurement Crisis: The
Performance Prism as a Solution, Cranfield School of Management, Cranfield.
Neely, A.D., Mills, J.F., Gregory, M.J., Richards, A.H., Platts, K.W. and Bourne, M.C.S. (1996),
Getting the Measure of Your Business, Findlay Publications, Horton Kirby.
Pettigrew, A. and Whipp, R. (1991), Managing Change for Competitive Success, Blackwell,
Oxford.
Scott, W.R. (1995), Institutions and Organizations: Theory and Research, Sage Publications,
London.
Senge, P.N. (1992), The Fifth Discipline: The Art and Practice of the Learning Organization,
Century Business Press, London.
Tichy, N.M. (1983), Managing Strategic Change: Technical, Political, and Cultural Dynamics, John
Wiley & Sons, New York, NY.
Townley, B. and Cooper, D. (1998), ``Performance measures: rationalization and resistance'',
Proceedings of Performance Measurement Theory and Practice: the First International
Conference on Performance Measurement, Cambridge, 14-17, July, pp. 238-46.
Waggoner, D.B., Neely, A.D. and Kennerley, M.P. (1999), ``The forces that shape organisational
performance measurement systems: an interdisciplinary review'', International Journal of
Production Economics, Vol. 60-61, pp. 53-60.
Wisner, J.D. and Fawcett, S.E. (1991), ``Linking firm strategy to operating decisions through
performance measurement'', Production and Inventory Management Journal, Third
Quarter, pp. 5-11.
This article has been cited by:

1. Ruggero Sainaghi, Paul Phillips, Emma Zavarrone. 2017. Performance measurement in tourism firms: A
content analytical meta-approach. Tourism Management 59, 36-56. [CrossRef]
2. SoysaIshani Buddika Ishani Buddika Soysa Ishani Buddika Soysa (BSc Hons, MSc), formerly a
Lecturer in Engineering Mathematics and Statistics, is a Research Scholar pursing her PhD in the
School of Engineering and Advanced Technology, Massey University, New Zealand. Ishanis research
interests include performance measurement, structural equation modelling, and mathematics teaching to
engineering undergraduates. JayamahaNihal Palitha Nihal Palitha Jayamaha Nihal Palitha Jayamaha (PhD,
MEng, MBA) is a Lecturer in Quality Management at the Massey University. Nihal previously worked
in Electrical utilities in South Asia and the Middle East for 20 years as a Project Manager, an Operations
Engineer and an Auditor. Nihal has published several journal articles, book chapters, and conference papers
on TQM and related topics. His research interests include process modelling, performance measurement,
and quality management theory building for process improvement. GriggNigel Peter Nigel Peter Grigg
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Nigel Peter Grigg is a Professor in Quality Systems at the Massey University. He leads the taught Master of
Quality Systems Qualification and Coordinates Postgraduate Research in Quality Management, Business
Excellence, and related areas. He is a Chartered Quality Professional and Certified Six Sigma Black Belt,
and a Member of the Chartered Quality Institute, the American Society for Quality, and the New Zealand
Organisation for Quality. School of Engineering and Advanced Technology, Massey University, Palmerston
North, New Zealand . 2016. Operationalising performance measurement dimensions for the Australasian
nonprofit healthcare sector. The TQM Journal 28:6, 954-973. [Abstract] [Full Text] [PDF]
3. SchniederjansDara G. Dara G. Schniederjans OzpolatKoray Koray Ozpolat ChenYuwen Yuwen Chen
Department of Supply Chain Management, University of Rhode Island, Kingston, Rhode Island, USA .
2016. Humanitarian supply chain use of cloud computing. Supply Chain Management: An International
Journal 21:5, 569-588. [Abstract] [Full Text] [PDF]
4. Tyler F. Thomas. 2016. Motivating revisions of management accounting systems: An examination of
organizational goals and accounting feedback. Accounting, Organizations and Society 53, 1-16. [CrossRef]
5. Jelle Van Camp Department of Engineering Management, University of Antwerp, Antwerpen, Belgium
Johan Braet Department of Engineering Management, University of Antwerp, Antwerpen, Belgium .
2016. Taxonomizing performance measurement systems failures. International Journal of Productivity and
Performance Management 65:5, 672-693. [Abstract] [Full Text] [PDF]
6. Sandeep Vij Department of Management, DAV University, Jalandhar, India Harpreet Singh Bedi
Department of Management, Lovely Professional University, Phagwara, India . 2016. Are subjective
business performance measures justified?. International Journal of Productivity and Performance
Management 65:5, 603-621. [Abstract] [Full Text] [PDF]
7. MunirRahat Rahat Munir Dr Rahat Munir is an Associate Professor at the Department of Accounting and
Corporate Governance, Faculty of Business and Economics, Macquarie University, Sydney. He has been an
academic in accounting discipline for over 20 years. His teaching is across undergraduate and postgraduate
subjects in management accounting and related subjects. He supervises Honours and PhD students. Dr
Munir has worked very closely with the Professional Risk Managers International Association in the
USA to implement risk management and measurement practices in Pakistan. He has worked on various
banking projects, including a project funded by the International Finance Corporation on Small and
Medium Enterprises Financing. His area of research includes risk management, performance measurement
systems and management control systems in banks. He has published in high-impact journals, such as the
Accounting, Auditing & Accountability Journal and the International Journal of Operations & Production
Management. BairdKevin Kevin Baird Dr Kevin Baird is an Associate Professor in the Department of
Accounting and Corporate Governance at Macquarie University in Sydney, Australia. He has 15 years
experience teaching undergraduate and postgraduate subjects in the management accounting area. He
also has supervised numerous Honours and PhD students and conducted research covering many topic
areas within the management accounting discipline including: Activity-based management practices; Total
quality management; Performance measurement systems; Management control systems; Outsourcing;
Employee organizational commitment; and Employee empowerment. Dr Baird has published 28 papers
with 16 papers published in A/A* ranked journals, including the Accounting, Auditing & Accountability
Journal, the International Journal of Operations & Production Management and Management Accounting
Research. Department of Accounting and Corporate Governance, Macquarie University, Sydney, Australia .
2016. Influence of institutional pressures on performance measurement systems. Journal of Accounting &
Organizational Change 12:2, 106-128. [Abstract] [Full Text] [PDF]
8. Premaratne Samaranayake School of Business, University of Western Sydney. Penrith, Australia Tritos
Laosirihongthong Industrial Engineering Department, Faculty of Engineering, Thammasat University,
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Bangkok, Thailand . 2016. Configuration of supply chain integration and delivery performance. Journal
of Modelling in Management 11:1, 43-74. [Abstract] [Full Text] [PDF]
9. Yulia Sidorova Department of Management, Economics and Industrial Engineering, Politecnico di Milano,
Milan, Italy Michela Arnaboldi Department of Management, Economics and Industrial Engineering,
Politecnico di Milano, Milan, Italy Jacopo Radaelli Department of Management, Economics and Industrial
Engineering, Politecnico di Milano, Milan, Italy . 2016. Social media and performance measurement
systems: towards a new model?. International Journal of Productivity and Performance Management 65:2,
139-161. [Abstract] [Full Text] [PDF]
10. Cory Searcy. 2016. Measuring Enterprise Sustainability. Business Strategy and the Environment 25:2,
120-133. [CrossRef]
11. Kamilah Ahmad, Shafie Mohamed Zabri. 2016. The Application of Non-Financial Performance
Measurement in Malaysian Manufacturing Firms. Procedia Economics and Finance 35, 476-484. [CrossRef]
12. Richard Henley, Christopher A. Brown. 2016. Axiomatic Design Applied to Play Calling in American
Football. Procedia CIRP 53, 206-212. [CrossRef]
13. James Kamwachale Khomba Department of Management Studies, University of Malawi, Blantyre, Malawi .
2015. Conceptualisation of the Balanced Scorecard (BSC) model. International Journal of Commerce and
Management 25:4, 424-441. [Abstract] [Full Text] [PDF]
14. Han-Hsin Chou. 2015. Multiple-Technique Approach for Improving a Performance Measurement and
Management System: Action Research in a Mining Company. Engineering Management Journal 27:4,
203-217. [CrossRef]
15. Joel Jrvinen, Heikki Karjaluoto. 2015. The use of Web analytics for digital marketing performance
measurement. Industrial Marketing Management 50, 117-127. [CrossRef]
16. Jonathan Pryshlakivsky, Cory Searcy. 2015. A Heuristic Model for Establishing Trade-Offs in Corporate
Sustainability Performance Measurement Systems. Journal of Business Ethics . [CrossRef]
17. Heidi Kromrei. 2015. Enhancing the Annual Performance Appraisal Process: Reducing Biases and
Engaging Employees Through Self-Assessment. Performance Improvement Quarterly 28:2, 53-64.
[CrossRef]
18. Antonella Cifalin, Irene Eleonora Lisi. 2015. La misurazione delle performance dei servizi domiciliari e
residenziali tra riforme istituzionali e applicazioni locali. MECOSAN :93, 9-32. [CrossRef]
19. Carron M. Blom, Luca De Marco, Peter M. Guthrie. 2015. Customer perceptions of road infrastructure
surface conditions. Infrastructure Asset Management 2:1, 23-38. [CrossRef]
20. Aki Jskelinen Department of Industrial Engineering, Tampere University of Technology, Tampere,
Finland Juho-Matias Roitto Department of Industrial Management, Tampere University of Technology,
Tampere, Finland . 2015. Designing a model for profiling organizational performance management.
International Journal of Productivity and Performance Management 64:1, 5-27. [Abstract] [Full Text]
[PDF]
21. Clandia Maffini Gomes, Jordana Marques Kneipp, Isak Kruglianskas, Luciana Aparecida Barbieri da Rosa,
Roberto Schoproni Bichueti. 2014. Management for sustainability in companies of the mining sector:
ananalysis of the main factors related with the business performance. Journal of Cleaner Production 84,
84-93. [CrossRef]
22. Hella Abidi Institute for Logistics and Service Management (ild), FOM University of Applied
Sciences, Essen, Germany and Department of Information, Logistics and Innovation, VU University
Amsterdam, Amsterdam, The Netherlands Sander de Leeuw Department of Information, Logistics and
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Innovation, Faculty of Economics & Business Administration, VU University Amsterdam, Amsterdam,


The Netherlands and Department of Management, Nottingham Business School, Nottingham Trent
University, Nottingham, UK Matthias Klumpp Institute for Logistics and Service Management (FOM
ild), FOM University of Applied Sciences, Essen, Germany . 2014. Humanitarian supply chain performance
management: a systematic literature review. Supply Chain Management: An International Journal 19:5/6,
592-608. [Abstract] [Full Text] [PDF]
23. Fiorenzo Franceschini, Maurizio Galetto, Elisa Turina. 2014. Impact of performance indicators on
organisations: a proposal for an evaluation model. Production Planning & Control 25:9, 783-799.
[CrossRef]
24. Steven A. Melnyk, Umit Bititci, Ken Platts, Jutta Tobias, Bjrn Andersen. 2014. Is performance
measurement and management fit for the future?. Management Accounting Research 25:2, 173-186.
[CrossRef]
25. Ian O'Boyle, David Hassan. 2014. Performance management and measurement in national-level non-profit
sport organisations. European Sport Management Quarterly 14:3, 299-314. [CrossRef]
26. Federico Caniato, Davide Luzzini, Stefano Ronchi. 2014. Purchasing performance management systems:
an empirical investigation. Production Planning & Control 25:7, 616-635. [CrossRef]
27. Mohammad Munir Ahmad School of Science and Engineering, Teesside University, Middlesbrough, UK
Osama Alaskari School of Science and Engineering, Teesside University, Middlesbrough, UK . 2014.
Development of assessment methodology for improving performance in SME's. International Journal of
Productivity and Performance Management 63:4, 477-498. [Abstract] [Full Text] [PDF]
28. John R. Turner, Shelby Danks. 2014. Case Study Research: A Valuable Learning Tool for Performance
Improvement Professionals. Performance Improvement 53:4, 24-31. [CrossRef]
29. Mike Perkins The York Management School, University of York, York, UK Anna Grey The York
Management School, University of York, York, UK Helge Remmers The York Management School,
University of York, York, UK . 2014. What do we really mean by Balanced Scorecard?. International
Journal of Productivity and Performance Management 63:2, 148-169. [Abstract] [Full Text] [PDF]
30. Mohamed Behery College of Business Administration, Abu Dhabi University, Abu Dhabi, United
Arab Emirates Fauzia Jabeen College of Business Administration, Abu Dhabi University, Abu Dhabi,
United Arab Emirates Mohammed Parakandi College of Business Administration, Abu Dhabi University,
Abu Dhabi, United Arab Emirates . 2014. Adopting a contemporary performance management system.
International Journal of Productivity and Performance Management 63:1, 22-43. [Abstract] [Full Text]
[PDF]
31. Mladen Vukomanovic, Mladen Radujkovic, Maja Marija Nahod. 2014. EFQM excellence model as the
TQM model of the construction industry of southeastern Europe. Journal of Civil Engineering and
Management 20:1, 70-81. [CrossRef]
32. Edson Pinheiro de Lima, Sergio E. Gouvea da Costa, Jannis Jan Angelis, Juliano Munik. 2013. Performance
measurement systems: A consensual analysis of their roles. International Journal of Production Economics
146:2, 524-542. [CrossRef]
33. Ben Clegg, Jillian MacBryde and Prasanta Dey Mattias Elg Department of Management and Engineering,
HELIX VINN Excellence Centre, Linkping University, Linkping, Sweden Klara Palmberg Broryd
Department of Management and Engineering, Linkping University, Linkping, Sweden Beata Kollberg
Department of Management and Engineering, Linkping University, Linkping, Sweden . 2013.
Performance measurement to drive improvements in healthcare practice. International Journal of Operations
& Production Management 33:11/12, 1623-1651. [Abstract] [Full Text] [PDF]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

34. Fei Deng, Hedley Smyth. 2013. Contingency-Based Approach to Firm Performance in Construction:
Critical Review of Empirical Research. Journal of Construction Engineering and Management 139:10,
04013004. [CrossRef]
35. Zhigang Jin, Fei Deng, Heng Li, Martin Skitmore. 2013. Practical Framework for Measuring Performance
of International Construction Firms. Journal of Construction Engineering and Management 139:9,
1154-1167. [CrossRef]
36. Andrew Taylor, Margaret Taylor. 2013. Antecedents of effective performance measurement system
implementation: an empirical study of UK manufacturing firms. International Journal of Production
Research 51:18, 5485-5498. [CrossRef]
37. Luiz C. R. Carpinetti, Rafael H. P. Lima. 2013. Institutions for collaboration in industrial clusters: proposal
of a per-formance and change management model. International Journal of Production Management and
Engineering 1:1. . [CrossRef]
38. Tuomas Korhonen, Teemu Laine, Petri Suomala. 2013. Understanding performance measurement
dynamism: a case study. Journal of Management & Governance 17:1, 35-58. [CrossRef]
39. F. FranceschiniDISPEA, Politecnico di Torino, Torino, Italy M. GalettoDISPEA, Politecnico di Torino,
Torino, Italy E. TurinaDISPEA, Politecnico di Torino, Torino, Italy. 2013. Techniques for impact
evaluation of performance measurement systems. International Journal of Quality & Reliability Management
30:2, 197-220. [Abstract] [Full Text] [PDF]
40. Fiorenzo Franceschini, Elisa Turina. 2013. Quality improvement and redesign of performance measurement
systems: an application to the academic field. Quality & Quantity 47:1, 465-483. [CrossRef]
41. Ingrid Guerra-Lpez, Alisa Hutchinson. 2013. Measurable and Continuous Performance Improvement:
The Development of a Performance Measurement, Management, and Improvement System. Performance
Improvement Quarterly 26:2, 159-173. [CrossRef]
42. Laura Grosswiele, Maximilian Rglinger, Bettina Friedl. 2013. A decision framework for the consolidation
of performance measurement systems. Decision Support Systems 54:2, 1016-1029. [CrossRef]
43. Elvin Bastian, Munawar Muchlish. 2012. Perceived Environment Uncertainty, Business Strategy,
Performance Measurement Systems and Organizational Performance. Procedia - Social and Behavioral
Sciences 65, 787-792. [CrossRef]
44. Pekka Helki, Timo Ala-Risku. 2012. Local environment as a pitfall in the performance measurement of
multi-site operations. Operations Management Research 5:3-4, 81-86. [CrossRef]
45. Pedro Gustavo Siqueira Ferreira, Edson Pinheiro de Lima, Sergio E. Gouvea da Costa. 2012. Perception of
virtual teams performance: A multinational exercise. International Journal of Production Economics 140:1,
416-430. [CrossRef]
46. Bunjongjit RomphoSchool of Management, Asian Institute of Technology, Pathumthani, Thailand
Sununta SiengthaiSchool of Management, Asian Institute of Technology, Pathumthani, Thailand. 2012.
Integrated performance measurement system for firm's human capital building. Journal of Intellectual
Capital 13:4, 482-514. [Abstract] [Full Text] [PDF]
47. Michaela Striteska. 2012. Key Features of Strategic Performance Management Systems in Manufacturing
Companies. Procedia - Social and Behavioral Sciences 58, 1103-1110. [CrossRef]
48. Ricardo Chalmeta, Sergio Palomero, Magali Matilla. 2012. Methodology to develop a performance
measurement system in small and medium-sized enterprises. International Journal of Computer Integrated
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Manufacturing 25:8, 716-740. [CrossRef]


49. P.R.C. GopalDepartment of Industrial Engineering and Management, Indian Institute of Technology
Kharagpur, Kharagpur, India Jitesh ThakkarDepartment of Industrial Engineering and Management,
Indian Institute of Technology Kharagpur, Kharagpur, India. 2012. A review on supply chain performance
measures and metrics: 20002011. International Journal of Productivity and Performance Management 61:5,
518-547. [Abstract] [Full Text] [PDF]
50. Wei-Tsong Wang. 2012. Evaluating organisational performance during crises: A multi-dimensional
framework. Total Quality Management & Business Excellence 23:5-6, 673-688. [CrossRef]
51. Cory Searcy. 2012. Corporate Sustainability Performance Measurement Systems: A Review and Research
Agenda. Journal of Business Ethics 107:3, 239-253. [CrossRef]
52. Mattias Elg, Beata Kollberg. 2012. Conditions for reporting performance measurement. Total Quality
Management & Business Excellence 23:1, 63-77. [CrossRef]
53. Predrag Stancic, Miroslav Todorovic, Milan Cupic. 2012. Value-based management and corporate
governance: A study of Serbian corporations. Economic annals 57:193, 93-112. [CrossRef]
54. Amy TungDepartment of Accounting and Corporate Governance, Macquarie University, Sydney, Australia
Kevin BairdDepartment of Accounting and Corporate Governance, Macquarie University, Sydney, Australia
Herbert P. SchochDepartment of Accounting and Corporate Governance, Macquarie University, Sydney,
Australia. 2011. Factors influencing the effectiveness of performance measurement systems. International
Journal of Operations & Production Management 31:12, 1287-1310. [Abstract] [Full Text] [PDF]
55. Daniele CerratoDipartimento di Scienze Economiche e Sociali, Universit Cattolica del Sacro Cuore,
Piacenza, Italy Donatella DepperuDipartimento di Scienze Economiche e Sociali, Universit Cattolica del
Sacro Cuore, Piacenza, Italy. 2011. Unbundling the construct of firmlevel international competitiveness.
Multinational Business Review 19:4, 311-331. [Abstract] [Full Text] [PDF]
56. Renata Gomes Frutuoso Braz, Luiz Felipe Scavarda, Roberto Antonio Martins. 2011. Reviewing and
improving performance measurement systems: An action research. International Journal of Production
Economics 133:2, 751-760. [CrossRef]
57. Panagiotis Chytas, Michael Glykas, George Valiris. 2011. A proactive balanced scorecard. International
Journal of Information Management 31:5, 460-468. [CrossRef]
58. Suwit SrimaiFaculty of Liberal Arts & Management Sciences, Prince of Songkla University, Surat Thani,
Thailand Jack RadfordFaculty of Commerce, Lincoln University, Lincoln, Canterbury, New Zealand Chris
WrightFaculty of Commerce, Lincoln University, Lincoln, Canterbury, New Zealand. 2011. Evolutionary
paths of performance measurement. International Journal of Productivity and Performance Management
60:7, 662-687. [Abstract] [Full Text] [PDF]
59. Martin KuncWarwick Business School, University of Warwick, Coventry, UK Rahul BhandariWarwick
Business School, University of Warwick, Coventry, UK. 2011. Strategic development processes during
economic and financial crisis. Management Decision 49:8, 1343-1353. [Abstract] [Full Text] [PDF]
60. Kim Sundtoft HaldDepartment of Operations Management, Copenhagen Business School, Frederiksberg,
Denmark Chris EllegaardCopenhagen Business School, Center for Applied Market Science, Herning,
Denmark. 2011. Supplier evaluation processes: the shaping and reshaping of supplier performance.
International Journal of Operations & Production Management 31:8, 888-910. [Abstract] [Full Text] [PDF]
61. Beata KollbergIndustrial Marketing, Department of Management and Engineering, Linkping University,
Linkping, Sweden Mattias ElgQuality Technology and Management, Department of Management and
Engineering, Linkping University, Linkping, Sweden. 2011. The practice of the Balanced Scorecard
in health care services. International Journal of Productivity and Performance Management 60:5, 427-445.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

[Abstract] [Full Text] [PDF]


62. C. AnnamalaiSchool of Management, Universiti Saints Malaysia, Minden, Malaysia T. RamayahSchool
of Management, Universiti Saints Malaysia, Minden, Malaysia. 2011. Enterprise resource planning (ERP)
benefits survey of Indian manufacturing firms. Business Process Management Journal 17:3, 495-509.
[Abstract] [Full Text] [PDF]
63. A.R. Rezaei, T. elik, Y. Baalousha. 2011. Performance measurement in a quality management system.
Scientia Iranica 18:3, 742-752. [CrossRef]
64. Cory SearcyAssistant Professor in the Department of Mechanical and Industrial Engineering, Ryerson
University, Toronto, Canada. 2011. Updating corporate sustainability performance measurement systems.
Measuring Business Excellence 15:2, 44-56. [Abstract] [Full Text] [PDF]
65. Danilo Hisano BarbosaProduction Engineering Department, Engineering School of So Carlos, USP
University of So Paulo, So Carlos, Brazil Marcel Andreotti MusettiProduction Engineering Department,
Engineering School of So Carlos, USP University of So Paulo, So Carlos, Brazil. 2011. The use
of performance measurement system in logistics change process. International Journal of Productivity and
Performance Management 60:4, 339-359. [Abstract] [Full Text] [PDF]
66. Payam HanafizadehDepartment of Industrial Management, Allameh Tabataba'i University, Tehran, Iran
and Elmira OsouliDepartment of Industrial Management, Allameh Tabataba'i University, Tehran, Iran.
2011. Process selection in reengineering by measuring degree of change. Business Process Management
Journal 17:2, 284-310. [Abstract] [Full Text] [PDF]
67. Sander de Leeuw, Jeroen P. van den Berg. 2011. Improving operational performance by influencing
shopfloor behavior via performance management practices. Journal of Operations Management 29:3,
224-235. [CrossRef]
68. S.S. Nudurupati, U.S. Bititci, V. Kumar, F.T.S. Chan. 2011. State of the art literature review on
performance measurement. Computers & Industrial Engineering 60:2, 279-290. [CrossRef]
69. Andrey PavlovCranfield School of Management, Centre for Business Performance, Cranfield, UK
Mike BourneCranfield School of Management, Centre for Business Performance, Cranfield, UK. 2011.
Explaining the effects of performance measurement on performance. International Journal of Operations &
Production Management 31:1, 101-122. [Abstract] [Full Text] [PDF]
70. Dilek Cetindamar, Blent atay, Osman Serdar BasmaciPerformance Measurement in Supply Chain
Collaboration 137-157. [CrossRef]
71. Rajiv Sindwani, Vikram Singh, Sandeep Grover. 2011. Identification of Attributes of TQM in an
Educational Institute. International Journal of Service Science, Management, Engineering, and Technology
2:2, 48-64. [CrossRef]
72. Lie-Chien Lin, Tzu-Su Li. 2010. An integrated framework for supply chain performance measurement
using six-sigma metrics. Software Quality Journal 18:3, 387-406. [CrossRef]
73. Katja Kolehmainen. 2010. Dynamic Strategic Performance Measurement Systems: Balancing
Empowerment and Alignment. Long Range Planning 43:4, 527-554. [CrossRef]
74. Jean-Franois Henri. 2010. The Periodic Review of Performance Indicators: An Empirical Investigation of
the Dynamism of Performance Measurement Systems. European Accounting Review 19:1, 73-96. [CrossRef]
75. Andr A. de WaalAssociate Professor based at Maastricht School of Management, Maastricht, The
Netherlands. 2010. Performancedriven behavior as the key to improved organizational performance.
Measuring Business Excellence 14:1, 79-95. [Abstract] [Full Text] [PDF]
76. Claire Moxham. 2010. Help or Hindrance?. Public Performance & Management Review 33:3, 342-354.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

[CrossRef]
77. Paolo TaticchiPaola CoccaDepartment of Mechanical and Industrial Engineering, Brescia University,
Brescia, Italy Marco AlbertiDepartment of Mechanical and Industrial Engineering, Brescia University,
Brescia, Italy. 2010. A framework to assess performance measurement systems in SMEs. International
Journal of Productivity and Performance Management 59:2, 186-200. [Abstract] [Full Text] [PDF]
78. Okfalisa, Rose Alinda Alias, Naomie Salim, Kuan Yew WongMetric for strategy implementation:
Measuring and monitoring the performance 29-34. [CrossRef]
79. Edson Pinheiro de LimaAffiliated to both Pontifical Catholic University of Parana, Curitiba, Brazil and
the OM Group, Warwick Business School, University of Warwick, Coventry, UK Sergio E. Gouvea da
CostaBased at the Pontifical Catholic University of Parana, Curitiba, Brazil Jannis J. AngelisBased at The
OM Group, Warwick Business School, University of Warwick, Coventry, UK. 2009. Strategic performance
measurement systems: a discussion about their roles. Measuring Business Excellence 13:3, 39-48. [Abstract]
[Full Text] [PDF]
80. Karen FryerCaledonian Business School, Glasgow Caledonian University, Glasgow, UK Jiju
AntonyDepartment of DMEM, Strathclyde Institute for Operations Management, University of
Strathclyde, Glasgow, UK Susan OgdenCaledonian Business School, Glasgow Caledonian University,
Glasgow, UK. 2009. Performance management in the public sector. International Journal of Public Sector
Management 22:6, 478-498. [Abstract] [Full Text] [PDF]
81. Claire MoxhamManchester Business School, University of Manchester, Manchester, UK. 2009.
Performance measurement. International Journal of Operations & Production Management 29:7, 740-763.
[Abstract] [Full Text] [PDF]
82. Mattias Elg, Beata Kollberg. 2009. Alternative arguments and directions for studying performance
measurement. Total Quality Management & Business Excellence 20:4, 409-421. [CrossRef]
83. Michael Gall, Christian Sterba, Thomas GrechenigDefinition and Segmentation of Orchestra Companies
135-139. [CrossRef]
84. Juhani Ukko, Sanna Pekkola, Hannu Rantanen. 2009. A framework to support performance measurement
at the operative level of an organisation. International Journal of Business Performance Management 11:4,
313. [CrossRef]
85. Stephan Schmidberger, Lydia Bals, Evi Hartmann, Christopher Jahns. 2009. Ground handling services
at European hub airports: Development of a performance measurement system for benchmarking.
International Journal of Production Economics 117:1, 104-116. [CrossRef]
86. Roberto Cigolini, Alberto Grando. 2009. Modelling capacity and productivity of multi-machine systems.
Production Planning & Control 20:1, 30-39. [CrossRef]
87. A. Ramaa, T. M. Rangaswamy, K. N. SubramanyaA Review of Literature on Performance Measurement
of Supply Chain Network 802-807. [CrossRef]
88. Mike BourneProfessor of Business Performance at the Centre for Business Performance, Cranfield School
of Management, Cranfield University, Cranfield, UK. 2008. Performance measurement: learning from the
past and projecting the future. Measuring Business Excellence 12:4, 67-72. [Abstract] [Full Text] [PDF]
89. N. van der MerweSchool of Accounting Sciences, NorthWest University S.S. VisserSchool of
Accounting Sciences, NorthWest University. 2008. Performance management in the South African motor
manufacturing industry: a framework. Meditari Accountancy Research 16:2, 189-211. [Abstract] [PDF]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

90. Vinod KumarSprott School of Business, Carleton University, Ottawa, Canada Raili PollanenSprott School
of Business, Carleton University, Ottawa, Canada Bharat MaheshwariOdette School of Business, University
of Windsor, Windsor, Canada. 2008. Challenges in enhancing enterprise resource planning systems for
compliance with SarbanesOxley Act and analogous Canadian legislation. Management Research News
31:10, 758-773. [Abstract] [Full Text] [PDF]
91. Dilek Ozdemir, Sitki GozluInvestigation of performance criteria for health information systems 2066-2072.
[CrossRef]
92. Panagiotis Chytas, Michael Glykas, George ValirisA Proactive Fuzzy Cognitive Balanced Scorecard
1331-1338. [CrossRef]
93. Martin KuncSchool of Business, University of Adolfo Ibaez, Santiago, Chile. 2008. Using systems
thinking to enhance strategy maps. Management Decision 46:5, 761-778. [Abstract] [Full Text] [PDF]
94. Cory SearcyOld Dominion University, Norfolk, Virginia, USA Stanislav KarapetrovicUniversity of
Alberta, Edmonton, Canada Daryl McCartneyUniversity of Alberta, Edmonton, Canada. 2008. Application
of a systems approach to sustainable development performance measurement. International Journal of
Productivity and Performance Management 57:2, 182-197. [Abstract] [Full Text] [PDF]
95. Jean-Franois Henri Taxonomy of performance measurement systems 247-288. [Abstract] [Full Text]
[PDF] [PDF]
96. Edson Pinheiro de Lima, Sergio E. Gouvea da Costa, Jannis J. Angelis. 2008. The strategic management
of operations system performance. International Journal of Business Performance Management 10:1, 108.
[CrossRef]
97. Chanan Syan, Krystal Ramoutar. 2008. Development of an Integrated Framework for Assessing and
Improving the Performance of Manufacturing Industries in Developing Countries. Journal of Konbin 8:1. .
[CrossRef]
98. Kit Fai PunDepartment of Mechanical and Manufacturing Engineering, The University of the West
Indies, St. Augustine, Trinidad and Tobago Anesa HoseinDepartment of Mechanical and Manufacturing
Engineering, The University of the West Indies, St. Augustine, Trinidad and Tobago. 2007. Identification
of Performance Indicators for Poultry Agribusiness Operations. Asian Journal on Quality 8:3, 11-22.
[Abstract] [PDF]
99. Massimiliano Bonacchi, Leonardo Rinaldi. 2007. DartBoards and Clovers as new tools in sustainability
planning and control. Business Strategy and the Environment 16:7, 461-473. [CrossRef]
100. Eric O. OlsenOrfalea College of Business Industrial Technology, California Polytechnic State University,
San Luis Obispo, California, USA Honggeng ZhouDepartment of Decision Sciences, Whittemore School
of Business and Economics, University of New Hampshire, Durham, New Hampshire, USA Denis M.S.
LeeSawyer School of Business, Suffolk University, Boston, Massachusetts, USA YokeEng NgHewlett
Packard Singapore (Pte) Ltd, Singapore Chow Chewn ChongHewlettPackard Singapore (Pte) Ltd,
Singapore Pean PadunchwitPeregrine Semiconductor, Homebush Bay, Australia. 2007. Performance
measurement system and relationships with performance results. International Journal of Productivity and
Performance Management 56:7, 559-582. [Abstract] [Full Text] [PDF]
101. Sai Nudurupati, Tanweer Arshad, Trevor Turner. 2007. Performance measurement in the construction
industry: An action case investigating manufacturing methodologies. Computers in Industry 58:7, 667-676.
[CrossRef]
102. Mike Bourne, Steven Melnyk and Norman FaullClaire MoxhamManchester Business School, University
of Manchester, Manchester, UK Ruth BoadenManchester Business School, University of Manchester,
Manchester, UK. 2007. The impact of performance measurement in the voluntary sector. International
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

Journal of Operations & Production Management 27:8, 826-845. [Abstract] [Full Text] [PDF]
103. Mike Bourne, Steven Melnyk and Norman FaullPatrizia GarengoDIMEG, University of Padua, Padova,
Italy Umit BititciDMEM, University of Strathclyde, Glasgow, Scotland, UK. 2007. Towards a contingency
approach to performance measurement: an empirical study in Scottish SMEs. International Journal of
Operations & Production Management 27:8, 802-825. [Abstract] [Full Text] [PDF]
104. S.C.L. Koh and S.M. SaadJeff PursgloveHull City Council, Kingston upon Hull, UK Mike
SimpsonSheffield University Management School, University of Sheffield, Sheffield, UK. 2007.
Benchmarking the performance of English universities. Benchmarking: An International Journal 14:1,
102-122. [Abstract] [Full Text] [PDF]
105. Jitesh ThakkarA.D. Patel Institute of Technology, Vallabh Vidyanagar, India S.G. DeshmukhIndian
Institute of Technology, Delhi, India A.D. GuptaIndian Institute of Technology, Delhi, India Ravi
ShankarIndian Institute of Technology, Delhi, India. 2006. Development of a balanced scorecard.
International Journal of Productivity and Performance Management 56:1, 25-59. [Abstract] [Full Text]
[PDF]
106. Beata Kollberg, Mattias Elg. 2006. Challenges Experienced in the Development of Performance
Measurement Systems in Swedish Health Care. Quality Management in Health Care 15:4, 244-256.
[CrossRef]
107. JenHer WuDepartment of Information Management, National Sun Yatsen University, HsiTze Wan,
Kaohsiung, Taiwan, Republic of China YuMin WangDepartment of Information Management, National
ChiNan University, Nantou Hsien, Taiwan, Republic of China. 2006. Measuring ERP success: the
ultimate users' view. International Journal of Operations & Production Management 26:8, 882-903.
[Abstract] [Full Text] [PDF]
108. Bhagyashree ParanjapePhD student in the Department of Engineering, Faculty of Engineering &
Information Technology, Australian National University, Canberra, Australia. Margaret RossiterSenior
Lecturer, in the Department of Engineering, Faculty of Engineering & Information Technology,
Australian National University, Canberra, Australia. Victor PantanoLead Researcher at the International
Automotive Research Centre, Warwick Manufacturing Group, University of Warwick, Coventry, UK. 2006.
Performance measurement systems: successes, failures and future a review. Measuring Business Excellence
10:3, 4-14. [Abstract] [Full Text] [PDF]
109. Jillian MacBryde and Zoe RadnorCraig ShepherdThe Institute of Work Psychology, The University
of Sheffield, Sheffield, UK Hannes GnterETH Zurich, Organization Work and Technology Group,
Zurich, Switzerland. 2006. Measuring supply chain performance: current research and future directions.
International Journal of Productivity and Performance Management 55:3/4, 242-258. [Abstract] [Full Text]
[PDF]
110. Valerie DecoeneGhent University, Ghent, Belgium Werner BruggemanGhent University, Ghent, Belgium.
2006. Strategic alignment and middlelevel managers' motivation in a balanced scorecard setting.
International Journal of Operations & Production Management 26:4, 429-448. [Abstract] [Full Text] [PDF]
111. Garry ColemanStrategic Performance Measurement 12-1-12-22. [CrossRef]
112. George ValirisDepartment of Business Administration, University of Aegean, Chios, Greece Panagiotis
ChytasDepartment of Business Administration, University of Aegean, Chios, Greece Michael
GlykasDepartment of Financial and Management Engineering, University of Aegean, Chios, Greece.
2005. Making decisions using the balanced scorecard and the simple multiattribute rating technique.
Performance Measurement and Metrics 6:3, 159-171. [Abstract] [Full Text] [PDF]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

113. Zoe Radnor and Mike KennerleyEileen M. Van AkenGrado Department of Industrial and
Systems Engineering, Virginia Tech, Blacksburg, Virginia, USA Geert LetensBelgian Armed Forces,
Martelarenstraat, Vilvoorde, Belgium Garry D. ColemanTransformation Systems, Inc., Bristow, Virginia,
USA Jennifer FarrisGrado Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg,
Virginia, USA Dirk Van GoubergenDepartment of Industrial Management, Ghent University, Ghent,
Belgium. 2005. Assessing maturity and effectiveness of enterprise performance measurement systems.
International Journal of Productivity and Performance Management 54:5/6, 400-418. [Abstract] [Full Text]
[PDF]
114. Mike BourneCentre for Business Performance, Cranfield School of Management, Cranfield, UK Mike
KennerleyCentre for Business Performance, Cranfield School of Management, Cranfield, UK Monica
FrancoSantosCentre for Business Performance, Cranfield School of Management, Cranfield, UK. 2005.
Managing through measures: a study of impact on performance. Journal of Manufacturing Technology
Management 16:4, 373-395. [Abstract] [Full Text] [PDF]
115. Derrick Purdue. 2005. Performance Management for Community Empowerment Networks. Public Money
and Management 25:2, 123-130. [CrossRef]
116. I Egan, J M Ritchie, P D Gardiner. 2005. Measuring performance change in the mechanical design process
arena. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture
219:12, 851-863. [CrossRef]
117. A. Grando *, F. Turco. 2005. Modelling plant capacity and productivity: conceptual framework in a single-
machine case. Production Planning & Control 16:3, 309-322. [CrossRef]
118. Monica Franco-Santos *, Mike Bourne. 2005. An examination of the literature relating to issues affecting
how companies manage through measures. Production Planning & Control 16:2, 114-124. [CrossRef]
119. Kit Fai Pun, Anthony Sydney White. 2005. A performance measurement paradigm for integrating strategy
formulation: A review of systems and frameworks. International Journal of Management Reviews 7:1, 49-71.
[CrossRef]
120. T. J. Turner *, U. S. Bititci, S. S. Nudurupati. 2005. Implementation and impact of performance measures
in two SMEs in Central Scotland. Production Planning & Control 16:2, 135-151. [CrossRef]
121. Petri Suomala LIFE CYCLE PERSPECTIVE IN THE MEASUREMENT OF NEW PRODUCT
DEVELOPMENT PERFORMANCE 523-700. [Abstract] [Full Text] [PDF] [PDF]
122. Mohamed E. KuwaitiBahrain Defence Force, Bahrain. 2004. Performance measurement process: definition
and ownership. International Journal of Operations & Production Management 24:1, 55-78. [Abstract] [Full
Text] [PDF]
123. Mike KennerleyCentre for Business Performance, Cranfield School of Management, Cranfield University,
Cranfield, UK Andy NeelyCentre for Business Performance, Cranfield School of Management, Cranfield
University, Cranfield, UK. 2003. Measuring performance in a changing business environment. International
Journal of Operations & Production Management 23:2, 213-229. [Abstract] [Full Text] [PDF]
124. Rajiv Sindwani, Vikram Singh, Sandeep GroverIdentification of Attributes of TQM in an Educational
Institute 123-141. [CrossRef]
125. Ibrahim H. Osman, Abdel Latef AnouzeA Cognitive Analytics Management Framework (CAM-Part 1):
1-79. [CrossRef]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)

You might also like