Performance indicators for measuring the effects of Smart Maintenance

Camilla Lundgren (Department of Industrial and Materials Science, Chalmers University of Technology, Gothenburg, Sweden)
Jon Bokrantz (Department of Industrial and Materials Science, Chalmers University of Technology, Gothenburg, Sweden)
Anders Skoogh (Department of Industrial and Materials Science, Chalmers University of Technology, Gothenburg, Sweden)

International Journal of Productivity and Performance Management

ISSN: 1741-0401

Article publication date: 4 July 2020

Issue publication date: 25 June 2021

5101

Abstract

Purpose

The purpose of this study is to ensure productive, robust and sustainable production systems and realise digitalised manufacturing trough implementation of Smart Maintenance – “an organizational design for managing maintenance of manufacturing plants in environments with pervasive digital technologies”. This paper aims to support industry practitioners in selecting performance indicators (PIs) to measure the effects of Smart Maintenance, and thus facilitate its implementation.

Design/methodology/approach

Intercoder reliability and negotiated agreement were used to analyse 170 maintenance PIs. The PIs were structurally categorised according to the anticipated effects of Smart Maintenance.

Findings

Companies need to revise their set of PIs when changing manufacturing and/or maintenance strategy (e.g. reshape the maintenance organisation towards Smart Maintenance). This paper suggests 13 categories of PIs to facilitate the selection of PIs for Smart Maintenance. The categories are based on 170 PIs, which were analysed according to the anticipated effects of Smart Maintenance.

Practical implications

The 13 suggested categories bring clarity to the measuring potential of the PIs and their relation to the Smart Maintenance concept. Thereby, this paper serves as a guide for industry practitioners to select PIs for measuring the effects of Smart Maintenance.

Originality/value

This is the first study evaluating how maintenance PIs measure the anticipated effects of maintenance in digitalised manufacturing. The methods intercoder reliability and negotiated agreement were used to ensure the trustworthiness of the categorisation of PIs. Such methods are rare in maintenance research.

Keywords

Citation

Lundgren, C., Bokrantz, J. and Skoogh, A. (2021), "Performance indicators for measuring the effects of Smart Maintenance", International Journal of Productivity and Performance Management, Vol. 70 No. 6, pp. 1291-1316. https://doi.org/10.1108/IJPPM-03-2019-0129

Publisher

:

Emerald Publishing Limited

Copyright © 2020, Camilla Lundgren, Jon Bokrantz and Anders Skoogh

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

Innovations and advancements in technology are generating a shift in the manufacturing industry that is currently undergoing a transition towards digitalised manufacturing. Digitalised manufacturing means manufacturing in which the physical world is connected to the virtual world and in which production systems rely on computer science and advance manufacturing technology (Kagermann et al., 2013; Xu et al., 2018). All entities of a production system are expected to be interconnected, to exchange information, make decentralised decisions and act autonomously. The attention to maintenance has now accelerated, as it is critical to avoid unexpected stoppages and disruptions in such systems. In fact, predictive maintenance is one of the highest ranked business cases in the manufacturing industry (McKinsey Global Institute, 2016). Digitalised manufacturing will, in short, set new, substantial requirements on the maintenance function. For meeting these requirements, it is anticipated that maintenance organisations will change (Bokrantz et al., 2017).

Smart Maintenance is “an organizational design for managing maintenance of manufacturing plants in environments with pervasive digital technologies” (Bokrantz et al., 2020a). Smart Maintenance is characterised by four core dimensions: (1) data-driven decision making, (2) human capital resource, (3) internal integration and (4) external integration. Thus, working according to Smart Maintenance is expected to have broader effects compared to the traditional view of maintenance. Traditionally, maintenance is expected to lead to available, reliable and efficient production systems (Ylipää et al., 2017). However, Smart Maintenance is expected to contribute to performance at plant-level and firm-level, as well as having effects on individuals. Measuring and following up on these effects is an important aspect of working with Smart Maintenance principles and achieve full implementation.

Performance indicators (PIs) are used to measure and follow up effects in the industrial environment. Industrial standards for maintenance PIs exist, and previous maintenance research has been conducted to develop and investigate PIs and performance measurement systems for maintenance; see, for example, Kumar et al. (2013); Simões et al. (2011) and Parida and Chattopadhyay (2007). Nevertheless, these maintenance PIs were developed before the digital transition of the manufacturing industry accelerated. Accordingly, current maintenance PIs measure effects according to a traditional view of maintenance with a focus on availability and reliability. On the other hand, expectations on Smart Maintenance cover a broader perspective of, including, for example, firm and plant performance. This makes the availability of maintenance PIs a valid issue.

The purpose of this study is to ensure productive, robust and sustainable production systems and realise digitalised manufacturing through the implementation of Smart Maintenance. Many manufacturing companies worldwide are currently updating their maintenance strategies in line with the principles of Smart Maintenance. This study analysed 170 maintenance PIs, aiming to support PI selection and bring clarity to the measuring potential of the PIs and their relation to the Smart Maintenance concept. This paper guides industry practitioners in selecting performance indicators (PIs) to measure and follow up on the anticipated effects of Smart Maintenance, which will facilitate its implementation.

2. Theory

Traditionally, maintenance has been seen as a function that repairs equipment. This is reflected by the standard EN 13306, where the definition of maintenance is “the combination of all technical, administrative and managerial actions, during the life-cycle of an item intended to retain it in, or restore it to, a state in which it can perform a required function” (CEN, 2001). Maintenance work is typically followed up and described in terms of the status of individual machines, their failure behaviours and repair time. However, in digitalised manufacturing, maintenance organisations are expected to change; this affects a broader range of aspects than machine failure behaviours. The following sections will briefly describe digitalised manufacturing and explain the whole concept of Smart Maintenance. It will be followed by a section of performance measurement in the manufacturing industry. The theory chapter ends with performance indicators related to maintenance.

2.1 Digitalised manufacturing

Digitalised manufacturing means manufacturing in which the physical world is connected to the virtual world and in which production systems rely on computer science and advance manufacturing technology. This transition is anticipated to result in a fourth industrial revolution, with the German initiative Industrie 4.0/Industry 4.0 as one of the well-spread examples (Kagermann et al., 2013; Xu et al., 2018). In Industry 4.0, manufacturing operation systems are integrated with technologies and innovations such as Big Data, the Internet of Things (IoT), artificial intelligence (AI), and Cyber-Physical Systems (CPS) (Lu, 2017). The adoption of these technologies and Industry 4.0 will change the structure of the manufacturing industry, its competition rules and business models, and in general, increase the performance (Wang et al., 2015; Dalenogare et al., 2018; Tirabeni et al., 2019). The production systems are anticipated to act decentralised and autonomous, with higher requirements of efficiency. For realising this, maintenance, “procedures that make production systems work” (Groover, 2007) will be crucial to reduce the risks and consequences of unexpected stoppages. For meeting the new, substantial requirements of production systems, maintenance organisations are expected to transform towards Smart Maintenance.

2.2 Smart Maintenance

Various maintenance concepts are recognised as suitable for digitalised manufacturing. These include, but are not limited to, predictive maintenance (Carnero, 2005), e-maintenance (Lee et al., 2006; Muller et al., 2008;), prognostics and health management (Lee et al., 2014), Maintenance 4.0 (Kans et al., 2016), and Smart Maintenance (Munzinger et al., 2009; Bokrantz et al., 2020a). Smart Maintenance has been the subject of several empirical scenario planning studies, in close collaboration with industrial representatives (Bokrantz et al., 2017; Akkermans et al., 2016), thereby providing a common language between scholars and practitioners. This study refers to Smart Maintenance as “an organizational design for managing maintenance of manufacturing plants in environments with pervasive digital technologies” (Bokrantz et al., 2020a). The following sections describe Smart Maintenance, including its four underlying dimensions (the characteristics of Smart Maintenance), contextual factors (facilitating or inhibiting implementation) and the anticipated effects of Smart Maintenance (performance variables). The underlying dimensions, contextual factors, and the anticipated effects jointly constitute the contingency model of Smart Maintenance (Bokrantz et al., 2020b).

2.2.1 The underlying dimensions

Smart Maintenance consists of four underlying dimensions: (1) data-driven decision-making, (2) human capital resource, (3) internal integration and (4) external integration (Bokrantz et al., 2020a), please see Figure 1.

First, data-driven decision-making is “the degree to which decisions are based on data” (Bokrantz et al., 2020a) and reflects how maintenance decisions are based on data. This can include automation and augmentation of human decision-making. Owing to technological advancements such as machine learning (ML), a reduction in the price of sensors and the increasing availability of equipment data, maintenance decisions can be increasingly based on data instead of just experience and intuition (Bokrantz et al., 2017; Bokrantz et al., 2020a). Second, the human capital resource is defined as “unit capacity based on individual knowledge, skills, abilities and other characteristics (KSAO) that are accessible for unit-relevant performance” (Bokrantz et al., 2020a). In other words, it means the knowledge, skills, abilities and other characteristics of maintenance employees. Due to technological change, the requirements placed upon maintenance personnel are also changing. In particular, maintenance employees need higher levels of generic skills (such as communication and collaboration), plus specific skills (as in the case of data analytics) (Bokrantz et al., 2020a; Akkermans et al., 2016; Roda et al., 2018). Thirdly, internal integration, “the degree to which the maintenance function is a part of a unified, intra-organizational whole” (Bokrantz et al., 2020a) refers to the cross-functional collaboration between the maintenance function and the rest of the plant organisation. It includes such things as the sharing of data, information and knowledge, and closer synchronisation. Fourthly and finally, external integration is defined as “the degree to which the maintenance function is a part of a unified, inter-organizational whole” (Bokrantz et al., 2020a). It refers to the establishment of links to external parties, especially networks and strategic partnerships. These links allow for things like equipment data to be shared between parties, allowing the scaling of ML and consolidation of knowledge resources (Bokrantz et al., 2020a).

2.2.2 Contextual factors

Several factors influence the adoption of Smart Maintenance, and certain of them can facilitate or inhibit its implementation (Bokrantz et al., 2020b; Akkermans et al., 2016). There are three main categories of context variables: change, investment and interface. Implementing Smart Maintenance requires substantial change across multiple dimensions of technology, skills and organisation. Such change is influenced by cultural aspects (a data-focused corporate culture for example) (Bokrantz et al., 2020b; Akkermans et al., 2016) and algorithm interpretability, as well as the leadership abilities of maintenance managers (Bokrantz et al., 2020b) Further, implementation requires investment in both tangible and intangible assets. Tangible assets are obtained primarily through ICT investment (sensors and IT systems, for example), which are relatively cheap. Intangible assets are needed if the technology is to be used effectively. These are obtained primarily through complementary investments (such as the training and education of employees). Such investment is typically much greater than the direct financial cost of the technology itself (Bokrantz et al., 2020b). Moreover, the success of any type of investment is influenced by the ability to quantify the effects of maintenance in accounting terms (Bokrantz et al., 2020b; Roda et al., 2018) Finally, the contextual factors relating to interfaces primarily influence the establishment of external integration. This includes digital platforms, openness and IT-security (Bokrantz et al., 2020b).

2.2.3 Anticipated effects of Smart Maintenance

Adoption of Smart Maintenance is expected to lead to a broader spectrum of effects, compared to the traditional view of maintenance. These effects can be divided into three different levels: individual, plant and firm (Bokrantz et al., 2020b), as described in Figure 2.

Starting at the individual level, Smart Maintenance will change the job characteristics of maintenance, and is, therefore, expected to influence job satisfaction and organisational attractiveness for prospective employees (Bokrantz et al., 2020b). For example, data-driven decision-making can make maintenance activities increasingly predictable and plannable, reducing the stress of reactive firefighting. Further, the younger generation of workers is attracted by other organisational characteristics that are non-typical in traditional maintenance work, such as the use of advanced IT-tools and flexible organisational structures.

At the plant level, the anticipated effects comprise performance variables. The four most important of these variables are performance in maintenance, manufacturing, environmental and safety. Briefly, maintenance performance relates to the internal efficiency of the maintenance function (Bokrantz et al., 2020b; Bengtsson and Salonen, 2016). This includes the failure behaviour of equipment, reactive and preventative actions and cost-effectiveness and is typically reflected in indicators such as maintenance cost, time between failures and repair lead time. Manufacturing performance relates to the external effectiveness of the maintenance (how maintenance influences the production system’s performance) (Bokrantz et al., 2020b; Bengtsson and Salonen, 2016). This is typically reflected in indicators such as manufacturing cost, quality of products and throughput, with the emphasis on increasing productivity. The two final dimensions of performance at the plant level, the environmental and safety performance, refer to such things as the service life of the equipment and avoiding safety hazards (Bokrantz et al., 2020b).

At the firm level, the two most important dimensions are financial performance and competitive advantage (Bokrantz et al., 2020b). While these two terms are often used interchangeably, they are distinct. While many firms can be profitable, few can have a competitive advantage (Ketokivi, 2016). Financial performance is typically reflected in indicators such as return on assets (ROA) or return on investment (ROI). Competitive advantage, on the other hand, is seen in superior firms that are capable of, e.g. creating economic value from inimitable internal resources or productive factors that are in limited supply (Barney, 1991).

2.3 Measuring performance in the manufacturing industry

Sayings like “What gets measured gets done” summarise the importance of measuring performance (Lynch and Cross, 1991; from Kennerley and Neely, 2003), given that the actions taken are accordingly supporting the strategies and objectives of the organisation. Performance can be explained as an interaction between ability and efforts for improvements (Vroom, 1964; Porter and Lawler, 1968) (from Parida et al., 2015). Performance measurement is done by many industrial companies to support operational processes, by comparing measurements to a certain target or expectation, and thus learn to do better (Zairi, 1994; Ahmad and Dhafr, 2002; Rouse and Putterill, 2003). Performance is commonly divided into four dimensions: cost/productivity, time, flexibility, and quality (De Toni and Tonchia, 2001).

Performance measurement is explained by Neely et al. (1995) as the process to quantify both the effectiveness and efficiency of actions. Effectiveness and efficiency can be described as “doing the right thing” and “doing things right,” respectively. For a manufacturing company and the production system, effectiveness can be described as producing the intended result – the product, while efficiency relates to producing the product with the least waste of time and effort, i.e. how well the resources are utilized.

Measuring the performance of a manufacturing company is not done by one individual performance indicator (PI), but rather a set of PIs. These PIs are not independent, as they are often interconnected. Awareness of these relationships will make the usage of PIs more beneficial, as it brings a deeper understanding of the process, making it more likely to make the right decisions (Rodriguez et al., 2009; Kang et al., 2016). The selection of PIs must consider the context (i.e. reflect the strategy of the company), be standardized, and the PIs need to be interpreted as intended to be beneficial (Azapagic and Perdan, 2000; Veleva and Ellenbecker, 2001). Neely et al. (2000) suggest five parts to consider for setting up PIs: (1) the right variables must be measured, (2) correct calculations of the PIs, (3) it should be possible to compare the PIs to previous results to facilitate improvement, (4) the PIs should be comparable to competitors to allow benchmarking, and (5) to have long-term goals to assess the trend considering these goals. If the vision is deployed through the whole organisation, and PIs selected accordingly, the PIs will provide feedback to be used as support in any decision. The relation between the vision and the PIs are described in Figure 3 (inspired by Bititci et al., 1997).

Neely and Bourne (2000) state that “the trick is to measure as little as possible, but to ensure that you are measuring the things that matter”. Even though, the trend in many companies is an increasing number of PIs, resulting in difficulties in measuring and analysing the data (Schneiderman, 1999; Tangen, 2005; Salloum, 2013; Parida et al., 2015). This might be explained by the dynamics in companies and changes in strategies without changing the performance measurement (Salloum, 2013; Melnyk et al., 2014). Changes in strategies and operations require further modification of PIs and the performance measurement system, for the system to continue to reflect upon the organisational context and goals. Nevertheless, few organisations include a systematic process to ensure that the PIs and the performance measurement system still is valid considering the new strategies (Kennerley and Neely, 2003). The difficulty is often due to the assessment of relevant PIs, the establishment of targets, data availability, as well as involvement of the ones who will use those (Braz et al., 2011). In general, there is little guidance on selecting and implementing measurement of performance (Digalwar and Sangwan, 2011). Frameworks available are often described very broadly, leaving it to practitioners in the manufacturing industry to solve the practical challenges (Tangen, 2004).

Currently, many industrial companies have accelerated their transition towards digitalisation, which also means changes in strategy. As PIs should follow the vision and the strategy of the company, a change towards digitalised manufacturing requires a change of the set of PIs accordingly to monitor and control the performance, as well as the implementation (Ante et al., 2018).

Another change in the manufacturing industry is the increasing interest in sustainable manufacturing, and companies are changing their strategy for considering sustainability on both strategic and operational levels. Naturally, sustainability performance indicators need to be included in companies’ performance measurement system (Winroth et al., 2016). However, sustainability is complex (emergent property) with many interacting factors. While the single factors can be measured and followed up, the whole concept of sustainability is argued to be more challenging to be described by PIs (Ehrenfeld, 2009; from Bocken et al., 2013). In addition, Bocken et al. (2013) report that companies are seldom sure whether sustainability PIs augment performance improvement.

The challenge of selecting sustainability PIs has been considered in research by the development of methodologies and guidelines to facilitate the selection. As an example, Veleva and Ellenbecker (2001) proposed twenty-two PIs related to sustainability, as well as a detailed guideline of their application. Despite the detailed guidelines, some companies might still find it challenging to implement and use the PIs. The availability of resources and data are major barriers, which is essential for being able to use the proposed methodologies. However, lacking data availability is not unique for sustainability PIs; it is representative of PI selection and usage in general Braz et al. (2011).

2.4 Maintenance performance indicators

The maintenance performance perspective has long been viewed differently within organisations: cost, budget performance, and availability from a perspective of economists, senior management, and the production department, respectively (Pintelon and Van Puyvelde, 1997). However, the traditional view of maintenance organisations and their role is to repair equipment (Ylipää et al., 2017). This has resulted in a focus on PIs which describe the failure behaviour of machines (internal efficiency), rather than focusing on the system level (external efficiency). Maintenance performance includes both internal and external efficiency of the maintenance function. In spite of this, the majority of indicators used in manufacturing industry are typically PIs related to the internal efficiency of maintenance (Parida and Kumar, 2006). Examples include equipment, maintenance costs and safety aspects (Muchiri et al., 2011).

Indicators are commonly divided into leading and lagging indicators (Smith, 2004; Muchiri et al., 2011; Smith and Mobley, 2011; Kumar et al., 2013). Leading indicators are actions carried out in the maintenance process; examples include preventive maintenance (PM) tasks conducted in relation to PM tasks planned and the number of inspections. Lagging indicators, on the other hand, measure the results of those actions, including, for example, mean time to failure (MTTF), and direct maintenance costs. Both types of indicators are described as important and useful for controlling actions according to their effects.

The literature suggests several PIs for following up maintenance actions and measuring maintenance performance. The authors of these works deal with similar PIs but lack consensus on how to categorise maintenance PIs and the methodologies to select them (Kumar et al., 2013). Simões et al. (2011) present a literature review that identifies 345 different PIs. 37 main PIs were identified, with the requirement that they should occur at least twice in the reviewed literature. Examples of the main PIs identified are cost, overall equipment effectiveness (OEE), availability, quality, mean time between failure (MTBF), mean time to restoration (MTTR), downtime and productivity. Another review of maintenance PIs, presented by Kumar et al. (2013), categorises PIs into four major categories: (1) financial, (2) human resources, (3) indicators related to the internal process and (4) technical indicators. Coetzee (1998) (from Kumar et al., 2013) categorises maintenance PIs into four categories of efficiency: (1) machine maintenance, (2) tasks, (3) organisation and (4) profit/cost. On the other hand, Campbell (1995) (from Kumar et al., 2013) does present three categories of PI: (1) equipment performance, (2) cost performance and (3) process performance. Parida and Chattopadhyay (2007) suggest a more extensive structure of categories for maintenance PI with seven different categories. The PIs are divided into (1) equipment-related indicators, (2) maintenance task-related indicators, (3) cost-related indicators, (4) impact on customer satisfaction, (5) learning and growth, (6) health, safety, security and environment (HSSE) and (7) employee satisfaction. The standard EN 15341 (CEN, 2007) is well-known in the manufacturing industry and consists of over 70 PIs. These are categorised into technical, economic and organisational indicators. The overlapping of categories and limited consensus results in a lack of clarity regarding different types of PI.

Nevertheless, using PIs does not guarantee performance improvement; PIs must be selected to be aligned with the strategy of the organisation (Simões et al., 2016). For a long time, there has been a lack of consensus about which methodologies to use for PIs selection (Kumar et al., 2013). Researchers have, therefore, suggested different approaches for maintenance PIs selection (see for example: Stefanovic et al., 2017; Fangucci et al., 2017; Brundage et al., 2018; Wijesinghe and Mallawarachchi, 2019). In spite of this, the most common approach in practice is to select PIs based on measures already available, which neglects its relevance (Simões et al., 2016). Evaluate the relevance of each PI and how to use them is a common challenge in the manufacturing industry. In spite of the importance of linking company vision and strategic goals to PIs (Bititci et al., 1997), maintenance PIs are seldom linked to the goals of the whole organisation (Kumar et al., 2013). Muchiri et al. (2011) argue that interaction between maintenance and the rest of organisation is needed when selecting and using PIs. A number of authors have proposed more research into practical implementation and investigations of how PIs are used in the manufacturing industry (Muchiri et al., 2011; Simões et al., 2011; Kumar et al., 2013).

3. Methodology

This study aimed to analyse 170 maintenance PIs on how they can be used to measure and follow up on the anticipated effects of Smart Maintenance. It started with a collection of PIs to use in the analysis, based on selected literature from a literature search. Analyses of the PIs were done by coding them using intercoder reliability and negotiated agreement (Campbell et al., 2013). A pre-defined coding scheme was based on the anticipated effects of Smart Maintenance. The scheme also included a category of “other”, to avoid “force-fitting” and to allow further clarification. As a final step, the category “other” was analysed further. Figure 4 shows on an overview of the methodology, and the following sections describe it more in detail.

A literature selection of maintenance PIs and measures was conducted to collect PIs to analyse. A flowchart of the selection is represented in Figure 5.

The literature selection started with a literature search in Scopus. Keywords used in searching for literature were: (maintenance AND performance AND measure OR indicator); (maintenance AND “key performance indicator”); (literature review AND maintenance AND performance). The content of the retrieved literature was scanned, and the criteria for selection included: mention of individual PIs, categories of PIs, or maintenance measurement system. Review-papers, including individual PIs and/or categories of PIs, were also considered for selection. In addition, the industrial maintenance standard was selected. The selected literature is summarized in Table 1.

A gross list of PIs was created based on the selected literature. All individual PIs in the publications were added to the gross list. When PI categories were presented in the publication, any examples of individual PIs were also added to the list. Duplicates immediately detected during the gross list creation were not added to the list. The gross list of PIs comprised 208 PIs. The gross list was then sorted to remove remaining duplicates and individuals PIs, which did not have a clear definition (for example, quality, which might refer to the product produced or maintenance task performed). The final list used in the analysis consisted of 170 PIs.

A clear, structured coding scheme was developed based on the anticipated effects of Smart Maintenance (Bokrantz et al., 2020b): (1) maintenance performance, (2) manufacturing performance, (3) safety performance, (4) environmental performance, (5) financial performance, (6) competitive advantage, (7) job satisfaction and (8) organisational attractiveness. As the anticipated effects of Smart Maintenance are clearly defined, a ninth code, (9) other, was added to the coding scheme, to avoid “force-fitting” and allow further clarity. The coding procedure followed intercoder reliability and negotiated agreement (Campbell et al., 2013). These methods are suitable and ensure credibility when using a clear coding scheme coded by two equally knowledgeable researchers. The two researchers coded all 170 PIs in isolation from each other, using the software NVivo (version 11). When considering agreement by chance, intercoder reliability was calculated using Krippendorff’s kappa coefficient (Krippendorff, 2004) (from Campbell et al., 2013). The first coding resulted in 40 discrepancies, and the kappa coefficient was calculated at 0.68. All discrepancies were discussed, with the aim of reaching consensus on the rules to be used in the second coding. The second coding resulted in 10 discrepancies, and the kappa coefficient was calculated at 0.93. This time, the discrepancies were discussed so as to reach a consensus on how they should be categorised.

Some of the PIs were categorised as “other”, as they did not fit into any of the eight defined categories of anticipated effects of Smart Maintenance. A further analysis was conducted for clarification purposes and to investigate what these PIs measure. Firstly, the contingency model of Smart Maintenance (Bokrantz et al., 2020b) was used to categorise the PIs. An inductive approach inspired by constant comparison (Glaser and Strauss, 1967) was then used to create new categories for the remaining PIs. A new category must be defined by at least two PIs.

4. Results and interpretations

This section presents the results of the study, complemented with the interpretation of the results. An overview of how the PIs were categorised can be obtained by Figure 6.

The majority, 93 of the PIs, or 54.7%, were categorised as “maintenance performance”. There were 11 PIs categorised as “manufacturing performance” and 10 as “safety performance”. This corresponds to 6.5% and 5.9% of all PIs. 2.4%, corresponding to four PIs, were categorised as “financial performance” and 2.4% as “environmental performance”. Three PIs, 1.8%, were categorised as “job satisfaction”. None of the PIs were categorised as “competitive advantage” or “organisational attractiveness”, while 45 PIs, 26.5%, were categorised as “other”.

The PIs categorised as “maintenance performance” were characterised by describing the internal efficiency by failure behaviour/mode and machine repair, as well as maintenance cost-effectiveness. Examples include: MTBF, MTTR, total downtime, OEE, cost of unplanned maintenance tasks and maintenance cost per unit. Conversely, PIs like “immediate corrective man-hours in relation to total maintenance man-hours” and “immediate corrective maintenance time in relation to total maintenance downtime” reflect the production situation and can be used in various ways to describe maintenance outcomes. However, PIs directly related to production system outcomes were categorised as “production performance”; in other words, the external efficiency of maintenance. Some examples included: productivity, the number of reworks, rework cost, cycle time and production rate. Almost as many PIs were categorised as “safety performance”, focusing on the safety aspects of maintenance. Examples included: number of accidents, number of incidents, amount of compensation paid, and number of failures causing personal injury relative to the total number of failures.

Compared to PIs relating to the cost of carrying out maintenance performance, PIs in the “financial performance” category are related to “higher level” (plant-level and firm-level) financial performance. These included “return on fixed assets” (ROFA) and “return on maintenance investment”. PIs in “financial performance” is affected by factors other than just maintenance actions, while PIs in “maintenance performance” were strongly linked to maintenance actions.

Environmental performance includes PIs for environmental impacts, such as “number of failures causing environmental damage” and “number of failures causing potential environmental damage”, relative to the total number of failures or time. In considering the satisfaction of maintenance employees, the following PIs were categorised as job satisfaction: employee satisfaction, number of employee complaints and employee turnover rate. None of the PIs were categorised as “organisational attractiveness”, or “competitive advantage”.

Forty-five PIs were categorised as “other”. Some PIs are closer to the maintenance organisation/function, such as “competence of maintenance personnel” and “number of internal maintenance personnel in relation to total number of internal employees” but are not deemed “ (maintenance) performance”. Other PIs do not immediately reflect upon the maintenance actions. Examples include “employee absentees”, “number of new ideas generated” and “number of new customers”. For clarification purposes, the spread in this category created a need for further analysis.

The analysis of the PIs in “other” started by using the full model of Smart Maintenance. Some of the PIs were categorised as “complementary investment” and “human capital resource”. These refer to investment in intangible assets and skill level (the analytical, IT, social, business, adaptability, and technical skills of the employees). An inductive approach was used to create categories for the remaining PIs. This resulted in the following categories: customer satisfaction, HSE and organisational. “Customer satisfaction” refers to the satisfaction of the organisation’s end customer. “HSSE” includes PIs measuring aggregated values of health, safety, security and environment. The “organisational” category includes PIs describing the organisation and the work distribution correlation (number of people and their cost). Three PIs were uncategorised. Figure 7 gives an overview of the categories and the distribution of how PIs were categorised.

The majority, 62.2% of the PIs, were categorised as “organisational”. 13.3% were categorised as “complementary investment” and 8.8% as “customer satisfaction”. HSSE consists of 6.3% of the PIs and 2.2% were categorised as “human capital resource”. The “uncategorised” PIs represent 6.3%.

Both “complementary investment” and “human capital resource” relate to intangible assets. “Competence of maintenance personnel” was coded as “human capital resource”, while PIs in “complementary investment” are efforts to maintain/increase skill levels, or capture a lack of skills. Examples include: amount of money spent on training, number of training programmes conducted and estimated lost time due to lack of knowledge or skills in relation to total time worked. On the other hand, organisational PIs describe the setup of the organisation. “Number of internal maintenance personnel in relation to total maintenance personnel” and “number of maintenance employees in relation to number of supervisors” are two examples. Also, financial indicators that immediately reflect the setup of the organisation were included in the “organisational” category. Examples include “total mechanical contractor cost” in relation to “total maintenance contractor cost and percentage”.

The PIs “number of HSSE complaints” and “HSSE losses” were categorised as “other”, as “environmental performance” and “safety performance” were two different categories from which the “HSSE” category was created. Compared to the PIs in “environmental performance” and “safety performance”, these PIs are aggregated values of several measures. Other PIs were not strongly linked to the maintenance function but rather, related to the end customer of the company. Examples of PIs in the category of “customer satisfaction” are “number of complaints from customers” and “number of new customers”.

In summary, the categories of the defined effects of Smart Maintenance are presented in decreasing order: maintenance performance, manufacturing performance, safety performance, environmental performance, financial performance, job satisfaction, organisational attractiveness and competitive advantage. The analysis of “other” resulted in five more categories: organisational, complementary investment, customer satisfaction, HSSE and human capital resource. In total, 170 PIs were used in the analysis, with 167 PIs categorised into 13 different categories. Three PIs were uncategorised. Appendix 1 shows how each PI was categorised.

5. Discussion

The purpose of this study is to ensure productive, robust and sustainable production systems and realise digitalised manufacturing by the implementation of Smart Maintenance. To invest in and effectively implement Smart Maintenance, industry practitioners need to be able to measure and follow up the anticipated effects with PIs they actually understand and use. This paper serves as a guide for industry practitioners in selecting performance indicators (PIs) to measure and follow up on the anticipated effects of Smart Maintenance to facilitate its implementation.

Since the expectations of Smart Maintenance are broader than in the traditional view of maintenance, this study evaluated whether the anticipated effects of Smart Maintenance could be measured and followed up using existing maintenance PIs. 170 PIs suggested in maintenance literature were analysed and categorised. Using a coding procedure based on intercoder reliability and negotiated agreement (Campbell et al., 2013), 54.7% of the PIs were categorised as “maintenance performance”, 6.5% were categorised as “manufacturing performance,” and 5.9% were categorised as “safety performance”. “Environmental performance” and “financial performance” got 2.4% each and “job satisfaction” 1.8%. None of the PIs were categorised as “competitive advantage” or “organisational attractiveness”. Meanwhile, 26.5% that could not be categorised as effects of Smart Maintenance was categorised as “other”.

The 170 maintenance PIs analysed can be used to measure (at least to some extent) the majority of the anticipated effects of Smart Maintenance. However, there were also some gaps. No maintenance PIs were categorised as measuring “competitive advantage” or “organisational attractiveness”. It may not have been intended for maintenance organisations to measure these effects themselves. Instead, integration between departments is necessary to successfully select and use relevant PIs, which has been requested in maintenance research before (Muchiri et al., 2011). In addition, an investigation of the relationship between the PIs should be done to use the PIs most beneficial and create more value from the usage (Rodriguez et al., 2009; Kang et al., 2016).

It is clear from the research that changes in strategy also require a change in the set of used PIs (see, for example, Braz et al., 2011; Simões et al., 2016). Currently, many manufacturing companies have accelerated their transition into digitalised manufacturing, meaning that their set of PIs needs to be changed accordingly. This study shows that new PIs for maintenance might not necessarily be developed to measure and follow up anticipated effects of Smart Maintenance. However, it is crucial to revise the set of maintenance PIs, and integrate the use of PIs with other departments (especially production), as well as linking maintenance PIs to business goals (Neely et al., 2000; Kumar et al., 2013) to successfully implement Smart Maintenance.

Neely and Bourne (2000) state that “the trick is to measure as little as possible, but to ensure that you are measuring the things that matter”. The intention of this paper has not been to evaluate the importance of a specific PI or “how good” it is. However, during the negotiated agreement (Campbell et al., 2013), it was naturally discussed whether the PI is useful or not and why it should be measured. This study included 170 PIs related to maintenance, identified from scientific publications and the industrial standard EN 15341 (CEN, 2007). The standard consists of over 70 PIs, the majority of them relatively complicated, with no guidance on why to use them, how to use them, or select among these. With no guidance on why and how to use them, or select among them, this standard could easily be misinterpreted and promote the selection of PIs based on convenience or availability. In contrast, many authors that the number of PIs should be kept to a minimum (see, for example, Neely and Bourne, 2000; Tangen, 2005; Parida et al., 2015). In this way, there is a clear need for clarification and guidance for best practice and application of PIs.

In both academia and industry, it is important to differentiate between means and ends. In other words, actions, which bring about effects vs the effects themselves. In the literature of maintenance PIs, this distinction is described in terms of “leading” and “lagging” indicators. Leading indicators are used for actions carried out and lagging indicators for the effects of those actions (Kumar et al., 2013). However, in the literature that has been studied and in industry, all these indicators are somewhat clumsily called “performance indicators”. This makes differentiating between means and ends problematic. The authors of this paper agree that both leading and lagging indicators should be measured, so as to control actions according to their effects (Kumar et al., 2013). However, it is important to differentiate between what is done to impact performance (means) and the performance itself (ends).

The above lack of clarity was also reflected in the coding procedure. The coding procedure was based on intercoder reliability and negotiated agreement (Campbell et al., 2013), with the first coding resulting in 40 discrepancies. These discrepancies were caused by the researchers’ differing views of performance (their limited reflection on means vs ends). Negotiated agreement produced a consensus that performance is an effect of actions, a clear differentiation between means and ends. This meant there were fewer discrepancies in the second coding. The kappa coefficient for intercoder reliability was calculated as 0.93, reaching 93% intercoder reliability, strengthening the credibility of the coding results.

With the conceptualisation of Smart Maintenance as a basis, the PIs were divided into eight well-defined categories of anticipated effects of Smart Maintenance. However, 26.5% of the PIs could not be categorised under these effects and so were categorised as “other”. For clarifying this, more categories were created based on the contingency model of Smart Maintenance (Bokrantz et al., 2020b). For PIs which did not fit into the contingency model, an inductive approach was used, inspired by constant comparison (Glaser and Strauss, 1967). This resulted in five more categories, making a total of 13 PI categories. This gave a more detailed overview of what the PIs actually measured compared to previous literature, which would commonly divide PIs into three or four categories (Kumar et al., 2013).

Three PIs did not fit into any of the categories from the Smart Maintenance model. Furthermore, it was not possible to create new categories using constant comparison (Glaser and Strauss, 1967), as a new category must be defined by at least two PIs. To avoid “force-fitting”, these three PIs, therefore remained uncategorised.

To sum up, the main focus of this study has been to support industry practitioners (especially decision-makers) to select PIs for measuring and follow up on the effects of Smart Maintenance. The majority of anticipated effects of Smart Maintenance can be measured by the 170 maintenance PIs analysed. However, PIs must be selected according to the vision and goal of the organisation for achieving performance improvement (Bititci et al., 1997; Neely et al., 2000; Digalwar and Sangwan, 2011). Further, the industrial practices of selection and use of PIs do need investigation (Tangen, 2004; Digalwar and Sangwan, 2011; Kumar et al., 2013). Compared to the traditional view of maintenance function (repairing equipment), Smart Maintenance is expected to lead to a broader spectrum of effects, and change the role of the maintenance function in the corporate strategy. The strategy should be reflected in PIs, meaning that changes in strategies and operations require a change of PIs (Braz et al., 2011; Simões et al., 2016). In contrast, many industry practitioners select and use PIs based on the availability of measurements instead of its relevance (Simões et al., 2016). For a manufacturing company implementing Smart Maintenance, there is a strong need of revising the set of PIs, as well as integrate the use of PIs with the rest of the organisation.

Neely et al. (2000) present five elements to consider when selecting PIs. Methodologies for selecting PIs for sustainable manufacturing have been developed (e.g. Veleva and Ellenbecker, 2001) but not yet for maintenance in digitalised manufacturing. Nevertheless, even if methodologies exist, practical challenges in selecting and change PIs remain (Braz et al., 2011). To understand the practical challenges and the needs in using PIs, the authors propose empirical research into the selection of PIs in digitalised manufacturing, including integrated use of PIs between departments. The authors also suggest research to evaluate how well each PI measure each effect, to successfully follow up and target the anticipated effects of Smart Maintenance.

6. Conclusions

The manufacturing industry is undergoing a digital transition, and many manufacturing companies worldwide are currently updating their maintenance strategies in line with Smart Maintenance. With the purpose of ensuring productive, robust and sustainable production systems through the implementation of Smart Maintenance, this study has analysed and categorised 170 maintenance PIs to support the selection of PIs targeting the anticipated effects of Smart Maintenance.

The categorisation of the PIs was based on intercoder reliability and negotiated agreement. The results show that the majority of the maintenance PIs measure the internal maintenance performance. However, there are PIs available, capable of measuring the majority of anticipated effects of Smart Maintenance. That said, due to the future extended view of maintenance, some gaps were identified. For example, there are no existing PIs targeting “organisational attractiveness” or “competitive advantage”, which are important for the future role of maintenance. Ultimately, it is not necessary to develop a completely new set of maintenance PIs to follow up on the effects of Smart Maintenance. It is more important to integrate the usage of PIs internally across the whole organisation instead of focus on developing new PIs for “competitive advantage” and “organisational attractiveness”. Furthermore, this study produced 13 categories of maintenance PIs, providing clarity and enabling effective implementation. These categories were: (1) maintenance performance, (2) manufacturing performance, (3) safety performance, (4) environmental performance, (5) financial performance, (6) competitive advantage, (7) job satisfaction, (8) organisational attractiveness, (9) organisational, (10) complementary investment, (11) customer satisfaction, (12) HSSE and (13) human capital resource.

Clarity and the ability to differentiate between means and ends are key factors in designing an effective PI-based measurement system for Smart Maintenance, as well as align the PIs with the vision and goals of the company. This paper serves as a guide for industry practitioners to select relevant PIs for each of the anticipated effects of Smart Maintenance. Such a measurement system will facilitate the implementation of Smart Maintenance and thus help manufacturing companies realise digitalised manufacturing with productive, robust and sustainable production systems.

Figures

The four underlying dimensions of smart maintenance: data-driven decision-making, human capital resource, internal integration and external integration

Figure 1

The four underlying dimensions of smart maintenance: data-driven decision-making, human capital resource, internal integration and external integration

It is anticipated that smart maintenance will impact three different levels: individual, plant and firm

Figure 2

It is anticipated that smart maintenance will impact three different levels: individual, plant and firm

The figure shows the relation between the company’s vision and the performance indicators to be used in the organisation

Figure 3

The figure shows the relation between the company’s vision and the performance indicators to be used in the organisation

The methodology of this study included three major steps: the selection of maintenance PIs to use in the study, creation of coding procedure and an analysis of the “other” category

Figure 4

The methodology of this study included three major steps: the selection of maintenance PIs to use in the study, creation of coding procedure and an analysis of the “other” category

The flow chart describes the literature selection procedure used to generate a list of PIs to analyse

Figure 5

The flow chart describes the literature selection procedure used to generate a list of PIs to analyse

Overview of how the maintenance PIs were categorised. The left-hand part of the figure shows the number of PIs in each category and the pie chart shows the distribution of PIs in each category as a percentage

Figure 6

Overview of how the maintenance PIs were categorised. The left-hand part of the figure shows the number of PIs in each category and the pie chart shows the distribution of PIs in each category as a percentage

Overview of new categories. The left-hand part shows the number of PIs in each category, while the pie chart shows the distribution of the PIs in each category as a percentage

Figure 7

Overview of new categories. The left-hand part shows the number of PIs in each category, while the pie chart shows the distribution of the PIs in each category as a percentage

Selected literature from which PIs were selected to analyse in this study

Literature (no)ReferenceType
Literature 1Wireman (2005)Book
Literature 2Muchiri et al (2010)Journal paper
Literature 3Muchiri et al. (2011)Journal paper
Literature 4Simões et al. (2011)Journal paper
Literature 5Parida and Chattopadhyay (2007)Journal paper
Literature 6CEN (2007)Maintenance standard

PICategory
Amount of money spent on trainingComplementary investment
Estimated lost time due to lack of knowledge or skills/total time workedComplementary investment
Maintenance rework due to lack of knowledge or skills/total maintenance workComplementary investment
Number of training programs conductedComplementary investment
Skill improvement trainingComplementary investment
Total training dollars/total plant payrollComplementary investment
Customer satisfactionCustomer satisfaction
Number of new customersCustomer satisfaction
Number of quality complaints from customerCustomer satisfaction
Percentage of customer retainedCustomer satisfaction
Number of failures due to maintenance creating environmental damage/calendar timeEnvironmental performance
Annual volume of wastes or harmful effects related to maintenance /calendar timeEnvironmental performance
Number of failures causing damage to the environment/total number of failuresEnvironmental performance
Number of failures causing potential damage to the environment/total number of failuresEnvironmental performance
Return on maintenance investmentFinancial performance
ROMI (return on marketing investment)Financial performance
RONA, return on net assetsFinancial performance
ROFA, return on fixed assetsFinancial performance
Employee absenteesHSSE
HSSE lossesHSSE
Number of HSSE complaintsHSSE
Competence maintenance personnelHuman capital resource
Number of employee complaintsJob satisfaction
Employee turn-over-rateJob satisfaction
Employee satisfactionJob satisfaction
OEEMaintenance performance
AvailabilityMaintenance performance
DowntimeMaintenance performance
Failure rateMaintenance performance
ReliabilityMaintenance performance
Downtime costMaintenance performance
Equipment lossesMaintenance performance
Service levelMaintenance performance
Inventory costMaintenance performance
MTTFMaintenance performance
Number of small stoppagesMaintenance performance
Number of big stoppagesMaintenance performance
Downtime for small stoppagesMaintenance performance
Downtime for big stoppagesMaintenance performance
Change over time (from stop to running condition)Maintenance performance
Number of planned maintenance tasksMaintenance performance
Time for planned maintenance tasksMaintenance performance
Cost of planned maintenance tasksMaintenance performance
Number of unplanned maintenance tasksMaintenance performance
Time for unplanned maintenance tasksMaintenance performance
Cost of unplanned maintenance tasksMaintenance performance
Response time for maintenanceMaintenance performance
Maintenance cost/unitMaintenance performance
Number of stopsMaintenance performance
Number of shutdownsMaintenance performance
Maintenance cost per estimated replacement value or the plantMaintenance performance
Maintenance cost as a percentage of salesMaintenance performance
Maintenance work order on hold awaiting parts/total number of maintenance work ordersMaintenance performance
Total downtime attributed to maintenance errors/total downtimeMaintenance performance
Breakdowns caused by items that should have been inspected, serviced or a part of the PM program/total number of breakdownsMaintenance performance
Number of repetitive equipment failures/total number of equipment failuresMaintenance performance
Estimated PM task cost/actual PM task costMaintenance performance
Total number of work order generated from PM inspections/ total number of work orderMaintenance performance
Inactive stock line items/total stock line itemsMaintenance performance
Maintenance labour cost on work orders/total maintenance labour costsMaintenance performance
Maintenance labour costs planned/total maintenance labour costsMaintenance performance
Maintenance material costs planned/total maintenance materials costsMaintenance performance
Total maintenance cost/assets replacement valueMaintenance performance
Total maintenance cost/added value plus external costs for maintenanceMaintenance performance
Total maintenance cost/production transformation costMaintenance performance
(Total maintenance cost + unavailability costs related to maintenance)/quantity of outputMaintenance performance
Availability related to maintenance/total maintenance costMaintenance performance
Average inventory value of maintenance materials/asset replacement valueMaintenance performance
Total cost of maintenance materials/total maintenance costMaintenance performance
Total cost of maintenance materials/average inventory value of maintenance materialsMaintenance performance
Total maintenance cost/total energy usedMaintenance performance
Corrective maintenance cost/total maintenance costMaintenance performance
Preventive maintenance cost/total maintenance costMaintenance performance
Condition-based maintenance cost/total maintenance costMaintenance performance
Predetermined maintenance cost/total maintenance costMaintenance performance
Improvement maintenance cost/total maintenance costMaintenance performance
Maintenance shutdown cost/total maintenance costMaintenance performance
Total operating time/(total operating time + downtime due to maintenance)Maintenance performance
Achieved up time during required time/required timeMaintenance performance
Total operating time/(total operating time + downtime related to failures)Maintenance performance
Total operating time/( total operating time + downtime related to planned and scheduled maintenance)Maintenance performance
Preventive maintenance causing downtime/total downtime related to maintenanceMaintenance performance
Predetermined maintenance causing downtime/total downtime related to maintenanceMaintenance performance
Condition-based maintenance time causing downtime/total downtime related to maintenanceMaintenance performance
Total operating time/number of maintenance work-orders causing downtimeMaintenance performance
Total operating time/number of maintenance work-ordersMaintenance performance
Total operating time/number of failures (MTBF)Maintenance performance
Planned and scheduled maintenance time causing production downtime/planned and scheduled total maintenance time requiring downtimeMaintenance performance
Total time to restoration/number of failures (MTTR)Maintenance performance
Production operator maintenance man-hours/total maintenance man-hours availableMaintenance performance
Planned and scheduled maintenance man-hours/total maintenance man-hours availableMaintenance performance
Production operator maintenance man-hours/total production operators man-hoursMaintenance performance
Direct maintenance personnel on shift/total downtime related to maintenanceMaintenance performance
Immediate corrective maintenance time/total downtime related to maintenanceMaintenance performance
Corrective maintenance man-hours/total maintenance man-hoursMaintenance performance
Immediate corrective maintenance man-hours/total maintenance man-hoursMaintenance performance
Preventive maintenance man-hours/total maintenance man-hoursMaintenance performance
Condition cased maintenance man-hours/total maintenance man-hoursMaintenance performance
Predetermined maintenance man-hours/total maintenance man-hoursMaintenance performance
Overtime internal maintenance man-hours/total internal maintenance man-hoursMaintenance performance
Number of work order performed as scheduled/total number of scheduled work ordersMaintenance performance
Total man-hours spend by direct personnel on planned and scheduled activities/total man-hours planned and scheduled to direct personnelMaintenance performance
Number of spare parts supplied by the warehouse as requested/total number of spare parts required by maintenanceMaintenance performance
Work requests remaining in request status for <5 days/total work requestsMaintenance performance
Planned work/total work doneMaintenance performance
Percentage of work orders requiring rework due to planning/ all word ordersMaintenance performance
Percentage of work orders with delayed execution due to material or manpowerMaintenance performance
Work orders with scheduled date earlier or equal to latest finish date/all work ordersMaintenance performance
Manpower utilisation (total hours spent on tasks/available hours)Maintenance performance
Manpower efficiency (time allocated to tasks/time spent on tasks)Maintenance performance
Work order turnover (number of completed tasks/number of received tasks)Maintenance performance
Backlog (number of overdue tasks/number of received tasks)Maintenance performance
Percentage of maintenance work requiring reworkMaintenance performance
Number of failuresMaintenance performance
Direct maintenance costMaintenance performance
Breakdown severity (breakdown cost/direct maintenance cost)Maintenance performance
Maintenance cost of components over manufacturing costMaintenance performance
Maintenance stock turnoverMaintenance performance
ProductivityManufacturing performance
Cycle timeManufacturing performance
Performance rate (speed of production)Manufacturing performance
Nonconforming itemsManufacturing performance
Number of return goodsManufacturing performance
Number of reworkManufacturing performance
Cost of reworkManufacturing performance
Production cost/unitManufacturing performance
Production rateManufacturing performance
Actual operation time/required operation timeManufacturing performance
Total downtime attributed to operational/total downtimeManufacturing performance
Cost of indirect maintenance personnel/total maintenance costOrganisational
Cost of suppliers/total maintenance costOrganisational
Cost of training for maintenance/number of maintenance personnelOrganisational
Internal electrical man-hours/total internal direct maintenance personnel man-hoursOrganisational
Internal instrumentation man-hours/total internal direct maintenance personnel man-hoursOrganisational
Internal mechanical man-hours/total internal direct maintenance personnel man-hoursOrganisational
Maintenance budgetOrganisational
Man-hours envisaged for proactive work/total-man hours availableOrganisational
Man-hours used for continuous improvement/total maintenance personnel man-hoursOrganisational
Man-hours used for improvement and modification/total man-hours availableOrganisational
Man-hours used for planning in a systematic maintenance planning process/total internal maintenance personnel man-hoursOrganisational
Number of indirect maintenance personnel/number of direct maintenance personnelOrganisational
Number of indirect maintenance personnel/number of internal maintenance personnelOrganisational
Number of internal direct maintenance people using software /number of internal direct maintenance personnelOrganisational
Number of internal maintenance personnel/total internal employeesOrganisational
Number of internal multi-skilled maintenance personnel/number of internal maintenance personnelOrganisational
Number of maintenance employees/number of plannersOrganisational
Number of maintenance employees/number of supervisorsOrganisational
Number of maintenance internal personnel man-hours of training/total internal maintenance man-hoursOrganisational
Percentage cost of personnel (staff cost/total maintenance cost)Organisational
Percentage cost of subcontractors (expenditure of subcontracting/total maintenance cost)Organisational
Total contractor cost/total maintenance costOrganisational
Total electrical maintenance contractor costs/total maintenance contractor costsOrganisational
Total external personnel cost spent on maintenance/total maintenance costOrganisational
Total instrumentation maintenance contractor costs/total maintenance contractor costsOrganisational
Total internal personnel cost spent on maintenance/total maintenance costOrganisational
Total mechanical maintenance contractor costs/total maintenance contractor costsOrganisational
Value of asset maintained per maintenance employeeOrganisational
Number of accidentsSafety performance
Number of incidentsSafety performance
Number of legal casesSafety performance
Number of compensation casesSafety performance
Amount of compensation paidSafety performance
Number of injuries for people due to maintenance/working timeSafety performance
Number of failures causing injury to people/total number of failuresSafety performance
Number of failures causing potential injury to people/total number of failuresSafety performance
Number of injuries to maintenance personnel/total maintenance personnelSafety performance
Man-hours lost due to injuries for maintenance personnel/ total man-hours worked by maintenance personnelSafety performance
Number of new ideas generatedUncategorised
Number of systems covered by a critical analysis/total number of systemsUncategorised
Total number of equipment items in the CMMS/EAM system/ total number of equipment items in the plantUncategorised
Appendix 1

Appendix 1

References

Ahmad, M.M. and Dhafr, N. (2002), “Establishing and improving manufacturing performance measures”, Robotics and Computer-Integrated Manufacturing, Vol. 18 Nos 3-4, pp. 171-176.

Akkermans, H., Besselink, L., Van Dongen, L. and Schouten, R. (2016), “Smart moves for smart maintenance”, available at: https://pdfs.semanticscholar.org/2bd7/53952261520bdccbeb026675245a77ca400a.pdf (accessed 1 October 2018).

Ante, G., Facchini, F., Mossa, G. and Digiesi, S. (2018), “Developing a key performance indicators tree for lean and smart production systems”, IFAC-PapersOnLine, Vol. 51 No. 11, pp. 13-18.

Azapagic, A. and Perdan, S. (2000), “Indicators of sustainable development for industry: a general framework”, Process Safety and Environmental Protection, Vol. 78 No. 4, pp. 243-261.

Barney, J. (1991), “Firm resources and sustained competitive advantage”, Journal of management, Vol. 17 No. 1, pp. 99-120.

Bengtsson, M. and Salonen, A. (2016), “Requirements and needs – a foundation for reducing maintenance-related waste”, in Koskinen K. et al. (Eds) Proceedings of the 10th World Congress on Engineering Asset Management (WCEAM 2015), Springer, Cham, pp. 105-112.

Bititci, U.S., Carrie, A.S. and McDevitt, L. (1997), “Integrated performance measurement systems: a development guide”, International Journal of Operations and Production Management, Vol. 17 No. 5, pp. 522-534.

Bocken, N., Morgan, D. and Evans, S. (2013), “Understanding environmental performance variation in manufacturing companies”, International Journal of Productivity and Performance Management, Vol. 62 No. 8, pp. 856-870.

Bokrantz, J., Skoogh, A., Berlin, C. and Stahre, J. (2017), “Maintenance in digitalised manufacturing: Delphi-based scenarios for 2030”, International Journal of Production Economics, Vol. 191, pp. 154-169.

Bokrantz, J., Skoogh, A., Berlin, C., Wuest, T. and Stahre, J. (2020a), “Smart maintenance: an empirically grounded conceptualization”, International Journal of Production Economics, Vol. 223.

Bokrantz, J., Skoogh, A., Berlin, C., Wuest, T. and Stahre, J. (2020b), “Smart Maintenance: a research agenda for industrial maintenance management”, International Journal of Production Economics, Vol. 224.

Braz, R.G.F., Scavarda, L.F. and Martins, R.A. (2011), “Reviewing and improving performance measurement systems: an action research”, International Journal of Production Economics, Vol. 133 No. 2, pp. 751-760.

Brundage, M.P., Morris, K.C., Sexton, T., Moccozet, S. and Hoffman, M. (2018). “Developing maintenance key performance indicators from maintenance work order data”, Paper Presented at the ASME 2018 13th International Manufacturing Science and Engineering Conference, MSEC.

Campbell, J.D. (1995), Uptime: Strategies for Excellence in Maintenance Management, Productivity Press, New York, NY.

Campbell, J.L., Quincy, C., Osserman, J. and Pedersen, O.K. (2013), “Coding in-depth semi structured interviews: problems of unitization and intercoder reliability and agreement”, Sociological Methods and Research, Vol. 42 No. 3, pp. 294-320.

Carnero, M.C. (2005), “Selection of Diagnostic techniques and instrumentation in a predictive maintenance program. A case study”, Decision Support Systems, Vol. 38, pp. 539-555.

CEN (2001), EN 13306: Maintenance Terminology, European Committee for Standardization, Brussels.

CEN (2007), EN 15341: Maintenance – Maintenance Key Performance Indicators, European Committee for Standardization, Brussels.

Coetzee, J.L. (1998), Maintenance, Maintenance Publishers, Hatfield.

Dalenogare, L.S., Benitez, G.B., Ayala, N.F. and Frank, A.G. (2018), “The expected contribution of Industry 4.0 technologies for industrial performance”, International Journal of Production Economics, Vol. 204, pp. 383-394.

De Toni, A. and Tonchia, S. (2001), “Performance measurement systems-models, characteristics and measures”, International Journal of Operations and Production Management, Vol. 21 Nos 1-2, pp. 46-71.

Digalwar, A.K. and Sangwan, K.S. (2011), “An overview of existing performance measurement frameworks in the context of world class manufacturing performance measurement”, International Journal of Services and Operations Management, Vol. 9 No. 1, pp. 60-82.

Ehrenfeld, J.R. (2009), “Understanding of complexity expands the reach of industrial ecology”, Journal of Industrial Ecology, Vol. 13 No. 2, pp. 165-167.

Fangucci, A., Galante, G.M., Inghilleri, R. and La Fata, C.M. (2017), “Structured methodology for selection of maintenance key performance indicators: application to an oil refinery plant”, International Journal of Operations and Quantitative Management, Vol. 23 No. 2, pp. 89-113.

Glaser, B.G. and Strauss, A.L. (1967), The Discovery of Grounded Theory: Strategies for Qualitative Theory, Aldine Transaction, New York, NY.

Groover, M.P. (2007), Automation, Production Systems, and Computer-Integrated Manufacturing, Prentice Hall Press, New Jersey, United States.

Kagermann, H., Helbig, J., Hellinger, A. and Wahlster, W. (2013), “Recommendations for implementing the strategic initiative INDUSTRIE 4.0: securing the future of german manufacturing industry; final report of the Industrie 4.0 working group”, available at: https://www.din.de/blob/76902/e8cac883f42bf28536e7e8165993f1fd/recommendations-for-implementing-industry-4-0-data.pdf (accessed 1 October).

Kang, N., Zhao, C., Li, J. and Horst, J.A. (2016), “A Hierarchical structure of key performance indicators for operation management and continuous improvement in production systems”, International Journal of Production Research, Vol. 54 No. 21, pp. 6333-6350.

Kans, M., Galar, D. and Thaduri, A. (2016), “Maintenance 4.0 in railway transportation industry”, in Koskinen K. et al. (Eds), Proceedings of the 10th World Congress on Engineering Asset Management (WCEAM 2015), Springer, Cham, pp. 317-331.

Kennerley, M. and Neely, A. (2003), “Measuring performance in a changing business environment”, International Journal of Operations and Production Management, Vol. 23 No. 2, pp. 213-229.

Ketokivi, M. (2016), “Point–counterpoint: resource heterogeneity, performance, and competitive advantage”, Journal of Operations Management, Vol. 41 No. 1, pp. 75-76.

Krippendorff, K. (2004), Content Analysis: An Introduction to its Methodology, Sage Publications, Thousand Oaks.

Kumar, U., Galar, D., Parida, A., Stenström, C. and Berges, L. (2013), “Maintenance performance metrics: a state-of-the-art review”, Journal of Quality in Maintenance Engineering, Vol. 19 No. 3, pp. 233-277.

Lee, J., Ni, J., Djurdjanovic, D., Qiu, H. and Liao, H. (2006), “Intelligent prognostics tools and e-maintenance”, Computers in Industry, Vol. 57 No. 6, pp. 476-489.

Lee, J., Wu, F., Zhao, W., Ghaffari, M., Liao, L. and Siegel, D. (2014), “Prognostics and health management design for rotary machinery systems – reviews, methodology and applications”, Mechanical Systems and Signal Processing, Vol. 42, pp. 314-334.

Lu, Y. (2017), “Industry 4.0: a survey on technologies, applications and open research issues”, Journal of Industrial Information Integration, Vol. 6, pp. 1-10.

Lynch, R.L. and Cross, K.F. (1991), Measure up – the Essential Guide to Measuring Business Performance, Mandarin, London.

McKinsey Global Institute (2016), The Age of Analytics: Competing in a Data-Driven World, Executive Summary, McKinsey & Company, Brussels, available at: https://www.mckinsey.com/∼/media/McKinsey/Business%20Functions/McKinsey%20Analytics/Our%20Insights/The%20age%20of%20analytics%20Competing%20in%20a%20data%20driven%20world/MGI-The-Age-of-Analytics-Executive-summary.ashx (accessed 14 April 2020).

Melnyk, S.A., Bititci, U., Platts, K., Tobias, J. and Andersen, B. (2014), “Is performance measurement and management fit for the future?”, Management Accounting Research, Vol. 25 No. 2, pp. 173-186.

Muchiri, P.N., Pintelon, L., Martin, H. and De Meyer, A.M. (2010), “Empirical analysis of maintenance performance measurement in Belgian industries”, International Journal of Production Research, Vol. 48 No. 20, pp. 5905-5924.

Muchiri, P., Pintelon, L., Gelders, L. and Martin, H. (2011), “Development of maintenance function performance measurement framework and indicators”, International Journal of Production Economics, Vol. 131 No. 1, pp. 295-302.

Muller, A., Marquez, A.C. and Iung, B. (2008), “On the concept of E-maintenance: review and current research”, Reliability Engineering and System Safety, Vol. 93, pp. 1165-1187.

Munzinger, C., Fleischer, J., Broos, A., Hennrich, H., Wieser, J., Ochs, A. and Schopp, M. (2009), “Development and implementation of smart maintenance activities for machine tools”, CIRP Journal of Manufacturing Science and Technology, Vol. 1, pp. 237-246.

Neely, A. and Bourne, M. (2000), “Why measurement initiatives fail”, Measuring Business Excellence, Vol. 4, pp. 3-6.

Neely, A., Gregory, M. and Platts, K. (1995), “Performance measurement system design: a literature review and research agenda”, International Journal of Operations and Production Management, Vol. 15 No. 4, pp. 80-116.

Neely, A., Mills, J., Platts, K., Richards, H., Gregory, M., Bourne, M. and Kennerley, M. (2000), “Performance measurement system design: developing and testing a process-based approach”, International Journal of Operations and Production Management, Vol. 20 No. 10, pp. 1119-1145.

Parida, A. and Chattopadhyay, G. (2007), “Development of a multi-criteria hierarchical framework for maintenance performance measurement (MPM)”, Journal of Quality in Maintenance Engineering, Vol. 13 No. 3, pp. 241-258.

Parida, A. and Kumar, U. (2006), “Maintenance performance measurement (MPM): issues and challenges”, Journal of Quality in Maintenance Engineering, Vol. 12 No. 3, pp. 239-251.

Parida, A., Kumar, U., Galar, D. and Stenström, C. (2015), “Performance measurement and management for maintenance: a literature review”, Journal of Quality in Maintenance Engineering, Vol. 21 No. 1, pp. 2-33.

Pintelon, L. and Van Puyvelde, F. (1997), “Maintenance performance reporting systems: some experiences”, Journal of Quality in Maintenance Engineering, Vol. 3 No. 1, pp. 4-15.

Porter, L.W. and Lawler, E.E. (1968), Managerial Attitudes and Performance, Dorsey, Homewood, Illinois, IL.

Roda, I., Macchi, M. and Fumagalli, L. (2018), “The future of maintenance within industry 4.0: an empirical research in manufacturing”, IFIP International Conference on Advances in Production Management Systems, 2018, Springer, Cham, pp. 39-46.

Rodriguez, R.R., Saiz, J.J.A. and Bas, A.O. (2009), “Quantitative relationships between key performance indicators for supporting decision-making processes”, Computers in Industry, Vol. 60 No. 2, pp. 104-113.

Rouse, P. and Putterill, L. (2003), “An integral framework for performance measurement”, Management Decision, Vol. 41 No. 8, pp. 791-805.

Salloum, M. (2013), “Explaining the evolution of performance measures – a dual case-study approach”, Journal of Engineering, Project, and Production Management, Vol. 3, p. 99.

Schneiderman, A. (1999), “Why balanced scorecards fail”, Journal of Strategic Performance Measurement, Vol. 2 No. 11, pp. 6-11.

Simões, J.M., Gomes, C.F. and Yasin, M.M. (2011), “A literature review of maintenance performance measurement: a conceptual framework and directions for future research”, Journal of Quality in Maintenance Engineering, Vol. 17 No. 2, pp. 116-137.

Simões, J.M., Gomes, C.F. and Yasin, M.M. (2016), “Changing role of maintenance in business organisations: measurement vs strategic orientation”, International Journal of Production Research, Vol. 54 No. 11, pp. 3329-3346.

Smith, R. (2004), “Key Performance indicators-Leading or lagging and when to use them”, in Smith and Hawkins Lean Maintenance: Reduce Costs, Improve Quality, and Increase Market Share, Elsevier BV, Burlington, pp. 249-252.

Smith, R. and Mobley, R.K. (2011), Rules of Thumb for Maintenance and Reliability Engineers, Butterworth-Heinemann, Oxford, UK.

Stefanovic, M., Nestic, S., Djordjevic, A., Djurovic, D., Macuzic, I., Tadic, D. and Gacic, M. (2017), “An assessment of maintenance performance indicators using the fuzzy sets approach and genetic algorithms”, Proceedings of the Institution of Mechanical Engineers, Part B, Journal of Engineering Manufacture, Vol. 231 No. 1, pp. 15-27, available at: https://doi.org/10.1177/0954405415572641.

Tangen, S. (2004), “Performance measurement: from philosophy to practice”, International Journal of Productivity and Performance Management, Vol. 53 No. 8, pp. 726-737.

Tangen, S. (2005), “Analysing the requirements of performance measurement systems”, Measuring Business Excelence, Vol. 9, pp. 46-54.

Tirabeni, L., De Bernardi, P., Forliano, C. and Franco, M. (2019), “How can organisations and business models lead to a more sustainable society? A framework from a systematic review of the industry 4.0”, Sustainability, Vol. 11 No. 22, p. 6363.

Veleva, V. and Ellenbecker, M. (2001), “Indicators of sustainable production: framework and methodology”, Journal of Cleaner Production, Vol. 9 No. 6, pp. 519-549.

Vroom, V.H. (1964), Work and Motivation, John Wiley & Sons, New York, NY.

Wang, L., Törngren, M. and Onori, M. (2015), “Current status and advancement of cyber-physical systems in manufacturing”, Journal of Manufacturing Systems, Vol. 37, pp. 517-527.

Wijesinghe, D. and Mallawarachchi, H. (2019), “A systematic approach for maintenance performance measurement: apparel industry in Sri Lanka”, Journal of Quality in Maintenance Engineering, Vol. 25 No. 1, pp. 41-53.

Winroth, M., Almström, P. and Andersson, C. (2016), “Sustainable production indicators at factory level”, Journal of Manufacturing Technology Management, Vol. 27 No. 6, pp. 842-873.

Wireman, T. (2005), Developing Performance Indicators for Managing Maintenance, Industrial Press, New York, NY.

Xu, L.D., Xu, E.L. and Li, L. (2018), “Industry 4.0: state of the art and future trends”, International Journal of Production Research, Vol. 56 No. 8, pp. 2941-2962.

Ylipää, T., Skoogh, A., Bokrantz, J. and Gopalakrishnan, M. (2017), “Identification of maintenance improvement potential using OEE assessment”, International Journal of Productivity and Performance Management, Vol. 66 No. 1, pp. 126-143.

Zairi, M. (1994), “Benchmarking: the best tool for measuring competitiveness”, Benchmarking for Quality Management and Technology, Vol. 1 No. 1, pp. 11-24.

Acknowledgements

We would like to thank all individuals who contributed to this research. This work was financed by VINNOVA [grant number 2017–01652]. This work has been performed within the Production Area of Advance at Chalmers. The support is greatly appreciated.

Corresponding author

Camilla Lundgren is the corresponding author and can be contacted at: camilla.lundgren@chalmers.se

About the authors

Camilla Lundgren is a PhD student at the Department of Industrial and Materials Science, Chalmers University of Technology. Her research focusses on the implementation of Smart Maintenance, including aspects such as investments, strategy and leadership. She has industrial experience from working with automatic cleaning systems in the pulp and paper industry, and from work as a production and logistics consultant at AFRY.

Jon Bokrantz, PhD, is a Researcher at the Department of Industrial and Materials Science, Chalmers University of Technology. Jon has a background in Production Engineering and his research focuses on maintenance in digitalized manufacturing.

Anders Skoogh is Professor of Production Maintenance at Industrial and Materials Science, Chalmers University of Technology. He is a research group leader for Production Service and Maintenance Systems. Anders is also the director of Chalmers’ master’s program in Production Engineering and board member of the think-tank Sustainability Circle. Before starting his research career, he accumulated industrial experience from being a logistics developer at Volvo Cars.

Related articles