Next Article in Journal
Surface Finishing of Zirconium Dioxide with Abrasive Brushing Tools
Previous Article in Journal
Maintainability of a Gearbox Using Design for Disassembly and Augmented Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

MARMA: A Mobile Augmented Reality Maintenance Assistant for Fast-Track Repair Procedures in the Context of Industry 4.0

1
Department of Production and Management Engineering, Democritus University of Thrace, 12 Vas. Sophias, GR-671 00 Xanthi, Greece
2
Department of Electrical and Computer Engineering, Democritus University of Thrace, Kimmeria, GR-671 00 Xanthi, Greece
*
Author to whom correspondence should be addressed.
Submission received: 30 November 2020 / Revised: 13 December 2020 / Accepted: 16 December 2020 / Published: 20 December 2020
(This article belongs to the Section Advanced Manufacturing)

Abstract

:
The integration of exponential technologies in the traditional manufacturing processes constitutes a noteworthy trend of the past two decades, aiming to reshape the industrial environment. This kind of digital transformation, which is driven by the Industry 4.0 initiative, not only affects the individual manufacturing assets, but the involved human workforce, as well. Since human operators should be placed in the centre of this revolution, they ought to be endowed with new tools and through-engineering solutions that improve their efficiency. In addition, vivid visualization techniques must be utilized, in order to support them during their daily operations in an auxiliary and comprehensive way. Towards this end, we describe a user-centered methodology, which utilizes augmented reality (AR) and computer vision (CV) techniques, supporting low-skilled operators in the maintenance procedures. The described mobile augmented reality maintenance assistant (MARMA) makes use of the handheld’s camera and locates the asset on the shop floor and generates AR maintenance instructions. We evaluate the performance of MARMA in a real use case scenario, using an automotive industrial asset provided by a collaborative manufacturer. During the evaluation procedure, manufacturer experts confirmed its contribution as an application that can effectively support the maintenance engineers.

1. Introduction

The fourth industrial revolution has grown over the last two decades, making rapid changes day by day and removing silos within organization. The business models and manufacturing processes are transformed using digital technologies and through engineering solutions [1,2]. The factory of the future, enabled by the German initiative of Industry 4.0 (I4.0), places the human component at the center of the value chain [3]. Operators are critical elements of the smart factory, since their intelligence can not be replaced by machines. To that end, an Operator 4.0 should be flexible and adaptive, in order not only to perform tasks in collaboration with machines, but also to solve complex unplanned problems. By further exploiting the rapid configuration of smart production systems, producing huge amount of information, workforce should use innovative technology systems, providing better visualization regarding the information of the task.
As the cyber-physical production systems materialize, human intervention in the production process is anticipated to significantly drop off. Nevertheless, the maintenance of the assets will still be performed by the operators. Hence, in the era of I4.0, maintenance operators should be supported by connected tools and intelligent systems to efficiently complete their operations, without the necessity of knowing the accurate location of machines, as well as specific information regarding the configurable topology. Traditionally, the maintenance operators are highly experienced and they follow paper-based technical drawings to perform a maintenance activity. In the advanced cases, in which the maintenance requires knowledge and workforce from the original equipment manufacturers (OEMs), the production is usually frozen by the time an expert visits the plant [4].
One of the technological pillars of I4.0, which can solve the aforementioned traditional procedure, is augmented reality (AR) [5]. AR technologies are rapidly introduced in a wide variety of applications within the factory of the future. About 25% of their recent implementations in I4.0 covers tasks regarding remote assistance and maintenance, with the other 75% focusing either on visualization and simulation applications or on robot programming and human–robot interaction tasks [6]. Maintenance constitutes one of the most important activities of the production life-cycle, covering 60–70% of its total costs. Hence, the improvement of its efficiency and performance is among the highest priorities, leading to several implementations of AR in maintenance applications [7].
As the smart manufacturing environments become reality, handheld solutions are used to connect the operators and managers with the digital factory. Some of the applications include work order management, asset tracking, inventory management and digitized maintenance procedures. In the maintenance departments, operators, on one hand, use their mobiles to easily switch tasks and rotate jobs over a day, communicate with experts and read step-by-step digitized procedures using AR architectures [8,9], as illustrated in Figure 1. On the other hand, managers can control multiple work areas and teams via one central environment.
In this paper, we propose an efficient AR system, which can locate a component in the manufacturing plant and visualize the maintenance instructions of the corresponding failure modes in a handheld and vivid manner to the operators’ mobile device. A camera acts as the sensory input of the proposed AR system, recording the activity within the industrial plant. Firstly, the asset of interest is detected on the image plane, while subsequently, a robust tracking algorithm [10] is responsible for following the position and scale of the detected asset for the subsequent frames of the recording. According to the calculated position and scale values, the 3D CAD of the specific asset is projected on the image plane in a user friendly way. When the operator approaches the machine, the proposed system provides instructions, regarding the first steps of the maintenance procedure. Then, the system is easily piloted by the user through suitable buttons that proceed to the subsequent or former maintenance steps of the asset. When the maintenance is concluded, the system notifies with a confirmation message. The contributions of this paper are summarized as follows:
  • A supportive system is introduced, which can be used by unskilled operators to perform maintenance operations in night shifts.
  • The platform is executed in personal mobile phones or tablets, eliminating the investment costs of using expensive AR kits.
  • The system is able to locate the asset inside the complex manufacturing shop floor without human intervention.
  • The proposed solution can replace the paper-based instructions with digital ones, exploiting AR functions to limit the retrieval times.
  • Our system is anticipated to reduce the knowledge gap between the manufacturers and maintenance operators.
The proposed application was firstly introduced in the I4.0 NOW crowdhackathon organized by the Hellenic Federation of Enterprises. In the competition, it was selected within the best seven teams to present their prototype in the first conference of Industry 4.0 in Greece, boosted by the Hellenic Government [11]. After that, the application has been further improved.
The remainder of this paper is structured as follows. In Section 2, we discuss representative works that focus on AR applications in the context of Industry 4.0. Consequently, Section 3 contains a sort description of the algorithms utilized in the mobile augmented reality maintenance assistant (MARMA), while in Section 4, a detailed description of the proposed methodology is presented. In the following Section 5, the experimental procedure is described. Lastly, in Section 6, we draw conclusions and present suggestions for future work.

2. Related Work

The utilization of cutting-edge technologies from fields, like computer vision (CV) [12,13] and AR [14] can considerably enhance the capacity of an intelligent system. Such algorithms can provide useful feedback within an industrial environment, focusing on both human-centric technologies, like emotion [15] and hand pose estimation [16], and environment mapping ones [17]. Apart from AR, there are also other extended reality (xR) technologies, like virtual reality (VR) and mixed reality (MR), that are used in maintenance within the factory of the future. VR improves the efficiency of training activities and reduces training manufacturing equipment costs [18,19]. In the MR field, the real and virtual elements are mixed, where the real arguments can be visualized in the virtual environment, added value on the limitless of location, reducing costs for international processes [20]. Especially, holograms are part of mixed reality, in which remote teams are sharing 3D holographic data to break travel barriers and improve communication, by taking data-driven decisions [21].
In the last 10 years, the number of maintenance support systems have been increased in manufacturing research, with the goal to reduce the human errors in assembly tasks and eliminate the down times of the production. In consonance with the contemporary review study [14], AR manufacturing research is focused on four main topics, viz., assembly guidance, maintenance procedures, logistics navigation and picking instruction. The majority of the systems referred to assembly operation in the context of maintenance, while 25% of the reviewed papers are AR systems within indoor logistics [22] and picking problems [23]. Werrlich et al. [24] developed an AR system to support the engine assembly line in an automotive manufacturing environment. In the aviation industry, the impact of AR is evaluated, by applying techniques in inspection, robot programming, maintenance and process guidance [25,26]. In specific, Ceruti et al. proposed a framework [27] to identify the fault parts with AR and reserve engineering methodologies, which scans and prints the fault part, using additive manufacturing techniques. In addition, Freddi et al. [28] applied AR techniques in the disassembly of a tailstock, to improve the efficiency of the maintenance process. On the other hand, there is interest in the development of remote maintenance and repair support, where Mourtzis et al. [29] presented a framework about real-time communication channels between shop floor operators and maintenance experts, using AR guidance. Aiming at real-time interaction with other users, He et al. [30] presented an AR annotation tool that maps the environment and creates notes with limited actions. Arntz et al. [31] presented an indoor navigation AR approach to support shop floor operators in heavy industries, providing handy 3D visual instructions during evacuations. Meanwhile, Fang et al. [32] developed a scalable mobile AR application to track the pick-by-order process and provide instructions, using global market-based marks on the factory floor. Regarding the observation of robots, Limeria et al. [33] presented an application in ROS, where virtual reality (VR) technologies are exploited to simulate the robot picking procedures in real environments.
Improvements have also occurred in the collaborative picking procedures, where Sarupuri et al. [34] presented a prototype AR system to improve the successful picking rate of forklift operators, by providing real-time 3D guidelines about pallet racking. Moreover, Kapinus et al. [35] designed an AR architecture, that supports the shop floor employees in the programming of industrial robotic tasks, visualizing instructions into a 3D mobile environment that eliminate the times between the screens and the work environment.
Safety retention during performance enhancements of the maintenance procedures constitutes a critical factor in the development of AR applications [36]. Kim et al. [23] examined the effectiveness of operators among AR devices, while performing tasks like order picking, maintenance and assembly. Furthermore, Sanna et al. [37] compared the required times and number of errors that occurred during assembly, maintenance and picking tasks, by using paper-based instructions and a handheld AR tool. During the maintenance procedures on food processing machines, Vignali et al. [38] proposed an AR framework to ensure the safety of the employees.
The majority of the aforementioned AR systems rely on the support of shop floor operators in different manufacturing areas, using high-cost and advanced equipment. Their value is significant but they do not offer a well-defined process, describing the creation of the 3D models, the input data acquisition, as well as the asset’s projection to the application’s interface that provides the feeling of naturalism to the users. Considering the state-of-the-art, we present an AR application that can be effectively controlled by shop-floor operators to perform fast-track repair procedures within the manufacturing plant. The proposed framework is designed to be part of a ubiquitous and interactive maintenance system that can eliminate the mean time to repair (MTR) in production’s availability, by reducing the unexpected breakdowns times, where external original equipment manufacturer (OEM) experts may be required. For that reason, the application can be installed in operators’ mobile handhelds and support bring your own device (BYOD) policies, forming a significant part of the Industry 4.0 philosophy. As a result, MARMA is a low-cost application that replaces the paper-based maintenance instructions with AR-based ones, reducing the knowledge gap between OEMs and maintenance in-house operators.

3. Tools

In this section, a short apposition of the tools that were exploited in the proposed AR methodology is presented. More specifically, we provide descriptions and concrete explanations, regarding the exploitation of the YOLO detector, the 3D modeling software, the augmented reality SDK, as well as the 3D engine and the android studio.

3.1. Object Detector

For years, the challenge of object detection was achieved by using separately a localization and a classification algorithm [39,40]. In 2016, Redmon and Farhadi presented the real-time YOLO method for object detection [41]. The YOLO detector is an open-source deep learning system that combines the above two methods in a single end-to-end network, providing accurate and fast detection rates in a wide variety of frame sizes. In particular, the detection challenge is handled as a regression one, by estimating a set of proposed bounding boxes along with their corresponding class probabilities. In the following years, improved versions of YOLO in terms of accuracy and speed have been published. In specific, the YOLOv3 is based on YOLOv2 but also uses logistic regression to predict the objectness score of each bounding box, also applicable for various image resolutions. Despite the fact that YOLOv2 and YOLOv3 are robust, the size of the models is huge. To address this challenge, Redmon proposed the Tiny-Yolo architecture, where the small model size and fast speed make it suitable for embedded systems [42].

3.2. 3D Modeling Design

The 3D modeling software “Autodesk Inventor” is used to transform the paper-based sketch parts of a machine to digital models for 3D representation in the proposed AR application. Autodesk Inventor Pro 2019 is a software that has been developed for engineering needs [43]. Engineers can create digital designs about products, molds, machines, constructions or other design needs, that can be integrated in a simulation software for modeling a real phenomenon with a set of mathematical formulas in digital environments. Specifically, the concept of this test is to apply engineering mechanics, which are related to the truss, beam and frame structures. In our case, the Autodesk has been selected, in order to export the 3D models to a standard .obj geometry format. This format was used to import the 3D models in the application environment.

3.3. Augmented Reality Software Development Kit (SDK)

Vuforia is the most popular SDK for developing AR-applications for a wide variety of devices, launched by Qualcomm. Some representative features of Vuforia are feature tracking, image recognition, object recognition, text recognition and video playback. Vuforia uses CV algorithms to recognize objects in the image frame and present 3D models or simple visual data in a real-time interface. The direction and positioning properties of the 3D models are also included in the package. The software development kit uses the camera of the handhelds to extract new images and a virtual display that previews the AR frame. Utilizing virtual 3D objects in real world images gives operators the feeling of immersion [44]. Among other available AR SDKs, we preferred the Vuforia, due to the speed of partly-covered recognition, the robust object tracking and its general efficiency in low-light conditions.

3.4. 3D Engine

Unity is a powerful cross-platform 3D engine supporting C# scripting, 3D and 2D graphics, as well as animations developed by Unity Technologies. It constitutes one of the most popular engines used for AR and VR mobile applications, supporting human–machine interaction through AR development tools. The Unity is selected, thanks to its compatibility with the Vuforia SDK plug-in, in order to detect and track 3D objects in AR applications [44]. Unity offers pre-defined development functions to create interactive 3D contents applicable in practical scenarios. In addition, Unity ensures the flexibility to export the designed application in executable files compatible with the most typical mobile operating systems, such as iOS and Android.

4. Methodology

As mentioned above, MARMA is able to estimate the position of the asset in a plant, display the 3D CAD model of the machine with the start of the maintenance procedure and let the user navigate throughout the proposed maintenance steps through the available previous and next buttons to his/her handheld. Firstly, a set of features is extracted for the machine of interest from various viewpoints and distances, which is stored to the system’s database as a 3D target model. Then, during the inference phase, MARMA receives a frame from the handheld’s camera, extracts a set of features and compares them with the features of the 3D target model stored in its database. In the case that a sufficient matching score is succeeded, the machine is successfully detected and the extracted frame’s features are assigned to the system’s tracking algorithm. The subsequent captured frames are processed only by the tracking algorithm, measuring a matching score between the frame’s features and the tracked ones. In case the tracking algorithm fails, the frame’s features are discarded, the algorithm returns a false state and the system processes the following frame again from the feature extraction and matching step. For every successful detection or tracking step, our method computes, according to the corresponding frame’s features, the position, the orientation and the distance of the machine, in order to project the corresponding CAD model from an XML file on the image plane of the device’s screen. This procedure is repeated, until our method receives the user’s choice, to change the maintenance process step. In such a case, MARMA projects the next or previous model of the XML file. The maintenance procedure is terminated when the user exits the application. An outline of our proposed method is provided in the flowchart depicted in Figure 2. Below, a detailed description regarding the individual components of MARMA is presented.

4.1. 3D Model Design Philoshopy

In the maintenance operations, 3D objects are used to preview the relevant steps of the maintenance task in a clear and comprehensible way. During the creation of a 3D object, it is crucial to determine the terms of use and the required designs’ resolution, since the models rarely require high-quality detailed designs. In addition, keeping low-quality designs ensures a limited object’s size, which, in turn, allows the exploitation of more 3D objects in our methodology. As a general rule, the file size depends on the number of polygons, animations, materials and textures that provide a sense of realism to the user. The major factor, which is taken into consideration for the creation of the 3D objects, constitutes MARMA’s capability of smoothly operating in most mobile devices with a decent processing power.

4.2. 3D Target Model

The purpose of this component is to create a 3D target model, using a descriptive and robust feature extraction technique. Hence, the Vuforia Object Scanner is employed to scan the machine from different viewpoints and distances and extract the salient features using the FAST corner detection pipeline for each frame of the scanning process. Then, their corresponding description vectors are computed, by exploiting the speeded up robust features (SURF) algorithm [45], responsible for storing the surrounding characteristics of the selected features. The whole scanning procedure is executed under medium brightness without direct lighting in a noise-free background, and it lasts until sufficient features have been extracted for every possible view of the machine. Subsequently, the application’s quality is enhanced, by scanning additional salient points under intense lighting, increasing the detection ability in various environmental conditions. Due to adverse conditions that industrial environments display, the scanning should also include conditions with shadows and occlusions, while the overall procedure should be free of any reference point. At the end of the 3D target model creation phase, the successful completion of the machine’s pattern is verified using the device’s camera and the Vuforia application.

4.3. Feature Matching and Tracking

Among the two available tracking techniques that Vuforia provides, the natural feature tracking (NFT) [46] method is employed. In specific, NFT is an image or model-based technique, which detects and tracks the natural features, previously extracted from the target model itself, as described in Section 4.2. Similarly, the frame’s feature detection procedure is achieved with the FAST detector, while their corresponding description vectors can arise using different approaches, like SIFT and SURF. In order to keep consistency between the tracking and 3D target model creation phase, we exploit SURF features that lead to a more lightweight pipeline and provide proven efficacy in different tasks, as well [47]. During the real-time process, the system detects the region of interest within the camera’s frame, using the tiny-YOLOv3 detector, and crops the captured image. Then, it computes the SURF features of the cropped image and compares their descriptors against the target’s ones. In case of successful matching, the system keeps tracking the camera’s features in the subsequent frames, using the Vuforia SDK tracking framework.

4.4. Orientation and Scale

At the moment that the 3D model is detected, a corresponding CAD model has to be suitably projected on the image plane of the mobile screen. To that end, the model’s actual pose has to be estimated, including both orientation and scale computation. The above is achieved through the orientation of the detected features using gradients, which are consequently compared against the gradients of the corresponding features in the target model. Moreover, the distance between the neighboring features contributes to the scale calculation. The above procedure leads to an associated rotation matrix R and a translation vector t capable of projecting every point of the 3D CAD model on the mobile screen, according to:
x c = [ R | t ] × X x y z = r 1 r 2 r 3 t x r 4 r 5 r 6 t y r 7 r 8 r 9 t z X Y Z 1 ,
where x c the projection in image coordinates, X the point’s real world coordinates and [ R | t ] the pose matrix. In order to align the camera’s pixel and the projection of coordinates, the camera’s intrinsic matrix is exploited:
K = f γ p x 0 0 f p y 0 0 0 1 0 ,
where f represents the focal length and γ the skew factor between the x and y axis, which equals zero. Hence, taking into consideration the calibration matrix K, the entire transformation matrix becomes:
M = K × T M = f γ p x 0 0 f p y 0 0 0 1 0 × r 1 r 2 r 3 t x r 4 r 5 r 6 t y r 7 r 8 r 9 t z ,
with T being the pose matrix. Eventually, given a point with X i real world coordinates, its equivalent coordinates X f on the image plane of the camera are:
X f = M × X i .

4.5. 3D Visualization

As the maintenance procedure of a machine consists of various steps, the need of creating a set of required 3D CAD models comes up, which should also be stored in a central local file. Within MARMA methodology, the 3D objects and the maintenance instructions are saved in an XML file, that forms the basis for the visualization of the entire maintenance assembly procedure. Based on the operator’s choice, the system reads the XML file and present the corresponding 3D object in the smartphone’s display. The 3D object is visualized according to the pose estimation, described in Section 4.4.

4.6. User Interface

The visualization of the AR maintenance procedure can be realized, by using head-mounted displays, AR glasses, smartphones, tablets and PCs. MARMA is based on the BYOD policy, where the majority of the engineers own smartphones that can load the application. In comparison with AR glasses, smartphones’ users know how to use them without extra training. A significant function of our interface is the simple interaction between user and smartphone. Firstly, the operator is asked to start the maintenance or exit the application. If the user decides to start the procedure, then the application loads the camera frames, searching through the machine’s predefined patterns. When the machine is detected, the 3D object from the XML file is loaded to demonstrate the augmented information to the user. Afterwards, the operator can choose among three simple buttons and review the progress through a suitable bar, as is shown in Figure 3. The progress bar is located in the left corner of the screen and contains information about the maintenance step that the user has completed. On the other corner of the screen, the operator can be navigated through the maintenance steps. If the next arrow is pressed, the 3D object regarding the next maintenance step is loaded. Aiming to keep the content simple and clear, we have included the maintenance instructions in the XML file and preview them with the 3D objects. Taking into consideration that users are familiar with mobile apps, the progress bar uses a wrench to indicate the progress. This eliminates the space of the progress bar and the operator can be informed about the reassembly instruction by navigating from the end to the start. In addition, the available explosion button helps operators to understand the structure of the machine in an elegant way, by separating the individual elements of the machine.

4.7. Unity

The entire AR application is developed using Unity3D software [48]. More specifically, Unity3D is exploited to import the database, including the XML file with the 3D CAD models, the 3D target model, the tracking pipeline with the Vuforia SDK, as well as the proposed user interface. In addition, it provides tools and methods for connecting the individual components and, finally, export the whole MARMA system to a suitable mobile application format.

5. Experimental Process

The performance of the proposed methodology is tested on a realistic use case scenario through a compressor, which is provided by a collaborating manufacturer. The current maintenance procedure requires an operator that can read the step-by-step instructions from a paper-book and perform the whole procedure. The collaborating manufacturer recognizes the capabilities of MARMA approach and decides to work on a compressor. Finally, a real-time demonstration of the maintenance process with MARMA application is presented to the operators.

5.1. Maintenance Scenario Setup

In our case study, the investigated asset is an A/C compressor, which is commonly included in the air-conditioning system of a car, as shown in Figure 4. It generates the power responsible for channeling the freon into the condenser that transforms air to liquid. During the maintenance process of the system, mechanics have to clear the region of the valve plate. The scenario setup contains the creation of the compressor’s target model through the Vuforia Object Scanner described in Section 4.2, scanning the compressor in various lighting conditions. The next step refers to the design of the 3D CAD models required for the development of the specific maintenance scenario. To support the aforementioned process with AR techniques, the compressor parts should be designed in 3D CAD models. As mentioned in Section 4.1, the Autodesk Inventor software is exploited since it provides the suitable tools for designing mechanical parts, as well as the capability of exporting them to an . o b j format. Due to the low level of details, the 3D models of the compressor lasted around 35 h because of the uncomplicated structure. Both Figure 5 and Figure 6 demonstrate 3D exploded views of the compressor that illustrate its designed assembly parts.
After the selection of the compressor, the exact maintenance procedure, which is going to be visualized via the proposed application, is defined by the manufacturer. To that end, a set of actions are specified, which shall be displayed to the operator’s mobile device. The positioning of the object in the mobile required around 25 h. In particular, the maintenance is described by the following 5 phases:
  • Phase 1: Remove the bolts
  • Phase 2: Remove the frontal phase
  • Phase 3: Unscrew the sheet metal
  • Phase 4: Replace the gasket
  • Phase 5: Clean the metal flange
The traditional maintenance procedure is visualized through paper-based technical drawings, as shown in Figure 5. In addition, the compressor’s manual includes step-by-step instructions above the technical drawing. Hence, during the maintenance, significant time is spent on understanding the paper-based instructions according to the operator’s level of expertise [49]. Our system uses AR to close the time gap between highly experienced maintenance managers and not very experienced operators.

5.2. Demonstration

The developed application was executed in an Android smartphone with a 1.8GHz 4-Core CPU and 4GB RAM. For each phase, the application chooses a short description and the corresponding 3D object to guide the user during the disassembly. At the beginning, the user chooses to start the maintenance, while the application loads the XML file, including several information regarding the subsequent phases. After that, the application tries to locate the compressor’s features to the handeld’s frame, in order to visualize the first phase. To complete phase 1, the user has to unscrew and remove the five bolts from the frontal phase (Figure 7a). The next phase includes the removal of the frontal phase, in the corrected position (Figure 7b). After the disassembly of the external surface, the user is guided to unscrew the sheet metal (Figure 7c). Finally, the application visualizes the removal of the gasket and provides a warning message to clean the metal flange (Figure 7d). At the end of the replacement, the user can navigate throughout the assembly tasks by using the back button. At any time, the user can choose the explosion option to dynamically inspect the correlation and the assemblage of the different parts (Figure 6).
During the final demonstration of MARMA within Industry 4.0 (now crowdhackathon), more than 20 manufacturer experts assessed the maturity level and the usage of the application in the industry. MARMA generally received positive scores and confirmed its contribution as an application that can effectively support the maintenance engineers, since the procedure can be achieved without the need for high-skilled operators. The demonstration of MARMA application to the interested manufacturers showed their willingness and the potential of the system in simplifying complex maintenance procedures. Experts mentioned that MARMA can reduce the total repair time of compressor by 30%, compared with the paper-based procedure and digital ones. In addition, particular interest was paid in exploiting the application as a means of training new and unskilled maintenance operators, reducing both the required training, as well as the total repair time. Eventually, the integration of exponential digital technologies, like MARMA, in small–medium enterprises can strengthen the industrial competitiveness in the global landscape.
The framework of MARMA can be applied not only in manufacturing environments, but also in completely different sectors, like investigation [50], infrastructure [51], and education [52]. Generally, MARMA enhances users’ field of view with real-time AR-based digital information, reducing the required time for understanding the procedures.
In the case that we want to apply the MARMA in a production line, which is the most significant asset of a factory, it is crucial to perform a cost–benefit analysis [53]. The cost benefit analysis of an AR maintenance application correlates the break down frequency, the cost of spare parts, the repair fault time and external collaborators costs per machine. Based on the findings, the first machines to implement the framework of MARMA will be chosen, ensuring valuable recoverable. Then, a similar procedure with the one presented in Section 5.1 has to be followed. The validity of the application depends on the probability of unexpected maintenance steps. This means that if the maintenance process is standardized by the OEM, then MARMA will work in the right way. Otherwise, the success of the process is based on the experience of the operator because he/she will need to perform additional unplanned steps. In the implementation phase, the complexity of the maintenance procedure is not a time-consuming task, compared with the availability of the 3D digital models. In this case, the OEM provides the 3D models or the manufacturer owns the digital files, then the implementation time is significant lesser than designing them from scratch.
MARMA reduces the knowledge gap between the original equipment manufacturers and maintenance operators, describing in detail the pipeline development and usage. Considering the current applications, the system is designed in a user-friendly way to be effectively controlled in fast-track repair procedures by shop-floor operators and capable to be correlated with ubiquitous maintenance system that can reduce the mean time to repair (MTR) in production’s availability. To the best of our knowledge, this is the first attempt to fully and concretely present an end-to-end AR-assisted maintenance system, by describing all of its individual parts, as well as the way that those cooperate and contribute to the final system.

6. Conclusions and Future Work

The paper at hand proposes a maintenance augmented reality system for fast-track repair procedures in the context of Industry 4.0. The asset of interest is detected through a simple smartphone camera, using feature matching and tracking techniques, as well as 3D modelling. When the detection is achieved, AR technologies are employed to generate detailed instructions to the maintenance operators’ device in a natural and comprehensible way. During the development, the design of virtualization, modularity and service oriented principles have been followed, providing a customized digitized maintenance service, which can be adapted in various manufacturing maintenance scenarios. MARMA is tested in an A/C compressor of the automotive industry, indicating that the 3D visualization of maintenance instruction improves the operators’ user experience during maintenance.
Future work regarding the application will focus on the expansion of its provided features, supporting the integration of MARMA in a cloud-based server, where the operator can communicate with experts or load more detailed visualized instructions in the handheld. Moreover, 3D reconstruction, by scanning the object during real-time operation, is anticipated to provide the managers with the ability to create customized maintenance procedures. In the context of inter-connectivity, MARMA can be integrated with predictive alarm systems to perform maintenance operation at the right time. The predictive alarm system will be parted by three modules, focusing on: predictive maintenance, decision support system and MARMA. The predictive maintenance module will be responsible for monitoring the operational state of a machine within the industrial environment; the decision support system will decide about the expected maintenance scenario, while the maintenance steps will be visualized by MARMA. Over and above the development, further testing scenarios will be considered, including different operators with various levels of expertise.

Author Contributions

F.K.K. proposed the conceptualization, designed the architecture, carried out the investigation, prepared and wrote the manuscript; I.K. and N.S. gathered the data-sets, implemented the software and performed the experimental process; S.G.M. set the requirements of the system and accepted the architecture; A.G. supervised the research, accepted the manuscript carried out the final proof-reading. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partly funded by the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH-CREATE- INNOVATE grant number [T1EDK-02433].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Konstantinidis, F.K.; Gasteratos, A.; Mouroutsos, S.G. Vision-Based Product Tracking Method for Cyber-Physical Production Systems in Industry 4.0. In Proceedings of the 2018 IEEE International Conference on Imaging Systems and Techniques (IST), Krakow, Poland, 16–18 October 2018; pp. 1–6. [Google Scholar]
  2. Benakis, M.; Du, C.; Patran, A.; French, R. Welding Process Monitoring Applications and Industry 4.0. In Proceedings of the 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE), Vancouver, BC, Canada, 22–26 August 2019; pp. 1755–1760. [Google Scholar]
  3. Ruppert, T.; Jaskó, S.; Holczinger, T.; Abonyi, J. Enabling technologies for operator 4.0: A survey. Appl. Sci. 2018, 8, 1650. [Google Scholar] [CrossRef] [Green Version]
  4. Wlazlak, P.; Säfsten, K.; Hilletofth, P. Original equipment manufacturer (OEM)-supplier integration to prepare for production ramp-up. J. Manuf. Technol. Manag. 2019, 30, 506–530. [Google Scholar] [CrossRef]
  5. Gallala, A.; Hichri, B.; Plapper, P. Survey: The Evolution of the Usage of Augmented Reality in Industry 4.0. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Bangkok, Thailand, 17–19 May 2019; pp. 12–17. [Google Scholar]
  6. Mourtzis, D.; Vlachou, E.; Milas, N.; Xanthopoulos, N. A cloud-based approach for maintenance of machine tools and equipment based on shop-floor monitoring. Procedia CIRP 2016, 41, 655–660. [Google Scholar] [CrossRef] [Green Version]
  7. Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A systematic review of augmented reality applications in maintenance. Robot. Comput. Integr. Manuf. 2018, 49, 215–228. [Google Scholar] [CrossRef] [Green Version]
  8. Spranger, J.; Buzatoiu, R.; Polydoros, A.; Nalpantidis, L.; Boukas, E. Human-Machine Interface for Remote Training of Robot Tasks. arXiv 2018, arXiv:1809.09558. [Google Scholar]
  9. Muzzamil, F.; Syafrida, R.; Permana, H. The effects of smartphones on social skills in Industry 4.0. In Teacher Education and Professional Development In Industry 4.0: Proceedings of the 4th International Conference on Teacher Education and Professional Development (InCoTEPD 2019), Yogyakarta, Indonesia, 13–14 November 2019; CRC Press: Boca Raton, FL, USA, 2020; p. 72. [Google Scholar]
  10. Simonetti Ibañez, A.; Paredes Figueras, J. Vuforia v1. 5 SDK: Analysis and Evaluation of Capabilities. Master’s Thesis, Universitat Politècnica de Catalunya, Barcelona, Spain, 2013. [Google Scholar]
  11. SEV. Industry 4.0: A Growth Opportunity Greece should not Miss by Hellenic Federation of Enterprises. 2019. Available online: https://en.sev.org.gr/events/industry-4-0-a-growth-opportunity-greece-should-not-miss-18-19-december-2019/ (accessed on 20 December 2020).
  12. Pérez, L.; Rodríguez, Í.; Rodríguez, N.; Usamentiaga, R.; García, D.F. Robot guidance using machine vision techniques in industrial environments: A comparative review. Sensors 2016, 16, 335. [Google Scholar] [CrossRef] [PubMed]
  13. Tsintotas, K.A.; Giannis, P.; Bampis, L.; Gasteratos, A. Appearance-Based Loop Closure Detection with Scale-Restrictive Visual Features. In ICVS 2019: Computer Vision Systems; Springer: Cham, Switzerland, 2019; pp. 75–87. [Google Scholar]
  14. Bottani, E.; Vignali, G. Augmented reality technology in the manufacturing industry: A review of the last decade. IISE Trans. 2019, 51, 284–310. [Google Scholar] [CrossRef] [Green Version]
  15. Kansizoglou, I.; Bampis, L.; Gasteratos, A. An Active Learning Paradigm for Online Audio-Visual Emotion Recognition. IEEE Trans. Affect. Comput. 2019. [Google Scholar] [CrossRef]
  16. Santavas, N.; Kansizoglou, I.; Bampis, L.; Karakasis, E.; Gasteratos, A. Attention! A Lightweight 2D Hand Pose Estimation Approach. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
  17. Balaska, V.; Bampis, L.; Boudourides, M.; Gasteratos, A. Unsupervised semantic clustering and localization for mobile robotics tasks. Robot. Auton. Syst. 2020, 131, 103567. [Google Scholar] [CrossRef]
  18. Velosa, J.D.; Cobo, L.; Castillo, F.; Castillo, C. Methodological proposal for use of Virtual Reality VR and Augmented Reality AR in the formation of professional skills in industrial maintenance and industrial safety. In Online Engineering & Internet of Things; Springer: Cham, Switzerland, 2018; pp. 987–1000. [Google Scholar]
  19. De Amicis, R.; Ceruti, A.; Francia, D.; Frizziero, L.; Simões, B. Augmented Reality for virtual user manual. Int. J. Interact. Des. Manuf. (IJIDeM) 2018, 12, 689–697. [Google Scholar] [CrossRef]
  20. Moser, T.; Hohlagschwandtner, M.; Kormann-Hainzl, G.; Pölzlbauer, S.; Wolfartsberger, J. Mixed Reality Applications in Industry: Challenges and Research Areas. In SWQD 2019: Software Quality: The Complexity and Challenges of Software Engineering and Software Quality in the Cloud; Springer: Cham, Switzerland, 2019; pp. 95–105. [Google Scholar]
  21. Aleksy, M.; Troost, M.; Scheinhardt, F.; Zank, G.T. Utilizing hololens to support industrial service processes. In Proceedings of the 2018 IEEE 32nd International Conference on Advanced Information Networking and Applications (AINA), Krakow, Poland, 16–18 May 2018; pp. 143–148. [Google Scholar]
  22. Pierdicca, R.; Prist, M.; Monteriù, A.; Frontoni, E.; Ciarapica, F.; Bevilacqua, M.; Mazzuto, G. Augmented Reality Smart Glasses in the Workplace: Safety and Security in the Fourth Industrial Revolution Era. In AVR 2020: Augmented Reality, Virtual Reality, and Computer Graphics; Springer: Cham, Switzerland, 2020; pp. 231–247. [Google Scholar]
  23. Kim, S.; Nussbaum, M.A.; Gabbard, J.L. Influences of augmented reality head-worn display type and user interface design on performance and usability in simulated warehouse order picking. Appl. Ergon. 2019, 74, 186–193. [Google Scholar] [CrossRef] [PubMed]
  24. Werrlich, S.; Nitsche, K.; Notni, G. Demand analysis for an augmented reality based assembly training. In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, Island of Rhodes, Greece, 21–23 June 2017; pp. 416–422. [Google Scholar]
  25. Eschen, H.; Kötter, T.; Rodeck, R.; Harnisch, M.; Schüppstuhl, T. Augmented and virtual reality for inspection and maintenance processes in the aviation industry. Procedia Manuf. 2018, 19, 156–163. [Google Scholar] [CrossRef]
  26. Li, S.; Zheng, P.; Zheng, L. An AR-Assisted Deep Learning Based Approach for Automatic Inspection of Aviation Connectors. IEEE Trans. Ind. Inform. 2020, 17, 1721–1731. [Google Scholar] [CrossRef]
  27. Ceruti, A.; Marzocca, P.; Liverani, A.; Bil, C. Maintenance in aeronautics in an Industry 4.0 context: The role of Augmented Reality and Additive Manufacturing. J. Comput. Des. Eng. 2019, 6, 516–526. [Google Scholar] [CrossRef]
  28. Freddi, M.; Frizziero, L. Design for Disassembly and Augmented Reality Applied to a Tailstock. Actuators 2020, 9, 102. [Google Scholar] [CrossRef]
  29. Mourtzis, D.; Siatras, V.; Angelopoulos, J. Real-Time Remote Maintenance Support Based on Augmented Reality (AR). Appl. Sci. 2020, 10, 1855. [Google Scholar] [CrossRef] [Green Version]
  30. He, F.; Ong, S.K.; Nee, A.Y. A Mobile Solution for Augmenting a Manufacturing Environment with User-Generated Annotations. Information 2019, 10, 60. [Google Scholar] [CrossRef] [Green Version]
  31. Arntz, A.; Keßler, D.; Borgert, N.; Zengeler, N.; Jansen, M.; Handmann, U.; Eimler, S.C. Navigating a Heavy Industry Environment Using Augmented Reality-A Comparison of Two Indoor Navigation Designs. In HCII 2020: Virtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications; Springer: Cham, Switzerland, 2020; pp. 3–18. [Google Scholar]
  32. Fang, W.; Zheng, S.; Liu, Z. A Scalable and Long-Term Wearable Augmented Reality System for Order Picking. In Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Beijing, China, 10–18 October 2019; pp. 4–7. [Google Scholar]
  33. Limeira, M.; Piardi, L.; Kalempa, V.C.; Schneider, A.; Leitão, P. Augmented Reality System for Multi-robot Experimentation in Warehouse Logistics. In Proceedings of the Iberian Robotics Conference, Porto, Portugal, 20–22 November 2019; pp. 319–330. [Google Scholar]
  34. Sarupuri, B.; Lee, G.A.; Billinghurst, M. An augmented reality guide for assisting forklift operation. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida, Mexico, 19–23 September 2016; pp. 59–60. [Google Scholar]
  35. Kapinus, M.; Beran, V.; Materna, Z.; Bambušek, D. Spatially Situated End-User Robot Programming in Augmented Reality. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; pp. 1–8. [Google Scholar]
  36. Zacharaki, A.; Kostavelis, I.; Gasteratos, A.; Dokas, I. Safety bounds in human robot interaction: A survey. Saf. Sci. 2020, 127, 104667. [Google Scholar] [CrossRef]
  37. Sanna, A.; Manuri, F.; Lamberti, F.; Paravati, G.; Pezzolla, P. Using handheld devices to sup port augmented reality-based maintenance and assembly tasks. In Proceedings of the 2015 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 9–12 January 2015; pp. 178–179. [Google Scholar]
  38. Vignali, G.; Bertolini, M.; Bottani, E.; Di Donato, L.; Ferraro, A.; Longo, F. Design and testing of an augmented reality solution to enhance operator safety in the food industry. Int. J. Food Eng. 2017, 14. [Google Scholar] [CrossRef]
  39. Kyriakoulis, N.; Gasteratos, A. Color-based monocular visuoinertial 3-D pose estimation of a volant robot. IEEE Trans. Instrum. Meas. 2010, 59, 2706–2715. [Google Scholar] [CrossRef]
  40. Metta, G.; Gasteratos, A.; Sandini, G. Learning to track colored objects with log-polar vision. Mechatronics 2004, 14, 989–1006. [Google Scholar] [CrossRef] [Green Version]
  41. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  42. Womg, A.; Shafiee, M.J.; Li, F.; Chwyl, B. Tiny SSD: A tiny single-shot detection deep convolutional neural network for real-time embedded object detection. In Proceedings of the 2018 15th Conference on Computer and Robot Vision (CRV), Toronto, ON, Canada, 9–11 May 2018; pp. 95–101. [Google Scholar]
  43. Bethune, J.D. Engineering Design Graphics with Autodesk Inventor 2020; Macromedia Press: San Francisco, CA, USA, 2019. [Google Scholar]
  44. Linowes, J.; Babilinski, K. Augmented Reality for Developers: Build Practical Augmented Reality Applications with Unity, ARCore, ARKit, and Vuforia; Packt Publishing Ltd.: Mumbai, India, 2017. [Google Scholar]
  45. Bay, H.; Tuytelaars, T.; Van Gool, L. Surf: Speeded up robust features. In Proceedings of the European Conference on Computer Vision, Graz, Austria, 7–13 May 2006; pp. 404–417. [Google Scholar]
  46. Wagner, D.; Reitmayr, G.; Mulloni, A.; Drummond, T.; Schmalstieg, D. Pose tracking from natural features on mobile phones. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, 15–18 September 2008; pp. 125–134. [Google Scholar]
  47. Tsintotas, K.A.; Bampis, L.; Gasteratos, A. DOSeqSLAM: Dynamic on-line sequence based loop closure detection algorithm for SLAM. In Proceedings of the 2018 IEEE International Conference on Imaging Systems and Techniques (IST), Krakow, Poland, 16–18 October 2018; pp. 1–6. [Google Scholar]
  48. Wang, S.; Mao, Z.; Zeng, C.; Gong, H.; Li, S.; Chen, B. A new method of virtual reality based on Unity3D. In Proceedings of the 2010 18th international conference on Geoinformatics, Beijing, China, 18–20 June 2010; pp. 1–5. [Google Scholar]
  49. Park, Y.H. Human Cognitive Task Distribution Model for Maintenance Support System of a Nuclear Power Plant. Master’s Thesis, Korea Advanced Institute of Science and Technology, Daejeon, Korea, 2007. [Google Scholar]
  50. D’Anniballe, A.; Silva, J.; Marzocca, P.; Ceruti, A. The role of augmented reality in air accident investigation and practitioner training. Reliab. Eng. Syst. Saf. 2020, 204, 107149. [Google Scholar] [CrossRef]
  51. Michalis, P.; Konstantinidis, F.; Valyrakis, M. The road towards Civil Infrastructure 4.0 for proactive asset management of critical infrastructure systems. In Proceedings of the 2nd International Conference on Natural Hazards & Infrastructure (ICONHIC), Chania, Greece, 23–26 June 2019; pp. 23–26. [Google Scholar]
  52. Chen, P.; Liu, X.; Cheng, W.; Huang, R. A review of using Augmented Reality in Education from 2011 to 2016. In Innovations in Smart Learning; Springer: Singapore, 2017; pp. 13–18. [Google Scholar]
  53. Tirkel, I.; Rabinowitz, G. Modeling cost benefit analysis of inspection in a production line. Int. J. Prod. Econ. 2014, 147, 38–45. [Google Scholar] [CrossRef]
Figure 1. In the Industry 4.0, maintenance operators are supported by augmented reality (AR)-enabled handhelds.
Figure 1. In the Industry 4.0, maintenance operators are supported by augmented reality (AR)-enabled handhelds.
Machines 08 00088 g001
Figure 2. Flowchart of the proposed augmented reality method for fast track maintenance procedures.
Figure 2. Flowchart of the proposed augmented reality method for fast track maintenance procedures.
Machines 08 00088 g002
Figure 3. Main maintenance interface that includes the layout of the buttons and the progress bar.
Figure 3. Main maintenance interface that includes the layout of the buttons and the progress bar.
Machines 08 00088 g003
Figure 4. The investigated compressor and its location within the topology of an air-conditioning system of a car.
Figure 4. The investigated compressor and its location within the topology of an air-conditioning system of a car.
Machines 08 00088 g004
Figure 5. Traditional paper-based technical drawing, presenting the correlations among the assembly parts of the compressor.
Figure 5. Traditional paper-based technical drawing, presenting the correlations among the assembly parts of the compressor.
Machines 08 00088 g005
Figure 6. Exploded view diagram generated by MARS. The illustration contains the designed subparts of the compressor in the explosion option.
Figure 6. Exploded view diagram generated by MARS. The illustration contains the designed subparts of the compressor in the explosion option.
Machines 08 00088 g006
Figure 7. Demonstration of the maintenance steps of an A/C compressor using MARMA in a mobile device.
Figure 7. Demonstration of the maintenance steps of an A/C compressor using MARMA in a mobile device.
Machines 08 00088 g007
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Konstantinidis, F.K.; Kansizoglou, I.; Santavas, N.; Mouroutsos, S.G.; Gasteratos, A. MARMA: A Mobile Augmented Reality Maintenance Assistant for Fast-Track Repair Procedures in the Context of Industry 4.0. Machines 2020, 8, 88. https://0-doi-org.brum.beds.ac.uk/10.3390/machines8040088

AMA Style

Konstantinidis FK, Kansizoglou I, Santavas N, Mouroutsos SG, Gasteratos A. MARMA: A Mobile Augmented Reality Maintenance Assistant for Fast-Track Repair Procedures in the Context of Industry 4.0. Machines. 2020; 8(4):88. https://0-doi-org.brum.beds.ac.uk/10.3390/machines8040088

Chicago/Turabian Style

Konstantinidis, Fotios K., Ioannis Kansizoglou, Nicholas Santavas, Spyridon G. Mouroutsos, and Antonios Gasteratos. 2020. "MARMA: A Mobile Augmented Reality Maintenance Assistant for Fast-Track Repair Procedures in the Context of Industry 4.0" Machines 8, no. 4: 88. https://0-doi-org.brum.beds.ac.uk/10.3390/machines8040088

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop