Next Article in Journal
The Evolution of Sustainability: The Automotive Supply Chain Opportunity in Southern Italy
Next Article in Special Issue
Prioritizing the Solutions to Reverse Logistics Barriers for the E-Commerce Industry in Pakistan Based on a Fuzzy AHP-TOPSIS Approach
Previous Article in Journal
Green Restaurants ASSessment (GRASS): A Tool for Evaluation and Classification of Restaurants Considering Sustainability Indicators
Previous Article in Special Issue
Role of Industry 4.0 in Supply Chains Sustainability: A Systematic Literature Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Augmented Reality Symbiosis Software Tool for Sustainable Logistics Activities

by
Vasileios Sidiropoulos
1,2,*,
Dimitrios Bechtsis
1,2,3 and
Dimitrios Vlachos
1,3
1
Department of Mechanical Engineering, Division of Industrial Management, Polytechnical School, Aristotle University of Thessaloniki, 54124 Thessaloniki, Greece
2
Department of Industrial Engineering and Management, School of Engineering, International Hellenic University, Thermi, 57001 Thessaloniki, Greece
3
Center for Interdisciplinary Research and Innovation (CIRI-AUTH), Balkan Center, Thermi, 57001 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(19), 10929; https://0-doi-org.brum.beds.ac.uk/10.3390/su131910929
Submission received: 31 August 2021 / Revised: 24 September 2021 / Accepted: 27 September 2021 / Published: 30 September 2021
(This article belongs to the Special Issue Sustainable Management and Application of E-Logistics)

Abstract

:
Augmented Reality (AR) is an emerging technology in the Industry 4.0 and Logistics 4.0 contexts with an important role in man–machine symbiosis scenarios. Practitioners, although already acquainted with AR technology, are reluctant to adopt AR applications in industrial operations. This stems from the fact that a direct connection that is important for the management of sustainability goals is missing. Moreover, such a connection with economic, social, and environmental sustainability parameters sparsely appears in the AR literature. The proposed research, on one hand, presents an innovative architecture for a stable and scalable AR application that extents state-of-the-art solutions and, on the other hand, attempts to study AR technology within the framework of a sustainable business strategy. The developed system utilizes the Robot Operating System (ROS) alongside an AR mobile application to present an employee navigation scenario in warehouses and production lines. ROS is responsible for mapping the industrial facility, while the AR mobile application identifies the surrounding environment, along with a Real-Time Location System localizes employees in the facility. Finally, ROS identifies the shortest path between the employee and the destination point, while the AR mobile application presents the virtual path for reaching the destination.

1. Introduction

Industrial environments are demanding and full of challenges that could be addressed by digital technologies. Digital transformation is considered a fundamental process in the Industry 4.0 and Logistics 4.0 contexts and promotes the adoption of Information and Communication Technology tools. The ever-increasing complexity of industrial facilities requires highly specialized personnel and continuous training, while at the same time operational activities are time-critical processes that should be handled directly by employees in the production line. Augmented Reality (AR) is an emerging technology that redefines the use of human–machine and human–human interactions by utilizing new hardware, such as smart glasses, or by the alternate use of commonly used devices, such as smartphones [1]. AR applications are embraced by the industry at a rapid pace, and most enterprises that are considering AR solutions are planning the transition of their AR pilots to the production phase within the next 2 years. The adoption of AR will strengthen personnel training, will improve and shorten the duration of the execution of manufacturing activities, and will further support the design of innovative processes. In recent research regarding job-specific trainings, 6% of the respondents utilize AR to overlay the predefined instructions, while 3% use expert coaching and knowledge transfer, which projects instructions given from an expert. On the manufacturing side, instruction manuals on how to perform processes, operate machinery, assemble parts, and perform maintenance activities are used by 6% of the respondents. Finally, 9% stated that they used AR in the design phase of their products for review purposes [2]. In logistics and warehouses, AR already offers solutions for researchers and practitioners in multiple prospects, such as picking, shipping, and inventory planning. In these scenarios, the proper implementation of AR can significantly reduce labor costs and improve the overall efficiency [3]. Nonetheless, AR technology is in its infancy stage, and more research needs to be conducted to create stable, scalable, and user-friendly applications in complex industrial and logistics environments. A holistic approach that tackles the limitations and considers the economic, social, and environmental impact could accelerate the adoption of Industry 4.0 and Logistics 4.0 AR applications.
For interfacing and communicating with the real-world environments, AR applications use image processing techniques. This enables them to identify the objects of the real-world environment and then interoperate by using a virtual environment that acts as a digital twin. Marker-based AR is a common technique that uses markers to determine the AR objects’ positions, but it struggles in harsh industrial environments as it often loses track of the markers. The existence of multiple markers is an alternative for tackling the issue, but it increases the system’s complexity and often leads to unsatisfactory results. A 2D camera for image recognition could be used for increasing accuracy but, unfortunately, dust and poor lighting often hinder the process [4]. An interesting approach is to use spatial registration to localize users and apply this output for feeding the AR application [5]. The combination of the aforementioned methods provides improved results but negatively affects the complexity, both for users and developers. Common AR problems need to be addressed for the establishment of AR solutions. A common malfunction of the AR technology is AR shifting, which occurs when image recognition techniques fail to identify the environment or when sudden, quick, and unpredictable movements interrupt the normal flow and the identification process is hindered, resulting in a misaligned AR world [6]. Thus, the development of AR applications raises the following research questions: (RQ1) How to develop stable, scalable, and user-friendly AR applications that can cope with industrial environments; and (RQ2) What are the sustainability dimensions that researchers and practitioners should consider in a holistic approach towards the adoption of modern AR applications.
The proposed system describes a symbiosis scenario which enables employees to interoperate with hardware and software systems in warehouses and production lines through a mobile AR application. A specific use case is showcased for the guidance of employees in the facility layout using the application and information from the central industrial information system (CIIS) to demonstrate the system’s functionalities. The CIIS includes information from all the static and mobile hardware equipment and information that derives from the software tools. This enables the user to directly interact with the AR application and access all the necessary information sources. The main goal is to directly utilize the AR application in the industrial facility, make it a discrete part of the process, and effectively handle real-time information from the field. With a stable communication between the AR application and the CIIS, the AR application can access the information sources and present them to the user. The CIIS can support the application to overcome various AR-related problems that are widely discussed in the extant literature, such as AR shifting. The AR application was developed at the Unity platform with the AR foundation package and its marker-less method. A Real-Time Location System (RTLS) was introduced to overcome the marker-less problems and increase the application’s performance and stability. The system’s architecture also ensures the scalability of the system both in terms of (i) expanding the application’s functionalities (e.g., creating assembly scenarios) and (ii) developing multiuser scenarios. Furthermore, the direct connection of the mobile AR application with the CIIS, which ensures scalability, could accelerate the procurement of AR based solutions from large-scale companies.
The technological driver, although important, is further supported by sustainability indicators for motivating researchers and practitioners in using AR applications. The literature review revealed economic, social, and environmental parameters that add value to the inclusion of AR technology in operational activities. The operational activities include the minimization of warehouse layout changes cost, the optimization of product localization and transportation processes, the minimization of human errors, the monitoring of quality control, the optimization of warehouse management, the strengthening of the employees’ safety, and the minimization of the natural resources’ consumption [7,8]. Although, it is clear that the introduction of the AR technology is a feasible and profitable action, managers are reluctant to adopt it. This stems from two main factors: (i) the implemented AR applications have limitations in harsh industrial environments; (ii) a holistic approach that targets the sustainability context is missing. The overarching aim of this study is the adoption of a holistic approach that tackles limitations and considers sustainability indicators for introducing AR technology in the industrial landscape. The contributions of the proposed research focus on the identified gap and include (i) a thorough mapping of the research area, (ii) an innovative architecture for developing AR applications, (iii) a stable and scalable AR application, and (iv) an analysis of the benefits of the AR technology in the sustainability context.
The remainder of this manuscript is structured as follows. Section 2 describes the systematic literature review process, while Section 3 outlines the proposed system’s architecture for the AR symbiosis software tool. Finally, Section 4 analytically presents the developed software tool, and Section 5 discusses the conclusions and the sustainability indicators.

2. Literature Review

A systematic literature review was conducted considering Industry 4.0 and Logistics 4.0 AR applications in the sustainability context. We initially utilized the keywords “Augmented Reality” and “Industry 4.0” and extracted a heatmap with the VOSviewer 1.6.14 software tool, created by Leiden University in Netherlands, in order to identify the main research topics and commonly used keywords (Figure 1). As a next step, the widely adopted keywords were used (“human-machine interaction”, “training”, “human-computer interaction”, “digital twin”) along with the sustainability dimensions (economic, social, environmental) in order to systematically proceed with the literature review search at the scientific databases.
AR is considered as an emerging and promising technology for industries [9,10,11], and AR technology is labelled as a serviceable future technology in the transition towards the digital transformation, while [12] stated that AR can improve numerus industrial indicators, namely, production plant control, safety and security, resource matching, product design, information provision, human–machine interactions, and training. Digital technologies are used for the digital transformation of supply chains, and they are closely connected to the improvement of the social, economic, and environmental sustainability indicators [13], and AR is moving towards this direction. AR applications regarding industrial activities have been widely adopted by the research community and practitioners. Specifically, AR is widely used for assistance in assembly scenarios of industrial products and machinery. Ref. [14] developed a system for a human–robot collaborative assembly, based on a marker-based AR algorithm, to provide information to employees throughout the process. Another research effort focused distinctly on the improvement of the employees’ experience by creating AR-based assembly manuals that enabled the effective and efficient use of the AR technology with a user-oriented approach [15]. Overall, the AR technology in assembly scenarios is widely accepted by users and clearly has a positive influence on their satisfaction levels, their creativity, and their business performance [16]. On the other hand, AR is an advantageous technology for industrial training sessions. It is proven that students prefer AR-based courses, and this enhances both their attendance rates and their performance [17]. To this end, some researchers developed AR applications for training. Ref. [18] developed an AR application for hand-held devices and implemented a training session for improving assembly and maintenance skills. The authors stated that the skill level of the technicians was magnified after the establishment of the application in the working environment. Taking a step forward, the authors identified that there was a significant performance difference among technicians trained with traditional methods and those that used the AR application. Additionally, Ref. [19] strengthened the above argument by observing a better performance when utilizing a Head-Mounted display (HMD) for conducting a training session for maintenance tasks. In this study, the authors extended the functionalities of the application by implementing an interface for providing assistance during maintenance processes. Similar implementations have been proposed using geovisualization and utility management for demonstrating the potential outdoor use of AR technology in logistics and could be adopted in complex industrial environments and logistics warehouses to support the preparation and the execution of the maintenance processes [20,21]. These studies highlight the advantages of the AR technology in social, safety, and economic aspects and, moreover, emphasize on the potential of AR as a monitoring, documenting, and resource managing mean.
Another aspect of industrial activities that can take advantage of the AR technology is the human–robot interactions section. Particularly, [22] successfully combined an AR application with ROS to allow users to control a gripper using their application. However, the authors used an HMD and marker-based AR and concluded that the system lacks flexibility, while the HMD was not widely accepted by the users. AR applications can benefit by using network-based architectures and frameworks and create multiuser platforms. Ref. [23] successfully developed a multiuser application for collaborative 3D design. This approach can be transferred in industrial environments to support both experienced and novice personnel during the training phase and enable them to cooperatively complete tasks. These studies demonstrate the possibilities of AR technology in the Industry 4.0 context. In the Logistics 4.0 context, AR is also considered a breakthrough by the researchers and practitioners and appears to have added value to quality control, warehouse management, predictive maintenance, and remote operation activities. Refs. [7,8] utilized AR technology to support order picking tasks in a warehouse and indicated the profitable use of the technology in every-day logistics processes. Furthermore, Ref. [24] reinforced this statement by implementing an AR system for product picking activities and using agent base simulation for supporting their findings. AR technology can be introduced in warehouse operations to minimize the employees’ error rate, the task completion time, and increase flexibility, reliability, and adaptability in the facility layout [25]. Additionally, AR solutions can improve production processes, reduce the rate of work accidents, control environmental parameters, and help the enterprises wisely utilize natural resources [26]. Nevertheless, AR technology is still in its infancy stages, and further research needs to be conducted to successfully introduce it to industrial environments. Specifically, the lack of scalable AR applications in real production lines is pinpointed [27] alongside the necessity for quality and reliability improvements of the proposed AR solutions [9,28]. Hardware limitations are also present regarding the AR devices’ performance after extensive use and the battery consumption during high processing power tasks, such as image recognition [4,25]. Furthermore, the important role of the human factor in the process of designing and developing AR applications is highlighted. The lack of user feedback during the development phase could have negative effects on the operational performance and the workers’ health and safety conditions [29]. Finally, it has been stated that there are also some barriers regarding the overall users’ experience. When virtual objects are projected in monitors or HMDs, human cognition is negatively influenced and that incommodes the ability to comprehend the scale and dimensions in the AR world [30]. In addition, some users suffer from motion sickness after extensive use of AR applications, but this effect is more common for HMDs [31].
The overarching aim of this study is to establish an innovative architecture that promotes the adoption of stable and scalable AR applications in operational activities, while assessing the sustainability dimensions.

3. System Architecture

The proposed AR application supports the navigation of employees in industrial environments with the help of AR objects that are projected in the real-world environment and indicate specific destination points. Real-world scenes are dynamically captured by the camera of the mobile phone, and AR objects are positioned in specific locations. The CIIS provides all the inside information regarding the facility layout, and the AR application provides a set of regions of interest that are used as destinations. The user selects a destination and can be assisted to follow the shortest path from his current position to the destination point following the AR objects that act as path indicators.
To address the shortfalls of AR implementations in harsh environments, the authors decided to fully capitalize on the CIIS resources. A lot of crucial functionalities such as controlling the AR world’s objects have been appointed to the CIIS instead of the AR application. As the AR application has mediocre computational resources, the CIIS is responsible for all the functions and the execution of heavy-duty activities The AR application could include a continuous growing number of functionalities while the CIIS handles all the reported issues that common AR applications confront, and this paves the way for a stable and fully scalable software tool.
Taking a step forward in this direction, the Real Time Location System (RTLS) is introduced to address the AR shifting problem. AR shifting is a known issue in AR applications [6] and occurs when the marker-less AR method fails to recognize the object under scrutiny. Unity is used as the basic software for implementing the AR recognition method, and from the literature review, it is evident that this issue appears either in situations where the image recognition process fails to identify the object due to (i) the abrupt device’s movements or (ii) the malfunction of the image processing algorithms. During AR shifting, augmented objects appear to either slip from their actual position or be misplaced in the real-world scene displayed on the mobile device. The use of the RTLS ensures that the CIIS will be informed about the actual position of the mobile device and this information could be cross-referenced with the estimated position from the AR application for supporting correct the AR objects placement. In case there is an acceptable difference (lower than 20 cm in each dimension), it is assumed that the AR application did not encounter any problems and the AR world is displayed correctly. In case we do not have an acceptable difference, the AR application fails to properly project the objects and the AR world is not properly displayed. This is when the CIIS sends a message to the AR application to execute the Reset and Align routine. This routine consists of functions that instantly align the coordinate systems from the CIIS and the AR application. The proposed functionality could also be used for the initial alignment of the two subsystems. In Figure 2, the proposed innovative architecture of the software tool is presented.

4. The Proposed Implementation

For implementing the proposed system, the authors initially selected the software tools for the development phase and elaborated on three individual subsystems, namely, (i) the CIIS, (ii) the AR application, and (iii) the RTLS.
The individual subsystems are interoperating by sharing information, and their final goal is to provide guidance to employees in industrial facilities (warehouses and production lines). The first step is to align the real and the AR world using the CIIS’s information. When launching the application, the coordinate systems of the application and the CIIS must be aligned. This will enable the proper localization and visualization of the AR objects when the CIIS sends their coordinates to the AR application. For the alignment, the first implementation was based on a predefined zero-point in the facility, which was common for the CIIS and the AR application. The AR application had to be launched at this point to align with the CIIS. After the RTLS addition, the coordinate systems’ alignment was dynamically accomplished without the need for a zero-point that would initialize the application. To achieve that, a routine was developed on the CIIS that used the RTLS’s data to align the CIIS’s and AR application’s coordinate systems. The AR application captures the user’s pose and the user’s request for a destination point and communicates with the CIIS. It uses image recognition techniques and information from the mobile phone’s sensors (camera, accelerometer, and gyroscope) in order to calculate the employee’s location based on the aligned coordinate system. After receiving the input data, the CIIS processes them and concludes in the response (the AR path in the virtual environment). Finally, the AR application receives the system’s response and visualizes the results by creating the AR objects that form a path in the AR world for guiding the user. It should be stated that the AR objects that form the path are dynamically positioned in the AR world according to the movement of the employee that holds the mobile device and continuously indicate the destination point.

4.1. Software Tools

The requirements for the CIIS were carefully studied, and numerous robotic platforms were examined. The CIIS should be able to interoperate using industrial standards, create virtual worlds, enable the autonomous and programmable movement of objects (indicatively robotic vehicles) in the virtual world using pathfinding algorithms, and include an API for communicating over the network with third party tools. One suitable choice was Microsoft’s Robotics Studio®, which is already used for research studies [32,33], as it provides all the necessary components. Nonetheless, in the proposed use case, the ROS platform was selected as it also fully covers the requirements, provides a long list of open-source libraries, and furthermore has a big and active community. It also offers some ready-made communication structures that could facilitate the development process and provides visualization tools such as Rviz in order to dynamically present the sensor’s data and the pathfinding results in the virtual environment.
The AR application was developed using the Android Operating System (OS) and a compatible software development kit (SDK) in order to be executed by the majority of the mobile devices. This decision was also based on literature review findings that showcased the adoption of AR applications with similar technical specifications. Indicatively, ref. [34] stated that AR applications that operate in mobile devices could be beneficial to both novice and experienced workers, regardless of their age and experience level. The AR application’s requirements include: (i) motion and position tracking, (ii) AR text and AR objects’ placement in real time, and (iii) AR world manipulation based on the device’s movement. Initially, we considered three options for the SKD, namely, Google’s ARcore, PTC’s Vuforia, and Wikimedia’s Wikitude. Both ARcore and Wikitude met our requirements, while Vuforia lacks the motion and position tracking. After a thorough study, we selected ARcore, Version 1.26.0, due to its wide adoption among researchers [35] and its active community. Finally, the application’s platform had to support ARcore and include wireless communication structures in order to exchange information with ROS. Unity provides built-in communication interfaces with ROS via the ROS# library and enables the creation of ROS messages within Unity and bidirectional communication with ROS.
The RTLS subsystem should be able to support (i) long-range signals for covering the industrial area, (ii) high position accuracy (<0.20 cm), (iii) industrial areas with obstacles, as this is the most common scenario, and (iv) low-cost hardware implementation. RTLS technologies such as RFID, GPS, and WLAN were excluded due to their low accuracy. Infrared offers great accuracy, but for the implementation of the technology the cost is significantly high while ultrasound also offers excellent accuracy but struggles in environments with obstacles and metal surfaces [36]. As a result, the authors selected the ultra-wideband (UWB) method, which fulfills all the RTLS needs.

4.2. ROS Implementation for the CIIS Subsystem

ROS is widely used by practitioners and academics as a robotic platform for both simulation and real-world applications. Virtual worlds can be developed using the Gazebo software that is fully compatible with the ROS platform, and even real-world facility layouts can be digitized and handled as digital twins. For deploying our scenario, the real-world facility layout was digitized and included in the ROS ecosystem. For navigating to the industrial facility, the turtlebot3 wafflepi robotic vehicle from Robotis (Seoul, Korea) was used. It provides a set of packages for the operation of the robot as well as for its simulation in the virtual environment. Turtlebot robotic vehicles use the ros_navigation_stack and sensors (lidar and camera) in order to map the facility layout and navigate. The vehicle can receive specific commands and the final goal that indicates the destination point and find the shortest path in order to reach its destination.
For communicating with the AR application, the ROS receives data streams for controlling the AR world and responds to the user’s requests. This communication is feasible using the rosbridge_server library, which enables low-latency and bidirectional communication with the WebSocket connection. The AR application collects and directly sends the user’s position, the user’s orientation using the gyroscope data of the mobile device, and, finally, the user’s request for moving towards a specific destination point in the facility layout. Finally, the ROS processes all the data and sends the corresponding path as a response to the AR application.
RTLS interoperates with the ROS for continuously providing the user’s current position and comparing it with the AR application’s estimation. This information is retrieved by the main anchor of the RTLS which acts as master and stores real-time position information from all the anchors and tags in the facility layout. The main anchor is connected to the ROS by using a python-based script.

4.3. Unity Implementation for the AR Application Subsystem

Unity’s AR foundation offers an interface for AR application development for Android devices. The ARcore plugin provides two components, the AR session and the AR session origin, for the AR world operation. The AR session controls the augmented world by transforming the augmented objects according to the Position and the Orientation of the device in order to maintain the illusion of the augmented objects’ dynamic movement. On the other hand, the AR session origin contains the camera and all the trackable features, such as planes, detected from the image recognition process. Moreover, ARcore calculates the device’s position by a process called simultaneous localization and mapping (SLAM). ARcore detects feature points and identifies the device’s location. This information is combined with the devices sensor data to estimate the user’s pose (position and orientation).
To overcome the AR shifting problem, a “Reset and Align” routine was developed. This routine consists of three stages, and it is triggered from the ROS every time a misalignment is identified, based on the error between the real position, from the RTLS and the estimated position, from ARcore. In the first stage, the error is calculated by subtracting these coordinates on each axis. If the error exceeds the 20 cm error on any axis, the second stage initiates. During this stage, the world is reinitialized, and all the feature points that are detected are deleted. Finally, during the third stage, the alignment occurs and shifts the ARcores’ coordinate system according to an offset determined the RTLS. This routine is also responsible for the initial alignment of the coordinate systems upon the AR application launch.
Finally, the communication was developed with the ROS# library. The ROS# enables ROS-like message creation within Unity with the C sharp programming language. It also provides a script for the communication through Network with ROS and ros_bridge library. For every request and response, a message was created. Some messages had continuous transmission (gyroscope, user’s estimated position) while others were sent on demand (e.g., after the selection of a destination). In Figure 3, the Reset and Align routine’s outcome is demonstrated alongside with the drop-down menu, provided to the user, for the destination selection.

4.4. RTLS Subsystem Implementation

The UWB technology was applied for the RTLS implementation. Four DWM1000 UWB modules were used as hardware equipment. One of them acted as tag on the mobile device and the other three acted as anchors. These modules offer a long indoor range (40 m) and high accuracy (<20 cm based on manufacturer’s specifications), which are key features for our implementation as we wanted to operate them in industrial facilities (logistic warehouses, production lines). The data from the UWB modules were passed to an ESP-32 microcontroller for data processing. This hardware configuration is identical among tags and anchors, and the differentiation was at the firmware level which assists in the implementation cost reduction. For the position calculation the tag sends a signal to each anchor, one at a time, while anchors upon the signal arrival are calculating their distance from the tag. Additionally, all anchors send this calculated position to the main anchor. Finally, the main anchor transmits the collected data to ROS, via a USB-serial communication, and a python-based script calculates the tag’s position using a trilateration algorithm. With this configuration, the RTLS system could efficiently locate the user at any time with a relatively low error. In our system architecture, ROS uses RTLS’s data only to support the AR application, but it is worth mentioning that this information can be utilized for other purposes in the industry, namely, decision making, employee work evaluation, etc. Additionally, this RTLS layout is not limited from the facility’s layout and can be expanded for bigger areas. Lastly, multiuser location is possible without considerable changes in addition to the added tags.

4.5. Use Case Description

After the development of the subsystems, the authors elaborated on a specific scenario for showcasing the added value of the application in a warehouse. The proposed scenario guides employees to specific locations in a warehouse to assist them during a picking process.
The CIIS was used for creating the digital twin environment of a real-world warehouse. The virtual environment was developed using ROS and the occupancy grid map for navigating in the facility layout is presented in Figure 4a,b. This enabled the automatic creation of the shortest paths in the virtual environment from a starting to a destination point using the built-in ROS routing algorithms and the navigation stack. The navigation stack takes as an input the coordinates of the destination point and calculates the shortest path from the employee’s location to the destination point. As a next step, several destination points were included in the AR application with a drop-down menu selection. The destination points were either predefined locations in the facility or dynamic, such as the robot’s position.
The employee should be correctly localized in the facility layout by explicitly providing the exact position and orientation to the CIIS. The AR application, with the use of the mobile phone’s camera, identifies the scene’s objects in the real-world and estimates the employee’s position. The localization process is supported by the RTLS system to reduce AR shifting shortfalls and concludes in the employee’s coordinates in the facility. This process is continuously executed and dynamically identifies the employee’s location to calculate the virtual path from his current position to the destination point.
After safeguarding the mapping of the facility layout, the creation of the occupancy grid map, and the dynamic identification of the employee’s position in the facility, the AR application enables the navigation to the selected destination. The destination is selected from the drop-down menu and then the Sent Goal function is executed to inform the CIIS. At this point and from the CIIS point of view, ROS receives the destination’s coordinates and calculates the shortest path by using the employee’s position, while the AR application captures instances from the real-world facility and presents to the mobile device a virtual path (the shortest path) to reach the destination point. This is a dynamic process as every time the employee moves, his new position is retrieved, ROS calculates the shortest path at the virtual environment, and the AR application presents the updated virtual path in the mobile device.
Figure 4(a1) presents the occupancy grid with the employee’s initial position, the destination point (Exit), and the path in the virtual environment, while Figure 4(b1) presents the AR path from the employee’s initial position to the Exit. After the employee moves forward, Figure 4(a2) presents his updated position in the occupancy grid map and the updated path in the virtual environment, while Figure 4(b2) presents the updated AR path in the mobile device that leads to the destination point.
The proposed scenario increases sustainability indicators as it has social, environmental, and economic impact. It reduces hazards as employees can avoid obstacles and restricted areas that are predefined in the facility, it reduces energy consumption when employees are moving in the facility using vehicles, while, from an economic perspective, it increases efficiency indicators in the facility.

5. Discussion

5.1. AR Technology Discussion and Drawbacks

AR is an emerging technology that will redefine numerous industrial activities that include training, assembly, and human–machine interactions. AR will highly impact logistics operations, namely: warehouse design and management, product localization, picking and transportation, human errors minimization, quality control assistance, facility safety maximization, and natural resource consumption. Researchers have already introduced AR technology studies to the research community and have highlighted specific AR benefits to the public. Practitioners, although already acquainted with AR technology, are reluctant to adopt AR applications in their operational activities. A holistic approach is needed that considers economic, social, and environmental indicators. Economic and environmental indicators are focusing on providing more effective operational activities that will have a direct impact cost minimization and that will also indirectly minimize the use of natural resources. Furthermore, they are also affected as numerous operational activities are remotely operated by AR technology. Social indicators are of great importance as AR technology improves safety by reducing the physical interactions between the employees and the machinery, promoting collaborative tasks and human–machine interactions, and providing innovative user experiences for training and maintenance and operational activities. This holistic approach will accelerate the use of AR applications in the logistics industry.
In order to experience the full potential of AR technology, it is urgent to overcome some technological drawbacks. During our research we ascertained that many of the existing implementations had specific limitations. Several researchers developed their AR solutions based on the marker-based method, but this could result in malfunctions when considering modern industrial environments and flexible manufacturing systems. The scalability of AR applications is also an issue, as the proposed implementations either cannot be adopted by many concurrent users or have limitations in the development of add-on features that could expand the existing functionalities. The devices’ computational resources often struggle after long periods of use and the high consumption tasks are increasing the battery consumption. Finally, human nature also provides some limitations as the continuous monitoring of virtual objects could negatively influence human cognition and could even lead to motion sickness, especially due to the extensive use of HMDs.

5.2. Research Results Interpretation, Implications, and Limitations

The proposed system contributes to the existing literature by providing: (i) a literature review mapping of the research area, (ii) an innovative architecture that extends existing approaches and focuses on providing stable and scalable AR implementations, (iii) an AR application that showcases the benefits and the technology’s potential; and (iv) a holistic approach for the adoption of AR technology that considers the sustainability indicators. The mapping of the existing literature clearly identifies the use cases and the limitations and provides an analysis of the sustainability indicators. The innovative architecture enables the implementation of stable and scalable symbiosis scenarios. The addition of the RTLS subsystem and the use of the CIIS’s computational resources provide a feasible solution that overcomes the AR shifting problem and makes a step towards a stable and scalable AR implementation. The developed AR application is used for indoor guidance in industrial facilities. The proposed solution provides: (i) flexibility regarding the user’s movement in the facility, as the AR application is not marker-based, and (ii) reliability, by overcoming common shortfalls of AR technology.
The literature review revealed that researchers and practitioners are not following a holistic approach that tackles limitations and considers sustainability indicators as enablers for the introduction of AR technology into the Industry 4.0 and Logistics 4.0 contexts. The introduction of AR technology is a powerfully engaging medium that supports both the training of new employees and the operational activities of experienced employees, and this has a direct impact on sustainability indicators. The presented use case showcased the guidance of employees through the facility, but multiple use cases could be also supported (indicatively picking activities and product assembly scenarios). Furthermore, from a social perspective employees could enhance their ability to communicate with the hardware and software backbone of an industrial facility and reduce hazards. There is also a direct economic impact as AR improves efficiency in operational activities, reduces the total transportation distance within the facility, and increases productivity. An indirect environmental impact from the minimization of transportations in the industrial facility is important.

5.3. Recommendations and Future Research Agenda

The developed system introduces an innovative architecture for efficient and effective AR applications in industrial environments, but more research and field studies need to be conducted. Researchers should elaborate more on the AR visualizations that sometimes hinder human cognition and promote motion sickness issues on real-world conditions. Furthermore, the research community should tackle the challenging issues of the battery consumption in order to extend the AR application lifetime. Practitioners, on the other hand, can utilize the proposed system architecture to develop AR applications and implement numerous scenarios in their facilities (indicatively assembly and maintenance activities). The AR technology’s potential will be fully discovered as practitioners will benefit from deploying stable and scalable AR applications into the industrial ecosystem. Researchers and practitioners could also benefit from the new generation mobile phones and portable devices with increased computational power and longer battery lifetime.

Author Contributions

Conceptualization, V.S., D.B. and D.V.; methodology, D.B.; software, V.S.; validation, D.B. and D.V.; formal analysis, D.B. and V.S.; investigation, D.B.; resources, V.S.; data curation, V.S.; writing—original draft preparation, D.B. and V.S.; writing—review and editing, D.B. and D.V.; visualization, V.S.; supervision, D.B. and D.V.; project administration, D.B.; funding acquisition, D.V. All authors have read and agreed to the published version of the manuscript.

Funding

The APC was funded by the national General Secretariat for Research and Innovation project “Dynamic Quality CONtrol on Production lines using intelligent AutonomouS vehicleS (Q-CONPASS)” MIS code: 5048497.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new Data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Briggs, B. (Global Chief Technology Officer Deloitte Consulting LLP); Henry, N. (US Consulting Chief Innovation Officer Deloitte Consulting LLP); Main, A. (Deloitte Digital Leader Deloitte Consulting LLP). Tech Trends 2019—Beyond the Digital Frontier. Deloitte Insights, no. 10th Anniversary Edition. 2019. Available online: https://www2.deloitte.com/be/en/pages/technology/enterprise-technology-and-performance/articles/tech-trends-2019-beyond-the-digital-frontier.html (accessed on 4 April 2021).
  2. Campbell, M. (PTC, Executive Vice President, Augmented Reality Products); Lang, J. (PTC, Vice President Corporate Strategy); Kelly, S. (PTC, Lead Principal Business Analyst); Immerman, D. (PTC, Business Analyst), The State of Industrial Augmented Reality 2019. Ptc, 2019; p. 7. Available online: ptc.com (accessed on 2 May 2021).
  3. Wang, W.; Wang, F.; Song, W.; Su, S. Application of Augmented Reality (AR) Technologies in inhouse Logistics. E3S Web Conf. 2020, 145, 02018. [Google Scholar] [CrossRef] [Green Version]
  4. Dianatfar, M.; Latokartano, J.; Lanz, M. Review on existing VR/AR solutions in human–robot collaboration. Procedia CIRP 2021, 97, 407–411. [Google Scholar] [CrossRef]
  5. Chen, K.; Yang, J.; Cheng, J.C.; Chen, W.; Li, C.T. Transfer learning enhanced AR spatial registration for facility maintenance management. Autom. Constr. 2020, 113, 103135. [Google Scholar] [CrossRef]
  6. Aschauer, A.; Reisner-Kollmann, I.; Wolfartsberger, J. Creating an Open-Source Augmented Reality Remote Support Tool for Industry: Challenges and Learnings. Procedia Comput. Sci. 2021, 180, 269–279. [Google Scholar] [CrossRef]
  7. Fraga-Lamas, P.; Fernández-Caramés, T.M.; Blanco-Novoa, O.; Vilar-Montesinos, M.A. A Review on Industrial Augmented Reality Systems for the Industry 4.0 Shipyard. IEEE Access 2018, 6, 13358–13375. [Google Scholar] [CrossRef]
  8. Mourtzis, D.; Samothrakis, V.; Zogopoulos, V.; Vlachou, E. Warehouse Design and Operation using Augmented Reality technology: A Papermaking Industry Case Study. Procedia CIRP 2019, 79, 574–579. [Google Scholar] [CrossRef]
  9. Jeong, B.; Yoon, J. Competitive Intelligence Analysis of Augmented Reality Technology Using Patent Information. Sustainability 2017, 9, 497. [Google Scholar] [CrossRef] [Green Version]
  10. Masood, T.; Egger, J. Augmented reality in support of Industry 4.0—Implementation challenges and success factors. Robot. Comput. Manuf. 2019, 58, 181–195. [Google Scholar] [CrossRef]
  11. Egger, J.; Masood, T. Augmented reality in support of intelligent manufacturing—A systematic literature review. Comput. Ind. Eng. 2020, 140, 106195. [Google Scholar] [CrossRef]
  12. Damiani, L.; Demartini, M.; Guizzi, G.; Revetria, R.; Tonelli, F. Augmented and virtual reality applications in industrial systems: A qualitative review towards the industry 4.0 era. IFAC-PapersOnLine 2018, 51, 624–630. [Google Scholar] [CrossRef]
  13. Bechtsis, D.; Tsolakis, N.; Vlachos, D.; Srai, J.S. Intelligent Autonomous Vehicles in digital supply chains: A framework for integrating innovations towards sustainable value networks. J. Clean. Prod. 2018, 181, 60–71. [Google Scholar] [CrossRef]
  14. Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.-S. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 2016, 65, 61–64. [Google Scholar] [CrossRef]
  15. Wang, Z.; Bai, X.; Zhang, S.; Wang, Y.; Han, S.; Zhang, X.; Yan, Y.; Xiong, Z. User-oriented AR assembly guideline: A new classification method of assembly instruction for user cognition. Int. J. Adv. Manuf. Technol. 2021, 112, 41–59. [Google Scholar] [CrossRef]
  16. Schuster, F.; Engelmann, B.; Sponholz, U.; Schmitt, J. Human acceptance evaluation of AR-assisted assembly scenarios. J. Manuf. Syst. 2021. [Google Scholar] [CrossRef]
  17. Neffati, O.S.; Setiawan, R.; Jayanthi, P.; Vanithamani, S.; Sharma, D.K.; Regin, R.; Mani, D.; Sengan, S. An educational tool for enhanced mobile e-Learning for technical higher education using mobile devices for augmented reality. Microprocess. Microsyst. 2021, 83, 104030. [Google Scholar] [CrossRef]
  18. Webel, S.; Bockholt, U.; Engelke, T.; Gavish, N.; Olbrich, M.; Preusche, C. An augmented reality training platform for assembly and maintenance skills. Robot. Auton. Syst. 2013, 61, 398–403. [Google Scholar] [CrossRef]
  19. Schwald, B.; de Laval, B. An Augmented Reality System for Training and Assistance to Maintenance in the Industrial Context. J. WSCG 2003, 11, 284–291. [Google Scholar] [CrossRef]
  20. Stylianidis, E.; Valari, E.; Pagani, A.; Carrillo, I.; Kounoudes, A.; Michail, K.; Smagas, K. Augmented Reality Geovisualisation for Underground Utilities. PFG—J. Photogramm. Remote Sens. Geoinform. Sci. 2020, 88, 173–185. [Google Scholar] [CrossRef]
  21. Jimenez, R.J.P.; Becerril, E.M.D.; Nor, R.M.; Smagas, K.; Valari, E.; Stylianidis, E. Market potential for a location based and augmented reality system for utilities management. In Proceedings of the 2016 22nd International Conference on Virtual System & Multimedia (VSMM), Kuala Lumpur, Malaysia, 17–21 October 2016; pp. 11–14. [Google Scholar] [CrossRef]
  22. Lotsaris, K.; Gkournelos, C.; Fousekis, N.; Kousi, N.; Makris, S. AR based robot programming using teaching by demonstration techniques. Procedia CIRP 2021, 97, 459–463. [Google Scholar] [CrossRef]
  23. Boonbrahm, P.; Kaewrat, C.; Boonbrahm, S. Effective Collaborative Design of Large Virtual 3D Model using Multiple AR Markers. Procedia Manuf. 2020, 42, 387–392. [Google Scholar] [CrossRef]
  24. Cirulis, A.; Ginters, E. Augmented Reality in Logistics. Procedia Comput. Sci. 2013, 26, 14–20. [Google Scholar] [CrossRef] [Green Version]
  25. Stoltz, M.-H.; Giannikas, V.; McFarlane, D.; Strachan, J.; Um, J.; Srinivasan, R. Augmented Reality in Warehouse Operations: Opportunities and Barriers. IFAC-PapersOnLine 2017, 50, 12979–12984. [Google Scholar] [CrossRef]
  26. Kościelniak, H.; Łęgowik-Małolepsza, M.; Łęgowik-Świącik, S. The Application of Information Technologies in Consideration of Augmented Reality and Lean Management of Enterprises in the Light of Sustainable Development. Sustainability 2019, 11, 2157. [Google Scholar] [CrossRef] [Green Version]
  27. Plakas, G.; Ponis, S.; Agalianos, K.; Aretoulaki, E.; Gayialis, S. Augmented Reality in Manufacturing and Logistics: Lessons Learnt from a Real-Life Industrial Application. Procedia Manuf. 2020, 51, 1629–1635. [Google Scholar] [CrossRef]
  28. Quandt, M.; Knoke, B.; Gorldt, C.; Freitag, M.; Thoben, K.-D. General Requirements for Industrial Augmented Reality Applications. Procedia CIRP 2018, 72, 1130–1135. [Google Scholar] [CrossRef]
  29. Neumann, W.P.; Winkelhaus, S.; Grosse, E.H.; Glock, C.H. Industry 4.0 and the human factor—A systems framework and analysis methodology for successful development. Int. J. Prod. Econ. 2021, 233, 107992. [Google Scholar] [CrossRef]
  30. Keil, J.; Korte, A.; Ratmer, A.; Edler, D.; Dickmann, F. Augmented Reality (AR) and Spatial Cognition: Effects of Holographic Grids on Distance Estimation and Location Memory in a 3D Indoor Scenario. PFG—J. Photogramm. Remote Sens. Geoinform. Sci. 2020, 88, 165–172. [Google Scholar] [CrossRef]
  31. Pettijohn, K.A.; Peltier, C.; Lukos, J.R.; Norris, J.N.; Biggs, A.T. Virtual and augmented reality in a simulated naval engagement: Preliminary comparisons of simulator sickness and human performance. Appl. Ergon. 2020, 89, 103200. [Google Scholar] [CrossRef] [PubMed]
  32. Tsai, W.; Sun, X.; Huang, Q.; Karatza, H. An ontology-based collaborative service-oriented simulation framework with Microsoft Robotics Studio®. Simul. Model. Pr. Theory 2008, 16, 1392–1414. [Google Scholar] [CrossRef]
  33. Matta-Gómez, A.; Del Cerro, J.; Barrientos, A. Multi-robot data mapping simulation by using microsoft robotics developer studio. Simul. Model. Pr. Theory 2014, 49, 305–319. [Google Scholar] [CrossRef] [Green Version]
  34. Marino, E.; Barbieri, L.; Colacino, B.; Fleri, A.K.; Bruno, F. An Augmented Reality inspection tool to support workers in Industry 4.0 environments. Comput. Ind. 2021, 127, 103412. [Google Scholar] [CrossRef]
  35. Blaga, A.; Militaru, C.; Mezei, A.-D.; Tamas, L. Augmented reality integration into MES for connected workers. Robot. Comput. Manuf. 2021, 68, 102057. [Google Scholar] [CrossRef]
  36. Li, H.; Chan, G.; Wong, J.K.W.; Skitmore, M. Real-time locating systems applications in construction. Autom. Constr. 2016, 63, 37–47. [Google Scholar] [CrossRef] [Green Version]
Figure 1. VOS viewer heatmap for identifying widely used keywords.
Figure 1. VOS viewer heatmap for identifying widely used keywords.
Sustainability 13 10929 g001
Figure 2. The proposed system’s architecture, after the Real-Time Location System addition.
Figure 2. The proposed system’s architecture, after the Real-Time Location System addition.
Sustainability 13 10929 g002
Figure 3. (a) Vector before the alignment (misaligned vector); (b) Vector after the alignment (The coordinate systems’ are aligned); (c) The AR application’s interface where the employee selects the destination point.
Figure 3. (a) Vector before the alignment (misaligned vector); (b) Vector after the alignment (The coordinate systems’ are aligned); (c) The AR application’s interface where the employee selects the destination point.
Sustainability 13 10929 g003
Figure 4. On the left side, the virtual environment is presented (the initial (a1) and an updated (a2) position), and on the right side, the real-world environment from the mobile device that includes the AR objects for creating a virtual path and guiding the employee to the destination point (the initial (b1) and the updated (b2) virtual path, respectively).
Figure 4. On the left side, the virtual environment is presented (the initial (a1) and an updated (a2) position), and on the right side, the real-world environment from the mobile device that includes the AR objects for creating a virtual path and guiding the employee to the destination point (the initial (b1) and the updated (b2) virtual path, respectively).
Sustainability 13 10929 g004
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sidiropoulos, V.; Bechtsis, D.; Vlachos, D. An Augmented Reality Symbiosis Software Tool for Sustainable Logistics Activities. Sustainability 2021, 13, 10929. https://0-doi-org.brum.beds.ac.uk/10.3390/su131910929

AMA Style

Sidiropoulos V, Bechtsis D, Vlachos D. An Augmented Reality Symbiosis Software Tool for Sustainable Logistics Activities. Sustainability. 2021; 13(19):10929. https://0-doi-org.brum.beds.ac.uk/10.3390/su131910929

Chicago/Turabian Style

Sidiropoulos, Vasileios, Dimitrios Bechtsis, and Dimitrios Vlachos. 2021. "An Augmented Reality Symbiosis Software Tool for Sustainable Logistics Activities" Sustainability 13, no. 19: 10929. https://0-doi-org.brum.beds.ac.uk/10.3390/su131910929

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop