Next Article in Journal
Application and Assessment of Cooperative Localization in Three-Dimensional Vehicle Networks
Next Article in Special Issue
Cybersecurity in the AI-Based Metaverse: A Survey
Previous Article in Journal
The Effectiveness of Different Training Methods in Soccer for Repeated Sprint Ability: A Brief Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Study of Social Presence While Interacting in Metaverse with an Augmented Avatar during Autonomous Driving

by
Gheorghe Daniel Voinea
*,
Florin Gîrbacia
,
Cristian Cezar Postelnicu
,
Mihai Duguleana
,
Csaba Antonya
,
Adrian Soica
and
Ruxandra-Cristina Stănescu
Department of Automotive and Transport Engineering, Transilvania University of Brasov, 500036 Brasov, Romania
*
Author to whom correspondence should be addressed.
Submission received: 23 October 2022 / Revised: 11 November 2022 / Accepted: 16 November 2022 / Published: 20 November 2022
(This article belongs to the Special Issue Enabling Technologies and Critical Applications of Metaverse)

Abstract

:
In this paper, we studied the effects of using Microsoft HoloLens 2 in a Metaverse-based collaborative mixed reality environment on the driver’s social presence while using an autonomous driving system. In (semi-) autonomous vehicles the driver is the system’s monitor, and the driving process becomes a secondary task. Our approach is motivated by the advent of Microsoft Mesh XR technology that enables immersion in multi-person, shared mixed reality environments. We conducted a user study comparing the effects on social presence in two scenarios: baseline and mixed reality collaboration. During the baseline condition, participants communicated and interacted with another person using Skype/Meet which was installed on a mobile tablet. In the second scenario the participants used the Microsoft Mesh application installed on HoloLens 2 to collaborate in a mixed reality environment where each user is represented by an augmented 3D avatar. During the experiment, the participant had to perform a social interaction tell-a-lie task and a remote collaborative tic-tac-toe game, while also monitoring the vehicle’s behavior. The social presence was measured using the Harms and Biocca questionnaire, one of the most widely used tools for evaluating the user’s experience. We found that there are significant statistical differences for Co-presence, Perceived Emotional Interdependence, and Perceived Behavioral Interdependence, and participants were able to easily interact with the avatar in the mixed reality scenario. The proposed study procedure could be taken further to assess the driver’s performance during handover procedures, especially when the autonomous driving system encounters a critical situation.

1. Introduction

The Metaverse is a participatory, immersive, dynamic, and multidimensional realm where humans are connected, socially and virtually. The Metaverse is capitalizing on four major trends. First, the integration of augmented reality (AR) with virtual reality (VR) has made it a widely used technology that gives users a sensation of physical presence. Second, the expansion of the Internet into human-device and human-to-human interfaces enables an embodied social interaction. Third, the development of immersive mobile technologies, such as smartphones and head-mounted devices, is making the Metaverse more accessible and popular. Fourth, the convergence of pervasive reality with a digital world powered by artificial intelligence (AI) and blockchain enables real-time ubiquitous social interactions [1].
The primary task of the driver during the driving process is to keep the vehicle on the road and follow the rules of regulations in order to safely reach its destination [2]. Besides this, other tasks, referred to as secondary tasks, can also be performed, like interaction with the navigation system, changing the radio station, responding to or initiating phone calls or text messaging. Distracted driving occurs when the driver is taking his eyes off the road, but also during these secondary tasks, which pose a significant risk to the safety of the driver and to other road participants. Cognitive distractions [3] were found to have a negative effect on hazard anticipation. A literature review on driver behavior was presented in [4]. The driver is the decision-maker and is combining two types of behavior: the automatic one, which is fast and effortless, with the slower one, which is more deliberate. Driving is a safety–critical task and the driver is also responsible for risk management [5]. In the decision-making process, the information flow is activating mental models, as the internal representations of the current state. To perform a task, the mental model is going through a cycle of stages like perception, prediction, evaluation, and action [6].
The Society of Automotive Engineers (SAE) has defined six levels of driving automation in their J3016 standard, from SAE level 0 (without any automation) to SAE level 5 (full driving autonomy). The driving process becomes a secondary task in (semi-)autonomous driving systems, as the human operator switches to supervising the process in SAE level 3, or can be completely out-of-the-loop in SAE level 4 (if certain road conditions are met) and 5 (in all driving conditions) [7]. The driving time, which is now rather a monitoring time, can be used for other tasks that are non-driving related [8]. One type of activity in which the driver can be engaged during driving is communicating and collaborating through a digital platform. This is commonly performed using conventional 2D video conferencing systems (Zoom, Google Meet, Microsoft Skype, Meta Messenger, WhatsApp). Recently, Nissan [9] presented the concept Invisible-to-Visible (I2V) that uses mixed reality to connect drivers and other participants by displaying them as 3D augmented reality (AR) avatars inside the vehicle. The I2V concept is likely to gain more traction with the progress of the Metaverse; however, there are certain technological and social challenges ahead. One issue could be related to the acceptance of the new technology, which can be prevented with proper user studies. Moreover, the system should enable social presence, which was defined as the “sense of being together with another” [10]. Social presence was also described as an important quantification factor for 3D remote collaboration systems [11,12]. The guidelines proposed in [10] were explored in this work to assess social presence.
In [13], a social robot, AIDA, is presented that uses the driver’s mobile device as its face designed to be a friendly in-car companion. The results of an experimental study where participants were placed in a mock in-car environment show that AIDA users were less stressed. The effects of different types of intelligent virtual agents (IVIAs) on the drivers’ perceptions in autonomous driving scenarios was investigated in [14]. The results revealed that drivers were more engaged when conversational agents were present. A recent study highlighted that productivity could be improved by using AR headsets, which can provide large, flexible, virtual workspaces for passengers that work during travel with cars, trains, subways, and planes [15]. Therefore, we can hypothesize that it is important for future in-car companions to be embodied and interactive.
In this paper, we explore how subjects that use HoloLens 2 perceive the social interaction while performing two non-driving activities in a Metaverse-based mixed reality environment during autonomous driving. We present the findings of a user study in which participants are engaged in an interpersonal discussion, as well as in a simple, yet competitive tic-tac-toe game. To our knowledge, there are no studies that evaluate social presence in a collaborative, immersive mixed reality environment during autonomous driving.

2. Related Work

2.1. Collaboration in Metaverse

The idea of collaborating inside mixed reality applications is not new. Studies on this subject spread over the last three decades. One of the initial articles that treat this idea shows that two issues can be solved thanks to the particularities of extended environments: the seamlessness of the activities and the enhancement of reality [16]. Initially, due to the technological limitations, researchers used to focus on creating system architectures that would support real-time collaboration—tools that would allow interaction and equipment that would increase immersion. This research would translate into the overall improvement of presence, the main focus of most virtual and augmented reality scenarios [17].
Meanwhile, things have improved, both technologically and acceptance-wise. Besides virtual reality (VR), augmented/extended/mixed reality also gained popularity. Technology evolved and became cheaper. We now have commercial mixed reality equipment at our disposal, such as Microsoft HoloLens 2, Vuzix Blade 2 [18], Magic Leap 2 [19], and many others. However, unlike VR headsets such as Oculus Quest 2, mixed reality headsets are more expensive.
Remote collaboration can be enhanced by the use of extended reality technologies [20], which provide tools that can be used to create virtual environments in which users can view data from multiple sources, including 3D virtual avatars, and interact with each other in real time [21].
Nevertheless, what traits are offered by the mixed reality, and how do they empower the field workforce and allow it to increase its efficiency? As presented in several studies [22,23], mixed reality offers time-sensitive solutions to critical issues. Information is quickly accessible during on-site operations in ways that were not possible before. Mixed reality connects workers in challenging conditions with remote experts [24] to solve problems faster and safer. Thanks to the development of communication technologies (5G), workers can now hear real-time audio and see video documentation directly where they need it. Besides architecture, engineering, construction, and operation (AECO) [25], mixed reality has also been used in mining operations [26], food production, and consumption [27], healthcare [28], and, more recently, in socializing and entertainment. The idea of mixing VR objects with the real environment is so appealing that one of the biggest tech companies, Facebook, has rebranded itself as Meta and has a long-term purpose of creating the means and technology to make the Metaverse a reality [29].

2.2. Avatar Representation

The main advantage of using AR in collaborative environments is that its representation requires a small computation cost compared to 3D scanning and reconstruction. Researchers realized that it is possible to obtain similar levels of immersion to real-life imagery in communication experiments by using low-poly animations instead of continuously retrieving exact data from involved parties. E.g., although Meta is facing difficulties in creating the Metaverse experience presented by Mark Zuckerberg in late 2021, using virtual avatars to colocate users is appealing. For their new initiative, Meta Horizon Worlds, virtual reality became the main priority [30]. On this platform, avatars have a cartoon-like appearance; however, there are other services out there that propose much more realistic avatars [31].
Avatars have been represented as live video streams [32], 3D representations from depth camera captures [33], or volumetrically, using the already established concept of holoportation [34].

2.3. Metaverse for Autonomous Driving

The implementation of autonomous vehicles in a Metaverse holds great promise and has already attracted the attention of several automotive industry players, such as Nissan with their Invisible to Visible (I2V) concept [9], or Hyundai with their Metamobility concept [35]. The I2V vision is focused on creating an augmented reality channel between real and virtual world information in order to offer the ultimate in-vehicle experience for drivers and passengers (Figure 1). Nissan’s Omni-Sensing technology consists of a virtual hub which collects exterior information (e.g., road status, signage, weather conditions, and nearby vehicles and pedestrians) and interior information (driver’s state of alertness, facial expressions, and body tracking). The real-world data can be used in the Metaverse by a digital twin, therefore creating a link between the virtual space and the real world. Moreover, the digital twin can be used to transfer information to an augmented or mixed reality interface from the real world. BMW’s Omniverse [36] is also relying on digital twins, analytics, and AI. However, their aim is to set new standards in virtual factory planning by simulating every aspect of their manufacturing processes.
Hyundai intends to become a pioneer by establishing a connection between smart devices and the Metaverse that will allow mobility to include virtual reality (VR) and eventually help individuals overcome their physical restrictions on movement in both time and space. One of their initiatives was the launch of the Hyundai Mobility Adventure—a Metaverse space on Roblox in which users can socialize as digital characters or avatars and experience the company’s popular vehicles and future mobility solutions [37].
The artificial intelligence algorithms for autonomous driving require an enormous amount of data that should represent all the driving scenarios that can appear on the road. Autopilot from Tesla is considered to be the most advanced due to the billions of miles used to train its neural networks. The key areas in which Tesla holds the upper ground, due to its vast fleet and bleeding-edge technology, are computer vision, prediction, and path planning. Nonetheless, there is still not enough data for fully autonomous vehicles to understand and react to road events as humans do. The Metaverse can prove itself extremely efficient and safe in gathering data from driving in virtual environments where the AI agent can be pushed to the limit by well-designed scenarios, while also reducing the carbon footprint. Recently, the Oxbotica Driver was launched which offers a suite of low-energy use and high-performance tools with features such as virtual world simulation, real-time data expansion, and an automated discovery of difficult scenarios [38].
In a Metaverse, users interact and spend time in a virtual world. There is a need for transportation or mobility wherever there are people. The Oslo study [39] proposed a new mobility concept based on shared mobility as a service (MaaS). Although there are several pilot projects around the world, it is difficult to create cross-platform collaborations. Furthermore, connected cars will still have to face similar challenges as smartphones have in the past, such as surveillance, privacy, and accountability. Simulating different MaaS strategies in a Metaverse could shorten the time frame until shared mobility will be adopted on a large scale in the real world. Metaverse users could travel with virtual autonomous vehicles that are simulating the behavior of a real driver, which in turn could increase their confidence in using an autonomous vehicle in the real world. The Metaverse could also be used to create in-vehicle entertainment experiences based on augmented reality and gamification, as well as creative and localized advertisements.
The role of urban mobility and autonomous driving in a smart city has been discussed in [40]. The authors found that the adoption of autonomous vehicles (AV) may not be the smartest choice and proposed micromobility or other non-AV modes of transportation. Digital twins of real vehicles could be integrated in digital twins of smart cities which could help predict the traffic flow in real-time, simulate various urban conditions, and increase road safety.
Connected cars can communicate with other vehicles, as well as with the infrastructure. The future smart city will be fully compliant with the needs of AV and with the requirements of the Metaverse. A synergy between the Metaverse and the smart city could increase the quality of life and have a significant impact on decarbonization [41,42].

2.4. Social Presence

Human beings have an innate need for social connectedness. Social interactions are said to influence the internal states of a being, such as the capacity to perceive and understand others’ thoughts, intentions, and associated behaviors, thus altering the decision-making process [43]. The concept of social presence is essential for successful computer mediated environments like online learning, online assisted shopping, teleconferences, gaming, assisted driving, and so on [44].
Unlike telepresence and self-presence, social presence requires the actual experience in a virtual environment (VE) mediated by a co-present entity. Social presence describes a person’s state in an environment—the state of consciousness in a VE and the sense of being in a place [45]. “The minimum level of social presence occurs when users feel that a form, behavior, or sensory experience indicates the presence of another intelligence. The amount of social presence is the degree to which a user feels access to the intelligence, intentions, and sensory impressions of another” [46]. More definitions on social presence can be found in [47].
Presence, per se, is a complex concept to describe and measure. A standardized instrument for assessing presence was proposed in [48]. Their tool identified six dimensions for presence: social richness, realism, transportation, immersion, social actor within medium, and medium as social actor.
In a review of theories on social presence, it was found that social presence consists of three dimensions with regard to the interaction and behavior during VE experiences: co-presence, psychological involvement, and behavioral engaging [10]. Short et al. introduced the concept of social presence referring to both the participants’ feeling of connectedness and the perceived psychological distance during the interaction [49]. Social presence is more of an interpersonal relation, with verbal and non-verbal language being a key element for the interaction in VE [50], and is essential in a shared VE interaction with the sense of belonging to more than just a group, but of being fully immersed in the place [45]. Immersion is the quantifiable technological capacity of a medium to deliver “an inclusive, surrounding and vivid” [45] illusion of reality to all the participants involved in the interaction. The perception of the VE increases with the capacity to be present and engaged in virtual reality; thus, the environment delivered by the VR headset appears more intuitive, realistic, and attractive.
In their extensive review, Oh et al. [50] showed that social presence may be influenced by several factors which contribute to the overall impact on the participants. Studies showed that face-to-face interactions are preferred to computer-mediated communication. The quality of the partner’s representation in a VR interaction has a great impact on one’s “sense of social presence.” The availability of visual representation and the visual realism of the virtual environment directly influences the level of social presence in comparison with experiences where participants are not able to see their partner’s avatar and are therefore interacting with an invisible partner. The more the virtual representation acts (communicates, behaves) and looks like a human being, the greater the realism factor, and the stronger the impact on social presence [50]. Behavioral patterns of an individual in VE and the extent to which the same or similar behavior or responses are shown in normal life circumstances can be observed and correlated with certain aspects of immersion [45]. Higher levels of social presence are perceived when participants have the opportunity to interact directly with the partner [51], even more when the partners are both involved in manipulating a virtual object together and receive haptic feedback [50].

3. Materials and Methods

A user study was conducted to evaluate how subjects assess social presence while interacting with another person by means of a 2D mobile tablet and a mixed reality HoloLens 2 headset. The interaction through the mobile tablet represents the baseline scenario when a traditional, well-known 2D platform (Skype/Meet) is used. The cross-platform Microsoft Mesh application was installed on the HoloLens 2 to enable the collaboration between users in a mixed reality environment. Each participant was required to perform two tasks: engage in the tell-a-lie [52] and tic-tac-toe games. The proposed method is relatively simple and aims to highlight how social presence is affected by a novel interaction paradigm which involves mixed reality and an animated 3D avatar.

3.1. Experimental Setup

The custom simulator (Figure 2) used in this study consists of a Stewart motion platform with six degrees of freedom (MOOG 6 DOF 2000E) [53,54] and a driving seat with a Logitech steering wheel and pedals. The motion platform offers a realistic force feedback which enhances the immersion in the virtual environment. The dynamic model of the simulator is based on the Motion Cueing Algorithm [55] and details about the implementation were presented in a previous study [56].
The autonomous vehicle driving behavior was implemented with the tools provided by the open-source software platform, CARLA [57]. The 0.9.13 release of CARLA was used on a VR-ready desktop computer with a RTX 3090 graphics card, an AMD Ryzen 9 5950X processor, and 32 GB of RAM. The basic autopilot function integrated in CARLA was sufficient for the purpose of this study.
In this study, the participants visualized the immersive mixed reality environment through a Microsoft HoloLens 2 headset. The mixed reality experience offered by HoloLens 2 is possible thanks to its see-through holographic lenses with a 2k resolution, a second-generation holographic processing unit (HPU), a Qualcomm Snapdragon 850 processor, an advanced set of cameras (light camera, infrared cameras, and depth sensor), a 5-channel microphone array, and built-in spatial sounds. The main features of HoloLens 2 are hand tracking, real-time eye tracking, and iris recognition, as well as voice command and control.
The Microsoft Mesh [58] preview application was used to virtually bring people together and facilitate collaborative mixed reality experiences. Mesh contains a collection of technologies and tools designed to enable rich, immersive, 3D collaboration experiences between people. Mesh allows users to have 3D interactions on content by capturing, storing, and displaying 3D objects. Moreover, Mesh users can move and look around in the digital space, play, rotate, manipulate, and point to 3D objects as if they were in the same physical space. Mesh also supports spatial audio that plays real-time sounds at the point where you look or listen, so the user’s eyes and ears determine where a sound is coming from.
Microsoft Mesh provides a set of out-of-the-box cartoon avatars that can be personalized by the user. Each participant was represented in the 3D digital space as a custom cartoon avatar in an immersed way. The avatar can move and display facial expressions consistent with the user’s actions acquired through HoloLens 2 sensors. The participants can have a unique perspective in the multi-user scenario, can bring in 3D models and interact with them, as well as create annotations using a pinch gesture.

3.2. Participants and Procedure

A total of 24 subjects participated in the study: 18 men and 6 women. The age of the participants ranged between 23–64 years old (M = 36.62, SD = 13.09). They were master students and professors from our university. They all had previous experience with 2D video conference systems (Microsoft Skype, Google Meet), though none had experience with immersive AR systems. Seventeen participants (70.83%) had previous experience of using AR, mostly from playing AR games on mobile devices. All users volunteered to participate in the study.
As for the experimental tasks, we chose the social interaction tell-a-lie [52] and tic-tac-toe game (see Figure 3). At the beginning of the experiment, we presented the aim of the research and instructed the participants on how to use HoloLens glasses. The experiment was conducted with one participant driving on the simulator using one HoloLens 2, and a researcher from our laboratory playing the role of a remote collaborator using another HoloLens 2. The whole process took approximately 20 min, out of which about 5 min were allocated for the user to get accustomed to the mixed reality environment and the hand tracking capabilities of the HoloLens 2.
Due to the fact that the driving task was secondary, the participants were instructed to focus their attention on the collaborative interaction with the avatar. Social presence was measured using Harms and Biocca’s questionnaire [59], whose main aim is to highlight the sense of being together and assess the attention and behavior during the interaction with the AR avatar. The social presence questionnaire uses a 7-point Likert scale rating (1: Strongly disagree–7: Strongly agree).

4. Results

The responses from the Harms and Biocca’s social presence questionnaire were centralized using Excel and then analyzed with SPSS. The questionnaire evaluates six constructs: Co-presence (CoP), Attentional Allocation (Att), Perceived Message Understanding (Msg), Perceived Affective Understanding (Aff), Perceived Emotional Interdependence (Emo), and Perceived Behavioral Interdependence (Behv). The first statistical analysis was aimed to verify the validity of the scales by calculating Cronbach’s alpha [60]. Further, the mean values and the standard deviation of the responses are obtained (Table 1 and Figure 4, as box plots). Lastly, a t-test analysis is applied to verify if there are significant statistical differences between the baseline and augmented reality scenarios.

4.1. Baseline Scenario

All constructs were found reliable, with values for Cronbach’s alpha over 0.69. The results for each construct are as follows: Co-presence (M = 6.103, SD = 0.683), Attentional Allocation (M = 5.062, SD = 0.587), Perceived Message Understanding (M = 5.89, SD = 0.568), Perceived Affective Understanding (M = 4.68, SD = 0.68), Perceived Emotional Interdependence (M = 4.993, SD = 0.68), Perceived Behavioral Interdependence (M = 5.022, SD = 0.709).

4.2. Augmented Reality Scenario

Cronbach’s alpha indicated good internal consistency for each set of questions (over 0.7). The results for Co-presence (M = 6.563, SD = 0.370) show that all participants felt that they were not alone, most likely due to the dynamic and tridimensional nature of the tasks. Attentional Allocation (M = 4.897, SD = 0.599) was scored moderately, as the subjects were allocating a part of their attention to the driving scene. Perceived Message Understanding (M = 6.189, SD = 0.511) has an increased value as the process of transmitting and receiving messages was enhanced by the spatial audio offered by HoloLens, as well as the movement of the partner’s 3D augmented avatar mouth when speaking. We observed smaller values and a wider variation in the responses for Perceived Affective Understanding (M = 4.862, SD = 0.679) and Perceived Emotional Interdependence (M = 4.430, SD = 0.820) mainly due to the cartoon-like representation of the avatar, as well as the relatively short duration of the study. Furthermore, assessing the partner’s feelings can be difficult especially when interacting with a non-familiar person, as was the case in our study. Lastly, participants gave high ratings for Perceived Behavioral Interdependence (M = 5.917, SD = 0.727), which was to be expected due to the interactive task of playing tic-tac-toe in the same virtual space.

4.3. Statistical Analysis

A paired t-test analysis revealed statistically significant differences for three constructs: CoP (t(23) = −2.991 p = 0.007), Emo (t(23) = 2.330, p = 0.29), and Behv (t(23) = −4.393, p = 0.000). No statistically significant differences were found for Att (t(23) = 0.982, p = 0.336), Msg (t(23) = −1.726, p = 0.098), and Aff (t(23) = −0.932, p = 0.361) constructs.

5. Discussion

The integration of digital content with the real world using mixed reality devices has started to make its way into our society. Although XR technology has not yet become very accessible, it has shown great promise in the industrial sector, as well as in remote collaboration. There are numerous studies on automated driving and mixed reality experiences, but none address both subjects together. In a transition period towards highly and fully autonomous vehicles, the current study aimed to add knowledge regarding the interaction with a 3D augmented avatar while engaged in an autonomous driving system. As we spend more time commuting, with an average of 30 to 60 min per day one-way [61,62,63], the need to take control of our time becomes imperative. Thus, the tasks that may or may not be performed while in the car will be of utmost importance. Social presence becomes a critical factor in assessing how this new driving approach could be fully accepted. An important aspect that could boost the adoption of the Metaverse is ensuring the applications offer high levels of social presence besides well-designed user interfaces. Therefore, we conducted a user study that examined social presence during two different scenarios: baseline and mixed reality collaboration. The subjective responses showed that the interaction with the virtual avatar is preferred by the users. Significant statistical differences were found in three of the six constructs: Co-presence, where we observed that the dynamic interaction increased the sense of presence, Perceived Emotional Interdependence appeared to be influenced by the avatar representation and the duration of the study, and Perceived Behavioral Interdependence increased due to the interactive tasks. An interesting note can be made about the increased value for the Perceived Message Understanding, which was influenced by the enhanced audio and video quality of the interaction. This confirms the idea that the paraverbal (audio) and non-verbal (visual) language play a key role in enhancing the sense of social presence in AR interactions as an interpersonal relationship. The results could have also been influenced by the novelty of the mixed reality interaction with an animated, 3D avatar.
The results also show that the participants were more likely to interact with the avatar in the AR environment than using video conference systems. The use of personal electronic devices like smartphones in autonomous driving can lead to a feeling of social isolation, as people are not experiencing effective collaboration. AR headsets can provide a more immersive experience, allowing the user to feel as if they are interacting with the remote collaborator rather than simply looking at him, as video conferencing does. Immersive virtual avatars can create a high degree of social presence by making it feel like the person you are communicating with is right in front of you. They can also help to create a more realistic and engaging experience, which is important for effective collaboration. AR Metaverse environments can enable passengers of future autonomous vehicles to use their travel time in new and productive ways by augmenting their real environment with digital content that allows for interactive experiences. However, more emphasis should be placed on the environmental conditions of the interaction, as the Harms and Biocca questionnaire seems to overlook this aspect and focus only on the interaction itself, an issue confirmed by another study [64].

6. Conclusions

In this paper, we evaluated the social presence while interacting in a mixed reality scene with a 3D augmented avatar during autonomous driving. This work was motivated by the fact that humans will engage in non-driving activities in autonomous vehicles, as well as the important role that the Metaverse will have in the near future. The Harms and Biocca questionnaire was used to assess the participants’ perception on collaborating in a mixed reality environment with the HoloLens 2. This initial study found that participants gave higher ratings to three of the six constructs in the mixed reality scenario.
Future studies could focus on assessing the driver’s performance in handover maneuvers while collaborating in a mixed reality scene with a 3D animated avatar, and further expand the qualitative analysis by integrating interviews with the participants to better understand their experience. The results of such studies could be used to design driver training programs to ensure that drivers are aware of the limitations of the system in order to be prepared to intervene when the autonomous driving system encounters a critical situation.

Author Contributions

Conceptualization, C.A. and G.D.V.; methodology, F.G. and M.D.; software, G.D.V. and F.G.; validation, R.-C.S., G.D.V. and C.C.P.; formal analysis, C.A.; data curation, C.C.P.; writing—original draft preparation, G.D.V. and F.G.; writing—review and editing, M.D. and C.A.; visualization, C.C.P.; supervision, A.S. and R.-C.S.; project administration, C.A. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a grant of the Romanian Ministry of Education and Research, CCCDI–UEFISCDI, project number PN-III-P2-2.1-PED-2019-4366 (431PED), within PNCDI III.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Transilvania University of Brasov.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets generated and/or analyzed during the current study are not publicly available due to confidentiality agreements but are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, Y.; Su, Z.; Zhang, N.; Xing, R.; Liu, D.; Luan, T.H.; Shen, X. A survey on metaverse: Fundamentals, security, and privacy. IEEE Commun. Surv. Tutor. 2022. [Google Scholar] [CrossRef]
  2. Jin, L.; Guo, B.; Jiang, Y.; Wang, F.; Xie, X.; Gao, M. Study on the impact degrees of several driving behaviors when driving while performing secondary tasks. IEEE Access 2018, 6, 65772–65782. [Google Scholar] [CrossRef]
  3. Ebadi, Y.; Fisher, D.L.; Roberts, S.C. Impact of cognitive distractions on drivers’ hazard anticipation behavior in complex scenarios. Transp. Res. Rec. 2019, 2673, 440–451. [Google Scholar] [CrossRef]
  4. Ranney, T.A. Models of driving behavior: A review of their evolution. Accid. Anal. Prev. 1994, 26, 733–750. [Google Scholar] [CrossRef]
  5. Luke, R.; Heyns, G.J. Reducing risky driver behaviour through the implementation of a driver risk management system. J. Transp. Supply Chain. Manag. 2014, 8, 1–10. [Google Scholar] [CrossRef] [Green Version]
  6. Boer, E.R.; Hoedemaeker, M. Modeling driver behavior with different degrees of automation: A hierarchical decision framework of interacting mental models. In Proceedings of the 17th European Annual Conference on Human Decision Making and Manual Control, Valenciennes, France, 14–16 December 1998; pp. 63–72. [Google Scholar]
  7. International, S. Available online: https://www.sae.org/standards/content/j3016_202104/ (accessed on 21 October 2022).
  8. Naujoks, F.; Forster, Y.; Wiedemann, K.; Neukum, A. Improving usefulness of automated driving by lowering primary task interference through HMI design. J. Adv. Transp. 2017, 2017, 6105087. [Google Scholar] [CrossRef] [Green Version]
  9. Technology, N. Available online: https://www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/i2v.html (accessed on 21 October 2022).
  10. Biocca, F.; Harms, C.; Burgoon, J.K. Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence Teleoper. Virtual Environ. 2003, 12, 456–480. [Google Scholar] [CrossRef]
  11. Jo, D.; Kim, K.-H.; Kim, G.J. Effects of avatar and background types on users’ co-presence and trust for mixed reality-based teleconference systems. In Proceedings of the 30th Conference on Computer Animation and Social Agents, Seoul, Republic of Korea, 22–24 May 2017; pp. 27–36. [Google Scholar]
  12. Yoon, B.; Kim, H.-i.; Lee, G.A.; Billinghurst, M.; Woo, W. The effect of avatar appearance on social presence in an augmented reality remote collaboration. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 547–556. [Google Scholar]
  13. Williams, K.J.; Peters, J.C.; Breazeal, C.L. Towards leveraging the driver’s mobile device for an intelligent, sociable in-car robotic assistant. In Proceedings of the 2013 IEEE intelligent vehicles symposium (IV), Gold Coast, Australia, 23–26 June 2013; pp. 369–376. [Google Scholar]
  14. Wang, M.; Lee, S.C.; Kamalesh Sanghavi, H.; Eskew, M.; Zhou, B.; Jeon, M. In-vehicle intelligent agents in fully autonomous driving: The effects of speech style and embodiment together and separately. In Proceedings of the 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Leeds, UK, 9–14 September 2021; pp. 247–254. [Google Scholar]
  15. Medeiros, D.; McGill, M.; Ng, A.; McDermid, R.; Pantidi, N.; Williamson, J.; Brewster, S. From Shielding to Avoidance: Passenger Augmented Reality and the Layout of Virtual Displays for Productivity in Shared Transit. IEEE Trans. Vis. Comput. Graph. 2022, 28, 3640–3650. [Google Scholar] [CrossRef]
  16. Billinghurst, M.; Kato, H. Collaborative mixed reality. In Proceedings of the First International Symposium on Mixed Reality, Yokohama, Japan, 9–11 March 1999; pp. 261–284. [Google Scholar]
  17. Cadet, L.B.; Chainay, H. Memory of virtual experiences: Role of immersion, emotion and sense of presence. Int. J. Hum.-Comput. Stud. 2020, 144, 102506. [Google Scholar] [CrossRef]
  18. Blade, V. Available online: https://www.vuzix.com/products/vuzix-blade-2-smart-glasses (accessed on 21 October 2022).
  19. Leap, M. Available online: https://www.magicleap.com/magic-leap-2 (accessed on 21 October 2022).
  20. Xi, N.; Chen, J.; Gama, F.; Riar, M.; Hamari, J. The challenges of entering the metaverse: An experiment on the effect of extended reality on workload. Inf. Syst. Front. 2022, 1–22. [Google Scholar] [CrossRef]
  21. Popescu, G.H.; Ciurlău, C.F.; Stan, C.I. Virtual Workplaces in the Metaverse: Immersive Remote Collaboration Tools, Behavioral Predictive Analytics, and Extended Reality Technologies. Psychosociol. Issues Hum. Resour. Manag. 2022, 10, 21–34. [Google Scholar]
  22. Cárdenas-Robledo, L.A.; Hernández-Uribe, Ó.; Reta, C.; Cantoral-Ceballos, J.A. Extended reality applications in industry 4.0.—A systematic literature review. Telemat. Inform. 2022, 73, 101863. [Google Scholar] [CrossRef]
  23. Vasarainen, M.; Paavola, S.; Vetoshkina, L. A systematic literature review on extended reality: Virtual, augmented and mixed reality in working life. Int. J. Virtual Real. 2021, 21, 1–28. [Google Scholar] [CrossRef]
  24. Dai, F.; Olorunfemi, A.; Peng, W.; Cao, D.; Luo, X. Can mixed reality enhance safety communication on construction sites? An industry perspective. Saf. Sci. 2021, 133, 105009. [Google Scholar] [CrossRef]
  25. Cheng, J.C.; Chen, K.; Chen, W. State-of-the-art review on mixed reality applications in the AECO industry. J. Constr. Eng. Manag. 2020, 146, 03119009. [Google Scholar] [CrossRef]
  26. Stothard, P.; Squelch, A.; Stone, R.; Van Wyk, E. Towards sustainable mixed reality simulation for the mining industry. Min. Technol. 2019, 128, 246–254. [Google Scholar] [CrossRef]
  27. Chai, J.J.; O’Sullivan, C.; Gowen, A.A.; Rooney, B.; Xu, J.-L. Augmented/mixed reality technologies for food: A review. Trends Food Sci. Technol. 2022, 124, 182–194. [Google Scholar] [CrossRef]
  28. Gsaxner, C.; Li, J.; Pepe, A.; Jin, Y.; Kleesiek, J.; Schmalstieg, D.; Egger, J. The HoloLens in Medicine: A systematic Review and Taxonomy. arXiv 2022, arXiv:2209.03245. [Google Scholar]
  29. Fernandez, P. Facebook, Meta, the metaverse and libraries. Libr. Hi Tech News 2022, 39, 1–5. [Google Scholar] [CrossRef]
  30. Kolesnichenko, A.; McVeigh-Schultz, J.; Isbister, K. Understanding emerging design practices for avatar systems in the commercial social vr ecology. In Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA, 23–28 June 2019; pp. 241–252. [Google Scholar]
  31. Avatars, U. Available online: https://unionavatars.com/ (accessed on 9 November 2022).
  32. Kim, J.I.; Ha, T.; Woo, W.; Shi, C.-K. Enhancing social presence in augmented reality-based telecommunication system. In Proceedings of the International Conference on Virtual, Augmented and Mixed Reality, Las Vegas, NV, USA, 24 July 2013; pp. 359–367. [Google Scholar]
  33. Pejsa, T.; Kantor, J.; Benko, H.; Ofek, E.; Wilson, A. Room2room: Enabling life-size telepresence in a projected augmented reality environment. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, San Francisco, CA, USA, 27 February–2 March 2016; pp. 1716–1725. [Google Scholar]
  34. Orts-Escolano, S.; Rhemann, C.; Fanello, S.; Chang, W.; Kowdle, A.; Degtyarev, Y.; Kim, D.; Davidson, P.L.; Khamis, S.; Dou, M. Holoportation: Virtual 3d teleportation in real-time. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; pp. 741–754. [Google Scholar]
  35. Concept, H.M. Available online: https://www.hyundai.news/eu/articles/press-releases/hyundai-shares-vision-of-new-metamobility-concept-through-robotics-and-Metaverse-at-CES-2022.html (accessed on 21 October 2022).
  36. Group, B. Available online: https://www.press.bmwgroup.com/global/article/detail/T0329569EN/bmw-group-and-nvidia-take-virtual-factory-planning-to-the-next-level?language=en (accessed on 21 October 2022).
  37. Adventure, H.M. Available online: https://www.hyundai.com/worldwide/en/company/newsroom/hyundai-motor-vitalizes-future-mobility-in-roblox-Metaverse-space%252C-hyundai-mobility-adventure-0000016713 (accessed on 21 October 2022).
  38. Driver, O. Available online: https://www.oxbotica.com/insight/oxbotica-partners-with-nevs-to-reshape-the-future-of-urban-mobility-with-fleet-of-shared-self-driving-all-electric-vehicles/ (accessed on 21 October 2022).
  39. COWI. The Oslo Study—How Autonomous Cars May Change Transport in Cities; COWI: Lyngby, Denmark, 2019; Volume 79. [Google Scholar]
  40. Richter, M.A.; Hagenmaier, M.; Bandte, O.; Parida, V.; Wincent, J. Smart cities, urban mobility and autonomous vehicles: How different cities needs different sustainable investment strategies. Technol. Forecast. Soc. Chang. 2022, 184, 121857. [Google Scholar] [CrossRef]
  41. Allam, Z.; Sharifi, A.; Bibri, S.E.; Jones, D.S.; Krogstie, J. The metaverse as a virtual form of smart cities: Opportunities and challenges for environmental, economic, and social sustainability in urban futures. Smart Cities 2022, 5, 771–801. [Google Scholar] [CrossRef]
  42. Pamucar, D.; Deveci, M.; Gokasar, I.; Tavana, M.; Köppen, M. A metaverse assessment model for sustainable transportation using ordinal priority approach and Aczel-Alsina norms. Technol. Forecast. Soc. Chang. 2022, 182, 121778. [Google Scholar] [CrossRef]
  43. Soiné, A.; Flöck, A.N.; Walla, P. Electroencephalography (EEG) Reveals Increased Frontal Activity in Social Presence. Brain Sci. 2021, 11, 731. [Google Scholar] [CrossRef] [PubMed]
  44. Cui, G.; Lockee, B.B.; Meng, C. Building modern online social presence: A review of social presence theory and its instructional design implications for future trends. Educ. Inf. Technol. 2012, 18, 661–685. [Google Scholar] [CrossRef]
  45. Slater, M.; Wilbur, S. A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence Teleoper. Virtual Environ. 1997, 6, 603–616. [Google Scholar] [CrossRef]
  46. Biocca, F. The Cyborg’s Dilemma: Progressive Embodiment in Virtual Environments [1]. J. Comput.-Mediat. Commun. 1997, 3, JCMC324. [Google Scholar] [CrossRef]
  47. Kreijns, K.; Xu, K.; Weidlich, J. Social presence: Conceptualization and measurement. Educ. Psychol. Rev. 2021, 34, 139–170. [Google Scholar] [CrossRef] [PubMed]
  48. Lombard, M.; Ditton, T. At the heart of it all: The concept of presence. J. Comput.-Mediat. Commun. 1997, 3, JCMC321. [Google Scholar] [CrossRef]
  49. Short, J.; Williams, E.; Christie, B.A. The Social Psychology of Telecommunications; Wiley & Sons: Toronto, ON, Canada, 1976. [Google Scholar]
  50. Oh, C.S.; Bailenson, J.N.; Welch, G.F. A Systematic Review of Social Presence: Definition, Antecedents, and Implications. Front. Robot. AI 2018, 5, 114. [Google Scholar] [CrossRef] [Green Version]
  51. Skalski, P.; Tamborini, R. The role of social presence in interactive agent-based persuasion. Media Psychol. 2007, 10, 385–413. [Google Scholar] [CrossRef]
  52. Zuckerman, M.; DePaulo, B.M.; Rosenthal, R. Verbal and nonverbal communication of deception. In Advances in Experimental Social Psychology; Elsevier: Amsterdam, The Netherlands, 1981; Volume 14, pp. 1–59. [Google Scholar]
  53. Hulme, K.; Kasprzak, E.; English, K.; Moore-Russo, D.; Lewis, K. Experiential learning in vehicle dynamics education via motion simulation and interactive gaming. Int. J. Comput. Games Technol. 2009, 2009, 952524. [Google Scholar] [CrossRef] [Green Version]
  54. Stewart, D. A platform with six degrees of freedom. Proc. Inst. Mech. Eng. 1965, 180, 371–386. [Google Scholar] [CrossRef]
  55. Reymond, G.; Kemeny, A. Motion cueing in the Renault driving simulator. Veh. Syst. Dyn. 2000, 34, 249–259. [Google Scholar] [CrossRef]
  56. Antonya, C.; Irimia, C.; Grovu, M.; Husar, C.; Ruba, M. Co-simulation environment for the analysis of the driving simulator’s actuation. In Proceedings of the 2019 7th International Conference on Control, Mechatronics and Automation (ICCMA), Delft, The Netherlands, 6–8 November 2019; pp. 315–321. [Google Scholar]
  57. Dosovitskiy, A.; Ros, G.; Codevilla, F.; Lopez, A.; Koltun, V. CARLA: An open urban driving simulator. In Proceedings of the Conference on Robot Learning, Mountain View, CA, USA, 13–15 November 2017; pp. 1–16. [Google Scholar]
  58. Mesh, M. Available online: https://learn.microsoft.com/en-us/mesh/overview (accessed on 21 October 2022).
  59. Harms, C.; Biocca, F. Internal consistency and reliability of the networked minds measure of social presence. In Proceedings of the Seventh Annual International Workshop: Presence, Valencia, Spain, 13–15 October 2004. [Google Scholar]
  60. Hair, J.F.; Anderson, R.; Tatham, R.; Black, W. Factor analysis. Multivariate data analysis. NJ Prentice-Hall 1998, 3, 98–99. [Google Scholar]
  61. Eurostat. Available online: https://ec.europa.eu/eurostat/web/products-eurostat-news/-/ddn-20201021-2 (accessed on 21 October 2022).
  62. Giménez-Nadal, J.I.; Molina, J.A.; Velilla, J. Trends in commuting time of European workers: A cross-country analysis. Transp. Policy 2022, 116, 327–342. [Google Scholar] [CrossRef]
  63. Teodorovicz, T.; Kun, A.L.; Sadun, R.; Shaer, O. Multitasking while driving: A time use study of commuting knowledge workers to assess current and future uses. Int. J. Hum.-Comput. Stud. 2022, 162, 102789. [Google Scholar] [CrossRef]
  64. Kim, K.; Schubert, R.; Hochreiter, J.; Bruder, G.; Welch, G. Blowing in the wind: Increasing social presence with a virtual human via environmental airflow interaction in mixed reality. Comput. Graph. 2019, 83, 23–32. [Google Scholar] [CrossRef]
Figure 1. Metaverse implementation for autonomous driving systems.
Figure 1. Metaverse implementation for autonomous driving systems.
Applsci 12 11804 g001
Figure 2. Driving simulator.
Figure 2. Driving simulator.
Applsci 12 11804 g002
Figure 3. Screenshots from HoloLens 2: (a) collaborative tic-tac-toe game using Microsoft Mesh application; (b) the 3D avatar waving at the participant.
Figure 3. Screenshots from HoloLens 2: (a) collaborative tic-tac-toe game using Microsoft Mesh application; (b) the 3D avatar waving at the participant.
Applsci 12 11804 g003
Figure 4. Box plots with the results for the Likert scale rating: Co-presence (CoP), Attentional Allocation (Att), Perceived Message Understanding (Msg), Perceived Affective Understanding (Aff), Perceived Emotional Interdependence (Emo), and Perceived Behavioral Interdependence (Behv).
Figure 4. Box plots with the results for the Likert scale rating: Co-presence (CoP), Attentional Allocation (Att), Perceived Message Understanding (Msg), Perceived Affective Understanding (Aff), Perceived Emotional Interdependence (Emo), and Perceived Behavioral Interdependence (Behv).
Applsci 12 11804 g004
Table 1. Mean and standard deviation (SD) values for constructs and each test.
Table 1. Mean and standard deviation (SD) values for constructs and each test.
ScenarioCoPAttMsgAffEmoBehv
Tablet *6.103 ± 0.6835.062 ± 0.5875.890 ± 0.5684.680 ± 0.6804.993 ± 0.6805.022 ± 0.709
Mixed reality *6.563 ± 0.3704.897 ± 0.5996.189 ± 0.5114.862 ± 0.6794.430 ± 0.8305.917 ± 0.727
* Mean and standard deviation values.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Voinea, G.D.; Gîrbacia, F.; Postelnicu, C.C.; Duguleana, M.; Antonya, C.; Soica, A.; Stănescu, R.-C. Study of Social Presence While Interacting in Metaverse with an Augmented Avatar during Autonomous Driving. Appl. Sci. 2022, 12, 11804. https://0-doi-org.brum.beds.ac.uk/10.3390/app122211804

AMA Style

Voinea GD, Gîrbacia F, Postelnicu CC, Duguleana M, Antonya C, Soica A, Stănescu R-C. Study of Social Presence While Interacting in Metaverse with an Augmented Avatar during Autonomous Driving. Applied Sciences. 2022; 12(22):11804. https://0-doi-org.brum.beds.ac.uk/10.3390/app122211804

Chicago/Turabian Style

Voinea, Gheorghe Daniel, Florin Gîrbacia, Cristian Cezar Postelnicu, Mihai Duguleana, Csaba Antonya, Adrian Soica, and Ruxandra-Cristina Stănescu. 2022. "Study of Social Presence While Interacting in Metaverse with an Augmented Avatar during Autonomous Driving" Applied Sciences 12, no. 22: 11804. https://0-doi-org.brum.beds.ac.uk/10.3390/app122211804

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop