Next Article in Journal
An Overview of Twenty-Five Years of Augmented Reality in Education
Previous Article in Journal
Engaging English Language Learners as Cultural Informants in the Design of a Social Robot for Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards a Design Space of Haptics in Everyday Virtual Reality across Different Spatial Scales

Department for Informatics, LMU Munich, 80337 Munich, Germany
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2021, 5(7), 36; https://0-doi-org.brum.beds.ac.uk/10.3390/mti5070036
Submission received: 30 April 2021 / Revised: 15 June 2021 / Accepted: 26 June 2021 / Published: 3 July 2021
(This article belongs to the Special Issue Perception and Cognition in XR)

Abstract

:
Virtual Reality (VR) has become a consumer-grade technology, especially with the advent of standalone headsets working independently from a powerful computer. Domestic VR mainly uses the visual and auditory senses since VR headsets make this accessible. Haptic feedback, however, has the potential to increase immersion substantially. So far, it is mostly used in laboratory settings with specialized haptic devices. Especially for domestic VR, there is underexplored potential in exploiting physical elements of the often confined space in which it is used. In a literature review (n = 20), we analyzed VR interaction using haptic feedback with or without physical limitations. From this, we derive a design space for VR haptics across three spatial scales (seated, standing, and walking). In our narrow selection of papers, we found inspirations for future work and will discuss two example scenarios. Our work gives a current overview of haptic VR solutions and highlights strategies for adapting laboratory solutions to an everyday context.

1. Introduction

In 1962, Morton Heilig invented the Sensorama [1], which can be considered a precursor to multi-modal virtual reality (VR) systems. By using a stereoscopic color display, fans, odor emitters, a stereo sound system, and a motional chair, it covered almost all human senses (except, for example, touch). Six years later, the advent of the first see-through head-mounted display (HMD) with real-time 3D computer graphics, the Ultimate Display by Ivan Sutherland [2], attracted researchers’ interest in the human senses located in the head, namely the visual and acoustic senses. Today, nearly five decades after the debut of Sensorama, VR technology is well accessible to the public. With the rise in consumer-grade, standalone VR headsets such as the Oculus Quest 2, users start to use them in their dynamic, everyday environments to immerse themselves in another virtual environment for games and leisure. For interaction, they mainly use handheld controllers. This trend creates the challenge to replicate multi-modal immersive experiences from a laboratory VR set-up to an everyday scenario. The solutions in prior work are frequently set in spacious, isolated laboratories [3]. Moreover, compared to the high quality of visual and auditory sensation from the HMD, the controllers only provide a low-fidelity haptic sensation that is limited to vibrotactile stimulation and hardly sufficient for an immersive experience [4]. Limited research investigated VR haptic technology while taking into account the physical limitations from real environments (REs). Examples are redirected walking [5] and redirected touch [6]. To enhance an immersive VR experience in the user’s daily life, we lack haptic design strategies to deal with the physical constraints prevalent in real-world environments. In this review, we discuss previous work on haptic VR with or without physical limitations (e.g., the activity area in an apartment constrained by walls or furniture) and establish a design space of haptic feedback at different spatial scales. We will not use the term "scale" in its strict sense, but rather to denote different spatial categories.
Specifically, we are interested in examining the potential of utilizing haptic feedback from confined but reachable real environments (e.g., inside a car). To this end, we systematically reviewed 20 papers (from an initial corpus of 2284) retrieved from the ACM Digital Library, IEEE Xplore, and Google Scholar. We introduce a classification of seated, standing, and walking RE scales to cluster and characterize this work. In addition, we identified clusters by haptics type, haptic display, RE scale, application scenario, and evaluation and metrics. We explore this niche topic of using confined spaces for haptic sensations in everyday VR, since, in the context of our own research, we realized that this field seemed underexplored. Most haptics publications in fact focus on novel technologies, but hardly anybody has examined the potential of the existing real environment. However, the resulting, relatively small body of relevant work turned out to contain inspiring ideas, as we will highlight later in this paper.
This work aims to offer readers a structured and clear overview of publications on the topic of Haptic VR—i.e., immersive VR experiences supported by haptic technologies. We especially focus on the classification of various VR haptic technologies and feedback types in association with the different RE scales in which they are used in order to structure the state of the art. The taxonomy aims to foster and guide the future creation process of new use cases in everyday Haptic VR by providing a system of essential categories and properties. The driving question behind this review is related to how to adapt the established VR haptic solutions for transferable immersion strategies across real environments of different scales in everyday usage. Finally, we present potential design strategies in a design space and demonstrate its usage with two example scenarios in households and transportation.

2. Definition and Concepts of Haptics in Virtual Environments

The term “haptic” originates from the Greek haptesthai (to touch) that is related to force and tactile sensation [3]. For a basic understanding, we examine concepts and definitions of human haptics [7] and haptic displays [8] in virtual environments.

2.1. Human Haptics

The human haptic system consists of sensory, motor, mechanical, and cognitive components. For the scope of this study, we give a brief overview of tactile sensory and motor features mainly based on the work by Srinivasan and Basdogan [7]. The sensory system consists of numerous receptors and nerve endings in the skin, muscles, joints, and tendons. The hand-centric tactual sensory information consists of i) tactile information, referring to the sense of touching an object’s surface; ii) kinesthetic or proprioceptive information, meaning the awareness of the spatial location and motion of limbs under the associated forces. Therefore, the spatial and temporal force variations in the touched object, such as changes in geometry, shape, and texture, are sensed by the tactile sensors in the hand. Large-scale contours, the detection of which requires the movement of hands and arms, are sensed by kinesthetic sensors. More concretely, tactile information includes the spatial location, texture (smoothness), stiffness (while squeezing an object), and local motion (e.g., vibration) within the contact region of an object or a small-scale hand movement involving fingers and the palm. The kinesthetic information is often associated with large-scale body movement and control, such as muscle stretch reflex, limb motion, and force manipulation (with, e.g., shoulder muscles).

2.2. Haptic Displays

Research on haptic displays for VR systems started in the late 1960s [9]. Over the last three decades, technologies for haptic displays for virtual reality evolved from desktop haptics (Phantom device) and surface haptics (touchscreen) to wearable haptics (exoskeleton). Desktop haptics uses a grounded device, provoking tactile and kinesthetic stimuli. For example, the user actively moves the finger, hand, and arm (large-scale) in contact with a stylus of the tool, which is represented as a virtual pointer, pen, or cursor in the computer system. There are six degrees of freedom (DoFs) in the motion and/or force dimensions. In contrast, surface haptics often involves movement of the finger, hand, and wrist (small-scale, without forearm or shoulder) in contact with a flat touchscreen. There are two DoFs within the planar surface in the motion and/or force dimensions. Nowadays, wearable haptics rely mostly on body-worn devices in a number of locations ranging from the hand (most common) or fingers [10] to the waist [11], feet [12], or a combination of multiple regions for the full body [13]. There are up to 22 DoFs in the motion/force dimensions, which is close to the number of DoF found in a human hand.

3. Literature Review Methodology

The purpose of this work is to provide a structured and clear overview of how the Haptic VR solutions we found differ across the spatial scales of testing environments (which we refer to as RE scales) with or without physical constraints. We especially focus on established solutions of haptic technology in VR application systems and design strategies for VR usage in confined spaces. Based on the review results, we aim to guide future work on haptics in everyday VR across spatial scales by clarifying working definitions of RE scales and conceptual boundaries of haptic types and displays. In this sense, our review could be considered a scoping review [14].
For this purpose, we conducted a systematic search in the ACM Digital Library, IEEE Xplore, and Google Scholar as a starting point. We initially searched for related work using the original keywords “virtual reality”, “haptic feedback”, “confined space” and related terms such as “virtual environment”, “haptic sensation”, and “limited space”. The systematic process was adapted from PRISMA [15] and is shown in Figure 1. It contains four phases: identification, screening, eligibility, and inclusion.

3.1. Identification and Screening

We defined the following search query to find relevant work: Q = (“haptic feedback” OR “haptic sensation”) OR (“confined space” OR “limited space”) OR (“virtual reality” OR “virtual environment”). It contains the main keyword for the three main concepts we were interested in (haptics, VR, confined space) as well as one frequently used related term for each. The “OR” conjunction means that the initial search also retrieved results that did not contain all concepts in order to also find near misses in our process.
Using this term, we searched the ACM Digital Library (https://0-dl-acm-org.brum.beds.ac.uk/) and IEEE Xplore (https://0-ieeexplore-ieee-org.brum.beds.ac.uk/Xplore/home.jsp), as well as Google Scholar (https://0-scholar-google-com.brum.beds.ac.uk/) at data collection time (January 2021). We retrieved 2284 papers, of which 738 papers were duplicated or unavailable. The remaining 1546 papers were evaluated with respect to form and content criteria. Regarding the form, papers were required to be written in English and selected through a peer-review process, which includes peer-reviewed journals, conference papers, workshops, etc. In terms of the content, papers were retained if they explicitly explained a haptic feedback solution for VR interaction, VR interaction in a confined space, haptic feedback for a confined space, or a combination of these. We thus removed 1496 irrelevant papers, leaving 50 papers to review more closely.

3.2. Eligibility and Inclusion

To further identify the truly eligible papers, we formulated a rating criterion based on the occurrence and usage of keywords by authors in the abstract. The abstract usually summarizes the core of a study. The occurrence of keywords in the abstract shows that the work deals with these topics, while the way the author used these words indicates the potential relevance of the work to our study scope. We rated the remaining 50 papers on a scale from 1 to 5, with 1 being the least relevant and 5 being the most relevant (see Table 1). The complete rating definition is:
  • No keyword in the abstract (minimum of one keyword in the full text);
  • One keyword in the abstract and one of the other keywords in the text or the solution introduced in the paper is applicable for one or both of the other keywords;
  • Two keywords in the abstract;
  • Two keywords in the abstract and the third keyword in the text or the solution introduced in the paper is applicable to the third keyword;
  • Three keywords all in the abstract.
By manually rating the applicability of the presented solution to a specific concept, we used a semantic search instead of just relying on syntactic search criteria.
Based on the rating results, we then included the most relevant papers with a score of 4 or 5 in the final full-text review. This led to the inclusion of 20 eligible papers, while the other 30 papers where excluded. Next, we classified the 20 included papers in a spreadsheet (https://docs.google.com/spreadsheets/d/14kjK5GWdDk0f7Cp7MtR2XnRAKhikyLsjBsAvUMu5slg). Trying to understand the state of the art of VR haptic solutions with or without physical constraints in the first place, we created the following six criteria: (a) haptics type, describing the haptic sensation design in the VR system, such as tactile, vibrotactile, kinesthetic, and force feedback; (b) haptic display, implemented as a haptic interface between the user and virtual environments, such as wearable, grounded devices, physical proxies, handheld and autonomous devices; (c) RE scale, identified in the full text, figures, or video demos, which differentiate the testing environment with regard to its physical size, namely sitting, standing, and walking scale; (d) Application scenario, implying the goal of the VR system—for example, for gaming, media consumption, education, simulation, and accessibility; (e) evaluation and metrics, summarizing how prior haptic solutions were evaluated, for example by measuring performance, realism, presence, enjoyment, accuracy, comfort, workload, etc.

3.3. Limitations of the Study Regarding Scope and Detail

Before discussing and interpreting the results, we would first like to briefly reflect on some of the limitations of our review. Due to the specific interest in Haptic VR with or without physical constraints, the study scope became narrow. This is also reflected in the large initial body of potentially relevant papers (2284), of which only 20 were included in the final selection. The small selection cannot fully represent the wide field of general work on haptic sensation in virtual environments. However, the 20 papers are adequate to provoke insights for transferable haptic immersion using scalable haptic solutions for everyday VR. We also carefully left out technical detail, such as operating systems, mechanical engineering, or physiological principles in favor of a clear and structured overview with a succinct length and insightful themes.

4. Results of the Literature Review

Below, we will introduce the results of our systematic selection process and analysis according to the criteria above. We will first introduce the state of the art of identified Haptics Type and Haptics Display to understand the established VR haptic solutions with or without physical constraints. Then, we will examine how these solutions differ across RE Scales and Application Scenarios to provide the basis for further discussion on transferable haptic immersion from testing environments to everyday confined spaces such as households and transportation. Finally, we will complete the results with an overview of Evaluation and Metrics. An overview of the entire classification is given in Table 1 and the results regarding specific aspects will be discussed below.

4.1. Haptics Type and Haptic Display

Along with the evolution of haptic display in virtual environments, we determine the prevalent Wearable solutions in our review. In total, we identified ten Haptic VR studies deploying various body-worn garments which includes four arm- [17,19,24,31], three hand- [26,29,34], one wrist- [32], one waist- [28], and one torso-worn [16] solutions. This indicates that the majority of wearable solutions focus on the upper human body. The decision of the exact position is affected by the study motivation and system purpose—i.e., what kinds of haptic feedback to pursue. Most arm-centric wearable devices aim to offer force and/or kinesthetic feedback depending on the emphasis of the study.
To enable active contact with walls and heavy objects, Lopes et al. [24] investigated a repulsion design (where users are pushed away from the object) and a soft design (where hands can permeate into virtual objects) involving muscles of the shoulder, arm, and wrist. The ElectroCutscenes [31] instead supports the involuntary movement of passive users, such as raising arms or forearms, rotating forearm, and finger flexion/extension and grasp. The former study emphasized the force feedback triggered by active users, while the latter focused on the kinesthetic feedback triggered by the system acting on the body movement of passive users. Nevertheless, the other arm-centric systems used both force and kinesthetic feedback or force feedback only. FlexTorque and FlexTensor [19] use an arm-worn exoskeleton that exerts forces to the human arm and causes arm motion. Similarly, Kruijff et al. [17] adopted neuromuscular electrical stimulation (NMES) to achieve pseudo-haptic feedback, especially in force-related events.
In contrast, hand-centric wearable devices cover a combination of force and tactile feedback or force feedback only. The combination can be, for example, palm-centric. The Delta Touch [29] relies on a mechanical, palm-worn moving platform with a vibration motor as an effector, while the PuPop [26] uses a physical proxy approach via a pneumatic shape-changing interface, much like a palm-worn airbag. On the opposite side of the palm, the Wireality [34] is designed to provide force feedback in case of collisions via on-hand tethering strings connected to a shoulder-mounted mechanical system. Between the hand and arm, one specific study focused on a wrist-worn solution: Tasbi [32] is a fabricated bracelet that realizes vibrotactile feedback and squeeze force feedback for virtual hand-based interactions in VR.
When moving away from the hand and arm, there is a decreasing number of tactile solutions in torso-worn or waist-worn studies, as well as fewer combinations of multiple forms of haptic feedback. The TactaVest [16] introduced a customized vest to induce vibration only on the upper body. The HapticSerpent [28] is a waist-worn serpent-shaped robot arm which can apply forces such as push, pull hit, scratch, and pinch on various body parts. Overall, most of the wearable solutions were designed for force feedback (including squeezing, n = 8) [17,19,24,26,28,29,32,34], followed by tactile (including vibrotactile, n = 4) [16,26,29,32] and kinesthetic feedback (n = 2) [19,31]. The combinations of force and tactile [26,29,32] or force and kinesthetic [19] feedback were shown to be feasible in prior studies for wearable solutions.
Another predominant Haptic VR approach is to use a Grounded Device (n = 6), which ranges from a stationary object [21], to a motorized surrounding platform [35], to steerable devices [18,27] to ultrasonic devices [23,25]. Among them, two hybrid solutions were combined either with a Physical Proxy [35] or with a Handheld Device [27]. All grounded solutions offer tactile feedback to VR users; however, the targeted body parts vary with the study goal and concept. Starting from the most sensitive tactile receptors on the human hand, ultrasonic technologies [23,25] demonstrated the potential of contact-free tactile sensation via mid-air interfaces. Additionally, Ochiai et al. [23] utilized a cross-field aerial haptic display through the superposition of femtosecond-laser light and ultrasonic acoustic fields. Instead of a single focus on non-contact tactile sensations, the HapticAround [27] device covered touchless tactile sensation via a grounded, steerable haptic device over the user’s head, together with a handheld device which generates a contact tactile sensation. Such a hybrid approach increases scalability not only from contact to non-contact but also from the partial to full body and static to portable scenarios containing multiple tactile sensations such as wind, temperature, and humidity.
Similarly, another combination of a Grounded Device and a Physical Proxy is a motorized turntable [35], which allows designers or users to replace the physical proxies mounted on this rotating platform with any kind of prop based on the haptics type they want to encounter. The corresponding arm movement of reaching out to touch or manipulate the surrounding physical objects (e.g., a fishing rod) creates the resulting tactile and kinesthetic feedback. Taken together, such a combination of approaches can succeed both in terms of depth (multiple forms of single modality) and breadth (multiple haptic modalities) but requires a large physical space in a test environment or real world scenario which will be discussed in the next section.
In contrast to the bulky laboratory haptic displays, an everyday Grounded Device such as a steerable scooter [18] or a normal chair [21] is easily accessible and can provide certain haptic feedback, such as vibrotactile and/or force feedback. A chair [21] with vibration actuators on the surface and bottom is well suited to a passive user experience in VR. In contrast, a steerable scooter [18], as a navigation input, facilitates and increases realism for active users when exploring virtual environments.
Beyond the aforementioned “add-on” solutions in Haptic VR, we found approaches taking advantage of what already exists and the user is familiar with. Such “built-in” solutions have a small internal difference between Physical Proxy (n = 3) and Handheld Device (n = 1). To create haptic feedback in VR, the former approach pursues a natural/intuitive haptic interface from the surrounding, everyday objects that offer affordances for humans to touch and manipulate, such as a small ball [20], airbags on the floor [30], or a household cleaning robot [33]. Moreover, carry-on proxies on a consumer-grade Autonomous Device such as a cleaning robot [33] demonstrate a promising haptic design approach of module-based customization to enhance immersion in everyday VR. The latter approach, however, just uses the well accepted VR hand controller as an existing haptic source. For a higher fidelity rendering beyond vibrotactile feedback, the Normal/TextureTouch [22] built a mechanically actuated platform which is tiltable for users to feel the shape and extrudable for recognizing the texture.

4.2. Real Environment Scales and Application Scenarios

To understand how the authors used the real environment and took into account its spatial scale, we examined the testing environment described in the full text, figures, or video demos and then classified them according to RE scale. Overall, Haptic VR solutions were designed evenly for Standing scale (n = 9) [18,19,22,23,25,28,31,32,35], Seated scale (n = 6) [17,20,21,26,29,34], and Walking scale rooms (n = 5) [13,24,27,30,33]. In contrast, the application scenarios are skewed towards Gaming (n = 10) [17,19,24,25,26,27,30,31,33,35], followed by simulation (n = 2) [13,27], media consumption (n = 1) [17], and accessibility (n = 1) [23]. Moreover, the other seven papers on Haptic VR lack a specific application scenario in the study, which we categorized as Demo [18,20,22,28,29,32,34].
The identified Walking scale ranges from 4.5 × 4.5 × 3 m [24], 2 × 2 m [27], to 1.86 × 2.22 m [30] with the more recent publications requiring less space. The rapid development of VR tracking systems largely facilitates the environment set-up, e.g., from outside-in (external tracking sensors) to inside-out (built-in cameras on the headset). Nevertheless, the walkable VR scenario constantly aims to build a realistic Gaming experience in which the user can explore diverse virtual environments via stepping, sitting, leaning, pushing and touching, such as Jurassic Island escape [30] or dog walking [33]. In addition, a walking VR experience can enhance Education as well as the Simulation of real-world events, e.g., learning about the working process of a blacksmith [27] or a battle [13] or weather simulation [27].
When scaling down to Standing and Seated spaces, there is an increasing number of Demo applications with no specialized scenarios, which rather focus on technical contributions. Still, in some cases, the RE scale can be driven from the choice of the application scenario: for example, a Seated body posture is well suited to a relaxing or passive user state found in Media Consumption [17]. When thinking about haptic media, the passive user can consume or author VR haptic content from a vibrating chair with a low information capacity [17]. In contrast, a Standing posture enhances the users’ awareness of a high information capacity, such as an aerial Braille alphabet, via sophisticated cross-field aerial haptics [23].

4.3. Evaluation of System Performance and User Perception

After implementing a haptic feedback application for VR interaction in a constrained space, it is necessary to validate whether the feedback is suitable for interaction in the constrained space and provides a better experience for the user. Here, we describe how the utility of a haptic system in VR interaction is measured and how it is verified, as well as whether it improves the user’s perception and experience. Before conducting a study to investigate the influence of haptic feedback, the researchers cautiously set up a suitable experiment, usually in a between-subject design. The control group is equipped with a different haptics type, lower-fidelity feedback, or without haptic stimuli.
Depending on the research question, the experimenter then mostly asks specific questions to capture the dominant influence of haptic feedback. An example for such a question is: “How realistic did the object feel?” [34]. The majority of evaluations use Likert scales. The questions in the questionnaires were mostly related to enjoyment and realism. Furthermore, participants were asked to specify the reasons for their choices and talk freely about their overall perception of the haptic sensation they experienced. A further possibility of evaluating the solution was carried out in some works, partially even as the only measurement method: participants were given a task for which a performance metric could be specified, for example, traversing a parcours [18]. They then performed this task in two experimental setups, one with and one without haptic feedback. Performance was compared, thereby measuring the impact of the haptic feedback device. User evaluations were also conducted using structured or semi-structured interviews. In some publications, a qualitative evaluation of the hardware was conducted. Example criteria for a wearable solution include weight and power consumption [34].
One purpose of haptic feedback in VR interaction is to enhance the feeling of presence of the user. To this end, established questionnaires are often used. Schwind et al. [36] present 15 questionnaires in their work, which are claimed to measure presence. The most frequently used ones are the Igroup Presence Questionnaire (IPQ), the Slater Usoh-Steed (SUS), and the Witmer and Singer (WS) [36]. Another way to measure a participant’s presence is to observe their behaviors. It is a sign of presence if the participant’s behavior in the application is analogous to the behavior in the corresponding real-world situation [37]. Physiological measurements, such as brain activity [38,39,40], electrodermal activity (EDA) and heart rate [41,42], skin conductance and temperature [42], provide objective tools for measuring presence.

5. A Design Space for Everyday Haptic VR

Overall, there is limited research on haptic feedback for VR interaction tailored for everyday spaces with physical constraints. The most prominent solution identified in this review is the redirected touch technique by Carvalheiro et al. [20]. However, most of the established Haptic VR solutions could be applicable for confined spaces. Below, we evaluate the solutions and discuss to what extent they are applicable in limited spaces.

5.1. A Spatial Design Space

To envision a design space for Haptic VR in everyday confined spaces, we revisit the two criteria RE Scale and Haptic Display. As a design fiction [43], we illustrated all the 20 studies in the largest real environment size we had identified, i.e., 4.5 × 4.5 × 3 m [24] (see Figure 2). However, the following discussion will also include the other two relevant classifications, Haptics Type and Application Scenario, which both are related to the resulting motion and position of the VR user under physical constraints.
Starting from the most confined Seated scale, the prior work present promising Haptic VR solutions using a Wearable on the user’s hand [26,29,34], arm [17], and/or a Physical Proxy worn on the palm [26] or from the surrounding environment that is essentially within the reach of the user’s arm [20], as well as a specialized focus on the most natural Grounded Device in a seated position—the chair. The restricted body movement also restrains the range of possible haptic sensations mainly to tactile and force feedback. The only kinesthetic feedback we found at this spatial scale was user redirection [20] or, more concretely, redirected touch, which evolved together with redirected walking [5]. However, such a technique is questionable in the long run. We speculate that the seated VR user might become exhausted and stop reaching out, or even be tempted to stand up or walk around to explore virtual environments outside of the tracking volume. The seated position, by nature, is an everyday type of proprioception which is normally found in rest. Exploiting this fact, VR viewing with vibrotactile feedback conveys limited but sufficient information for media consumption without interrupting the user’s experience when watching immersive movies [21]. Nevertheless, kinesthetic feedback from external motion platforms is feasible in a seated VR experience, as we will see in the mobile VR example scenario below.
When expanding the available space to the Standing and further to Walking scales, the chances to induce kinesthetic sensations. This is achieved via diverse haptic displays including arm-worn EMS [31] (raising arms sideways), a specialized human-height turntable [35] (raising arms forward), and pneumatically actuated floor ties as proxy objects [30] (sitting, stepping, leaning, or lying on obstacles). The additional space also offers more room for multiple sensations (i.e., multi-sensory experiences) such as wind, temperature and humidity [18,27]. Similarly, prior studies succeeded in inducing multi-form sensations such as multi-resolution haptic images rendered by cross-fields of light and sound [23] and multiple force forms (tension, resistance, impact) from an autonomous consumer-grade robot [33].

5.2. Usage of the Design Space

To demonstrate how to use our spatial design space for Haptic VR in everyday settings, we present the following two example scenarios of domestic and mobile VR.

5.2.1. Domestic VR—Household VR Gym

In an everyday household environment, the VR user can already put on a personal consumer-class headset and play and sweat in embodied games. When playing such rhythm games in an available standing-to-walking space at home, users can immerse themselves beyond seeing and hearing via touching a prefabricated playground. Inspired by the solutions of (carry-on) physical proxies [20,30] and autonomous devices [33], we envision a Household VR Gym application, in which users can select surrounding objects and label them, e.g., a sofa as passive haptics and a cleaning robot as active haptics, before the exercise game, similar to the current guardian set-up. After this haptics set-up, the system can involve the surfaces, shapes, and motion of these physical proxies into the rhythm game. For example, the peripheral of the sofa is mapped to an exercise route in the virtual environment, while the robot (with a carry-on box) represents an avatar coach whom you can follow in a connected way.

5.2.2. Mobile VR—Passenger VR Relaxation

In an everyday transit context, the rear-seat passenger can experience a Holoride [44] in the VR headset and play a space shooting game which is synchronized with the real motion of the car. In addition to gaming, the travel time can be used for well-being given a passive user state [21] on the seat. We envision a Passenger VR Relaxation application, in which haptics is designed in accordance with visual sensation aiming for a low-arousal relaxation experience in travels [45]. On the other side, the visual motion cues that are integrated into the virtual environment can diminish motion sickness by synchronizing with the real-time vehicle movements [46]. Grounded devices inside modern cars (e.g., the AC system) and external environmental stimuli outside the car (breeze) support a multi-form tactile sensation (temperature, wind). The hybrid haptic sensation from the real world and simulated virtual environment might create a novel embodied immersive experience.

5.3. Proposed Design Strategies

After establishing the design space and using it to structure the work we have found, we would like to cautiously venture beyond the state of the art. For scalable haptic sensations in an everyday confined space, we propose the following design strategies.

5.3.1. Single RE Scale

We first consider a single RE scale, in which the user is constantly seated, standing or walking (with physical limitations) during the VR experience. The fixed spatial scale makes it easier to use a prefabricated environment set-up. Thus, we propose a design strategy using an affordable grounded device and/or physical proxy, such as a chair, a cleaning robot, a fan, etc. For example, deploying physical household objects could expand the range of haptic feedback from the hand to the full body via everyday proxies from grounded furniture such as a chair [21] and proxies carried by a reconfigured cleaning robot [33]. We call for future work on do-it-yourself solutions (just as the Google cardboard in HMDs) for accessible haptics in everyday VR within a single RE scale.

5.3.2. Hybrid RE Scales

Hybrid RE scales mean that the user presumably moves across various spatial scales in a single VR experience. In the face of such a dynamic, everyday user behavior, we believe in the potential of deploying portable haptic solutions, such as affordable wearable, handheld devices, and autonomous devices for ubiquitous haptic feedback in everyday VR experiences. An example of combining wearable and handheld devices would be the haptic PIVOT [47], which is a lightweight forearm-mounted pivoting mechanism that rotates a handle into and out of the user’s palm to provide a realistic haptic proxy. The flexibility of this haptic solution allows one to design dynamic variants of each haptic display, even in such an everyday VR across different spatial scales.

6. Summary

In summary, we systematically retrieved and analyzed work on haptics in VR with a special focus on confined spaces. We derived a design space and used it to describe two example scenarios. Based on the insights from our analysis, we concluded by proposing concrete design strategies. With this work, we hope to guide and inspire future work on VR in everyday environments across different scales, utilizing the properties of the environment, such as households and various modes of transportation.

Author Contributions

Conceptualization, J.L.; Data curation, A.M.; Formal analysis, J.L. and A.M.; Investigation, J.L. and A.M.; Methodology, J.L. and A.M.; Supervision, A.B.; Validation, J.L.; Visualization, J.L.; Writing—original draft, J.L., A.M. and A.B.; Writing—review & editing, J.L. and A.B. All authors have read and agreed to the published version of the manuscript.

Funding

Jingyi Li’s contributions were funded by the China Scholarship Council (CSC), grant number 201908080094.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sensorama Simulator (1962). Available online: http://www.freepatentsonline.com/3050870.html (accessed on 30 April 2021).
  2. Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the Fall Joint Computer Conference, Part I, San Francisco, CA, USA, 9–11 December 1968; pp. 757–764. [Google Scholar]
  3. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Whitmire, E.; Benko, H.; Holz, C.; Ofek, E.; Sinclair, M. Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 21–26 April 2018; pp. 1–12. [Google Scholar] [CrossRef]
  5. Nilsson, N.C.; Serafin, S.; Steinicke, F.; Nordahl, R. Natural Walking in Virtual Reality: A Review. Comput. Entertain. 2018, 16, 1–22. [Google Scholar] [CrossRef]
  6. Kohli, L.; Whitton, M.C.; Brooks, F.P. Redirected touching: The effect of warping space on task performance. In Proceedings of the 2012 IEEE Symposium on 3D User Interfaces (3DUI), Costa Mesa, CA, USA, 4–5 March 2012; pp. 105–112. [Google Scholar] [CrossRef]
  7. Srinivasan, M.A.; Basdogan, C. Haptics in virtual environments: Taxonomy, research status, and challenges. Comput. Graph. 1997, 21, 393–404. [Google Scholar] [CrossRef]
  8. Dangxiao, W.; Yuan, G.; Shiyi, L.; Zhang, Y.; Weiliang, X.; Jing, X. Haptic display for virtual reality: Progress and challenges. Virtual Real. Intell. Hardw. 2019, 1, 136–162. [Google Scholar] [CrossRef] [Green Version]
  9. Brooks, F.P., Jr.; Ouh-Young, M.; Batter, J.J.; Jerome Kilpatrick, P. Project GROPEHaptic displays for scientific visualization. In Proceedings of the 17th Annual Conference on Computer Graphics and Interactive Techniques—SIGGRAPH ‘90, Dallas, TX, USA, 6–10 August 1990; ACM Press: New York, NY, USA, 1990. [Google Scholar] [CrossRef]
  10. Gonzalez, E.J.; Follmer, S. Investigating the detection of bimanual haptic retargeting in virtual reality. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, Parramatta, Australia, 12–15 November 2019; pp. 1–5. [Google Scholar]
  11. Cheng, C.H.; Chang, C.C.; Chen, Y.H.; Lin, Y.L.; Huang, J.Y.; Han, P.H.; Ko, J.C.; Lee, L.C. GravityCup: A liquid-based haptics for simulating dynamic weight in virtual reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, Tokyo, Japan, 28 November–1 December 2018; pp. 1–2. [Google Scholar]
  12. Wolf, D.; Rogers, K.; Kunder, C.; Rukzio, E. Jumpvr: Jump-based locomotion augmentation for virtual reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–12. [Google Scholar]
  13. Günther, S.; Makhija, M.; Müller, F.; Schön, D.; Mühlhäuser, M.; Funk, M. PneumAct: Pneumatic kinesthetic actuation of body joints in virtual reality environments. In Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA, 23–28 June 2019; pp. 227–240. [Google Scholar]
  14. Peters, M.D.J.; Godfrey, C.M.; Khalil, H.; McInerney, P.; Parker, D.; Soares, C.B. Guidance for conducting systematic scoping reviews. Int. J. Evid. Based Healthc. 2015, 13, 141–146. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.A.; Clarke, M.; Devereaux, P.J.; Kleijnen, J.; Moher, D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. J. Clin. Epidemiol. 2009, 62, e1–e34. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Lindeman, R.W.; Page, R.; Yanagida, Y.; Sibert, J.L. Towards full-body haptic feedback. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology—VRST ’04, Hong Kong, China, 10–12 November 2004; Lau, R.W.H., Baciu, G., Eds.; ACM Press: New York, NY, USA, 2004; p. 146. [Google Scholar] [CrossRef]
  17. Kruijff, E.; Schmalstieg, D.; Beckhaus, S. Using neuromuscular electrical stimulation for pseudo-haptic feedback. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Limassol, Cyprus, 1–3 November 2006; pp. 316–319. [Google Scholar]
  18. Deligiannidis, L.; Jacob, R.J. The vr scooter: Wind and tactile feedback improve user performance. In Proceedings of the 3D User Interfaces (3DUI’06), Alexandria, VA, USA, 25–26 March 2006; pp. 143–150. [Google Scholar]
  19. Tsetserukou, D. FlexTorque, FlexTensor, and HapticEye: Exoskeleton haptic interfaces for augmented interaction. In Proceedings of the 2nd Augmented Human International Conference, Tokyo, Japan, 13 March 2011; pp. 1–2. [Google Scholar]
  20. Carvalheiro, C.; Nóbrega, R.; da Silva, H.; Rodrigues, R. User Redirection and Direct Haptics in Virtual Environments. In Proceedings of the 2016 ACM on Multimedia Conference—MM ’16, Amsterdam, The Netherlands, 15–19 October 2016; Hanjalic, A., Snoek, C., Worring, M., Bulterman, D., Huet, B., Kelliher, A., Kompatsiaris, Y., Li, J., Eds.; ACM Press: New York, NY, USA, 2016; pp. 1146–1155. [Google Scholar] [CrossRef]
  21. Israr, A.; Schwemler, Z.; Mars, J.; Krainer, B. VR360HD: A VR360° player with enhanced haptic feedback. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Munich, Germany, 2–4 November 2016; pp. 183–186. [Google Scholar]
  22. Benko, H.; Holz, C.; Sinclair, M.; Ofek, E. Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; pp. 717–728. [Google Scholar]
  23. Ochiai, Y.; Kumagai, K.; Hoshi, T.; Hasegawa, S.; Hayasaki, Y. Cross-field aerial haptics: Rendering haptic feedback in air with light and acoustic fields. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 3238–3247. [Google Scholar]
  24. Lopes, P.; You, S.; Cheng, L.P.; Marwecki, S.; Baudisch, P. Providing haptics to walls & heavy objects in virtual reality by means of electrical muscle stimulation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1471–1482. [Google Scholar]
  25. Georgiou, O.; Jeffrey, C.; Chen, Z.; Tong, B.X.; Chan, S.H.; Yang, B.; Harwood, A.; Carter, T. Touchless haptic feedback for VR rhythm games. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 553–554. [Google Scholar]
  26. Teng, S.Y.; Kuo, T.S.; Wang, C.; Chiang, C.H.; Huang, D.Y.; Chan, L.; Chen, B.Y. Pupop: Pop-up prop on palm for virtual reality. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 14 October 2018; pp. 5–17. [Google Scholar]
  27. Han, P.H.; Chen, Y.S.; Lee, K.C.; Wang, H.C.; Hsieh, C.E.; Hsiao, J.C.; Chou, C.H.; Hung, Y.P. Haptic around: Multiple tactile sensations for immersive environment and interaction in virtual reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, Tokyo, Japan, 28 November–1 December 2018; pp. 1–10. [Google Scholar]
  28. Al-Sada, M.; Jiang, K.; Ranade, S.; Piao, X.; Höglund, T.; Nakajima, T. HapticSerpent: A wearable haptic feedback robot for VR. In Proceedings of the Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April; pp. 1–6.
  29. Trinitatova, D.; Tsetserukou, D. DeltaTouch: A 3D Haptic Display for Delivering Multimodal Tactile Stimuli at the Palm. In Proceedings of the 2019 IEEE World Haptics Conference (WHC), Tokyo, Japan, 9–12 July 2019; pp. 73–78. [Google Scholar]
  30. Teng, S.Y.; Lin, C.L.; Chiang, C.H.; Kuo, T.S.; Chan, L.; Huang, D.Y.; Chen, B.Y. TilePoP: Tile-type pop-up prop for virtual reality. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, LA, USA, 20–23 October 2019; pp. 639–649. [Google Scholar]
  31. Khamis, M.; Schuster, N.; George, C.; Pfeiffer, M. ElectroCutscenes: Realistic Haptic Feedback in Cutscenes of Virtual Reality Games Using Electric Muscle Stimulation. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, Parramatta, Australia, 12–15 November 2019; pp. 1–10. [Google Scholar]
  32. Pezent, E.; O’Malley, M.K.; Israr, A.; Samad, M.; Robinson, S.; Agarwal, P.; Benko, H.; Colonnese, N. Explorations of Wrist Haptic Feedback for AR/VR Interactions with Tasbi. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–4. [Google Scholar]
  33. Wang, Y.; Chen, Z.; Li, H.; Cao, Z.; Luo, H.; Zhang, T.; Ou, K.; Raiti, J.; Yu, C.; Patel, S.; et al. MoveVR: Enabling Multiform Force Feedback in Virtual Reality using Household Cleaning Robot. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
  34. Fang, C.; Zhang, Y.; Dworman, M.; Harrison, C. Wireality: Enabling complex tangible geometries in virtual reality with worn multi-string haptics. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
  35. Huang, H.Y.; Ning, C.W.; Wang, P.Y.; Cheng, J.H.; Cheng, L.P. Haptic-Go-Round: A surrounding platform for encounter-type haptics in virtual reality experiences. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
  36. Schwind, V.; Knierim, P.; Haas, N.; Henze, N. Using presence questionnaires in virtual reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019. [Google Scholar] [CrossRef]
  37. Slater, M. A Note on Presence Terminology. Presence Connect. 2003, 3, 1–5. [Google Scholar]
  38. Baumgartner, T.; Valko, L.; Esslen, M.; Jäncke, L. Neural correlate of spatial presence in an arousing and noninteractive virtual reality: An EEG and psychophysiology study. Cyberpsychol. Behav. 2006, 9, 30–45. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Baumgartner, T.; Speck, D.; Wettstein, D.; Masnari, O.; Beeli, G.; Jäncke, L. Feeling present in arousing virtual reality worlds: Prefrontal brain regions differentially orchestrate presence experience in adults and children. Front. Hum. Neurosci. 2008, 2, 8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Tjon, D.M.; Tinga, A.M.; Alimardani, M.; Louwerse, M.M. Brain activity reflects sense of presence in 360 video for virtual reality. In Proceedings of the 28th International Conference on Information Systems Development, Toulon, France, 28–30 August 2019. [Google Scholar]
  41. Wiederhold, B.K.; Jang, D.P.; Kaneda, M.; Cabral, I.; Lurie, Y.; May, T.; Kim, I.; Wiederhold, M.D.; Kim, S. An investigation into physiological responses in virtual environments: An objective measurement of presence. Towards Cyberpsychol. Mind Cogn. Soc. Internet Age 2001, 2, 175–183. [Google Scholar]
  42. Meehan, M.; Insko, B.; Whitton, M.; Brooks, F.P., Jr. Physiological measures of presence in stressful virtual environments. ACM Trans. Graph. (Tog) 2002, 21, 645–652. [Google Scholar] [CrossRef] [Green Version]
  43. Blythe, M. Research through design fiction: Narrative in real and imaginary abstracts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI ‘14, Toronto, ON, Canada, 26 April–1 May 2014; pp. 703–712. [Google Scholar] [CrossRef]
  44. Holoride: Virtual Reality Meets the Real World. Available online: https://www.audi.com/en/experience-audi/mobility-and-trends/digitalization/holoride-virtual-reality-meets-the-real-world.html (accessed on 30 April 2021).
  45. Jingyi, L.; Yong, M.; Puzhen, L.; Andreas, B. A Journey Through Nature: Exploring Virtual Restorative Environments as a Means to Relax in Confined Spaces. In Proceedings of the Creativity and Cognition—C&C ’21, Virtual Event, Italy, 22–23 June 2021. [Google Scholar] [CrossRef]
  46. McGill, M.; Ng, A.; Brewster, S. I Am The Passenger: How Visual Motion Cues Can Influence Sickness For In-Car VR. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems—CHI ‘17, Denver, CO, USA, 6–11 May 2017; pp. 5655–5668. [Google Scholar] [CrossRef] [Green Version]
  47. Kovacs, R.; Ofek, E.; Gonzalez Franco, M.; Siu, A.F.; Marwecki, S.; Holz, C.; Sinclair, M. Haptic PIVOT: On-Demand Handhelds in VR. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Minneapolis, MN, USA, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1046–1059. [Google Scholar] [CrossRef]
Figure 1. A flow diagram of our systematic selection process (adapted from [15]).
Figure 1. A flow diagram of our systematic selection process (adapted from [15]).
Mti 05 00036 g001
Figure 2. A spatial design space for haptic VR across three RE scales: Seated (in orange), Standing (in blue), and Walking (in green), ranging from 2 × 2 × 1.7 m to 4.5 × 4.5 × 3 m.
Figure 2. A spatial design space for haptic VR across three RE scales: Seated (in orange), Standing (in blue), and Walking (in green), ranging from 2 × 2 × 1.7 m to 4.5 × 4.5 × 3 m.
Mti 05 00036 g002
Table 1. Condensed results of our systematic selection and analysis.
Table 1. Condensed results of our systematic selection and analysis.
SystemRef.YearHaptics TypeHaptic DisplayRE ScaleApplic. ScenarioEvaluation and Metrics
TactaVest[16]2004VibrotactileWearableWalkingSimulat.Robustness, ease of use, weight, power consumption, cable management
NMES Arm[17]2006ForceWearableSeatedGamingMuscle constractions, pain, excitement, utility
VRScooter[18]2006Vibrotactile ForceGroundedStandingDemoTime to complete, satisfaction, presence, simulator sickness
FlexTorq./Tens.[19]2011Kinesthetic ForceWearableStandingGamingn/a
Diraptics[20]2016TactileWearableSeatedGamingSystem accuracy, execution time, space awareness
VR360HD[21]2016VibrotactileGroundedSeatedMedia Consum.Location/speed/direction/ continuity recognition, sensory illusion
Nor./Tex.Touch[22]2016TactileHandheldStandingDemoAccuracy, realism
Cross-Field[23]2016TactileGroundedStandingAccessi.Perceptual threshold, spatial pattern recognition, scalability, resolution, safety
Haptics ToWall[24]2017ForceWearableWalkingGamingBelievablilty, impermeability, consistency, familiarity, realism, enjoyment, preference
Touchless Rhythm[25]2018TactileGroundedStandingGamingn/a
PuPop[26]2018Tactile ForceWearable Physical ProxySeatedGamingWearability, affordance, interactivity, acceptance, enjoyment, realism
Haptic Around[27]2018TactileGrounded HandheldWalkingGaming Educat. Simulat.Enjoyment, realism, quality, immersion
Haptic Serpent[28]2018ForceWearableStandingDemoComfort, acceptability
Delta Touch[29]2019Tactile ForceWearableSeatedDemoTactile/weight perception
TilePop[30]2019Kinesthetic TactilePhysical ProxyWalkingGamingUser experience, safety
Electro Cutscenes[31]2019KinestheticWearableStandingGamingPresence, realism, consistency, preference, involvement
Tasbi[32]2020Vibrotactile ForceWearableStandingDemoUtility
MoveVR[33]2020ForcePhysical Proxy AutonomousWalkingGamingPerception accuracy, realism, enjoyment, acceptability, user experience
Wireality[34]2020ForceWearableSeatedDemoWeight, field of reach, spatial consistency, realism, comfort, freedom of movement
Haptic GoRound[35]2020Kinesthetic TactileGrounded Physical ProxyStandingGamingPerformance
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, J.; Mayer, A.; Butz, A. Towards a Design Space of Haptics in Everyday Virtual Reality across Different Spatial Scales. Multimodal Technol. Interact. 2021, 5, 36. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5070036

AMA Style

Li J, Mayer A, Butz A. Towards a Design Space of Haptics in Everyday Virtual Reality across Different Spatial Scales. Multimodal Technologies and Interaction. 2021; 5(7):36. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5070036

Chicago/Turabian Style

Li, Jingyi, Alexandra Mayer, and Andreas Butz. 2021. "Towards a Design Space of Haptics in Everyday Virtual Reality across Different Spatial Scales" Multimodal Technologies and Interaction 5, no. 7: 36. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5070036

Article Metrics

Back to TopTop