Next Article in Journal / Special Issue
User Behavior Adaptive AR Guidance for Wayfinding and Tasks Completion
Previous Article in Journal
Using Shallow and Deep Learning to Automatically Detect Hate Motivated by Gender and Sexual Orientation on Twitter in Spanish
Previous Article in Special Issue
There is Always a Way: Organizing VR User Tests with Remote and Hybrid Setups during a Pandemic—Learnings from Five Case Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

An Overview of Olfactory Displays in Education and Training

by
Miguel Angel Garcia-Ruiz
1,*,
Bill Kapralos
2 and
Genaro Rebolledo-Mendez
3
1
School of Computer Science and Technology, Algoma University, Sault Ste. Marie, ON P6A 4H9, Canada
2
Faculty of Business and Information Technology, Ontario Tech University, Oshawa, ON L1H 7K4, Canada
3
Tec Labs, Tecnologico de Monterrey, Monterrey 64869, Nuevo Leon, Mexico
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2021, 5(10), 64; https://0-doi-org.brum.beds.ac.uk/10.3390/mti5100064
Submission received: 27 August 2021 / Revised: 28 September 2021 / Accepted: 6 October 2021 / Published: 13 October 2021
(This article belongs to the Special Issue Feature Papers of MTI in 2021)

Abstract

:
This paper describes an overview of olfactory displays (human–computer interfaces that generate and diffuse an odor to a user to stimulate their sense of smell) that have been proposed and researched for supporting education and training. Past research has shown that olfaction (the sense of smell) can support memorization of information, stimulate information recall, and help immerse learners and trainees into educational virtual environments, as well as complement and/or supplement other human sensory channels for learning. This paper begins with an introduction to olfaction and olfactory displays, and a review of techniques for storing, generating and diffusing odors at the computer interface. The paper proceeds with a discussion on educational theories that support olfactory displays for education and training, and a literature review on olfactory displays that support learning and training. Finally, the paper summarizes the advantages and challenges regarding the development and application of olfactory displays for education and training.

1. Introduction

The human senses have been defined as the physiological capacities that provide data for perception. The five traditionally recognized senses (as classified by Aristotle over 2000 years ago) include sight, smell, taste, touch, and hearing, although we now know that we can detect stimuli beyond these five traditional senses (e.g., temperature, pain, balance, amongst others) [1]. Olfaction (one of the phylogenetically oldest human senses) [2], that is, the sense of smell, allows smells (odors) to be perceived. The human olfactory system has between 6 and 10 million receptor cells allowing it to distinguish between 2000–4000 different smells [3], and although the human olfactory system has been described as poor relative to other animals including dogs (e.g., a sheep dog has 220 million receptor cells), it is still very acute [4]. As Doty (2009) [5] and Dozio et al. (2021) [6] point out, the olfactory system is greatly underappreciated as a source of information and a way to interact with our environment, despite the fact that it monitors the intake of airborne agents into the human respiratory system, and helps determine the flavor and palatability of foods and beverages. In addition to enhancing the quality of life, the olfactory system warns us of dangers (e.g., spoiled foods, polluted air) and mediates basic elements of communication (e.g., mother–infant interactions). In other words, olfaction is critically important for safety, nutritional status, and quality of life; its dysfunction is now known to be among the earliest “preclinical” signs of Alzheimer’s disease (AD) and sporadic Parkinson’s disease (PD) [5].
Olfaction can evoke memories linked to past experiences and the emotions that arise from these experiences have been well established [7]. In fact, olfaction can evoke memories more intensively than the other modalities and it can convey meaning [8] and allow us to shift our attention by modulating our distance to the source based on the perceived pleasantness of a smell [9]. A particular smell can provide a powerful and sustained cue, and when it is associated with a pleasurable experience, it can offer a mechanism for recalling events and emotions [10]. As described by Ward et al. (1999) [10], Herz (1998) [11] has shown that olfaction is most closely linked to remembered emotion rather than facts.
Olfactory displays are defined as human–computer interfaces that generate and diffuse one or more odors to a user for a purpose [12,13]. Computer-generated odors have been proposed and used in education and training settings over the past four decades, supporting memorization of information, helping immerse learners into 3D educational environments, and complementing or supplementing the other human senses. Smells can convey meaningful and useful information at the computer/digital product interface. Applications of olfactory display include the use of smell as a warning signal, and “mood enhancer” using aromatherapy techniques, given that smells stimulate emotional responses. In addition, olfaction is a powerful recall stimulant [14]. Smith et al. (1992) [15] demonstrated that 47 participants successfully recalled a 24-word list when smelling two scents during an initial learning and a re-learning session, suggesting that smells can provide an effective contextual cue for retrieval of verbal stimuli. Including olfactory stimuli in multimedia applications can lead to a more complex and richer user multimedia experience, since they can heighten the sense of reality and diversify user interaction modalities [16].
In a cross-modal comparison of verbal, visual, tactile, musical and olfactory stimuli as associated memory cues, Herz (1998) [11] found that odor was a superior cue to memory in terms of emotional salience. Odors have a number of technical properties that can be successfully exploited in human–computer interaction (HCI) in general, including directional properties [17], intensity, the chemical nature of the odor, and hierarchical properties, among others [18,19]. In this paper, we refer to the interface as the part of a computer or digital product where the user and the digital product meet and interact; it is where human–product or human–computer interaction takes place using one or more sensory channels [20]. Thus, an olfactory display provides users with one or more odors that convey meaningful information at a computer interface, for example, providing useful cues on how to solve a problem in an educational computer application [21]. This paper focuses on the generation and transmission of scents from olfactory displays, used for supporting education and training.
Despite the importance of olfaction in the real world and the advantages of olfactory displays, the sense of smell is one of the least used senses in HCI. This is partly due to the fact that there are important challenges regarding smell generation, diffusion, and removal, as we will discuss in this paper. In addition, the amount of design methodologies, usability and user experience (UX) testing methods for developing olfaction interfaces are still very limited. As Ghinea and Ademoye (2011) [16], point out, olfaction is “one of the last challenges which multimedia and multimodal applications have to conquer…”
Odors has been purposely used in media, going back several decades. Fragrances delivered to cinema audiences have been sporadically applied since the first movies were projected at the beginning of twentieth century [22]. For instance, a scent of roses was dispersed with a fan to an audience during the projection of a newsreel on the Pasadena Rose Bowl game in 1906 [23]. Smell-o-vision was a mechanical system used in the 1950s and 1960s that diffused up to 30 odors during a film, so that the viewers could perceive scents as part of the movie scene, delivering narrative clues [24]. With respect to learning, smells have the potential for enhancing the narrative in immersive educational applications (e.g., serious games, that is, video games whose primary purpose is education, and not entertainment, and other types of educational human–computer applications). At the beginning of the 1960s, Morton Heilig, an American cinematographer, designed and constructed the Sensorama, an electro-mechanical one-person arcade movie system that included auditory, visual, tactile, and olfactory displays [25], developed with the objective of immersing the user in a projected multimodal movie. For example, while watching a movie about riding a motorcycle on the streets of Brooklyn in New York, NY, USA, the user could smell baking pizza as the bike passed by a pizzeria. They were also presented with the smells of car exhaust fumes coming from nearby cars. The odors were generated from spray cans and were diffused to the user using fans. Interestingly, Heilig proposed in his patent that the Sensorama could be used as an educational tool. In his patent he also stated that the Sensorama simulator may relieve teachers of the burden of teaching complex materials and may be used to train soldiers to prevent exposure to “potentially dangerous equipment” [25]. Unfortunately, Heilig was not able to commercialize Sensorama due to the lack of funding at the time that his patent was granted. In John Waters’ film “polyester”, released in 1981, the audience were given “scratch and sniff” cards and asked to rub them at certain places during some scenes of the movie [26]. In 2010, audiences of the movie “Spy Kids 4”, directed by Robert Rodriguez, used Aromascope cards with numbered smells they had to rub when the corresponding number flashed on the screen [27].
More recent applications of olfaction in media is seen in scent marketing, that is, the use of smells for promoting commercial products or positioning a brand [28]. For example, some magazines include scented pages of perfume advertisements. Scent marketing also uses a technique called ambient scents, where smells “do not emanate from a product, but are generally present as part of the retail environment”, generally injecting odors into the stores’ ventilation system with the objective of stimulating consumers’ mood and influencing them to buy products [29].
The use of olfaction in computer interfaces was initially proposed, researched, and applied in the 1980s. An early example of an olfactory application in computer interfaces is the action-fiction video game titled “Leather Goddesses of Phobos”. It included a scratch-and-sniff card with seven numbered odors, where players scratched the corresponding number that was shown at certain points in the game, and then whiffing the resulting odor [30], complementing a multimodal interaction with the game. Reinghold (1991) [31] pointed out that odors can be an important component in virtual reality/virtual environment applications, since olfaction can be a powerful sensory channel for enhancing presence in 3D virtual environments and improving spatial cues, which are key characteristics of virtual reality [32,33]. Until recently (due to increased computer processing power and more efficient olfactory systems), olfaction has been effectively studied as an additional sensory modality in a 3D virtual environment for supporting a more immersive experience, using pleasant scents that are related to the virtual environment [34]. Smell-generation techniques for virtual reality/virtual environments and other types of human–computer interfaces have been developed, which are described in the next section.
Early olfaction applications proposed or employed simple techniques for generating and transmitting odors to audiences, such as using scratch-and-sniff cards and fans for diffusing odors. Over the years, as described in the following section, more elaborate odor creation and diffusion techniques have been designed and tested. It is necessary to analyze the existing olfactory display techniques in order to understand the complexity and feasibility of the technology in order to produce and use efficient and effective olfactory displays.
The paper organization is as follows: This paper begins with an introduction to olfaction and olfactory displays, and a review of techniques for storing, generating, and diffusing odors at the computer interface. The paper then presents a discussion on educational theories that support olfactory displays for education and training, and a literature review on olfactory displays that support education and training. Finally, the paper summarizes the advantages and challenges regarding the development and application of olfactory displays for education and training. The review considers papers on computer-based olfactory displays for education and training from the mid 1990s (where we found an early report of an application for training [32]) to the present time. In Section 4 and Section 5 of this paper, we have considered in this time frame, papers that have used computer-based olfactory displays and olfactory applications purposely used along with educational and training human–computer interfaces.

2. Techniques for Storing, Generating and Diffusing Odors at the Computer Interface

Kaye (2004) [18] and Yanagida (2012) [13] describe various techniques for storing, generating, and diffusing odors using mechanical (physical), chemical, or electro-chemical methods, or by using a combination of them. Below we provide a brief description of several of these techniques and related reported research is provided:
  • Using scented waxes or oils that will release odors when they are in contact with the air. Brewster et al. (2006) [35] used commercially available plastic cubes containing scented oils in an experiment that examined the use of odors to help memorization of features from photographs. Results showed that recall using olfaction was above chance. Another slightly different technique is heating a scented liquid or wax for releasing its smell. This technique has been widely used in commercial aromatherapy products and in household air fresheners and aromatizers. In addition, Braga (2006) [36] proposed the design of a simple electronic odor generator interface where a time-controlled resistor heats a scented liquid, and a small fan diffuses the scent towards the user.
  • Keeping a scent compressed in a bottle and spray it into the air. This is another technique widely used in commercial household aromatizers and has been employed in some research projects where an odor is sprayed using an electronically controlled solenoid. Nakamoto et al. (2009) [37] developed an olfactory display system where some video clips were “aromatized” using this technique. Tests indicated that most participants enjoyed watching and smelling the scented videos.
  • Encasing a scented substance in a cartridge and dispersing it using inkjet technology, which diffuses very small drops of the scent towards a user (Sugimoto et al., 2010) [38]. Preliminary tests indicated favorable and effective use of scented movies. Czyzewski et al. (2010) [39] used a similar technique called “cold air diffusion”, which delivers very tiny drops of scented oil, previously stored in a glass pipe and rapidly released to the environment using compressed air.
  • Producing a cool or warm scented mist by using a metal diaphragm (an ultrasonic transducer) vibrating at high frequency and placed on a scented liquid, generally water. The water molecules from the generated mist are very small, measuring about some microns in diameter, being rapidly absorbed into the air. This technique is used by commercial off-the-shelf humidifiers [13]. A similar technology is a vaporizer that boils scented water, releasing a fine steam and moisture into the air.
  • Using scratch-and-sniff stickers and other similar materials forms the basis of the technique of smell microencapsulation, where users rub the sticker to break the micro capsules placed on the surface of the stickers and release a smell. A number of patents have included the proposed use of scratch-and-sniff stickers for learning and teaching, but to our knowledge there are no reported research projects that are examining educational olfactory HCI using this technique. However, stickers, books, and other educational materials with impregnated smell have been widely used in classrooms for years for supporting learning (for example, see Schultz, 1987 [40]; McGee and Tompkins, 1982 [41]).
Most of the above techniques include fans for diffusing the generated smells towards the user. In addition, Yanagida (2008) [42] reported the development and use of “air cannons” for directing the odor to a user. Another reported technique is directing the odor to the user’s nose. Nakamoto and Yoshikawa (2006) [43] devised a system to generate up to eight odors that were presented along with movie scenes. The system was composed of solenoid valves activating sprays that contained odors that were synchronized with scenes from a movie. The odors were sent close to the user’s nose through a tube.
Once the odors have been generated and diffused, it is necessary in many cases to remove them from the environment. Past literature has reported very few techniques, guidelines, or recommendations on how to remove dispensed odors when using olfactory displays. Odor removal must be an important part of olfactory display applications due to the intrinsic characteristics of persistence of odors (a generated odor tends to linger), multiplicity (an odor may enhance other odors) and masking (an odor may suppress another odor) (Yanagida, 2008) [42]. Computer-generated odors may persist in ambient or personal spaces and thus distract or even annoy others nearby. In addition, it may be necessary to remove a generated odor quickly at the interface to generate and diffuse another one, otherwise smell masking or multiplication will occur. Olfactory adaptation (or odor fatigue) refers to the reduction of sensitivity following stimulation leading to the inability to distinguish a particular odor after an exposure to that odor over a period of time [44] and it may occur if the odor is presented for long periods of time at the computer interface. Kadowaki et al. (2007) [45] have overcome this problem by devising an odor pulse ejection technique that presents a smell for very short periods of time to minimize the time the odor remains in the air. Tsai and Hsieh (2012) [21] used a computer-controlled fan placed behind a pair of odor generators to “clean the surrounding air” right after the generated smells were used. However, this technique uses the fan to simply push the odor away from the user, but does not completely remove or absorb the odor from the environment where the odors were used. Figure 1 shows a simple olfactory display interface developed by Garcia-Ruiz et al. (2021) [46], consisting of an off-the-shelf household aromatizer and a fan for diffusing the generated odor. Both devices are controlled by a microcontroller board that receives a signal via a USB port from a video game for activating the fan and the aromatizer at specific moments when the game is played.

3. Classification of Smell Presentation in Human–Computer Interaction

Odors incorporated into human–computer interfaces can be classified into olfactory icons and smicons, according to their semantic meaning and its relationship to its context of use [12]. Both smicons and olfactory icons are similar to “earcons” (musical notes that represent information with an abstract mapping) and “auditory icons” (naturally occurring sound effects that map information with direct meaning) [47], respectively, and are the equivalent to visual icons from a graphical user interface (GUI). An olfactory icon has been defined by Kaye (2004) [18] as a computer-generated odor that conveys meaningful information to its user(s), which is semantically and environmentally related to the information to be conveyed. For instance, a graphical icon of a rotten apple associated with a computer-generated smell of a rotten apple can be used to indicate that the computer trash can is full, thus there is a direct mapping between the visual icon and the odor.
However, a smicon is a computer-generated odor having an abstract relationship with the information representing it [18]. For example, a Japanese firm developed a prototype of an olfactory alarm that uses a very distinctive wasabi (Japanese horseradish) smell working as an olfactory fire alarm to alert hotel guests. The olfactory alarm is intended to substitute auditory and visual alarms in the event that some of those guests are deaf or profoundly asleep [48]. However, the idea of using olfactory alarms has been proposed before [49].
Recently, electronic odor sensors (also known as odor meters or electronic noses) have been incorporated into olfactory human–computer interfaces. Odor sensors are capable of measuring the odor strength and the odor type [50]. This can be an effective method of measuring both qualitative and quantitative data from usability tests. However, some odor sensors require a long turnaround time, up to dozens of seconds [42]. Moreover, a disadvantage on user testing of odors is the lack of training and experience of many participants in distinguishing particular scents [51].

4. Educational Theories That Take Olfactory Stimuli into Account

Olfaction has been overlooked in many educational settings, despite the fact that odors can effectively help learners memorize pieces of information, evoke memories, and trigger emotions [52], among others. The use of olfaction in the classroom may include reactive olfactory displays and multimodal interfaces. An example may consider the delivery of an odor as a reward after solving a math problem. This is supported by the educational philosophy of Constructivism, which is based on the premise that students “learn/practice by doing”, constructing meaning and knowledge from their experiences [53].
Educational olfactory displays are supported by the Experiential Learning Theory [54], which states that “learning is the process whereby knowledge is created through the transformation of experience” [55], and involves a learner’s perception of didactic materials using all of his/her senses, including olfactory materials. The Brain-Based Learning theory states that students’ emotions are critical to learning, which can be influenced by olfactory stimuli [56,57]. Holloway (1998) [58] described an experiment on olfaction and emotions in learners conducted by Rachel Herz of the Monell Chemical Senses Center in Philadelphia, where a group of anxious students who were about to take an exam were given a list of words to remember. At the same time, the students were exposed to a particular odor (Holloway did not mention what the odor was). Another group of anxious students was given the same list of words, but this time their classroom was odorless. A week after the experiment was completed, Hertz re-exposed the two groups of students to the odor, and found that the group who were exposed to the odor for the second time improved their recall on the list of words compared to the control subjects by 50%. This is consistent with previous research on using odors to improve emotional states of people when memorizing and recalling pieces of information [14]. The Cognitive-Affective Theory of Learning with Media [59] considers multimodal instructional materials, including olfactory information. As their theory states, for meaningful learning to occur, learners attend to and select relevant verbal and non-verbal information (including smell) for further processing in their working memory. After that, learners organize the multiple representations into a coherent mental model and integrate the organized information with the learners’ prior knowledge.

5. Review of Research on Educational Olfactory Applications

Computer-generated odors can be an important support for learning, since they can be useful for provoking positive emotions in learners, thus reducing their stress levels at school [60]. In addition, odors can help “enhance memory performance through better problem solving, reduce response times, produce fewer errors, increase recall, recognition, and retention, and enhance productivity, alertness, and physical performance” [61]. Youngblut et al. (1996) [62] pointed out a number of benefits of olfaction in educational virtual reality, such as reducing students’ stress, and the ability of improving information retention and recall. The following paragraphs summarize examples of research projects that employ odors for supporting learning.
Olfactory applications have been used in art exhibitions for enhancing visitors’ educational experiences about multimodal art. Lai (2015) [63] developed an interactive art exhibition where patrons perceived five odors (scents of grass, baby powder, whiskey tobacco, dark chocolate, and leather), generated by mist diffusers. In one part of the exhibition, the odors were linked to artwork such as origami boxes that patrons rearranged themselves, allowing other visitors to perceive the odors in different order. In another part, the aromas (a sweet or pleasant odor) were generated according to the visitors’ walking direction, allowing other visitors to perceive them. Lai reported that the odors worked as a powerful communication medium and a complement to the other senses in the artistic perception. Another example of multimodal experiences, including olfactory information, was presented at the Tate Britain art gallery in London, UK [64], where a combination of visual, touch, auditory, and olfactory stimuli was used for supporting artistic appreciation of paintings. Visitors were allowed to hold and smell 3D printed scented objects that were related to some paintings that visitors could observe, while listening to related sounds. Vi et al. (2017) [64] collected and analyzed post-questionnaire data completed by 50 visitors (participants), showing that “all participants strongly acknowledged that stimulating all the senses added another layer, dimension, and perspective to the experience of the paintings and thus opened new ways of thinking and interpreting art”. Qualitative data showed that odors were useful in supporting understanding the paintings’ meaning and artistic renderings.
Tijou et al. (2006) [65] developed a fully immersive desktop virtual reality (VR) system to investigate the effects of olfaction on learning, recall, and retention of 3D structures of organic molecules. The VR system included two commercial olfactory display devices generating up to six odors that were related to the virtual molecular models presented in the VR system, for example, a molecular model of vanillin was paired with a vanilla odor. In the VR system, the 3D graphical molecular models were displayed on both a computer monitor and a head-mounted display (HMD). Users interacted with the molecular models using a special 3D mouse or a hand movement tracking system. The paper demonstrated the feasibility of a multimodal VR system used for learning in the sciences. However, the authors [65] did not report any testing conducted with students, and proposed future work that will study the role of olfaction, interaction techniques and depth cues while learning molecular structures.
Richard et al. (2006) [66] introduced the ‘‘Nice-smelling Interactive Multimedia Alphabet’’ project that involved developing a multimodal computer application that included olfactory, visual, and auditory information. The main objective of this multimodal application was to support learning of letters of the alphabet. However, Richard et al. do not provide further details on the project.
Miyaura et al. (2011) [67] developed an olfactory display that diffused odors of ylang-ylang and peppermint using inkjet technology. An electrocardiogram was used to measure users’ concentration levels when performing basic addition tasks. The researchers reported that the odors helped to decrease errors in the additions. The aim of the olfactory display was to help learners to re-engage in additional tasks with the use of odors once a concentration lapse was detected with the electrocardiogram.
Kwok et al. (2009) [68] introduced the SAMAL (Smart Ambience for Affective Learning) system, which included the development and testing of a multimodal ambient room with visual, auditory, and olfactory stimuli. One of the objectives of SAMAL was to integrate cognitive and affective issues with the purpose of enhancing learning, and studying the emotional and affective experience of learners while perceiving multisensory stimuli coming from the SAMAL system. The ambient room included 3D stereo projection, 3D interaction using a Wii-mote, high-fidelity audio, and an olfactory display system with spray dispensers, among other interesting features. The SAMAL system provided some ambient “scenarios” to evoke different affective and cognitive states of mind and feelings. For example, a scenario called “Blue Hat Smart Ambient” displayed a 3D projection of a quiet road shown along with a sound effect of rain and provided students with a smell of violets. According to Kwok et al. (2009) [68], all of the multimodal stimuli were designed to promote a “feeling of calm and wakening needed for better control and direction” of students while solving a problem. Another scenario, a green apple odor was dispersed to stimulate the “fresh, liberated and free-thinking feelings needed for triggering new or wild ideas.” Preliminary findings of post-tests applied to learners showed that the SAMAL system generally did influence students’ affective experiences, improving their learning effectiveness. However, Kwok et al. (2009) [68] did not describe any effects of odors on students’ cognitive processes, and they did not describe the odor generation system in greater detail.
Garcia-Ruiz et al. (2008) [69] described a usability study that tested the integration of an odor in an educational 3D virtual environment (a virtual town with some buildings, street lamps, and roads with street name signs) developed for second language learning. Twelve computer science student participants tested the virtual environment, where they followed oral instructions in English (their native language was Spanish) about going from point A to B in the virtual environment, using a regular mouse for “walking around” in the streets of the virtual town. While walking on the virtual streets, all participants were presented with a fresh leaves of mint (Mentha Spicata) odor. After the test, participants completed the System Usability Scale (SUS) questionnaire—a 10-item questionnaire with five response options for respondents, from Strongly agree to Strongly disagree—which is a reliable tool for measuring the usability of a system [70]. Preliminary results indicated that all participants perceived the usability of the multimodal virtual environment as very good. In addition, participants reported that the mint odor helped them lower their anxiety when listening to the oral instructions in English. Experimental results obtained by Herz et al. (2004) [71] have shown that a mint odor (among other pleasant smells) can stimulate or activate the mood of learners, and pleasant responses to odors are learned through emotional associations. The use of mint to affect mood has also been examined in marketing applications [72]. Moreover, Ho and Spence (2005) [73] demonstrated experimentally that olfactory stimulation of mint facilitates tactile performance of complex tasks, which may be useful for supporting further training in computer simulations that require dexterity. Oliver (2012) [74] incorporated odors in a language learning seminar. The seminar explored how odors can be incorporated into teaching literary concepts of English language at different levels, pointing out that odors can work as an important learning tool.
Czyzewski et al. (2010) [39] developed a computer-controlled device capable of generating tiny drops of scented oil, previously stored in a glass pipe, and rapidly released to the environment using compressed air. This is a technique that the authors called “cold air diffusion.” The developed device can generate up to four odors simultaneously and was tested using an educational software showing animated animal cartoons. This multimedia software was designed for measuring the degree of concentration in young students. The students had to concentrate on the movements of a bee character while purposely distracted with other moving characters. In addition, a particular odor was displayed with the developed device when the animations were shown to the students. However, they do not describe what type of odors were used. The researchers [39] reported that although the results of an initial test were inconclusive, the test served to correct technical problems from the device and to analyze the effectiveness of the olfactory device. The researchers also pointed out that their device could further be used to support multisensory stimulation for science-based education.
Covaci et al. (2018) [75] developed a multiplayer serious game intended to teach high-school students about the seventeenth century’s Age of Discovery (a period in history when several European kingdoms started to explore the world by sea for trading goods such as spices). In a study, participants played the serious game and required to open and smell small jars containing odors of real spices and beans, including coffee beans, cocoa beans, ginger, pepper, cinnamon, and clove. The spices were smelled at the same time that an image of each spice appeared visually in the game. Presumably, participants were required to close the jar when it was no longer needed. This research aimed at exploring ways to design multisensory experiences in serious games. The researchers’ goal was to examine any differences in students’ performance and enjoyment while playing the game on desktop computers and mobile devices, in the absence or presence of olfactory stimulation. They found, through a self-reporting engagement questionnaire, that multiple sense stimulation in a serious game engaged the users, potentially improving the learning process. However, results of pre-test and post-test knowledge questionnaires showed that the olfactory feedback did not yield an improvement in performance. This was possibly due to the participants’ cultural background which prevented them to effectively associate the images of the spices with their odors, their ability to discriminate among the odor stimuli, and the fact that students did not verbalize the odors used in the game. As Guinea and Ademoye (2011) [16] highlighted, it is difficult to effectively measure the user experience (UX) on the quality of olfactory data presented through olfactory displays.
In the real world, our senses are constantly responding to specific physical phenomena, providing the perceptual system with data that is processed/integrated by the nervous system, producing multisensory information that allow us to acquire knowledge [76]. The integration of environmental information across multiple sensory channels (such as the olfactory channel) is critical to guiding decisions and behaviors [77]. The different senses also interact with one another and alter each other’s processing and ultimately the resulting perception. Stability of perception in everyday life is preserved by integration of multimodal information and perception of synchrony in cross-modal combinations plays an important role in maintaining perceptual stability in a continually changing environment [78]. Various “real-world” studies suggest that sound can have a significant effect on the perception of the other senses. For example, sound (noise) can have an impact on the perception of food gustatory properties, food crunchiness, and food liking [79]. In other words, our construct of our environment and our ability to interact with that environment is very much determined by the interaction of our senses. An understanding of this multimodal interactivity in the real world can therefore inform our development of environments in the virtual space. Most scientific research suggests that the more modalities that are integrated into a virtual representation of our environment (including the smell modality), the greater the sense of presence or immersion in that space [1]. Additional modalities can reinforce existing information, or provide additional information that cannot be obtained by a single modality alone. For instance, although vision tends to dominate, auditory information can tell us what is behind a door, or behind or inside of our bodies. With respect to education, training, and memory, most studies acknowledge that multiple sensory inputs (including odors) result in improved processing and retention [80]. The term cross-modal interaction or integration is defined as the process of coordinating multimodal information (information that stimulates multiple senses), from different sensory channels into a final percept [81]. Signals that incorporate multiple sensory modalities enhance cognitive processes, including learning and decision-making [82]. As Flavian et al. (2021) [83] point out, “achieving multisensory digital experiences is the holy grail of human–technology interaction”, and “providing multisensory experiences in digital environments is one of the future priorities in technology development”.
However, some care should be given when considering multimodal stimuli. As Kapralos et al. (2017) [1] describe, redundancy, that is, the repetition between modalities of the same message, can increase communication and facilitate knowledge transfer. However, due to channel limits—that is, how much information we can absorb at any one time (also known as cognitive load)—redundancy can at times decrease effectiveness. Redundancy that does not provide new information runs the risk of decreasing the facilitation of communication. In other words, we have a limited cognitive ability to take in information across multiple senses at the same time, and therefore an overload of information—too much sensory stimulation—can impede task performance. Additional modalities with incongruent information add cognitive load when not carefully balanced with the main means of communication.
Klašnja-Milićević et al. (2018) [84] investigated olfaction-based applications in multimodal human–computer interfaces for learning, with the goal of determining how the senses of smell, taste, vision, and hearing interact and how they can improve memorization in a virtual reality educational application that focused on the solar system. Sixty university students participated in an experiment, where they were randomly divided into two groups. One group used an educational VR system while they consumed chocolate and/or coffee and were presented with vapors of three essential oils: citrus, rosemary, and mint. The other group used an educational VR system without the olfactory and taste stimuli. A within-groups testing protocol revealed that participants who consumed the chocolate, drank the coffee, and smelled the citrus oil vapor while using the VR learning application scored higher in a pre-/post-test.
Alkasasbeh and Ghinea (2019) [85] developed a multimodal website for learning about geography. Four odors were used to represent information about four countries (Brazil = coffee, India = curry, Japan = green tea, and South Africa = wild grass) and were delivered to web users through a dry-air scent diffuser with fans for directing the odors to the users. They hypothesized that memory and recall information about those countries would be improved with the odors. A test was conducted with 32 participants who were randomly distributed within eight conditions (text only, audio only, olfactory media only, images, audio and text, audio and olfactory media, audio, text and olfactory media, text, and olfactory media). The conditions depicted geographical information about the four countries. The researchers administered pre-tests and post-tests with questions about the four countries. Results showed that the questions from the post-test in which the olfactory media was synchronized with the audiovisual media was significantly different (improved with higher than average scores) compared to those who were not provided with any olfactory stimuli. However, the odor-only related questions yielded no significant difference. The use of odors for recalling information is consistent with previous research [35]. Table 1 summarizes the research reviewed in this section.
The next section describes olfactory displays used for training. The time frame considered in the reviews of the following papers is from the mid-1990s to 2019, where we identified most of the olfactory applications in training in the literature.

6. Review of Research on Olfactory Applications That Support Training

Designers and developers of immersive 3D virtual environments including virtual simulations and serious games used for training, typically aim to faithfully recreate real-world scenarios. In fact, it has been suggested that “achieving multisensory digital experiences is the holy grail of human–technology interaction” in general [83]. However, traditional emphasis is placed on recreating the visual, and (perhaps to a lesser extent), auditory scenes, while ignoring the other senses (including touch, smell, and taste) despite their importance in the real world [86]. However, current digital experiences are primarily based on audiovisual stimulation and involve other sensory stimulation to a lesser extent [87]. Olfactory stimuli, in particular, are widely neglected, although the sense of smell influences many of our daily life choices, affects our behavior, and can catch and direct our attention [6]. Incorporating pleasant and congruent ambient odors into a virtual reality experience can lead to enhanced sensory stimulation, which in turn, can directly (and indirectly through ease of imagination) influence affective and behavioral reactions [82]. Finally, incorporating olfactory technologies into virtual environments has shown to be safe and effective for targeting several aspects of psychological and physical health such as anxiety, stress, and pain [88]. Simulating the sense of touch, smell, and taste is not trivial and presents many technological challenges and issues. Recently, there has been a large effort made on simulating the sense of touch (e.g., haptics) and this effort has been accelerated with the availability of consumer-level haptic devices. That being said, although greater work remains, as described below, olfaction has been applied in virtual reality (3D) learning environments.
Olfactory stimuli have also been applied to support training of people in different configurations and situations. Cater (1994) [32] developed a backpack-mounted fire-fighter training system at the Deep Immersion Virtual Environment Laboratory, which generated odors that were sent to a fire-fighter oxygen mask, to train persons to distinguish different types of smoke. However, the author reported that the system generated strong odors, causing extreme discomfort in some trainees.
A review conducted by Spencer (2006) [89] outlined a number of research projects that used artificially generated odors in medical simulations for education and training. Olfactory information is a key factor in medicine to complement a correct patient diagnosis of many diseases. The review [89] argues that it is technically feasible to use odors in virtual reality simulations for medical training, thanks to recent technological advances that have led to devices for odor production in a computer interface, and the development of efficient ways to disperse them remotely over the Internet and in local networks. They showed that adding simulated odors to a virtual reality medical simulator effectively complements medical diagnoses and enhance training skills in medicine students. Kent et al. (2016) [90] discussed the use of olfactory stimuli in healthcare simulators, carrying out a systematic review of literature identifying five relevant papers that described the use of smell in medical simulations. They found that olfaction is very rarely used in medical simulations and found mixed results from the five reviewed papers. The researchers determined that clinically relevant odors (such as an odor of iodine scrub) could be more effective than using general smells for supporting training, and smell can be very useful for improving simulation fidelity.
A compelling application of olfactory human–computer interfaces has been developed for military training. According to Vlahos (2006) [91], theme-park designers and the University of Southern California developed a virtual reality simulator to train U.S. soldiers, integrating a number of odors to enhance the ambiance of a virtual war scenario. The soldiers don an electronic device around their neck that generates scents. Each odor is activated wirelessly, and they are activated according to the events generated in the virtual reality war simulation. For instance, when soldiers fire a gun, they can perceive the smell of gunpowder coming from the electronic device they wear. Preliminary research pointed out that the use of odors in a simulated war environment enhanced soldiers’ mental immersion, a key element that supports training.
Tsai and Hsieh (2012) [21] explored the use of two types of computer-generated odors to support training of computer programmers in writing efficient software code. The purpose of the olfactory display was to help improve coding style and to identify code errors. Results from a test indicated that more than 80% of participants reported that the smell was useful for identifying errors and to improve their coding style. The researchers built an olfactory display using an Arduino™ microcontroller board connected to a pair of off-the-shelf household aromatizers.
Narciso et al. (2019) [92] developed a virtual reality system whose goal was to analyze how olfactory stimulus supported training of firefighters. The researchers set up a multimodal virtual environment simulating a virtual closed container, where firefighters were holding a virtual fire, with the added burnt wood odor stimulus. The odor was generated by a SensoryCo SmX-4D™ aroma system, by using compressed air, and preloading the odor into the system. A between-groups experimental result showed that although the multimodal virtual environment was successful in knowledge transference overall, the addition of smell stimulus in the group of firefighters that experienced the smell did not significantly influence any of the measured variables of presence, cybersickness, fatigue, stress, and knowledge transfer. The researchers explained that a possible cause of low odor effectiveness was because the firefighters were not fully equipped with protective gear, and the multimodal virtual reality environment (including its olfactory display) was not immersive enough in the experiment, thus reporting lower values of fatigue and stress in it.
Table 2 summarizes the research reviewed in this section.

7. Advantages of Olfactory Human–Computer Interfaces

According to past research on olfactory displays for learning and training reviewed in this paper, we have found the following advantages:
  • Odors can effectively work as cues for solving a problem in educational software.
  • Odors can serve as an attention grabber in an educational setting.
  • Odors can be used to provoke positive emotions in students, thus lowering their stress levels at school.
  • Odors help immerse learners and trainees in educational virtual environments.
  • A number of odor properties can be successfully exploited in a computer interface, such as persistence, directionality, intensity, the chemical nature of the odor, and hierarchy, among others.
  • An olfactory display can convey useful information to students by employing either olfactory icons (odors that are semantically linked to the conveyed information) or smicons (smells that have an abstract link with the mapped information).
  • Odors can stimulate emotional responses in learners. They can work as a “mood enhancer” and increase alertness.
  • Olfaction is a powerful memory recall stimulant.
  • Odors in computer interfaces may complement other senses, which can be exploited in educational multimodal interfaces.
  • Disabled students may use olfactory displays for supplementing other senses.
  • Simple odor diffusion techniques can work effectively in educational settings, such as scratch and sniff stickers, and spraying smells over the air.
With respect to virtual environments, including realistic olfactory delivery along with audio-visual stimuli is vitally important if the virtual environments are to be used as genuine representations of real-life scenarios [43]. Furthermore, the introduction of smell in a virtual environment increases the sense of presence and enhances the level of realism [92].

8. Challenges of Olfactory Human–Computer Interfaces

Despite the advantages and advancements of olfactory display technology and applications in education and training, computer-based olfactory displays are not being widely used in educational and training settings. We believe there are a number of important challenges that we found in the reviewed literature from this paper, including:
  • Educational olfactory displays are in their infancy. Despite the efforts made by researchers and practitioners to deliver multisensory digital experiences, there is still a long way to go before this goal is accomplished [87]. Greater research and development is still required to improve the generation and delivery of odors in educational settings. Greater work is also required regarding the short, medium, and long-term pedagogical effects of olfactory displays.
  • Most olfactory display hardware developed in past research projects and commercial devices seems cumbersome and expensive to implement and use. However, recent low-cost single-board microcontroller boards such as the Arduino™ series can be easily used to create olfactory displays with a minimum of programming and set up [21,93], which may suit schools with limited budgets.
  • As Ghinea (2011) [16] pointed out, it can be difficult to effectively measure the user experience (UX) on the quality of olfactory data presented through olfactory displays. This is important because UX will, in turn, affect students’ motivation and engagement when using olfactory displays.
  • There is very little research on how to efficiently and promptly remove unused artificially generated odors from an educational setting (e.g., a classroom, a computing laboratory, etc.).
  • Basdogan and Loftin (2009) [25] warn that it can be challenging to generate a specific smell that suits the context of a training application. Similarly, it can be difficult to find a suitable odor that can be used effectively as a smicon that conveys abstract information.
  • To our knowledge, there are no standard or commonly used programming libraries for developing olfactory human–computer interfaces.
  • Strong and proven olfactory display design guidelines are needed to develop usable (effective, efficient and pleasant) educational smell interfaces.
  • Some researchers argue that Western culture is predominantly visual (e.g., Mirzoeff, 2009) [94], which may have slowed down the widespread use of olfactory displays. However, although Western learning styles are generally visual, this does not exclude using other learning styles or incorporating other senses in learning and training.
  • There is no consensus within olfaction regarding how to classify odors effectively [18]. This may affect the design and development of olfactory displays with multiple odors.
  • Some users may have medical conditions that affect smell perception in computer interfaces, such as anosmia (the inability to perceive any odor) or hyposmia (a decreased ability to smell). In addition, the common cold is the usual cause for temporary or partial loss of smell.
  • The usability testing of olfactory icons and smicons can be rather challenging. Past usability testing of olfactory displays has been conducted by experienced HCI specialists as participants [42], thus this may affect the objectivity of the results. To evaluate olfactory displays based on smicons, most of the time it is necessary to train testers on the mappings between the smicons and their meanings. This may produce a smell habituation effect in them. In addition, results of usability testing may be affected by test environment conditions, for example, the smell of chemical carpet cleaners lingering around.
  • Information overload, that is, too much sensory stimulation, at the computer interface can affect learners’ task performance. Additional modalities (including smell) with incongruent information can increase cognitive load when not carefully balanced with the main means of communication [95].
Greater work remains to be done with respect to the interaction of olfaction and the other senses in order to develop a greater understanding regarding the effect of redundant cues on cognitive load and ultimately learning.

9. Conclusions

This paper presented an overview of olfactory displays that have been proposed and researched for supporting education and training. Olfaction in educational human–computer interfaces has been largely underused, despite past research that found support for information memorization and recall, immersion promotion, and its capacity for complementing or supplementing other human sensory channels for learning, among other advantages. The paper has introduced olfactory display in human–computer interaction, a discussion on educational theories that support the use of olfaction in learning and training applications, and past research on olfactory displays applied to education and training. Finally, the paper summarized many benefits and challenges regarding efficient and effective design and use of olfactory computer interfaces for learning and training. Greater research is required in order to determine the appropriate odor generation and diffusion techniques to be used in educational human–computer applications. From the literature review, we determined that more experimental research is needed to confirm the effectiveness of olfactory displays in education and training. The papers that we reviewed here regarding olfactory applications for learning support knowledge acquisition on early literacy, art, sciences, social sciences, English language learning, history, and studies about providing a positive environment for studying. The reviewed papers that focused on olfactory displays for training discussed their use in military, firefight, medical, and computer programming applications. This indicates a wide variety of educational and training areas where olfactory applications have been used. The technologies and olfactory materials used in the reviewed papers are also varied, including mist diffusers, spray dispensers, fans for diffusing the smells, 3D-printed scented objects, inkjet technology, using fresh mint leaves, real spices stored in jars, and devices for making vapors of essential oils. This indicates that there is no common or unique type of olfactory technology and materials used in education and training, although the technology was applied depending on its number of users, such as individual vs. group learning/training, and the type of context of use, such as diffusing the smell in a small or large venue.
According to the literature reviewed in this paper, the use of olfaction in educational computer applications work more effectively in conjunction with other human sensory modalities. In addition, scratch and smell stickers, ultrasonic diffusers and spray techniques for generating smells seem useful and practical in their use in educational and training olfactory displays. Recent electronic interfaces employing microcontrollers appear to be an efficient and low-cost technique for controlling smell interfaces in educational and training applications [21,46,96]. Further research is required to determine short, medium- and long-term effects of olfactory displays in education and training, and to develop solid methodologies for development and testing olfactory displays. There is also a limited understanding of olfaction in multimodal human–computer interfaces. More research is needed in this domain [97].

Author Contributions

Conceptualization, M.A.G.-R., B.K. and G.R.-M.; methodology, M.A.G.-R., B.K. and G.R.-M.; formal analysis, M.A.G.-R., B.K. and G.R.-M.; investigation, M.A.G.-R., B.K. and G.R.-M.; writing—original draft preparation, M.A.G.-R.; writing—review and editing, M.A.G.-R., B.K. and G.R.-M. All authors have read and agreed to the published version of the manuscript.

Funding

The financial support of the Natural Sciences and Engineering Research Council (NSERC) of Canada in the form of individual Discovery grants to B. Kapralos, is gratefully acknowledged.

Acknowledgments

All trademarks, trade names, service marks, and logos referenced in this paper belong to their respective companies. The first author acknowledges the technical support of the Interactive Gaming Technology Lab, Algoma University. The third author acknowledges the technical support of the Writing Lab, Tecnologico de Monterrey.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kapralos, B.; Collins, K.; Uribe-Quevedo, A. The senses and virtual environments. Senses Soc. 2017, 12, 69–75. [Google Scholar] [CrossRef]
  2. Albrecht, J.; Wiesmann, M. The human olfactory system. Anatomy and physiology. Nervenarzt 2006, 77, 931–939. [Google Scholar] [CrossRef] [PubMed]
  3. Strugnell, C.; Jones, L. Consumer perceptions and opinions of fragrances in household products. Nutr. Food Sci. 1999, 99. [Google Scholar] [CrossRef]
  4. Gulas, C.S.; Bloch, P.H. Right under our noses: Ambient scent and consumer responses. J. Bus. Psychol. 1995, 10, 87–98. [Google Scholar] [CrossRef]
  5. Doty, R.L. The olfactory system and its disorders. Semin. Neurol. 2009, 29, 74–81. [Google Scholar] [CrossRef] [Green Version]
  6. Dozio, N.; Maggioni, E.; Pittera, D.; Gallace, A.; Obrist, M. May I smell your attention: Exploration of smell and sound for visuospatial attention in virtual reality. Front. Psychol. 2021, 12, 671470. [Google Scholar] [CrossRef]
  7. Aggleton, J.P.; Waskett, L. The ability of odours to serve as state-dependent cues for real-world memories: Can Viking smells aid the recall of Viking experiences? Br. J. Psychol. 1999, 90, 1–7. [Google Scholar] [CrossRef]
  8. Obrist, M.; Tuch, A.N.; Hornbæk, K. Opportunities for odor: Experiences with smell and implications for technology. In Proceedings of the CHI ‘14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; Association for Computing Machinery: New York, NY, USA, 2014. [Google Scholar] [CrossRef]
  9. Rinaldi, L.; Maggioni, E.; Olivero, N.; Maravita, A.; Girelli, L. Smelling the space around us: Odor pleasantness shifts visuospatial attention in humans. Emotion 2018, 18, 971–979. [Google Scholar] [CrossRef]
  10. Ward, P.; Davies, B.; Kooijman, D. Olfaction and the retail environment: Examining the influence of ambient scent. Serv. Bus. 1999, 1, 295–316. [Google Scholar] [CrossRef]
  11. Herz, R.S. Are odors the best cues to memory? A cross-modal comparison of associative memory stimuli. Ann. N. Y. Acad. Sci. 1998, 855, 670–674. [Google Scholar] [CrossRef]
  12. Kaye, J.N. Symbolic Olfactory Display. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2001. [Google Scholar]
  13. Yanagida, Y.; Tomono, A. Basics for olfactory display. In Human Olfactory Displays and Interfaces: Odor Sensing and Presentation; Information Science Reference: Hershey, PA, USA, 2012. [Google Scholar] [CrossRef]
  14. Chu, S.; Downes, J.J. Odour-evoked autobiographical memories: Psychological investigations of proustian phenomena. Chem. Senses 2000, 25, 111–116. [Google Scholar] [CrossRef] [Green Version]
  15. Smith, D.G.; Standing, L.; de Man, A. Verbal memory elicited by ambient odor. Percept. Mot. Ski. 1992, 74, 339–343. [Google Scholar] [CrossRef]
  16. Ghinea, G.; Ademoye, O.A. Olfaction-enhanced multimedia: Perspectives and challenges. Multimed. Tools Appl. 2011, 55, 601–626. [Google Scholar] [CrossRef] [Green Version]
  17. Kim, D.W.; Ando, H. Development of directional olfactory display. In Proceedings of the VRCAI ‘10: Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry, Seoul, Korea, 12–13 December 2010; Association for Computing Machinery: New York, NY, USA, 2010. [Google Scholar] [CrossRef]
  18. Kaye, J. Making scents: Aromatic output for HCI. Interact. Stud. Commun. Cult. 2004, 11, 48–61. [Google Scholar] [CrossRef]
  19. Emsenhuber, B. The olfactory medium—Smell in human-computer interaction. Sci. Technol. Innov. Stud. 2011, 7, 47–64. [Google Scholar]
  20. Preece, J.; Sharp, H.; Rogers, Y. Interaction Design—Beyond Human-Computer Interaction, 4th ed.; Wiley and Sons: Oxford, UK, 2015. [Google Scholar]
  21. Tsai, Y.-T.; Hsieh, D. Smellware: Olfactory Feedback for Code Smell in Software Development. Available online: https://www.semanticscholar.org/paper/Smellware%3A-Olfactory-Feedback-for-Code-Smell-in-Tsai-Hsieh/e9618d0c44798bfc272200e8338ee4e20940c09e (accessed on 13 October 2021).
  22. Drobnick, J. The Smell Culture Reader; Routledge: Abingdon-on-Thames, UK, 2006; Available online: https://www.routledge.com/The-Smell-Culture-Reader/Drobnick/p/book/9781845202132 (accessed on 13 October 2021).
  23. Ijsselsteijn, W. Presence in the past: What can we learn from media history? In Being There: Concepts, Effects and Measurement of User Presence in Synthetic Environments; IOS Press: Amsterdam, The Netherlands, 2003. [Google Scholar]
  24. Olofsson, J.K.; Niedenthal, S.; Ehrndal, M.; Zakrzewska, M.; Wartel, A.; Larsson, M. Beyond smell-O-vision: Possibilities for smell-based digital media. Simul. Gaming 2017, 48, 455–479. [Google Scholar] [CrossRef]
  25. Basdogan, C.; Loftin, R.B. Multimodal Display Systems: Haptic, Olfactory, Gustatory, and Vestibular. In The PSI Handbook of Virtual Environments for Training and Education; Praeger Security International: London, UK, 2009; pp. 116–134. [Google Scholar]
  26. Waters, J. Polyester; US. 1981. Available online: https://www.imdb.com/title/tt0082926/ (accessed on 13 October 2021).
  27. Anderton, E. “Spy Kids 4” Hitting Theaters with an All-New Form of Smell-O-Vision. Available online: https://www.firstshowing.net/2011/spy-kids-4-hitting-theaters-with-an-all-new-form-of-smell-o-vision/ (accessed on 12 October 2021).
  28. Brumfield, C.R.; Goldney, J.; Gunning, S. Whiff!: The Revolution of Scent Communication in the Information Age; Quimby Press: Chicago, IL, USA, 2008. [Google Scholar]
  29. Bradford, K.D.; Desrochers, D.M. The use of scents to influence consumers: The sense of using scents to make cents. J. Bus. Ethics 2009, 90 (Suppl. 2), 141–153. [Google Scholar] [CrossRef] [Green Version]
  30. McCandless, K. Gender-Specific Leather and Lace: Leather Goddesses of Phobos Release 59 Review. Macworld. August 1987, pp. 159–160. Available online: https://en.wikipedia.org/wiki/Leather_Goddesses_of_Phobos (accessed on 13 October 2021).
  31. Rheingold, H. Virtual Reality; Touchstone Publishing: New York, NY, USA, 1991. [Google Scholar]
  32. Cater, J.P. Approximating the senses smell/taste: Odors in virtual reality. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, San Antonio, TX, USA, 2–5 October 1994; Volume 2. [Google Scholar]
  33. Barfield, W.; Danas, E. Comments on the use of olfactory displays for virtual environments. Presence Teleoperators Virtual Environ. 1996, 5, 109–121. [Google Scholar] [CrossRef]
  34. Tortell, R.; Luigi, D.P.; Dozois, A.; Bouchard, S.; Morie, J.F.; Ilan, D. The effects of scent and game play experience on memory of a virtual environment. Virtual Real. 2007, 11, 61–68. [Google Scholar] [CrossRef]
  35. Brewster, S.A.; McGookin, D.K.; Miller, C.A. Olfoto: Designing a smell-based interaction. In Proceedings of the CHI ‘06: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 22–27 April 2006; Association for Computing Machinery: New York, NY, USA, 2006; Volume 2. [Google Scholar]
  36. Braga, N. Bionics for the Evil Genius: 25 Build-It-Yourself Projects, 1st ed.; McGraw-Hill Education Tab, Ed.; McGraw-Hill: New York, NY, USA, 2006. [Google Scholar]
  37. Nakamoto, T.; Kinoshita, M.; Murakami, K.; Yossiri, A. Demonstration of improved olfactory display using rapidly-switching solenoid valves. In Proceedings of the 2009 IEEE Virtual Reality Conference, Lafayette, LA, USA, 14–18 March 2009. [Google Scholar] [CrossRef]
  38. Sugimoto, S.; Noguchi, D.; Bannnai, Y.; Okada, K. Ink jet olfactory display enabling instantaneous switches of scents. In Proceedings of the MM ‘10: Proceedings of the 18th ACM International Conference on Multimedia, Firenze, Italy, 25–29 October 2010; Association for Computing Machinery: New York, NY, USA, 2010. [Google Scholar] [CrossRef]
  39. Czyzewski, A.; Odya, P.; Smulko, J.; Lentka, G.; Kostek, B.; Kotarski, M. Scent emitting multimodal computer interface for learning enhancement. In Proceedings of the 2010 Workshops on Database and Expert Systems Applications, Bilbao, Spain, 30 August–3 September 2010. [Google Scholar] [CrossRef]
  40. Schultz, E. Chemistry for kids: Pop-and-sniff experimentation: A high-sensory-impact teaching device. J. Chem. Educ. 1987, 64, 797. [Google Scholar] [CrossRef]
  41. McGee, L.M.; Tompkins, G.E. Concepts about print for the young blind child. Lang. Arts 1982, 59, 40–45. [Google Scholar]
  42. Yanagida, Y. Olfactory interfaces. In HCI Beyond the GUI; Kortum, P., Ed.; Morgan Kaufmann: Burlington, MA, USA, 2008; pp. 267–290. [Google Scholar]
  43. Nakamoto, T.; Yoshikawa, K. Movie with scents generated by olfactory display using solenoid valves. IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 2006, E89-A, 3327–3332. [Google Scholar] [CrossRef]
  44. Köster, E.P.; de Wijk, R.A. Olfactory adaptation. In The Human Sense of Smell; Laing, D.G., Doty, R.L., Breipohl, W., Eds.; Springer: Berlin/Heidelberg, Germany, 1991. [Google Scholar] [CrossRef]
  45. Kadowaki, A.; Sato, J.; Bannai, Y.; Okada, K.I. Presentation technique of scent to avoid olfactory adaptation. In Proceedings of the 17th International Conference on Artificial Reality and Telexistence (ICAT 2007), Esbjerg, Denmark, 28–30 November 2007. [Google Scholar] [CrossRef]
  46. Garcia-Ruiz, M.A.; Kapralos, B.; Rebolledo-Mendez, G. Towards effective odor diffusion with fuzzy logic in an olfactory interface for a serious game. In Proceedings of the HCI International—23rd International Conference on Human-Computer Interaction, Washington, DC, USA, 24–29 July 2021. [Google Scholar]
  47. Dingler, T.; Lindsay, J.; Walker, B.N. Learnability of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech. In 14th International Conference on Auditory Display. Available online: https://www.researchgate.net/publication/238494483_learnabiltiy_of_sound_cues_for_environmental_features_auditory_icons_earcons_spearcons_and_speech (accessed on 13 October 2021).
  48. Buerk, R. How the Stimulant Smell of Wasabi Can Save Lives. Available online: http://news.bbc.co.uk/2/hi/asia-pacific/8592180.stm (accessed on 12 October 2021).
  49. Schiffman, S.S. Use of olfaction as an alarm mechanism to arouse and alert sleeping individuals. Aroma-Chology Rev. 1995, 4, 2–5. [Google Scholar]
  50. Yamada, T.; Yokoyama, S.; Tanikawa, T.; Hirota, K.; Hirose, M. Wearable olfactory display: Using odor in outdoor environment. In Proceedings of the IEEE Virtual Reality Conference (VR 2006), Alexandria, VA, USA, 25–29 March 2006; Volume 2006. [Google Scholar] [CrossRef] [Green Version]
  51. Bodnar, A.; Corbett, R.; Nekrasovski, D. AROMA: Ambient awareness through olfaction in a messaging application. In Proceedings of the ICMI ‘04: Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, USA, 13–15 October 2004; Association for Computing Machinery: New York, NY, USA, 2004. [Google Scholar]
  52. Baines, L. A Teacher’s Guide to Multisensory Learning: Improving Literacy by Engaging the Senses; ASCD: Alexandria, VA, USA, 2008. [Google Scholar]
  53. Fosnot, C.T. Constructivism: Theory, Perspectives, and Practice; Teachers College Press: New York, NY, USA, 2005. [Google Scholar]
  54. Kolb, A.Y.; Kolb, D.A. Experiential learning spiral. In Encyclopedia of the Sciences of Learning; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar] [CrossRef]
  55. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development; Prentice Hall, Inc.: Upper Saddle River, NJ, USA, 1984; No. 1984. [Google Scholar] [CrossRef]
  56. Gülpinar, M.A. The principles of brain-based learning and constructivist models in education. Educ. Sci. Theory Pract. 2005, 5, 299–306. [Google Scholar]
  57. Caine, R.N.R.; Caine, G. Understanding a brain-based approach to learning and teaching. Educ. Leadersh. 1990, 48, 66–70. [Google Scholar]
  58. Holloway, M. Seeking “smart drugs”. Sci. Am. Present. 1998, 9, 39–43. [Google Scholar]
  59. Moreno, R.; Mayer, R. Interactive multimodal learning environments. Educ. Psychol. Rev. 2007, 19. [Google Scholar] [CrossRef]
  60. Sylwester, R. How emotions affect learning. Educ. Leadersh. 1994, 52, 60–65. [Google Scholar]
  61. Washburn, D.A.; Jones, L.M.; Satya, R.V.; Bowers, D.C.A.; Cortes, A. Olfactory use in virtual environment training. Model. Simul. Mag. 2003, 2, 19–25. [Google Scholar]
  62. Youngblut, C.; Johnson, R.E.; Nash, S.H.; Wienclaw, R.A.; Will, C.A. Review of Virtual Environment Interface Technology: References; Institute for Defense Analyses: Alexandria, VA, USA, 1996; Volume 35. [Google Scholar]
  63. Lai, M.K. Universal scent blackbox—Engaging visitors communication through creating olfactory experience at art museum. In Proceedings of the SIGDOC ‘15: Proceedings of the 33rd Annual International Conference on the Design of Communication, Limerick, Ireland, 16–17 July 2015; Association for Computing Machinery: New York, NY, USA, 2015. [Google Scholar] [CrossRef]
  64. Vi, C.T.; Ablart, D.; Gatti, E.; Velasco, C.; Obrist, M. Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition. Int. J. Hum. Comput. Stud. 2017, 108, 1–14. [Google Scholar] [CrossRef]
  65. Tijou, A.; Richard, E.; Richard, P. Using olfactive virtual environments for learning organic molecules. In Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2006; Volume 3942. [Google Scholar] [CrossRef]
  66. Richard, E.; Tijou, A.; Richard, P.; Ferrier, J.L. Multi-modal virtual environments for education with haptic and olfactory feedback. Virtual Real. 2006, 10, 207–225. [Google Scholar] [CrossRef]
  67. Miyaura, M.; Narumi, T.; Nishimura, K.; Tanikawa, T.; Hirose, M. Olfactory feedback system to improve the concentration level based on biological information. In Proceedings of the 2011 IEEE Virtual Reality Conference, Singapore, 19–23 March 2011. [Google Scholar] [CrossRef]
  68. Kwok, R.C.W.; Cheng, S.H.; Ho-Shing Ip, H.; Kong, J.S.L. Design of affectively evocative smart ambient media for learning. Comput. Educ. 2011, 56, 101–111. [Google Scholar] [CrossRef]
  69. Garcia-Ruiz, M.A.; Edwards, A.; Aquino-Santos, R.; Alvarez-Cardenas, O.; Mayoral-Baldivia, M.G. Integrating the sense of smell in virtual reality for second language learning. In Proceedings of the World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, Las Vegas, NV, USA, 17 November 2008. [Google Scholar]
  70. Sauro, J. A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices; CreateSpace Independent Publishing Platform: Scotts Valley, CA, USA, 2011. [Google Scholar]
  71. Herz, R.S.; Beland, S.L.; Hellerstein, M. Changing odor hedonic perception through emotional associations in humans. Int. J. Comp. Psychol. 2004, 17, 315–338. [Google Scholar]
  72. Ellen, P.S.; Bone, P.F. Does it matter if it smells? Olfactory stimuli as advertising executional cues. J. Advert. 1998, 27, 29–39. [Google Scholar] [CrossRef]
  73. Ho, C.; Spence, C. Olfactory facilitation of dual-task performance. Neurosci. Lett. 2005, 389, 35–40. [Google Scholar] [CrossRef]
  74. Oliver, S. Stop and Smell the Roses: Incorporating Smell as a Multisensory Learning Tool in the University English Classroom. Teach. Innov. Proj. 2012, 2, 1–23. [Google Scholar]
  75. Covaci, A.; Ghinea, G.; Lin, C.H.; Huang, S.H.; Shih, J.L. Multisensory games-based learning—Lessons learnt from olfactory enhancement of a digital board game. Multimed. Tools Appl. 2018, 77. [Google Scholar] [CrossRef] [Green Version]
  76. Seitz, A.R.; Kim, R.; Van Wassenhove, V.; Shams, L. Simultaneous and independent acquisition of multisensory and unisensory associations. Perception 2007, 36. [Google Scholar] [CrossRef] [Green Version]
  77. Wesson, D.W.; Wilson, D.A. Smelling sounds: Olfactory-auditory sensory convergence in the olfactory tubercle. J. Neurosci. 2010, 30, 3013–3021. [Google Scholar] [CrossRef] [Green Version]
  78. Gotow, N.; Kobayakawa, T. Simultaneity judgment using olfactory-visual, visual-gustatory, and olfactory-gustatory combinations. PLoS ONE 2017, 12, e0174958. [Google Scholar] [CrossRef]
  79. Woods, A.T.; Poliakoff, E.; Lloyd, D.M.; Kuenzel, J.; Hodson, R.; Gonda, H.; Batchelor, J.; Dijksterhuis, G.B.; Thomas, A. Effect of background noise on food perception. Food Qual. Prefer. 2011, 22, 42–47. [Google Scholar] [CrossRef]
  80. Adelstein, B.D.; Begault, D.R.; Anderson, M.R.; Wenzel, E.M. Sensitivity to haptic-audio asynchrony. In Proceedings of the ICMI ‘03: Proceedings of the 5th International Conference on Multimodal Interfaces, Vancouver, BC, Canada, 5–7 November 2003; Association for Computing Machinery: New York, NY, USA, 2003. [Google Scholar] [CrossRef]
  81. Driver, J.; Spence, C. Crossmodal attention. Curr. Opin. Neurobiol. 1998, 8, 245–253. [Google Scholar] [CrossRef]
  82. Rowe, C. Receiver psychology and the evolution of multicomponent signals. Anim. Behav. 1999, 58, 921–931. [Google Scholar] [CrossRef] [Green Version]
  83. Flavián, C.; Ibáñez-Sánchez, S.; Orús, C. The influence of scent on virtual reality experiences: The role of aroma-content congruence. J. Bus. Res. 2021, 123, 289–301. [Google Scholar] [CrossRef]
  84. Klašnja-Milićević, A.; Marošan, Z.; Ivanović, M.; Savić, N.; Vesin, B. The future of learning multisensory experiences: Visual, audio, smell and taste senses. In Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2019; Volume 804. [Google Scholar] [CrossRef]
  85. Alkasasbeh, A.A.; Ghinea, G. Using olfactory media cues in E-learning—Perspectives from an empirical investigation. Multimed. Tools Appl. 2020, 79, 19265–19287. [Google Scholar] [CrossRef] [Green Version]
  86. Persky, S.; Dolwick, A.P. Olfactory perception and presence in a virtual reality food environment. Front. Virtual Real. 2020, 1, 1. [Google Scholar] [CrossRef]
  87. Petit, O.; Velasco, C.; Spence, C. Digital sensory marketing: Integrating new technologies into multisensory online experience. J. Interact. Mark. 2019, 45, 42–61. [Google Scholar] [CrossRef] [Green Version]
  88. Tomasi, D.; Ferris, H.; Booraem, P.; Enman, L.; Gates, S.; Reyns, E. Olfactory virtual reality (OVR) for wellbeing and reduction of stress, anxiety and pain. J. Med. Res. Health Sci. 2021, 4, 1212–1221. [Google Scholar]
  89. Spencer, B.S. Incorporating the sense of smell into patient and haptic surgical simulators. IEEE Trans. Inf. Technol. Biomed. 2006, 10. [Google Scholar] [CrossRef]
  90. Kent, S.J.W.; Kent, F.H.; Brown, C.W.; Morrison, I.G.; Morse, J.C. Should we add smells in simulation training? A systematic review of smells in healthcare-related simulation training. BMJ Simul. Technol. Enhanc. Learn. 2016. [Google Scholar] [CrossRef]
  91. Vlahos, J. The smell of war. Pop. Sci. 2006, 269. [Google Scholar]
  92. Narciso, D.; Bessa, M.; Melo, M.; Vasconcelos-Raposo, J. Virtual reality for training-the impact of smell on presence, cybersickness, fatigue, stress and knowledge transfer. In Proceedings of the 2019 International Conference on Graphics and Interaction (ICGI), Faro, Portugal, 21–22 November 2019; pp. 115–121. [Google Scholar]
  93. Chen, Y. Olfactory display: Development and application in virtual reality therapy. In Proceedings of the 16th International Conference on Artificial Reality and Telexistence—Workshops (ICAT’06), Hangzhou, China, 29 November–1 December 2006. [Google Scholar] [CrossRef]
  94. Mirzoeff, N. An Introduction to Visual Culture; Routledge: Abingdon-on-Thames, UK, 1999. [Google Scholar]
  95. Collins, K.; Kapralos, B. Pseudo-Haptics: Leveraging Cross-Modal Perception in Virtual Environments. Senses Soc. 2019, 14, 313–329. [Google Scholar] [CrossRef]
  96. Rebolledo-Mendez, G.; Huerta-Pacheco, N.S.; Baker, R.S.; du Boulay, B. Meta-affective behaviour within an intelligent tutoring system for mathematics. Int. J. Artif. Intell. Educ. 2021, in press. [Google Scholar] [CrossRef]
  97. Murray, N.; Lee, B.; Qiao, Y.; Muntean, G.M. Olfaction-enhanced multimedia: A survey of application domains, displays, and research challenges. ACM Comput. Surv. 2016, 48, 1–34. [Google Scholar] [CrossRef]
Figure 1. An example of a simple olfactory display (Garcia-Ruiz et al., 2021) [46].
Figure 1. An example of a simple olfactory display (Garcia-Ruiz et al., 2021) [46].
Mti 05 00064 g001
Table 1. A summary of research conducted on educational olfactory interfaces.
Table 1. A summary of research conducted on educational olfactory interfaces.
Reviewed ResearchOlfactory Technology Used in the ResearchExperimental/Testing Main Findings
Lai (2015) [63] developed an interactive art exhibition where patrons perceived five odorsMist diffusersOdors worked as a powerful communication medium and complemented other senses in the artistic perception
Visitors of London’s Tate Gallery held and smelled 3D printed scented objects that were related to some paintings (Vi et al. (2017) [64])3D printed scented objectsSmells were useful in supporting understanding the paintings’ meaning and artistic renderings
Tijou et al. (2006) [65] developed a fully immersive desktop virtual reality (VR) system to investigate the effects of olfaction on learning, recall and retention of 3D structures of organic moleculesScented gels stored into cartridges and fansThe paper demonstrated the feasibility of a multimodal VR system (including smell) used for learning in the sciences
Richard et al. (2006) [66] introduced the ‘‘Nice-smelling Interactive Multimedia Alphabet’’ project that involved developing a multimodal computer application that included olfactory, visual and auditory informationScented gels stored into cartridges and fansNo reported research results
Miyaura et al. (2011) [67] developed an olfactory display to help learners re-engage in math tasksink-jet technology with scented dropletsThe researchers reported that the odors helped to decrease errors in the additions
Kwok et al. (2009) [68] developed and tested a multimodal ambient room for learning with visual, auditory and olfactory stimuliOlfactory display system with spray dispensersPreliminary findings of post-tests applied to learners showed that the multimodal ambient room influenced students’ affective experiences, improving their learning effectiveness
Garcia-Ruiz et al. (2008) [69] developed a 3D virtual environment for learning English languageFresh leaves of mint (Mentha Spicata)The students perceived the usability of the multimodal virtual environment as very good. In addition, students reported that the mint odor helped students lower their anxiety when listening to the oral instructions in English
Czyzewski et al. (2010) [39] developed a multimodal educational software showing animated cartoons of animalsA device that generated small drops of scented oil, previously stored in a glass pipe, and rapidly released to the environment using compressed airInitial tests helped to fix technical problems from the device and to analyze the effectiveness of the olfactory device, although their testing results were inconclusive
Covaci et al. (2018) [75] developed a multiplayer serious game intended to teach high-school students about the seventeenth century’s Age of DiscoverySmall jars containing odors of real spices and beansMultisensory stimulation in the serious game engaged the users, potentially improving the learning process. However, pre-test and post-test knowledge questionnaire results showed that the olfactory feedback did not yield an improvement in students’ performance
Klašnja-Milićević et al. (2018) [84] investigated olfaction-based applications in multimodal VR application for learning the solar systemEssential oil vaporsA within-groups test showed that participants who consumed the chocolate, drank the coffee and smelled the citrus oil vapor while using the VR learning application scored higher in a knowledge pre-/post-test
Alkasasbeh and Ghinea (2019) [85] developed a multimodal website for learning about geographyDry-air scent diffuser and fansResults showed that the questions from the post-test in which the olfactory media was synchronized with the audiovisual media was significantly different (improved with higher than average scores) compared to those who were not provided with any olfactory stimuli. However, the odor-only related questions yielded no significant difference
Table 2. A summary of research conducted on olfactory interfaces that support training.
Table 2. A summary of research conducted on olfactory interfaces that support training.
Reviewed ResearchOlfactory Technology Used in the ResearchExperimental/Testing Main Findings
Cater (1996) [32] developed a virtual reality system for training firefighters,Wearable olfactory display that generated different types of smokeStrong smoke caused extreme discomfort in trainees
Spencer (2006) [89] and Kent et al. (2016) [90] conducted literature reviews of research projects that developed and tested medical simulators that incorporated olfactory displaysVarious virtual reality technologiesThe literature reviews highlighted the feasibility of virtual reality and olfactory displays in medical training, where this technology may support medical diagnoses and benefit training of medicine students
Vlahos (2006) [91] reported that theme park designers and the University of Southern California developed a virtual reality system with an olfactory display for training soldiersSoldiers don a collar with cartridges that activate wirelessly. The collar has four smell-soaked wicks that send the smells to the trainee’s nose with micro fansSmell improved soldiers’ mental immersion in the simulated war scenario, positively supporting training
Tsai and Hsieh (2012) [21] used odors for supporting training of computer programmers for improving coding style and identifying coding errorsArduino™ microcontroller board connected to a pair of off-the-shelf household aromatizersMore than 80% of participants in a test declared that smells were useful for identifying coding errors
Narciso et al. (2019) [92] developed a virtual reality system with olfactory display for supporting training of firefightersA commercial olfactory system that diffused smell of burnt wood with compressed airA between-groups experimental results shown that overall, the multimodal virtual reality system supported knowledge transfer but the experimental group with the smell condition did not significantly improve participants’ presence, cybersickness, fatigue, stress and knowledge transfer
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Garcia-Ruiz, M.A.; Kapralos, B.; Rebolledo-Mendez, G. An Overview of Olfactory Displays in Education and Training. Multimodal Technol. Interact. 2021, 5, 64. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5100064

AMA Style

Garcia-Ruiz MA, Kapralos B, Rebolledo-Mendez G. An Overview of Olfactory Displays in Education and Training. Multimodal Technologies and Interaction. 2021; 5(10):64. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5100064

Chicago/Turabian Style

Garcia-Ruiz, Miguel Angel, Bill Kapralos, and Genaro Rebolledo-Mendez. 2021. "An Overview of Olfactory Displays in Education and Training" Multimodal Technologies and Interaction 5, no. 10: 64. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5100064

Article Metrics

Back to TopTop