Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications

A special issue of Brain Sciences (ISSN 2076-3425). This special issue belongs to the section "Computational Neuroscience and Neuroinformatics".

Deadline for manuscript submissions: closed (20 August 2020) | Viewed by 38793

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editor


E-Mail Website
Guest Editor
Department of Philosophy, University of Milan, 7, 20122 Milan, Italy
Interests: creativity; learning; cognitive flexibility; decision-making; EEG; tDCS; neuro modulation; neurofeedback; emotions; applied cognitive science; multi-brain neuroscience
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Brain–computer interfaces (BCI) are technological systems that enable individuals to interact with a computer using their brain activity. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action on the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game).

Applications are now available in various research and clinical fields, such as for communication and control tools for severely disabled users, rehabilitation, and human–computer interaction. These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional state by training a classification algorithm to recognize affect states. This information can be utilized to analyze the cortical activity in specific areas during complex tasks, as well as to allow the BCI to adapt its recognition algorithms. Furthermore, information about emotional involvement can be used to improve the user’s experience while controlling the BCI through affective modulation. 

This Special Issue is dedicated to the study of brain activity related to emotional involvement as measured by BCI systems designed for different purposes. Papers reporting theoretical models, targeted reviews, research, and clinical applications are welcome.

Dr. Claudio Lucchiari
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Brain Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • brain computer interface
  • emotional involvement
  • brain activity
  • human–machine interaction
  • emotion detection methods
  • artificial intelligence
  • rehabilitation
  • emotional BCI control
  • psychological states

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

4 pages, 635 KiB  
Editorial
Brain–Computer Interfaces: Toward a Daily Life Employment
by Pietro Aricò, Nicolina Sciaraffa and Fabio Babiloni
Brain Sci. 2020, 10(3), 157; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci10030157 - 09 Mar 2020
Cited by 11 | Viewed by 4686
Abstract
Recent publications in the Electroencephalogram (EEG)-based brain–computer interface field suggest that this technology could be ready to go outside the research labs and enter the market as a new consumer product. This assumption is supported by the recent advantages obtained in terms of [...] Read more.
Recent publications in the Electroencephalogram (EEG)-based brain–computer interface field suggest that this technology could be ready to go outside the research labs and enter the market as a new consumer product. This assumption is supported by the recent advantages obtained in terms of front-end graphical user interfaces, back-end classification algorithms, and technology improvement in terms of wearable devices and dry EEG sensors. This editorial paper aims at mentioning these aspects, starting from the review paper “Brain–Computer Interface Spellers: A Review” (Rezeika et al., 2018), published within the Brain Sciences journal, and citing other relevant review papers that discussed these points. Full article
Show Figures

Figure 1

Research

Jump to: Editorial, Review

17 pages, 8090 KiB  
Article
Lightweight Building of an Electroencephalogram-Based Emotion Detection System
by Abeer Al-Nafjan, Khulud Alharthi and Heba Kurdi
Brain Sci. 2020, 10(11), 781; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci10110781 - 26 Oct 2020
Cited by 16 | Viewed by 3418
Abstract
Brain–computer interface (BCI) technology provides a direct interface between the brain and an external device. BCIs have facilitated the monitoring of conscious brain electrical activity via electroencephalogram (EEG) signals and the detection of human emotion. Recently, great progress has been made in the [...] Read more.
Brain–computer interface (BCI) technology provides a direct interface between the brain and an external device. BCIs have facilitated the monitoring of conscious brain electrical activity via electroencephalogram (EEG) signals and the detection of human emotion. Recently, great progress has been made in the development of novel paradigms for EEG-based emotion detection. These studies have also attempted to apply BCI research findings in varied contexts. Interestingly, advances in BCI technologies have increased the interest of scientists because such technologies’ practical applications in human–machine relationships seem promising. This emphasizes the need for a building process for an EEG-based emotion detection system that is lightweight, in terms of a smaller EEG dataset size and no involvement of feature extraction methods. In this study, we investigated the feasibility of using a spiking neural network to build an emotion detection system from a smaller version of the DEAP dataset with no involvement of feature extraction methods while maintaining decent accuracy. The results showed that by using a NeuCube-based spiking neural network, we could detect the valence emotion level using only 60 EEG samples with 84.62% accuracy, which is a comparable accuracy to that of previous studies. Full article
Show Figures

Figure 1

22 pages, 4020 KiB  
Article
An Emotion Assessment of Stroke Patients by Using Bispectrum Features of EEG Signals
by Choong Wen Yean, Wan Khairunizam Wan Ahmad, Wan Azani Mustafa, Murugappan Murugappan, Yuvaraj Rajamanickam, Abdul Hamid Adom, Mohammad Iqbal Omar, Bong Siao Zheng, Ahmad Kadri Junoh, Zuradzman Mohamad Razlan and Shahriman Abu Bakar
Brain Sci. 2020, 10(10), 672; https://doi.org/10.3390/brainsci10100672 - 25 Sep 2020
Cited by 17 | Viewed by 2631
Abstract
Emotion assessment in stroke patients gives meaningful information to physiotherapists to identify the appropriate method for treatment. This study was aimed to classify the emotions of stroke patients by applying bispectrum features in electroencephalogram (EEG) signals. EEG signals from three groups of subjects, [...] Read more.
Emotion assessment in stroke patients gives meaningful information to physiotherapists to identify the appropriate method for treatment. This study was aimed to classify the emotions of stroke patients by applying bispectrum features in electroencephalogram (EEG) signals. EEG signals from three groups of subjects, namely stroke patients with left brain damage (LBD), right brain damage (RBD), and normal control (NC), were analyzed for six different emotional states. The estimated bispectrum mapped in the contour plots show the different appearance of nonlinearity in the EEG signals for different emotional states. Bispectrum features were extracted from the alpha (8–13) Hz, beta (13–30) Hz and gamma (30–49) Hz bands, respectively. The k-nearest neighbor (KNN) and probabilistic neural network (PNN) classifiers were used to classify the six emotions in LBD, RBD and NC. The bispectrum features showed statistical significance for all three groups. The beta frequency band was the best performing EEG frequency-sub band for emotion classification. The combination of alpha to gamma bands provides the highest classification accuracy in both KNN and PNN classifiers. Sadness emotion records the highest classification, which was 65.37% in LBD, 71.48% in RBD and 75.56% in NC groups. Full article
Show Figures

Graphical abstract

16 pages, 2131 KiB  
Article
Independent Components of EEG Activity Correlating with Emotional State
by Yasuhisa Maruyama, Yousuke Ogata, Laura A. Martínez-Tejada, Yasuharu Koike and Natsue Yoshimura
Brain Sci. 2020, 10(10), 669; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci10100669 - 25 Sep 2020
Cited by 5 | Viewed by 2508
Abstract
Among brain-computer interface studies, electroencephalography (EEG)-based emotion recognition is receiving attention and some studies have performed regression analyses to recognize small-scale emotional changes; however, effective brain regions in emotion regression analyses have not been identified yet. Accordingly, this study sought to identify neural [...] Read more.
Among brain-computer interface studies, electroencephalography (EEG)-based emotion recognition is receiving attention and some studies have performed regression analyses to recognize small-scale emotional changes; however, effective brain regions in emotion regression analyses have not been identified yet. Accordingly, this study sought to identify neural activities correlating with emotional states in the source space. We employed independent component analysis, followed by a source localization method, to obtain distinct neural activities from EEG signals. After the identification of seven independent component (IC) clusters in a k-means clustering analysis, group-level regression analyses using frequency band power of the ICs were performed based on Russell’s valence–arousal model. As a result, in the regression of the valence level, an IC cluster located in the cuneus predicted both high- and low-valence states and two other IC clusters located in the left precentral gyrus and the precuneus predicted the low-valence state. In the regression of the arousal level, the IC cluster located in the cuneus predicted both high- and low-arousal states and two posterior IC clusters located in the cingulate gyrus and the precuneus predicted the high-arousal state. In this proof-of-concept study, we revealed neural activities correlating with specific emotional states across participants, despite individual differences in emotional processing. Full article
Show Figures

Figure 1

21 pages, 4696 KiB  
Article
In-Ear EEG Based Attention State Classification Using Echo State Network
by Dong-Hwa Jeong and Jaeseung Jeong
Brain Sci. 2020, 10(6), 321; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci10060321 - 26 May 2020
Cited by 23 | Viewed by 5094
Abstract
It is important to maintain attention when carrying out significant daily-life tasks that require high levels of safety and efficiency. Since degradation of attention can sometimes have dire consequences, various brain activity measurement devices such as electroencephalography (EEG) systems have been used to [...] Read more.
It is important to maintain attention when carrying out significant daily-life tasks that require high levels of safety and efficiency. Since degradation of attention can sometimes have dire consequences, various brain activity measurement devices such as electroencephalography (EEG) systems have been used to monitor attention states in individuals. However, conventional EEG instruments have limited utility in daily life because they are uncomfortable to wear. Thus, this study was designed to investigate the possibility of discriminating between the attentive and resting states using in-ear EEG signals for potential application via portable, convenient earphone-shaped EEG instruments. We recorded both on-scalp and in-ear EEG signals from 6 subjects in a state of attentiveness during the performance of a visual vigilance task. We have designed and developed in-ear EEG electrodes customized by modelling both the left and right ear canals of the subjects. We use an echo state network (ESN), a powerful type of machine learning algorithm, to discriminate attention states on the basis of in-ear EEGs. We have found that the maximum average accuracy of the ESN method in discriminating between attentive and resting states is approximately 81.16% with optimal network parameters. This study suggests that portable in-ear EEG devices and an ESN can be used to monitor attention states during significant tasks to enhance safety and efficiency. Full article
Show Figures

Figure 1

14 pages, 1512 KiB  
Article
Abnormal Emotional Processing and Emotional Experience in Patients with Peripheral Facial Nerve Paralysis: An MEG Study
by Mina Kheirkhah, Stefan Brodoehl, Lutz Leistritz, Theresa Götz, Philipp Baumbach, Ralph Huonker, Otto W. Witte, Gerd Fabian Volk, Orlando Guntinas-Lichius and Carsten M. Klingner
Brain Sci. 2020, 10(3), 147; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci10030147 - 04 Mar 2020
Cited by 5 | Viewed by 3029
Abstract
Abnormal emotional reactions of the brain in patients with facial nerve paralysis have not yet been reported. This study aims to investigate this issue by applying a machine-learning algorithm that discriminates brain emotional activities that belong either to patients with facial nerve paralysis [...] Read more.
Abnormal emotional reactions of the brain in patients with facial nerve paralysis have not yet been reported. This study aims to investigate this issue by applying a machine-learning algorithm that discriminates brain emotional activities that belong either to patients with facial nerve paralysis or to healthy controls. Beyond this, we assess an emotion rating task to determine whether there are differences in their experience of emotions. MEG signals of 17 healthy controls and 16 patients with facial nerve paralysis were recorded in response to picture stimuli in three different emotional categories (pleasant, unpleasant, and neutral). The selected machine learning technique in this study was the logistic regression with LASSO regularization. We demonstrated significant classification performances in all three emotional categories. The best classification performance was achieved considering features based on event-related fields in response to the pleasant category, with an accuracy of 0.79 (95% CI (0.70, 0.82)). We also found that patients with facial nerve paralysis rated pleasant stimuli significantly more positively than healthy controls. Our results indicate that the inability to express facial expressions due to peripheral motor paralysis of the face might cause abnormal brain emotional processing and experience of particular emotions. Full article
Show Figures

Figure 1

21 pages, 961 KiB  
Article
Peak Detection with Online Electroencephalography (EEG) Artifact Removal for Brain–Computer Interface (BCI) Purposes
by Mihaly Benda and Ivan Volosyak
Brain Sci. 2019, 9(12), 347; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci9120347 - 29 Nov 2019
Cited by 13 | Viewed by 4398
Abstract
Brain–computer interfaces (BCIs) measure brain activity and translate it to control computer programs or external devices. However, the activity generated by the BCI makes measurements for objective fatigue evaluation very difficult, and the situation is further complicated due to different movement artefacts. The [...] Read more.
Brain–computer interfaces (BCIs) measure brain activity and translate it to control computer programs or external devices. However, the activity generated by the BCI makes measurements for objective fatigue evaluation very difficult, and the situation is further complicated due to different movement artefacts. The BCI performance could be increased if an online method existed to measure the fatigue objectively and accurately. While BCI-users are moving, a novel automatic online artefact removal technique is used to filter out these movement artefacts. The effects of this filter on BCI performance and mainly on peak frequency detection during BCI use were investigated in this paper. A successful peak alpha frequency measurement can lead to more accurately determining objective user fatigue. Fifteen subjects performed various imaginary and actual movements in separate tasks, while fourteen electroencephalography (EEG) electrodes were used. Afterwards, a steady-state visual evoked potential (SSVEP)-based BCI speller was used, and the users were instructed to perform various movements. An offline curve fitting method was used for alpha peak detection to assess the effect of the artefact filtering. Peak detection was improved by the filter, by finding 10.91% and 9.68% more alpha peaks during simple EEG recordings and BCI use, respectively. As expected, BCI performance deteriorated from movements, and also from artefact removal. Average information transfer rates (ITRs) were 20.27 bit/min, 16.96 bit/min, and 14.14 bit/min for the (1) movement-free, (2) the moving and unfiltered, and (3) the moving and filtered scenarios, respectively. Full article
Show Figures

Figure 1

18 pages, 1012 KiB  
Article
Exploring Shopper’s Browsing Behavior and Attention Level with an EEG Biosensor Cap
by Dong-Her Shih, Kuan-Chu Lu and Po-Yuan Shih
Brain Sci. 2019, 9(11), 301; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci9110301 - 31 Oct 2019
Cited by 5 | Viewed by 3642
Abstract
The online shopping market is developing rapidly, meaning that it is important for retailers and manufacturers to understand how consumers behave online compared to when in brick-and-mortar stores. Retailers want consumers to spend time shopping, browsing, and searching for products in the hope [...] Read more.
The online shopping market is developing rapidly, meaning that it is important for retailers and manufacturers to understand how consumers behave online compared to when in brick-and-mortar stores. Retailers want consumers to spend time shopping, browsing, and searching for products in the hope a purchase is made. On the other hand, consumers may want to restrict their duration of stay on websites due to perceived risk of loss of time or convenience. This phenomenon underlies the need to reduce the duration of consumer stay (namely, time pressure) on websites. In this paper, the browsing behavior and attention span of shoppers engaging in online shopping under time pressure were investigated. The attention and meditation level are measured by an electroencephalogram (EEG) biosensor cap. The results indicated that when under time pressure shoppers engaging in online shopping are less attentive. Thus, marketers may need to find strategies to increase a shopper’s attention. Shoppers unfamiliar with product catalogs on shopping websites are less attentive, therefore marketers should adopt an interesting style for product catalogs to hold a shopper’s attention. We discuss our findings and outline their business implications. Full article
Show Figures

Figure 1

Review

Jump to: Editorial, Research

29 pages, 2667 KiB  
Review
Advances in Multimodal Emotion Recognition Based on Brain–Computer Interfaces
by Zhipeng He, Zina Li, Fuzhou Yang, Lei Wang, Jingcong Li, Chengju Zhou and Jiahui Pan
Brain Sci. 2020, 10(10), 687; https://0-doi-org.brum.beds.ac.uk/10.3390/brainsci10100687 - 29 Sep 2020
Cited by 59 | Viewed by 8437
Abstract
With the continuous development of portable noninvasive human sensor technologies such as brain–computer interfaces (BCI), multimodal emotion recognition has attracted increasing attention in the area of affective computing. This paper primarily discusses the progress of research into multimodal emotion recognition based on BCI [...] Read more.
With the continuous development of portable noninvasive human sensor technologies such as brain–computer interfaces (BCI), multimodal emotion recognition has attracted increasing attention in the area of affective computing. This paper primarily discusses the progress of research into multimodal emotion recognition based on BCI and reviews three types of multimodal affective BCI (aBCI): aBCI based on a combination of behavior and brain signals, aBCI based on various hybrid neurophysiology modalities and aBCI based on heterogeneous sensory stimuli. For each type of aBCI, we further review several representative multimodal aBCI systems, including their design principles, paradigms, algorithms, experimental results and corresponding advantages. Finally, we identify several important issues and research directions for multimodal emotion recognition based on BCI. Full article
Show Figures

Figure 1

Back to TopTop