Next Article in Journal
Evaluating the Sustainability and Inherent Safety of a Crude Palm Oil Production Process in North-Colombia
Previous Article in Journal
Kinetic Modeling of Advanced Oxidation Processes Using Microreactors: Challenges and Opportunities for Scale-Up
Previous Article in Special Issue
On the Handwriting Tasks’ Analysis to Detect Fatigue
Open AccessArticle

Investigation of Visual Stimulus Signals Using Hue Change for SSVEP

1
Division of Informatics, Tokyo City University, 1-28-1 Tamazutsumi, Setagaya-ku, Tokyo 158-8557, Japan
2
Department of Computer Science, Tokyo City University, 1-28-1 Tamazutsumi, Setagaya-ku, Tokyo 158-8557, Japan
3
Department of Medical Engineering, Tokyo City University, 1-28-1 Tamazutsumi, Setagaya-ku, Tokyo 158-8557, Japan
*
Author to whom correspondence should be addressed.
Academic Editor: Mufti Mahmud
Received: 23 November 2020 / Revised: 17 January 2021 / Accepted: 20 January 2021 / Published: 25 January 2021

Abstract

This study focuses on the problem of eye irritation when measuring steady-state visual evoked potentials (SSVEPs) using a brain–computer interface and aims to clarify experimentally visual stimulus signals that do not cause discomfort to users. To this end, a method is proposed that introduces a flash stimulus in which the color is changed by changing its hue. This reduces the change in brightness while providing a color change, thereby facilitating visual stimulation with less discomfort. In experiments conducted, flash stimuli of the primary colors red, green, and blue and colors with different hues of 5–45° from these primary colors were generated to investigate the algorithm accuracy of SSVEP and discomfort. Subjective questionnaire and CFF values, which are ophthalmic parameters, were obtained for the subjects and compared to the discrimination rate. As a result of the comparison, it was confirmed that the fatigue level of the visual stimulus generated by the proposed hue change was lower than that of the conventional black-and-white stimulus. It was also confirmed that the combination of the hue difference and frequency could obtain the same discrimination rate as the conventional method.
Keywords: brain–computer interface; electroencephalogram; analogous colors; steady-state visual evoked potential; canonical correlation analysis brain–computer interface; electroencephalogram; analogous colors; steady-state visual evoked potential; canonical correlation analysis

1. Introduction

The development of interfaces that can be intuitively operated in various electronic devices is advancing, and various devices have been developed that make it easy for the general public to understand how to operate them through touch panels and other means. However, because most human–machine interfaces, including those that allow such intuitive manipulation, are often difficult to operate for severely disabled persons, the use of electroencephalograms (EEGs), which are also available for physical disabilities, has been attracting increasing attention [1].
There are two ways to measure EEG: (1) invasive brain–machine interface/brain–computer interface (BMI/BCI) in which electrodes are implanted directly into the brain; (2) non-invasive BMI/BCI in which electrodes are placed on the scalp and EEG is extracted. In recent years, noninvasive BMI/BCI has been attracting more attention because it is less burdensome on the subject [2].
One commonly employed BCI method is the use of visual stimuli, such as fast-blinking optical signals [3]. EEG has the ability to register responses to visual stimuli, and the potential generated in the cerebral cortical visual cortex by visual stimuli is called visual evoked potential (VEP). Visual stimuli are frequently presented to cause/elicit visual evoked potentials, and the potential generated for periodic visual stimuli is called steady-state visual evoked potential (SSVEP); it is tuned to the same frequencies and harmonic components as the presented periodic visual stimuli [4]. Thus, SSVEP can be used to identify the visual stimulus being observed by the subject and can be used in an input interface by providing multiple visual stimuli, each meaningful.
In general, a flash stimulus is used for the visual stimulus presented in the generation of SSVEP [3]. Flash stimuli include red light-emitting diodes and periodically inverted black-and-white images displayed on the display. In recent years, exploiting the advantage that flash stimuli can be readily prepared, liquid crystal displays (LCDs) have been used to show black-and-white flash stimuli, and research on game manipulation by BCI has been conducted [5]. In contrast to the advantage that flash stimuli can be readily prepared, there is a problem in that the brightness of the flash stimulus signal used is intense and causes eye irritation. Thus, it causes fatigue and discomfort, is difficult to use for extended periods, and has a risk of inducing photosensitive epilepsy [4].
Therefore, methods causing less irritation and less discomfort are actively being studied. For example, an attempt has been made to reduce fatigue and discomfort by changing the luminance value of the flash stimulus as a sinusoidal or triangular wave and decreasing the abrupt change in brightness [6].
In this article, we focus on the fact that asthenopia and mental fatigue during visual display terminal work are related to display colors reported in [7] and propose a visual stimulation method that does not cause eye fatigue or discomfort by generating visual stimulation signals using three primary colors of light in the flash stimulus. The proposed method produces a flash stimulus by changing the hue of the color of the stimulus, which was previously changed by changing the brightness.
The aim of this paper is to experimentally clarify the visual stimulus signals that cause the least discomfort to users by using the proposed visual stimulus signals of each of the three primary colors and asking users a subjective questionnaire about their discomfort for each color.

2. Materials and Methods

2.1. Visual Stimuli Using Color

Color-based visual stimuli produce flash stimuli by altering the hue ring of the hue, saturation, value (HSV) color space. The hue ring has a value of 0–360° and a color change of 0°-red, 120°-blue, and 240°-green. In a hue ring, colors within a 30° angle difference from the hue of one particular color (reference color) are called similar colors. A similar color can be recognized as a different color by the human eye and thus can be recognized as a flash stimulus. In addition, a flash stimulus that changes the hue can be converted to another color without altering the luminance value. Therefore, it is possible to reduce visual discomfort by determining the target color of blinking, which is paired with the reference color, from the hue change. A time sequence of flash stimuli with the same color when red (0°) is used, as the reference color is shown in Figure 1 and Figure 2.

2.2. Visual Stimuli on Display

An LCD (LG W2363D-PF) with a refresh rate of 120 Hz was used for the stimulation display. The LCD was placed 50 cm in front of the participant’s face, and nine square visual stimuli, 15 mm on one side, were placed in a 3 × 3 matrix (15 mm spacing). An example of such a visual stimulation is shown in Figure 3.
Visual stimuli were displayed vertically synchronized by OPENGL and were generated as a set of nine signals: 30, 20, 15, 12, 10, 7.5, 6, 5, and 4 Hz. In an LCD with a refresh rate of 120 Hz, vertical synchronization allows an image of 120 frames to be displayed for exactly one second. If the frame configuration of each signal is 12 Hz, 12 waves are generated per second; each wave comprises a total of 10 frames—five white and five black frames. In the case of flash stimulation with hue change, a total of 10 frames—five red (0°) frames and five frames of shades (30°) of red—are regarded as one wave.

2.3. Obtaining and Analyzing SSVEP

Emotiv Epoc+, an electroencephalogram measuring device that is noninvasive to the body, was used to measure the EEG. Its electrodes are dedicated saline-based wet electrodes, and they contact the scalp through electrolyte-impregnated sponges at 14 sites: AF3, AF4, F7, F8, F3, F4, FC4, FC5, T7, T8, P7, P8, O1, O2. SSVEP can be measured using the electrodes close to the occipital region where the visual cortex is located. In previous reports [8], the measurement of SSVEP was reported using the electrodes of Emotiv Epoc+ at the occipital region (O1, O2). The occipital region (O1, O2) was also used in this study.
In the analysis of the acquired EEG data, frequency information is extracted from multi-channel EEG using canonical correlation analysis (CCA) [9]. In CCA, the data from one signal can be divided into a unique signal (proprietary signal), communication signal, and noise. Only a signal common to multiple signal data is extracted and selected. It is possible to obtain accurate SSVEP by obtaining common EEGs from the signals of the two channels obtained from the two sites (O1, O2).
For CCA, a reference signal was generated for 30, 20, 15, 12, 10, 7.5, 6, 5, and 4 Hz, respectively. CCA was used to calculate the canonical correlation coefficient for each of the nine artificial reference signals generated from the EEG data. The extracted frequency corresponds to the reference signal with the highest correlation coefficient and is acquired from the EEG data through multiple trials. Then, accuracy is calculated from Equation (1):
accuracy = c o r r N ,
where corr is the number of trials necessary to obtain the correct prediction, and N is the total number of trials.

2.4. Measurement of Visual Fatigue

Visual fatigue was assessed by measuring and evaluating subjective evaluation responses (12 symptoms considering general discomfort and eye fatigue), signal flickering response after each set of colors, and critical fusion frequency (CFF) values and ophthalmic parameters before and after the experiment. Twelve symptoms [10] of common discomfort and eye fatigue were scored on five-point scales in a subjective questionnaire (1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe). The subjective evaluation of the signal flicker when observing the flashing signal was called flicker rating and was rated on three levels by a subjective questionnaire (1 = none, 2 = slight, 3 = clear). The frequency threshold value at which a blinking sensation is generated is called the CFF value [11,12]. The CFF value has been reported to decrease with the accumulation of fatigue [13]. Therefore, it is widely used in the field of ergonomics and occupational health as a test for mental fatigue. In our experiment, the CFF value was measured by decreasing the frequency threshold by 1 Hz every 5 s from 45 Hz using a red LED and acquiring the frequency threshold at which a blinking sensation was generated. Measurements were performed on both eyes. Measured CFF values were evaluated by comparing changes before and after the experiment.

2.5. Preparation of Experiment

The flash stimulus was altered using a pair of colors with hue differences of 5° to 45°, based on the three primary colors of red (0°), blue (120°), and green (240°), to determine a signal with the highest SSVEP recognition accuracy and no discomfort. As described above, the visual stimulus consisted of nine signals at 30, 20, 15, 12, 10, 7.5, 6, 5, and 4 Hz displayed as a set.
Two parts of the occipital region (O1 and O2) were used for EEG measurements. The application attached to Emotiv Epoc+ adjusted the contact condition of all the electrodes to the best (green) condition in order to optimally mount the electrode in contact with the scalp. This allowed for optimum EEG data to be obtained. Emotiv Epoc+ EEG data were transmitted to the PC via Bluetooth at a sampling rate of 256 Hz.
To acquire the EEG data, the subject sits in front of the display showing the visual stimuli shown in Figure 3 and moves the gaze point from a visual stimulus of 30 to 4 Hz based on the examiner’s cue. The EEG data were collected in 10 trials.
The CCA-based analysis program, developed using MATLAB, extracted frequencies from the EEG data and calculated the percentage of correct answers.
From the results identified from the EEG data, we compared the effect of the rate of correct answers and fatigue before and after the experiment among experimental participants. In accordance with the main purpose of the Declaration of Helsinki, the subjects were informed about the experiment in writing and orally in advance, and their signatures of consent were obtained. Six subjects (five males and one female, all in their twenties) participated in this experimental study, and EGG data were obtained from them. The subjects were all healthy, with either normal or corrected eyesight and normal color discrimination.

3. Results

This section presents the results of the conventional black-and-white flash stimulus signals and the colors red (0°), green (120°), and blue (240°), separately.

3.1. Black–White

The following results were obtained from the conventional method [3]. The algorithm accuracy rate of SSVEP when the black-and-white flash stimuli were changed is shown in Figure 4. The K in the legend indicates the black-and-white flash signals. The vertical axis represents the accuracy rate, and the horizontal axis represents the frequency of the flash stimuli.
The results of the subjective questionnaire conducted after the experiment are shown in Figure 5. All subjects reported that the flicker rating was clear.
Figure 6 shows the change in CFF levels before and after the experiments with black-and-white flash stimulation.

3.2. Red

The accuracy rate of SSVEP when the paired flash stimuli are changed in increments of 5° between red (0°) and 5° to 45° is shown in Figure A1. R5 in the legend shows a flash signal with a color 5° from reference red (0°), R10 shows a flash signal with a color 10° from reference red (0°), and the other legends are created similarly. The results of the subjective questionnaire conducted after the experiment are shown in Figure A2. Figure 7 shows a graph of the algorithm accuracy rate for each frequency with the average value calculated for each hue difference, compared to the conventional method. Figure 7 also shows the flicker rating of the questionnaire results of the conventional method and the average score of the items with a score of 2 or higher: blurred vision, eyestrain, and irritated eyes. Figure 8 shows the change in CFF values before and after the experiment, based on red (0°).

3.3. Green

The accuracy rate of SSVEP for flash stimuli between green (120°) and 125° to 165° with color changes in increments of 5° is shown in Figure A3. G5 in the legend shows a flash signal with a color 125° (5° from reference green (120°)), G10 in the legend shows a color 130° (10° from reference green (120°)), and other legends are constructed similarly. The results of the subjective questionnaire conducted after the experiment are shown in Figure A4. Figure 9 shows a graph of the average algorithm accuracy rate of each frequency for each hue difference, compared to the conventional method. The average values of the questionnaire results for flicker rating, blurred vision, and eye strain are also shown in Figure 9. Figure 10 shows the change in CFF values before and after the experiment based on green (120°).

3.4. Blue

The accuracy rate of SSVEP when the paired flash stimuli were changed in increments of 5° between blue (240°) and 245° to 285° is shown in Figure A5. B5 in the legend shows a flash signal with a color 5° from the reference blue (240°), B10 shows a flash signal with a color 10° from the reference blue (240°), and the other legends are similarly constructed. The results of the subjective questionnaire conducted after the experiment are shown in Figure A6. Figure 11 shows a graph of the average algorithm accuracy rate of each frequency for each hue difference, compared to the conventional method. The average values of the questionnaire results for flicker rating, blurred vision, and eye strain are also shown in Figure 11. Figure 12 shows the change in CFF values before and after the experiment based on blue (240°).

4. Discussion

The black-and-white flash stimuli of the conventional method achieved an algorithm accuracy rate of more than 75% for all frequencies in Figure 4. In the results of the experiment with the proposed color stimuli, the red stimulus signal achieved an accuracy rate of more than 75% for all hue differences in Figure 7. For the green stimulus signal, the results showed that the accuracy rate was approximately 40% when the hue difference was less than 15°. When the hue difference was more than 20°, the accuracy rate was approximately 70%. For the blue stimulus signal, the discrimination rate was 65% or higher for all hue differences.
In the subjective questionnaire of the conventional method (black-and-white flash stimulation), shown in Figure 6, many subjects reported levels of blurred vision, eye strain, and irritated eyes to be slight (2) to moderate (3), with a mean of two or more in the questionnaire results. Some subjects experienced eye pain, slowed adjustment, or double vision, with a mean of 1.5 Subjects reporting that they felt certain about blurred vision and irritated eyes. The subjective questionnaire results show that many participants experienced eye strain. In addition, from the result of the CFF value, the average value was approximately +2 for both eyes in the black-and-white flash stimulation of the conventional method (Figure 5). Therefore, it is clear that using the visual stimuli of the conventional method causes fatigue.
For all the hues and hue differences of the proposed method, there was some variation in the CFF change, but the change in fatigue was less than that of the conventional method. The mean change in fatigue was almost always between −1 and +1 (Figure 8, Figure 10, and Figure 12). The only visual stimuli that were above this range were R45 and G5. The results for these two signals were also closer to 1 than to 2. From these results, we can confirm that the proposed method reduces the accumulation of fatigue compared to the conventional method. From the results of the subjective questionnaire for each hue of the proposed method, it was confirmed that the eye strain of the proposed method was lower than that of the conventional method for red visual stimuli with a hue difference of 20° or less from R5 to R20. For blurred vision and irritated eyes, the eye strain was lower than that of the conventional method for all hue differences from R5 to R45. The average flicker rating for the lowest hue difference was approximately 2. However, we can confirm that the red visual stimuli were slightly more accurate than the conventional method. For green visual stimuli, eye strain, blurred vision and irritated eye were lower than the conventional method for all hue differences from G5 to G45. From G35 to G45, the flicker rating was the same as that of the conventional method, and from G5 to G15, the flicker rating was less than 2, and the discrimination rate was significantly reduced. For blue visual stimuli, eye strain was similar to the conventional method for B30 and B40, and any other hue difference was found to be reduced. In addition, blurred vision and irritated eyes were lower for all hue differences from B5 to B45 than with the conventional method. For flicker rating, it was the same as the conventional method for B15 to B45, and the discrimination rate was slightly lower than the conventional method for all hue differences. These results indicate that the generation of visual stimuli by hue change can reduce the degree of fatigue caused by visual stimuli compared to the conventional method and can obtain the same level of discrimination rate.
Next, we discuss the detailed discrimination rate for each frequency of each hue. The results for each hue are shown in Appendix A.
Based on the results of red (0°) in Figure A1, the algorithm accuracy rate exceeded an average of 60% for all signals from R5 to R45, especially from 30 to 6 Hz, with small standard deviations and high mean values. An EEG analysis of, for example, a 15 Hz flash stimulus under gaze has been reported in [4] to reveal a large 15 Hz component, as well as a twofold and threefold harmonic, called a double frequency. Similarly, for 4 and 5 Hz, the algorithm accuracy rate may be reduced because of the misidentification of double frequencies of 10, 12, and 15 Hz. As the subjective questionnaire result of eye strain for the conventional method was 2.333 on average, we can say that the signals for R20–R45 have the same algorithm accuracy rate as the flash stimulus signals of the conventional method, with fatigue reduced by up to 15%. In addition, the subjective questionnaire result of eye fatigue for R5–R15 is approximately 1.5. Hence, the signal of R5–R15 is reduced eye fatigue by approximately 35.71% compared to the conventional method, and it is confirmed that the algorithm accuracy rate obtained is the same as that of the conventional method.
Based on the results of green (120°) in Figure A3, the algorithm accuracy rate is low for G5 to G15 signals. This is because the flicker rating of the flash stimulation questionnaire, shown in Figure 9, shows that many subjects did not recognize the flash stimulation. Previous research in [14] reported that the SSVEP-induced EEG is not a physiological response to a stimulus but a change in the perception of that stimulus. Therefore, if the flash stimulus cannot be recognized, the SSVEP may not be obtained accurately. For signals G20–G45, the identification rate varied significantly by frequency: 70% on average at 30 Hz, 60% at 20 Hz, 40% at 15 Hz, 60% at 12 Hz, 70% at 10 Hz, 60% at 7.5 Hz, 65% at 6 Hz, 40% at 5 Hz, and 60% at 4 Hz. There are large differences in the algorithm accuracy rates, and it is not possible to obtain a stable algorithm accuracy rate. With respect to 5 Hz, the effect of double frequency may be considered, but with regard to the overall decrease in algorithm accuracy rate, the difficulty of recognizing color changes due to green flash stimuli is considered to have a major effect.
Based on the blue (240°) results in Figure A5, the algorithm accuracy rate was 60% or more for 6–15 Hz but is approximately 50% for a signal above 20 Hz, which is a fast flash stimulus. In a previous report [15], blue and purple colors were reported to be indistinguishable. As a visual stimulus signal, the instantaneous flicker signal takes a long time to recognize the color change boundary; thus, a fast flash stimulus reduces the algorithm accuracy rate. Therefore, the identification rate was low at high-speed frequencies and high at low-speed frequencies. In addition, many answers that clearly recognized flickering were given in B10–B45 of flicker rating (Figure 11). Recognition of color changes is possible for any hue difference, and the difference in algorithm accuracy rates due to hue change is small. From these results, it is believed that the accuracy of blue varies depending on the frequency. From the results of the subjective questionnaire of eye strain in Figure 11, it is clear that many subjects experienced asthenopia in signals with strong color changes between B25 and B45. In summary, the blue signal cannot be identified with high accuracy in high-speed signals; therefore, it is considered that it can be used with low-speed signals as a stimulus signal to induce SSVEP.
In visual display terminal (VDT) work, it has been reported that the degree of fatigue felt by the user changes depending on the display color, and this study confirmed that the visual stimulus of SSVEP also affects the degree of fatigue in accordance with the changing of the display color. Further, it was confirmed that the fatigue level of the flash stimulus generated by the hue change was less than that of the conventional visual stimulus generated by the brightness change. However, because there were some color combinations for which the flash stimulus generated by color could not be experienced and which reduced the algorithm accuracy rate of SSVEP, it is necessary to be aware of them.
The present study revealed that it is possible to acquire SSVEPs with color signals. Therefore, it is possible to add color to the interface by using color stimuli as well as the conventional black-and-white flash stimuli. By doing so, we believe that it is possible to make BCI/BMI operate intuitively. In the future, it will be necessary to investigate the algorithm accuracy rate and intuitive operability by mixing multiple color signals.

5. Conclusions

In this paper, we highlighted the risk of visual discomfort or photosensitive epilepsy in response to intense light-induced flash stimuli and other visual stimuli used in noninvasive SSVEP-type BCI. The flash stimulus signal used as a visual stimulus was proposed to reduce visual discomfort by changing the color phase from a change in brightness to a change in hue. A measurement experiment of SSVEP with different hue changes and frequencies was performed to determine the flash stimulus signal with a similar degree of accuracy after reducing visual discomfort compared to conventional methods. The results are as follows.
  • For almost all hues and hue differences of the proposed visual stimuli, there was some variation in the CFF changes, but all fatigue changes were between −1 and +1. The only visual stimuli that were above this range were R45 and G5. For these two signals, the results were also closer to 1 than to 2. Therefore, the proposed method causes less fatigue than the conventional method, which has a CFF mean of +2.
  • As for the discrimination rate, we could confirm that the red visual stimuli were slightly more accurate than the conventional method. For green and blue visual stimuli, the accuracy depended on the hue difference, and the overall accuracy was slightly lower in the G15–G45 and B25–B45 ranges than the conventional method. However, for green and blue visual stimuli, the larger the hue difference, the much lower the eye strain was, compared with red.
  • For the three items that were above 2 from the subjective questionnaire results of the conventional method, blurred vision and eye irritation were below 2 for all colors in the proposed method. Only eye strain varied in relation to the hue difference.
  • From the results of the discrimination rates of the nine frequencies, for all hue differences in red visual stimuli, a discrimination rate of more than 60% was obtained for frequencies of 6 Hz and above, and a discrimination rate of more than 50% was obtained for frequencies of 5 Hz and below. For green visual stimuli, the discrimination rate was more than 50% for the frequencies from G20 to G45 that were of 6 Hz or higher. For blue visual stimuli, the discrimination rate of more than 50% was obtained when the frequencies from B10 to B45 were 5 Hz or higher.

Author Contributions

Conceptualization, Y.S., Y.K. and Y.B.; methodology, Y.S. and Y.K.; software, Y.S. and Y.K.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S. and Y.B.; supervision, Y.B. and T.H.; project administration, Y.B. and T.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki. Ethical review and approval were waived for this study, due to the research use only anonymized processed information which the data published in the paper does not identify specific individuals and no correspondence table has been created.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

We would like to express our sincere gratitude to the participants who cooperated with the experiment in the midst of the epidemic of the new coronavirus infection (COVID-19).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Figure A1. Accuracy rate of the algorithm with red flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Figure A1. Accuracy rate of the algorithm with red flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Applsci 11 01045 g0a1
Figure A2. Subjective questionnaire results with red flash stimuli: (a) Flicker rating; (b) Nausea; (c) Neck/back pain; (d) Blurred vision; (e) Headache; (f) Dizziness; (g) Sore eyes; (h) Eye strain; (i) Irritated eyes; (j) Eye pain; (k) Slowed adjustment; (l) Double vision; (m) Hot eyes. In all the graphs, the vertical axis represents the results of the questionnaire and the horizontal axis the type of signal based on the hue. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (flicker rating: 1 = none, 2 = slight, 3 = clear. measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.).
Figure A2. Subjective questionnaire results with red flash stimuli: (a) Flicker rating; (b) Nausea; (c) Neck/back pain; (d) Blurred vision; (e) Headache; (f) Dizziness; (g) Sore eyes; (h) Eye strain; (i) Irritated eyes; (j) Eye pain; (k) Slowed adjustment; (l) Double vision; (m) Hot eyes. In all the graphs, the vertical axis represents the results of the questionnaire and the horizontal axis the type of signal based on the hue. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (flicker rating: 1 = none, 2 = slight, 3 = clear. measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.).
Applsci 11 01045 g0a2aApplsci 11 01045 g0a2b
Figure A3. Accuracy rate of the algorithm with green flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Figure A3. Accuracy rate of the algorithm with green flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Applsci 11 01045 g0a3
Figure A4. Subjective questionnaire results with green flash stimuli: (a) Flicker; (b) Nausea; (c) Neck/back pain; (d) Blurred vision; (e) Headache; (f) Dizziness; (g) Sore eyes; (h) Eye strain; (i) Irritated eyes; (j) Eye pain; (k) Slowed adjustment; (l) Double vision; (m) Hot eyes. In all the graphs, the vertical axis represents the results of the questionnaire and the horizontal axis the type of signal based on the hue. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (flicker rating: 1 = none, 2 = slight, 3 = clear. measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.).
Figure A4. Subjective questionnaire results with green flash stimuli: (a) Flicker; (b) Nausea; (c) Neck/back pain; (d) Blurred vision; (e) Headache; (f) Dizziness; (g) Sore eyes; (h) Eye strain; (i) Irritated eyes; (j) Eye pain; (k) Slowed adjustment; (l) Double vision; (m) Hot eyes. In all the graphs, the vertical axis represents the results of the questionnaire and the horizontal axis the type of signal based on the hue. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (flicker rating: 1 = none, 2 = slight, 3 = clear. measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.).
Applsci 11 01045 g0a4
Figure A5. Accuracy rate of the algorithm with blue flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Figure A5. Accuracy rate of the algorithm with blue flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Applsci 11 01045 g0a5
Figure A6. Subjective questionnaire results with blue flash stimuli: (a) Flicker; (b) Nausea; (c) Neck/back pain; (d) Blurred vision; (e) Headache; (f) Dizziness; (g) Sore eyes; (h) Eye strain; (i) Irritated eyes; (j) Eye pain; (k) Slowed adjustment; (l) Double vision; (m) Hot eyes. In all the graphs, the vertical axis represents the results of the questionnaire and the horizontal axis the type of signal based on the hue. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (flicker rating: 1 = none, 2 = slight, 3 = clear. measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.).
Figure A6. Subjective questionnaire results with blue flash stimuli: (a) Flicker; (b) Nausea; (c) Neck/back pain; (d) Blurred vision; (e) Headache; (f) Dizziness; (g) Sore eyes; (h) Eye strain; (i) Irritated eyes; (j) Eye pain; (k) Slowed adjustment; (l) Double vision; (m) Hot eyes. In all the graphs, the vertical axis represents the results of the questionnaire and the horizontal axis the type of signal based on the hue. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (flicker rating: 1 = none, 2 = slight, 3 = clear. measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.).
Applsci 11 01045 g0a6

References

  1. Li, Y.; Pan, J.; Wang, F.; Yu, Z. A Hybrid BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control. IEEE Trans. Biomed. Eng. 2013, 60, 3156–3166. [Google Scholar] [CrossRef] [PubMed]
  2. Misawa, T.; Matsuda, J.; Hirobayashi, S. A Single-trial Multi-class Classification of Various Motor Imagery Tasks for EEG-based Brain-computer Interface Communication. IEEJ Trans. Sens. Micromach. 2015, 135, 239–245. [Google Scholar] [CrossRef]
  3. Hwang, H.J.; Kim, S.; Choi, S.; Im, C.H. EEG-based brain-computer interfaces: A thorough literature survey. Int. J. Hum. Comput. Interact. 2013, 29, 814–826. [Google Scholar] [CrossRef]
  4. Zhu, D.; Bieger, J.; Molina, G.G.; Aarts, R.M. A survey of stimulation methods used in SSVEP-based BCIs. Comput. Intell. Neurosci. 2010, 2010, 702357. [Google Scholar] [CrossRef]
  5. Wang, Z.; Yu, Y.; Xu, M.; Liu, Y.; Yin, E.; Zhou, Z. Towards a Hybrid BCI Gaming Paradigm Based on Motor Imagery and SSVEP. Int. J. Hum. Comput. Interact. 2019, 35, 197–205. [Google Scholar] [CrossRef]
  6. Teng, F.; Chen, Y.; Choong, A.M.; Gustafson, S.; Reichley, C.; Lawhead, P.; Waddell, D. Square or Sine: Finding a Waveform with High Success Rate of Eliciting SSVEP. Comput. Intell. Neurosci. 2011, 2011, 364385. [Google Scholar] [CrossRef]
  7. Shieh, K.K.; Chen, M.T. Effects of screen color combination, work-break schedule, and workpace on VDT viewing distance. Int. J. Ind. Ergon. 1997, 20, 11–18. [Google Scholar] [CrossRef]
  8. Botani, H.; Ohsugam, M. Proposal of Recognition Algorithm for Menu Selection using Steady State Visual Evoked Potential. Jpn. J. Ergon. 2017, 53, 8–15. [Google Scholar] [CrossRef]
  9. Lin, Z.; Zhang, C.; Wu, W.; Gao, X. Frequency Recognition Based on Canonical Correlation Analysis for SSVEP-Based BCIs. IEEE Trans. Biomed. Eng. 2006, 53, 2610–2614. [Google Scholar] [CrossRef]
  10. Wang, L.; Tu, Y.; Chen, Y.; Wang, Y. The effect of ambient light source and display type on visual fatigue. In Proceedings of the 13th China International Forum on Solid State Lighting, Beijing, China, 15–17 November 2016; pp. 130–133. [Google Scholar] [CrossRef]
  11. Lee, D.S.; Ko, Y.H.; Shen, I.H.; Chao, C.Y. Effect of light source, ambient illumination, character size and interline spacing on visual performance and visual fatigue with electronic paper displays. Displays 2011, 32, 1–7. [Google Scholar] [CrossRef]
  12. Chang, P.C.; Chou, S.Y.; Shieh, K.K. Reading performance and visual fatigue when using electronic paper displays in long-duration reading tasks under various lighting conditions. Displays 2013, 34, 208–214. [Google Scholar] [CrossRef]
  13. Simonson, E.; Enzer, N. Measurements of fusion frequency of flicker as a test of fatigue of the central nervous system; observations on laboratory technicians and office workers. Ind. Hyg. Toxicol. 1941, 23, 83–89. [Google Scholar]
  14. Yoshioka, M. Construction of Brain-Machine Interface Power Assistive System Based on EEG Analysis—Analysis of Brain Waves and Estimation of Joint Torques on Motion by Periodic Power Spectrum. Ph.D. Thesis, Maebashi Institute of Technology, Maebashi, Japan, 24 February 2017. [Google Scholar]
  15. Bashashati, A.; Fatourechi, M.; Ward, R.K.; Birch, G.E. A survey of signal processing algorithms in brain-computer interfaces based on electrical brain signals. J. Neural Eng. 2007, 4, 32–57. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Example of a time sequence of a flash stimulation of hue changes.
Figure 1. Example of a time sequence of a flash stimulation of hue changes.
Applsci 11 01045 g001
Figure 2. Example of a time sequence of a color phase change graph.
Figure 2. Example of a time sequence of a color phase change graph.
Applsci 11 01045 g002
Figure 3. Example of nine visual stimuli.
Figure 3. Example of nine visual stimuli.
Applsci 11 01045 g003
Figure 4. Accuracy rate of steady-state visual evoked potential with black-and-white flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Figure 4. Accuracy rate of steady-state visual evoked potential with black-and-white flash stimuli. The bar chart shows the mean algorithm accuracy, and the error bar shows the standard deviation.
Applsci 11 01045 g004
Figure 5. Results of subjective questionnaire on black-and-white flash stimuli. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score. (measures of discomfort: 1 = none and 5 = severe).
Figure 5. Results of subjective questionnaire on black-and-white flash stimuli. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score. (measures of discomfort: 1 = none and 5 = severe).
Applsci 11 01045 g005
Figure 6. Changes in critical fusion frequency (CFF) values with black-and-white flash stimuli. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Figure 6. Changes in critical fusion frequency (CFF) values with black-and-white flash stimuli. The shaded bar chart shows the upper and lower limits of the score. Black bars indicate the mean score (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Applsci 11 01045 g006
Figure 7. Comparison of the average accuracy rate of the algorithm for black-and-white flash stimuli and red signals, and the average flicker and eye strain in the questionnaire results. (Flicker rating: 1 = none, 2 = slight, 3 = clear. Measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.)
Figure 7. Comparison of the average accuracy rate of the algorithm for black-and-white flash stimuli and red signals, and the average flicker and eye strain in the questionnaire results. (Flicker rating: 1 = none, 2 = slight, 3 = clear. Measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.)
Applsci 11 01045 g007
Figure 8. Changes in CFF values with red flash stimuli: (a) Right eye; (b) Left eye. Shaded bar graphs show the upper and lower limits of the score. Black bars indicate the mean score. (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Figure 8. Changes in CFF values with red flash stimuli: (a) Right eye; (b) Left eye. Shaded bar graphs show the upper and lower limits of the score. Black bars indicate the mean score. (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Applsci 11 01045 g008
Figure 9. Comparison of the average accuracy rate of the algorithm for black-and-white flash stimuli and green signals, and the average flicker and eye strain in the questionnaire results. (Flicker rating: 1 = none, 2 = slight, 3 = clear. Measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.)
Figure 9. Comparison of the average accuracy rate of the algorithm for black-and-white flash stimuli and green signals, and the average flicker and eye strain in the questionnaire results. (Flicker rating: 1 = none, 2 = slight, 3 = clear. Measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.)
Applsci 11 01045 g009
Figure 10. Changes in CFF values with green flash stimuli: (a) Right eye; (b) Left eye. Shaded bar graphs show the upper and lower limits of the score. Black bars indicate the mean score (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Figure 10. Changes in CFF values with green flash stimuli: (a) Right eye; (b) Left eye. Shaded bar graphs show the upper and lower limits of the score. Black bars indicate the mean score (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Applsci 11 01045 g010
Figure 11. Comparison of the average accuracy rate of the algorithm for black-and-white flash stimuli and blue signals, and the average flicker and eye strain in the questionnaire results. (Flicker rating: 1 = none, 2 = slight, 3 = clear. Measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.)
Figure 11. Comparison of the average accuracy rate of the algorithm for black-and-white flash stimuli and blue signals, and the average flicker and eye strain in the questionnaire results. (Flicker rating: 1 = none, 2 = slight, 3 = clear. Measures of discomfort: 1 = none, 2 = slight, 3 = moderate, 4 = clear, 5 = severe.)
Applsci 11 01045 g011
Figure 12. Changes in CFF values with blue flash stimuli: (a) Right eye; (b) Left eye. Shaded bar graphs show the upper and lower limits of the score. Black bars indicate the mean score (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Figure 12. Changes in CFF values with blue flash stimuli: (a) Right eye; (b) Left eye. Shaded bar graphs show the upper and lower limits of the score. Black bars indicate the mean score (0 = no change in fatigue, + = more fatigued, − = less fatigued).
Applsci 11 01045 g012
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Back to TopTop