Next Article in Journal
Horizontal Review on Video Surveillance for Smart Cities: Edge Devices, Applications, Datasets, and Future Trends
Next Article in Special Issue
On the Optimization of Regression-Based Spectral Reconstruction
Previous Article in Journal
Rapid Determination of Low Heavy Metal Concentrations in Grassland Soils around Mining Using Vis–NIR Spectroscopy: A Case Study of Inner Mongolia, China
Previous Article in Special Issue
Proof of Principle for Direct Reconstruction of Qualitative Depth Information from Turbid Media by a Single Hyper Spectral Image
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Band-Selection of a Portal LED-Induced Autofluorescence Multispectral Imager to Improve Oral Cancer Detection

1
Institute of Electrical and Control Engineering, National Yang Ming Chiao Tung University, 1001 University Road, Hsinchu City 30010, Taiwan
2
Institute of Biomedical Engineering, National Yang Ming Chiao Tung University, 1001 University Road, Hsinchu City 30010, Taiwan
3
Department of Pathology, China Medical University, 91 Hsueh-Shih Road, Taichung City 40402, Taiwan
4
Department of Otolaryngology, China Medical University Hospital, 2 Yuh-Der Road, Taichung City 40447, Taiwan
*
Author to whom correspondence should be addressed.
Submission received: 21 April 2021 / Revised: 1 May 2021 / Accepted: 1 May 2021 / Published: 6 May 2021
(This article belongs to the Collection Advances in Spectroscopy and Spectral Imaging)

Abstract

:
This aim of this study was to find effective spectral bands for the early detection of oral cancer. The spectral images in different bands were acquired using a self-made portable light-emitting diode (LED)-induced autofluorescence multispectral imager equipped with 365 and 405 nm excitation LEDs, emission filters with center wavelengths of 470, 505, 525, 532, 550, 595, 632, 635, and 695 nm, and a color image sensor. The spectral images of 218 healthy points in 62 healthy participants and 218 tumor points in 62 patients were collected in the ex vivo trials at China Medical University Hospital. These ex vivo trials were similar to in vivo because the spectral images of anatomical specimens were immediately acquired after the on-site tumor resection. The spectral images associated with red, blue, and green filters correlated with and without nine emission filters were quantized by four computing method, including summated intensity, the highest number of the intensity level, entropy, and fractional dimension. The combination of four computing methods, two excitation light sources with two intensities, and 30 spectral bands in three experiments formed 264 classifiers. The quantized data in each classifier was divided into two groups: one was the training group optimizing the threshold of the quantized data, and the other was validating group tested under this optimized threshold. The sensitivity, specificity, and accuracy of each classifier were derived from these tests. To identify the influential spectral bands based on the area under the region and the testing results, a single-layer network learning process was used. This was compared to conventional rules-based approaches to show its superior and faster performance. Consequently, four emission filters with the center wavelengths of 470, 505, 532, and 550 nm were selected by an AI-based method and verified using a rule-based approach. The sensitivities of six classifiers using these emission filters were more significant than 90%. The average sensitivity of these was about 96.15%, the average specificity was approximately 69.55%, and the average accuracy was about 82.85%.

1. Introduction

Oral cancer has become a severe health problem in many developing and developed countries. In addition to the economic burden of patients and their families, related medical care has been a central issue of national health. According to the World Health Organization (WHO), 657,000 new cases of oral cancer are diagnosed each year, and more than 330,000 deaths occur due to oral cancer [1]. In Taiwan, oral cancer is ranked as the fifth leading cause of death among common cancers. About 7000 new cases and 3000 deaths of oral cancer occur in Taiwan each year [2]. The incidence rate and mortality rate in Taiwan ranked first and second, respectively, compared with 35 other countries in the OECD. Patients suffering from oral cancer normally have habits of smoking and/or betel-nut chewing in Taiwan and Southeast Asia [3,4,5].
Fluorophore, which is the intermediate product of heme biosynthesis, generates fluorescence after it absorbs a specific excitation light; this phenomenon is called autofluorescence. The fluorophores in human tissue include flavins adenine dinucleotide (FAD), nicotinamide adenine dinucleotide (NADH), and structural protein of collagen, elastin, and keratin. FAD has autofluorescence in the green spectrum between 510 and 570 nm; NADH has autofluorescence in the blue spectrum between 450 and 490 nm; and protoporphyrin has autofluorescence in the red spectrum between 620 and 640 nm [6]. The concentrations of NADH and FAD in tumor tissue were smaller than that of normal tissue because tumor tissue might have higher aerobic metal metabolic activity than normal tissue [7]. This might result in lower intensity of blue and green autofluorescence in tumor tissue compared to that of normal tissue. Structural proteins are considered to make a smaller contribution to fluorescent emission in tumor tissue compared to that of normal tissue because the thickness of the epithelial layer in premalignant and malignant tissues can be larger than that of normal tissue [8,9]. Protoporphyrin may vary with the carcinoma progression; however, it also varies with the number of bacteria on the mucosa [10].
Optical spectroscopy has customarily been used to determine the specified bands for identifying abnormal tissue. Gillenwater et al. [11] found that the fluorescence intensity of abnormal oral tissue significantly decreased in the blue spectrum between 455 and 490 nm compared to that of normal oral tissue. Betz et al. [12] indicated that the fluorescence intensity of malignant mucosa significantly decreased in the green region at 540 and 575 nm compared to that of normal mucosa. Müller et al. [13] demonstrated that the autofluorescent change of NADH and collagen has potential for cancer diagnosis. Majumder et al. [14] measured the spectrum of autofluorescence in squamous cell carcinoma excited by a nitrogen laser, and the accuracy of classifying the carcinoma from the normal squamous cell was over 85% based on total principal component regression. Schwarz et al. reported that the fluorescent spectrum of different depths in oral tissue could enhance the detection of optical changes associated with premalignant because the collagen in the underlying stroma might change with the progression of malignant tumor tissue [15]. Mallia et al. [16] indicated that the ratio of fluorescent intensity at 500 nm to that at 645, 705, and 685 nm could discriminate hyperplasia from dysplastic and normal tissues. In addition to optical spectroscopy, hyperspectral imaging systems (HIS) can measure not only the spatial distribution of the reflectance or transmittance but also the spectral distribution in the specified spectrum range. The hyperspectral imaging system can measure over 100 bands of the spectra. In recent decades, the system has been used in many types of research, including remote sensing, food production, medical detection, and agriculture applications. In a previous study, we developed the embedded relay lens microscopic hyperspectral imaging system (ERL-MHSI) and employed the system in cancer-related research [17]. Depending on the spatial resolution and spectral resolution of the ERL-MHSI system, the data can be analyzed with hyperspectral morphological images. This is a useful tool for research but may not be appropriate for the quick screening of oral cancer detection devices, because the HIS usually requires minutes to scan a target to capture the spatial and spectral data, and the volume of the HIS is usually high. The HIS or spectrometer can be used to find the characteristic spectrum for identifying the target, but the spectrum is difficult to directly implement in a multi-spectra system; the number of spectral bands is too large to practically allow implementation in a portable multispectral imager. Thus, a multispectral imager composed of a few determined band-pass filters is fundamentally important for the quick screening of oral cancer.
Traditionally, an oral examination for oral cancer involves a visual inspection and palpation of oral lesions under illumination. If the clinician suspects there is a risk of abnormal tissue progressing to cancer tissue, an oral biopsy for histopathological analysis is necessary. It is a challenge for experienced clinicians to observe the superficial characteristics of oral cancer due to the subtle changes of epithelia in the cancer’s malignant progression. To enhance visualization for quick screening of oral cancer, several handheld assistant tools have been developed, such as VELscope [18], Identafi 3000 (DentalEZ Inc., Malvern PA, USA)) [19], and EVINCE (MMOptics, São Carlos, Brazil) [20]. These devices provide a light source from ultraviolet light-emitting diodes (LEDs) to excite the oral tissue and help an examiner observe the autofluorescence of oral tissue through a long-pass or band-pass optical filter. By observing the fluorescence loss, the development and progression of oral neoplasia can be artificially identified and scored using these devices. These devices have been used in clinician examinations and trials, showing their ability and efficacy for the screening of oral cancer [21,22,23,24,25,26,27,28,29,30,31]. Recently, Jeng et al. [32] developed a principle component analysis based method that combined a VELscope and Raman spectroscopy to improve the detection of oral cancer. Jeng et al. [32] further used linear discriminant analysis and quadratic discriminant analysis to increase differentiation between normal, premalignant, and malignant lesions based on the autofluorescence images acquired from the VELscope; the accuracy of the classifications was increased by 2% to 14% [33]. Huang et al. [34] created a two-channel autofluorescence detection that used 375 and 460 nm excitation light sources and 479 and 525 nm band-pass emission filters to detect oral cancer and precancerous lesions. The results revealed that autofluorescence had high sensitivity for detecting oral cancer. Cherry et al. [35] examined oral potentially malignant disorders (OMPDs) based on autofluorescence imaging and suggested that autofluorescence imaging had the potential to track OMPDs.
These beneficial tools mainly aim to enhance the visualization of autofluorescent loss in abnormal tissue. In this study, we developed a portable handheld multispectral imager to acquire the spectral image of the tumor and normal oral tissue in several bands and attempted to determine the effective spectral bands for identifying oral cancer based on several quantitatively computing methods. In a previous study, our team proposed a self-made portable LED-induced autofluorescence multispectral imager device for the screening of oral cancer [36]. This earlier device is mainly composed of LEDs, emission filters, and a CMOS imaging sensor, and was used to collect the spectral images of autofluorescence in normal and tumor tissues. The results of this study illustrated that the autofluorescence of the healthy and tumor tissues had significant variance in the blue intensity. However, how to determine appropriate band filters is a vital issue in portable devices, especially for quick screening of oral cancer. In the current study, the methodology for band selection of the LIAF multispectral imager was proposed and demonstrated using 436 sample points from 62 patients in ex vivo trials undertaken at China Medical University Hospital. Two light intensities and two wavelengths of the LEDs, 365 and 405 nm, were used as the excitation light sources; nine wavelengths of the spectral band, 470, 505, 525, 532, 550, 595, 632, 635, and 695 nm, were acquired. The spectral images were pre-processed by four image processing methods, including intensity, histogram, entropy, and fractional dimension. The threshold of the quantized value for screening the tumor points was optimized by calculating the area under the receiver operating characteristic curve (ROC). To find the effective spectral bands, a single-layer network learning process was used, and results were compared to a conventional rules-based process.

2. Materials and Methods

2.1. Instrument Composition and Spectral Characteristics

The autofluorescence multispectral imager was equipped with excitation blue or purple LED sources, a long-pass filter suppressed from the excitation light sources, several band-pass filters, and a CMOS image sensor to capture the autofluorescence multispectral images from the reflection of specimens. The LIAF multispectral imager was minimized and implemented as a self-made, convenient-portable, and easy-handheld device, as shown in Figure 1. The device uses LEDs to induce the autofluorescence of target tissue and acquire the spectral images of the autofluorescence; the excitation LED light sources module was equipped with six excitation LEDs; the emission filters on the rotatory filter array passed the spectrum of the autofluorescence within a certain wavelength range of interest and rejected the spectrum without the wavelength range of interest; and the imaging system was composed of a color CMOS imaging sensor and lens capturing the fluorescent image induced from the tissues. The LEDs are placed on a Metal Core Printed Circuit Board (MCPCB), which can cool the heat produced by the LEDs. The current of LEDs is controlled using pulse width modulation techniques which are generated by the imaging module. The probe is in front of the holder blocking out the ambient light and fixing the object distance of the system. The filter ring contains band-pass filters arranged in a circle and can be rotated by the users’ fingers to change the filters. The target tissues are excited by the LEDs, which generate autofluorescence transmitting across the band-pass filter, long-pass filter, and the lens. The autofluorescence is captured by the imaging sensor afterward.
The LIAF multispectral imager was divided into four-channel (4CH) and eight-channel (8CH) versions based on the number of band-pass filters on the filter ring (Figure 2). Furthermore, the total current intensities of the LEDs were 350 mA in the 4CH version and 1000 mA in the 8CH version (Figure 2). The 4CH and the 8CH LIAF used the same excitation LEDs. The four channels in the rotary filter ring of the 4CH LIAF multispectral imager had three band-pass filters and one without filters. The center wavelengths of these filters were 525, 635, and 695 nm. The seven channels in the rotary filter ring of the 8CH LIAF multispectral imager had six band-pass filters and one without filters. The center wavelengths of these filters were 470, 505, 532, 550, 595, and 632 nm. To block the reflection of the excitation light entering the imaging system, a long-pass filter (LP455) was adopted in the 8CH LIAF multispectral imager (Figure 2); this filter can block the wavelength of light shorter than 455 nm and pass the wavelength of the light longer than 455 nm. Because the 4CH LIAF multispectral imager was the earlier version of the LIAF multispectral imager, the LP455 was not adopted in this version.

2.2. Patient History and Experimental Design

Patients who were referred to the Department of Otolaryngology-Head and Neck Surgery at China Medical University Hospital because of suspicious oral lesions or were waiting for head and neck surgery in the hospital ward were recruited to participate in the study. This study was reviewed and approved by the Institutional Review Board of China Medical University Hospital (CMUH102-REC1-069) [37], and the analysis was checked and approved by the Department of Biomedical Engineering, National Yang Ming Chiao Tung University, Taiwan. Written informed consent was obtained from each subject enrolled in the study. Patients in the region between 20 and 100 years of age were eligible to participate.
The patients involved in this study were divided into three experiments, as shown in Table 1. The first experiments (Exp_1) involved 17 patients using the 4CH LIAF multi-spectral imager without the correcting procedure. The second experiment (Exp_2) involved 19 patients using the 4CH LIAF multi-spectral imager with the correcting procedure. The third experiment (Exp_3) involved 26 patients using the 8CH LIAF multi-spectral imager with the correcting procedure. The anatomical specimens were collected from the surgical resection of the different patients, except for two anatomical specimens collected from one patient in Exp_3. Thus, a total of 28 specimens were involved in Exp_3.
The experiment procedure is shown in Figure 3. After the instruments were set and the surgeon completing the surgical operation, the surgeon immediately selected several points of the tumor region and healthy region on the anatomical specimen (Figure 4a). The third step was to capture a dark image and a white image of white balance and diffuse reflectance targets. The dark image and white image were used to perform a dark calibration, and a white calibration for reducing the intensity offset produced from the imaging sensor and reducing the impact of the various flux in the light source. The surgeon aimed the LIAF multi-spectral imager at each selected point and pressed the trigger of the LIAF multi-spectral imager to start the capturing procedure (Figure 4b). The capturing procedure comprised steps 4 to 9. The third step was to capture the dark image without emitting light. The fourth step was to excite the point, and the fifth step was to capture the autofluorescence image of the point (Figure 4c). The third step to the fifth step was performed again after the light source was changed automatically. After the capturing procedure was completed, the indicator of the LIAF multi-spectral imager was illuminated. The surgeon changed the band-pass filter by rotating the rotary filter ring and started the next capturing procedure. The surgeon repeated the above operation until the active channels of the rotary filter ring were all used. Sequentially, the surgeon repeated the third to ninth steps for all points. As shown in Table 2, a total of 53 healthy points and 53 tumor points were collected in Exp_1; a total of 54 healthy points and 54 tumor points were collected in Exp_2; and a total of 111 normal points and 111 tumor points were collected in Exp_3.

2.3. Data Collection and Analysis

The spectral images of the specimens captured from the selected points of tumor tissue and normal tissue were used in the analysis. The pixels of one image used for analysis were only in the probe region (Figure 4c). Because the SOI-268 is a color CMOS imaging sensor, one image contains red (R), green (G), and blue (B) gray-level images. Three gray-level images were used as an independent data set. The first step was to analyze the images of the points depending on four methods, including intensity, histogram, fractal dimensions, and entropy. The first method calculated the summation of the gray level of the pixels in the probe region (ROI) of one spectral image. The summation, I, is expressed as:
I = x , y R O I f x , y ,
where f is the gray level of the pixel at coordinate (x,y) that is in the ROI. The second method was to find the highest number S of the intensity level in a spectral image:
S = max n k ,
where nk is the number of pixels in f with intensity rk that denotes the intensities of an L-level spectral image; k is from 0 to L-1. In information theory, entropy measures uncertainty in a set of random variables. The third method was to calculate the entropy H, expressed as:
H = i = 0 n P r k × log 2 P r k ,
where P(ri) is the proportion of the gray level rk in a spectral image. The fourth method uses the concept of fractal dimension. The fractal dimension is an index that characterizes the complexity of a pattern. The morphological shape of the tumor tissue could be chaotic because of the random proliferation of the tissue. Thus, the fractal dimension may be useful in oral cancer detection. The first step of the fourth method was to binarize the images using Otsu’s method. Otsu’s method finds the optimum threshold of r k when the maximizing inter-class variance σ( r k ) is found. σ 2 r k is expressed as:
σ 2 r k = k = 0 t 1 P r k × k = t L 1 P r k × k = 0 t 1 r k × P r k k = 0 t 1 P r k k = t 1 L 1 r k × P r k k = t 1 L 1 P r k 2 ,
where k ranges from 0 to L. The optimum threshold t was used to transform a gray-level image f into a binary image fb. The transfer is expressed as:
f b x , y =   1 ,   f x , y   t 0 ,   f x , y < t .
A kernel of size 2 by 2 pixels moved over the binary image fb and computed the sum of the product at each location. fc is the result of the spatial correlation and is expressed as:
f c i , j =   1 ,     x = i , y = j i + 1 , j + 1 f b x , y 1   0 ,     x = i , y = j i + 1 , j + 1 f b x , y < 1 .
After the spatial correlation, the summation of fc was calculated as follows:
D = i = 0 , j = 0 M 2 , N 2 f c i , j ,
where M and N are the dimensions of fc.
The red, green, and blue filters were regarded as three spectral bands and correlated with the emission filters to become new spectral bands; thus, a total of 30 spectral bands are listed in Table 3. In addition, a total of two excitation light sources and four image processing methods were used; thus, a total of 96 and 168 combinations were regarded as different classifiers in Exp_2 and Exp_3, respectively (Figure 2). The data of each combination were divided into two groups, A and B, for the cross-validation, as shown in Table 2. One group was the training set, and the other group was a validation set. The training set was used to optimize the threshold for distinguishing the normal from the tumor points. The optimized threshold was tested in a validation set to derive the sensitivity, specificity, and accuracy of the classifier.
The threshold was the cross point of two Gaussian distributions of normal and tumor data. The normal and tumor data in which the p-value of the Kolmogorov–Smirnov (KS) tests [32] exceeded 0.05 were assumed to have a Gaussian distribution. The Gaussian distribution of the normal and tumor data was determined by their mean and the standard deviation. These distributions were used to find the optimized threshold and calculate the receiver operating characteristic curve (ROC). One threshold determines one sensitivity standing for a fraction of true positives to all tumor points and one specificity standing for a fraction of true negatives to all normal points. The accuracy standing for a fraction of true positive and true negative to all points was also determined (Table 4). The ROC curve is plotted with the sensitivities and one minus specificity of the various threshold (Figure 5). The optimal threshold marked as ‘‘filled circle’’ was determined by the highest accuracy, sensitivity, and specificity on the ROC (Figure 5). The area under the ROC curve (AUC) was used to evaluate the performance of the classifiers [38,39,40,41].

3. Results

3.1. LAIF Spectral Imaging with and without Filters

The spectrum, radiant flux, and luminous flux of the excitation LEDs were measured using a SMS-500 spectrometer in conjunction with an integrating sphere. The measurement was compliant with CIE 127:2007 [42]. First, the spectral calibration and absolute luminous flux calibration of the measurement system were implemented. Then, the excitation LEDs were installed in the integrating sphere. The spectrum, radiant flux, and luminous flux of the excitation LEDs driving the forward currents of 500 and 1000 mA were each measured ten times. The ten records of each condition were averaged. The spectral radiant flux of the excitation light sources is shown in Figure 6. The total radiant flux of the 365 nm LEDs driving the forward current of 500 and 1000 mA was 76,613 and 141,663 µW. The total radiant flux of the 405 nm LEDs driving the forward current of 500 and 1000 mA was 782,890 and 1,267,606 µW. The peak wavelength of the 365 nm LEDs and 405 nm LEDs was 365 and 401 nm. The full width at half maximum (FWHM) of the 365 nm LEDs ranged from 364.26 to 375.64 nm. The FWHM of the 405 nm LEDs ranged from 395.55 to 411.95 nm. The designed peak wavelength of the 405 nm LEDs differed from the measured peak wavelength but was still in the measured FWHM. The dominant wavelength of the 365 nm LED was 462 nm. The CIE x and y of the 365 nm LEDs were 0.2182 and 0.1607, respectively, which approaches the purple–blue color. The dominant wavelength of the 405 nm LED was 431 nm. The CIE x and y of the 405 nm LEDs were 0.1742 and 0.0188, respectively, which approaches the blue color. The relative spectral intensity of the excitation light sources is shown in Figure 6. The 8CH LIAF and the 4CH LIAF multi-spectral imager used the SOI-268 CMOS sensor. The transmittance of the red, green, and blue filters correlated with the spectral response of the sensor are marked as dotted lines and the spectral transmittance of the emission filters used in the 4CH version are marked as solid lines in Figure 7. The spectral transmittance of the emission filters used in the 8CH version is marked as solid lines in Figure 8; the spectral transmittance of long-pass filter LP455 is plotted in Figure 8 as dotted lines with cross marks. The peak wavelength and the corresponding transmittance of the red, green, and blue filter correlated with and without each emission filter are shown in Table 3.

3.2. Band Selection for Oral-Cancer Diagnosis

In this study, we attempted to identify effective spectral bands for the screening of oral cancer. The selection of the spectral bands was based on the AUC of the classifiers tested in a training set and the sensitivity, specificity, and accuracy of the classifiers tested in a validation set. A total of thirty spectral bands were collected from the R, G, and B filter, correlated with and without the nine emission filters. Each spectral band was included in eight classifiers associated with two excitations and four image processing methods. The data of each classifier was divided into two groups, A and B. One group was a training set, and the other one was a validation set. For evaluating the performance of the classifiers, the AUC and the optimized threshold of the training set were calculated. The threshold was used to test the validation set to calculate the sensitivity, specificity, and accuracy. The minimum, first quartile, second quartile, third quartile, average, and maximum AUC of the group A, B, and A + B in eight classifiers of 30 spectral bands were calculated and are depicted in Figure 9, Figure 10 and Figure 11, respectively. The average AUC of 3C10 (470 nm_B), 3C14 (505 nm_G), 3C17 (532 nm_G), 3C20 (550 nm_G), 3C28 (B), and 3C29 (G) was higher than the others.

3.2.1. Rules-Based Band Selection

For further band selection, the top four AUCs of each spectral band were only considered and taken into the average for ranking; that is, the four worst AUCs of each spectral band were not taken into account. The top six average AUC values of the spectral bands are shown in Table 5. The spectral bands, namely 3C14, 3C10, 3C17, 3C29, 3C20, and 3C28, ranked in the top six AUCs of the groups A, B, and A + B. The results illustrate that the blue and green filter correlated with and without four emission filters of center wavelength 470, 505, 532, and 550 nm exhibited the best performance for the screening of oral cancer. Combined with selecting the spectral bands, the effective excitation light source and image processing methods were further selected. The aim of rapid screening of oral cancer is to efficiently identify suspicious oral mucosa tissues in the early stage of oral cancer; the high sensitivity of the screening result is preferable relative to its high specificity; thus, the criteria for selecting the effective excitation light source and the imaging processing methods were based on the sensitivity of the testing results. The sensitivity, specificity, and accuracy of the validating results in 3C14, 3C10, 3C17, 3C29, 3C20, and 3C28 are shown in Figure 12. The classifiers with the highest sensitivity (over 94%) in the validating results were selected and are marked as red dotted circles in Figure 12.

3.2.2. AI-Based Band Selection

An artificial intelligence (AI) method was used to select the spectral bands. The method is illustrated in Figure 13. In iteration I, the spectral bands in which the average AUC was over 85% were selected. In iteration II, the accuracy, sensitivity, and specificity of each classifier in these selected spectral bands were used to calculate the weighting core. The method optimized the weighting W to find the maximum weighting score of the classifiers. The classifier which had the highest weighting score in one of the selected spectral bands determined the effective excitation light sources and imaging processing methods in conjunction with this spectral bands. For the screening of oral cancer, high sensitivity is more important than high specificity; thus, the sensitivity has the highest weighting value (0.8) compared to specificity and the accuracy. The selected spectral bands of the AI-based method in the data without grouping (A + B), group A, and group B are shown in Table 6. These four spectral bands were also selected using the rules-based method, namely 3C14, 3C17, 3C20, 3C28.

4. Discussion and Conclusions

In this study, a method of the screening of oral cancer was used to observe the autofluorescence of the tissue after the tissue was excited by LEDs. The autofluorescence was filtered by the different RGB filters with or without emission filters and recorded by the imaging sensor. In the experiments, we adopted the red, green, and blue filters in the SOI268 with or without the emission filters, which have center wavelengths of 525, 635, 695, 470, 505, 532, 550, 595, and 632 nm. The most important purpose of this paper was to select the effective filters for the rapid screening of oral cancer. In rules-based band selection, the spectral bands which identically ranked among the top six average AUCs of the classifiers tested in the data group A, B, and the data without grouping (A + B), were C14, C10, C17, C29, C20, and C28; the average of the top four AUCs in the test of eight classifiers in these spectral bands was greater than 85% (Table 5). The AI-based method selected the four spectral bands, which were the same as the those for the selection of the rules-based method. The blue and the green filter of the SOI268 CMOS imaging sensor showed good performance with or without the emission filters. Furthermore, the emission filters with center wavelengths of 550, 532, 470, and 505 nm showed better performance than other filters.
The goal of the LIAF multi-spectral imager is to allow rapid screening at the early stage of oral cancer. For quick screening at an earlier stage, high sensitivity of testing results is more important than high specificity. Therefore, the selection of the excitation LED wavelength and the computing methods aimed to find the classifiers which had the highest sensitivity. Each spectral band was involved in eight classifiers that used two excitation light sources and four quantitative computing methods. In the rule-based method, if the sensitivity of the classifier is higher than 90%, the classifier is adopted. According to the results (Figure 12), a total of six classifiers were selected, including a 365 nm excitation LED with the intensity method in C14, a 405 nm excitation LED with the intensity method in C10, a 405 nm excitation LED with the intensity method in C17, a 365 nm excitation LED with the intensity method in C29, a 405 nm excitation LED with the intensity method in C20, and a 405 nm excitation LED with the intensity method in C28. In the AI-based method, the final six combinations were selected, including a 405 nm LED with the intensity method for C14, a 405 nm LED with the intensity method for C10, a 405 nm LED with the intensity method for C17, a 365 nm LED with the intensity method for C29, a 405 LED with the intensity method for C20, and a 405 nm LED with the intensity method for C28. These classifiers in the rules-based and AI-based methods were identically adopted. The sensitivity of these six classifiers was larger than 90%. The average of the sensitivities to these six classifiers was 96.15%; the average of the specificities was 69.55%; and the average of the accuracies was 82.85%. For the investigated application, these six classifiers could be implemented in a LIAF multispectral imager for the quick screening of oral cancer.
The methodology of the AI-based band selection used the area under the ROC curve (AUC) to evaluate the performance of all classifiers. The advantage of the ROC and AUC is that the performance of the combinations does not depend on the threshold. Furthermore, the AI-based method was successfully verified by the ruled-based method. The advantage of the AI-based method is that the method can adjust the weighting to rapidly identify the best four combinations consisting of four bands. The weightings can decide the means of detection based on the four selected classifiers with four spectral bands. The rule-based method is based on the AUC ranks of the spectral bands in the data without grouping (A + B), group A, and group B to select the final four spectral bands. However, the rules-based method is unable to determine the means of detection based on the six selected classifiers with four spectral bands and cannot adjust itself to improve the testing results. The construction of the rule-based method is based on considerable background knowledge and is time-consuming. Therefore, the rules-based method can be replaced by the AI-based method. In the future, the AI-based methodology of the band selection and the combination selection can be applied to other cancer detection for quick screening.
Pioneer studies used an optical spectroscope to investigate the autofluorescent spectrum of the normal and abnormal oral tissue, and found that the autofluorescence intensity of the abnormal tissue decreased in blue and green spectral regions compared to that of normal oral tissue [6,7,8,9,10,11,12,13,14,15,16]. The selected spectral bands in this study are consistent in the blue and green regions. However, the specific spectral bands did not encompass or were not equal to the peak wavelength of autofluorescent for identifying oral cancer in previous studies. The peak wavelength of excitation light sources was not consistent in all studies because the excitation band of the fluorophores is broad [6,53]. The spectroscope might not be suitable to quickly screen and demarcate oral cancer in the whole oral cavity due to the narrow view and single measured point of most spectroscopes [53]. Several handheld assistant tools, such as VELscope, Identafi, and EVINCE, have been developed to enhance the visualization of oral lesions based on observing the loss of green or blue autofluorescence in abnormal tissue [18,19,20]. In most studies in which these tools have been successfully used in clinical trials, examiners or surgeons identified the oral lesion using the tool or the image captured with the tool [21,22,23,24,25,26,27,28,29,30,31,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70]. In this study, we developed a portable handheld LED-induced autofluorescence multispectral imager that might be more suitable for quick screening of oral cancer compared to optical spectroscopy or HIS. For this reason, 550, 532, 470, and 505 nm emission filters used in conjunction with 365 and 405 nm excitation LEDs were selected to enhance the differentiation between tumor tissue and normal tissue. These filters were implemented in a multi-spectral imager, and the computing methods used were able to provide a quantitative value for identifying oral cancer without requiring the opinion of experts.

Author Contributions

Conceptualization, M.-H.T. and M.O.-Y.; methodology, Y.-J.Y., M.O.-Y.; software, N.-L.C.; analysis, N.-L.C., C.-I.J., and Y.-J.Y.; investigation, N.-L.C., C.-I.J.; resources, C.-I.J., and M.-H.T.; data curation, N.-L.C., Y.-J.Y.; writing—original draft preparation, C.-I.J., Y.-J.Y.; writing—review and editing, Y.-J.Y., and M.O.-Y.; visualization, Y.-J.Y.; supervision, M.-H.T., J.-C.C. and M.O.-Y.; project administration, M.-H.T., J.-C.C. and M.O.-Y.; funding acquisition, C.-I.J., M.-H.T., J.-C.C. and M.O.-Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially financially funded by the Ministry of Science and Technology of Taiwan (MOST) (MOST 104-2622-E-009-015-CC2, MOST 105-2623-E-009 -008 –D, MOST 105-2221-E-009 -082 -, MOST 109-2321-B-009-008 -s), National Yang Ming Chiao Tung University, and China Medical University, Taiwan.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board of China Medical University Hospital, Taiwan. (CMUH102-REC1-069).

Informed Consent Statement

All subjects gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Institutional Review Board of China Medical University Hospital, Taiwan. (CMUH102-REC1-069).

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to their containing information that could compromise the privacy of research participants and the IRB statement (CMUH102-REC1-069).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Health Organization. Oral Cancer. Available online: http://www.who.int/cancer/prevention/diagnosis-screening/oral-cancer/en/ (accessed on 11 November 2018).
  2. Health Promotion Administration. Ministry of Health and Welfare. Statics of Oral Cancer Incidence. Available online: https://www.hpa.gov.tw/Pages/Detail.aspx?nodeid=1137&pid=7564 (accessed on 27 July 2017).
  3. Tadaaki, K.; Omura, K. Oral Cancer Diagnosis and Therapy; Springer: Tokyo, Japan, 2015; pp. 1–52. [Google Scholar]
  4. Ferlay, J.; Shin, H.-R.; Bray, F.; Forman, D.; Mathers, C.; Parkin, D.M. Estimates of worldwide burden of cancer in 2008: GLOBOCAN 2008. Int. J. Cancer 2010, 127, 2893–2917. [Google Scholar] [CrossRef] [PubMed]
  5. Ministry of Health and Welfare of Taiwan. Oral Cancer and Leading Cause of Death. Available online: https://www.mohw.gov.tw/cp-115-33347-2.html (accessed on 14 April 2021).
  6. Betz, C.S.; Arens, C.; Leunig, A. Autofluorescence Diagnosis of Cancers of the Upper Aerodigestive Tract; Endo-Press GmbH: Tuttlingen, Germany, 2007; pp. 9–12. ISBN 978-3-89756-131-1. [Google Scholar]
  7. Uppal, A.; Gupta, P.K. Measurement of NADH concentration in normal and malignant human tissues from breast and oral cavity. Biotechnol. Appl. Biochem. 2003, 37, 45–50. [Google Scholar] [CrossRef] [PubMed]
  8. Koenig, F.; McGovern, F.J.; Althausen, A.F.; Deutsch, T.F.; Schomacker, K.T. Laser induced autofluorescence diagnosis of bladder cancer. J. Urol. 1996, 156, 1597–1601. [Google Scholar] [CrossRef]
  9. Schomacker, K.T.; Frisoli, J.K.; Compton, C.C.; Flotte, T.J.; Richter, J.M.; Nishioka, N.S.; Deutsch, T.F. Ultraviolet laser-induced fluorescence of colonic tissue: Basic biology and diagnostic potential. Lasers Surg. Med. 1992, 12, 63–78. [Google Scholar] [CrossRef] [PubMed]
  10. Betz, C.S.; Stepp, H.; Janda, P.; Arbogast, S.; Grevers, G.; Baumgartner, R.; Leunig, A. A comparative study of normal inspection, autofluorescence and 5-ALA-induced PPIX fluorescence for oral cancer diagnosis. Int. J. Cancer 2002, 97, 245–252. [Google Scholar] [CrossRef]
  11. Gillenwater, A.; Jacob, R.; Ganeshappa, R.; Kemp, B.; El-Naggar, A.K.; Palmer, J.L.; Clayman, G.; Mitchell, M.F.; Richards-Kortum, R. Noninvasive Diagnosis of Oral Neoplasia Based on Fluorescence Spectroscopy and Native Tissue Autofluorescence. Arch. Otolaryngol. Head Neck Surg. 1998, 124, 1251–1258. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Betz, C.S.; Mehlmann, M.; Rick, K.; Stepp, H.; Grevers, G.; Baumgartner, R.; Leunig, A. Autofluorescence imaging and spectroscopy of normal and malignant mucosa in patients with head and neck cancer. Lasers Surg. Med. 1999, 25, 323–334. [Google Scholar] [CrossRef]
  13. Müller, M.G.; Valdez, T.A.; Georgakoudi, I.; Backman, V.; Fuentes, C.; Kabani, S.; Laver, N.; Wang, Z.; Boone, C.W.; Dasari, R.R.; et al. Spectroscopic detection and evaluation of morphologic and biochemical changes in early human oral carcinoma. Cancer 2003, 97, 1681–1692. [Google Scholar] [CrossRef]
  14. Majumder, S.K.; Gupta, A.; Gupta, S.; Ghosh, N.; Gupta, P.K. Multi-class classification algorithm for optical diagnosis of oral cancer. J. Photochem. Photobiol. 2006, B 85, 109–117. [Google Scholar] [CrossRef]
  15. Schwarz, R.A.; Gao, W.; Daye, D.; Williams, M.D.; Richards-Kortum, R.; Gillenwater, A.M. Autofluorescence and diffuse reflectance spectroscopy of oral epithelial tissue using a depth-sensitive fiber-optic probe. Appl. Opt. 2008, 47, 825–834. [Google Scholar] [CrossRef] [Green Version]
  16. Mallia, R.J.; Thomas, S.S.; Mathews, A.; Kumar, R.; Sebastian, P.; Madhavan, J.; Subhash, N. Laser-induced autofluorescence spectral ratio reference standard for early discrimination of oral cancer. Cancer 2008, 112, 1503–1512. [Google Scholar] [CrossRef] [PubMed]
  17. Hsieh, Y.F.; Ou-Yang, M.; Duann, J.R.; Chiou, J.C.; Chang, N.W.; Jan, C.I.; Tsai, M.H.; Wu, S.D.; Lin, Y.J.; Lee, C.C. Development of a novel embedded relay lens microscopic hyperspectral imaging system for cancer diagnosis: Use of the mice with oral cancer to be the example. Int. J. Spectrosc. 2012, 2012, 710803. [Google Scholar] [CrossRef] [Green Version]
  18. LED Dental. VELscope® Vx System. Available online: https://velscope.com/product/velscope-vx-system/ (accessed on 13 April 2021).
  19. DENTALEZ. IDENTAFI®. Available online: https://www.dentalez.com/product/identafi/ (accessed on 13 April 2021).
  20. MMOptics, EVINCE. Available online: https://mmo.com.br/evince/!# (accessed on 13 April 2021).
  21. Onizawa, K.; Saginoya, H.; Furuya, Y.; Yoshida, H.; Fukuda, H. Usefulness of fluorescence photography for diagnosis of oral cancer. Int. J. Oral Maxillofac. Surg. 1999, 28, 206–210. [Google Scholar] [CrossRef]
  22. Farah, C.S.; McIntosh, L.; Georgiou, A.; McCullough, M. Efficacy of tissue autofluorescence imaging in the visualization of oral mucosal lesions. Head Neck 2012, 34, 856–862. [Google Scholar] [CrossRef]
  23. Petruzzi, M.; Lucchese, A.; Nardi, G.M.; Lauritano, D.; Favia, G.; Serpico, R.; Grassi, F.R. Evaluation of autofluorescence and toluidine blue in the differentiation of oral dysplastic and neoplastic lesions from non dysplastic and neoplastic lesions: A cross-sectional study. J. Biomed. Opt. 2014, 19, 076003. [Google Scholar] [CrossRef] [Green Version]
  24. Scheer, M.; Fuss, J.; Derman, M.A.; Kreppel, M.; Neugebauer, J.; Rothamel, D.; Drebber, U.; Zoeller, J.E. Autofluorescence imaging in recurrent oral squamous cell carcinoma. Oral Maxillofac. Surg. 2015, 20, 27–33. [Google Scholar] [CrossRef]
  25. Jané-Salas, E.; Blanco-Carrión, A.; Jover-Armengol, L.; López-López, J. Autofluorescence and Diagnostic Accuracy of Lesions of Oral Mucosa: A Pilot Study. Braz. Dent. J. 2015, 26, 580–586. [Google Scholar] [CrossRef] [Green Version]
  26. Ramanathan, A.; Rosedee, N.A.; Edwer, S.A.; John, E.P.; Palaniswany, K.; Bakar, Z.A. Utility of autofluorescence imaging in the detection of oral mucosal lesions in elderly institutionalised subjects. Ann. Dent. 2015, 21, 2–16. [Google Scholar]
  27. Messadi, D.V.; Younai, F.S.; Liu, H.H.; Guo, G.; Wang, C.Y. The clinical effectiveness of reflectance optical spectroscopy for the in vivo diagnosis of oral lesions. Int. J. Oral Sci. 2014, 6, 162–167. [Google Scholar] [CrossRef]
  28. Lalla, Y.; Matias, M.A.T.; Farah, C.S. Assessment of oral mucosal lesions with autofluorescence imaging and reflectance spectroscopy. J. Am. Dent. Assoc. 2016, 147, 650–660. [Google Scholar] [CrossRef]
  29. Simonato, L.E.; Tomo, S.; Miyahara, G.I.; Navarro, R.S.; Villaverde, A.G.J.B. Fluorescence visualization efficacy for detecting oral lesions more prone to be dysplastic and potentially malignant disorders: A pilot study. Photodiagnosis Photodyn. Ther. 2017, 17, 1–4. [Google Scholar] [CrossRef]
  30. Lima, I.F.P.; Brand, L.M.; de Figueiredo, J.A.P.; Steier, L.; Lamers, M.L. Use of autofluorescence and fluorescent probes as a potential diagnostic tool for oral cancer: A systematic review. Photodiagnosis Photodyn. Ther. 2021, 33, 102073. [Google Scholar] [CrossRef]
  31. Shi, L.; Li, C.; Shen, X.; Zhou, Z.; Liu, W.; Tang, G. Potential role of autofluorescence imaging in determining biopsy of oral potentially malignant disorders: A large prospective diagnostic study. Oral Oncol. 2019, 98, 176–179. [Google Scholar] [CrossRef]
  32. Jeng, M.J.; Sharma, M.; Sharma, L.; Huang, S.F.; Chang, L.B.; Wu, S.L.; Chow, L. Novel Quantitative Analysis Using Optical Imaging (VELscope) and Spectroscopy (Raman) Techniques for Oral Cancer Detection. Cancers 2020, 12, 3364. [Google Scholar] [CrossRef]
  33. Jeng, M.J.; Sharma, M.; Chao, T.Y.; Li, Y.C.; Huang, S.F.; Chang, L.B.; Chow, L. Multiclass classification of autofluorescence images of oral cavity lesions based on quantitative analysis. PLoS ONE 2020, 15, e0228132. [Google Scholar] [CrossRef]
  34. Huang, T.-T.; Chen, K.-C.; Wong, T.-Y.; Chen, C.-Y.; Chen, W.-C.; Chen, Y.-C.; Chang, M.-H.; Wu, D.-Y.; Huang, T.-Y.; Nioka, S.; et al. Two-channel autofluorescence analysis for oral cancer. J. Biomed. Opt. 2018, 24, 051402. [Google Scholar] [CrossRef]
  35. Cherry, K.D.; Schwarz, R.A.; Yang, E.C.; Vohra, I.S.; Badaoui, H.; Williams, M.D.; Vigneswaran, N.; Gillenwater, A.M.; Richards-Kortum, R. Autofluorescence Imaging to Monitor the Progression of Oral Potentially Malignant Disorders. Cancer Prev. Res. 2019, 12, 791–800. [Google Scholar] [CrossRef] [Green Version]
  36. Yan, Y.J.; Huang, T.W.; Cheng, N.L.; Hsieh, Y.F.; Tsai, M.H.; Chiou, J.C.; Duann, J.R.; Lin, Y.J.; Yang, C.S.; Ou-Yang, M. Portable LED-induced autofluorescence spectroscopy for oral cancer diagnosis. J. Biomed. Opt. 2017, 22, 45007. [Google Scholar] [CrossRef]
  37. Tsai, M.H.; (China Medical University Hospital, 2 Yuh-Der Road, Taichung City 40447, Taiwan). Approval of Hyperspectrum for oral pr-cancer and early cancer detection and diagnosis (CMUH102-REC1-069). 2013. [Google Scholar]
  38. Daniel, W.W. Kolmogorov–Smirnov one-sample test. In Applied Nonparametric Statistics; PWS-Kent: Boston, MA, USA, 1990; pp. 319–330. [Google Scholar]
  39. Mandrekar, J.N. Receiver Operating Characteristic Curve in Diagnostic Test Assessment. J. Thorac. Oncol. 2010, 5, 1315–1316. [Google Scholar] [CrossRef] [Green Version]
  40. Bewick, V.; Cheek, L.; Ball, J. Statistics review 13: Receiver operating characteristic curves. Crit. Care 2004, 8, 508–512. [Google Scholar] [CrossRef] [Green Version]
  41. Søreide, K. Receiver-operating characteristic curve analysis in diagnostic, prognostic and predictive biomarker research. J. Clin. Pathol. 2008, 62, 1–5. [Google Scholar] [CrossRef]
  42. Goodman, T.; Heidel, G.; Muray, K.; Ohno, Y.; Sauter, G.; Schanda, J.; Steudtner, W.; Young, R. Measurement of LEDs; CIE 127-2007; Commission internationale de l’éclairage: Vienna, Austria, 2007; ISBN 9783901906589. [Google Scholar]
  43. Silicon Optronics. SOI268 Color CMOS UXGA (2.0 MPixel) Sensor. Available online: http://www.zhopper.narod.ru/mobile/soi268_full.pdf (accessed on 30 April 2021).
  44. Midwest Optical Systems. BP525 Light Green Bandpass Filter. Available online: https://midopt.com/filters/bp525/ (accessed on 30 April 2021).
  45. Midwest Optical Systems. BP635 Light Red Bandpass Filter. Available online: https://midopt.com/filters/bp635/ (accessed on 30 April 2021).
  46. Midwest Optical Systems. BP695 Near-IR Bandpass Filter. Available online: https://midopt.com/filters/bp695/ (accessed on 30 April 2021).
  47. Midwest Optical Systems. BP470 Blue Bandpass Filter. Available online: https://midopt.com/filters/bp470/ (accessed on 30 April 2021).
  48. Midwest Optical Systems. BP505 Cyan Bandpass Filter. Available online: https://midopt.com/filters/bp505/ (accessed on 30 April 2021).
  49. Midwest Optical Systems. BN532 Narrow Green Bandpass Filter. Available online: https://midopt.com/filters/bn532/ (accessed on 30 April 2021).
  50. Midwest Optical Systems. Bi550 Green Interference Bandpass Filter. Available online: https://midopt.com/filters/bi550/ (accessed on 30 April 2021).
  51. Midwest Optical Systems. BN595 Narrow Orange Bandpass Filter. Available online: https://midopt.com/filters/bn595/ (accessed on 30 April 2021).
  52. Edmund Optics. SCHOTT GG-455, 12.5mm Dia., 3mm Thick, Colored Glass Longpass Filter. Available online: https://www.edmundoptics.com.tw/p/gg-455-12.5mm-dia.-longpass-filter/11319/ (accessed on 30 April 2021).
  53. De Veld, D.; Witjes, M.; Sterenborg, H.; Roodenburg, J. The status of in vivo autofluorescence spectroscopy and imaging for oral oncology. Oral Oncol. 2005, 41, 117–131. [Google Scholar] [CrossRef]
  54. Farah, C.S.; McCullough, M.J. A pilot case control study on the efficacy of acetic acid wash and chemiluminescent illumination (ViziLite™) in the visualisation of oral mucosal white lesions. Oral Oncol. 2007, 43, 820–824. [Google Scholar] [CrossRef]
  55. Huber, M.A.; Bsoul, S.A.; Terezhalmy, G.T. Acetic acid wash and chemiluminescent illumination as an adjunct to conventional oral soft tissue examination for the detection of dysplasia: A pilot study. Quintessence Int. 2004, 35, 378–384. [Google Scholar]
  56. Kerr, A.R.; Sirois, D.A.; Epstein, J.B. Clinical evaluation of chemiluminescent lighting: An adjunct for oral mucosal examinations. J Clin Dent. 2006, 17, 59–63. [Google Scholar]
  57. Lane, P.M.; Gilhuly, T.; Whitehead, P.; Zeng, H.; Poh, C.F.; Ng, S.; Williams, P.M.; Zhang, L.; Rosin, M.P.; MacAulay, C.E. Simple device for the direct visualization of oral-cavity tissue fluorescence. BIOMEDO 2006, 11, 024006. [Google Scholar] [CrossRef]
  58. Svistun, E.; Alizadeh-Naderi, R.; El-Naggar, A.; Jacob, R.; Gillenwater, A.; Richards-Kortum, R. Vision enhancement system for detection of oral cavity neoplasia based on autofluorescence. Head Neck 2004, 26, 205–215. [Google Scholar] [CrossRef]
  59. Rahman, M.; Chaturvedi, P.; Gillenwater, A.M.; Richards-Kortum, R. Low-cost, multimodal, portable screening system for early detection of oral cancer. BIOMEDO 2008, 13, 030502. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  60. Farah, C.; McCullough, M.; McCullough, M. Oral cancer awareness for the general practitioner: New approaches to patient care. Aust. Dent. J. 2008, 53, 2–10. [Google Scholar] [CrossRef] [PubMed]
  61. Roblyer, D.; Richards-Kortum, R.; Sokolov, K.; El-Naggar, A.K.; Williams, M.D.; Kurachi, C.; Gillenwater, A.M. Multispectral optical imaging device for in vivo detection of oral neoplasia. BIOMEDO 2008, 13, 024019. [Google Scholar] [CrossRef] [PubMed]
  62. Rahman, M.S.; Ingole, N.; Roblyer, D.; Stepanek, V.; Richards-Kortum, R.; Gillenwater, A.; Shastri, S.; Chaturvedi, P. Evaluation of a low-cost, portable imaging system for early detection of oral cancer. Head Neck Oncol. 2010, 2, 10–18. [Google Scholar] [CrossRef] [Green Version]
  63. Patton, L.L.; Epstein, J.B.; Kerr, A.R. Adjunctive techniques for oral cancer examination and lesion diagnosis: A systematic review of the literature. J. Am. Dent. Assoc. 2008, 139, 896–905. [Google Scholar] [CrossRef] [Green Version]
  64. Lingen, M.W.; Kalmar, J.R.; Karrison, T.; Speight, P.M. Critical evaluation of diagnostic aids for the detection of oral cancer. Oral Oncol. 2008, 44, 10–22. [Google Scholar] [CrossRef] [Green Version]
  65. Messadi, D.V. Diagnostic aids for detection of oral precancerous conditions. Int. J. Oral Sci. 2013, 5, 59–65. [Google Scholar] [CrossRef]
  66. Poh, C.F.; Ng, S.P.; Williams, P.M.; Zhang, L.; Laronde, D.M.; Lane, P.; Macaulay, C.; Rosin, M.P. Direct fluorescence visualization of clinically occult high-risk oral premalignant disease using a simple hand-held device. Head Neck 2006, 29, 71–76. [Google Scholar] [CrossRef]
  67. Nagi, R.; Reddy-Kantharaj, Y.; Rakesh, N.; Janardhan-Reddy, S.; Sahu, S. Efficacy of light based detection systems for early detection of oral cancer and oral potentially malignant disorders: Systematic review. Medicina Oral Patología Oral y Cirugia Bucal 2016, 21, e447–e455. [Google Scholar] [CrossRef]
  68. Awan, K.H.; Morgan, P.R.; Warnakulasuriya, S. Assessing the accuracy of autofluorescence, chemiluminescence and toluidine blue as diagnostic tools for oral potentially malignant disorders—A clinicopathological evaluation. Clin. Oral Investig. 2015, 19, 2267–2272. [Google Scholar] [CrossRef]
  69. De Veld, D.C.; Skurichina, M.; Witjes, M.J.; Duin, R.P.; Sterenborg, D.J.; Star, W.M.; Roodenburg, J.L. Autofluorescence characteristics of healthy oral mucosa at different anatomical sites. Lasers Surg. Med. 2003, 32, 367–376. [Google Scholar] [CrossRef]
  70. Gillenwater, A.; Papadimitrakopoulou, V.; Richards-Kortum, R. Oral premalignancy: New methods of detection and treatment. Curr. Oncol. Rep. 2006, 8, 146–154. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Mechanical drawing and explosion drawing of the LIAF multispectral imager.
Figure 1. Mechanical drawing and explosion drawing of the LIAF multispectral imager.
Sensors 21 03219 g001
Figure 2. Composition of the 4CH LIAF multi-spectral imager and 8CH LIAF multi-spectral imager, including excitation lights, LED current, band-pass filters, a long-pass filter, and a CMOS sensor.
Figure 2. Composition of the 4CH LIAF multi-spectral imager and 8CH LIAF multi-spectral imager, including excitation lights, LED current, band-pass filters, a long-pass filter, and a CMOS sensor.
Sensors 21 03219 g002
Figure 3. Flowchart of the trials.
Figure 3. Flowchart of the trials.
Sensors 21 03219 g003
Figure 4. (a)Tumor regions and normal regions, which are captured are marked as a white circle. (b) LIAF captures the tumor or normal points. (c) Captured image with a circle indicating the opening scope of the probe.
Figure 4. (a)Tumor regions and normal regions, which are captured are marked as a white circle. (b) LIAF captures the tumor or normal points. (c) Captured image with a circle indicating the opening scope of the probe.
Sensors 21 03219 g004
Figure 5. (a) Gaussian distribution of normal and tumor data. The Gaussian distributions of normal and tumor data are drawn as a solid line and dotted line, respectively. The crossing point marked as “◆” represents the optimized threshold. (b) The ROC curve of normal and tumor data in one classifier. The crossing point marked as “•” represents the optimized threshold.
Figure 5. (a) Gaussian distribution of normal and tumor data. The Gaussian distributions of normal and tumor data are drawn as a solid line and dotted line, respectively. The crossing point marked as “◆” represents the optimized threshold. (b) The ROC curve of normal and tumor data in one classifier. The crossing point marked as “•” represents the optimized threshold.
Sensors 21 03219 g005
Figure 6. Spectral radiant flux of the excitation LEDs.
Figure 6. Spectral radiant flux of the excitation LEDs.
Sensors 21 03219 g006
Figure 7. Spectral transmittance of the emission filters and RGB color filters of the 4CH LIAF multi-spectral imager [43,44,45,46].
Figure 7. Spectral transmittance of the emission filters and RGB color filters of the 4CH LIAF multi-spectral imager [43,44,45,46].
Sensors 21 03219 g007
Figure 8. Spectral transmittance of the emission filters, long-pass filter, and RGB color filters of the 8CH LIAF multi-spectral imager [43,47,48,49,50,51,52,53].
Figure 8. Spectral transmittance of the emission filters, long-pass filter, and RGB color filters of the 8CH LIAF multi-spectral imager [43,47,48,49,50,51,52,53].
Sensors 21 03219 g008
Figure 9. The AUC statistic of the classifiers with 30 spectral bands in Exp_1, Exp_2, and Exp_3. The data in each classifier was used to calculate the AUC (A + B). The solid red points are the average AUCs. The highest and lowest black bold dash lines are the maximum and minimum AUC. The highest and the lowest blue dash lines are third and first quartile AUC. The mid-blue dashed lines are the median AUC.
Figure 9. The AUC statistic of the classifiers with 30 spectral bands in Exp_1, Exp_2, and Exp_3. The data in each classifier was used to calculate the AUC (A + B). The solid red points are the average AUCs. The highest and lowest black bold dash lines are the maximum and minimum AUC. The highest and the lowest blue dash lines are third and first quartile AUC. The mid-blue dashed lines are the median AUC.
Sensors 21 03219 g009
Figure 10. The AUC statistic of the classifiers with 30 spectral bands in Exp_1, Exp_2, and Exp_3. The data in group A was used to calculate the AUC. The solid red points are the average AUCs. The highest and lowest black bold dash lines are the maximum and minimum AUC. The highest and the lowest blue dash lines are third and first quartile AUC. The mid-blue dashed lines are the median AUC.
Figure 10. The AUC statistic of the classifiers with 30 spectral bands in Exp_1, Exp_2, and Exp_3. The data in group A was used to calculate the AUC. The solid red points are the average AUCs. The highest and lowest black bold dash lines are the maximum and minimum AUC. The highest and the lowest blue dash lines are third and first quartile AUC. The mid-blue dashed lines are the median AUC.
Sensors 21 03219 g010
Figure 11. Statistic AUC of the classifiers with 30 spectral bands in Exp_1, Exp_2, and Exp_3. The data in group B was used to calculate the AUC. The solid red points are the average AUCs. The highest and lowest black bold dash lines are the maximum and minimum AUC. The highest and the lowest blue dash lines are third and first quartile AUC. The mid-blue dashed lines are the median AUC.
Figure 11. Statistic AUC of the classifiers with 30 spectral bands in Exp_1, Exp_2, and Exp_3. The data in group B was used to calculate the AUC. The solid red points are the average AUCs. The highest and lowest black bold dash lines are the maximum and minimum AUC. The highest and the lowest blue dash lines are third and first quartile AUC. The mid-blue dashed lines are the median AUC.
Sensors 21 03219 g011
Figure 12. (a) Sensitivity, specificity, and accuracy of the classifiers using the 505 nm emission filters correlated with the green filter. (b) Sensitivity, specificity, and accuracy of the classifiers using the green filter. (c) Sensitivity, specificity, and accuracy of the classifiers using the 470 nm emission filters correlated with the blue filter. (d) Sensitivity, specificity, and accuracy of the classifiers using the 550 nm emission filters correlated with the green filter. (e) Sensitivity, specificity, and accuracy of the classifiers using the 532 nm emission filters correlated with the green filter. (f) Sensitivity, specificity, and accuracy of the classifiers using the blue filter. Each spectral band which is the emission filter correlated with the red, green, and blue filters has eight classifiers according to the excitation LEDs and the methods. The red dotted circles indicate the classifiers with the highest sensitivity (over 94%).
Figure 12. (a) Sensitivity, specificity, and accuracy of the classifiers using the 505 nm emission filters correlated with the green filter. (b) Sensitivity, specificity, and accuracy of the classifiers using the green filter. (c) Sensitivity, specificity, and accuracy of the classifiers using the 470 nm emission filters correlated with the blue filter. (d) Sensitivity, specificity, and accuracy of the classifiers using the 550 nm emission filters correlated with the green filter. (e) Sensitivity, specificity, and accuracy of the classifiers using the 532 nm emission filters correlated with the green filter. (f) Sensitivity, specificity, and accuracy of the classifiers using the blue filter. Each spectral band which is the emission filter correlated with the red, green, and blue filters has eight classifiers according to the excitation LEDs and the methods. The red dotted circles indicate the classifiers with the highest sensitivity (over 94%).
Sensors 21 03219 g012
Figure 13. AI-based band selection method flow chart. Four spectral bands are selected from C01 to C30 in the first stage. The classifier whose AUC is greater than 85% is selected from each selected spectral band. A total of four classifiers are selected in the second stage. The accuracy, sensitivity, and specificity of four selected spectral bands are multiplied by the corresponding weighting score W and summed. The summed accuracy is multiplied by 0.1, the summed sensitivity is multiplied by 0.8, and the summed specificity is multiplied by 0.1. The weighting sore is the summation of the three multiplied values. The method adjusts the weightings to find the maximum weighting score for each selection, and the weighting score of each selection is calculated and compared to find the best filter groups.
Figure 13. AI-based band selection method flow chart. Four spectral bands are selected from C01 to C30 in the first stage. The classifier whose AUC is greater than 85% is selected from each selected spectral band. A total of four classifiers are selected in the second stage. The accuracy, sensitivity, and specificity of four selected spectral bands are multiplied by the corresponding weighting score W and summed. The summed accuracy is multiplied by 0.1, the summed sensitivity is multiplied by 0.8, and the summed specificity is multiplied by 0.1. The weighting sore is the summation of the three multiplied values. The method adjusts the weightings to find the maximum weighting score for each selection, and the weighting score of each selection is calculated and compared to find the best filter groups.
Sensors 21 03219 g013
Table 1. Characteristics of the subjects in three experiments.
Table 1. Characteristics of the subjects in three experiments.
ExperimentExp_1Exp_2Exp_3
Version of the imaging system4CH4CH4CH
Dark image correctionnoyesyes
Patient171926
Male: Female17/0016/0323/03
Specimen (sample points)17 (106)19 (108)28 (222)
Age54 ± 1458 ± 1258 ± 11
Table 2. Information of the specimens and the sample points in three experiments.
Table 2. Information of the specimens and the sample points in three experiments.
ExperimentsTotal Sample PointsTesting Group
Group AGroup B
SpecimensNormal
Points
Tumor
Points
SpecimensNormal
Points
Tumor
Points
SpecimensNormal
Points
Tumor
Points
Exp_11753539272782626
Exp_219545410282892626
Exp_330111111155151156060
Table 3. Spectral transmittances of the R, G, and B filters correlated with and without the emission filters.
Table 3. Spectral transmittances of the R, G, and B filters correlated with and without the emission filters.
No.Spectral BandsBand
Wavelength
(nm)
The Central Wavelength of the Spectral TransmittanceNo.Spectral BandsBand
Wavelength
(nm)
The Central Wavelength of the Spectral Transmittance
C01525 nm_B510 ± 30.00.24C16532 nm_B520 ± 20.00.16
C02525 nm_G540 ± 30.00.64C17532 nm_G540 ± 25.00.64
C03525 nm_R570 ± 15.00.13C18532 nm_R570 ± 10.00.09
C04635 nm_B660 ± 40.00.10C19550 nm_B540 ± 15.00.13
C05635 nm_G600 ± 42.50.26C20550 nm_G540 ± 15.00.63
C06635 nm_R620 ± 42.50.94C21550 nm_R550 ± 15.00.05
C07695 nm_B700 ± 25.00.22C22595 nm_B600 ± 20.00.09
C08695 nm_G700 ± 22.50.36C23595 nm_G580 ± 2500.42
C09695 nm_R700 ± 27.50.84C24595 nm_R600 ± 17.50.95
C10470 nm_B490 ± 25.00.26C25632 nm_B640 ± 15.00.09
C11470 nm_G490 ± 15.00.17C26632 nm_G640 ± 17.50.16
C12470 nm_R750 ± 10.00.03C27632 nm_R630 ± 17.50.82
C13505 nm_B500 ± 30.00.28C28Blue500 ±50.00.30
C14505 nm_G540 ± 35.00.63C29Green540 ±45.00.68
C15505 nm_R580 ± 15.00.10C30Red620 -40.01
Table 4. Definition of the sensitivity, specificity, and accuracy.
Table 4. Definition of the sensitivity, specificity, and accuracy.
True Condition
Condition PositiveCondition Negative
Predicted conditionPredicted PositiveTrue-Positive (TP)False-Positive (FP)
Predicted NegativeFalse-Negative (FN)True-Negative (TN)
Sensitivity
= TP/(TP + TN)
Specificity
= TN/(FP + FN)
Accuracy = (TP + TN)/(TP + TN+ FP + FN)
Table 5. Average AUCs (%) of group A + B, A, and B.
Table 5. Average AUCs (%) of group A + B, A, and B.
A + BAB
No.AUCNo.AUCNo.AUC
3C1491.523C2990.933C1492.76
3C1090.743C2890.393C1090.92
3C1790.623C1490.363C1790.19
3C2989.083C1090.273C2090.04
3C2088.493C1789.583C2987.93
3C2886.633C2087.003C2885.34
Table 6. Results of AI-based band selection in three data groups.
Table 6. Results of AI-based band selection in three data groups.
No.ExcitationMethodWeighting (W)
A + B3C20405 nmIntensity0.7
3C14405 nmIntensity0.1
3C17405 nmIntensity0.1
3C28405 nmIntensity0.1
A3C20405 nmIntensity0.7
3C10405 nmIntensity0.1
3C17405 nmIntensity0.1
3C29365 nmIntensity0.1
B3C14405 nmfractal dimension0.7
3C17405 nmIntensity0.1
3C20405 nmIntensity0.1
3C29365 nmHistogram0.1
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Yan, Y.-J.; Cheng, N.-L.; Jan, C.-I.; Tsai, M.-H.; Chiou, J.-C.; Ou-Yang, M. Band-Selection of a Portal LED-Induced Autofluorescence Multispectral Imager to Improve Oral Cancer Detection. Sensors 2021, 21, 3219. https://0-doi-org.brum.beds.ac.uk/10.3390/s21093219

AMA Style

Yan Y-J, Cheng N-L, Jan C-I, Tsai M-H, Chiou J-C, Ou-Yang M. Band-Selection of a Portal LED-Induced Autofluorescence Multispectral Imager to Improve Oral Cancer Detection. Sensors. 2021; 21(9):3219. https://0-doi-org.brum.beds.ac.uk/10.3390/s21093219

Chicago/Turabian Style

Yan, Yung-Jhe, Nai-Lun Cheng, Chia-Ing Jan, Ming-Hsui Tsai, Jin-Chern Chiou, and Mang Ou-Yang. 2021. "Band-Selection of a Portal LED-Induced Autofluorescence Multispectral Imager to Improve Oral Cancer Detection" Sensors 21, no. 9: 3219. https://0-doi-org.brum.beds.ac.uk/10.3390/s21093219

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop