Next Article in Journal
Scalable Implementation of Hippocampal Network on Digital Neuromorphic System towards Brain-Inspired Intelligence
Next Article in Special Issue
Data-Driven Redundant Transform Based on Parseval Frames
Previous Article in Journal
Surface Plasmon Resonance Sensor Based on Polypyrrole–Chitosan–BaFe2O4 Nanocomposite Layer to Detect the Sugar
Previous Article in Special Issue
Comparison of Image Fusion Techniques Using Satellite Pour l’Observation de la Terre (SPOT) 6 Satellite Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Dental Images Recognition Technology and Applications: A Literature Review

by
María Prados-Privado
1,2,*,
Javier García Villalón
1,
Carlos Hugo Martínez-Martínez
1 and
Carlos Ivorra
1
1
Asisa Dental, Research Department, C/José Abascal, 32, 28003 Madrid, Spain
2
Department of Signal Theory and Communications, Higher Polytechnic School, Universidad de Alcalá de Henares, Ctra. Madrid-Barcelona, Km. 33,600, 28805 Alcalá de Henares, Spain
*
Author to whom correspondence should be addressed.
Submission received: 25 March 2020 / Revised: 14 April 2020 / Accepted: 17 April 2020 / Published: 20 April 2020
(This article belongs to the Special Issue Advances in Image Processing, Analysis and Recognition Technology)

Abstract

:
Neural networks are increasingly being used in the field of dentistry. The aim of this literature review was to visualize the state of the art of artificial intelligence in dental applications, such as the detection of teeth, caries, filled teeth, crown, prosthesis, dental implants and endodontic treatment. A search was conducted in PubMed, the Institute of Electrical and Electronics Engineers (IEEE) Xplore and arXiv.org. Data extraction was performed independently by two reviewers. Eighteen studies were included. The variable teeth was the most analyzed (n = 9), followed by caries (n = 7). No studies detecting dental implants and filled teeth were found. Only two studies investigated endodontic applications. Panoramic radiographies were the most common image employed (n = 5), followed by periapical images (n = 3). Near-infrared light transillumination images were employed in two studies and bitewing and computed tomography (CT) were employed in one study. The included articles used a wide variety of neuronal networks to detect the described variables. In addition, the database used also had a great heterogeneity in the number of images. A standardized methodology should be used in order to increase the compatibility and robustness between studies because of the heterogeneity in the image database, type, neural architecture and results.

1. Introduction

Medical imaging techniques, such as computed tomography (CT) or X-ray among others, have been used in recent decades for the detection, diagnosis and treatment of different diseases [1].
A new and emerging field in dentistry is dental informatics, because of the possibility it offers to improve treatment and diagnosis [2], in addition to saving time and reducing stress and fatigue during daily practice [3]. Medical practice in general, and dentistry in particular, generates massive data from sources such as high-resolution medical imaging, biosensors with continuous output and electronic medical records [4]. The use of computer programs can help dental professionals in making decisions related to prevention, diagnosis or treatment planning, among others [5].
At present, one of the artificial intelligence methods employed in clinical fields is called deep learning [6]. Artificial intelligence is the term used to describe the algorithms designed for problem solving and reasoning [7]. The success of deep learning is mainly due to the progress in the computer capacity, the huge amount of data available and the development of algorithms [1]. This method has been proven and is used effectively in image-based diagnosis in several fields [8]. Convolutional neural networks (CNNs) are commonly used in applications relying on deep learning, which have been developed extremely quickly during the last decade [9], mainly as a choice for analyzing medical images. CNNs have been successfully employed in medicine, primarily in cancer, for the automated assessment of breast cancer in mammograms, skin cancer in clinical skin screenings, or diabetic retinopathy in eye examinations [10].
CNNs have been recently applied in dentistry to detect periodontal bone loss [11,12], caries on bitewing radiographs [13], apical lesions [14], or for medical image classification [12]. These kinds of neural networks can be used to detect structures, such as teeth or caries, to classify them and to segment them [15]. Neural networks need to be trained and optimized, and for that an image database is necessary.
There are several image techniques in the dentistry field depending on their use. Periapical images are employed to capture intact teeth, including front and posterior, as well as their surrounding bone; therefore, periapical images are very helpful to visualize the potential caries, periodontal bone loss and periapical diseases [16]. Bitewing images can only visualize the crowns of posterior teeth with simple layouts and considerably less overlaps [17]. Panoramic radiographies are very common in dentistry, because they allow for the screening of a broad anatomical region and at the same time, require a relatively low radiation dose [18].
The objective of this review of the literature was to visualize the state of the art of artificial intelligence in various dental applications, such as the detection of teeth, caries, filled teeth, or endodontic treatment, among others.

2. Materials and Methods

2.1. Review Questions

(1) What are the neural networks used to detect teeth, filled teeth, caries, dental implants and endodontic teeth?
(2) How is the database used in the construction of these networks?
(3) What are the outcome metrics and its values obtained by those neural networks?

2.2. Search Strategy

An electronic search was performed in MEDLINE/PubMed, the Institute of Electrical and Electronics Engineers (IEEE) Xplore and arXiv.org databases, up until 17 March, 2020. Most journal manuscripts in the medical field were published in MEDLINE/Pubmed. IEEE Xplore publishes articles related to computer science, electrical engineering and electronics (https://0-ieeexplore-ieee-org.brum.beds.ac.uk/Xplore/home.jsp). Among others, arXiv.org is an electronic archive for scientific manuscripts in the field of physics, computer science, and mathematics.
The search strategy used is detailed in Table 1.

2.3. Study Selection

M.P.-P. and J.G.-V. performed the bibliography search and selected the articles that fulfilled the inclusion criteria. Both authors extracted independently the results. The references of the articles included in this study were manually reviewed.

2.4. Inclusion and Exclusion Criteria

The inclusion criteria were full manuscripts including conference proceedings that reported the use of neural network on detecting teeth, caries, filled teeth, dental implants and endodontic treatments. There were no restrictions on the language or the date of publication. Exclusion criteria were reviews, no dental application and no neural network employed.

3. Results

3.1. Study Selection

Figure 1 details a flowchart of the study selection. All of the electronic search strategies resulted in 387 potential manuscripts. A total of 378 studies were excluded because they did not meet the inclusion criteria. Additionally, a manual search was carried out to analyze the references cited in nine of the articles that were included in this work. Finally, nine more articles were incorporated from the manual search. At the end, a total of eighteen studies were analyzed.

3.2. Relevant Data of Included Studies

All of the included manuscripts were published between 2013 and 2020. Table 2 details the main characteristics of those included in the manuscript.
According to Table 2, the number of studies published increased each year and most of them were published in 2019. Selected works were published across seven countries, most of them in the United States (n = 5) and England (n = 5).
Regarding the variables detected by the included studies, the variable of teeth was the most analyzed (n = 9) followed by variable caries (n = 7). No studies detecting variables of dental implants and filled teeth were found. Only two studies investigated endodontic applications.
The total image database varied from 52 to 9812 images, with a mean of 1379 images. Panoramic radiographies were the most common image employed (n = 7) followed by periapical images (n = 3). Near-infrared light transillumination images were employed in two studies and bitewing and CT and radiovisiography were each employed in one study. No image type was detailed in two of the studies.

3.3. Tooth Detection

A deep convolutional neural network (DCNN) with an AlexNet architecture was employed by Miki et al. for classifying tooth types on dental cone-beam computed tomography (CT) images. In that study, the authors employed forty-two images to train the network and ten images to test it and obtained a relatively high accuracy (above 80%) [24].
A mask region-based convolutional neural network (Mask R-CNN) was employed by Jader et al. to obtain the profile of each tool, employing 1500 panoramic X-ray radiographies. The outcome metrics employed in this study were accuracy, F1-score, precision, recall and specificity, with values of 0.98, 0.88, 0.94, 0.84, and 0.99, respectively [23].
Faster regions with convolutional neural network features (faster R-CNN) in the TensorFlow tool package were used by Chen et al. to detect and number the teeth in dental periapical films [16]. Here, 800 images were employed as the training dataset, 200 as the test dataset and 250 as the validation dataset. The outcome metrics were recall and precision, which obtained 0.728 and 0.771, respectively. Chen et al. also employed a neural network to predict the missing teeth number.
Periapical images with faster-RCNN and region-based fully convolutional networks (R-FCN) were employed by Zhang et al. Here, 700 images were employed to train the network, 200 were employed to test and 100 to validate the network. The method proposed by Zhang et al. achieved a high precision close to 95.8% and a recall of 0.961 [20].
The efficiencies of a radial basis function neural network (RBFNN) and of a GAME neural network in predicting the age of the Czech population between three and 17 years were compared by Velemínská et al. This study employed a panoramic X-ray of 1393 individuals aged from three to 17 years. In this case, standard deviation was measured [25].
A total of 1352 panoramic images were employed by Tuzoff et al. to detect teeth using a Faster R-CNN architecture [3]. This study obtained a sensitivity of 0.9941 and a precision of 0.9945.
By employing a CNN architecture and the PyBrain library, Raith et al. classified teeth and obtained a performance of 0.93 [21].
One hundred dental panoramic radiographs were employed by Muramatsu et al. for an object detection network using a four-fold cross-validation method. The tooth detection sensitivity was 96.4% and the accuracy was 93.2% [28].
A database of 100 panoramic radiographs with an AlexNet architecture was employed by Oktay to detect teeth with an accuracy of over 0.92, depending on the type of tooth (molar, incisor, and premolars) [30].

3.4. Caries Detection

Two deep convolutional neural networks (CNNs), Resnet18 and Restext50, were applied by Schwendicke et al. to detect caries lesions in near-infrared light transillumination (NILT) images [10]. In this study, 226 extracted permanent human teeth (113 premolars and 113 molars) were employed. According to their results, the two models performed similarly in predicting the caries on tooth segments of the NILT images. The area under the curve (AUC), sensitivity and the specificity were evaluated with results of 0.74, 0.59, and 0.76, respectively.
A deep learning model was employed by Casalengo et al. for the automated detection and localization of dental lesions in 217 near-infrared transillumination images of upper and lower molars and premolars. Here, 185 images were used to train the network and 32 images were used to validate it. The results concluded an area under curve (AUC) of 85.6% for proximal lesions and an AUC of 83.6% for occlusal lesions [26].
A total of 3000 periapical radiographies were employed by Lee et al. to detect dental caries [13]. From the total dataset, 25.9% of the images were maxillary premolars, 25.6% were maxillary molars, 24.1% were mandibular premolars and 24.4% were mandibular molars. The authors implemented deep CNN algorithm weight factors. A pre-trained GoogleNet Inception v3 CNN network was used for preprocessing and the datasets were trained using transfer learning. For detecting caries, this study obtained an accuracy of 89%, 88% and 82% in premolar, molar and premolar-molar, respectively, and for AUC, values of 0.917, 0.89, and 0.845 were obtained for premolar, molar and premolar-molar, respectively.
Caries from given socioeconomic and dietary factors were analyzed by Zanella-Calzada et al. employing an ANN to determine the state of health [27]. An ANN designed with seven layers, four dense layers and three dropout layers, was used in this study. A total of 9812 subjects were employed, 70% of them were used for training and the remaining 30% for testing. The results obtained an accuracy of approximately 0.69 and an AUC of 0.75.
A total of 3000 bitewings images were employed by Srivasta et al. to detect dental caries with a deep fully convolutional neural network. The results of this study were a recall of 0.805, a precision of 0.615 and a F1-score of 0.7 [22].
A total of 251 radiovisiography images were employed by Prajapati et al. to detect caries with a convolutional neural network, which achieved an accuracy of 0.875 [29].
A back-propagation neural network with a database of 105 intra-oral images was employed by Geetha et al. to detect caries. This architecture achieved an accuracy of 0.971 and a precision recall curve (PRC) area of 0.987 [31].

3.5. Dental Implant and Filled Teeth Detection

Implant treatment is a common practice in different clinical situations for replacing teeth. However, no studies were found that used artificial intelligence and neural networks to detect implants on radiographs. The same is true for filled teeth detection.

3.6. Endodontic Treatment Detection

A convolutional neural network (CNN) system was employed by Fukuda et al. for detecting vertical root fractures (VRFs) in panoramic radiographies [19]. Three hundred images were used as an image dataset, of which 240 images were assigned to a training set and 60 images were assigned to a test set. This study constructed a CNN-based deep learning model using DetectNet with DIGITS version 5.0 (city and country), and obtained a recall of 0.75, a mean precision of 0.93 and a F measure of 0.83.
Deep convolutional neural networks (CNNs) based in Keras were applied by Ekert et al. to detect apical lesions on panoramic dental radiographs [14]. A total of 85 images were employed, which obtained an AUC of 0.89 in molars and 0.85 in other teeth and a sensitivity of 0.74 and 0.65 in molars and other teeth.

4. Discussion

The goal of this literature review was to visualize the state of the art of artificial intelligence in detecting different dental situations, such as the detection of teeth, caries, filled teeth, endodontic treatments and dental implants.
Neural networks can have single or multiple layers, with nodes or neurons interconnected that allows signals to travel through the network. ANNs are typically divided into three layers of neurons, namely: input (receives the information), hidden (extracts patterns and performs the internal processing), and output (presents the final network output) [32,33]. Training is the process to optimize parameters [34]. Figure 2 details the architecture for teeth detection.
More and more industries are using artificial intelligence to make increasingly complex decisions, and many alternatives are available to them [32]. However, in view of our results, there is a paucity of guidance on selecting the appropriate methods tailored to the health-care industry.
The benefit of neural networks in medicine and dentistry is related to their ability to process large amounts of data for analyzing, diagnosing and disease monitoring. Deep learning has become a great ally in the field of medicine in general and is beginning to be one in dentistry. According to the year of publication of the studies included in this review, 2019 was the year in which the most articles were published.
The results provided by artificial intelligence have a great dependence on the data with which they learn and are trained, that is, on the input data and the image employed to detect each variable. All of the studies included in this review employed radiographs, mainly panoramic radiographies. In this sense, it would be interesting to apply neural networks and artificial intelligence in other types of radiological studies such as cone beam computed tomography (CBCT) or cephalometry, which allow clinicians to make a complete anatomical examination. Lee et al. evaluated the detection and diagnosis of different lesions employing CBCT and a deep convolutional neural network [35]. Before being possible to detect the variables analyzed in this review, teeth must be detected. Panoramic radiography is the most common technique in general dentistry, which captures the entire mouth in a single 2D image [36,37], and it is common to use artificial intelligence to detect the presence or absence of a tooth. The main advantages of these types of images are: the patient comfort compared with other techniques, such as intraoral images (bitewing and periapical); the low radiation exposure; and the ability to evaluate a larger area of the maxilla and mandible [37].
Panoramic radiographies are useful to evaluate endodontic treatments, periapical lesions and disorders in bones, among others [38]. This type of image has obtained the best results in tooth detection if we compare it with the study that used periapical images to detect this variable. In addition, the results obtained by the studies that detected teeth were superior to the rest of the variables analyzed, regardless of the network or type of image used.
Caries is one of the most common chronic diseases in the oral field, with a great impact on a patient’s health [39]. Clinical examination is the main method for caries detection, with radiographic examination being a complementary diagnostic tool [40]. According to experience and scientific literature, intraoral bitewing images are the most effective in detecting caries lesions [41]. However, only one study included in this review employed bitewings to detect caries. Two studies used near-infrared transillumination images and one employed periapical images. The best results were obtained in the study where periapical images were used to detect caries.
A variety of CNN architectures were found in the studies included in this literature review. Convolutional networks are designed to process data that come in the form of multiple arrays and that are structured in a series of stages [42]. In recent decades, CNNs have been applied with success for the detection, segmentation and recognition of objects in images. In this review, convolutional networks applied to the detection of dental variables were used.
Faster regions with convolutional neural network features (Faster R-CNN) are composed of two modules. The first module is a deep fully convolutional network that suggests regions and the second module is the Fast R-CNN detector [43]. ResNets are residual networks, which is a CNN designed to allow for thousands of convolutional layers. Deep Neural Network for Object Detection (DetectNet) outputs the XY coordinates of a detected object. This kind of neural network has been applied in different medical fields [19,44]. Keras is a library of open source neural networks written in Python. PyBrain is a machine-learning library for Python, whose objective is to provide flexible, easy-to-use and powerful algorithms for machine-learning tasks [45]. Mask R-CNN is an extension of Faster R-CNN, by adding a branch for predicting segmentation masks on each region of interest (ROI) [46]. AlexNet was introduced in 2012 and employs an eight-layer convolutional neural network as follows: five convolutional layers, two fully connected hidden layers, and one fully connected output layer [47].
In addition to the wide variety of neural network architectures, the studies included in this work also presented a great variety in terms of the number of images used. The manuscripts included in this review published in 2017 and 2018 are those that show a larger database compared with the articles published in 2019 and 2020. However, there is no relationship between the database used and the results obtained, nor between the database and the variables detected.
The possible and future clinical applications of artificial intelligence and neural networks is the prediction of a phenomenon. Probabilistic neural networks can be used in dentistry to predict fractures, as Johari et al. indicated, where a probabilistic neural network was designed to diagnose a fracture in endodontically treated teeth [48].
In view of the results shown in this review and the included studies, the authors suggest the use of neural networks that are capable of predicting possible diseases or possible treatment failures for future clinical applications in the field of dentistry.

5. Conclusions

Because of the great heterogeneity in terms of the image database and the type, results and architectures of neural networks, a standardized methodology is needed in order to increase the compatibility and robustness between studies.

Author Contributions

All of the authors have read and agreed to the published version of the manuscript. Conceptualization, M.P.-P.; methodology, M.P.-P. and J.G.V.; data curation, M.P.-P. and J.G.V.; writing—original draft preparation, M.P.-P.; writing—review and editing, C.H.M.-M.; visualization, M.P.-P., J.G.V., C.H.M.-M. and C.I.; supervision, C.I.; funding acquisition, C.H.M.-M. and C.I. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Asisa Dental S.A.U.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shen, D.; Wu, G.; Suk, H.-I. Deep Learning in Medical Image Analysis. Annu. Rev. Biomed. Eng. 2017, 19, 221–248. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Ehtesham, H.; Safdari, R.; Mansourian, A.; Tahmasebian, S.; Mohammadzadeh, N.; Pourshahidi, S. Developing a new intelligent system for the diagnosis of oral medicine with case-based reasoning approach. Oral Dis. 2019, 25, 1555–1563. [Google Scholar] [CrossRef] [PubMed]
  3. Tuzoff, D.V.; Tuzova, L.N.; Bornstein, M.M.; Krasnov, A.S.; Kharchenko, M.A.; Nikolenko, S.I.; Sveshnikov, M.M.; Bednenko, G.B. Tooth detection and numbering in panoramic radiographs using convolutional neural networks. Dentomaxillofacial Radiol. 2019, 48, 20180051. [Google Scholar] [CrossRef] [PubMed]
  4. Topol, E.J. High-performance medicine: The convergence of human and artificial intelligence. Nat. Med. 2019, 25, 44–56. [Google Scholar] [CrossRef]
  5. Mendonça, E.A. Clinical decision support systems: Perspectives in dentistry. J. Dent. Educ. 2004, 68, 589–597. [Google Scholar]
  6. Hiraiwa, T.; Ariji, Y.; Fukuda, M.; Kise, Y.; Nakata, K.; Katsumata, A.; Fujita, H.; Ariji, E. A deep-learning artificial intelligence system for assessment of root morphology of the mandibular first molar on panoramic radiography. Dentomaxillofacial Radiol. 2019, 48, 20180218. [Google Scholar] [CrossRef]
  7. Currie, G. Intelligent Imaging: Anatomy of Machine Learning and Deep Learning. J. Nucl. Med. Technol. 2019, 47, 273–281. [Google Scholar] [CrossRef]
  8. Xue, Y.; Zhang, R.; Deng, Y.; Chen, K.; Jiang, T. A preliminary examination of the diagnostic value of deep learning in hip osteoarthritis. PLoS ONE 2017, 12, e0178992. [Google Scholar] [CrossRef] [Green Version]
  9. Sklan, J.E.S.; Plassard, A.J.; Fabbri, D.; Landman, B.A. Toward content-based image retrieval with deep convolutional neural networks. In Medical Imaging 2015: Biomedical Applications in Molecular, Structural, and Functional Imaging; International Society for Optics and Photonics: Bellingham, WA, USA, 2015; Volume 9417. [Google Scholar]
  10. Schwendicke, F.; Elhennawy, K.; Paris, S.; Friebertshäuser, P.; Krois, J. Deep Learning for Caries Lesion Detection in Near-Infrared Light Transillumination Images: A Pilot Study. J. Dent. 2019, 103260. [Google Scholar] [CrossRef]
  11. Krois, J.; Ekert, T.; Meinhold, L.; Golla, T.; Kharbot, B.; Wittemeier, A.; Dörfer, C.; Schwendicke, F. Deep Learning for the Radiographic Detection of Periodontal Bone Loss. Sci. Rep. 2019, 9, 8495. [Google Scholar] [CrossRef]
  12. Lee, J.-H.; Kim, D.; Jeong, S.-N.; Choi, S.-H. Diagnosis and prediction of periodontally compromised teeth using a deep learning-based convolutional neural network algorithm. J. Periodontal Implant Sci. 2018, 48, 114. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Lee, J.-H.; Kim, D.-H.; Jeong, S.-N.; Choi, S.-H. Detection and diagnosis of dental caries using a deep learning-based convolutional neural network algorithm. J. Dent. 2018, 77, 106–111. [Google Scholar] [CrossRef] [PubMed]
  14. Ekert, T.; Krois, J.; Meinhold, L.; Elhennawy, K.; Emara, R.; Golla, T.; Schwendicke, F. Deep Learning for the Radiographic Detection of Apical Lesions. J. Endod. 2019, 45, 917–922.e5. [Google Scholar] [CrossRef] [PubMed]
  15. Schwendicke, F.; Golla, T.; Dreher, M.; Krois, J. Convolutional neural networks for dental image diagnostics: A scoping review. J. Dent. 2019, 91, 103226. [Google Scholar] [CrossRef] [PubMed]
  16. Chen, H.; Zhang, K.; Lyu, P.; Li, H.; Zhang, L.; Wu, J.; Lee, C.-H. A deep learning approach to automatic teeth detection and numbering based on object detection in dental periapical films. Sci. Rep. 2019, 9, 3840. [Google Scholar] [CrossRef] [Green Version]
  17. Mahoor, M.H.; Abdel-Mottaleb, M. Classification and numbering of teeth in dental bitewing images. Pattern Recognit. 2005, 38, 577–586. [Google Scholar] [CrossRef]
  18. Nardi, C.; Calistri, L.; Grazzini, G.; Desideri, I.; Lorini, C.; Occhipinti, M.; Mungai, F.; Colagrande, S. Is Panoramic Radiography an Accurate Imaging Technique for the Detection of Endodontically Treated Asymptomatic Apical Periodontitis? J. Endod. 2018, 44, 1500–1508. [Google Scholar] [CrossRef]
  19. Fukuda, M.; Inamoto, K.; Shibata, N.; Ariji, Y.; Yanashita, Y.; Kutsuna, S.; Nakata, K.; Katsumata, A.; Fujita, H.; Ariji, E. Evaluation of an artificial intelligence system for detecting vertical root fracture on panoramic radiography. Oral Radiol. 2019. [Google Scholar] [CrossRef]
  20. Zhang, K.; Wu, J.; Chen, H.; Lyu, P. An effective teeth recognition method using label tree with cascade network structure. Comput. Med. Imaging Graph. 2018, 68, 61–70. [Google Scholar] [CrossRef]
  21. Raith, S.; Vogel, E.P.; Anees, N.; Keul, C.; Güth, J.-F.; Edelhoff, D.; Fischer, H. Artificial Neural Networks as a powerful numerical tool to classify specific features of a tooth based on 3D scan data. Comput. Biol. Med. 2017, 80, 65–76. [Google Scholar] [CrossRef]
  22. Srivastava, M.M.; Kumar, P.; Pradhan, L.; Varadarajan, S. Detection of Tooth caries in Bitewing Radiographs using Deep Learning. In Proceedings of the Thirty-first Annual Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; p. 4. [Google Scholar]
  23. Jader, G.; Fontineli, J.; Ruiz, M.; Abdalla, K.; Pithon, M.; Oliveira, L. Deep Instance Segmentation of Teeth in Panoramic X-Ray Images. In Proceedings of the 2018 31st SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Parana, Brazil, 29 October–1 November 2018; pp. 400–407. [Google Scholar]
  24. Miki, Y.; Muramatsu, C.; Hayashi, T.; Zhou, X.; Hara, T.; Katsumata, A.; Fujita, H. Classification of teeth in cone-beam CT using deep convolutional neural network. Comput. Biol. Med. 2017, 80, 24–29. [Google Scholar] [CrossRef] [PubMed]
  25. Velemínská, J.; Pílný, A.; Čepek, M.; Kot’ová, M.; Kubelková, R. Dental age estimation and different predictive ability of various tooth types in the Czech population: Data mining methods. Anthropol. Anzeiger 2013, 70, 331–345. [Google Scholar] [CrossRef]
  26. Casalegno, F.; Newton, T.; Daher, R.; Abdelaziz, M.; Lodi-Rizzini, A.; Schürmann, F.; Krejci, I.; Markram, H. Caries Detection with Near-Infrared Transillumination Using Deep Learning. J. Dent. Res. 2019, 98, 1227–1233. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Zanella-Calzada, L.; Galván-Tejada, C.; Chávez-Lamas, N.; Rivas-Gutierrez, J.; Magallanes-Quintanar, R.; Celaya-Padilla, J.; Galván-Tejada, J.; Gamboa-Rosales, H. Deep Artificial Neural Networks for the Diagnostic of Caries Using Socioeconomic and Nutritional Features as Determinants: Data from NHANES 2013–2014. Bioengineering 2018, 5, 47. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Muramatsu, C.; Morishita, T.; Takahashi, R.; Hayashi, T.; Nishiyama, W.; Ariji, Y.; Zhou, X.; Hara, T.; Katsumata, A.; Ariji, E.; et al. Tooth detection and classification on panoramic radiographs for automatic dental chart filing: Improved classification by multi-sized input data. Oral Radiol. 2020. [Google Scholar] [CrossRef] [PubMed]
  29. Prajapati, S.A.; Nagaraj, R.; Mitra, S. Classification of dental diseases using CNN and transfer learning. In Proceedings of the 2017 5th International Symposium on Computational and Business Intelligence (ISCBI), Dubai, United Arab Emirates, 11–14 August 2017; pp. 70–74. [Google Scholar]
  30. Betul Oktay, A. Tooth detection with Convolutional Neural Networks. In Proceedings of the 2017 Medical Technologies National Congress (TIPTEKNO), Trabzon, Turkey, 12–14 October 2017; pp. 1–4. [Google Scholar]
  31. Geetha, V.; Aprameya, K.S.; Hinduja, D.M. Dental caries diagnosis in digital radiographs using back-propagation neural network. Heal. Inf. Sci. Syst. 2020, 8, 8. [Google Scholar] [CrossRef] [PubMed]
  32. Shahid, N.; Rappon, T.; Berta, W. Applications of artificial neural networks in health care organizational decision-making: A scoping review. PLoS ONE 2019, 14, e0212356. [Google Scholar] [CrossRef]
  33. Da Silva, I.; Hernane Spatti, S.; Andrade Flauzino, R. Artificial Neural Network Architectures and Training Processes. In Artificial Neural Networks: A Practical Course; Springer International Publishing: Berlin, Germany, 2017; pp. 21–28. [Google Scholar]
  34. Yamashita, R.; Nishio, M.; Do, R.K.G.; Togashi, K. Convolutional neural networks: An overview and application in radiology. Insights Imaging 2018, 9, 611–629. [Google Scholar] [CrossRef] [Green Version]
  35. Lee, J.-H.; Kim, D.-H.; Jeong, S.-N. Diagnosis of Cystic Lesions Using Panoramic and Cone Beam Computed Tomographic Images Based on Deep Learning Neural Network. Oral Dis. 2020, 26, 152–158. [Google Scholar] [CrossRef]
  36. Farman, A.G. There are good reasons for selecting panoramic radiography to replace the intraoral full-mouth series. Oral Surgery, Oral Med. Oral Pathol. Oral Radiol. Endodontology 2002, 94, 653–654. [Google Scholar] [CrossRef]
  37. Kim, J.; Lee, H.-S.; Song, I.-S.; Jung, K.-H. DeNTNet: Deep Neural Transfer Network for the detection of periodontal bone loss using panoramic dental radiographs. Sci. Rep. 2019, 9, 17615. [Google Scholar] [CrossRef]
  38. Moll, M.A.; Seuthe, M.; von See, C.; Zapf, A.; Hornecker, E.; Mausberg, R.F.; Ziebolz, D. Comparison of clinical and dental panoramic findings: A practice-based crossover study. BMC Oral Health 2013, 13, 48. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Chen, K.J.; Gao, S.S.; Duangthip, D.; Lo, E.C.M.; Chu, C.H. Prevalence of early childhood caries among 5-year-old children: A systematic review. J. Investig. Clin. Dent. 2019, 10, e12376. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Wenzel, A. Dental caries. In Oral radiology. Principles and Interpretation.; Elsevier Mosby: St. Louis, MO, USA, 2014; pp. 285–298. [Google Scholar]
  41. Pakbaznejad Esmaeili, E.; Pakkala, T.; Haukka, J.; Siukosaari, P. Low reproducibility between oral radiologists and general dentists with regards to radiographic diagnosis of caries. Acta Odontol. Scand. 2018, 76, 346–350. [Google Scholar] [CrossRef] [PubMed]
  42. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
  43. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. Comput. Vis. Pattern Recognit. 2015, 39, 91–99. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Zhao, Z.-Q.; Zheng, P.; Xu, S.-T.; Wu, X. Object Detection With Deep Learning: A Review. IEEE Trans. Neural Networks Learn. Syst. 2019, 30, 3212–3232. [Google Scholar] [CrossRef] [Green Version]
  45. Schaul, T.; Bayer, J.; Wierstra, D.; Sun, Y.; Felder, M.; Sehnke, F.; Rückstieß, T.; Schmidhuber, J. PyBrain. J. Mach. Learn. Res. 2010, 11, 743–746. [Google Scholar]
  46. He, K.; Gkioxari, G.; Dollar, P.; Girshick, R. Mask R-CNN. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 386–397. [Google Scholar] [CrossRef]
  47. Krizhevsky, A.; Sutskever, I.; Hinton, G. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 1097–1105. [Google Scholar] [CrossRef]
  48. Johari, M.; Esmaeili, F.; Andalib, A.; Garjani, S.; Saberkari, H. Detection of vertical root fractures in intact and endodontically treated premolar teeth by designing a probabilistic neural network: An ex vivo study. Dentomaxillofacial Radiol. 2017, 46, 20160107. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Flowchart.
Figure 1. Flowchart.
Applsci 10 02856 g001
Figure 2. System architecture for teeth detection.
Figure 2. System architecture for teeth detection.
Applsci 10 02856 g002
Table 1. Search strategy.
Table 1. Search strategy.
DatabaseSearch StrategySearch Data
MEDLINE/PubMed(deep learning OR artificial intelligence OR neural network *) AND (dentistry OR dental) AND (teeth OR tooth OR caries OR filling OR dental implant OR endodontic OR root treatment) AND detect NOT (review)17 March, 2020
IEEE Xplore(((((((((“Full Text Only”: deep learning) OR “Full Text Only”: artificial intelligence) OR “Full Text Only”: neural network) AND “Full Text Only”: teeth) OR “Full Text Only”: endodontic) OR “Full Text Only”: caries) OR “Full Text Only”: filling) OR “Full Text Only”: dental implant) AND “Document Title”: detect)17 March, 2020
arXiv.org(deep learning OR artificial intelligence OR neural network *) AND (dentistry OR dental) AND (teeth OR tooth OR caries OR filling OR dental implant OR endodontic OR root treatment) AND detect17 March, 2020
* This is a method to search in pubmed.
Table 2. Main characteristics of included studies.
Table 2. Main characteristics of included studies.
AuthorsJournalCountry, YearVariable DetectedImageTotal Image DatabaseNeural NetworkOutcome MetricsOutcome Metrics Values
Schwendicke et al. [10]Journal of DentistryEngland, 2019CariesNear-infrared light transillumination226Resnet18
Resnext50
AUC/Sensitivity/Specificity0.74/0.59/0.76
Fukuda et al. [19]Oral RadiologyJapan, 2019Vertical root fracture (endodontic)Panoramic radiography300DetectNetRecall/Precision/F measure0.75/0.93/0.83.
Ekert et al. [14]Journal of EndodonticsUSA, 2019EndodonticPanoramic radiography85CNNAUC/SensitivityMolar: 0.89/0.74 Other teeth: 0.85/0.65
Chen et al. [16]Scientific ReportsEngland, 2019TeethPeriapical images1250Faster R-CNNRecall/Precision0.728/0.771
Tuzoff [3]Dentomaxillofacial RadiologyEngland, 2019TeethPanoramic radiography1352Faster R-CNNSensitivity/Precision0.9941/0.9945
Zhang et al. [20]Computerized Medical Imaging and GraphicsUSA, 2018TeethPeriapical images700Faster-RCNN/region-based fully convolutional networks (R-FCN).Precision/Recall0.958/0.961
Raith et al. [21]Computers in Biology and MedicineEngland, 2017Teeth--ANNPerformance0.93
Srivastava et al. [22]NIPS 2017 workshop on Machine Learning for Health (NIPS 2017 ML4H)USA, 2017CariesBitewing3000FCNN (deep fully convolutional neural network)Recall/Precision/F1-Score0.805/0.615/0.7
Jader et al. [23]IEEEBrazil, 2018TeethPanoramic radiography1500Mask R-CNNAccuracy/F1-score/Precision/Recall/Specificity0.98/0.88/0.94/0.84/0.99
Miki et al. [24]Computers in Biology and MedicineUSA, 2017TeethCone-beam computed tomography (CT)52AlexNetAccuracy0.88
Velemínská et al. [25]Anthropologischer AnzeigerGermany, 2013TeethPanoramic radiography1393RBFNN
GAME
Accuracy-
Casalengo et al. [26]Journal of Dental ResearchUSA, 2019CariesNear-infrared transillumination217CNNs for semantic segmentationAUC0.836 and 0.856 for occlusal and proximal lesions, respectively
Lee et al. [13]Journal of DentistryEngland, 2018CariesPeriapical3000Deep CNN algorithm weight factorsAccuracy/AUCpremolar, molar, and both premolar and molar: 0.89, 0.88, and 0.82/0.917, 0.89, 0.845
Zanella-Calzada et al. [27]BioengineeringSwitzerland, 2018Caries-9812ANNAccuracy/AUC0.69/0.75
Muramatsu et al. [28]Oral RadiologyJapan, 2020TeethPanoramic radiographs100Object detection network using fourfold cross-validation methodSensitivity/Accuracy0.964/0.932
Prajapati et al. [29]5th International Symposium on Computational and Business IntelligenceUnited Arab Emirates, 2017CariesRadiovisiography images251CNNAccuracy0.875
Oktay, A. [30]IEEETurkey, 2017TeethPanoramic radiographs100AlexNetAccuracy>0.92
Geetha et al. [31]Health Information Science and SystemsSwitzerland, 2020CariesIntraoral radiographs105Back-propagation neural networkAccuracy/ Precision recall0.971/0.987

Share and Cite

MDPI and ACS Style

Prados-Privado, M.; Villalón, J.G.; Martínez-Martínez, C.H.; Ivorra, C. Dental Images Recognition Technology and Applications: A Literature Review. Appl. Sci. 2020, 10, 2856. https://0-doi-org.brum.beds.ac.uk/10.3390/app10082856

AMA Style

Prados-Privado M, Villalón JG, Martínez-Martínez CH, Ivorra C. Dental Images Recognition Technology and Applications: A Literature Review. Applied Sciences. 2020; 10(8):2856. https://0-doi-org.brum.beds.ac.uk/10.3390/app10082856

Chicago/Turabian Style

Prados-Privado, María, Javier García Villalón, Carlos Hugo Martínez-Martínez, and Carlos Ivorra. 2020. "Dental Images Recognition Technology and Applications: A Literature Review" Applied Sciences 10, no. 8: 2856. https://0-doi-org.brum.beds.ac.uk/10.3390/app10082856

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop