Artificial Intelligence in the Diagnosis of Maxillofacial Disorders
Abstract views: 60 / PDF downloads: 53
Recently, studies and research have focused on the use of artificial intelligence (AI) in medical science [1,2]. It is probable that the healthcare industry, especially radiology, is a step or two ahead of the curve when using convolutional neural networks in clinical practice. The number of investigations into the use of radiography in daily life continues to grow, as does the number of accessible methods that have already impacted the issue of patient care, both of which are on the rise. In addition, there is a whole area devoted to Medical Imaging using AI. Additionally, a dedicated domain has emerged, focusing on the synergy between AI and Medical Imaging, particularly in the context of diagnosing Maxillofacial Disorders.
The diagnosis is made based on the patient’s medical history, linked testing, and other susceptible variables, known to be risk factors for human memory retention. AI from human professionals performs much better than human specialists when using primary health data . A study indicated that by using AI in conjunction with clinical diagnostics, the accuracy and efficiency of diagnosis might be significantly improved.
Recently, machine learning techniques have been used to diagnose several illnesses, including tumors, cancer, and metastases, among others. These algorithms demonstrated excellent reliability in distinguishing periapical cysts from keratocystic odontogenic tumors when manually created parameters  were used in their development. However when these approaches, such as convolutional neural network (CNN), were employed to examine the cytological pictures gathered, they revealed an inadequate performance error in identifying malignant lesions of the mouth. Although these results are hopeful, existing AI algorithms for diagnosing oral and maxillofacial lesions predominantly rely only on a single type of data, cytopathological reports. Using models that include the patient’s medical history is critical for a more accurate analysis .
Deep learning (DL) and CNN have made significant contributions to AI in caries and endodontics because of their capacity to automate waste categorization and classification. To classify radiographs or photographs, several criteria, including comparable qualities, are used to separate them into many discontinuous sections . This process results in predictable data being generated from unpredictable data. Using understanding network (U-Net), the DL categorizes the cone beam computed tomography (CBCT) vertices into “lesions,” “tooth structures,” “bones,” “restorative materials,” and “backgrounds,” with the findings comparable to the diagnosis of total lesions. Apical is a company that supplies doctors . Distal caries lesions may also be detected by DL using imaging data .
The clinical signs and symptoms that the patient exhibits are crucial in diagnosing temporomandibular disorders (TMD). It is a method for converting spoken language into an ordered computer language known as speech processing. It was found that constructing a software model based on the sorts of words used in the patient’s speech and the size of their mouth was more successful than using the actual mode . A full degree of agreement between AI and the physician is shown in AI’s identification of condyle morphology.
Reviewing these articles was instructive since it provided us with an opportunity to observe the diverse range of approaches that have been created and assessed across a diverse range of images and experiences. However, it is important to note that no one has determined how these approaches will be integrated into a clinical workflow or, more importantly, whether and how they will impact radiologists’ diagnostic accuracy and efficiency, and consequently, patient outcomes. Therefore, it is difficult to predict which ones will be implemented in a clinical environment. As underscored by the study findings, continued research endeavors are imperative to harness the full potential of AI in transforming the landscape of diagnosing Maxillofacial Disorders.
Balat A, Bahşi İ (2023) May Artificial Intelligence Be a Co-Author on an Academic Paper? Eur J Ther. 29(3):e12-e13. https://doi.org/10.58600/eurjther1688
Balat A, Bahşi İ (2023) We Asked ChatGPT About the Co-Authorship of Artificial Intelligence in Scientific Papers. Eur J Ther. 29(3):e16-e19. https://doi.org/10.58600/eurjther1719
Bouletrea P, Makaremi M, Ibrahim B, Louvrier A, Sigaux N (2019) Artificial Intelligence: Application in Orthognathic Surgery. J Stomatol Oral Maxillofac Surg. 120(4):347-354. https://doi.org/10.1016/j.jormas.2019.06.001
Yilmaz E, Kayikcioglu T, Kayipmaz S (2017). Computer-aided diagnosis of periapical cyst and keratocystic odontogenic tumor on cone beam computed tomography. Comput Methods Programs Biomed. 146:91-100. https://doi.org/10.1016/j.cmpb.2017.05.012
Sunny S, Baby A, James BL, Balaji D, NV A, Rana MH, Gurpur P, Skandarajah A, D’Ambrosio M, Ramanjinappa RD, Mohan SP (2019). A smart tele-cytology point-of-care platform for oral cancer screening. PLoS One. 14(11):e0224885. https://doi.org/10.1371/journal.pone.0224885
Anwar SM, Majid M, Qayyum A, Awais M, Alnowami M, Khan MK (2018). Medical image analysis using convolutional neural networks: a review. J Med Syst. 42:1-3. https://doi.org/10.1007/s10916-018-1088-1
Setzer FC, Shi KJ, Zhang Z, Yan H, Yoon H, Mupparapu M, Li J (2020). Artificial intelligence for the computer-aided detection of periapical lesions in cone-beam computed tomographic images. J Endod. 46(7):987-93. https://doi.org/10.1016/j.joen.2020.03.025
Casalegno F, Newton T, Daher R, Abdelaziz M, Lodi-Rizzini A, Schürmann F, Krejci I, Markram H (2019). Caries detection with near-infrared transillumination using deep learning. J Dent Res. 98(11):1227-33. https://doi.org/10.1177/0022034519871884
Nam Y, Kim HG, Kho HS (2018). Differential diagnosis of jaw pain using informatics technology. J Oral Rehabil. 45(8):581-8. https://doi.org/10.1111/joor.12655
How to Cite
Copyright (c) 2023 European Journal of Therapeutics
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
The content of this journal is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.