Enhancing Cognitive Frailty Prediction Accuracy Using Conditional Generative Adversarial Networks(CGAN)

Class imbalance is a prevalent issue in real-life scenarios, especially in medical datasets where instances of normal health conditions far outnumber those with health conditions, for example, Cognitive Frailty. This imbalance can lead to predictive models biased towards the majority class, thus dim...

詳細記述

書誌詳細
出版年:ACM International Conference Proceeding Series
第一著者: Ibrahim F.N.A.; Badruddin N.; Ramasamy K.
フォーマット: Conference paper
言語:English
出版事項: Association for Computing Machinery 2024
オンライン・アクセス:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85215947639&doi=10.1145%2f3702138.3702151&partnerID=40&md5=7a185588287fc31fdccf1004508663f7
その他の書誌記述
要約:Class imbalance is a prevalent issue in real-life scenarios, especially in medical datasets where instances of normal health conditions far outnumber those with health conditions, for example, Cognitive Frailty. This imbalance can lead to predictive models biased towards the majority class, thus diminishing their accuracy in identifying those with health condition cases. This paper explores using an innovative Conditional Generative Adversarial Network (CGAN) and also other methods to improve class imbalance in medical datasets, highlighting their potential to refine prediction models and ultimately improve patient outcomes. By focusing on the generation and integration of synthetic data to counteract class imbalance, this paper makes a significant contribution to the field of medical diagnostics, underscoring the importance of this research. It provides valuable insights into optimizing machine learning algorithms, particularly in the early detection of Cognitive Frailty. The aim is to create balanced datasets that better represent the spectrum of Cognitive Frailty conditions, thereby enhancing the performance of machine learning models in early Cognitive Frailty detection and, most importantly, improving patient outcomes in a tangible and significant way. © 2024 Copyright held by the owner/author(s).
ISSN:
DOI:10.1145/3702138.3702151