An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction
This study evaluated the nature-inspired optimization algorithms to improve classification involving imbalanced class problems. The particle swarm optimization (PSO) and grey wolf optimizer (GWO) were used to adaptively balance the distribution and then four supervised machine learning classifiers a...
Published in: | Indonesian Journal of Electrical Engineering and Computer Science |
---|---|
Main Author: | |
Format: | Article |
Language: | English |
Published: |
Institute of Advanced Engineering and Science
2023
|
Online Access: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85174209769&doi=10.11591%2fijeecs.v32.i1.pp468-477&partnerID=40&md5=8c3498c8638eeb2dbf8f8be3c2faa95d |
id |
2-s2.0-85174209769 |
---|---|
spelling |
2-s2.0-85174209769 Kamaruddin A.S.; Hadrawi M.F.; Wah Y.B.; Aliman S. An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction 2023 Indonesian Journal of Electrical Engineering and Computer Science 32 1 10.11591/ijeecs.v32.i1.pp468-477 https://www.scopus.com/inward/record.uri?eid=2-s2.0-85174209769&doi=10.11591%2fijeecs.v32.i1.pp468-477&partnerID=40&md5=8c3498c8638eeb2dbf8f8be3c2faa95d This study evaluated the nature-inspired optimization algorithms to improve classification involving imbalanced class problems. The particle swarm optimization (PSO) and grey wolf optimizer (GWO) were used to adaptively balance the distribution and then four supervised machine learning classifiers artificial neural network (ANN), support vector machine (SVM), extreme gradient-boosted tree (XGBoost), and random forest (RF) were applied to maximize the classification performance for electricity fraud prediction. The imbalance data was balanced using random undersampling (RUS) and two nature-inspired algorithm techniques (PSO and GWO). Results showed that for the data balanced using random undersampling, ANN (Sentest = 50.31%), and XGBoost (Sentest = 66.32%) has better sensitivity than SVM (Sentest = 23.61%), while RF exhibits overfitting (Sentrain = 100%, Sentest = 71.25%). The classification performance of RF model hybrid with PSO improved tremendously (AccTest = 96.98%, Sentest = 94.87%, Spectest = 99.16%, Pretest = 99.14%, F1 Score = 96.96%, and area under the curve (AUC) = 0.989). This was closely followed by hybrid of XGBoost with PSO. Moreover, RF and XGBoost hybrid with GWO also showed an improvement and promising results. This study has showed that nature-inspired optimization algorithms (PSO and GWO) are effective methods in addressing imbalanced dataset. © 2023 Institute of Advanced Engineering and Science. All rights reserved. Institute of Advanced Engineering and Science 25024752 English Article All Open Access; Gold Open Access |
author |
Kamaruddin A.S.; Hadrawi M.F.; Wah Y.B.; Aliman S. |
spellingShingle |
Kamaruddin A.S.; Hadrawi M.F.; Wah Y.B.; Aliman S. An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
author_facet |
Kamaruddin A.S.; Hadrawi M.F.; Wah Y.B.; Aliman S. |
author_sort |
Kamaruddin A.S.; Hadrawi M.F.; Wah Y.B.; Aliman S. |
title |
An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
title_short |
An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
title_full |
An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
title_fullStr |
An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
title_full_unstemmed |
An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
title_sort |
An evaluation of nature-inspired optimization algorithms and machine learning classifiers for electricity fraud prediction |
publishDate |
2023 |
container_title |
Indonesian Journal of Electrical Engineering and Computer Science |
container_volume |
32 |
container_issue |
1 |
doi_str_mv |
10.11591/ijeecs.v32.i1.pp468-477 |
url |
https://www.scopus.com/inward/record.uri?eid=2-s2.0-85174209769&doi=10.11591%2fijeecs.v32.i1.pp468-477&partnerID=40&md5=8c3498c8638eeb2dbf8f8be3c2faa95d |
description |
This study evaluated the nature-inspired optimization algorithms to improve classification involving imbalanced class problems. The particle swarm optimization (PSO) and grey wolf optimizer (GWO) were used to adaptively balance the distribution and then four supervised machine learning classifiers artificial neural network (ANN), support vector machine (SVM), extreme gradient-boosted tree (XGBoost), and random forest (RF) were applied to maximize the classification performance for electricity fraud prediction. The imbalance data was balanced using random undersampling (RUS) and two nature-inspired algorithm techniques (PSO and GWO). Results showed that for the data balanced using random undersampling, ANN (Sentest = 50.31%), and XGBoost (Sentest = 66.32%) has better sensitivity than SVM (Sentest = 23.61%), while RF exhibits overfitting (Sentrain = 100%, Sentest = 71.25%). The classification performance of RF model hybrid with PSO improved tremendously (AccTest = 96.98%, Sentest = 94.87%, Spectest = 99.16%, Pretest = 99.14%, F1 Score = 96.96%, and area under the curve (AUC) = 0.989). This was closely followed by hybrid of XGBoost with PSO. Moreover, RF and XGBoost hybrid with GWO also showed an improvement and promising results. This study has showed that nature-inspired optimization algorithms (PSO and GWO) are effective methods in addressing imbalanced dataset. © 2023 Institute of Advanced Engineering and Science. All rights reserved. |
publisher |
Institute of Advanced Engineering and Science |
issn |
25024752 |
language |
English |
format |
Article |
accesstype |
All Open Access; Gold Open Access |
record_format |
scopus |
collection |
Scopus |
_version_ |
1809677777751769088 |