Journal of Intelligent Systems and Internet of Things

Journal DOI

https://doi.org/10.54216/JISIoT

Submit Your Paper

2690-6791ISSN (Online) 2769-786XISSN (Print)

Outlier Management and its Impact on Diabetes Prediction: A Voting Ensemble Study

S. Phani Praveen , Kotte Sandeep , N. Raghavendra Sai , Aditi Sharma , Jitendra Pandey , Vikas Chouhan

The chronic metabolic disorder known as diabetes mellitus, which is defined by hyperglycemia, poses a significant threat to the health of people all over the world. The categorization is broken down into two primary categories: Type 1 and Type 2, with each category having its own unique causes and approaches to treatment. It is very necessary for the effective management of illnesses to have both the prompt detection and the exact prediction of outcomes. The applications of machine learning and data mining are becoming increasingly important as tools in this setting. The current research study analyses the usage of machine learning models, specifically Voting Ensembles, for the goal of predicting diabetes. Specifically, the researchers were interested in how accurate these models were. Using GridSearchCV, the Voting Ensemble, which consists of LightGBM, XGBoost, and AdaBoost, is fine-tuned to manage outliers. This may be done with or without the Interquartile Range (IQR) pre-processing. The results of a comparative analysis of performance, which is carried out, illustrate the benefits that are linked with outlier management. According to the findings, the Voting Ensemble model, when paired with IQR pre-processing, possesses greater accuracy, precision, and AUC score, which makes it more acceptable for predicting diabetes. Despite this, the strategy that does not use the IQR continues to be a workable and reasonable alternative. The current study emphasizes both the significance of outlier management within the area of healthcare analytics and the effect of data preparation procedures on the accuracy of prediction models. Both of these topics are brought up because of the relevance of the current work.

Read More

Doi: https://doi.org/10.54216/JISIoT.120101

Vol. 12 Issue. 1 PP. 08-19, (2024)

Grey Wolf Optimizer Algorithm for Multi-Objective Optimal Power Flow

Y. V. Krishna Reddy , R. Sireesha , BP Mishra , Pavithra G. , Soban Badonia

This article introduces the Grey Wolf Optimizer (GWO) algorithm, a novel method aimed at tackling the challenges posed by the multi-objective Optimal Power Flow (OPF) problem. Drawing inspiration from the foraging behavior of grey wolves, GWO stands apart from traditional approaches by enhancing initial solutions without relying on gradient data collection from the objective function. In the domain of power system optimization, the OPF problem is widely acknowledged, involving constraints related to generator parameters, valve-point loading, reactive power, and active power. The proposed GWO technique is applied to IEEE 14-bus and 30-bus power systems, targeting four case objectives: minimizing cost with quadratic cost function, minimizing cost with inclusion of valve point, minimizing power loss, and minimizing both cost and losses simultaneously. For the IEEE-14 bus system, which requires meeting a power demand of 259 MW, GWO yields optimal costs of 827.0056 $/hr, 833.4691 $/hr, 1083.2410 $/hr, and 852.2255 $/hr across the four cases. Similarly, for the IEEE-30 bus system aiming to satisfy a demand of 283.4 MW, GWO achieves optimal costs of 801.8623 $/hr, 825.9321 $/hr, 1028.6309 $/hr, and 850.4794 $/hr for the respective cases. These optimal results are then compared with existing research outcomes, highlighting the efficiency and cost-effectiveness of the GWO algorithm when juxtaposed with alternative methods for solving the OPF problem.

Read More

Doi: https://doi.org/10.54216/JISIoT.120102

Vol. 12 Issue. 1 PP. 20-32, (2024)

Enhancing Brain Tumor Detection and Classification using Osprey Optimization Algorithm with Deep Learning on MRI Images

S. Stephe , V. Nivedita , B. Karthikeyan , K. Nithya , Mohamed Yacin Sikkandar

Brain tumors (BT) are abnormal cell growth from the brain or the surrounding cells. It is categorized into 2 major types such as malignant (cancerous) and benign (non-cancerous). Classifying and detecting BTs is critical for knowledge of their mechanisms. Magnetic Resonance Imaging (MRI) is a helpful but time-consuming system, that needs knowledge for manual examination. A new development in Computer-assisted Diagnosis (CAD) and deep learning (DL) allows more reliable BT detection. Typical machine learning (ML) depends on handcrafted features, but DL achieves correct outcomes without such manual extraction. DL methods, particularly convolutional neural networks (CNNs) and recurrent neural networks (RNNs) can exposed to optimum outcomes in the domain of medical image analysis, comprising the classification and recognition of BTs in MRI and CT scans. Thus, the study designs an automated BT Detection and Classification using the Osprey Optimization Algorithm with Deep Learning (BTDC-OOADL) method on MRI Images. The BTDC-OOADL technique deeply investigates the MRI for the identification of BT. In the proposed BTDC-OOADL algorithm, the Wiener filtering (WF) model is applied for the elimination of noise. Besides, the BTDC-OOADL algorithm exploits the MobileNetV2 technique for the procedure of feature extractor. In the meantime, the OOA is utilized for the optimum hyperparameter choice of the MobileNetv2 model. Finally, the graph convolutional network (GCN) model can be deployed for the classification and recognition of BT. The experimental outcome of the BTDC-OOADL methodology can be tested under benchmark dataset. The simulation values infer the betterment of the BTDC-OOADL system with recent approaches.

Read More

Doi: https://doi.org/10.54216/JISIoT.120103

Vol. 12 Issue. 1 PP. 33-44, (2024)

Energy Efficient Task Scheduling Strategy using Modified Coot Optimization Algorithm for Cloud Computing

Kandan M. , M. Mutharasu , Siva Satya Sreedhar P. , S. Thenappan , G. Nagarajan

Cloud computing (CC) refers to a current computing method that provides the virtualization of computing services as a utility to Cloud service users. Problems based on ineffective task mapping to cloud resource frequently happen in a cloud atmosphere. Task scheduling (TS), thus, means effective scheduling of rational allocation and computational actions of computing resource in certain limitations in the IaaS cloud network. Job scheduling was to allocate tasks to the most appropriate sources to reach more than one goal. Thus, choosing a suitable work scheduling technique for rising CC resource efficiency, whereas maintaining high quality of service (QoS) assurances, becomes a significant problem that remains to attract interest of researchers. Metaheuristic techniques shown remarkable efficacy in supplying near-optimal scheduling solutions for a complicated large-sized issues. Recently, a rising number of independent scholar has examined the QoS rendered by TS approaches. Therefore, this study develops an Energy Efficient Task Scheduling Strategy using Modified Coot Optimization Algorithm (EETSS-MCOA) for CC environment. The EETSS-MCOA method carries out the derivation of features and MCOA is applied to schedule tasks. In addition, the MCOA algorithm is derived by the combination of adaptive β hill climbing concept with the COA for enhanced task scheduling. The conventional COA is stimulated by the swarming characteristics of birds known as coots. The COA followed two distinct stages of bird movements on water surface. The experimental results of the EETSS-MCOA model are validated on CloudSim tool. The solutions attained by the EETSS-MCOA model are found to be better than the existing algorithms.

Read More

Doi: https://doi.org/10.54216/JISIoT.120104

Vol. 12 Issue. 1 PP. 45-56, (2024)

Enhancing Real-Time Malware Analysis with Quantum Neural Networks

Thulasi Bikku , Suresh Babu Chandolu , S. Phani Praveen , Narasimha Rao Tirumalasetti , K. Swathi , U. Sirisha

The proposed Quantum Neural Networks (QNN) perform better than traditional machine learning models. The escalating complexity of malware poses a significant challenge to cybersecurity, necessitating innovative approaches to keep pace with its rapid evolution. Contemporary malware analysis techniques underscore the urgent need for solutions that can adapt to the dynamic functionalities of evolving malware. In this context, Quantum Neural Networks (QNNs) emerge as a cutting-edge and distinctive approach to malware analysis, promising to overcome the limitations of conventional methods. Our exploration of QNNs focuses on uncovering their valuable applications, particularly in real-time malware research. We meticulously examine the advantages of QNNs in contrast to conventional machine-learning methods employed in malware detection and classification. The proposed QNN showcases its unique capability to handle complex patterns, emphasizing its potential to achieve heightened levels of accuracy. Our contribution extends to introducing a dedicated framework for QNN-based malware analysis, harnessing the formidable computational capabilities of quantum computing for real-time malware analysis. This framework is structured around three pivotal components, Malware Feature Extraction utilizes quantum feature extraction techniques to identify relevant features from malware samples. Malware Classification employs a QNN classifier to categorize malware samples as benign or malicious. Real-Time Analysis enables the instantaneous examination of malware samples by integrating feature extraction and classification within a streaming data pipeline. Our proposed methodology undergoes comprehensive evaluation using a benchmark dataset of malware samples. The Proposed Quantum Neural Networks (QNNs) demonstrated a high accuracy of 0.95, outperforming other quantum models such as Quantum Support Vector Machines (QSVM) and Quantum Decision Trees (QDT), as well as classical models like Random Forest (RF), Support Vector Machines (SVM), and Decision Trees (DT) on the Malware DB dataset. The results affirm the framework's exceptional accuracy rates and low latency, establishing its suitability for real-time malware analysis. These findings underscore the potential for QNNs to revolutionize malware evaluation and strengthen real-time defenses against cyberattacks. While our research demonstrates promising outcomes, further exploration and development in this domain are imperative to fully exploit the extensive viability that QNNs offer for cybersecurity applications.

Read More

Doi: https://doi.org/10.54216/JISIoT.120105

Vol. 12 Issue. 1 PP. 57-69, (2024)

Automated EEG based Emotion Detection using Bonobo Optimizer with Deep Learning on Human Computer Interaction

Siva Satya Sreedhar P. , M. S. Minu , P. Vidyasri , Habeeb Omotunde , A. Tamizharasi , R. Logarasu , Rama Prabha K. P. , V. Subashree