Background A thorough psychosocial assessment is time-consuming, often requiring 1 2 Accelerating... more Background A thorough psychosocial assessment is time-consuming, often requiring 1 2 Accelerating Depression Intervention: Identifying Critical Factors using MOORA multiple sessions to uncover the psychological factors contributing to mental illness, such as depression. The duration varies depending on the severity of the patient's condition and how effectively the psychotherapist can establish rapport. However, prolonged assessment periods pose a significant risk of patient deterioration. Methods The comprehensive psychosocial intervention, led by the Multi-Criteria Decision-Making (MCDM) approach utilizing the Multi-Objective Optimization by Ratio Analysis (MOORA) method, played a pivotal role in identification of the key psychological factors contributing to the depression of the client among the 21 facotrs specified by BDI-II analysis. Results The integration of the MCDM-MOORA strategy compared to traditional psychotherapy demonstrates a Jaccard similarity coefficient of 0.8, with a minimum error margin of 7% (vulnerability index = 0.57), indicating a significant agreement between the two approaches, both converging towards a similar solution. Conclusion The implementation of MOORA facilitated the identification and prioritization of key psychosocial intervention strategies, making the process 45.5 times faster compared to traditional methods. This acceleration significantly contributed to the precision and efficacy of the work. Additionally, critical vulnerable factors were identified through ordered statistics and correlation analysis (Pearson (r) = 0.8929 and Spearman's rank (ρ) = 0.7551) on the Beck Depression Inventory-II model. These findings were supported by other MCDM schemes such as EDAS and TOPSIS, etc. Moreover, the proposed method demonstrated high stability and robustness in dynamic decision-making environments, maintaining consistency across scenarios adapted by different psychotherapists. Overall, the combined application of MCDM (MOORA) and targeted psychological interventions yielded substantial positive outcomes in enhancing the well-being of individuals with psychological illnesses e.g., depression, cognitive, affective, somatic syndromes.
Breast cancer (BC) is the most frequently diagnosed cancer among women, surpassing all other type... more Breast cancer (BC) is the most frequently diagnosed cancer among women, surpassing all other types of cancer in terms of prevalence. It affects both males and females, but women are at a greater risk of developing it. The lifetime probability of developing breast cancer for women is approximately 1 in 38. The focus of this study is to differentiate between benign and malignant breast cancer tumors using the fine needle aspiration (FNA) signal as the primary source of information. Four deep learning (DL) models, namely long short-term memory (LSTM), Gated recurrent unit (GRU), Deep belief network (DBN), and autoencoder (AE) have been utilized to achieve this goal. The proposed models have been trained and validated using two public breast cancer datasets: the Wisconsin Original Breast Cancer dataset (WBC) and the Wisconsin Diagnostic Breast Cancer dataset (WDBC). To establish a reliable model, three different types of training techniques have been utilized, including the 80:20 split, the 70:30 split, and the k-fold method. The experimental investigation incorporated three different data characteristics, namely balanced, less imbalanced, and extremely imbalanced data. The simulation-based experimental findings indicate that the LSTM model achieves high levels of accuracy, F1-score, and area under the curve (AUC) when applied to the two commonly used datasets. The WDBC dataset yields accuracy, F1score, and AUC values of 0.98, 0.98, and 0.99, respectively, while the WBCD dataset yields values of 0.99, 0.99, and 1, respectively. These results were obtained using a 3-fold training scheme and balanced data. The LSTM model consistently outperforms the other three models, regardless of variations in datasets, training methods, and changes in data properties. The efficacy of the models can be evaluated by subjecting the deep learning models to bigger and varying degrees of unbalanced data samples, including both balanced and less skewed datasets. To further this study, we aim to explore the effectiveness of DL models in conjunction with an IoT system to improve breast cancer detection accuracy in online mode for patients residing in remote areas.
A neurological brain disorder that progresses over time is Alzheimer's disease. Alzheimer's disea... more A neurological brain disorder that progresses over time is Alzheimer's disease. Alzheimer's disease can take years to identify, comprehend, and manifest-even in cases where signs are obvious. On the other hand, technological developments like imaging methods aid in early detection. But frequently, the results are unreliable, which delays the course of treatment. By dividing resting-state electroencephalography (EEG) signals into three groups-AD, healthy controls, and mild cognitive impairment (MCI)-this work offers a novel perspective on the diagnosis of Alzheimer's disease (AD). In order to overcome data limits and the over-fitting issue with deep learning models, we looked at augmenting the one-dimensional EEG data of 100 patients (49 AD participants, 37 MCI subjects, and 14 HC subjects) with overlapping sliding windows. Better results and early intervention could arise from this for persons afflicted with the illness. This research has the potential to significantly advance the early diagnosis of Alzheimer's disease and lay the groundwork for the creation of more precise and trustworthy diagnostic instruments for this debilitating condition. This study presents a Modified Deep Belief Network (MDBN) with a metaheuristic optimization method for detecting face expression and Alzheimer's disease using EEG inputs. The recommended method extracts significant features from EEG data in a novel way by applying the Improved Binary Salp Swarm Algorithm (IBSSA), which combines the MDBN and the metaheuristic optimization algorithm. The performance of the suggested technique MDBN-IBSSA for Alzheimer's disease diagnosis is evaluated using two publicly available datasets. The proposed technique's capacity to discriminate between healthy and ill patients is proved by the MDBN-IBSSA accuracy of 98.13%, f-Score of 96.23%, sensitivity of 95.89%, precision of 95.671%, and specificity of 97.13%. The experimental results of this study show that the MDBN-IBSSA algorithm proposed for AD diagnosis is effective, superior, and applicable.
In software development and testing, detecting and mitigating faults are paramount to prevent pot... more In software development and testing, detecting and mitigating faults are paramount to prevent potential issues from escalating and disrupting the development and testing processes. The proposed method can also improve the prediction of various issues, such as increased model complexity, longer execution times, higher error rates, and enhanced fault detection capabilities. Addressing this concern, the paper introduced a three-stage model encompassing data pre-processing, feature dimensionality reduction, and fault prediction, which are essential steps in effective software testing. Our research leverages the publicly available NASA dataset and employs Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) to reduce feature vector dimensions, a common practice in software testing. We propose an improved version of the Grey Wolf Optimization (IMGWO) algorithm, complemented by Extreme Learning Machines (ELM), to discern the presence of defects within software modules. This approach is highly relevant in software testing, as it aids in identifying problematic areas early in the development cycle. Utilizing the PCA-LDA+IMGWO-ELM approach, our model achieves an average accuracy rate of 0.9811 when applied to the KC2 dataset, a significant milestone in software testing. These results are substantiated through experimental validation, reinforcing the credibility of our approach in predicting potential software defects during the software testing phase.
Background While long-lasting marriages are lauded for their positive impact on well-being, the r... more Background While long-lasting marriages are lauded for their positive impact on well-being, the reality is that 44% of couples globally face the challenge of divorce. This study addresses the crucial role of psycho-social interventions in alleviating severe depression arising from marital conflict. Methods Our comprehensive approach, combining subjective analysis, BDI-II, and Mental Status Examination, guides a structured intervention with rapport building, psycho-education, and tailored strategies. Results This led to significant improvements for our client: previously severely depressed, they now actively engage in professional activities, with depression levels, cognitive, somatic, and dominating factors improving by 85.7%, 85.7%, 88.8%, and 80%, respectively. To further refine our approach, we employ Ordered Statistics and Pearson Correlation to identify critical vulnerable factors and Multi-Objective Optimization by Ratio Analysis (MOORA) to objectively prioritize interventions. Moreover, incorporation of MCDM-MOORA strategy over traditional psychotherapy for identification of dominating psychological criteria from BDI-II shows Jaccard similarity coefficient of 0.8 with minimum error margin 7% (vulnerability index = 0.57) and around 45.5 times more faster convergence rate compared to traditional process. Conclusion This data-driven approach ensures the most effective interventions are chosen, maximizing the precision and efficacy of our study and paving the way for improved support for individuals struggling with marital conflict and depression.
It has been demonstrated that periodic leg movements during sleep (PLMS) are connected to alterat... more It has been demonstrated that periodic leg movements during sleep (PLMS) are connected to alterations in features of EEG signal. Data mining evaluates hemispheric/cortical activity-related hemodynamic changes. We used data mining and machine learning to examine whether there are changes in brain hemodynamics associated with PLMS. Nighttime EEG recordings were made while brain activity was monitored in PLMS patients. Scores from EEG feature data were examined to find relevant differences. PLMS were consistently accompanied by
This paper presents a novel approach to predictive Ischemic brain stroke analysis using game theo... more This paper presents a novel approach to predictive Ischemic brain stroke analysis using game theory and machine learning techniques. The study investigates the use of the Shapley value in predictive Ischemic brain stroke analysis. Initially, preference algorithms identify the most important features in various machine learning models, including logistic regression, K-nearest neighbor, decision tree, support vector machine (linear kernel), support vector machine (RBF kernel), neural networks, etc. For each sample, the top 3, 4, and 5 features are evaluated and selected to evaluate their performance. The Shapley Value method has been used to rank the models using their best four features based on their predictive capabilities. As a result, better-performing models have been found. Afterward, ensemble machine learning methods were used to find the most accurate predictions using the top 5 models ranked by shapely value. The research demonstrates an impressive accuracy of 92.39%, surpassing other proposed models' performance. This study highlights the utility of combining game theory and machine learning in Ischemic brain stroke prediction and the potential of ensemble learning methods to increase predictive accuracy in Ischemic stroke analysis.
Tunicate Swarm Algorithm (TSA) is a novel swarm intelligence algorithm developed in 2020. Though ... more Tunicate Swarm Algorithm (TSA) is a novel swarm intelligence algorithm developed in 2020. Though it has shown superior performance in numerical benchmark function optimization and six engineering design problems over its competitive algorithms, it still needs further improvements. This article proposes two improved TSA algorithms using chaos theory, opposition-based learning (OBL) and Cauchy mutation. The proposed algorithms are termed OCSTA and COCSTA. The static and dynamic OBL are used respectively in the initialization and generation jumping phase of OCTSA, whereas centroid oppositionbased computing is used, in the same phases, in COCTSA. The proposed algorithms are tested on 30 IEEE CEC2017 benchmark optimization problems consists of unimodal, multimodal, hybrid, and composite functions with 30, 50, and 100 dimensions. The experimental results are compared with the classical TSA, TSA with the local escaping operator (TSA-LEO), Sine Cosine Algorithm (SCA), Giza-Pyramid Construction Algorithm (GPC), Covariance Matrix Adaptation Evolution Strategy (CMAES), Archimedes Optimization Algorithm (AOA), Opposition-Based Arithmetic Optimization Algorithm (OBLAOA), and Opposition-Based Chimp Optimization Algorithm (ChOAOBL). The statistical analysis of experimental results using the Wilcoxon Signed Rank Test establishes that the proposed algorithms outperform TSA and other algorithms for most of the problems. Moreover, high dimensions are used to validate the scalability of OCTSA and COCTSA, and the results show that the modified TSA algorithms are least impacted by larger dimensions. The experimental results with statistical analysis demonstrate the effectiveness of the proposed algorithms in solving global optimization problems.
The discernibility matrix acts as an efficient tool to determine decision-making problems. The di... more The discernibility matrix acts as an efficient tool to determine decision-making problems. The discernibility matrix enhances both the logical and theoretical knowledge of an information system. Also, it evaluated the boolean function for enhancing its relevant properties. In this paper, a granular lattice has been discussed to define a consistent set, based on object-oriented concepts. An ordered pair of attributes and objects was constructed to determine the granular lattice using indiscernibility relation. We have introduced two sets, namely the maximum and minimum sets to analyze the indiscernibility object. The indiscernibility object investigates the upper approximation space and extended its effectiveness towards maximum and minimum sets. Furthermore, the properties of the maximum set and minimum set are classified using the granularity concept. An equivalency condition is framed by the maximum set and its results are examined with the help of some theorems and lemma.
A novel method, RMSxAI, is proposed to predict arginine methylation sites from the protein sequen... more A novel method, RMSxAI, is proposed to predict arginine methylation sites from the protein sequences using machine learning algorithms. • It explores the sequence-based features, such as physicochemical properties, dipeptide composition, amino acid composition, and distribution information, to extract discriminative and informative features from the sequences. • It explains the feature importance and prediction results using explainable artificial intelligence.
Thyroid cancer is a life-threatening condition that arises from the cells of the thyroid gland lo... more Thyroid cancer is a life-threatening condition that arises from the cells of the thyroid gland located in the neck’s frontal region just below the adam’s apple. While it is not as prevalent as other types of cancer, it ranks prominently among the commonly observed cancers affecting the endocrine system. Machine learning has emerged as a valuable medical diagnostics tool specifically for detecting thyroid abnormalities. Feature selection is of vital importance in the field of machine learning as it serves to decrease the data dimensionality and concentrate on the most pertinent features. This process improves model performance, reduces training time, and enhances interpretability. This study examined binary variants of FOX-optimization algorithms for feature selection. The study employed eight transfer functions (S and V shape) to convert the FOX-optimization algorithms into their binary versions. The vision transformer-based pre-trained models (DeiT and Swin Transformer) are used fo...
Cancer prediction in the early stage is a topic of major interest in medicine since it allows acc... more Cancer prediction in the early stage is a topic of major interest in medicine since it allows accurate and efficient actions for successful medical treatments of cancer. Mostly cancer datasets contain various gene expression levels as features with less samples, so firstly there is a need to eliminate similar features to permit faster convergence rate of prediction algorithms. These features (genes) enable us to identify cancer disease, choose the best prescription to prevent cancer and discover deviations amid different techniques. To resolve this problem, we proposed a hybrid novel technique CSSMO-based gene selection for cancer prediction. First, we made alteration of the fitness of Spider Monkey Optimization (SMO) with Cuckoo search (CS) algorithm viz., CSSMO for feature selection, which helps to combine the benefit of both metaheuristic algorithms to discover a subset of genes which helps to predict a cancer disease in early stage. Further, to enhance the accuracy of the CSSMO ...
Biological data at the omics level are highly complex, requiring powerful computational approache... more Biological data at the omics level are highly complex, requiring powerful computational approaches to identifying significant intrinsic characteristics to further search for informative markers involved in the studied phenotype. In this paper, we propose a novel dimension reduction technique, protein–protein interaction-based gene correlation filtration (PPIGCF), which builds on gene ontology (GO) and protein–protein interaction (PPI) structures to analyze microarray gene expression data. PPIGCF first extracts the gene symbols with their expression from the experimental dataset, and then, classifies them based on GO biological process (BP) and cellular component (CC) annotations. Every classification group inherits all the information on its CCs, corresponding to the BPs, to establish a PPI network. Then, the gene correlation filter (regarding gene rank and the proposed correlation coefficient) is computed on every network and eradicates a few weakly correlated genes connected with ...
Meta-heuristics are commonly applied to solve various global optimization problems. In order to m... more Meta-heuristics are commonly applied to solve various global optimization problems. In order to make the meta-heuristics performing a global search, balancing their exploration and exploration ability is still an open avenue. This manuscript proposes a novel Opposition-based learning scheme, called ''PCOBL'' (Partial Centroid Opposition-based Learning), inspired by the partial centroid. PCOBL aims to improve meta-heuristics performance through maintaining an effective balance between the exploration and exploitation. PCOBL was incorporated in three different meta-heuristics, and a comparative study was conducted on 28 CEC2013 benchmark problems with 30, 50, and 100 dimensions. In addition, we assessed the PCOBL in the IEEE CEC2011 real-world problems. The empirical results demonstrate that PCOBL balances the exploration and exploitation ability of the meta-heuristics, positively impacting their performance and making them outperform the state-of-the-art algorithms in terms of best-error runs and convergence in most of the optimization problems. Moreover, the computational cost analysis illustrated that the inclusion of PCOBL in the meta-heuristic algorithm has a low impact on its efficiency.
This article proposes an adaptive discriminator-based GAN (generative adversarial network) model ... more This article proposes an adaptive discriminator-based GAN (generative adversarial network) model architecture with different scaling and augmentation policies to investigate and identify the cases of lost children even after several years (as human facial morphology changes after specific years). Uniform probability distribution with combined random and auto augmentation techniques to generate the future appearance of lost children’s faces are analyzed. X-flip and rotation are applied periodically during the pixel blitting to improve pixel-level accuracy. With an anisotropic scaling, the images were generated by the generator. Bilinear interpolation was carried out during up-sampling by setting the padding reflection during geometric transformation. The four nearest data points used to estimate such interpolation at a new point during Bilinear interpolation. The color transformation applied with the Luma flip on the rotation matrices spread log-normally for saturation. The luma-flip...
The metaverse is an upcoming computing paradigm aiming towards blending reality seamlessly with t... more The metaverse is an upcoming computing paradigm aiming towards blending reality seamlessly with the artificially generated 3D worlds of deep cyberspace. This giant interactive mesh of three-dimensional reconstructed realms has recently received tremendous attention from both an academic and commercial point of view owing to the curiosity instilled by its vast possible use cases. Every virtual world in the metaverse is controlled and maintained by a virtual service provider (VSP). Interconnected clusters of LiDAR sensors act as a feeder network to these VSPs which then process the data and reconstruct the best quality immersive environment possible. These data can then be leveraged to provide users with highly targeted virtual services by building upon the concept of digital twins (DTs) representing digital analogs of real-world items owned by parties that create and establish the communication channels connecting the DTs to their real-world counterparts. Logically, DTs represent dat...
Introduction: Of all the cancers that afflict women, breast cancer (BC) has the second-highest mo... more Introduction: Of all the cancers that afflict women, breast cancer (BC) has the second-highest mortality rate, and it is also believed to be the primary cause of the high death rate. Breast cancer is the most common cancer that affects women globally. There are two types of breast tumors: benign (less harmful and unlikely to become breast cancer) and malignant (which are very dangerous and might result in aberrant cells that could result in cancer). Methods: To find breast abnormalities like masses and micro-calcifications, competent and educated radiologists often examine mammographic images. This study focuses on computer-aided diagnosis to help radiologists make more precise diagnoses of breast cancer. This study aims to compare and examine the performance of the proposed shallow convolutional neural network architecture having different specifications against pre-trained deep convolutional neural network architectures trained on mammography images. Mammogram images are pre-processed in this study's initial attempt to carry out the automatic identification of BC. Thereafter, three different types of shallow convolutional neural networks with representational differences are then fed with the resulting data. In the second method, transfer learning via finetuning is used to feed the same collection of images into pre-trained convolutional neural networks VGG19, ResNet50, MobileNet-v2, Inception-v3, Xception, and Inception-ResNet-v2. Results: In our experiment with two datasets, the accuracy for the CBIS-DDSM and INbreast datasets are 80.4%, 89.2%, and 87.8%, 95.1% respectively. Discussion: It can be concluded from the experimental findings that the deep network-based approach with precise tuning outperforms all other state-ofthe-art techniques in experiments on both datasets.
Deepfake technology uses auto-encoders and generative adversarial networks to replace or artifici... more Deepfake technology uses auto-encoders and generative adversarial networks to replace or artificially construct fine-tuned faces, emotions, and sounds. Although there have been significant advancements in the identification of particular fake images, a reliable counterfeit face detector is still lacking, making it difficult to identify fake photos in situations with further compression, blurring, scaling, etc. Deep learning models resolve the research gap to correctly recognize phony images, whose objectionable content might encourage fraudulent activity and cause major problems. To reduce the gap and enlarge the fields of view of the network, we propose a dual input convolutional neural network (DICNN) model with ten-fold cross validation with an average training accuracy of 99.36 ± 0.62, a test accuracy of 99.08 ± 0.64, and a validation accuracy of 99.30 ± 0.94. Additionally, we used ’SHapley Additive exPlanations (SHAP) ’ as explainable AI (XAI) Shapely values to explain the resu...
Pattern detection and classification of cervical cell dysplasia can assist with diagnosis and tre... more Pattern detection and classification of cervical cell dysplasia can assist with diagnosis and treatment. This study aims to develop a computational model for real-world applications for cervical dysplasia that has the highest degree of accuracy and the lowest computation time. Initially, an ML framework is created, which has been trained and evaluated to classify dysplasia. Three different color models, three multi-resolution transform-based techniques for feature extraction (each with different filters), two feature representation schemes, and two well-known classification approaches are developed in conjunction to determine the optimal combination of “transform (filter) ⇒ color model ⇒ feature representation ⇒ classifier”. Extensive evaluations of two datasets, one is indigenous (own generated database) and the other is publicly available, demonstrated that the Non-subsampled Contourlet Transform (NSCT) feature-based classification performs well, it reveals that the combination “N...
COVID-19 has caused over 528 million infected cases and over 6.25 million deaths since its outbre... more COVID-19 has caused over 528 million infected cases and over 6.25 million deaths since its outbreak in 2019. The uncontrolled transmission of the SARS-CoV-2 virus has caused human suffering and the death of uncountable people. Despite the continuous effort by the researchers and laboratories, it has been difficult to develop reliable efficient and stable vaccines to fight against the rapidly evolving virus strains. Therefore, effectively preventing the transmission in the community and globally has remained an urgent task since its outbreak. To avoid the rapid spread of infection, we first need to identify the infected individuals and isolate them. Therefore, screening computed tomography (CT scan) and X-ray can better separate the COVID-19 infected patients from others. However, one of the main challenges is to accurately identify infection from a medical image. Even experienced radiologists often have failed to do it accurately. On the other hand, deep learning algorithms can tack...
Uploads
Papers by saurav mallik