Papers by Saman M. Almufti

FMDBTransactions on Sustainable Intelligent Networks, 2025
Metaheuristic algorithms have emerged as indispensable tools for solving complex structural desig... more Metaheuristic algorithms have emerged as indispensable tools for solving complex structural design problems characterized by nonlinearity, high dimensionality, and conflicting objectives. Traditional optimisation techniques often fall short in navigating such multifaceted design landscapes, necessitating more adaptive and robust approaches. This paper presents a comparative analysis of five metaheuristic algorithms applied to structural engineering problems: Genetic Algorithms (GA), Osprey Optimisation Algorithm (OOA), Quantum Annealing-based Structural Optimisation (QASO), Ensemble Laplacian Biogeography-Based Sine Cosine Algorithm (ELBBSC), and the Fitness Distance Balance Modified Metaheuristic (FDB-Meta). Recent developments since 2020 are emphasized to highlight innovations in convergence dynamics, robustness, and computational efficiency. Benchmark structural problems, such as steel moment frames, cable-stayed bridges, and RC slab bridges, are used to evaluate algorithm performance across various criteria, including convergence speed, solution quality, robustness, scalability, and parameter sensitivity. The results indicate that while GA remains a foundational method, newer algorithms, such as FDB-Meta and OOA, demonstrate significant improvements in both cost efficiency and reliability. This study contributes a systematic guideline for algorithm selection in structural design optimisation and outlines avenues for future research in hybrid metaheuristic development and adaptive parameter tuning.

International Journal of Scientific World, 2025
Optimization remains a cornerstone of modern engineering and computational intelligence, playing ... more Optimization remains a cornerstone of modern engineering and computational intelligence, playing a vital role in the design, control, and allocation of limited resources across industries ranging from logistics to structural engineering. Traditional optimization methods, such as gradient-based and exact algorithms, often struggle with the nonlinear, multimodal, and constrained nature of real-world problems, necessitating the adoption of metaheuristic approaches. These biologically and physically inspired algorithms offer flexibility, scalability, and robustness in navigating complex search spaces. This study presents a systematic categorization of optimization problems-including combinatorial, continuous, constrained, and multiobjective classes-followed by a rigorous comparative analysis of nine prominent metaheuristics: Ant Colony Optimization (ACO), Lion Algorithm (LA), Cuckoo Search (CS), Grey Wolf Optimizer (GWO), Vibrating Particles System (VPS), Social Spider Optimization (SSO), Cat Swarm Optimization (CSO), Bat Algorithm (BA), and Artificial Bee Colony (ABC). The algorithms are evaluated across five representative benchmark problems: the Traveling Salesman Problem (TSP), Welded Beam Design (WBD), Pressure Vessel Design (PVD), Tension/Compression Spring Design (TSD), and the Knapsack Problem (KP). Key contributions include: 1)Domain-specific suitability analysis, revealing how algorithmic mechanisms align with problem structures. 2) Performance benchmarking under standardized conditions, highlighting convergence speed, solution quality, and constraint-handling efficacy. 3) Practical insights for practitioners on algorithm selection, hybridization potential, and adaptation challenges. Results demonstrate that no single algorithm dominates universally; instead, problem characteristics dictate optimal choices. For instance, ACO excels in discrete problems (TSP, KP), while GWO and BA outperform in continuous engineering designs (WBD, PVD). The study concludes with recommendations for future research, including dynamic parameter tuning, hybrid models, and real-world scalability assessments.

Academic Journal of Nawroz University, 2017
Optimization remains a cornerstone of modern engineering and computational intelligence, playing ... more Optimization remains a cornerstone of modern engineering and computational intelligence, playing a vital role in the design, control, and allocation of limited resources across industries ranging from logistics to structural engineering. Traditional optimization methods, such as gradient-based and exact algorithms, often struggle with the nonlinear, multimodal, and constrained nature of real-world problems, necessitating the adoption of metaheuristic approaches. These biologically and physically inspired algorithms offer flexibility, scalability, and robustness in navigating complex search spaces. This study presents a systematic categorization of optimization problems-including combinatorial, continuous, constrained, and multiobjective classes-followed by a rigorous comparative analysis of nine prominent metaheuristics: Ant Colony Optimization (ACO), Lion Algorithm (LA), Cuckoo Search (CS), Grey Wolf Optimizer (GWO), Vibrating Particles System (VPS), Social Spider Optimization (SSO), Cat Swarm Optimization (CSO), Bat Algorithm (BA), and Artificial Bee Colony (ABC). The algorithms are evaluated across five representative benchmark problems: the Traveling Salesman Problem (TSP), Welded Beam Design (WBD), Pressure Vessel Design (PVD), Tension/Compression Spring Design (TSD), and the Knapsack Problem (KP). Key contributions include: 1)Domain-specific suitability analysis, revealing how algorithmic mechanisms align with problem structures. 2) Performance benchmarking under standardized conditions, highlighting convergence speed, solution quality, and constraint-handling efficacy. 3) Practical insights for practitioners on algorithm selection, hybridization potential, and adaptation challenges. Results demonstrate that no single algorithm dominates universally; instead, problem characteristics dictate optimal choices. For instance, ACO excels in discrete problems (TSP, KP), while GWO and BA outperform in continuous engineering designs (WBD, PVD). The study concludes with recommendations for future research, including dynamic parameter tuning, hybrid models, and real-world scalability assessments.

International Journal of Scientific World, 2025
This study presents a comprehensive comparative analysis of nine state-of-the-art metaheuristic o... more This study presents a comprehensive comparative analysis of nine state-of-the-art metaheuristic optimization algorithms applied to the classical Traveling Salesman Problem (TSP), a fundamental benchmark in combinatorial optimization. The selected algorithms-Ant Colony Optimization (ACO), Lion Algorithm (LA), Cuckoo Search (CS), Grey Wolf Optimizer (GWO), Vibrating Particles System (VPS), Social Spider Optimization (SSO), Cat Swarm Optimization (CSO), Bat Algorithm (BA), and Artificial Bee Colony (ABC)-are evaluated on three standardized TSPLIB benchmark instances: berlin52, eil76, and pr1002. The evaluation framework encompasses multiple performance metrics, including best-found cost, mean solution quality, standard deviation, and convergence behavior, over 30 independent runs per instance. The results offer empirical insights into each algorithm's strengths, limitations, and scalability across problem sizes. Notably, ACO, GWO, and CSO demonstrate superior balance between solution accuracy and robustness, making them promising candidates for large-scale combinatorial problems. This work not only provides an up-to-date performance landscape of leading swarm-based and evolutionary metaheuristics but also guides algorithm selection for real-world optimization applications requiring adaptability and computational efficiency.

FMDB Transactions on Sustainable Computer Letters, 2025
Swarm intelligence (SI) algorithms have emerged as powerful tools for solving complex structural ... more Swarm intelligence (SI) algorithms have emerged as powerful tools for solving complex structural optimisation problems that are characterised by nonlinearity, multiple constraints, and multimodal objective functions. This paper presents a comprehensive comparative study of five prominent swarm-based metaheuristic algorithms-Particle Swarm Optimisation (PSO), Ant Colony Optimisation (ACO), Artificial Bee Colony (ABC), Grey Wolf Optimiser (GWO), and Harris Hawks Optimisation (HHO)-applied to the classical welded beam design problem. The design objective is to minimise fabrication cost while satisfying structural and geometric constraints. Each algorithm is implemented in a unified benchmarking environment, and their performances are evaluated in terms of solution quality, convergence speed, robustness, and computational cost. The results reveal nuanced performance trade-offs among the algorithms, highlighting the importance of balancing exploration and exploitation, as well as parameter sensitivity, in engineering applications. The study contributes to the growing body of research in computational structural engineering, offering insights into the practical application of swarm intelligence methods for real-world design challenges.

International Journal of Scientific World, 2025
Optimization plays a vital role in tackling complex challenges across diverse fields such as engi... more Optimization plays a vital role in tackling complex challenges across diverse fields such as engineering, computer science, data mining, and machine learning. Conventional optimization techniques often face difficulties when dealing with high-dimensional and nonlinear problems, which has led to the rise of metaheuristic algorithms as effective alternatives. The Artificial Bee Colony (ABC) algorithm, developed by Karaboga in 2005, is a nature-inspired optimization method modeled after the foraging behavior of honeybees. ABC has proven highly effective in solving nonlinear, multidimensional, and NP-hard optimization problems. This paper reviews the ABC algorithm, explores its various enhancements designed to improve convergence speed and the balance between exploration and exploitation, and examines its broad applications in areas like engineering, data mining, and medical diagnostics. The ongoing advancements in ABC, including its integration with other algorithms and adaptive parameter control, highlight its importance in contemporary optimization tasks.

International Journal of Scientific World, 2025
Explainability in artificial intelligence (AI) is an essential factor for building transparent, t... more Explainability in artificial intelligence (AI) is an essential factor for building transparent, trustworthy, and ethical systems, particularly in high-stakes domains such as healthcare, finance, justice, and autonomous systems. This study examines the foundations of AI explainability, its critical role in fostering trust, and the current methodologies used to interpret AI models, such as post-hoc techniques, intrinsically interpretable models, and hybrid approaches. Despite these advancements, challenges persist, including trade-offs between accuracy and interpretability, scalability, ethical risks, and transparency gaps. The paper explores emerging trends like causality-based explanations, neurosymbolic AI, and personalized frameworks, while emphasizing the integration of ethics and the need for automation in explainability. Future directions stress the importance of collaboration among researchers, practitioners, and policymakers to establish industry standards and regulations, ensuring that AI systems align with societal values and expectations.

SEEIPH, 2024
In this work, we propose a singular approach for blast identification and classification in Acute... more In this work, we propose a singular approach for blast identification and classification in Acute
Lymphoblastic Leukemia (ALL), an ordinary kind of formative child cancer dataset. The proposed method
combines the Pivot-Growing Segmentation (PGS) algorithm with the U-Net structure better with
Parametric Leaky ReLU (PLR) activations. The Pivot-Growing Segmentation has set of rules to clustering
method that utilizes K-medoid and squared Euclidean distance as a similarity degree. In this context, it's far
used to delineate blast areas from microscopic images by imparting unique localization. This technique is
hired to improve the accuracy of blast identification, that's vital for accurate diagnosis and treatment of
cancer. The U-Net PLR version is then used for blast classification that is a fully linked Convolutional
Neural Network (CNN) with Parametric Leaky ReLU activations. This version is designed to extract
difficult capabilities from segmented areas, improving the type overall performance. The U-Net PLR
version includes an encoder and decoder structure, with bypass connections among the corresponding
layers. The encoder is accountable for extracting capabilities from the input image, while the decoder
reconstructs the image and outputs the segmentation mask. The proposed method is achieving overall
performance in blast identification and classification of the given dataset. The proposed technique offers a
promising path for boosting diagnostic accuracy and assisting in personalized treatment techniques for
pediatric sufferers with ALL.

International Journal of Scientific World, 2025
The Cuckoo Search Algorithm (CSA), introduced by Xin-She Yang and Suash Deb in 2009, is a nature-... more The Cuckoo Search Algorithm (CSA), introduced by Xin-She Yang and Suash Deb in 2009, is a nature-inspired metaheuristic optimization
technique modeled on the brood parasitism behavior of certain cuckoo bird species. Utilizing a Levy flight mechanism, CSA effectively
balances global exploration and local exploitation, making it a versatile tool for addressing non-linear, multi-modal, and high-dimensional
optimization problems.
This paper presents a comprehensive exploration of CSA, detailing its biological foundation, mathematical framework, and algorithmic
processes. Key modifications, including hybrid approaches, adaptive mechanisms, and domain-specific enhancements, are reviewed to
illustrate how CSA has been refined to tackle increasingly complex optimization challenges. Applications spanning engineering, machine
learning, energy systems, robotics, and telecommunications highlight CSA’s versatility and efficiency in solving real-world problems.
Despite its strengths, challenges such as parameter sensitivity and computational demands in large-scale scenarios persist. To address these,
avenues for future research are proposed, including the integration of CSA with emerging technologies like quantum computing and advanced machine learning techniques. This study underscores CSA’s role as a cornerstone of modern metaheuristic optimization, offering
a robust framework for solving diverse and challenging problems
Bone feature quantization and systematized attention gate UNet-based deep learning framework for bone fracture classification
Intelligent data analysis, Jun 13, 2024

Academic journal of Nawroz University, May 10, 2024
Ensuring system availability and reliability is crucial in the quickly developing field of cloud ... more Ensuring system availability and reliability is crucial in the quickly developing field of cloud computing. The importance of fault tolerance in cloud infrastructure systems grows as organizations become more reliant on it to support their critical operations. The purpose of this article is to investigate the intricate realm of cloud computing and distributed systems. Specifically, the paper will investigate the numerous forms of cloud computing, fault tolerance methods, and frameworks that enable cloud services to be robust and durable. Cloud computing has transformed the way in which organizations and individuals access and administer computing resources. The paper discusses several deployment options, including public, private, hybrid, and multicloud environments, which provide organizations with the advantages of flexibility, scalability, and costeffectiveness. The inherent flexibility of cloud computing renders it well-suited for a diverse range of applications, spanning from the hosting of websites to the execution of intricate data analytics processes. Generally, cloud computing encounters substantial obstacles, including the need of maintaining uninterrupted service in the face of hardware failures, network outages, or software errors, despite its tremendous benefits. The critical importance of fault tolerance in this particular situation cannot be overstated, as it plays a pivotal role in maintaining the dependability and availability of the system. The primary objective of this study is to examine the utilization of distributed systems as a means to augment fault tolerance within the realm of cloud computing and distributed systems. Distributed systems offer an optimal approach for addressing difficulties related to fault tolerance, owing to its intrinsic capability to divide workloads and data over several nodes. This approach utilizes redundancy, replication, and the ability to recover seamlessly from disturbances, hence enhancing the resilience and resource efficiency of cloud services. This research reviews novel techniques and frameworks that utilize distributed systems to create fault-tolerant cloud computing architectures, emphasizing their substantial influence on the cloud computing domain. In conclusion, this research report includes a comparative analysis table that encompasses twenty preceding works.
FEM-supported machine learning for residual stress and cutting force analysis in micro end milling of aluminum alloys
International journal of mechanics and materials in design, Mar 30, 2024
An Integrated Gesture Framework of Smart Entry Based on Arduino and Random Forest Classifier
Indonesian journal of computer science/Indonesian Journal of Computer Science, Feb 16, 2024

Application of Fuzzy Logic for Evaluating Student Learning Outcomes in E-Learning
Lecture notes in networks and systems, 2024
Electronic education significantly expands the possibilities of traditional education both in ter... more Electronic education significantly expands the possibilities of traditional education both in terms of electronic educational environments and new educational technologies. Electronic educational environment allows students to access the materials of the course they are studying. Besides, there is an opportunity to evaluate the results of learning. This article considers the application of fuzzy logic in the evaluation of students' results when taking a course. Fuzzy logic allows to take into account the inaccuracies and uncertainties that are inherent in the educational process. Unlike classical assessment methods, which often operate with rigid rules and clear boundaries, fuzzy logic allows taking into account different levels of knowledge, skills and other criteria when assessing learning outcomes. This is particularly important in an educational context where students have different abilities, interests and learning needs. The application of fuzzy logic allows for a more objective evaluation of student learning outcomes and contributes to improving the quality of education. #COMESYSO1120.

Academic journal of Nawroz University, Sep 7, 2023
A Metaheuristic Optimization is a group of algorithms that are widely studied and employed in the... more A Metaheuristic Optimization is a group of algorithms that are widely studied and employed in the scientific literature. Typically, metaheuristics algorithms utilize stochastic operators that make each iteration unique, and they frequently contain controlling parameters that have an impact on the convergence process since their impacts are mostly neglected in most optimization literature, making it difficult to draw conclusions. This paper introduced the Big Bang-Big Crunch (BB-BC) metaheuristic algorithm to evaluate the performance of a metaheuristic algorithm in relation to its control parameter. It also demonstrates the effects of varying the values of BB-BC in solving. The "Welded Beam Design problem" is a well-known engineering optimization problem that is classified as a Single-Objective Constrained Optimization issue. Multiple starting parameter values for the BB-BC are evaluated as part of the experimental findings. This is done in an attempt to find the algorithm's optimal starting settings. The lowest, maximum, and mean values of the penalized objective functions are then computed. Finally, the BB-BC results are compared with various metaheuristics algorithms.
Retracing-efficient IoT model for identifying the skin-related tags using automatic lumen detection
Intelligent Data Analysis, Nov 2, 2023

1. INTRODUCTION Cryptography is the science oriented with privacy and security. Which is made up ... more 1. INTRODUCTION Cryptography is the science oriented with privacy and security. Which is made up of several cryptosystems, these cryptosystems are basically a collection of algorithms that aim at securing information and data. Recently, cryptosystems are wide utilized in all branches of digital technology, electronic mails, and internet banking. This paper shortly discusses the most favorable cryptosystems and investigates the most common private-key cipher. On January 2, 1997, the National Institute of Standards and Technology (NIST) held a challenge for a new encryption standard. The previously used standard, Data Encryption Standard (DES), was no longer capable of providing sufficient for security. The algorithm had been used since November 23, 1976. Since then, computers have developed providing greater computer power, thus the algorithm was rendered not safe. In 1998 a specially developed computer, the DES cracker was developed by the Electronic Frontier Foundation for approximately $ 250,000 and winning the RSA DES Challenge II-2 (Kaufman et al., 2002). The alternatives for a new encryption standard were Triple DES and International Data Encryption Algorithm. However, these alternatives were slow and not free to implement due to patent rights. NIST required an algorithm that provided high security that is efficient, flexible, and easy to implement and free to use (Dar et al., 2014). About 3 years into the contest, NIST chose the Rijndael algorithm (Dar et al., 2014) which is pronounced "Rhine Dahl" in English (National Institute of Standards and Technology, 2001).
Journal of advanced computer science & technology, Oct 19, 2019
This paper provides an introduction and a comparison of two widely used evolutionary computation ... more This paper provides an introduction and a comparison of two widely used evolutionary computation algorithms: Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) based on the previous studies and researches. It describes Genetic Algorithm basic functionalities including various steps such as selection, crossover, and mutation.

Fusion of Water Evaporation Optimization and Great Deluge: A Dynamic Approach for Benchmark Function Solving
Fusion: Practice and Applications
The Water Evaporation Optimization - Great Deluge explores the synergy between the Water Evaporat... more The Water Evaporation Optimization - Great Deluge explores the synergy between the Water Evaporation Optimization Algorithm (WEOA) and the Great Deluge Algorithm (GDA) to create a novel fusion model. This research investigates the efficacy of combining these two powerful optimization techniques in addressing benchmark problems. The fusion model incorporates WEOA's dynamic exploration-exploitation dynamics and GDA's global search capabilities. By merging their strengths, the fusion model seeks to enhance convergence efficiency and solution quality. The study presents an experimental analysis of the fusion model's performance across a range of benchmark functions, evaluating its ability to escape local optima and converge towards global optima. The results provide insights into the effectiveness of the fusion model and its potential for addressing complex optimization challenges., a comprehensive performance analysis of the application of the proposed fusion model to a cur...

Khatib Sulaiman Dalam
Gesture-based systems have emerged as a prominent breakthrough in the
field of smart access cont... more Gesture-based systems have emerged as a prominent breakthrough in the
field of smart access control, effectively integrating security measures with
user comfort. This study presents a novel gesture detection framework for
smart entry systems that combines the computational capabilities of a
Random Forest Classifier with the practicality of Arduino-based hardware.
Central to methodology is the utilization of MediaPipe, an advanced
computer vision library, to extract hand motion landmarks from live video
streams. The selected landmarks function as a comprehensive dataset for
training a Random Forest Classifier, which has been specifically chosen due
to its high level of accuracy and efficiency in managing intricate classification
jobs. The model exhibits outstanding competence in the categorization of
gestures in real-time, attaining high levels of accuracy that are crucial for
ensuring dependable entrance control. The Arduino microcontroller plays a
vital role in the execution of the entry mechanism as it serves as the
intermediary between the gesture detection software and the tangible entry
control hardware. The incorporation of gesture recognition technology
facilitates a cohesive and prompt user experience, wherein identified
motions are directly converted into input commands. The system's practical
use is demonstrated through a series of detailed tests, which highlight its
dependability and efficiency across diverse climatic circumstances. The
findings underscore the system's capacity as a flexible and safe solution for
contactless access in many environments, including both private homes and
highly protected establishments. Furthermore, the study makes a
substantial contribution to the larger domain of human-computer
interaction by showcasing the practicality of advanced gesture detection
systems in many everyday contexts. The suggested framework presents a
novel approach to smart entry systems and also paves the way for further
investigation in the domains of smart home automation and interactive
systems. In these areas, gesture-based interfaces have the potential to
deliver user experiences that are both intuitive and efficient.
Uploads
Papers by Saman M. Almufti
Lymphoblastic Leukemia (ALL), an ordinary kind of formative child cancer dataset. The proposed method
combines the Pivot-Growing Segmentation (PGS) algorithm with the U-Net structure better with
Parametric Leaky ReLU (PLR) activations. The Pivot-Growing Segmentation has set of rules to clustering
method that utilizes K-medoid and squared Euclidean distance as a similarity degree. In this context, it's far
used to delineate blast areas from microscopic images by imparting unique localization. This technique is
hired to improve the accuracy of blast identification, that's vital for accurate diagnosis and treatment of
cancer. The U-Net PLR version is then used for blast classification that is a fully linked Convolutional
Neural Network (CNN) with Parametric Leaky ReLU activations. This version is designed to extract
difficult capabilities from segmented areas, improving the type overall performance. The U-Net PLR
version includes an encoder and decoder structure, with bypass connections among the corresponding
layers. The encoder is accountable for extracting capabilities from the input image, while the decoder
reconstructs the image and outputs the segmentation mask. The proposed method is achieving overall
performance in blast identification and classification of the given dataset. The proposed technique offers a
promising path for boosting diagnostic accuracy and assisting in personalized treatment techniques for
pediatric sufferers with ALL.
technique modeled on the brood parasitism behavior of certain cuckoo bird species. Utilizing a Levy flight mechanism, CSA effectively
balances global exploration and local exploitation, making it a versatile tool for addressing non-linear, multi-modal, and high-dimensional
optimization problems.
This paper presents a comprehensive exploration of CSA, detailing its biological foundation, mathematical framework, and algorithmic
processes. Key modifications, including hybrid approaches, adaptive mechanisms, and domain-specific enhancements, are reviewed to
illustrate how CSA has been refined to tackle increasingly complex optimization challenges. Applications spanning engineering, machine
learning, energy systems, robotics, and telecommunications highlight CSA’s versatility and efficiency in solving real-world problems.
Despite its strengths, challenges such as parameter sensitivity and computational demands in large-scale scenarios persist. To address these,
avenues for future research are proposed, including the integration of CSA with emerging technologies like quantum computing and advanced machine learning techniques. This study underscores CSA’s role as a cornerstone of modern metaheuristic optimization, offering
a robust framework for solving diverse and challenging problems
field of smart access control, effectively integrating security measures with
user comfort. This study presents a novel gesture detection framework for
smart entry systems that combines the computational capabilities of a
Random Forest Classifier with the practicality of Arduino-based hardware.
Central to methodology is the utilization of MediaPipe, an advanced
computer vision library, to extract hand motion landmarks from live video
streams. The selected landmarks function as a comprehensive dataset for
training a Random Forest Classifier, which has been specifically chosen due
to its high level of accuracy and efficiency in managing intricate classification
jobs. The model exhibits outstanding competence in the categorization of
gestures in real-time, attaining high levels of accuracy that are crucial for
ensuring dependable entrance control. The Arduino microcontroller plays a
vital role in the execution of the entry mechanism as it serves as the
intermediary between the gesture detection software and the tangible entry
control hardware. The incorporation of gesture recognition technology
facilitates a cohesive and prompt user experience, wherein identified
motions are directly converted into input commands. The system's practical
use is demonstrated through a series of detailed tests, which highlight its
dependability and efficiency across diverse climatic circumstances. The
findings underscore the system's capacity as a flexible and safe solution for
contactless access in many environments, including both private homes and
highly protected establishments. Furthermore, the study makes a
substantial contribution to the larger domain of human-computer
interaction by showcasing the practicality of advanced gesture detection
systems in many everyday contexts. The suggested framework presents a
novel approach to smart entry systems and also paves the way for further
investigation in the domains of smart home automation and interactive
systems. In these areas, gesture-based interfaces have the potential to
deliver user experiences that are both intuitive and efficient.