Your search
Results 58 resources
-
There is still an urgent need of finding a mathematical model which can provide an accurate relationship between the software project effort/cost and the cost drivers. A powerful algorithm which can optimize such a relationship via developing a mathematical relationship between model variables is urgently needed. In this paper, we explore the use of GP to develop a software cost estimation model utilizing the effect of both the developed line of code and the used methodology during the development. An application of estimating the effort for some NASA software projects is introduced. The performance of the developed Genetic Programming (GP) based model was tested and compared to known models in the literature. The developed GP model was able to provide good estimation capabilities compared to other models.
-
Modeling lipase activity aids researchers in optimizing features such as temperature, pH, and substrate concentration to boost enzyme performance. This is essential in biotechnology for progressing the productivity and yield of processes such as fermentation, biodiesel production, and bioremediation. Fermentation is a highly complex, multivariable, and non-linear biotechnological process that produces bioactive materials. This study leverages artificial neural networks (ANN) to predict lipase activity in batch fermentation processes, addressing the inherent challenges in weight learning optimization often encountered with traditional algorithms like Backpropagation (BP). Several metaheuristic algorithms were employed to optimize the Multilayered Perceptron (MLP) structure and weights, including moth-frequency optimization (MFO), Particle Swarm Optimization (PSO), Dandelion Optimizer Algorithm (DO), Crow Search Algorithm (CSA), and Salp Swarm Algorithm (SSA) to overcome these limitations. Among the tested algorithms, MFO emerged as the most effective approach, achieving superior performance in weight learning with the best fitness value (i.e., mean square error (MSE)) of 0.6006. MFO-optimized ANN models deliver the most accurate predictions for lipase activity, highlighting their potential as a powerful tool for advancing industrial fermentation process optimization. © 2025 IEEE.
-
Economic load dispatch (ELD) is a challenge optimization problem to minimize the total cost of the thermally generated power that satisfies a set of equality and inequality constraints. We need to maximize the power network load under several operational constraints to solve this problem. Meanwhile, we need to minimize the cost of power generation and minimize the loss in the network transmission. Traditional optimization methods were used to solve such problems as linear programming. Meta-heuristic search algorithms have shown encouraging performance in solving various real-life engineering problems. This paper attempts to provide a comprehensive comparison between nine meta-heuristic search algorithms, including Genetic Algorithms (GAs), Particle Swarm Optimization (PSO), Crow Search Algorithm (CSA), Differential Evolution (DE), Salp Swarm Algorithm (SSA), Harmony Search (HS), Sine Cosine Algorithm (SCA), Multi-Verse Optimizer (MVO), and Moth-Flame Optimization Algorithm (MFO) for solving the economic load dispatch problem. Our developed results demonstrated that meta-heuristics search algorithms (i.e., CSA and DE) offer the optimal power set for each power station. These computed power fulfill the supply needs and maintain both minimum power costs and power losses in power transmission.
-
In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM) can be used to predict the number of failures that may be encountered during the software testing process. In this paper we explore the advantages of the Grey Wolf Optimization (GWO) algorithm in estimating the SRGM’s parameters with the objective of minimizing the difference between the estimated and the actual number of failures of the software system. We evaluated three different software reliability growth models: the Exponential Model (EXPM), the Power Model (POWM) and the Delayed S-Shaped Model (DSSM). In addition, we used three different datasets to conduct an experimental study in order to show the effectiveness of our approach.
-
Roads should always be in a reliable con-dition and maintained regularly. One of the problems that should be maintained well is the pavement cracks problem. This a challenging problem that faces road engineers, since maintaining roads in a stable condition is needed for both drivers and pedestrians. Many meth-ods have been proposed to handle this problem to save time and cost. In this paper, we proposed a two-stage method to detect pavement cracks based on Principal Component Analysis (PCA) and Convolutional Neural Network (CNN) to solve this classification problem. We employed a Principal Component Analysis (PCA) method to extract the most significant features with a di˙erent number of PCA components. The proposed approach was trained using a Mendeley Asphalt Crack dataset, which contains 400 images of road cracks with a 480×480 resolution. The obtained results show how PCA helped in speeding up the learning process of CNN.
-
Image reconstruction for industrial applications based on Electrical Capacitance Tomography (ECT) has been broadly applied. The goal of image reconstruction based ECT is to locate the distribution of permittivity for the dielectric substances along the cross-section based on the collected capacitance data. In the ECT-based image reconstruction process: (1) the relationship between capacitance measurements and permittivity distribution is nonlinear, (2) the capacitance measurements collected during image reconstruction are inadequate due to the limited number of electrodes, and (3) the reconstruction process is subject to noise leading to an ill-posed problem. Thence, constructing an accurate algorithm for real images is critical to overcoming such restrictions. This paper presents novel image reconstruction methods using Deep Learning for solving the forward and inverse problems of the ECT system for generating high-quality images of conductive materials in the Lost Foam Casting (LFC) process. Here, Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) models were implemented to predict the distribution of metal filling for the LFC process-based ECT. The recurrent connection and the gating mechanism of the LSTM is capable of extracting the contextual information that is repeatedly passing through the neural network while filtering out the noise caused by adverse factors. Experimental results showed that the presented ECT-LSTM-RNN model is highly reliable for industrial applications and can be utilized for other manufacturing processes. © 2013 IEEE.
-
In this study, we conducted experiments to model the temperature of two manufacturing processes using various metaheuristic search algorithms. The two processes adopted were the P05 horny steel tool and the AISI304 stainless steel castings machines. Our approach involves building a data-driven model, as traditional search methods for modeling manufac-turing problems often need help finding the global optimum when faced with a complex objective function and numerous decision variables. Bio-inspired metaheuristic search algorithms have shown promising performance in handling multi-model optimization functions, and efficiently exploring the search space to attain more global results. We applied several metaheuristic search algorithms to find the optimal tuning parameters of a temperature-based model. The results from the case studies demonstrate that Particle Swarm Optimization (PSO) provided the best performance in tuning model parameters, resulting in minimum modeling error.
-
SESSION TITLE: Clinical Prediction and Diagnosis of OSA
-
Establishing an optimal datacenter selection policy within the cloud environment is paramount to maximize the performance of the cloud services. Service broker policy governs the selection of datacenters for user requests. In our research, we introduce an innovative approach incorporating the genetic algorithm with service broker policy to assist cloud services in identifying the most suitable datacenters for specific userbases. The effectiveness of our proposed genetic algorithm was rigorously evaluated through experiments conducted on CloudAnalyst platform. The results clearly indicate that our proposed algorithm surpasses existing service broker policies and previous research works done in this field in terms of reducing response time and data processing time. The results analysis validates its efficacy and potential for enhancing cloud service performance and reducing the cost of overall cloud infrastructure.
-
Meta-heuristic search algorithms were successfully used to solve a variety of problems in engineering, science, business, and finance. Meta-heuristic algorithms share common features since they are population-based approaches that use a set of tuning parameters to evolve new solutions based on the natural behavior of creatures. In this paper, we present a novel nature-inspired search optimization algorithm called the capuchin search algorithm (CapSA) for solving constrained and global optimization problems. The key inspiration of CapSA is the dynamic behavior of capuchin monkeys.The basic optimization characteristics of this new algorithm are designed by modeling the social actions of capuchins during wandering and foraging over trees and riverbanks in forests while searching for food sources. Some of the common behaviors of capuchins during foraging that are implemented in this algorithm are leaping, swinging, and climbing. Jumping is an effective mechanism used by capuchins to jump from tree to tree. The other foraging mechanisms exercised by capuchins, known as swinging and climbing, allow the capuchins to move small distances over trees, tree branches, and the extremities of the tree branches. These locomotion mechanisms eventually lead to feasible solutions of global optimization problems. The proposed algorithm is benchmarked on 23 well-known benchmark functions, as well as solving several challenging and computationally costly engineering problems. A broad comparative study is conducted to demonstrate the efficacy of CapSA over several prominent meta-heuristic algorithms in terms of optimization precision and statistical test analysis. Overall results show that CapSA renders more precise solutions with a high convergence rate compared to competitive meta-heuristic methods. © 2020, Springer-Verlag London Ltd., part of Springer Nature.
-
Ozone is a toxic gas with massive distinct chemical components from oxygen. Breathing ozone in the air can cause severe effects on human health, especially people who have asthma. It can cause long-lasting damage to the lungs and heart attacks and might lead to death. Forecasting the ozone concentration levels and related pollutant attribute is critical for developing sophisticated environment safety policies. In this paper, we present three artificial neural network (ANN) models to forecast the daily ozone (O3), coarse particulate matter (PM10), and particulate matter (PM2.5) concentrations in a highly polluted city in the Republic of China. The proposed models are (1) recurrent multilayer perceptron (RMLP), (2) recurrent fuzzy neural network (RFNN), and (3) hybridization of RFNN and grey wolf optimizer (GWO), which are referred to as RMLP-ANN, RFNN, and RFNN-GWO models, respectively. The performance of the proposed models is compared with other conventional models previously reported in the literature. The comparative results showed that the proposed models presented outstanding performance. The RFNN-GWO model revealed superior results in the modeling of O3, PM10, and PM2.5 compared with the RMLP-ANN and RFNN models. © 2020, Springer Nature B.V.
-
In the last decade, a wide range of machine learning approaches were proposed and experimented to model highly nonlinear manufacturing processes. However, improving the performance of such models is challenging due to the complexity and high dimensionality of the manufacturing processes in general. In this paper, we propose bidirectional echo state reservoir networks (Bi-ESNs) trained using support vector machine privileged information method (SVM$$+$$) to model a winding machine process. The proposed model will be applied, tested and compared to reported models in the literature such as classical ESN with linear regression, ESN with a linear SVM readout, genetic programming, feedfoward neural network with backpropagation, radial basis function network, adaptive neural fuzzy inference system and local linear wavelet neural network. The developed results show that Bi-ESNs trained with SVM$$+$$are promising. It was able to provide better generalization performance compared to other models.
-
With the increasing interest in natural language processing, text summarization has become essential for condensing large volumes of data into concise and meaningful summaries. Extractive summarization, which involves selecting key sentences based on textual features, has gained attention due to its efficiency and effectiveness. This research explores extractive summarization using multiple machine learning classifiers, including Support Vector Machines (SVM), Logistic Regression (LR), Decision Trees (DT), K-Nearest Neighbors (KNN), and Random Forest (RF). Our findings indicate that the Random Forest model achieved the highest accuracy, reaching 80% in classifying sentences for summary generation. Additionally, we evaluated text classification on the same BBC dataset using ChatGPT, which attained an accuracy of 62%. Furthermore, comparisons with results from prior research confirm the competitive performance of our approach, reinforcing the potential of machine learning models in extractive summarization. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2026.
-
The rapid growth of technology has brought about many advantages, but has also made networks more susceptible to security threats. Intrusion Detection Systems (IDS) play a vital role in protecting computer networks against malicious activities. Given the dynamic and constantly evolving nature of cyber threats, these systems must continuously adapt to maintain their effectiveness. Machine Learning (ML) methods have gained prominence as effective tools for constructing IDS that offer both high accuracy and efficiency. This study conducts a performance assessment of several machine learning classifiers, including Random Forests (RF), Decision Trees (DT), and Support Vector Machines (SVM), in addressing multiclass intrusion detection as a means to counter cybersecurity threats. The NSL-KDD dataset, which includes various network attacks, served as the basis for our experimental evaluation. The research explores two classification scenarios: a five-class and a three-class model, analyzing their impact on detection performance. The results demonstrate that RF consistently achieves the highest accuracy (85.42%) on the three-class scenario testing set, highlighting its effectiveness in handling patterns and non-linear relationships within the intrusion data. Furthermore, reducing the classification complexity (three classes vs. five classes) significantly improves model generalization, as evidenced by the reduced performance gap between training and testing data. Friedman’s rank test and Holm’s post-hoc analysis were applied to ensure statistical rigor, confirming that RF outperforms DT and SVM in all evaluation metrics. These findings establish RF as the most robust classifier for intrusion detection and underscore the importance of simplifying classification tasks for improved IDS performance. © (2025), (Science Publications). All rights reserved.
-
Tomato plant diseases pose a significant threat to agricultural productivity, resulting in substantial economic losses. Early and accurate diagnosis is crucial for effective disease management. This paper describes the design and implementation of expert systems for tomato disease detection using the CLIPS (C Language Integrated Production System) platform. The tool is designed to help farmers and agronomists accurately identify diseases affecting tomato crops by simulating knowledge from professional experts. We carefully developed a set of rules to distinguish leaf blight symptoms from those of other tomato diseases and provided recommendations to minimize crop losses and maximize yields. The expert system was developed using a forward-chaining inference engine, and its performance was evaluated through a set of real-world test cases, demonstrating a high level of accuracy and consistency in decision-making. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2026.
Explore
Department
- Computer Science (58)
Resource type
- Book Section (2)
- Conference Paper (18)
- Journal Article (38)
Publication year
Resource language
- English (38)