Your search
Results 10 resources
-
Maintaining the excellent state of the road is critical to secure driving and is an obligation of both transportation and regulatory maintenance authorities. For a safe driving environment, it is essential to inspect road surfaces for defects or degradation frequently. This process is found to be labor-intensive and necessitates primary expertise. Therefore, it is challenging to examine road cracks visually; thus, we must effectively employ computer visualization and robotics tools to support this mission. This research provides our initial idea of simulating an Autonomous Robot System (ARS) to perform pavement assessments. The ARS for crack inspection is a camera-equipped mobile robot (i.e., an Android phone) to collect images on the road. The proposed system is simulated using an mBot robot armed with an Android phone that gathers video streams to be processed on a server that has a pre-training Convolutional Neural Networks (CNN) that can recognize crack existence. The proposed CNN model attained 99.0% accuracy in the training case and 97.5% in the testing case. The results of this research are suitable for application with a commercial mobile robot as an autonomous platform for pavement inspections. © 2022 Little Lion Scientific.
-
The advancement in treating medical data grows significantly daily. An accurate data classification model can help determine patient disease and diagnose disease severity in the medical domain, thus easing doctors' treatment burdens. Nonetheless, medical data analysis presents challenges due to uncertainty, the correlations between various measurements, and the high dimensionality of the data. These challenges burden statistical classification models. Machine Learning (ML) and data mining approaches have proven effective in recent years in gaining a deeper understanding of the importance of these aspects. This research adopts a well-known supervised learning classification model named a Decision Tree (DT). DT is a typical tree structure consisting of a central node, connected branches, and internal and terminal nodes. In each node, we have a decision to be made, such as in a rule-based system. This type of model helps researchers and physicians better diagnose a disease. To reduce the complexity of the proposed DT, we explored using the Feature Selection (FS) method to design a simpler diagnosis model with fewer factors. This concept will help reduce the data collection stage. A comparative analysis has been conducted between the developed DT and other various ML models, such as Logistic Regression (LR), Support Vector Machine (SVM), and Gaussian Naive Bayes (GNB), to demonstrate the effectiveness of the developed model. The results of the DT model establish a notable accuracy of 93.78% and an ROC value of 0.94, which beats other compared algorithms. The developed DT model provided promising results and can help diagnose heart disease. © 2024, Zarka Private University. All rights reserved.
-
Artificial intelligence (AI) is a distinct area of computer science that enables machines to handle and interpret complex data effectively. In recent years, there has been a dramatic uptick in studies devoted to AI, with many focusing on healthcare and medical research. This article delves deep into the potential of AI in several areas of healthcare, including the diagnosis and treatment of diseases. In recent years, Machine learning (ML) and deep learning (DL) have emerged as the most widely used artificial intelligence technologies in the healthcare industry. Moreover, this research demonstrates the crucial significance of progressing AI technologies, namely generative AI and large language models (LLMs), highlighting their revolutionary influence on healthcare. Finally, we highlight upcoming innovations and offer profound insights into the significant ethical, medical, and technological challenges associated with AI in healthcare. © 2025 Nova Science Publishers, Inc. All rights reserved.
-
The earth’s population is growing at a rapid rate, while the availability of water resources remains limited. Water is required for various purposes, including drinking, agriculture, industry, recreation, and development. Accurate forecasting of river flows can have a significant economic impact, particularly in agricultural water management and planning during water resource scarcity. Developing precise river flow forecasting models can greatly improve the management of water resources in many countries. In this study, we propose a two-phase model for predicting the flow of the Blackwater river located in the South Central United States. In the first phase, we use Multigene Symbolic Regression Genetic Programming (MG-GP) to develop a mathematical model. In the second phase, Particle Swarm Optimization (PSO) is employed to fine-tune the model parameters. Fine-tuning the MG-GP parameters improves the prediction accuracy of the model. The newly fine-tuned model exhibits 96% and 94% accuracy in training and testing cases, respectively © 2023, International Journal of Advanced Computer Science and Applications.All Rights Reserved.
-
In modern-day computing, cloud services are widely used in every aspect of life. So, user satisfaction depends on the effectiveness and efficiency of cloud services. Service broker policy of the cloud maintains the effectiveness and efficiency of cloud services. Service broker policy provides the rules and norms based on which a data center is selected for a userbase request. This paper proposes a genetic algorithm-based service broker policy that provides the optimal sequence of data centers for different userbases based on their requirements. This research aims to find an optimal data center for userbases that can achieve user satisfaction by minimizing the cloud service's response time and data processing time. We have experimented with our proposed genetic algorithm-based service broker policy in the CloudAnalyst platform based on different real-world scenarios. Simulation results indicate that our proposed genetic algorithm outperforms existing traditional algorithms. © 2023 IEEE.
-
Photovoltaic systems have proven to be one of the most widely used renewable energies and the best replacement for conventional energy. Yet, their non-linear nature remains a challenge when it comes to extracting maximum power from photovoltaic modules. Therefore, in this work, a nonlinear PID controller has been used to meet the requirements of the photovoltaic system. In addition, to improve system performance and response, metaheuristic search algorithms were introduced into the tuning process of both the NPID controller and conventional PID controller parameters in order to compare them. The use of Artificial Intelligence to fine-tune the controller parameters will enable the optimum values of proportional, integral, derivative and nonlinear gains to be determined as system condition change. Finally, a comparison between the algorithms applied is conducted in terms of efficiency, rise time, settling time and overshoot as well as the overall system stability.
-
Modeling of nonlinear industrial systems embraces two key stages: selection of a model structure with a compact parameter list, and selection of an algorithm to estimate the parameter list values. Thus, there is a need to develop a sufficiently adequate model to characterize the behavior of industrial systems to represent experimental data sets. The data collected for many industrial systems may be subject to the existence of high non-linearity and multiple constraints. Meanwhile, creating a thoroughgoing model for an industrial process is essential for model-based control systems. In this work, we explore the use of a proposed Enhanced version of the Cuckoo Search (ECS) algorithm to address a parameter estimation problem for both linear and nonlinear model structures of a real winding process. The performance of the developed models was compared with other mainstream meta-heuristics when they were targeted to model the same process. Moreover, these models were compared with other models developed based on some conventional modeling methods. Several evaluation tests were performed to judge the efficiency of the developed models based on ECS, which showed superior performance in both training and testing cases over that achieved by other modeling methods. © 2022, This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply.
-
Crow Search Algorithm (CSA) is a promising meta-heuristic method developed based on the intelligent conduct of crows in nature. This algorithm lacks a good representation of its individuals’ memory, and as with many other meta-heuristics it faces a problem in efficiently balancing exploration and exploitation. These defects may lead to early convergence to local optima. To cope with such issues, we proposed a Memory based Hybrid CSA (MHCSA) with the use of Particle Swarm Optimization (PSO) algorithm. This hybridization approach was proposed to reinforce the diversity ability of CSA and balance its search abilities for promising solutions to achieve robust search performance. The memory element of MHCSA was initialized with the best solution (pbest) of PSO to exploit the most promising search areas. The best positions of the CSA’s individuals are improved using the best solution found so far (gbest) and (pbest) of PSO. Another flaw of CSA is the use of fixed flight length and awareness probability for crows to control exploration and exploitation features, respectively. This issue was circumvented here by replacing these constants with adaptive functions in order to provide a better balance between exploration and exploitation over the course of iterations. The competence of MHCSA was revealed by testing it on seventy-three standard and computationally complex benchmark functions. Its applicability was substantiated by solving seven engineering design problems. The results showed that the problem of early convergence was eliminated by MHCSA and that the balance of exploration and exploitation was further improved. Further, MHCSA ranked first among CSA, PSO, robust variants of CSA and other strong competing methods in terms of accuracy and stability. © 2022, The Author(s), under exclusive licence to Springer Nature B.V.
-
This research introduces the application of an innovative bio-inspired metaheuristic technique, termed the Crow Search Algorithm (CSA), to model a crucial industrial process - hot rolling manufacturing. Inspired by the foraging patterns of crows, the CSA algorithm has demonstrated its prowess in solving diverse optimization challenges. In the context of this study, the CSA algorithm is harnessed to fine-tune the parameters of a simulation model focused on predicting the force exerted during a hot rolling procedure. The proposed model takes into consideration a range of influential factors, including the initial temperature (Ti), width (Ws), carbon equivalent (Ce), gauge (hi), draft (i), and roll diameter (R). The findings underscore the CSA's capability to deliver an exceptional modeling performance characterized by swift convergence and high solution quality. By getting along very well with the proposed model with the CSA algorithm, a robust and efficient avenue to optimize the hot rolling process emerges, with the potential for expansion into other manufacturing domains. The computational and simulation results demonstrated that the proposed approach-based CSA outperformed different meta-heuristic search algorithms, such as the Salp Swarm Algorithm (SSA), Dandelion Optimizer (DO), Particle Swarm Optimization (PSO), Gray Wolf Optimizer (GWO), and Moth-Flame Optimization (MFO), in all test cases. The CSA has achieved the highest coefficient of determination (R2), equal to 0.97244, and the lowest mean squared error (MSE), equal to 1904.97, compared to its opponent algorithms. © 2024 IEEE.
-
Urban air pollution, a combination of industry, traffic, forest burning, and agriculture pollutants, significantly impacts human health, plants, and economic growth. Ozone exposure can lead to mortality, heart attacks, and lung damage, necessitating the creation of complex environmental safety regulations by forecasting ozone concentrations and associated pollutants. This study proposes a hybrid method, RFNN-GOA, combining recurrent fuzzy neural network (RFNN) and grasshopper optimization algorithm (GOA) to estimate and forecast the daily ozone (O3) in specific urban areas, specifically Kopački Rit and Osijek city in Croatia, aiming to improve air quality, human health, and ecosystems. Due to the intricate structure of atmospheric particles, modeling of O3 likely poses the biggest challenge in air pollution today. The dataset used by the proposed RFNN-GOA model for the prediction of O3 concentrations in each explored area consists of the following air pollutants, NO, NO2, CO, SO2, O3, PM10, and PM2.5; and five meteorological elements, including temperature, relative humidity, wind direction, speed, and pressure. The RFNN-GOA method optimizes membership functions’ parameters and the rule premise, demonstrating robustness and reliability compared to other identifiers and indicating its superiority over competing methods. The RFNN-GOA method demonstrated superior accuracy in Osijek city and Kopački Rit area, with variance-accounted for (VAF) values of 91.135%, 83.676%, 87.807%, 79.673% compared to the RFNN method’s corresponding values of 85.682%, 80.687%, 80.808%, 74.202% in both training and testing phases, respectively. This reveals that RFNN-GOA increased the average VAF in Osijek city and Kopački Rit area by over 5% and 8%, respectively. © The Author(s), under exclusive licence to Springer Nature Switzerland AG 2024.
Explore
Resource type
- Book Section (1)
- Conference Paper (3)
- Journal Article (6)
Publication year
Resource language
- English (7)