Your search
Results 291 resources
-
Meta-heuristic search algorithms were successfully used to solve a variety of problems in engineering, science, business, and finance. Meta-heuristic algorithms share common features since they are population-based approaches that use a set of tuning parameters to evolve new solutions based on the natural behavior of creatures. In this paper, we present a novel nature-inspired search optimization algorithm called the capuchin search algorithm (CapSA) for solving constrained and global optimization problems. The key inspiration of CapSA is the dynamic behavior of capuchin monkeys.The basic optimization characteristics of this new algorithm are designed by modeling the social actions of capuchins during wandering and foraging over trees and riverbanks in forests while searching for food sources. Some of the common behaviors of capuchins during foraging that are implemented in this algorithm are leaping, swinging, and climbing. Jumping is an effective mechanism used by capuchins to jump from tree to tree. The other foraging mechanisms exercised by capuchins, known as swinging and climbing, allow the capuchins to move small distances over trees, tree branches, and the extremities of the tree branches. These locomotion mechanisms eventually lead to feasible solutions of global optimization problems. The proposed algorithm is benchmarked on 23 well-known benchmark functions, as well as solving several challenging and computationally costly engineering problems. A broad comparative study is conducted to demonstrate the efficacy of CapSA over several prominent meta-heuristic algorithms in terms of optimization precision and statistical test analysis. Overall results show that CapSA renders more precise solutions with a high convergence rate compared to competitive meta-heuristic methods. © 2020, Springer-Verlag London Ltd., part of Springer Nature.
-
A multistage biometric verification system uses multiple biometrics and/or multiple biometric verifiers to generate a verification decision. The core of a multistage biometric verification system is reject option which allows a stage not to give a genuine/impostor decision when it is not confident enough. This paper studies the effectiveness of symmetric rejection for multistage biometric verification systems. The symmetric rejection method determines the reject region by symmetrically rejecting equal proportion of genuine and impostor scores. The applicability of a multistage biometric verification system depends on how secure and user convenient it is, which is measured by the performance–cost trade-off. This paper analyzes the performance–cost trade-off of symmetric rejection method by conducting extensive experiments. Experiments are performed on two biometric databases: (1) publicly available NIST database and (2) a keystroke database. In addition, the symmetric rejection method is empirically compared with two existing rejection methods: (1) sequential probability ratio test-based method, which uses score-fusion and (2) Marcialis et al.’s method, which does not use score fusion. Results demonstrate strong effect of symmetric rejection method on creating a secure and user convenient multistage biometric verification system.
-
Image creation and retention are growing at an exponential rate. Individuals produce more images today than ever in history and often these images contain family. In this paper, we develop a framework to detect or identify family in a face image dataset. The ability to identify family in a dataset of images could have a critical impact on finding lost and vulnerable children, identifying terror suspects, social media interactions, and other practical applications. We evaluated our framework by performing experiments on two facial image datasets, the Y-Face and KinFaceW, comprising 37 and 920 images, respectively. We tested two feature extraction techniques, namely PCA and HOG, and three machine learning algorithms, namely K-Means, agglomerative hierarchical clustering, and K nearest neighbors. We achieved promising results with a maximum detection rate of 94.59% using K-Means, 89.18% with agglomerative clustering, and 77.42% using K-nearest neighbors. © 2020 World Scientific Publishing Company.
-
Ozone is a toxic gas with massive distinct chemical components from oxygen. Breathing ozone in the air can cause severe effects on human health, especially people who have asthma. It can cause long-lasting damage to the lungs and heart attacks and might lead to death. Forecasting the ozone concentration levels and related pollutant attribute is critical for developing sophisticated environment safety policies. In this paper, we present three artificial neural network (ANN) models to forecast the daily ozone (O3), coarse particulate matter (PM10), and particulate matter (PM2.5) concentrations in a highly polluted city in the Republic of China. The proposed models are (1) recurrent multilayer perceptron (RMLP), (2) recurrent fuzzy neural network (RFNN), and (3) hybridization of RFNN and grey wolf optimizer (GWO), which are referred to as RMLP-ANN, RFNN, and RFNN-GWO models, respectively. The performance of the proposed models is compared with other conventional models previously reported in the literature. The comparative results showed that the proposed models presented outstanding performance. The RFNN-GWO model revealed superior results in the modeling of O3, PM10, and PM2.5 compared with the RMLP-ANN and RFNN models. © 2020, Springer Nature B.V.
-
Retweeting is an important way of information propagation on Twitter. In this paper, we investigate the sentiment correlation between regular tweets and retweets. We anticipate our investigation sheds a light on how the sentiment of regular tweets impacts the retweets of different sentiments. We propose a method for measuring the sentiment of tweets. We categorize the Twitter users into different groups by different norms, which are the follower count, the betweenness connectivity, a combination of follower count and betweenness centrality,and the amount of tweets. Then, we calculate the sentiment correlation for different groups to examine the influential factors for retweeting a message with a certain sentiment.We find that the users with higher betweenness centrality and higher tweets amount tend to exhibit a higher sentiment correlation. The users with medium-level followers_count show the highest sentiment correlation compared to the low-level and high-level followers_count. After combining the two factors of followers_count and betweenness centrality, we discover that specifically at low-level betweenness centrality the users with medium-level followers_count have the highest sentiment correlation. Our last observation is that the difference for correlation coefficients exists between different types of users. Our study on the sentiment correlation provides instructional information for modeling information propagation in human society. © 2020, Springer-Verlag GmbH Austria, part of Springer Nature.
-
Alzheimer's Disease (AD) is a neurodegenerative disease that causes complications with thinking capability, memory and behavior. AD is a major public health problem among the elderly in developed and developing countries. With the growth of AD around the world, there is a need to further expand our understanding of the roles different clinical measurements can have in the diagnosis of AD. In this work, we propose a machine learning-based technique to distinguish control subjects with no cognitive impairments, AD subjects, and subjects with mild cognitive impairment (MCI), often seen as precursors of AD. We utilized several machine learning (ML) techniques and found that Gradient Boosting Decision Trees achieved the highest performance above 84% classification accuracy. Also, we determined the importance of the features (clinical biomarkers) contributing to the proposed multi-class classification system. Further investigation on the biomarkers will pave the way to introduce better treatment plan for AD patients. © 2020 The authors and IOS Press.
-
The aptitudes and abilities required for the position of programmer, within the computer industry, have yet to be fully studied and their inter-relationships known. Although the industry is relatively new, a substantial amount of research in the areas of personnel selection, evaluation and job requirements has been undertaken. Yet these studies have confined themselves primarily to the use of interest scales, aptitude and achievement tests as overall predictors for on-the-job success rather than in the study of the cognitive factors pertinent to the tasks of which programming is composed. In a study by Deutsch and Shea, Inc. (1963), the relationship between the programmer and the computer is seen as analogous to that of the mahout and his elephant. As with the mahout, the programmer uses his intelligence, skills and abilities in the control and guidance of a powerful and flexible, yet non-intelligent, tool in the performance of specific finite operations which contribute to the completion of more complex tasks. It is the programmer who, when presented with a problem from science, engineering or business, must work out a solution. John and Miller (1957) state that all problems have two general parts: the specific components involved (i.e., data, etc.) and the relationships which are the orderings of or changes to the specific components.
-
The paper presents methods of space allocation applicable to architectural design. These techniques have been developed in the past twenty years and are presented in this paper in such a way that they mav also be applied to other disciplines. Four categories are presented that identify the variations in the dimensioning of the elements, either unit dimension or variable dimension, and the variation in the shape of the boundary, either a simple rectangle or a multi-faceted boundary.
-
The use of gradient operators for image enhancement has been widely reported in the literature, but they have not been used routinely in the medical arena, particularly in the most common radiographic plain film procedure, chest radiographs. Gradient operators such as Sobel and Roberts operators, not only enhance image edges but also tend to enhance noise. Overall, the Sobel operator was found to be superior to the Roberts operator in edge enhancement. A theoretical explanation for the superior performance of the Sobel operator was developed based on the concept of analyzing the x and y Sobel masks as linear Alters. By applying pill box, Gaussian, or median filtering prior to applying a gradient operator, noise was reduced, but the pill box and Gaussian filters were much more computationally efficient than the median filter with approximately equal effectiveness in noise reduction. © 1988 IEEE
-
Temporal analysis has been applied to a sequence of cloud top pressure (CTP) images and cloud optical thickness (TAU) images stored in the International Satellite Cloud Climatology Project (ISCCP) D1 database located at the NASA Goddard Institute for Space Studies (GISS). Each pixel in the D1 data set has a resolution of 2.5 degrees or 280 kilometers. These images were collected in consecutive three-hour intervals for the entire month of April 1989. The primary objective of this project was to develop a sequence of storm tracks from the satellite images to follow the formation, progression and dissipation of storm systems over time. Composite images where created by projecting ahead in time and substituting the first available valid pixel for missing data and a variety of CTP and TAU cut-off values were used to identify regions of interest. Region correspondences were determined from one time frame to another yielding the coordinates of storm centers. These tracks were compared to storm tracks computed from sea level pressure data obtain from the National Meteorological Center (NMC) for the same time period. The location of sea level storm center provides insight as to whether storms have occurred anywhere in a region and can be helpful in determining the presence or absence of storms in a general geographic region.
-
The primary objective of this project is to define a methodology to depict the motion of deep convective cloud systems as observed form satellite imagery. These clouds are defined as clusters of pixels with Cloud Top Pressure (IPC) <EQ 440 millibars and Cloud Optical Thickness (TAU) >= 23 which are high in the atmosphere and sufficiently thick to produce significant rainfall. Clouds are one of the major factors in understanding the earth's climate. Evaluating cloud motion is important in understanding atmospheric dynamics and visualizations are vital because they provide a good way to observe change. IPC and TAU values have been collected for April of 1989 from the International Satellite Cloud Climatology Project, low resolution database for the northern latitudes between 30 and 60 degrees. Each of the 240 IPC and 240 TAU images consisted of 12 rows and 144 columns with each pixel representing a 280 km square on the globe collected in three-hour intervals. Individual images were color coded according to land, sea and clouds before being put into motion. Six animations have been produced which start with the original images, progress to include daily composite images and culminate with a collage. Animations of the original images have the advantage of relatively short intervals between still frames but have many undefined pixels, which are eliminated in the composites. The results of this project can serve as an example of how to improve the visualization of time varying image sequences.
-
The objective of this study is to compare statistical and unsupervised neural network techniques for determination of correspondences between storm system regions extracted from sequences of satellite images. Analysis was applied to the International Satellite Cloud Climatology Project (ISCCP) low resolution D1 database for selected storm systems during the period April 5 - 9, 1989. Cloud top pressure was used to delineate regions of interest and cloud optical thickness combined with spatial location was used to track regions throughout a given time sequence. The ability of the k-nearest neighbor classifier and of self-organizing maps to determine correspondences between storm regions was assessed. The two techniques generally yielded similar associations between regions of interest throughout the time sequence. Differences in final tracking results between the two techniques occurred primarily as a result of differences in the collections of points from a region in a time step t<SUB>2</SUB> that corresponded to a region in an earlier time step t<SUB>1</SUB>. The tracking results were also compared to the results obtained at the NASA Goddard Institute for Space Studies using sea level pressure data from the National Meteorological Center (NMC). For the storm systems investigated in this study, the storm tracks exhibited the same general tracking behavior with expected variations between cloud system storm centers and low sea level pressure centers.
-
A view of interactions in the undergraduate classroom is presented from several perspectives. Topics discussed include class perceptions of teacher as facilitator/authority/leader, grades versus performance appraisals, mixed-gender interactions, and subtle forms of cultural variations.
-
Accurate identification and tracking of synoptic-scale storm systems in the northern midlatitudes is important for understanding the structure and movement of the midlatitude cloud field which plays a major role in climate change. In this paper, a hybrid neural network/genetic algorithm (NN/GA) approach is presented that analyzes the behavior of storm systems from one time frame to the next. The goal of the hybrid neural network algorithm is to improve classifier output by reducing the number of infeasible solutions using constraint optimization techniques. The input to the hybrid neural network algorithm is the output from a traditional backpropagation neural network. The hybrid NN/GA analyzes the backpropagation neural network output for logical consistencies and makes changes to the classification results based on strength of neural network classifications and satisfaction of logical constraints. The results are compared with classification results obtained using linear discriminant analysis, k-nearest neighbor rule, and backpropagation neural network techniques.
-
An evolutionary system was developed for generation of complete tracks of northern midlatitude synoptic-scale storm systems based on optical flow and cloud motion analyses of global satellite-based datasets produced by the International Satellite Cloud Climatology Project (ISCCP). The tracking results were compared with low sea level pressure anomaly (SLPA) tracks obtained from the NASA Goddard Institute for Space Studies (GISS). The SLPA tracks were produced at GISS by analysis of meteorological, ground-based National Center for Environmental Prediction (NCEP) datasets. Results from the evolutionary system were also compared with results from using (a) the k-nearest neighbor rule (k-NN) and (b) self-organizing maps (SOM) to determine correspondences between consecutive locations within a track. The consistency of our evolutionary storm tracking results with the behavior of the low sea level pressure anomaly tracks, the ability of our evolutionary system to generate and evaluate complete tracks, and the close comparison between the results obtained by the evolutionary, k-NN, and SOM analyses of the ISCCP-derived datasets at tracking steps in which proximity or optical flow information sufficed to determine movement, demonstrate the applicability and the potential of evolutionary systems for tracking midlatitude storm systems through low-resolution ISCCP cloud product datasets.
-
This preliminary investigation considers undergraduate student perceptions with respect to their professional future. `No one warned me it would be like this,' and `These are the things that college never taught me,' are typical comments that are heard from the young workforce. This paper addresses future plans and predictions of students from two New England institutions of higher learning by utilizing a variety of strategies. Methods to elicit data include in-class activities and carefully designed questionnaires. These exercises have been designed to uncover images and themes concerning transition from college to the workplace. Issues include technical and communication skills, leadership roles, corporate politics, group dynamics, and gender diversity in the workplace.
-
The objective of this research is to automate the classification of the temporal behavior of storm cloud systems based on measurements derived from consecutive satellite images. The motivation behind this study is to develop improved descriptions of cloud dynamics which can be used in general circulation models for prediction of global climate change. Analysis was applied to the International Satellite Cloud Climatology Project (ISCCP) low resolution cloud top pressure database for the first six days in April, 1989. A total of 296 midlatitude storm cloud components were tracked between consecutive 3-hour time frames. For each pair of components, temporal correspondence events were classified as either 1.) direct, 2.) merge, 3.) split, or 4.) reject. The reject class, which was used primarily to categorize pairs of unrelated systems, included storm cloud system dissipation and creation as well. Statistical, neural network, and evolutionary techniques were developed for finding solutions to the storm cloud correspondence problem. Evolutionary techniques applied to the problem consisted of 1.) a constraint-handling hybrid evolutionary technique and 2.) a genetic local search algorithm. The results demonstrate the potential of evolutionary techniques to yield meteorologically-feasible solutions, given appropriate constraints, to the two-frame storm tracking problem. © 1998 SPIE. All rights reserved.
Explore
Department
- Computer Science
- Chemistry (1)
- History (1)
- Mathematics (1)
- Physics (6)
- Psychology (2)
- Public Health (1)
Resource type
- Book (12)
- Book Section (11)
- Conference Paper (123)
- Journal Article (132)
- Report (13)
Publication year
- Between 1900 and 1999 (53)
-
Between 2000 and 2026
(238)
- Between 2000 and 2009 (35)
- Between 2010 and 2019 (87)
- Between 2020 and 2026 (116)