Your search
Results 291 resources
-
Pattern recognition techniques for cloud type and cloud amount classification were applied to digital infrared SMS-1 data. The cloud classification results were used in a numerical radiation model to determine solar radiation during Phase III of the GARP Atlantic Tropical Experiment. In order to assess the effects on radiation computations of cloud information derived from both satellite and ship data, cloud analyses based on both data sources were prepared for input into the numerical radiation model. -from Authors
-
The backpropagation method is modified by replacing sigmoid function by sinusoidal function. The leaving law is also modified. The modified procedure shows great improvement over the original BP in terms of the number of neurons and the learning time. © 1992 IEEE.
-
New systolic architectures are proposed for the computation of the Fourier transform based on the generation of the coefficients of the transform during the computation. These architectures require less input/output pins on the chip. The new architectures are also extremely modular and cascadeable, thus, amenable for efficient VLSI implementation. VLSI complexity of the architectures are compared with the existing parallel architectures. © 1992 IEEE.
-
The increasing prevalence of multiprocessor and distributed systems in modern society is making it imperative to introduce the underlying principles of parallel/distributed computing to students at the undergraduate level. In order to meet the needs of our students for training in this critical area, the Computer Science Department at Southern Connecticut State University (SCSU) is currently in the process of implementing a curricular and laboratory development project that integrates key concepts and practical experiences in parallel computing throughout the undergraduate curriculum. The goal of this project is to build a strong foundation in parallel computing which would optionally culminate in advanced, senior-level specialized courses in parallel computing and/or senior research projects. This paper describes the laboratory facility we developed to support instruction in parallel and distributed computing and the parallel computing modules which were incorporated into three of our core undergraduate courses: data structures, operating systems, and programming languages. The laboratory facility enables us to provide our students with "hands-on" experiences in shared memory, distributed memory, and network parallelism. The modules and laboratory exercises give students the opportunity to experiment with a wide array of software and hardware environments and to gain a systematic exposure to the principles and techniques of parallel programming.
-
For the TREC 2007 conference, the CRM114 team considered three non-Bayesian methods of spam filtration in the CRM114 framework - an SVM based on the "hyperspace" feature==document paradigm, a bit-entropy matcher, and substring compression based on LZ77. As a calibration yardstick, we used the well-tested and widely used CRM114 OSB markov random field system (basically unchanged since 2003). The results show that the SVM has a spam-filtering accuracy of about a factor of two to three better accuracy than the OSB system, that substring compression is somewhat more accurate than OSB, and that bit entropy is somewhat less accurate for the TREC 2007 test sets.
-
There have been a large number of projects based on the Distributed Object Oriented (DOO) approach for solving complex problems in various scientific fields. The Mismatch problem is one of the most important problems facing the DOO system, where the initial design of the DOO application does not give the best class distribution. In such a case, the DOO software may need to be restructured. In this paper, we propose a methodology for efficiently restructuring the DOO software classes to be mapped on a distributed system consisting of a set of nodes. The proposed methodology consists of two phases. The first phase introduces a recursive graph clustering technique to partition the OO system into subsystems with low coupling. The second phase is concerned with mapping the generated partitions to the set of available machines in the target distributed architecture. A simulation evaluation was carried out for a set of randomly generated DOO software designs. Then the results were compared with those of the K-Partitioning algorithm in terms of the overall inter-class communication cost. © 2008 IEEE.
-
In this paper we describe an information system that we have designed for students and researchers to conduct atmospheric studies using data that they have collected from multiple atmospheric instruments including two laser radar (lidar) systems. The lidar systems available for research include a monostatic Micro Pulse Lidar System and a bistatic imaging CLidar system. Complementary instruments for data analysis and ground truth specification include a nephelometer, sunphotometers and a weather station. Information structures within the system allow users to 1) label, describe and archive raw and derived datasets from multiple atmospheric instruments with associated metadata using NetCDF format, 2) link together coincident and co-located datasets from different instruments and 3) identify owner and verify user access rights of raw and derived datasets. Data analysis software tools have been developed in MATLAB to characterize and remove instrument artifacts based on experimental lidar studies, to analyze clear sky data to determine variability in atmospheric aerosol content over time and altitude, and to investigate cloud and aerosol patterns.
-
In real-time software systems, meeting deadlines is crucial. Software engineers face many challenges to model the object-oriented software system to handle complex real-time constraints. The accurate estimating of the performance time is a key criterion for a precise scheduling decision. This paper presents an object-oriented performance model that analyzes the behavior of the real-time objects' tasks whose executions are controlled by a scheduler. Each task is subject to a time/utility function (TUF) that determines the accrued utility of the task according to its completion time. The scheduling scheme uses both the estimated time generated by the object-oriented performance model and the time utility function (TUF) of each task in the object-oriented system in order to maximize the total accrued utility. In addition, we implemented a software tool to conduct experimental study in order to show the effectiveness of our approach. © 2011 IEEE.
-
This book provides a hands-on approach to learning ARM assembly language with the use of a TI microcontroller. The book starts with an introduction to computer architecture and then discusses number systems and digital logic. The text covers ARM Assembly Language, ARM Cortex Architecture and its components, and Hardware Experiments using TILM3S1968. Written for those interested in learning embedded programming using an ARM Microcontroller. © Springer International Publishing Switzerland 2015.
-
Exfiltrating sensitive information from smartphones has become one of the most significant security threats. We have built a system to identify HTTP-based information exfiltration of malicious Android applications. In this paper, we discuss the method to track the propagation of sensitive information in Android applications using static taint analysis. We have studied the leaked information, destinations to which information is exfiltrated, and their correlations with types of sensitive information. The analysis results based on 578 malicious Android applications have revealed that a significant portion of these applications are interested in identity-related sensitive information. The vast majority of malicious applications leak multiple types of sensitive information. We have also identified servers associated with three country codes including CN, US, and SG are most active in collecting sensitive information. The analysis results have also demonstrated that a wide range of non-default ports are used by suspicious URLs. © 2018 IEEE.
-
We demonstrate that a nonzero strangeness contribution to the spacelike electromagnetic form factor of the nucleon is evidence for a strange-antistrange asymmetry in the nucleon's light-front wave function, thus implying different nonperturbative contributions to the strange and antistrange quark distribution functions. A recent lattice QCD calculation of the nucleon strange quark form factor predicts that the strange quark distribution is more centralized in coordinate space than the antistrange quark distribution, and thus the strange quark distribution is more spread out in light-front momentum space. We show that the lattice prediction implies that the difference between the strange and antistrange parton distribution functions, s(x)-s(x), is negative at small-x and positive at large-x. We also evaluate the strange quark form factor and s(x)-s(x) using a baryon-meson fluctuation model and a novel nonperturbative model based on light-front holographic QCD. This procedure leads to a Veneziano-like expression of the form factor, which depends exclusively on the twist of the hadron and the properties of the Regge trajectory of the vector meson which couples to the quark current in the hadron. The holographic structure of the model allows us to introduce unambiguously quark masses in the form factors and quark distributions preserving the hard scattering counting rule at large-Q2 and the inclusive counting rule at large-x. Quark masses modify the Regge intercept which governs the small-x behavior of quark distributions, therefore modifying their small-x singular behavior. Both nonperturbative approaches provide descriptions of the strange-antistrange asymmetry and intrinsic strangeness in the nucleon consistent with the lattice QCD result. © 2018 authors. Published by the American Physical Society.
-
The internet has changed the way that many people access written works. Books and articles, of various lengths, in several formats can be bought and accessed online, both legally and illegally. Texts in even shorter form are originating through forums, SMS, blogs, emails, and social media. Automating the process of determining the authorship of posted texts would help combat online piracy of copyrighted text and plagiarism. In addition, authorship identification could help detect fraudulent email messages from dangerous sources and combat cyberattacks by identifying authentic sources. We experiment with several machine learning algorithms on a limited set of public domain literature to identify the most efficient method of authorship identification using the least amount of samples. Different sized data sets are created by 5 predefined rounds of random sampling of 1500 word blocks on a total of 28 text books from a corpus of 7 authors. Traditional methods of authorship identification, such as Naive Bayes, Artificial Neural Network, and Support Vector Machine are implemented in addition to using a modern Deep Learning Neural Network for classification. Thirteen stylometric features are extracted ranging from character based, word based, and syntactic features. Our model consistently showed that Support Vector Machine out performs other classification methods. © 2020
-
Biologists and bioinformaticians heavily rely on data portals and repositories accessible through web application. While they mostly agree that the data is valuable, they find the interfaces hard to use and non-intuitive. In this paper we present a user-centered design of a database for classification and annotation for major and minor introns in various species. Our design is based on surveying and interviewing minor intron researchers and comparing the features of existing databases. In addition to its ease of use, the proposed database, Major and Minor Intron Annotation Database (MMIAD) offers high flexibility in querying and downloading subsets of information that interest the user in multiple commonly used file formats. © Proceedings of the 14th IADIS International Conference Interfaces and Human Computer Interaction 2020, IHCI 2020 and Proceedings of the 13th IADIS International Conference Game and Entertainment Technologies 2020, GET 2020 - Part of the 14th Multi Conference on Computer Science and Information Systems, MCCSIS 2020. All rights reserved.
-
Adult content on the Internet may be accessed by children with only a few keystrokes. While separate child-safe accounts may be established, a better approach could be incorporating automatic age estimation capability into the browser. We envision a safer browsing experience by implementing child-safe browsers combined with Internet content rating similar to the film industry. Before such a browser is created it was necessary to test the age estimation module to see whether acceptable error rates are possible. We created an Android application for collecting biometric touch data, specifically tapping data. We arranged with an elementary school, a middle school, a high school, and a university and collected samples from 262 user sessions (ages 5 to 61). From the tapping data, feature vectors were constructed, which were used to train and test 14 regressors and classifiers. Results for regression show the best mean absolute errors of 3.451 and 3.027 years, respectively, for phones and tablets. Results for classification show the best accuracies of 73.63% and 82.28%, respectively, for phones and tablets. These results demonstrate that age estimation, and hence, a child-safe browser, is feasible, and is a worthwhile objective. © 2020 IEEE.
-
Traditional keyboards remain the input device of choice for typing-heavy environments. When attached to sensitive data, security is a major concern. To continuously authenticate users in these environments, use of keystroke dynamics can be a preferred choice. An integral part of user enrollment in a keystroke based continuous authentication system is the writing instruction (prompt) given to the users, to use as a basis for their improvised writing. There are many prompts possible, and they directly impact the performance of authentication systems. Hence, prompts should be designed carefully, and with purpose. In this paper, we bridge the gap between cognitive psychology and computer science and attempt to influence the mental state of the users to acquire a better authentication performance. We compare two kinds of writing prompts, creative and factual, for generating reference samples. In addition, we perform two robustness tests: robustness to dissimilar writing style (e.g., creative reference and factual test) and robustness to surface (e.g., hard surface reference and soft surface test). We collect data from thirty participants in four weekly sessions. We experiment with three features: key interval, key press, and key hold latencies. We use Relative (R) measure to generate the match score between the reference and test samples. Results show that creative writing consistently performs better than the factual one. Both writing prompts perform well with dissimilar style in testing, i.e., continuous authentication is found robust to writing style. Also, we find that the surface (hard or soft) used in testing need not match that used for the reference, thus continuous authentication is also surface robust. © 2020 IEEE.
-
The cyber-behavioral biometric modalities such as keystroke dynamics, mouse dynamics, and touch screen dynamics have come under attacks of different forms in recent days. To address these attacks and other security issues, we present a novel concept of using smartwatch sensor data to continuously verify users in cyberspace and show its potential to be a new standalone cyber-behavioral biometric modality. For our experiments, smartwatch gyroscope and accelerometer data collected from 49 subjects while typing in desktop computer have been considered. We implemented six pattern matching classifiers to compare each verification attempt against the user profile. Experimental results comprising of 282, 240 classification attempts show significantly high True Positive (TP) rates and extremely low False Positive (FP) rates with the highest achieved TP rate of 87.2% and lowest FP rate of 0.2%. With this level of accuracy and natural resiliency to attacks comes with physical biometric property as such in hand movement, we opine that smartwatch movement dynamics, besides being a new biometric trait, can be a solution to the security loopholes in existing cyber-behavioral biometric modalities for continuous verification. © 2020 IEEE.
-
Promoter regions of long non-coding RNA (lncRNA) genes are crucial to understand their transcriptional regulatory pattern. LncRNA genes, being more cryptic than protein-coding genes in terms of their functionality and biogenesis divergence, are lacking in number of existing studies to elucidate the roles of their promoters compared to their counterparts. Based on the overlap between epigenetic marks and transcription start sites, human lncRNAs were categorized into two broad categories: enhancer-originated lncRNAs (e-lncRNAs) and promoter-originated lncRNAs (p-lncRNAs) and hence these two groups are subject to distinct transcriptional regulatory programs. To understand the difference in the transcriptional regulatory mechanisms that governs p- and e-lncRNAs, we studied the promoter sequences of these two groups of lncRNAs including distinct transcription factor (TF) proteins that favor p-over e-lncRNA (and vice versa). In addition, we developed a convolution neural network (CNN) based deep learning (DL) framework DeePEL (deep p-, e-lncRNA promoter recognizer), to classify the promoter of p- and e-lncRNAs. To the best of our knowledge, this is the first attempt to classify these two groups of lncRNA promoters, using sequence and TF information, based on DL framework. We report several sequence specific signatures in the promoter regions as well as several distinct TFs specific to groups of lncRNAs that will help in understanding the promoter-proximal transcriptional regulation of p-lncRNAs and e-lncRNAs. © 2019 IEEE.
-
Deep learning is a promising approach for fine- grained disease severity classification for smart agriculture, as it avoids the labor-intensive feature engineering and segmentation-based threshold. In this work, we first propose a Densely Connected Convolutional Networks (DenseNet) based transfer learning method to detect the plant diseases, which expects to run on edge servers with augmented computing resources. Then, we propose a lightweight Deep Neural Networks (DNN) approach that can run on Internet of Things (IoT) devices with constrained resources. To reduce the size and computation cost of the model, we further simplify the DNN model and reduce the size of input sizes. The proposed models are trained with different image sizes to find the appropriate size of the input images. Experiment results are provided to evaluate the performance of the proposed models based on real- world dataset, which demonstrate the proposed models can accurately detect plant disease using low computational resources. © 2019 IEEE.
-
This paper presents highly robust, novel approaches to solving the forward and inverse problems of an Electrical Capacitance Tomography (ECT) system for imaging conductive materials. ECT is one of the standard tomography techniques for industrial imaging. An ECT technique is nonintrusive and rapid and requires a low burden cost. However, the ECT system still suffers from a soft-field problem which adversely affects the quality of the reconstructed images. Although many image reconstruction algorithms have been developed, still the generated images are inaccurate and poor. In this work, the Capacitance Artificial Neural Network (CANN) system is presented as a solver for the forward problem to calculate the estimated capacitance measurements. Moreover, the Metal Filled Fuzzy System (MFFS) is proposed as a solver for the inverse problem to construct the metal images. To assess the proposed approaches, we conducted extensive experiments on image metal distributions in the lost foam casting (LFC) process to light the reliability of the system and its efficiency. The experimental results showed that the system is sensible and superior. © 2019 Wael Deabes et al.
Explore
Department
- Computer Science
- Chemistry (1)
- History (1)
- Mathematics (1)
- Physics (6)
- Psychology (2)
- Public Health (1)
Resource type
- Book (12)
- Book Section (11)
- Conference Paper (123)
- Journal Article (132)
- Report (13)
Publication year
- Between 1900 and 1999 (53)
-
Between 2000 and 2026
(238)
- Between 2000 and 2009 (35)
- Between 2010 and 2019 (87)
- Between 2020 and 2026 (116)