Your search
Results 5,127 resources
-
This book discusses biological, cognitive, educational, sociological, and interactive to discuss the nature of learning disabilities, its origins, its diagnosis, and effective remediation. It emphasizes the development of ideas as the motor forces behind the economic policies. © 1999 Taylor & Francis.
-
We demonstrate that a nonzero strangeness contribution to the spacelike electromagnetic form factor of the nucleon is evidence for a strange-antistrange asymmetry in the nucleon's light-front wave function, thus implying different nonperturbative contributions to the strange and antistrange quark distribution functions. A recent lattice QCD calculation of the nucleon strange quark form factor predicts that the strange quark distribution is more centralized in coordinate space than the antistrange quark distribution, and thus the strange quark distribution is more spread out in light-front momentum space. We show that the lattice prediction implies that the difference between the strange and antistrange parton distribution functions, s(x)-s(x), is negative at small-x and positive at large-x. We also evaluate the strange quark form factor and s(x)-s(x) using a baryon-meson fluctuation model and a novel nonperturbative model based on light-front holographic QCD. This procedure leads to a Veneziano-like expression of the form factor, which depends exclusively on the twist of the hadron and the properties of the Regge trajectory of the vector meson which couples to the quark current in the hadron. The holographic structure of the model allows us to introduce unambiguously quark masses in the form factors and quark distributions preserving the hard scattering counting rule at large-Q2 and the inclusive counting rule at large-x. Quark masses modify the Regge intercept which governs the small-x behavior of quark distributions, therefore modifying their small-x singular behavior. Both nonperturbative approaches provide descriptions of the strange-antistrange asymmetry and intrinsic strangeness in the nucleon consistent with the lattice QCD result. © 2018 authors. Published by the American Physical Society.
-
The excellent O-regioselectivity of the glycosidation of the ambident 2-O-substituted 5-fluorouracil (5-FU) via the silver salt method is computationally investigated at the MP2/6-311++G(2d,p):DZP//B3LYP/6-31+G(d):DZP level of theory. The reactions studied are those between 1-bromo-1-deoxy-2,3,4,6-tetra-O-acetyl-α-d-glucopyranose and the silver salts of 5-FU, 2-O-butyl-5-FU, and 2-O-benzyl-5-FU. Two pathways are considered as follows: (A) one where the silver and bromide ion do not interact, and (B) another where the silver and bromide ion interact in the transition states. Because the O-reaction barriers are much lower (by 13.3-22.2 kcal/mol) than N-reaction barriers in both pathways, the O-regioselectivity of the silver salt method can be satisfactorily explained by either path A or path B. Furthermore, path B, where Ag and Br interact consistently, has lower activation barriers than the corresponding path A (by 6.8-17.4 kcal/mol) in both N- and O-reactions. This computational result can be attributed to the following reasons: (1) the speeding-up effect in Koenigs-Knorr reactions due to the addition of silver carbonate into the reaction mixture; (2) the halogens being pulled away by silver ions from halides, as proposed by Kornblum and co-workers; and (3) the oxocarbenium ion involvement in the glycosidation reactions. The large energy difference between N- and O-transition states originates from the association between Ag and N-(O-) of the ambident unit (-N3-C4=O4) that shows significant covalent character so that the O-reaction transition states of the silver salt method benefit from favorable ionic interaction (C+···O-) and favorable covalent interaction (Ag···N). These two favorable interactions are in agreement with the hard and soft acids and bases principle; the former is a hard-hard interaction and the latter is a soft-soft interaction. © 2018 American Chemical Society.
-
Native fluorescence spectra play important roles in cancer detection. It is widely acknowledged that the emission spectrum of a tissue is a superposition of spectra of various salient fluorophores. However, component quantification is essentially an ill-posed problem. To address this problem, the native fluorescence spectra of normal human very low (LNCap), moderately metastatic (DU-145), and advanced metastatic (PC-3) cell lines were studied by the selected wavelength of 300 nm to investigate the key fluorescent molecules such as tryptophan, collagen and NADH. The native fluorescence spectra of cancer cell lines at different risk levels were analyzed using various machine learning algorithms for feature detection and develop criteria to separate the three types of cells. Principal component analysis (PCA), nonnegative matrix factorization (NMF), and partial least squares fitting were used separately to reduce dimension, extract features and detect biomolecular alterations reflected in the spectra. The scores corresponding to the basis spectra were used for classification. A linear support vector machine (SVM) was used to classify the spectra of the cells with different metastatic ability. In detection of signals coming from tryptophan and NADH with observed data corrupted by noise and inference, a sufficient statistic can be obtained based on the basis spectra retrieved using nonnegative matrix factorization. This work shows changes of relative contents of tryptophan and NADH obtained from native fluorescence spectroscopy may present potential criteria for detecting cancer cell lines of different metastatic ability. © 2018 SPIE.
-
Postmodernism and the Politics of ‘Culture’ is a comparative critical analysis of the political and intellectual ambitions of postmodernist critical theory and the academic discipline of cultural studies. Katz’s polemical aim is to show that cultural studies comes up short in both areas, because its practitioners focus on too-narrow issues-primarily, celebrating the folkways of micro-communities-while denying the very possibility of studying, understanding, and changing society in any comprehensive way and to any universally beneficial purpose. He argues that scholars and activists alike would do well to make use of the analytical tools of postmodernist critical theory, whose practitioners acknowledge the political significance of the differences between social groups, but do not consider them to be unbridgeable, and so seek to develop a set of practices for creating a truly inclusive, truly democratic public sphere. © 2000 Taylor & Francis. All rights reserved.
-
The internet has changed the way that many people access written works. Books and articles, of various lengths, in several formats can be bought and accessed online, both legally and illegally. Texts in even shorter form are originating through forums, SMS, blogs, emails, and social media. Automating the process of determining the authorship of posted texts would help combat online piracy of copyrighted text and plagiarism. In addition, authorship identification could help detect fraudulent email messages from dangerous sources and combat cyberattacks by identifying authentic sources. We experiment with several machine learning algorithms on a limited set of public domain literature to identify the most efficient method of authorship identification using the least amount of samples. Different sized data sets are created by 5 predefined rounds of random sampling of 1500 word blocks on a total of 28 text books from a corpus of 7 authors. Traditional methods of authorship identification, such as Naive Bayes, Artificial Neural Network, and Support Vector Machine are implemented in addition to using a modern Deep Learning Neural Network for classification. Thirteen stylometric features are extracted ranging from character based, word based, and syntactic features. Our model consistently showed that Support Vector Machine out performs other classification methods. © 2020
-
The Quad-camera Wavefront-sensing Six-channel Speckle Interferometer (QWSSI) is a new speckle imaging instrument available on the 4.3-m Lowell Discovery Telescope (LDT). QWSSI is built to efficiently make use of collected photons and available detector area. The instrument images on a single Electron Multiplying CCD (EMCCD) at four wavelengths in the optical (577, 658, 808, and 880nm) with 40nm bandpasses. Longward of 1μm, two imaging wavelengths in the NIR are collected at 1150 and 1570nm on two InGaAs cameras with 50nm bandpasses. All remaining non-imaging visible light is then sent into a wavefront EMCCD. All cameras are operated synchronously via concurrent triggering from a timing module. With the simultaneous wavefront sensing, QWSSI characterizes atmospheric aberrations in the wavefront for each speckle frame. This results in additional data that can be utilized during post-processing, enabling advanced techniques such as Multi-Frame Blind Deconvolution. The design philosophy was optimized for an inexpensive, rapid build; virtually all parts were commercial-off-the-shelf (COTS), and custom parts were fabricated or 3D printed on-site. QWSSI's unique build and capabilities represent a new frontier in civilian high-resolution speckle imaging. © COPYRIGHT SPIE. Downloading of the abstract is permitted for personal use only.
-
Biologists and bioinformaticians heavily rely on data portals and repositories accessible through web application. While they mostly agree that the data is valuable, they find the interfaces hard to use and non-intuitive. In this paper we present a user-centered design of a database for classification and annotation for major and minor introns in various species. Our design is based on surveying and interviewing minor intron researchers and comparing the features of existing databases. In addition to its ease of use, the proposed database, Major and Minor Intron Annotation Database (MMIAD) offers high flexibility in querying and downloading subsets of information that interest the user in multiple commonly used file formats. © Proceedings of the 14th IADIS International Conference Interfaces and Human Computer Interaction 2020, IHCI 2020 and Proceedings of the 13th IADIS International Conference Game and Entertainment Technologies 2020, GET 2020 - Part of the 14th Multi Conference on Computer Science and Information Systems, MCCSIS 2020. All rights reserved.
-
Drawing from analyses of teaching and learning, we posit a theoretical framework of axes, practical to epistemic and explicit to implicit, which frame four quadrants of support needed to know how and why to use the crosscutting concepts in sensemaking. This work has implications for the design of learning environments that use the crosscutting concepts in scientific sensemaking. © ISLS.
-
This paper examines the impact of higher education on youth unemployment. Following the 2008 financial crisis, youth unemployment returned to the fore as a serious concern among policy makers in Europe. A crucial difference from previous recessions is that this time around supply of higher education opportunities was much higher than in the 1980s, and indeed higher education participation rates grew rapidly in many regions during this period. Drawing on previous work on youth unemployment and the economic impacts of education we identify a variety of channels through which higher education is likely to influence youth unemployment. We examine this issue using a macro-panel of European regions for the period 2002-2012. This decade was characterized by variation in economic activity and higher education rates. Our results suggest that expansion of higher education during this period had a mitigating effect on youth unemployment and not recognizing this external benefit of education risks underestimating the effects of macroeconomic shocks on young people. © 2020, University of Illinois Press. All rights reserved.
-
Adult content on the Internet may be accessed by children with only a few keystrokes. While separate child-safe accounts may be established, a better approach could be incorporating automatic age estimation capability into the browser. We envision a safer browsing experience by implementing child-safe browsers combined with Internet content rating similar to the film industry. Before such a browser is created it was necessary to test the age estimation module to see whether acceptable error rates are possible. We created an Android application for collecting biometric touch data, specifically tapping data. We arranged with an elementary school, a middle school, a high school, and a university and collected samples from 262 user sessions (ages 5 to 61). From the tapping data, feature vectors were constructed, which were used to train and test 14 regressors and classifiers. Results for regression show the best mean absolute errors of 3.451 and 3.027 years, respectively, for phones and tablets. Results for classification show the best accuracies of 73.63% and 82.28%, respectively, for phones and tablets. These results demonstrate that age estimation, and hence, a child-safe browser, is feasible, and is a worthwhile objective. © 2020 IEEE.
-
Traditional keyboards remain the input device of choice for typing-heavy environments. When attached to sensitive data, security is a major concern. To continuously authenticate users in these environments, use of keystroke dynamics can be a preferred choice. An integral part of user enrollment in a keystroke based continuous authentication system is the writing instruction (prompt) given to the users, to use as a basis for their improvised writing. There are many prompts possible, and they directly impact the performance of authentication systems. Hence, prompts should be designed carefully, and with purpose. In this paper, we bridge the gap between cognitive psychology and computer science and attempt to influence the mental state of the users to acquire a better authentication performance. We compare two kinds of writing prompts, creative and factual, for generating reference samples. In addition, we perform two robustness tests: robustness to dissimilar writing style (e.g., creative reference and factual test) and robustness to surface (e.g., hard surface reference and soft surface test). We collect data from thirty participants in four weekly sessions. We experiment with three features: key interval, key press, and key hold latencies. We use Relative (R) measure to generate the match score between the reference and test samples. Results show that creative writing consistently performs better than the factual one. Both writing prompts perform well with dissimilar style in testing, i.e., continuous authentication is found robust to writing style. Also, we find that the surface (hard or soft) used in testing need not match that used for the reference, thus continuous authentication is also surface robust. © 2020 IEEE.
-
The Southern Connecticut Stellar Interferometer (SCSI) is an intensity interferometer that is designed to use correlated photon arrival times to determine the geometry of stars. Originally a low-cost, two-telescope instrument that used a 1-pixel single-photon avalanche diode (SPAD) detector at the focal plane of each telescope to record photon events, it is now being upgraded to include a third telescope. This will allow for the simultaneous detection of the photon correlation at three baselines, and thus the ability to map out the two-dimensional geometry of the source much more efficiently than with the two-telescope arrangement. Recent papers in the literature suggest that it may be possible to derive phase information in the Fourier domain from such triple correlations for the brightest stars, potentially giving SCSI an imaging capability. Prior to investigating this possibility, steps must be taken to maximize the observing efficiency of the SCSI. We present here our latest efforts in achieving better pointing, tracking, and collimation with our telescopes, and we discuss our first modeling results of the three-telescope situation in order to understand under what conditions the upgraded SCSI could retrieve imaging information. © COPYRIGHT SPIE. Downloading of the abstract is permitted for personal use only.
-
Based on a pragmatist inspired conception of the social self, the concept of reparations for the harms of genocide is reexamined. Both Raphael Lemkin, the person who invented the term “genocide,” and Claudia Card, a philosopher who examined the evil of genocide, hold similarly expansive notions of the harms inflicted by genocidal violence. Both argued that biological death is not necessarily central to genocide. For Lemkin cultural destruction of the targeted group is just as essential as the actual killing itself. Genocide is a group crime that aims to destroy the group and all the social aspects of group identity. Card similarly sees the target of genocidal violence as the social vitality of the self. This vitality is sustained by group relations. Reparations thus need to be reconceptualize in terms of the restoration of social life of the victim group and not solely on the basis of economic losses. Examples are given for the reparation of the social vitality of communities that have suffered genocide. © 2020 Central European Pragmatist Forum. All rights reserved.
-
The cyber-behavioral biometric modalities such as keystroke dynamics, mouse dynamics, and touch screen dynamics have come under attacks of different forms in recent days. To address these attacks and other security issues, we present a novel concept of using smartwatch sensor data to continuously verify users in cyberspace and show its potential to be a new standalone cyber-behavioral biometric modality. For our experiments, smartwatch gyroscope and accelerometer data collected from 49 subjects while typing in desktop computer have been considered. We implemented six pattern matching classifiers to compare each verification attempt against the user profile. Experimental results comprising of 282, 240 classification attempts show significantly high True Positive (TP) rates and extremely low False Positive (FP) rates with the highest achieved TP rate of 87.2% and lowest FP rate of 0.2%. With this level of accuracy and natural resiliency to attacks comes with physical biometric property as such in hand movement, we opine that smartwatch movement dynamics, besides being a new biometric trait, can be a solution to the security loopholes in existing cyber-behavioral biometric modalities for continuous verification. © 2020 IEEE.
-
Promoter regions of long non-coding RNA (lncRNA) genes are crucial to understand their transcriptional regulatory pattern. LncRNA genes, being more cryptic than protein-coding genes in terms of their functionality and biogenesis divergence, are lacking in number of existing studies to elucidate the roles of their promoters compared to their counterparts. Based on the overlap between epigenetic marks and transcription start sites, human lncRNAs were categorized into two broad categories: enhancer-originated lncRNAs (e-lncRNAs) and promoter-originated lncRNAs (p-lncRNAs) and hence these two groups are subject to distinct transcriptional regulatory programs. To understand the difference in the transcriptional regulatory mechanisms that governs p- and e-lncRNAs, we studied the promoter sequences of these two groups of lncRNAs including distinct transcription factor (TF) proteins that favor p-over e-lncRNA (and vice versa). In addition, we developed a convolution neural network (CNN) based deep learning (DL) framework DeePEL (deep p-, e-lncRNA promoter recognizer), to classify the promoter of p- and e-lncRNAs. To the best of our knowledge, this is the first attempt to classify these two groups of lncRNA promoters, using sequence and TF information, based on DL framework. We report several sequence specific signatures in the promoter regions as well as several distinct TFs specific to groups of lncRNAs that will help in understanding the promoter-proximal transcriptional regulation of p-lncRNAs and e-lncRNAs. © 2019 IEEE.
Explore
Resource type
- Audio Recording (1)
- Blog Post (4)
- Book (526)
- Book Section (627)
- Conference Paper (247)
- Dataset (1)
- Document (2)
- Encyclopedia Article (1)
- Journal Article (3,519)
- Magazine Article (24)
- Patent (1)
- Preprint (5)
- Presentation (23)
- Report (144)
- Thesis (2)
Publication year
-
Between 2000 and 2026
- Between 2000 and 2009 (1,022)
- Between 2010 and 2019 (2,500)
- Between 2020 and 2026 (1,605)
Resource language
- 206-207 (1)
- Chinese (10)
- chinese Traditional Chinese (1)
- Deutsch (1)
- English (3,531)
- English. (1)
- French (3)
- German (6)
- in czech and english Contributions In Czech And English (1)
- in czech or english Summaries In Czech Or English (1)