Your search

Department
  • Exfiltrating sensitive information from smartphones has become one of the most significant security threats. We have built a system to identify HTTP-based information exfiltration of malicious Android applications. In this paper, we discuss the method to track the propagation of sensitive information in Android applications using static taint analysis. We have studied the leaked information, destinations to which information is exfiltrated, and their correlations with types of sensitive information. The analysis results based on 578 malicious Android applications have revealed that a significant portion of these applications are interested in identity-related sensitive information. The vast majority of malicious applications leak multiple types of sensitive information. We have also identified servers associated with three country codes including CN, US, and SG are most active in collecting sensitive information. The analysis results have also demonstrated that a wide range of non-default ports are used by suspicious URLs. © 2018 IEEE.

  • We demonstrate that a nonzero strangeness contribution to the spacelike electromagnetic form factor of the nucleon is evidence for a strange-antistrange asymmetry in the nucleon's light-front wave function, thus implying different nonperturbative contributions to the strange and antistrange quark distribution functions. A recent lattice QCD calculation of the nucleon strange quark form factor predicts that the strange quark distribution is more centralized in coordinate space than the antistrange quark distribution, and thus the strange quark distribution is more spread out in light-front momentum space. We show that the lattice prediction implies that the difference between the strange and antistrange parton distribution functions, s(x)-s(x), is negative at small-x and positive at large-x. We also evaluate the strange quark form factor and s(x)-s(x) using a baryon-meson fluctuation model and a novel nonperturbative model based on light-front holographic QCD. This procedure leads to a Veneziano-like expression of the form factor, which depends exclusively on the twist of the hadron and the properties of the Regge trajectory of the vector meson which couples to the quark current in the hadron. The holographic structure of the model allows us to introduce unambiguously quark masses in the form factors and quark distributions preserving the hard scattering counting rule at large-Q2 and the inclusive counting rule at large-x. Quark masses modify the Regge intercept which governs the small-x behavior of quark distributions, therefore modifying their small-x singular behavior. Both nonperturbative approaches provide descriptions of the strange-antistrange asymmetry and intrinsic strangeness in the nucleon consistent with the lattice QCD result. © 2018 authors. Published by the American Physical Society.

  • To accommodate execution mode change and hardware malfunction, dynamic system reconfiguration, which invokes application migration across different processing cores, needs to be supported on multi-core embedded systems. Different application migration strategies will impact system's timing behaviors in different manners, it is important to select an appropriate one such that the system's timing performance after the migration process is still acceptable. The focus of our research is to predict the system's timing change of possible migration strategies and upon which to choose the optimal one. Extensive experiments have been set up by running multiple benchmarks and experimental results validate the effectiveness of our proposed approach.

  • Reliability, longevity, availability, and deadline guarantees are the four most important metrics to measure the QoS of long-running safety-critical real-time applications. Software aging is one of the major factors that impact the safety of long-running real-time applications as the degraded performance and increased failure rate caused by software aging can lead to deadline missing and catastrophic consequences. Software rejuvenation is one of the most commonly used approaches to handle issues caused by software aging. In this paper, we study the optimal time when software rejuvenation shall take place so that the system's reliability, longevity, and availability are maximized, and application delays caused by software rejuvenation is minimized. In particular, we formally analyze the relationships between software rejuvenation frequency and system reliability, longevity, and availability. Based on the theoretic analysis, we develop approaches to maximizing system reliability, longevity, and availability, and use simulation to evaluate the developed approaches. In addition, we design the MIN-DELAY semi-priority-driven scheduling algorithm to minimize application delays caused by rejuvenation processes. The simulation experiments show that the developed semi-priority-driven scheduling algorithm reduces application delays by 9.01% and 14.24% over the earliest deadline first (EDF) and least release time (LRT) scheduling algorithms, respectively.

  • We outline a novel method of user authentication for smart mobile devices, such as smartphones or tablets and propose movement pattern based authentication as an alternate to current methods that relies on a pin or drawn-pattern. While the current methods are vulnerable against common attacks (e.g., smudge attacks, shoulder surfing), our method, in contrast, is more resilient against the attacks of these kinds because it utilizes sensory data given off by the device during a preset movement for authentication. In our experiment, we recorded the values given off by four physical observational sensors: (1) accelerometer, (2) linear accelerometer, (3) gyroscope and (4) tilt sensor, which each had three axes, over a set of movements. We experimented with 10 arbitrary movement-patterns and gathered 12 samples of each (net 120 samples) to test with. We developed our own method of authentication, through which we performed 35,650 authentication attempts and found a 20.36% Equal Error Rate.

  • Design of a serial fusion based multi-biometric verification system requires fixing several parameters, such as reject thresholds at each stage of the architecture and the order in which each individual verifier is placed within the multi-stage system. Selecting the order of verifier is a crucial parameter to fix because of its high impact on verification errors. A wrong choice of verifier order might lead to tremendous user inconvenience by denying a large number of genuine users and might cause severe security breach by accepting impostors frequently. Unfortunately, this design issue has been poorly investigated in multi-biometric literature. In this paper, we address this design issue by performing experiments using three different serial fusion based multi-biometric verification schemes. We did our experiments on publicly available NIST multi-modal dataset. We tested 24 orders—all possible orders originated from four individual verifiers—on a four-stage biometric verification system. Our experimental results show that the verifier order “best-to-worst”, where the best performing individual verifier is placed in the first stage, the next best performing individual verifier is placed in the second stage, and so on, is the top performing order. In addition, we have proposed a modification to the traditional architecture of serial fusion based multi-biometric verification systems. With rigorous experiments on the NIST multi-modal dataset and using three serial fusion based multi-biometric verification schemes, we demonstrated that our proposed architecture significantly improves the performance of serial fusion based multi-biometric verification systems.

  • Measurements of parameters in electricity grids are frequently average values over some time interval. In scenarios of distributed measurements such as in distribution grids, offsets of local clocks can result in the averaging interval being misaligned. This paper investigates the properties of the so-called time alignment error of such measurands that is caused by shifts of the averaging interval. A Markov model is derived that allows for numerically calculating the expected value and other distribution properties of this error. Actual consumption measurements of an office building are used to study the behavior of this time alignment error, and to compare the results from the trace with numerical results and simulations from a fitted Markov model. For increasing averaging interval offset, the time alignment error approaches a normal distribution, whose parameters can be calculated or approximated from the Markov model.

  • The traditional architecture of serial fusion based multi-biometric verification systems places an average performing or the worst performing individual verifier in the final stage. Because the final stage gives the verification decision using a single threshold and takes on the most confusing samples which are rejected by all previous stages, an average or the worst performing individual verifier may incur high verification errors in the final stage, which may negatively impact the performance of the whole system. Unfortunately, it is not possible to place a strong individual verifier in the final stage of a traditional architecture because if we place a strong individual verifier in the final stage, we will have to place a weak individual verifier in an earlier stage. Studies show that placing a weak individual verifier in an earlier stage worsens the performance of the whole system by giving more wrong decision earlier. Hence, the challenge is-how can we place the best performing individual verifier in the first stage and at the same time not place an average or the worst performing individual verifier in the final stage? In this paper, we address this challenge. We have come up with a very simple but effective solution. We have proposed a modification to the traditional architecture of serial fusion based multi-biometric verification systems. With rigorous experiments on the NIST multi-modal dataset and using three serial fusion based multi-biometric verification schemes, we demonstrated that our proposed architecture significantly improves the performance of serial fusion based multi-biometric verification systems. © 2018 IEEE.

  • Twitter users often crave more followers to increase their social popularity. While a variety of factors have been shown to attract the followers, very little work has been done to analyze the mechanism how Twitter users follow or unfollow each other. In this paper, we apply game theory to modeling the follow-unfollow mechanism on Twitter. We first present a two-player game which is based on the Prisoner’s Dilemma, and subsequently evaluate the payoffs when the two players adopt different strategies. To allow two players to play multiple rounds of the game, we propose a multi-stage game model. We design a Twitter bot analyzer which follows or unfollows other Twitter users by adopting the strategies from the multi-stage game. We develop an algorithm which enables the Twitter bot analyzer to automatically collect and analyze the data. The results from analyzing the data collected in our experiment show that the follow-back ratios for both of the Twitter bots are very low, which are 0.76%0.76%0.76\% and 0.86%0.86%0.86\%. This means that most of the Twitter users do not cooperate and only want to be followed instead of following others. Our results also exhibit the effect of different strategies on the follow-back followers and on the non-following followers as well.

  • This textbook covers digital design, fundamentals of computer architecture, and assembly language. The book starts by introducing basic number systems, character coding, basic knowledge in digital design, and components of a computer. The book goes on to discuss information representation in computing; Boolean algebra and logic gates; sequential logic; input/output; and CPU performance. The author also covers ARM architecture, ARM instructions and ARM assembly language which is used in a variety of devices such as cell phones, digital TV, automobiles, routers, and switches. The book contains a set of laboratory experiments related to digital design using Logisim software; in addition, each chapter features objectives, summaries, key terms, review questions and problems. The book is targeted to students majoring Computer Science, Information System and IT and follows the ACM/IEEE 2013 guidelines. • Comprehensive textbook covering digital design, computer architecture, and ARM architecture and assembly • Covers basic number system and coding, basic knowledge in digital design, and components of a computer • Features laboratory exercises in addition to objectives, summaries, key terms, review questions, and problems in each chapter

Last update from database: 3/25/26, 6:13 PM (UTC)

Explore

Resource type

Resource language