Your search
Results 2,500 resources
-
Creative approaches to counseling help counselors to meet the needs of diverse populations. The utility of photography in counseling has been demonstrated through several case studies; however, clear implications of how photography relates to the counseling process have not been well delineated. The existing literature on phototherapy is reviewed and connected to specific photo directives within the counseling process and common psychotherapeutic techniques. © 2012 Copyright Taylor & Francis Group, LLC.
-
In a recent article in The New York Times, therapy using Sexual Orientation Management was highlighted, with the focus on therapists assisting clients to live a heterosexual life because of their religious beliefs.The primary reasons for such an approach are to allow for client choice and respect on an equally important area of diversity, their religious affiliation and values.Although research has been performed on the intersection of religion and sexual orientation, there has not been an extensive analysis or criticism of this management technique.In this article, the authors explore the experience of religious clients struggling with their sexual orientation, discuss the potential counselor responses to sexual orientation religious issues, and the impact of each on the counseling field using ethical principles, existing research, and logical outcomes.© 2011 Copyright Taylor and Francis Group, LLC.
-
We defined a set of quantifiable features for authorship categorization. We performed our experiments on public domain literature - all books analyzed were obtained in plain text format through Project Gutenberg's online repository of classic books. We tested three machine learning algorithms: Artificial Neural Network, Naïve Bayes Classifier, and Support Vector Machine with our features. We found that certain features, such as punctuation and various suffixes result in a higher accuracy. In addition, the Support Vector Machine classifier produces repeatedly higher accuracies than other classifiers and seems to be a far superior method of classification in terms of authorship categorization. © 2016 IEEE.
-
We give the theoretical foundation for finding a reject region which gives the minimum equal error rate in serial fusion based biometric verification. Given a user-specified tolerance of x percent genuine score reject rate, we prove that there exists a unique reject region inside which the false alarm rate and impostor pass rate curves overlap, and this reject region gives the minimum equal error rate. Our theory leads to new algorithms for finding reject regions, which have two key advantages over the state-of-the-art: (1) the algorithms allow the system administrator to control the proportion of genuine scores that a reject region can erroneously reject and (2) the algorithms determine reject regions directly from the scores, without the need to estimate score distributions. Our proofs do not rely on data belonging to any particular distribution, which makes them applicable to a wide range of biometric modalities including face, finger, iris, speech, gait, and keystrokes. © 2016 IEEE.
-
Smartphones, while providing users ease of access to sensitive information on the go, also present severe security risks if an attacker is able to gain access to them. To strengthen the user authentication and identification in a smartphone, we develop a biometric authentication and identification system which uses the capacitive touchscreen that is featured in all current smartphones. Our methodology focuses on using the touchscreen as a sensor to capture the image of a user's ear, thumb or four fingers. We extract the capacitive raw data from the touched body part to obtain a capacitive image, and then use it to capture geometric features (e.g., length and width of a finger) and principal components. After that, we experiment with Support Vector Machine (SVM) and Random Forest (RF) classifiers to verify and also identify each user. We achieved the maximum authentication accuracy of 98.84% by four fingers with SVM, and maxinum identification accuracy of 97.61% by four fingers with RF. © 2016 IEEE.
-
Search task difficulty has been attracting much research attention in recent years, mostly regarding its relationship with searchers' behaviors and the prediction of task difficulty from search behaviors. However, it remains unknown what makes searchers feel the difficulty. A study consisting of 48 undergraduate students was conducted to explore this question. Each participant was given 4 search tasks that were carefully designed following a task classification scheme. Questionnaires were used to elicit participants' ratings on task difficulty and why they gave those ratings. Based on the collected difficulty reasons, a coding scheme was developed, which covered various aspects of task, user, and user-task interaction. Difficulty reasons were then categorized following this scheme. Results showed that searchers reported some common reasons leading to task difficulty in different tasks, but most of the difficulty reasons varied across tasks. In addition, task difficulty had some common reasons between searchers with low and high levels of topic knowledge, although there were also differences in top task difficulty reasons between high and low knowledge users. These findings further our understanding of search task difficulty, the relationship between task difficulty and task type, and that between task difficulty and knowledge level. The findings can also be helpful with designing tasks for information search experiments, and have implications on search system design both in general and for personalization based on task type and searchers' knowledge. (C) 2014 Elsevier Ltd. All rights reserved.
-
Data dissemination protocols govern interaction and exchange of data among nodes in a distributed system. An understanding of data transfer protocols provides insight into efficient middleware management. Due to their simplicity, scalability and fault-tolerance, gossip-based protocols are researched widely as an effective communication strategy. The Shuffle protocol presented in [1], is an example of a decentralized, gossip-based data transfer protocol used to spread information in a wireless network via probabilistic exchange of data. This paper presents, an asynchronous variant of the Shuffle protocol and a system model that captures variability in data transmission times. This transmission time variability is inherent in dynamic networks, where such algorithms are typically deployed. A simulation-based analysis of the protocol's performance behavior is presented. Results show the effects of transmission variability, on data replication and its coverage. Also examined is the relationship between available storage and the performance of the protocol, expressed using measures such as propagation time and work. © 2015 IEEE.
-
Manufacturers of CPUs publish a document that contains information about the processor that includes: list of registers, function of each register, size of data bus, size of address bus and list of instructions that can be executed by the CPU. Each CPU has a known instruction set that a programmer can use to write assembly language programs. Instruction sets are specific to each type of processor. That being said, Pentium processors use a different instruction set than ARM processors. Using the Instructions a of processor to write a program is called assembly language and function of an assembler is to convert assembly language to machine code (binary) that the CPU can understand.
-
ARM offers variety of the core processor base on their applications and they are: Cortex A series: Cortex A series is a High performance processor for open operating system, the Cortex-A50 is a 64 bit process, application of Cortex-A series are Smart phones, Netbook, Digital TV, and eBook readers
-
The data transfer instructions are used to transfer data from memory to registers and from registers to memory and they are Load (LDR) and Store (STR) instructions.
-
Advanced RISC Machine (ARM) was developed by the Acorn Company. ARM is a leader supplier of microprocessors in the world, ARM develop the core CPU and thousand of suppliers add more functional units to the core. ARM uses two types instruction called Thumb and Thumb-2. Thumb instructions are 16 bits and thumb-2 instructions are 32 bits, currently most ARM processors uses 32 bit instructions.
-
The basic components of an Integrated Circuit (IC) is logic gates which made of transistors, in digital system there are three basic logic operations and they are called AND, OR and NOT.
-
In order to understand network technology it is important to know how information is represented for transmission from one computer to another. Information can be transferred between computers in one of two ways: an analog signal or a digital signal.
-
Dynamic voltage and frequency scaling (DVFS) is a well-known technique to optimize the power dissipation of electronic systems without significantly compromising overall system performance. DVFS exploits the periods of inter-core data exchange (memory-bound operations) to reduce the voltage and frequency (V/F) of the cores in order to reduce the power dissipation during the execution flow of an application running on the CMP. As the lengths of the idle and busy periods of the cores vary depending on the benchmarks, it is crucial for any DVFS technique to maximize the power saving without losing a significant performance. In this work we present two power optimization methodologies that are integrated into a full-system simulator to make online predictions about the voltage and frequency of the cores during the execution time of the benchmarks. We evaluate these methodologies in terms of the V/F predictions vs. the actual utilization of each core periodically. We also compare the overall execution time, energy dissipation, and energy-delay product (EDP) of the power optimization methodologies for various benchmarks. © 2015 IEEE.
-
We find that the relative preference of hedonic products is disproportionately enhanced when they are offered at a free price. This “free price bounce” is more subdued for utilitarian products. This is surprising because rational choice theory posits that relative preference amidst two options - say a hedonic and a utilitarian product - remains intact as long as the price difference between them is constant. We propose and demonstrate that this axiom is violated when a hedonic product is offered for free. (C) 2015 Elsevier B.V. All rights reserved.
-
Given the popularity of PHP frameworks used in developing webbased applications, a comparative study is conducted to determine which framework is best suited for incorporation into the curriculum of an undergraduate software engineering course that uses project-based learning. The top six PHP frameworks (Zend, Yii, CakePHP, CodeIgniter, PRADO, and Symphony) were initially considered and then narrowed down to two (CakePHP and CodeIgniter) based on their alignment with common functionality in previous class projects, framework complexity for those new to frameworks (learning curve), and developer friendliness (availability of documentation and online resources). An in-depth comparative study is conducted by developing a functionally-equivalent web application using each of the two frameworks as well as plain PHP (no framework). This work was motivated by the difficulties that were encountered in an evolving, content-rich software engineering course and discusses the educational changes that were made to align student learning with sound software engineering principles and current software development practices used in the computing industry. Copyright © 2013 ACM.
-
In this paper, we examined why information searchers perceive search tasks as difficult, and what factors/reasons make them perceive tasks as difficult. We also examined if task difficulty reasons vary across different tasks (task types). Data was collected through a controlled laboratory experiment in which tasks were designed following a classification scheme. A total of 32 undergraduate students participated, each was given 4 search tasks, and they were asked in questionnaires both before and after the tasks for task difficulty ratings and why they gave those ratings. We developed a coding scheme based on the difficulty reasons users gave, which covered various aspects of task, user, and user-task interaction. Difficulty reasons were categorized following this scheme. Results showed that searchers had some common reasons for task difficulty in different tasks, but most of the difficulty reasons varied across tasks. For each task, there were also common reasons for task difficulty, although there was some variation here as well. Task difficulty was also found to be negatively correlated with users topic knowledge, previous experience, and topic interest. Our findings help understand search task difficulty, as well as the relationships between task difficulty and task type, knowledge background, etc. These can also be helpful with experiment task design. © 2013 ACM.
Explore
Resource type
- Book (243)
- Book Section (384)
- Conference Paper (115)
- Journal Article (1,679)
- Magazine Article (10)
- Presentation (14)
- Report (55)