Your search

In authors or contributors
  • This multi-phased study investigates the learning outcomes of courses taught in the K-14 classroom. Specifically, the methods and practices teachers use to develop and encourage 21st Century Skills including critical thinking skills and technological fluency in all subject areas, STEM and non-STEM related, are of great interest. Currently, these skills are in high demand in fields which develop advanced materials and are the backbone of the National Academies-developed Frameworks for K-12 Science Education. Phase I participants in this study included high school and college educators while Phase II of the study will involve K-14 students. In this study, educators were asked to rate their teaching self-efficacy in two primary areas: critical thinking skills and technological fluency. This included questions related to components in their current curriculum as well as methods of assessment [e.g., rubrics]. The instrument created to measure self-efficacy was based on a modified 'Science Teaching Efficacy Belief Instrument' (STEBI). All participants were from Connecticut. Results indicate that both STEM and non-STEM related subject areas offer an equally rich array of opportunities to effectively teach critical thinking and technological fluency at a variety of educational levels. The impact of Professional Development on teacher self-efficacy was of particular importance, especially in K-12 education. © 2013 Materials Research Society.

  • Nanoparticles, particles with a diameter of 1-100 nanometers (nm), are of interest in many applications including device fabrication, quantum computing, and sensing because their size may give them properties that are very different from bulk materials. Further advancement of nanotechnology cannot be obtained without an increased understanding of nanoparticle properties such as size (diameter) and size distribution frequently evaluated using transmission electron microscopy (TEM). In the past, these parameters have been obtained from digitized TEM images by manually measuring and counting many of these nanoparticles, a task that is highly subjective and labor intensive. More recently, computer imaging particle analysis has emerged as an objective alternative by counting and measuring objects in a binary image. This paper will describe the procedures used to preprocess a set of gray scale TEM images so that they could be correctly thresholded into binary images. This allows for a more accurate assessment of the size and frequency (size distribution) of nanoparticles. Several preprocessing methods including pseudo flat field correction and rolling ball background correction were investigated with the rolling ball algorithm yielding the best results. Examples of particle analysis will be presented for different types of materials and different magnifications. In addition, a method based on the results of particle analysis for identifying and removing small noise particles will be discussed. This filtering technique is based on identifying the location of small particles in the binary image and removing them without affecting the size of other larger particles.

  • Thresholding is an image processing procedure used to convert an image consisting of gray level pixels into a black and white binary image. One application of thresholding is particle analysis. Once foreground objects are separated from the background, a quantitative analysis that characterizes the number, size and shape of particles is obtained which can then be used to evaluate a series of nanoparticle samples. Numerous thresholding techniques exist differing primarily in how they deal with variations in noise, illumination and contrast. In this paper, several popular thresholding algorithms are qualitatively and quantitatively evaluated on transmission electron microscopy (TEM) and atomic force microscopy (AFM) images. Initially, six thresholding algorithms were investigated: Otsu, Riddler-Calvard, Kittler, Entropy, Tsai and Maximum Likelihood. The Riddler-Calvard algorithm was not included in the quantitative analysis because it did not produce acceptable qualitative results for the images in the series. Two quantitative measures were used to evaluate these algorithms. One is based on comparing object area the other on diameter before and after thresholding. For AFM images the Kittler algorithm yielded the best results followed by the Entropy and Maximum Likelihood techniques. The Tsai algorithm yielded the top results for TEM images followed by the Entropy and Kittler methods.

Last update from database: 3/13/26, 4:15 PM (UTC)

Explore

Resource type

Resource language