Your search
Results 9 resources
-
This paper describes a collaborative project conducted by the Computer Science Department at Southern Connecticut State University and NASA's Goddard Institute for Space Science (GISS). Animations of output from a climate simulation math model used at GISS to predict rainfall and circulation have been produced for West Africa from June to September 2002. These early results have assisted scientists at GISS in evaluating the accuracy of the RM3 climate model when compared to similar results obtained from satellite imagery. The results presented below will be refined to better meet the needs of GISS scientists and will be expanded to cover other geographic regions for a variety of time frames.
-
Nanoparticles are of interest in many applications since their decreased size may give them properties that are very different from bulk material. Often nanoparticle properties such as size (diameter) and size distribution are evaluated using transmission electron microscopy (TEM). These parameters, size and size distribution, can be more easily obtained from digitized TEM images by mapping particle signal to black and background pixel to white in a process known as thresholding then performing an algorithm known as a particle analysis. The goal of this study was to compare the ability of several popular thresholding algorithms to segment TEM images. Performance of the thresholding algorithms was evaluated through qualitative and quantitative measures. Results show that the choice of a thresholding algorithm will strongly affect the results obtained from particle analysis. © 2007 Materials Research Society.
-
The purpose of this study was to compare the ability of several texture analysis parameters to differentiate textured samples from a smooth control on images obtained with an Atomic Force Microscope (AFM). Surface roughness plays a major role in the realm of material science, especially in integrated electronic devices. As these devices become smaller and smaller, new materials with better electrical properties are needed. New materials with smoother surface morphology have been found to have superior electrical properties than their rougher counterparts. Therefore, in many cases surface texture is indicative of the electrical properties that material will have. Physical vapor deposition techniques such as Jet Vapor Deposition and Molecular Beam Epitaxy are being utilized to synthesize these materials as they have been found to create pure and uniform thin layers. For the current study, growth parameters were varied to produce a spectrum of textured samples. The focus of this study was the image processing techniques associated with quantifying surface texture. As a result of the limited sample size, there was no attempt to draw conclusions about specimen processing methods. The samples were imaged using an AFM in tapping mode. In the process of collecting images, it was discovered that roughness data was much better depicted in the microscope's "height" mode as opposed to "equal area" mode. The AFM quantified the surface texture of each image by returning RMS roughness and the first order histogram statistics of mean roughness, standard deviation, skewness, and kurtosis. Color images from the AFM were then processed on an off line computer running NIH ImageJ with an image texture plug in. This plug in produced another set of first order statistics computed from each images' histogram as well as second order statistics computed from each images' cooccurrence matrix. The second order statistics, which were originally proposed by Haralick, include contrast, angular second moment, correlation, inverse difference moment, and entropy. These features were computed in the 0°, 45°, 90°, and 135° directions. The findings of this study propose that the best combination of quantitative texture parameters is standard deviation, 0° inverse difference moment, and 0° entropy, all of which are obtained from the NIH ImageJ texture plug in. © 2010 Copyright SPIE - The International Society for Optical Engineering.
-
There is an acute and well-documented need for image processing of microscopy data in materials science regarding, for example, the characterization of the structure/property relationship of a given materials system. In our work, image processing has been used as a framework for conducting interdisciplinary team-based research that effectively integrates programs within the Center for Research on Interface Structures and Phenomena (CRISP) Materials Research Science and Engineering Center (MRSEC), e.g. research experiences for undergraduates (REU), teachers (RET) and high school fellowships. This research resulted from a five-year long collaboration between CRISP and the Physics and Computer Science Departments at Southern Connecticut State University (SCSU). This paper will focus on the implementation of team-based research experiences as a vehicle for interdisciplinary science and education. Representative results of several of the studies are presented and discussed. © 2011 Materials Research Society.
-
The primary goal of this research was to investigate the ability of quantitative variables to confirm qualitative improvements of the deconvolution algorithm as a preprocessing step in evaluating micro CT bone density images. The analysis of these types of images is important because they are necessary to evaluate various countermeasures used to reduce or potentially reverse bone loss experienced by some astronauts when exposed to extended weightlessness during space travel. Nine low resolution (17.5 microns) CT bone density image sequences, ranging from between 85 to 88 images per sequence, were processed with three preprocessing treatment groups consisting of no preprocessing, preprocessing with a deconvolution algorithm and preprocessing with a Gaussian filter. The quantitative parameters investigated consisted of Bone Volume to Total Volume Ratio, the Structured Model Index, Fractal Dimension, Bone Area Ratio, Bone Thickness Ratio, Euler's Number and the Measure of Enhancement. Trends found in these quantitative variables appear to corroborate the visual improvements observed in the past and suggest which quantitative parameters may be capable of distinguishing between groups that experience bone loss and others that do not.
-
Nanoparticles, particles with a diameter of 1-100 nanometers (nm), are of interest in many applications including device fabrication, quantum computing, and sensing because their size may give them properties that are very different from bulk materials. Further advancement of nanotechnology cannot be obtained without an increased understanding of nanoparticle properties such as size (diameter) and size distribution frequently evaluated using transmission electron microscopy (TEM). In the past, these parameters have been obtained from digitized TEM images by manually measuring and counting many of these nanoparticles, a task that is highly subjective and labor intensive. More recently, computer imaging particle analysis has emerged as an objective alternative by counting and measuring objects in a binary image. This paper will describe the procedures used to preprocess a set of gray scale TEM images so that they could be correctly thresholded into binary images. This allows for a more accurate assessment of the size and frequency (size distribution) of nanoparticles. Several preprocessing methods including pseudo flat field correction and rolling ball background correction were investigated with the rolling ball algorithm yielding the best results. Examples of particle analysis will be presented for different types of materials and different magnifications. In addition, a method based on the results of particle analysis for identifying and removing small noise particles will be discussed. This filtering technique is based on identifying the location of small particles in the binary image and removing them without affecting the size of other larger particles.
-
Thresholding is an image processing procedure used to convert an image consisting of gray level pixels into a black and white binary image. One application of thresholding is particle analysis. Once foreground objects are separated from the background, a quantitative analysis that characterizes the number, size and shape of particles is obtained which can then be used to evaluate a series of nanoparticle samples. Numerous thresholding techniques exist differing primarily in how they deal with variations in noise, illumination and contrast. In this paper, several popular thresholding algorithms are qualitatively and quantitatively evaluated on transmission electron microscopy (TEM) and atomic force microscopy (AFM) images. Initially, six thresholding algorithms were investigated: Otsu, Riddler-Calvard, Kittler, Entropy, Tsai and Maximum Likelihood. The Riddler-Calvard algorithm was not included in the quantitative analysis because it did not produce acceptable qualitative results for the images in the series. Two quantitative measures were used to evaluate these algorithms. One is based on comparing object area the other on diameter before and after thresholding. For AFM images the Kittler algorithm yielded the best results followed by the Entropy and Maximum Likelihood techniques. The Tsai algorithm yielded the top results for TEM images followed by the Entropy and Kittler methods.
-
The use of Transmission Electron Microscopy (TEM) to characterize the microstructure of a material continues to grow in importance as technological advancements become increasingly more dependent on nanotechnology 1. Since nanoparticle properties such as size (diameter) and size distribution are often important in determining potential applications, a particle analysis is often performed on TEM images. Traditionally done manually, this has the potential to be labor intensive, time consuming, and subjective 2. To resolve these issues, automated particle analysis routines are becoming more widely accepted within the community 3. When using such programs, it is important to compare their performance, in terms of functionality and cost. The primary goal of this study was to apply one such software package, ImageJ to grayscale TEM images of nanoparticles with known size. A secondary goal was to compare this popular open-source general purpose image processing program to two commercial software packages. After a brief investigation of performance and price, ImageJ was identified as the software best suited for the particle analysis conducted in the study. While many ImageJ functions were used, the ability to break agglomerations that occur in specimen preparation into separate particles using a watershed algorithm was particularly helpful 4. © 2009 SPIE-IS&T.
Explore
Department
Resource type
- Conference Paper (8)
- Journal Article (1)
Publication year
Resource language
- English (9)