Your search

In authors or contributors
Publication year
  • The primary purpose of this study was to investigate the feasibility of using simulated data from the United Kingdom Meteorological Office (UKMO) global climate mathematical model to serve as boundary values for a regional model RM3 which has been used by NASA to make predictions about climate dynamics in West Africa. In the past, historical data has been used successfully as boundary data but this approach limits outcomes to time periods in the past. The advantage of using the UKMO data is its potential to provide input boundary data for future time periods resulting in future regional predictions. This study has provided NASA scientists with graphical and statistical summaries including visual animations that provide qualitative and quantitative information necessary for evaluating whether the UKMO data can be used as a driving force for the RM3 model. One definite conclusion of this investigation is that both spatial and temporal interpolation of UKMO results will be necessary in order to make its results compatible with the RM3 model.

  • This paper describes a collaborative project conducted by the Computer Science Department at Southern Connecticut State University and NASA's Goddard Institute for Space Science (GISS). Animations of output from a climate simulation math model used at GISS to predict rainfall and circulation have been produced for West Africa from June to September 2002. These early results have assisted scientists at GISS in evaluating the accuracy of the RM3 climate model when compared to similar results obtained from satellite imagery. The results presented below will be refined to better meet the needs of GISS scientists and will be expanded to cover other geographic regions for a variety of time frames.

  • Nanoparticles are of interest in many applications since their decreased size may give them properties that are very different from bulk material. Often nanoparticle properties such as size (diameter) and size distribution are evaluated using transmission electron microscopy (TEM). These parameters, size and size distribution, can be more easily obtained from digitized TEM images by mapping particle signal to black and background pixel to white in a process known as thresholding then performing an algorithm known as a particle analysis. The goal of this study was to compare the ability of several popular thresholding algorithms to segment TEM images. Performance of the thresholding algorithms was evaluated through qualitative and quantitative measures. Results show that the choice of a thresholding algorithm will strongly affect the results obtained from particle analysis. © 2007 Materials Research Society.

  • In scientific imaging, it is crucial to obtain precise images to facilitate accurate observations for the given application. However, often times the imaging equipment used to acquire such images introduces error into the observed image. Therefore, there is a fundamental need to remove the error associated with these images in order to facilitate accurate observations. This study investigates the effectiveness of an image processing technique utilizing an iterative deconvolution algorithm to remove error from micro-CT images. This technique is applied to several sets of in-vivo micro CT scans of mice, and its effectiveness is evaluated by qualitative comparison of the resultant thresholded binary images to thresholded binary images produced by more conventional image processing techniques; namely Gaussian filtering and straight thresholding. Results for this study suggest that iterative deconvolution as a pre-processing step produces superior qualitative results as compared to the more conventional methods tested. The groundwork for future quantitative verification is motivated. ©2005 IEEE.

  • The purpose of this study was to compare the ability of several texture analysis parameters to differentiate textured samples from a smooth control on images obtained with an Atomic Force Microscope (AFM). Surface roughness plays a major role in the realm of material science, especially in integrated electronic devices. As these devices become smaller and smaller, new materials with better electrical properties are needed. New materials with smoother surface morphology have been found to have superior electrical properties than their rougher counterparts. Therefore, in many cases surface texture is indicative of the electrical properties that material will have. Physical vapor deposition techniques such as Jet Vapor Deposition and Molecular Beam Epitaxy are being utilized to synthesize these materials as they have been found to create pure and uniform thin layers. For the current study, growth parameters were varied to produce a spectrum of textured samples. The focus of this study was the image processing techniques associated with quantifying surface texture. As a result of the limited sample size, there was no attempt to draw conclusions about specimen processing methods. The samples were imaged using an AFM in tapping mode. In the process of collecting images, it was discovered that roughness data was much better depicted in the microscope's "height" mode as opposed to "equal area" mode. The AFM quantified the surface texture of each image by returning RMS roughness and the first order histogram statistics of mean roughness, standard deviation, skewness, and kurtosis. Color images from the AFM were then processed on an off line computer running NIH ImageJ with an image texture plug in. This plug in produced another set of first order statistics computed from each images' histogram as well as second order statistics computed from each images' cooccurrence matrix. The second order statistics, which were originally proposed by Haralick, include contrast, angular second moment, correlation, inverse difference moment, and entropy. These features were computed in the 0°, 45°, 90°, and 135° directions. The findings of this study propose that the best combination of quantitative texture parameters is standard deviation, 0° inverse difference moment, and 0° entropy, all of which are obtained from the NIH ImageJ texture plug in. © 2010 Copyright SPIE - The International Society for Optical Engineering.

  • There is an acute and well-documented need for image processing of microscopy data in materials science regarding, for example, the characterization of the structure/property relationship of a given materials system. In our work, image processing has been used as a framework for conducting interdisciplinary team-based research that effectively integrates programs within the Center for Research on Interface Structures and Phenomena (CRISP) Materials Research Science and Engineering Center (MRSEC), e.g. research experiences for undergraduates (REU), teachers (RET) and high school fellowships. This research resulted from a five-year long collaboration between CRISP and the Physics and Computer Science Departments at Southern Connecticut State University (SCSU). This paper will focus on the implementation of team-based research experiences as a vehicle for interdisciplinary science and education. Representative results of several of the studies are presented and discussed. © 2011 Materials Research Society.

  • Temporal and spatial analysis was applied to a sequence of cloud top pressure (CTP) images and cloud optical thickness (TAU) images, and a storm tracking algorithm was proposed. A sequence of storm tracks from the satellite images was developed from the satellite images. Composite images were created by projecting ahead in time and substituting the first valid pixel for missing data, and a variety of CTP and TAU cut-off values were used to identify regions of interest. The region correspondences were determined from one time frame to another which yielded the storm center coordinates. The obtained tracks were compared to the storm tracks computed from sea level pressure data by matching the results first in time and then in spatial distance.

  • The objective of this study is to compare geometric-based and evolutionary techniques for tracking storm systems from sequences of satellite images. Analysis was applied to the International Satellite Cloud Climatology Project low resolution D1 database for selected storm systems during the month of September, 1988. During this time period there were two exceptionally long tracks of major hurricane systems, Hurricanes Gilbert and Helene. Cloud top pressure and cloud optical thickness were used to identify storm systems. The ability of the geometric-based and evolutionary techniques to generate tracks through storm regions was assessed. Differences in final tracking results between the two techniques resulted not only from the differences in methodology but also form differences in the type of preprocessed input used by each of the techniques. Tracking results were compared to results disseminated by the Colorado State/Tropical Prediction Center and maintained by the National Hurricane Center in Miami, Florida. For the hurricanes investigated in this study, both techniques were able to generate tracks which followed either most or some of the portions of the hurricanes. The evolutionary algorithm was in general able to maintain good continuity along the tracks but, with no knowledge of overall region movement, was unable to discern which of two possible directions would be best to pursue in cases where there were tow or more equally close storm systems components. The geometric method was able to maintain a smooth track close to the course of the hurricane except for confusion primarily at the beginning and/or end of tracks.

  • The primary goal of this research was to investigate the ability of quantitative variables to confirm qualitative improvements of the deconvolution algorithm as a preprocessing step in evaluating micro CT bone density images. The analysis of these types of images is important because they are necessary to evaluate various countermeasures used to reduce or potentially reverse bone loss experienced by some astronauts when exposed to extended weightlessness during space travel. Nine low resolution (17.5 microns) CT bone density image sequences, ranging from between 85 to 88 images per sequence, were processed with three preprocessing treatment groups consisting of no preprocessing, preprocessing with a deconvolution algorithm and preprocessing with a Gaussian filter. The quantitative parameters investigated consisted of Bone Volume to Total Volume Ratio, the Structured Model Index, Fractal Dimension, Bone Area Ratio, Bone Thickness Ratio, Euler's Number and the Measure of Enhancement. Trends found in these quantitative variables appear to corroborate the visual improvements observed in the past and suggest which quantitative parameters may be capable of distinguishing between groups that experience bone loss and others that do not.

  • The primary goal of this research was to provide image processing support to aid in the identification of those subjects most affected by bone loss when exposed to weightlessness and provide insight into the causes for large variability. Past research has demonstrated that genetically distinct strains of mice exhibit different degrees of bone loss when subjected to simulated weightlessness. Bone loss is quantified by in vivo computed tomography (CT) imaging. The first step in evaluating bone density is to segment gray scale images into separate regions of bone and background. Two of the most common methods for implementing image segmentation are thresholding and edge detection. Thresholding is generally considered the simplest segmentation process which can be obtained by having a user visually select a threshold using a sliding scale. This is a highly subjective process with great potential for variation from one observer to another. One way to reduce inter-observer variability is to have several users independently set the threshold and average their results but this is a very time consuming process. A better approach is to apply an objective adaptive technique such as the Riddler / Calvard method. In our study we have concluded that thresholding was better than edge detection and pre-processing these images with an iterative deconvolution algorithm prior to adaptive thresholding yields superior visualization when compared with images that have not been pre-processed or images that have been pre-processed with a filter.

  • Nanoparticles, particles with a diameter of 1-100 nanometers (nm), are of interest in many applications including device fabrication, quantum computing, and sensing because their size may give them properties that are very different from bulk materials. Further advancement of nanotechnology cannot be obtained without an increased understanding of nanoparticle properties such as size (diameter) and size distribution frequently evaluated using transmission electron microscopy (TEM). In the past, these parameters have been obtained from digitized TEM images by manually measuring and counting many of these nanoparticles, a task that is highly subjective and labor intensive. More recently, computer imaging particle analysis has emerged as an objective alternative by counting and measuring objects in a binary image. This paper will describe the procedures used to preprocess a set of gray scale TEM images so that they could be correctly thresholded into binary images. This allows for a more accurate assessment of the size and frequency (size distribution) of nanoparticles. Several preprocessing methods including pseudo flat field correction and rolling ball background correction were investigated with the rolling ball algorithm yielding the best results. Examples of particle analysis will be presented for different types of materials and different magnifications. In addition, a method based on the results of particle analysis for identifying and removing small noise particles will be discussed. This filtering technique is based on identifying the location of small particles in the binary image and removing them without affecting the size of other larger particles.

  • Thresholding is an image processing procedure used to convert an image consisting of gray level pixels into a black and white binary image. One application of thresholding is particle analysis. Once foreground objects are separated from the background, a quantitative analysis that characterizes the number, size and shape of particles is obtained which can then be used to evaluate a series of nanoparticle samples. Numerous thresholding techniques exist differing primarily in how they deal with variations in noise, illumination and contrast. In this paper, several popular thresholding algorithms are qualitatively and quantitatively evaluated on transmission electron microscopy (TEM) and atomic force microscopy (AFM) images. Initially, six thresholding algorithms were investigated: Otsu, Riddler-Calvard, Kittler, Entropy, Tsai and Maximum Likelihood. The Riddler-Calvard algorithm was not included in the quantitative analysis because it did not produce acceptable qualitative results for the images in the series. Two quantitative measures were used to evaluate these algorithms. One is based on comparing object area the other on diameter before and after thresholding. For AFM images the Kittler algorithm yielded the best results followed by the Entropy and Maximum Likelihood techniques. The Tsai algorithm yielded the top results for TEM images followed by the Entropy and Kittler methods.

  • The use of Transmission Electron Microscopy (TEM) to characterize the microstructure of a material continues to grow in importance as technological advancements become increasingly more dependent on nanotechnology 1. Since nanoparticle properties such as size (diameter) and size distribution are often important in determining potential applications, a particle analysis is often performed on TEM images. Traditionally done manually, this has the potential to be labor intensive, time consuming, and subjective 2. To resolve these issues, automated particle analysis routines are becoming more widely accepted within the community 3. When using such programs, it is important to compare their performance, in terms of functionality and cost. The primary goal of this study was to apply one such software package, ImageJ to grayscale TEM images of nanoparticles with known size. A secondary goal was to compare this popular open-source general purpose image processing program to two commercial software packages. After a brief investigation of performance and price, ImageJ was identified as the software best suited for the particle analysis conducted in the study. While many ImageJ functions were used, the ability to break agglomerations that occur in specimen preparation into separate particles using a watershed algorithm was particularly helpful 4. © 2009 SPIE-IS&T.

Last update from database: 3/13/26, 4:15 PM (UTC)

Explore

Resource type

Resource language