• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2848
  • 1316
  • 345
  • 340
  • 168
  • 92
  • 69
  • 59
  • 44
  • 36
  • 26
  • 25
  • 21
  • 21
  • 21
  • Tagged with
  • 6633
  • 1249
  • 1184
  • 1075
  • 538
  • 514
  • 462
  • 440
  • 423
  • 416
  • 396
  • 355
  • 328
  • 317
  • 303
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
301

Automatic isochoric apparatus for PVT and phase equilibrium studies of natural gas mixtures

Zhou, Jingjun 15 May 2009 (has links)
We have developed a new automatic apparatus for the measurement of the phase equilibrium and pVT properties of natural gas mixtures in our laboratory. Based on the isochoric method, the apparatus can operate at temperature from 200 K to 500 K at pressures up to 35 MPa, and yield absolute results in fully automated operation. Temperature measurements are accurate to 10 mK and pressure measurements are accurate to 0.002 MPa. The isochoric method utilizes pressure versus temperature measurements along an isomole and detects phase boundaries by locating the change in the slope of the isochores. The experimental data from four gas samples show that cubic equations of state, such as Peng-Robinson and Soave-Redich-Kwong have 1-20% errors in predicting hydrocarbon mixture dew points. The data also show that the AGA 8-DC92 equation of state has errors as large as 0.6% when predicting hydrocarbon mixture densities when its normal composition range is extrapolated.
302

Characterization of livestock herds in extensive agricultural settings in southwest Texas

Dominguez, Brandon James 15 May 2009 (has links)
Because of an ever-increasing threat of foreign animal disease outbreaks in the United States, there is a desire to develop strategies to prevent the occurrence of a foreign animal disease and control an outbreak if it does occur. Infectious disease models have been developed and are being used to determine reasonable mitigation strategies. However, little information is available concerning premises characteristics and movement of animals in extensively managed livestock areas. Hence adaptation of these models to areas where there is low livestock density is not easy. We collected empirical data, via mail out surveys, from an extensively managed livestock area. This will aid in improving the results of infectious disease models in these areas. In contrast to the intensively managed livestock that have previously been modeled, this study has shown that in areas of low livestock density, multiple livestock types often are managed on the same premises. Direct contacts, facilitated through the planned movement of animals, appear to have a greater seasonality in extensively managed areas as compared to intensively managed areas. Furthermore, wildlife contacts are likely and of potential importance. The results of this study add to the knowledge base used to model the spread of infectious disease in extensively managed livestock populations. Seasonal changes in animal densities and contact rates may impact the results of the models. Additionally, the effect of multiple livestock types on premises should be considered when the expected spread of disease is modeled in extensive livestock areas.
303

Theory and simulation of colloids near interfaces: quantitative mapping of interaction potentials

Lu, Mingqing 15 May 2009 (has links)
The behavior of dense colloidal fluids near surfaces can now be probed in great detail with experimental techniques like video and confocal microscopy. In fact we are approaching a point where quantitative comparisons of experiments with particle-level theory, such as classical density functional theory (DFT), are appropriate. In a forward sense, we may use a known surface potential to predict a particle density distribution function from DFT; in an inverse sense, we may use an experimentally measured particle density distribution function to predict the underlying surface potential from DFT. In this dissertation, we tested the ability of the closure-based DFT to perform forward and inverse calculations on potential models commonly employed for colloidal particles and surface under different surface topographies. To reduce sources of uncertainty in this initial study, Monte Carlo simulation results played the role of experimental data. The accuracy of the predictions depended on the bulk particle density, potential well depth and the choice of DFT closure relationships. For a reasonable range of choices of the density, temperature, potential parameters, and surface features, the inversion procedure yielded particle-surface potentials to an accuracy on the order of 0.1 kBT. Our results demonstrated that DFT is a valuable numerical tool for microcopy experiments to image three-dimensional surface energetic landscape accurately and rapidly. B
304

Estimating Rio Grande wild turkey densities in Texas

Locke, Shawn Lee 02 June 2009 (has links)
Rio Grande wild turkeys (Meleagris gallopavo intermedia) are a highly mobile, wide ranging, and secretive species located throughout the arid regions of Texas. As a result of declines in turkey abundance within the Edwards Plateau and other areas, Texas Parks and Wildlife Department initiated a study to evaluate methods for estimating Rio Grande wild turkey abundance. Unbiased methods for determining wild turkey abundance have long been desired, and although several different methods have been examined few have been successful. The study objectives were to: (1) review current and past methods for estimating turkey abundance, (2) evaluate the use of portable thermal imagers to estimate roosting wild turkeys in three ecoregions, and (3) determine the effectiveness of distance sampling from the air and ground to estimate wild turkey densities in the Edwards Plateau Ecoregion of Texas. Based on the literature review and the decision matrix, I determined two methods for field evaluation (i.e., infrared camera for detecting roosting turkeys and distance sample from the air and ground). I conducted eight ground and aerial forward-looking infrared (FLIR) surveys (4 Edwards Plateau, 3 Rolling Plains, and 1 Gulf Prairies and Marshes) of roost sites during the study. In the three regions evaluated, I was unable to aerially detect roosting turkeys using the portable infrared camera due to altitudinal restrictions required for safe helicopter flight and lack of thermal contrast. A total of 560 km of aerial transects and 10 (800 km) road based transects also were conducted in the Edwards Plateau but neither method yielded a sufficient sample size to generate an unbiased estimate of the turkey abundance. Aerial and ground distance sampling and aerial FLIR surveys were limited by terrain and dense vegetation and a lack of thermal contrast, respectively. Study results suggest aerial FLIR and ground applications to estimate Rio Grande wild turkeys are of limited value in Texas. In my opinion, a method for estimating Rio Grande wild turkey densities on a regional scale does not currently exist. Therefore, the Texas Parks and Wildlife Department should reconsider estimating trends or using indices to monitor turkey numbers on a regional scale.
305

Effect of Density Gradient Centrifugation on Quality and Recovery Rate of Equine Sperm

Edmond, Ann J. 2009 May 1900 (has links)
Density gradient centrifugation of sperm is a common assisted-reproduction procedure in humans used to improve semen quality. The technique allows sperm separation based on their isopycnic points. Sperm with morphologic abnormalities are often more buoyant, leading to their retention above centrifuged density gradients, with structurally normal sperm passing through the gradient. Three experiments were conducted to evaluate the effects of tube size, sperm number following centrifugation, and density gradient volume (height) on stallion sperm quality and recovery rate in sperm pellets following centrifugation. In all three experiments, equine semen was initially centrifuged to increase sperm concentration. In Experiment 1, one-mL aliquots were layered over EquiPure? Bottom Layer (1-Layer) or over-tiered EquiPure? Top and Bottom Layers (2-Layer). For Experiment 2, one-mL aliquots were layered over three different heights of EquiPure? Bottom Layer in 15-mL or 50-mL conical-bottom tubes. For Experiment 3, four different aliquots containing a sperm load of 1-4x were layered over a constant volume of EquiPure? Bottom Layer in 15-mL or 50-mL conical bottom tubes. The tubes were then centrifuged. Resulting sperm pellets were evaluated for morphologic quality, DNA integrity, motility and recovery rate. Sperm-EquiPure? centrifugation yielded improvements in motility, morphology and DNA integrity parameters (P<0.05), as compared to controls. The 1-Layer method resulted in a higher recovery rate than the 2-Layer method (P<0.05). Sperm processed in the 15-mL tubes yielded higher velocity and higher recovery rates than sperm processed in the 50-mL tubes (P<0.05). Within tube type, gradient volume did not impact parameters of semen quality or recovery rate. An increase in sperm number for density gradient centrifugation resulted in a decreased recovery rate (P<0.05) when 15-mL tubes were used.
306

Experimental Characterization and Molecular Study of Natural Gas Mixtures

Cristancho Blanco, Diego Edison 2010 May 1900 (has links)
Natural Gas (NG) plays an important role in the energy demand in the United States and throughout the world. Its characteristics as a clean, versatile and a sustainable source of energy makes it an important alternative within the spectra of energy resources. Addressing industrial and academic needs in the natural gas research area requires an integrated plan of research among experimentation, modeling and simulation. In this work, high accuracy PpT data have been measured with a high pressure single sinker magnetic suspension densimeter. An entire uncertainty analysis of this apparatus reveals that the uncertainty of the density data is less that 0.05% across the entire ranges of temperature (200 to 500) K and pressure (up to 200 MPa). These characteristics make the PpT data measured in this study unique in the world. Additionally, both a low pressure (up to 35 MPa) and a high pressure (up to 200 MPa) isochoric apparatus have been developed during the execution of this project. These apparatuses, in conjunction with a recently improved isochoric technique, allow determination of the phase envelope for NG mixtures with an uncertainty of 0.45% in temperature, 0.05% in pressure and 0.12% in density. Additionally, an innovative technique, based upon Coherent Anti-Stokes Raman Scattering (CARS) and Gas Chromatography (GC), was proposed in this research to minimize the high uncertainty introduced by the composition analyses of NG mixtures. The collected set of P?T and saturation data are fundamental for thermodynamic formulations of these mixtures. A study at the molecular level has provided molecular data for a selected set of main constituents of natural gas. A 50-50% methane-ethane mixture was studied by molecular dynamics simulations. The result of this study showed that simulation time higher than 2 ns was necessary to obtain reasonable deviations for the density determinations when compared to accurate standards. Finally, this work proposed a new mixing rule to incorporate isomeric effects into cubic equations of state.
307

Effect of different density, inbuilt objects and feeds on white shrimp (Litopenaeus vannamei) cultured in zero-exchange system

Kang, Hao-ling 07 February 2006 (has links)
Zero-exchange system doesn¡¦t discharge the nutrient-rich pool water to the environment, so it doesn¡¦t cause the pollution. Besides, nutrients in the water can be re-used and the amount of water used can be reduced. In shrimp pond, ammonia and nitrite will accumulate during the period of cultivation because of feeding large amount of protein-rich feed. In traditional culture system, a great deal of water was used to solve the problem. In zero-exchange system, nitro-bacteria were used to convert ammonia and nitrite into nitrate, the need of water exchange was much-reduced. In this study, shrimps (Litopenaeus vannamei) were cultured in FRP tanks, and organic particles were suspended by strong aeration. There are four experiments (1) substrate: net-liners, net-pockets, mats, coral granules and blank control. (2) density: 100, 200, 300 and 400 /m2. (3) feeds: White shrimp feed (contain 39% protein), tiger shrimp feed (contain 39% protein) and Kuruma prawn feed (contain protein 54%). (4) zero-exchange water and change water (35% ponds of water / week). Net-pockets have the highest yield and final weigh and lowest FCR, and the net-liners isolate highest amount of the suspending particle. Shrimp final weight reduces as the stocking density increases, white shrimp grows slower when density excess 200 /m2. Feed contains higher protein can improve the efficiency of feed used by shrimp, but the cost of the feed become higher. Use the tiger shrimp feed (containing 39% protein) only can also reach high yield. Except orthophosphate, ammonia, nitrite and nitrate are not different from the exchange system. When compared with those of the zero-exchange system survival, final weigh and FCR are not different between two the treatments, but yield is higher in zero-exchange system. In conclusion, shrimp cultured in zero-exchange system with density of 200 /m2, substrate of net-pockets and feeds the contain 39% protein can reveal optimum yield.
308

A QUANTITATIVE ANALYSIS OF THE VOCABULARY IN THE FIRST VOLUME OF TAIWANESE SENIOR HIGH SCHOOL ENGLISH TEXTBOOKS

Lin, Chia-hsin 14 September 2006 (has links)
This study is to probe into the quantitative aspects of in vocabulary in the first volumes of the major three senior high (SH) school English textbooks and the major three vocational high (VH) school English textbooks. Not only the vocabulary lists, but also the unlisted new words in the related sections which are categorized into 22 corpora are explored and compared in terms of the size of new words, the consistency between junior high school (JH) vocabulary lists and SH/VH textbooks, the new-word density, and the frequency of word exposures. In addition to the six commercial SH/VH English textbooks from three major publishers (Far East, Lungteng, and Sanmin), four JH word lists are included: two word lists of the old centralized Junior High School Required (Word-JHA) and Elective (Word-JHB) English Course by the National Institute for Compilation and Translation and two new word lists of 1,000 productive vocabulary (Word-JH1000) and 1,000 receptive vocabulary (Word-JH2000) by the MOE. The major findings of this study are as follows: 1. The word size, particularly the unlisted new words, is big. For those SH students who learn both Word-JH1000 and Word-JH2000, they face 9.05%~13.36% of unlisted new words in reading sections and encounter 17.92%~22.55% of unlisted new words in the whole textbooks. For those VH students who learn both Word-JH1000 and Word-JH2000, they face 5.77%~12.62% of unlisted new words in reading sections and encounter 16.05%~21.90% of unlisted new words in the whole textbooks. 2. Even though the new JH vocabulary lists (Word-JH1000 & Word-JH2000) provides a larger proportion of overlapping with the 22 SH/VH corpora than the old JH vocabulary (Word-JHA & Word-JHB), the consistency of vocabulary between JH and SH/VH is not adequate enough to reach the ¡§all-or-nothing threshold¡¨ (80% known words in a certain text). 3. Both SH and VH textbooks are too dense with new words to reach the ¡§probabilistic threshold¡¨ (95% known words in a certain text), the density index of an efficient textbook for the first year. 4. The frequency of word exposures is too low to be well-learned (more than 80% beneath the six-time threshold; more than 40% are one-timers). The findings have some pedagogical implications regarding the suggestions for the policy-makers, publisher, JH/SH/VH teachers and students.
309

Bootstrapping in a high dimensional but very low sample size problem

Song, Juhee 16 August 2006 (has links)
High Dimension, Low Sample Size (HDLSS) problems have received much attention recently in many areas of science. Analysis of microarray experiments is one such area. Numerous studies are on-going to investigate the behavior of genes by measuring the abundance of mRNA (messenger RiboNucleic Acid), gene expression. HDLSS data investigated in this dissertation consist of a large number of data sets each of which has only a few observations. We assume a statistical model in which measurements from the same subject have the same expected value and variance. All subjects have the same distribution up to location and scale. Information from all subjects is shared in estimating this common distribution. Our interest is in testing the hypothesis that the mean of measurements from a given subject is 0. Commonly used tests of this hypothesis, the t-test, sign test and traditional bootstrapping, do not necessarily provide reliable results since there are only a few observations for each data set. We motivate a mixture model having C clusters and 3C parameters to overcome the small sample size problem. Standardized data are pooled after assigning each data set to one of the mixture components. To get reasonable initial parameter estimates when density estimation methods are applied, we apply clustering methods including agglomerative and K-means. Bayes Information Criterion (BIC) and a new criterion, WMCV (Weighted Mean of within Cluster Variance estimates), are used to choose an optimal number of clusters. Density estimation methods including a maximum likelihood unimodal density estimator and kernel density estimation are used to estimate the unknown density. Once the density is estimated, a bootstrapping algorithm that selects samples from the estimated density is used to approximate the distribution of test statistics. The t-statistic and an empirical likelihood ratio statistic are used, since their distributions are completely determined by the distribution common to all subject. A method to control the false discovery rate is used to perform simultaneous tests on all small data sets. Simulated data sets and a set of cDNA (complimentary DeoxyriboNucleic Acid) microarray experiment data are analyzed by the proposed methods.
310

Low Density Nuclear Matter in Heavy Ion Collisions

Qin, Lijun 14 January 2010 (has links)
The symmetry energy is the energy difference between symmetric nuclear matter and pure neutron matter at a given density. Around normal nuclear density, i.e. p/p0 =1, and temperature, i.e. T = 0, the symmetry energy is approximately 23.5 MeV/nucleon for finite nuclear matter and 30 MeV/nucleon for infinite nuclear matter, but at other densities, the symmetry energies are very poorly understood. Since the symmetry energy is very important in understanding many aspects of heavy ion reactions, structure, and nuclear astrophysics, many different models have been developed and some predications of the density dependence of symmetry energy have been made. Intermediate energy heavy ion collisions provide a unique tool to probe the nuclear equation of state. The initial compression and the thermal shock in Fermi- Energy heavy ion collisions lead naturally to the production of nucleonic matter at varying temperatures and densities which are interesting in this context. Since the light particle emission during this stage witnesses each stage of the reaction, it carries essential information on the early dynamics and on the degree of equilibration at each stage of the reaction. The kinematic features and yields of emitted light particles and clusters in the invairant velocity frame have been exploited to probe the nature of the intermediate system and information on the Equation Of State (EOS) with emphasis on the properties of the low density participant matter produced in such collisions. In order to pursue this effort and broaden the density range over which the symmetry energies are experimentally determined we have now carried out a series of experiments in which the reactions of 112Sn and 124Sn with projectiles, ranging from 4He,10B, 20Ne, 40Ar to 64Zn, all at the same energy per nucleon, 47 Mev/u, were performed. In this series of experiments different collision systems should lead to different average densities. By careful comparisons of the yields, spectra and angular distributions observed for particle emission from these different systems we attempted to cleanly separate early emission resulting from nucleon-nucleon collisions from that resulting from evaporation from the thermalized system and obtain a much cleaner picture of the dynamic evolution of the hotter systems. The Albergo Model has been used to calculate the density and temperature, symmetry free energies with the isoscaling technique for systems with different N/Z ratios. Those are compared with Roepke Model results. Also other models like VEOS, Lattimer, and Shen-Toki have been added to calculate the alpha mass fraction in order to understand the properties of low density matter further.

Page generated in 0.0792 seconds