• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1287
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2849
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 163
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
781

Insight Driven Sampling for Interactive Data Intensive Computing

Masiane, Moeti Moeklesia 24 June 2020 (has links)
Data Visualization is used to help humans perceive high dimensional data, but it is unable to be applied in real time to data intensive computing applications. Attempts to process and apply traditional information visualization techniques to such applications result in slow or non-responsive applications. For such applications, sampling is often used to reduce big data to smaller data so that the benefits of data visualization can be brought to data intensive applications. Sampling allows data visualization to be used as an interface between humans and insights contained in the big data of data intensive computing. However, sampling introduces error. The objective of sampling is to reduce the amount of data being processed without introducing too much error into the results of the data intensive application. To determine an adequate level of sampling one can use statistical measures like standard error. However, such measures do not translate well for cases involving data visualization. Knowing the standard error of a sample can tell you very little about the visualization of that data. What is needed is a measure that allows system users to make an informed decision on the level of sampling needed to speed up a data intensive application. In this work we introduce an insight based measure for the impact of sampling on the results of visualized data. We develop a framework for the quantification of the level of insight, model the relationship between the level of insight and the amount of sampling, use this model to provide data intensive computing users with the ability to control the amount of sampling as a function of user provided insight requirements, and we develop a prototype that utilizes our framework. This work allows users to speed up data intensive applications with a clear understanding of how the speedup will impact the insights gained from the visualization of this data. Starting with a simple one dimensional data intensive application we apply our framework and work our way to a more complicated computational fluid dynamics case as a proof concept of the application of our framework and insight error feedback measure for those using sampling to speedup data intensive computing. / Doctor of Philosophy / Data Visualization is used to help humans perceive high dimensional data, but it is unable to be applied in real time to computing applications that generate or process vast amounts of data, also known as data intensive computing applications. Attempts to process and apply traditional information visualization techniques to such data result in slow or non-responsive data intensive applications. For such applications, sampling is often used to reduce big data to smaller data so that the benefits of data visualization can be brought to data intensive applications. Sampling allows data visualization to be used as an interface between humans and insights contained in the big data of data intensive computing. However, sampling introduces error. The objective of sampling is to reduce the amount of data being processed without introducing too much error into the results of the data intensive application. This error results from the possibility that a data sample could exclude valuable information that was included in the original data set. To determine an adequate level of sampling one can use statistical measures like standard error. However, such measures do not translate well for cases involving data visualization. Knowing the standard error of a sample can tell you very little about the visualization of that data. What is needed is a measure that allows one to make an informed decision of how much sampling to use in a data intensive application, as a result of knowing how sampling impacts how people gain insights from a visualization of the sampled data. In this work we introduce an insight based measure for the impact of sampling on the results of visualized data. We develop a framework for the quantification of the level of insight, model the relationship between the level of insight and the amount of sampling, use this model to provide data intensive computing users with an insight based feedback measure for each arbitrary sample size they choose for speeding up data intensive computing, and we develop a prototype that utilizes our framework. Our prototype applies our framework and insight based feedback measure to a computational fluid dynamics (CFD) case, but our work starts off with a simple one dimensional data application and works its way up to the more complicated CFD case. This work allows users to speed up data intensive applications with a clear understanding of how the speedup will impact the insights gained from the visualization of this data.
782

Adaptive Sampling Line Search for Simulation Optimization

Ragavan, Prasanna Kumar 08 March 2017 (has links)
This thesis is concerned with the development of algorithms for simulation optimization (SO), a special case of stochastic optimization where the objective function can only be evaluated through noisy observations from a simulation. Deterministic techniques, when directly applied to simulation optimization problems fail to converge due to their inability to handle randomness thus requiring sophisticated algorithms. However, many existing algorithms dedicated for simulation optimization often show poor performance on implementation as they require extensive parameter tuning. To overcome these shortfalls with existing SO algorithms, we develop ADALINE, a line search based algorithm that eliminates the need for any user defined parameters. ADALINE is designed to identify a local minimum on continuous and integer ordered feasible sets. ADALINE on a continuous feasible set mimics deterministic line search algorithms, while it iterates between a line search and an enumeration procedure on integer ordered feasible sets in its quest to identify a local minimum. ADALINE improves upon many of the existing SO algorithms by determining the sample size adaptively as a trade-off between the error due to estimation and the optimization error, that is, the algorithm expends simulation effort proportional to the quality of the incumbent solution. We also show that ADALINE converges ``almost surely'' to the set of local minima. Finally, our numerical results suggest that ADALINE converges to a local minimum faster, outperforming other advanced SO algorithms that utilize variable sampling strategies. To demonstrate the performance of our algorithm on a practical problem, we apply ADALINE in solving a surgery rescheduling problem. In the rescheduling problem, the objective is to minimize the cost of disruptions to an existing schedule shared between multiple surgical specialties while accommodating semi-urgent surgeries that require expedited intervention. The disruptions to the schedule are determined using a threshold based heuristic and ADALINE identifies the best threshold levels for various surgical specialties that minimizes the expected total cost of disruption. A comparison of the solutions obtained using a Sample Average Approximation (SAA) approach, and ADALINE is provided. We find that the adaptive sampling strategy in ADALINE identifies a better solution quickly than SAA. / Ph. D.
783

Comparison of Quantitative and Semi-Quantitative Assessments of Benthic Macroinvertebrate Community Response to Elevated Salinity in central Appalachian Coalfield Streams

Pence, Rachel A. 18 January 2019 (has links)
Anthropogenic salinization of freshwater is a global concern. In freshwater environments, elevated levels of major ions, measured as total dissolved solids (TDS) or specific conductance (SC), can cause adverse effects on aquatic ecosystem structure and function. In central Appalachia, eastern USA, studies largely rely on Rapid Bioassessment Protocols with semi-quantitative sampling to characterize benthic macroinvertebrate community response to increased salinity caused by surface coal mining. These protocols require subsampling procedures and identification of fixed numbers of individuals regardless of organism density, limiting measures of community structure. Quantitative sampling involves enumeration of all individuals collected within a defined area and typically includes larger sample sizes relative to semi-quantitative sampling, allowing expanded characterization of the benthic community. Working in central Appalachia, I evaluated quantitative and semi-quantitative methods for bioassessments in headwater streams salinized by coal mining during two time periods. I compared the two sampling methods for capability to detect SC-induced changes in the macroinvertebrate community. Quantitative sampling consistently produced higher estimates of taxonomic richness than corresponding semi-quantitative samples, and differences between sampling methods were found for community composition, functional feeding group, dominance, tolerance, and habit metrics. Quantitative methods were generally stronger predictors of benthic community-metric responses to SC and were more sensitive for detecting SC-induced changes in the macroinvertebrate community. Quantitative methods are advantageous compared to semi-quantitative sampling methods when characterizing benthic macroinvertebrate community structure because they provide more complete estimates of taxonomic richness and diversity and produce metrics that are stronger predictors of community response to elevated SC. / Master of Science / Surface coal mining in central Appalachia, eastern USA, contributes to increased salinity of surface waters, causing adverse effects on water quality and aquatic life. Stream condition is often evaluated through sampling of benthic macroinvertebrates because they are ubiquitous in aquatic environments and differ in sensitivity to various types of pollution and environmental stressors. In central Appalachia, studies have largely relied on semi-quantitative sampling methods to characterize effects of elevated salinity on benthic macroinvertebrate communities in headwater streams. These methods are ‘semiquantitative’ because processing of samples requires subsampling procedures and identification of a fixed number of individuals, regardless of the number of organisms that were originally collected. In contrast, quantitative sampling involves identification and counting of all collected individuals, often resulting in organism counts that are much higher than those of semi-quantitative samples. Quantitative samples are typically more time-consuming and expensive to process but allow for expanded description of the benthic macroinvertebrate community and characterization of community-wide response to an environmental stressor such as elevated salinity. Working in central Appalachian streams, I compared 1) depictions of benthic macroinvertebrate community structure; 2) benthic community response to elevated salinity; and 3) the minimum levels of salinity associated with community change between quantitative and semi-quantitative methods. Quantitative sampling methods provide many advantages over semi-quantitative methods by providing more complete enumerations of the taxa present, thus enhancing the ability to evaluate aquatic-life condition and to characterize overall benthic macroinvertebrate community response to elevated salinity caused by surface coal mining.
784

Analyses of Two Aspects of Study Design for Bioassessment With Benthic Macroinvertebrates: Single Versus Multiple Habitat Sampling and Taxonomic Identification Level

Hiner, Stephen W. 03 February 2003 (has links)
Bioassessment is the concept of evaluating the ecological condition of habitats by surveying the resident assemblages of living organisms. Conducting bioassessment with benthic macroinvertebrates is still evolving and continues to be refined. There are strongly divided opinions about study design, sampling methods, laboratory analyses, and data analysis. Two issues that are currently being debated about study design for bioassessment in streams were examined here: 1) what habitats within streams should be sampled; 2) and is it necessary to identify organisms to the species level? The influence of habitat sampling design and level of taxonomic identification on the interpretation of ecological conditions of ten small streams in western Virginia was examined. Cattle watering and grazing heavily affected five of these streams (impaired sites). The other five streams, with no recent cattle activity or other impact by man, were considered to be reference sites because they were minimally impaired and represented best attainable conditions. Inferential and non-inferential statistical analyses concluded that multiple habitat sampling design was more effective than a single habitat design (riffle only) at distinguishing impaired conditions, regardless of taxonomic level. It appeared that sampling design (riffle habitat versus multiple habitats) is more important than taxonomic identification level for distinguishing reference and impaired ecological conditions in this bioassessment study. All levels of taxonomic resolution, which were studied, showed that the macroinvertebrate assemblages at the reference and impaired sites were very different and the assemblages at the impaired sites were adversely affected by perturbation. This study supported the sampling of multiple habitats and identification to the family level as a design for best determining the ecological condition of streams in bioassessment. / Master of Science
785

Airborne Campylobacter in a Poultry Processing Plant

Johnson, Anjeanette Christina 25 May 2010 (has links)
Campylobacter is a foodborne pathogen commonly found in live poultry and raw poultry products. Identifying areas of contamination or modes of transmission during commercial processing can lead to strategies to reduce the level of Campylobacter on finished products. Monitoring levels of airborne Campylobacter may be useful for identifying the presence or relative concentration of the pathogen in a processing plant environment. In this study, air sampling was used to detect and quantify Campylobacter in a commercial chicken processing plant by location within the plant and collection time during the day. Air was sampled from evisceration and post-chill areas in a poultry processing plant on four days and at 4 hour intervals onto Campy-Cefex agar plates or gelatin filters that were subsequently transferred to Campy-Cefex agar plates. Additionally, pre-evisceration and post-chill carcass rinses were analyzed quantitatively for Campylobacter. The mean level of airborne Campylobacter was 5 CFU/1000L of air sampled (10% samples positive) in comparison with 413 CFU/mL from carcass rinses (70% samples positive). Higher concentrations were found in carcass rinse samples from pre-evisceration. Airborne Campylobacter was detected from the evisceration area more frequently than from the post-chill carcass area of the plant (P < 0.05). This study shows that airborne Campylobacter can be quantified with a selective agar and with gelatin filter collection. Further research is needed to prove the utility of airborne detection of Campylobacter for estimating the relative contamination level of live poultry flocks and the processing plant environment and the potential for cross-contamination. / Master of Science in Life Sciences
786

Quantitative Recovery of Listeria monocytogenes and Salmonella enterica from Environmental Sampling Media

Bazaco, Michael Constantine 27 January 2005 (has links)
Environmental sampling is a pathogen monitoring technique that has become important in the food industry. Many food processing companies have adopted environmental sampling as a way to verify good manufacturing practices and sanitation plans in their facilities. Environmental sampling is helpful because it gives better information on the source of product contamination than end product sampling. Two specific pathogens of concern to the food industry are Listeria monocytogenes and Salmonella enterica. Environmental samples are rarely analyzed immediately, but instead may be batched for later analysis or shipped to an off site testing facility. Multiple media on the market today is used for storage and transport of environmental samples. These various media types, differences in holding temperatures and time create variability in test sample conditions. Select time, temperature and media combinations were tested to determine their effect on Listeria monocytogenes and Salmonella enterica populations during transport and storage of samples. Cocktails of Listeria monocytogenes and Salmonella enterica were added separately to sample tubes containing D/E Neutralizing Broth, Neutralizing Buffer or Copan SRK Solution. Bacterial counts at 0, 12, 24 and 48 hours post inoculation were compared. Neutralizing Buffer and Copan SRK Solution maintained consistent bacterial populations at all temperatures. At 10° and 15°C, D/E Broth supported bacterial growth. This study helps validate the use of D/E Neutralizing Broth, Neutralizing Buffer and Copan SRK Solution for environmental sample transport and storage at proper holding temperatures. At temperatures >10°C Neutralizing Buffer or Copan SRK solution should be used if quantifying microbial recovery. / Master of Science
787

Decision Making Tools for Optimizing Environmental Sampling Plans for Listeria in Poultry Processing Plants

Al Wahaimed, Abdullah Saud 08 July 2022 (has links)
Meat and poultry slaughtering and processing practices have been associated with the microbial contamination with Listeria spp. Ready-to-eat poultry products have been considered as a primary agent associated with Listeria monocytogenes illness outbreaks. Developing environmental monitoring programs (EMPs) that are based on product and/or process risk level analysis is a useful approach to reduce contamination in poultry processing plants and enhance food safety. Sampling criteria that is based on product risk levels and process control in ready-to-eat poultry processing facilities was developed to allow users to design and conduct appropriate sampling plans to target Listeria spp. After developing the criteria, an internet-based environmental monitoring program ("EZSafety") was developed to allow poultry producers to enhance their sample collection and analysis of test results over time and conduct appropriate sampling plans for Listeria spp. and other microbiological indicators. The frontend of the program website was built using React Native (an open-source JavaScript library for building user interfaces). The backend of the program website was built using Node.js which executes JavaScript code outside a web browser. MongoDB was used as a document-oriented database for the website. The program was evaluated by 20 food safety professionals to assess its ability to develop appropriate sampling plans to target Listeria spp. The majority of these participants believed that EZSafety has several tools that are effective for targeting Listeria spp. and other indicators and enhancing environmental monitoring. Additionally, most participants agreed that EZSafety is organized and user-friendly. EMPs can play a significant role in improving the detection rate and the prevention of Listeria spp. and other indicators in poultry processing plants. / Master of Science in Life Sciences / Meat and poultry slaughtering and processing practices have been associated with the microbial contamination with a bacterium known as Listeria. Cooked poultry products during the manufacturing process have been considered as a primary agent associated with Listeria monocytogenes (disease causing type of bacteria) sickness outbreaks. Developing environmental monitoring plans to detect and prevent this bacterium in poultry processing establishments is a useful approach to reduce contamination and enhance food safety. Several guidelines and baselines were developed to allow users to design and conduct appropriate environmental monitoring plans to target this bacterium. After developing these guidelines and baselines, an internet-based environmental monitoring program ("EZSafety") was developed to allow poultry processors to enhance their sample collection and analysis of test results over time. The program was developed using several kinds of computer platforms (JavaScript, React Native, and MongoDB) . These open-source platforms were used to design, develop, and store the program over the internet. In order to validate its usefulness, the program was evaluated by 20 users who are majored in food safety and familiar with poultry processing plants hygiene to assess its ability to suggest appropriate monitoring plans. Most of the participants believed that EZSafety has several tools that are effective for targeting Listeria and other kinds of bacteria and enhancing environmental monitoring plans. Additionally, most participants agreed that EZSafety is organized and user-friendly. Such automated monitoring programs can play a significant role in enhancing the detection rate and the prevention of Listeria and other organisms in poultry processing facilities.
788

NOISE AWARE BAYESIAN PARAMETER ESTIMATION IN BIOPROCESSES: USING NEURAL NETWORK SURROGATE MODELS WITH NON-UNIFORM DATA SAMPLING / NOISE AWARE BAYESIAN PARAMETER ESTIMATION IN BIOPROCESSES

Weir, Lauren January 2024 (has links)
This thesis demonstrates a parameter estimation technique for bioprocesses that utilizes measurement noise in experimental data to determine credible intervals on parameter estimates, with this information of potential use in prediction, robust control, and optimization. To determine these estimates, the work implements Bayesian inference using nested sampling, presenting an approach to develop neural network (NN) based surrogate models. To address challenges associated with non-uniform sampling of experimental measurements, an NN structure is proposed. The resultant surrogate model is utilized within a Nested Sampling Algorithm that samples possible parameter values from the parameter space and uses the NN to calculate model output for use in the likelihood function based on the joint probability distribution of the noise of output variables. This method is illustrated against simulated data, then with experimental data from a Sartorius fed-batch bioprocess. Results demonstrate the feasibility of the proposed technique to enable rapid parameter estimation for bioprocesses. / Thesis / Master of Applied Science (MASc) / Bioprocesses require models that can be developed quickly for rapid production of desired pharmaceuticals. Parameter estimation is necessary for these models, especially first principles models. Generating parameter estimates with confidence intervals is important for model based control. Challenges with parameter estimation that must be addressed are the presence of non-uniform sampling and measurement noise in experimental data. This thesis demonstrates a method of parameter estimation that generates parameter estimates with credible intervals by incorporating measurement noise in experimental data, while also employing a dynamic neural network surrogate model that can process non-uniformly sampled data. The proposed technique implements Bayesian inference using nested sampling and was tested against both simulated and real experimental fed-batch data.
789

Nature versus design: the conformational propensities of D-amino acids and the importance of side chain chirality

Towse, Clare-Louise, Hopping, G.G., Vulovic, I.M., Daggett, V. 2014 September 1918 (has links)
No / D-amino acids are useful building blocks for de novo peptide design and they play a role in aging-related diseases associated with gradual protein racemization. For amino acids with achiral side chains, one should be able to presume that the conformational propensities of L- and D-amino acids are a reflection of one another due to the straightforward geometric inversion at the Cα atom. However, this presumption does not account for the directionality of the backbone dipole and the inverted propensities have never been definitively confirmed in this context. Furthermore, there is little known of how alternative side chain chirality affects the backbone conformations of isoleucine and threonine. Using a GGXGG host-guest pentapeptide system, we have completed exhaustive sampling of the conformational propensities of the D-amino acids, including D-allo-isoleucine and D-allo-threonine, using atomistic molecular dynamics simulations. Comparison of these simulations with the same systems hosting the cognate L-amino acids verifies that the intrinsic backbone conformational propensities of the D-amino acids are the inverse of their cognate L-enantiomers. Where amino acids have a chiral center in their side chain (Thr, Ile) the β-configuration affects the backbone sampling, which in turn can confer different biological properties. / NIH
790

An investigation of the origins of cattle and aurochs deposited in the Early Bronze Age barrows at Gayhurst and Irthlingborough

Towers, Jacqueline R., Montgomery, Janet, Evans, J., Jay, Mandy, Parker Pearson, M. 2009 October 1916 (has links)
Yes / The Early Bronze Age round barrows at Irthlingborough, Northamptonshire and Gayhurst, Buckinghamshire contained remarkably large quantities of cattle (Bos taurus) remains. At Irthlingborough, at least 185 skulls with smaller numbers of mandibles, shoulder blades and pelves were found together with a small number of skeletal elements from aurochs (Bos primigenius). In contrast, the remains from Gayhurst are dominated by the limb bones from more than 300 animals. This study employed strontium isotope ratio analysis of cattle tooth enamel from 15 cattle and one aurochs to investigate the diversity of the animals’ origins at both sites and provide insights into Early Bronze Age funerary practices. Although strontium results show that most of the cattle and the aurochs included in this study were consistent with local origins, one animal from each barrow was born remotely, most likely in western Britain. In addition, a second Gayhurst animal was consistent with origins in a region of chalk rather than the local Jurassic sediments.

Page generated in 0.0505 seconds