• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 247
  • 237
  • 37
  • 32
  • 18
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 662
  • 662
  • 151
  • 81
  • 59
  • 51
  • 50
  • 43
  • 40
  • 38
  • 38
  • 38
  • 37
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Sequential optimal design of neurophysiology experiments

Lewi, Jeremy 31 March 2009 (has links)
For well over 200 years, scientists and doctors have been poking and prodding brains in every which way in an effort to understand how they work. The earliest pokes were quite crude, often involving permanent forms of brain damage. Though neural injury continues to be an active area of research within neuroscience, technology has given neuroscientists a number of tools for stimulating and observing the brain in very subtle ways. Nonetheless, the basic experimental paradigm remains the same; poke the brain and see what happens. For example, neuroscientists studying the visual or auditory system can easily generate any image or sound they can imagine to see how an organism or neuron will respond. Since neuroscientists can now easily design more pokes then they could every deliver, a fundamental question is ``What pokes should they actually use?' The complexity of the brain means that only a small number of the pokes scientists can deliver will produce any information about the brain. One of the fundamental challenges of experimental neuroscience is finding the right stimulus parameters to produce an informative response in the system being studied. This thesis addresses this problem by developing algorithms to sequentially optimize neurophysiology experiments. Every experiment we conduct contains information about how the brain works. Before conducting the next experiment we should use what we have already learned to decide which experiment we should perform next. In particular, we should design an experiment which will reveal the most information about the brain. At a high level, neuroscientists already perform this type of sequential, optimal experimental design; for example crude experiments which knockout entire regions of the brain have given rise to modern experimental techniques which probe the responses of individual neurons using finely tuned stimuli. The goal of this thesis is to develop automated and rigorous methods for optimizing neurophysiology experiments efficiently and at a much finer time scale. In particular, we present methods for near instantaneous optimization of the stimulus being used to drive a neuron.
102

Vitamin supplementation of sows

Shelton, Nicholas William January 1900 (has links)
Doctor of Philosophy / Department of Animal Sciences and Industry / Jim Nelssen / A total of 701 pigs were used to evaluate effects of natural vitamin E relative to synthetic vitamin E in sow diets, late gestation feeding level on sow reproductive performance, dietary L-carnitine and chromium on sow reproductive performance, and experimental design on nursery pig trial interpretation. As D-α-tocopheryl acetate increased in the sow’s diet, concentrations of α-tocopherol increased (P < 0.03) in sow plasma, colostrum, milk, pig plasma, and pig heart. Regression analysis indicated that the bioavailability coefficients for D-α-tocopheryl acetate relative to DL-α-tocopheryl acetate ranged from 2.1 to 4.2 for sow and pig plasma α-tocopherol, 2.9 to 3.0 for colostrum α-tocopherol, 1.6 for milk α-tocopherol, 1.8 for heart α-tocopherol, and 2.0 for liver α-tocopherol. Overall, this study indicates that the relative bioavailability for D-α-tocopheryl acetate relative to DL-α-tocopheryl acetate varies depending on the response criteria but is greater than the standard potency value of 1.36. Increasing sow gestation feeding level by 0.9 kg from d 90 of gestation through farrowing reduced (P = 0.001) daily lactation feed intake in gilts, but also resulted in improved conception rate in gilts, whereas increasing late gestation feeding level decreased conception rate in sows (interaction; P = 0.03). Increasing late gestation feed intake in gilts also increased (P < 0.02) pig weaning weights during the second parity. Increasing late gestation feeding levels did not improve performance of older sows. Adding L-carnitine and chromium from chromium picolinate to sow gestation and lactation diets reduced (P = 0.01) the amount of sow weight loss during lactation, however, did not improve (P > 0.05) litter size, pig birth weight, or the variation in pig birth weight. Blocking pens of nursery pigs by BW in a randomized complete block design (RCBD) did not improve the estimates for σ2error compared to a completely randomized design (CRD) where all pens were allotted to have similar means and variations of body weight. Therefore, the added degrees of freedom for the error term in the CRD allowed more power to detect treatment differences for the CRD compared to the RCBD.
103

Data-driven methods for exploratory analysis in chemometrics and scientific experimentation

Emerton, Guy 04 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2014. / ENGLISH ABSTRACT: Background New methods to facilitate exploratory analysis in scientific data are in high demand. There is an abundance of available data used only for confirmatory analysis from which new hypotheses can be drawn. To this end, two new exploratory techniques are developed: one for chemometrics and another for visualisation of fundamental scientific experiments. The former transforms large-scale multiple raw HPLC/UV-vis data into a conserved set of putative features - something not often attempted outside of Mass-Spectrometry. The latter method ('StatNet'), applies network techniques to the results of designed experiments to gain new perspective on variable relations. Results The resultant data format from un-targeted chemometric processing was amenable to both chemical and statistical analysis. It proved to have integrity when machine-learning techniques were applied to infer attributes of the experimental set-up. The visualisation techniques were equally successful in generating hypotheses, and were easily extendible to three different types of experimental results. Conclusion The overall aim was to create useful tools for hypothesis generation in a variety of data. This has been largely reached through a combination of novel and existing techniques. It is hoped that the methods here presented are further applied and developed. / AFRIKAANSE OPSOMMING: Agtergrond Nuwe metodes om ondersoekende ontleding in wetenskaplike data te fasiliteer is in groot aanvraag. Daar is 'n oorvloed van beskikbaar data wat slegs gebruik word vir bevestigende ontleding waaruit nuwe hipoteses opgestel kan word. Vir hierdie doel, word twee nuwe ondersoekende tegnieke ontwikkel: een vir chemometrie en 'n ander vir die visualisering van fundamentele wetenskaplike eksperimente. Die eersgenoemde transformeer grootskaalse veelvoudige rou HPLC / UV-vis data in 'n bewaarde stel putatiewe funksies - iets wat nie gereeld buite Massaspektrometrie aangepak word nie. Die laasgenoemde metode ('StatNet') pas netwerktegnieke tot die resultate van ontwerpte eksperimente toe om sodoende ân nuwe perspektief op veranderlike verhoudings te verkry. Resultate Die gevolglike data formaat van die ongeteikende chemometriese verwerking was in 'n formaat wat vatbaar is vir beide chemiese en statistiese analise. Daar is bewys dat dit integriteit gehad het wanneer masjienleertegnieke toegepas is om eienskappe van die eksperimentele opstelling af te lei. Die visualiseringtegnieke was ewe suksesvol in die generering van hipoteses, en ook maklik uitbreibaar na drie verskillende tipes eksperimentele resultate. Samevatting Die hoofdoel was om nuttige middele vir hipotese generasie in 'n verskeidenheid van data te skep. Dit is grootliks bereik deur 'n kombinasie van oorspronklike en bestaande tegnieke. Hopelik sal die metodes wat hier aangebied is verder toegepas en ontwikkel word.
104

Nonlinear design of geophysical surveys and processing strategies

Guest, Thomas January 2010 (has links)
The principal aim of all scientific experiments is to infer knowledge about a set of parameters of interest through the process of data collection and analysis. In the geosciences, large sums of money are spent on the data analysis stage but much less attention is focussed on the data collection stage. Statistical experimental design (SED), a mature field of statistics, uses mathematically rigorous methods to optimise the data collection stage so as to maximise the amount of information recorded about the parameters of interest. The uptake of SED methods in geophysics has been limited as the majority of SED research is based on linear and linearised theories whereas most geophysical methods are highly nonlinear and therefore the developed methods are not robust. Nonlinear SED methods are computationally demanding and hence to date the methods that do exist limit the designs to be either very simplistic or computationally infeasible and therefore cannot be used in an industrial setting. In this thesis, I firstly show that it is possible to design industry scale experiments for highly nonlinear problems within a computationally tractable time frame. Using an entropy based method constructed on a Bayesian framework I introduce an iteratively-constructive method that reduces the computational demand by introducing one new datum at a time for the design. The method reduces the multidimensional design space to a single-dimensional space at each iteration by fixing the experimental setup of the previous iteration. Both a synthetic experiment using a highly nonlinear parameter-data relationship, and a seismic amplitude versus offset (AVO) experiment are used to illustrate that the results produced by the iteratively-constructive method closely match the results of a global design method at a fraction of the computational cost. This new method thus extends the class of iterative design methods to nonlinear problems, and makes fully nonlinear design methods applicable to higher dimensional industrial scale problems. Using the new iteratively-constructive method, I show how optimal trace profiles for processing amplitude versus angle (AVA) surveys that account for all prior petrophysical information about the target reservoir can be generated using totally nonlinear methods. I examine how the optimal selections change as our prior knowledge of the rock parameters and reservoir fluid content change, and assess which of the prior parameters has the largest effect on the selected traces. The results show that optimal profiles are far more sensitive to prior information about reservoir porosity than information about saturating fluid properties. By applying ray tracing methods the AVA results can be used to design optimal processing profiles from seismic datasets, for multiple targets each with different prior model uncertainties. Although the iteratively-constructive method can be used to design the data collection stage it has been used here to select optimal data subsets post-survey. Using a nonlinear Bayesian SED method I show how industrial scale amplitude versus offset (AVO) data collection surveys can be constructed to maximise the information content contained in AVO crossplots, the principal source of petrophysical information from seismic surveys. The results show that the optimal design is highly dependant on the model parameters when a low number of receivers is being used, but that a single optimal design exists for the complete range of parameters once the number of receivers is increased above a threshold value. However, when acquisition and processing costs are considered I find that, in the case of AVO experiments, a design with constant spatial receiver separation is close to optimal. This explains why regularly-spaced, 2D seismic surveys have performed so well historically, not only from the point of view of noise attenuation and imaging in which homogeneous data coverage confers distinct advantages, but also as providing data to constrain subsurface petrophysical information. Finally, I discuss the implications of the new methods developed and assess which areas of geophysics would benefit from applying SED methods during the design stage.
105

Bootstrap method to replicability: a nonparametric approach to Killeen's (2005) Prep / CUHK electronic theses & dissertations collection

January 2014 (has links)
Killeen's (2005) Prep is an estimator of the replicability of an experiment. It is specifically defined as the probability of obtaining an effect of the same sign as that found in original experiment. Nevertheless, since it was announced, the validity and reliabliltiy of Prep has been challenged by a number of researchers. The present study aims at improving the performance of Prep by applying the nonparametric bootstrap method in its computation, and this bootstrap replication estimator is denoted as PBrep . A simulation study was carried out to compare the performance of Killeen's Prep and the proposed PBrep under different conditions. As expected, PBrep gives a more accurate estimation than Prep. However, PBrep occasionally fails to work properly when there is a zero population effect size, so there is still a room for improvement. / Killeen (2005) 發明的Prep是一種實驗重複估計量,它是指能夠獲得與最初實驗效應量一致方向的可能性。但自其發表以來,該系數的信度及效度仍受到不少學者的質疑。是次研究嘗試通過使用自助抽樣法以改善此系數的效能,並將改良的新系數命名為PBrep。不同環境下對兩個系數準確度的模擬測試結果顯示,PBrep比Prep能達到更準確的估計值。然而當目標總體不存在差別效應時,PBrep偶爾會出現較大的偏差,因此未來研究仍需在此方向作出改善。 / Chan, Man Lok. / Thesis M.Phil. Chinese University of Hong Kong 2014. / Includes bibliographical references (leaves 37-40). / Title from PDF title page (viewed on 14, September, 2016).
106

Influencias das variaveis de processo de congelamento na qualidade final de pão tipo frances pre-assado / Influence of freezing process variables on the final quality of pre-baked french bread

Ota, Eliza Mami 24 February 2006 (has links)
Orientador: Vivaldo Silveira Junior / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia de Alimentos / Made available in DSpace on 2018-10-19T18:44:08Z (GMT). No. of bitstreams: 1 Ota_ElizaMami_M.pdf: 3040806 bytes, checksum: 227db951bd6e81777d068f2cbdd47982 (MD5) Previous issue date: 2006 / Resumo: A grande maioria das indústrias de panificação no Brasil utiliza processos tradicionais em suas linhas de produção, cujo tempo total aproximado, é de 4 a 6 horas. Uma nova tendência é a aplicação da refrigeração e do congelamento dos produtos de panificação, que visa reduzir os custos e a área de produção, e aumentar as áreas de comercialização. O trabalho propõe a análise da influência da variação das condições de processo (temperatura e velocidade do ar do túnel) no congelamento, com convecção forçada, de pães pré-assados formulados com e sem aditivos, avaliando-se as características físicas (umidade e volume específico) e estruturais (textura) dos produtos finais. Os ensaios experimentais foram realizados segundo planejamento experimental fatorial completo. Observou-se que durante o processo de congelamento, a taxa de calor diminuiu com o tempo. A temperatura do ar do túnel de congelamento foi o fator que mais influenciou na qualidade final do produto, sendo as temperaturas mais baixas as mais prejudiciais. Nos ensaios em que os produtos apresentaram semelhanças nas análises físicas e estruturais com os pães processados tradicionalmente, também não apresentaram diferenças na análise sensorial / Abstract: Most Brazilian bakery industries use traditional processes which takes 4 to 6 hours. A new tendency is the application of refrigeration and freezing in bakery products, which objective is to reduce costs and production area and to increase the commercialization area. This work proposes to study the influence of freezing process conditions (temperature and tunnel air velocity) with forced air of pre-baked breads formulated with and without additives, evaluating physical (moisture and specific volume) and structural characteristics (texture) of the final products. Experimental tests were done according to a complete factorial experimental design. During the freezing process, the heat rate decreased with time was verified. The air temperature in freezing tunnel was the factor which most influenced the product quality, being the lower temperatures, more deleterious is the quality. The tests in which products had physical and textural properties similar to traditionally processed breads showed no significant differences in sensorial attributes / Mestrado / Engenharia de Alimentos / Mestre em Engenharia de Alimentos
107

Evaluation of Experimental Design Options in Environmental Nano-Science Research

Pokhrel, Lok R., Scheuerman, Phillip R., Dubey, Brajesh 26 October 2013 (has links)
Evaluation of Experimental Design Options in Environmental Nano-Science Research As an experimental research design plays a pivotal role in executing a research problem, it is imperative of a researcher to develop a suitable and sound research design. Utilizing robust statistical methods can further enhance the study power and thus allow drawing a logical conclusion. The same holds true for basic environmental science research, including research related to the effects of engineered nanomaterials in the environment.
108

Comparison of response surface model and Taguchi methodology for robust design

Sudasna-na-Ayudthya, Prapaisri 01 December 1992 (has links)
The principal objective of this study was to compare the results of a proposed method based upon the response surface model to the Taguchi method. To modify the Taguchi method, the proposed model was developed to encompass the following objectives. The first, with the exception of the Taguchi inner array, was obtain optimal design variable settings with minimum variations, at the same time achieving the target value of the nominal-the best performance quality characteristics. The second was to eliminate the need for the use of a noise matrix (that is, the Taguchi outer array), resulting in the significant reduction of the number of experimental runs required to implement the model. The final objective was to provide a method whereby signal-tonoise ratios could be eliminated as performance statistics. To implement the proposed method, a central composite design (CCD) experiment was selected as a second-order response surface design for the estimation of mean response functions. A Taylor's series expansion was applied to obtain estimated variance expressions for a fitted second-order model. Performance measures, including mean squared error, bias and variance, were obtained by simulations at optimal settings. Nine test problems were developed to test the accuracy of the proposed CCD method. Statistical comparisons of the proposed method to the Taguchi method were performed. Experimental results indicated that the proposed response surface model can be used to provide significant improvement in product quality. Moreover, by the reduction of the number of experimental runs required for use of the Taguchi method, lower cost process design can be achieved by use of the CCD method. / Graduation date: 1993
109

Analysis of dynamic robust design experiment and modeling approach for degradation testing

Bae, Suk Joo 01 December 2003 (has links)
No description available.
110

A Study of Crossflow Electro-microfiltration on the Treatment of Chemical Mechanical Polishing Wastewater

Tsai, Shiou-Hui 14 September 2001 (has links)
ABSTRACT In this study, two chemical mechanical polishing (CMP) wastewaters were treated by crossflow electro-microfiltration. Also studied are the effects of operation parameters on their treatment efficiencies. In the semiconductor industry, presently, CMP has become the key technique to provide global planarization on interlevel dielectrics (ILD) and metal layers of wafers. In general, the post-CMP cleaning process will produce a great quantity of CMP wastewater. Normally, CMP wastewater consists of abrasives of high concentration and stability, chemicals (e.g., oxidant and surfactant), and a tremendous mass of de-ionized water. Because of the negatively charged suspended solids in CMP wastewater, crossflow electro-microfiltration was used to treat this type of wastewater. By applying an electric field to the system, the negatively charged suspended solids were expelled from the membrane surface moving toward the anode. Not only reducing the cake formation on the membrane, enhancement of the filtration rate and permeate flux have also been found when an external electric field is applied to the filtration system. In this investigation, CMP wastewaters obtained from wafer fabs A and B were first characterized by various standard methods. In CMP wastewater A, the suspended solids were found to have a high negative zeta potential, about ¡V78 mV. Its electrical conductivity was determined to be 127.2 £gS/cm. Before testing, each CMP wastewater was pre-filtered using a filter paper of 1.2 £gm in pore size. An experimental design based on the Taguchi method was employed. The L9 orthogonal arrays were utilized to investigate the effects of four experimental factors ( i.e., electric field strength, crossflow velocity, transmembrane pressure, and membrane pore size) on the filtration rate and permeate quality in the crossflow electro-microfiltration system. When the electric field strength applied was lower than the critical electric field strength, increases of the electric field strength, transmembrane pressure, and membrane pore size were found to be beneficial to the filtration rate. The experimental results were further subjected to the analysis of variance and regular analysis. For both CMP wastewaters A and B, the electric field strength and membrane pore size were determined to be very significant parameters. In this filtration system, the optimal treatment efficiency could be achieved by using a higher electric field strength, lower crossflow velocity, higher transmembrane pressure, and larger membrane pore size. The quality of permeate thus obtained was even better than the tap water quality standards. Therefore, the permeate might be worth recycling for various purposes.

Page generated in 0.106 seconds