• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 142
  • 131
  • 38
  • 16
  • 11
  • 10
  • 7
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 443
  • 443
  • 91
  • 86
  • 81
  • 76
  • 72
  • 48
  • 44
  • 42
  • 41
  • 40
  • 35
  • 30
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
371

Liquid Chromatography Coupled to Mass Spectrometry : Implementation of Chemometric Optimization and Selected Applications

Moberg, My January 2006 (has links)
Liquid chromatography (LC) coupled to mass spectrometry (MS) offers highly selective and sensitive analysis of a wide variety of compounds. However, the use of hyphenated experimental set-ups implies that many parameters may have an effect on the studied response. Therefore, in order to determine optimized experimental conditions it is of vital importance to incorporate systematic procedures during method development. In this thesis, a generic stepwise optimization strategy is proposed that aims at high chromatographic quality, as well as high mass spectrometric response. The procedure comprises (i) screening experiments to identify the most important parameters, (ii) LC studies to ensure sufficient chromatographic separation, (iii) extended infusion experiments in order to maximize precursor signal(s), and in the case of tandem MS (iv) extended infusion experiments to determine optimal conditions for collision induced dissociation and when applicable also ion trap settings. Experimental design and response surface methodology is used throughout the procedure. Further, the general applicability of LC-MS is demonstrated in this thesis. Specifically, a novel quantitative column-switched LC-MS method for ferrichrome, ferrichrysin and ferricrocin determination is presented. Using the method it was shown how the siderophore content varies with depth in podzolic soil profiles in the north and south of Sweden. The parallel approach using LC coupled to both inductively coupled plasma (ICP) mass spectrometry, and electrospray ionization (ESI) tandem MS is also evaluated as a tool to identify unknown siderophores in a sample. Additionally, different trypsin digestion schemes used for LC-ESI-MS peptide mapping were compared. By multivariate data analysis, it was clearly shown that the procedures tested induce differences that are detectable using LC-ESI-MS. Finally, the glutathione S-transferase catalyzed bioactivation of the prodrug azathioprine was verified using LC-MS.
372

Errores en la búsqueda de condiciones robustas. Metodologías para evitarlos.

Pozueta Fernández, Maria Lourdes 10 December 2001 (has links)
El problema de encontrar condiciones robustas al efecto de factores no controlados es un tema que interesa enormemente a las empresas ya que es una característica que demanda el mercado. Existen básicamente dos métodos para estudiar el problema: El que se basa en el método propuesto por G. Taguchi a comienzos de los 80's con el que se aproxima la variabilidad a partir de matrices producto y se seleccionan las condiciones robustas minimizando la respuesta, o el que parte de una matriz más económica que permite estimar un modelo para la respuesta Y en función de los factores de control y ruido, y estudia las condiciones robustas a partir de las interacciones entre los factores ruido y los factores de control. Aunque en un principio cabrían esperar resultados muy similares analizando un mismo problema por las dos vías hemos encontrado ejemplos donde las conclusiones son muy dispares y por ello nos hemos planteado este trabajo de investigación para encontrar las causas de estas diferencias.El trabajo de investigación lo hemos iniciado estudiando la naturaleza de las superficies asociadas a la variabilidad provocada por factores ruido realizando el estudio de forma secuencial aumentando el número de factores ruido. Hemos demostrado que independientemente de que la métrica seleccionada sea s2(Y), s(Y) o lo(s(Y)) las superficies difícilmente podrán ser aproximadas por polinomios de primer orden en los factores de control llegando a la conclusión de que algunas de las estrategias habituales que los experimentadores utilizan en la práctica difícilmente llevan a un buen conocimiento de esta superficie. Por ejemplo no es adecuado colocar un diseño 2k-p de Resolución III en los factores de control en una matriz producto siendo recomendables diseños de Resolución IV con puntos centrales.A continuación se han supuesto dos fuentes de variación en la respuesta debidas a ruido, fuentes desconocidas para el experimentador, y se ha estudiado la sensibilidad de los dos métodos para recoger estas oportunidades de reducción de la variabilidad demostrándose que el modelo para métricas resumen está más preparado para recoger todas las fuentes de variación que el modelo a partir de métricas no-resumen, el cual es muy sensible a la estimación del modelo de Y.Por último se ha investigado sobre los errores más comunes a la hora de seleccionar las condiciones robustas a partir de gráficos.
373

Computer experiments: design, modeling and integration

Qian, Zhiguang 19 May 2006 (has links)
The use of computer modeling is fast increasing in almost every scientific, engineering and business arena. This dissertation investigates some challenging issues in design, modeling and analysis of computer experiments, which will consist of four major parts. In the first part, a new approach is developed to combine data from approximate and detailed simulations to build a surrogate model based on some stochastic models. In the second part, we propose some Bayesian hierarchical Gaussian process models to integrate data from different types of experiments. The third part concerns the development of latent variable models for computer experiments with multivariate response with application to data center temperature modeling. The last chapter is devoted to the development of nested space-filling designs for multiple experiments with different levels of accuracy.
374

Ein neues Verfahren zur modellbasierten Prozessoptimierung auf der Grundlage der statistischen Versuchsplanung am Beispiel eines Ottomotors mit elektromagnetischer Ventilsteuerung (EMVS)

Haase, Dirk 01 October 2005 (has links) (PDF)
In recent years gasoline engines have become increasingly complex, for example through the introduction of electronic control and monitoring systems for ignition, fuel injection and exhaust aftertreatment. Parallel to this the requirements placed upon engines have also increased hence the need to develop new engine technologies. This demand for new technologies is, in part, due to the self obligation of the automobile industry to reduce the CO2 emissions about 25% by 2005, and also to the increasingly stringent future exhaust limits. Some promising solutions are currently in development, e.g. the direct injection gasoline engine and variable valve trains. All these new technologies are characterised by increasing complexity and significantly higher degrees of freedom. The associated application expenditure rises drastically with the number of free parameters and also with improved quality standards. Possible solutions to meet the future requirements of the development process are based on model-based parameter optimisation and the use of test methods, such as "Design of Experiments" (DoE). The idea behind this approach is to produce models to describe the dependence of the responses of interest (i.e. fuel consumption) on the adjusted engine parameters. With these models offline optimisation of the engine can be carry out, independently of testbench resources. The measured data for the models are produced with the help of statistically designed experiments. Thus, the testing and analysis processes are structured and the expenditure limited. In the following the DoE methodology will be employed of a gasoline engine with electromechanical valve train. / Der Ottomotor im Kraftfahrzeug hat in den letzten Jahren mit dem Einzug elektronischer Steuer- und Regelsysteme für Zündung, Einspritzung und Abgasnachbehandlung einen sehr hohen technischen Stand erreicht. Die wachsenden Ansprüche an die Motorenentwicklung im Hinblick auf Verbrauchsreduzierung bei gleichzeitiger Erfüllung der zukünftigen Abgasgrenzwerte, verschärfen den Druck zur Entwicklung weiterführender Technologien. Hierbei gibt es bereits einige vielversprechende Lösungsansätze, wie z.B. die Direkteinspritzung oder variable Ventilsteuerungen. All diese neuen Technologien zeichnen sich durch eine wachsende Komplexität durch die signifikant höhere Anzahl von Freiheitsgraden aus. Der damit verbundene Applikationsaufwand steigt drastisch durch die wachsende Anzahl freier Parameter, aber auch durch die steigenden Anforderungen an die Qualität der Applikationsergebnisse. Einen möglichen Lösungsansatz zur Realisierung der zukünftigen Anforderungen an den Entwicklungsprozess stellen die modellgestützte Parameteroptimierung sowie der Einsatz der "Statistischen Versuchsplanung" (SVP) - "Design of Experiments" (DoE) - dar. Der Grundgedanke basiert auf der Erstellung von Modellen zur Beschreibung der Abhängigkeiten variierter Verstellparameter. Mit diesen Modellen können Offline-Optimierungen unabhängig von Prüfstandsressourcen durchgeführt werden. Die für die Modellbildung benötigten Messdaten werden mit Hilfe der statistischen Versuchsplanung erzeugt. Dadurch wird der Prozess strukturiert und der Aufwand wird begrenzt. In der Arbeit wird der Einsatz der DoE-Methodik am Beispiel eines Ottomotors mit elektromechanischer Ventilsteuerung (EMVS) aufgezeigt.
375

Metabolomics studies of ALS a multivariate search for clues about a devastating disease /

Wuolikainen, Anna, January 2009 (has links)
Diss. (sammanfattning) Umeå : Umeå universitet, 2009. / Härtill 5 uppsatser. Även tryckt utgåva.
376

Multivariate design of molecular docking experiments : An investigation of protein-ligand interactions

Andersson, David January 2010 (has links)
To be able to make informed descicions regarding the research of new drug molecules (ligands), it is crucial to have access to information regarding the chemical interaction between the drug and its biological target (protein). Computer-based methods have a given role in drug research today and, by using methods such as molecular docking, it is possible to investigate the way in which ligands and proteins interact. Despite the acceleration in computer power experienced in the last decades many problems persist in modelling these complicated interactions. The main objective of this thesis was to investigate and improve molecular modelling methods aimed to estimate protein-ligand binding. In order to do so, we have utilised chemometric tools, e.g. design of experiments (DoE) and principal component analysis (PCA), in the field of molecular modelling. More specifically, molecular docking was investigated as a tool for reproduction of ligand poses in protein 3D structures and for virtual screening. Adjustable parameters in two docking software were varied using DoE and parameter settings were identified which lead to improved results. In an additional study, we explored the nature of ligand-binding cavities in proteins since they are important factors in protein-ligand interactions, especially in the prediction of the function of newly found proteins. We developed a strategy, comprising a new set of descriptors and PCA, to map proteins based on their cavity physicochemical properties. Finally, we applied our developed strategies to design a set of glycopeptides which were used to study autoimmune arthritis. A combination of docking and statistical molecular design, synthesis and biological evaluation led to new binders for two different class II MHC proteins and recognition by a panel of T-cell hybridomas. New and interesting SAR conclusions could be drawn and the results will serve as a basis for selection of peptides to include in in vivo studies.
377

Multivariate Synergies in Pharmaceutical Roll Compaction : The quality influence of raw materials and process parameters by design of experiments

Souihi, Nabil January 2014 (has links)
Roll compaction is a continuous process commonly used in the pharmaceutical industry for dry granulation of moisture and heat sensitive powder blends. It is intended to increase bulk density and improve flowability. Roll compaction is a complex process that depends on many factors, such as feed powder properties, processing conditions and system layout. Some of the variability in the process remains unexplained. Accordingly, modeling tools are needed to understand the properties and the interrelations between raw materials, process parameters and the quality of the product. It is important to look at the whole manufacturing chain from raw materials to tablet properties. The main objective of this thesis was to investigate the impact of raw materials, process parameters and system design variations on the quality of intermediate and final roll compaction products, as well as their interrelations. In order to do so, we have conducted a series of systematic experimental studies and utilized chemometric tools, such as design of experiments, latent variable models (i.e. PCA, OPLS and O2PLS) as well as mechanistic models based on the rolling theory of granular solids developed by Johanson (1965). More specifically, we have developed a modeling approach to elucidate the influence of different brittle filler qualities of mannitol and dicalcium phosphate and their physical properties (i.e. flowability, particle size and compactability) on intermediate and final product quality. This approach allows the possibility of introducing new fillers without additional experiments, provided that they are within the previously mapped design space. Additionally, this approach is generic and could be extended beyond fillers. Furthermore, in contrast to many other materials, the results revealed that some qualities of the investigated fillers demonstrated improved compactability following roll compaction. In one study, we identified the design space for a roll compaction process using a risk-based approach. The influence of process parameters (i.e. roll force, roll speed, roll gap and milling screen size) on different ribbon, granule and tablet properties was evaluated. In another study, we demonstrated the significant added value of the combination of near-infrared chemical imaging, texture analysis and multivariate methods in the quality assessment of the intermediate and final roll compaction products. Finally, we have also studied the roll compaction of an intermediate drug load formulation at different scales and using roll compactors with different feed screw mechanisms (i.e. horizontal and vertical). The horizontal feed screw roll compactor was also equipped with an instrumented roll technology allowing the measurement of normal stress on ribbon. Ribbon porosity was primarily found to be a function of normal stress, exhibiting a quadratic relationship. A similar quadratic relationship was also observed between roll force and ribbon porosity of the vertically fed roll compactor. A combination of design of experiments, latent variable and mechanistic models led to a better understanding of the critical process parameters and showed that scale up/transfer between equipment is feasible.
378

Beitrag zur Optimierung der Verfahrensparameter von Vliesstoffausrüstungsprozessen bei hohen Warengeschwindigkeiten / Contribution to optimisation of process parameters of nonwoven finishing processes at high speeds

Grönke, Kerstin 15 December 2014 (has links) (PDF)
Gegenstand der vorliegenden Arbeit ist die Untersuchung des Foulardierprozesses zur chemischen Nassausrüstung von Vliesstoffen bei Warengeschwindigkeiten bis zu 250 m/min. Hintergrund ist die abweisende Ausrüstung von Polypropylen-Spinnvliesstoffen für die Anwendung als Operationskittel. Wo bislang nach dem Stand der Technik eine Veredlung bei Lohnausrüstern bei geringen Warengeschwindigkeiten durchgeführt wurde, zeigt die Tendenz in der Vliesstoffindustrie in Richtung der eigenen Prozessbeherrschung. Eine grundlegende Voraussetzung, um den Foulardierprozess für diese Anwendung nutzbar zu machen, ist die Kenntnis über die Prozesseigenschaften bei den geforderten hohen Warengeschwindigkeiten. Für den abzudeckenden Versuchsraum mit sechs Einflussgrößen bei jeweils drei Faktorstufen wurde mittels der Methodik der statistischen Versuchsplanung ein D-optimaler Versuchsplan erstellt. Die Versuchsdurchführung erfolgte auf einem in eine Technikumsanlage eingebundenen Foulard mit horizontaler Walzenanordnung. Für jede der sieben Zielgrößen wurde auf Grundlage der erhaltenen Messwerte eine lineare Regressionsanalyse erstellt und ausgewertet. Eine detaillierte Analyse und Diskussion der Regressionsmodelle liefert Informationen zu Wirkungsrichtung und Intensität der einzelnen Einflussgrößen sowie zu Faktor-Faktor-Wechselwirkungen. / The subject of the work presented here is the study of the padding process for the chemical wet finishing of nonwovens at web speeds up to 250 m/min. Background to the topic is the repellent treatment of polypropylene spunbond nonwovens applied for surgical gowns. Finishing carried out at subcontractors corresponding to best practice technology up to now, the trend in the nonwovens industry is turning towards an in-house process mastery. Essential requirement to make the padding process technologically exploitable for this kind of application is the knowledge of the process characteristics at the high web speeds claimed. For the experimental scenario to be covered comprising six determining factors at three level steps each, a D-optimal trial plan was defined using the statistic method of the design of experiments (DOE). The realization of the trials carried out on a padder with horizontal roll arrangement installed in a pilot line. For each of the seven responses a linear regression analyses was compiled and evaluated. A detailed analysis and discussion of the regression models provides information on direction of influence as well as intensity of each of the determining factors and factor-factor-interactions.
379

Contributions to computer experiments and binary time series

Hung, Ying 19 May 2008 (has links)
This thesis consists of two parts. The first part focuses on design and analysis for computer experiments and the second part deals with binary time series and its application to kinetic studies in micropipette experiments. The first part of the thesis addresses three problems. The first problem is concerned with optimal design of computer experiments. Latin hypercube designs (LHDs) have been used extensively for computer experiments. A multi-objective optimization approach is proposed to find good LHDs by combining correlation and distance performance measures. Several examples are presented to show that the obtained designs are good in terms of both criteria. The second problem is related to the analysis of computer experiments. Kriging is the most popular method for approximating complex computer models. Here a modified kriging method is proposed, which has an unknown mean model. Therefore it is called blind kriging. The unknown mean model is identified from experimental data using a Bayesian variable selection technique. Many examples are presented which show remarkable improvement in prediction using blind kriging over ordinary kriging. The third problem is related to computer experiments with nested and branching factors. Design and analysis of experiments with branching and nested factors are challenging and have not received much attention in the literature. Motivated by a computer experiment in a machining process, we develop optimal LHDs and kriging methods that can accommodate branching and nested factors. Through the application of the proposed methods, optimal machining conditions and tool edge geometry are attained, which resulted in a remarkable improvement in the machining process. The second part of the thesis deals with binary time series analysis with application to cell adhesion frequency experiments. Motivated by the analysis of repeated adhesion tests, a binary time series model incorporating random effects is developed in this chapter. A goodness-of-fit statistic is introduced to assess the adequacy of distribution assumptions on the dependent binary data with random effects. Application of the proposed methodology to real data from a T-cell experiment reveals some interesting information. These results provide some quantitative evidence to the speculation that cells can have ¡§memory¡¨ in their adhesion behavior.
380

Hit Identification and Hit Expansion in Antituberculosis Drug Discovery : Design and Synthesis of Glutamine Synthetase and 1-Deoxy-D-Xylulose-5-Phosphate Reductoisomerase Inhibitors

Nordqvist, Anneli January 2011 (has links)
Since the discovery of Mycobacterium tuberculosis (Mtb) as the bacterial agent causing tuberculosis, the permanent eradication of this disease has proven challenging. Although a number of drugs exist for the treatment of tuberculosis, 1.7 million people still die every year from this infection. The current treatment regimen involves lengthy combination therapy with four different drugs in an effort to combat the development of resistance. However, multidrug-resistant and extensively drug-resistant strains are emerging in all parts of the world. Therefore, new drugs effective in the treatment of tuberculosis are much-needed. The work presented in this thesis was focused on the early stages of drug discovery by applying different hit identification and hit expansion strategies in the exploration of two new potential drug targets, glutamine synthetase (GS) and 1-deoxy-D-xylulose-5-phosphate reductoisomerase (DXR). A literature survey was first carried out to identify new Mtb GS inhibitors from compounds known to inhibit GS in other species. Three compounds, structurally unrelated to the typical amino acid derivatives of previously known GS inhibitors, were then discovered by virtual screening and found to be Mtb GS inhibitors, exhibiting activities in the millimolar range. Imidazo[1,2-a]pyridine analogues were also investigated as Mtb GS inhibitors. The chemical functionality, size requirements and position of the substituents in the imidazo[1,2-a]pyridine hit were investigated, and a chemical library was designed based on a focused hierarchical design of experiments approach. The X-ray structure of one of the inhibitors in complex with Mtb GS provided additional insight into the structure–activity relationships of this class of compounds. Finally, new α-arylated fosmidomycin analogues were synthesized as inhibitors of Mtb DXR, exhibiting IC50 values down to 0.8 µM. This work shows that a wide variety of aryl groups are tolerated by the enzyme. Cinnamaldehydes are important synthetic intermediates in the synthesis of fosmidomycin analogues. These were prepared by an oxidative Heck reaction from acrolein and various arylboronic acids. Electron-rich, electron-poor, heterocyclic and sterically hindered boronic acids could be employed, furnishing cinnamaldehydes in 43–92% yield.

Page generated in 0.1132 seconds