• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 453
  • 274
  • 163
  • 47
  • 25
  • 22
  • 19
  • 10
  • 6
  • 5
  • 5
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 1190
  • 259
  • 193
  • 143
  • 124
  • 87
  • 74
  • 67
  • 61
  • 61
  • 61
  • 61
  • 57
  • 54
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
371

An Approach for the Adaptive Solution of Optimization Problems Governed by Partial Differential Equations with Uncertain Coefficients

Kouri, Drew 05 September 2012 (has links)
Using derivative based numerical optimization routines to solve optimization problems governed by partial differential equations (PDEs) with uncertain coefficients is computationally expensive due to the large number of PDE solves required at each iteration. In this thesis, I present an adaptive stochastic collocation framework for the discretization and numerical solution of these PDE constrained optimization problems. This adaptive approach is based on dimension adaptive sparse grid interpolation and employs trust regions to manage the adapted stochastic collocation models. Furthermore, I prove the convergence of sparse grid collocation methods applied to these optimization problems as well as the global convergence of the retrospective trust region algorithm under weakened assumptions on gradient inexactness. In fact, if one can bound the error between actual and modeled gradients using reliable and efficient a posteriori error estimators, then the global convergence of the proposed algorithm follows. Moreover, I describe a high performance implementation of my adaptive collocation and trust region framework using the C++ programming language with the Message Passing interface (MPI). Many PDE solves are required to accurately quantify the uncertainty in such optimization problems, therefore it is essential to appropriately choose inexpensive approximate models and large-scale nonlinear programming techniques throughout the optimization routine. Numerical results for the adaptive solution of these optimization problems are presented.
372

Algorithms for Characterizing Peptides and Glycopeptides with Mass Spectrometry

He, Lin January 2013 (has links)
The emergence of tandem mass spectrometry (MS/MS) technology has significantly accelerated protein identification and quantification in proteomics. It enables high-throughput analysis of proteins and their quantities in a complex protein mixture. A mass spectrometer can easily and rapidly generate large volumes of mass spectral data for a biological sample. This bulk of data makes manual interpretation impossible and has also brought numerous challenges in automated data analysis. Algorithmic solutions have been proposed and provide indispensable analytical support in current proteomic experiments. However, new algorithms are still needed to either improve result accuracy or provide additional data analysis capabilities for both protein identification and quantification. Accurate identification of proteins in a sample is the preliminary requirement of a proteomic study. In many cases, a mass spectrum cannot provide complete information to identify the peptide without ambiguity because of the inefficiency of the peptide fragmentation technique and the prevalent existence of noise. We propose ADEPTS to this problem using the complementary information provided in different types of mass spectra. Meanwhile, the occurrence of posttranslational modifications (PTMs) on proteins is another major issue that prevents the interpretation of a large portion of spectra. Using current software tools, users have to specify possible PTMs in advance. However, the number of possible PTMs has to be limited since specifying more PTMs to the software leads to a longer running time and lower result accuracy. Thus, we develop DeNovoPTM and PeaksPTM to provide efficient and accurate solutions. Glycosylation is one of the most frequently observed PTMs in proteomics. It plays important roles in many disease processes and thus has attracted growing research interest. However, lack of algorithms that can identify intact glycopeptides has become the major obstacle that hinders glycoprotein studies. We propose a novel algorithm, GlycoMaster DB, to fulfil this urgent requirement. Additional research is presented on protein quantification, which studies the changes of protein quantity by comparing two or more mass spectral datasets. A crucial problem in the quantification is to correct the retention time distortions between different datasets. Heuristic solutions from previous research have been used in practice but none of them has yet claimed a clear optimization goal. To address this issue, we propose a combinatorial model and practical algorithms for this problem.
373

A Study on the Service Quality on the Blockbuster, Jointly Held by the Taiwan Print Media and Museum -A Case-study of Da Vinci Travelling Exhibition

Chang, Kai-yao 16 August 2010 (has links)
Taiwan public museums and print medias hold many ¡§blockbusters¡¨ one after another since 1990. This trend started from US then spread to all over the world. The considerable amount of visitors not only solved the financial problem, but also built the field for print media chasing the cultural capital and developing new business model. We can generalize this trend to three concepts, first, Issues of the Administrative Corporation of National Museum, second, the trend of museum marketing, third, new museology. Those concepts break the ossified museum organization system and the icy impression. However, a stream of people and money will bring some doubts, such like exhibition quality goes down, over commercialize, and unprofessional. Most of exhibition assessments about museum are overall assessments which include a wide range. But it's not applicable to blockbuster that happens only once, tours around, and needs inter-organizational cooperation. This study is a case study of ¡§Da Vinci Travelling Exhibition¡¨ which is held by United Daily News Group, National Chiang Kai-shek Memorial Hall, and National Science and Technology Museum. The research design in this study is based on ¡§Conceptual Model of Service Quality¡¨ and ¡§Extended Model of Service Quality¡¨ which was brought out by Parasuraman, Zeithmal, & Berry. This study counts out the coefficients which existed in the service quality gap theory, then, establish regression. With the regression, we can judge which service gap character will effect customer¡¦s satisfaction of service quality. So that we can point out the way how to improve blockbuster's service quality and build an exhibition assessment driving from customer. This study also investigated the relation among demographic variables and customer¡¦s satisfaction of exhibition service quality. Three findings are revealed. First, demographic variables have significant effect on customer¡¦s attitude toward blockbuster service quality. Second, except for gap character 1(managerial perception), gap character 2(managerial standard)¡Bgap character 3(service performance) and gap character 4(external communication) has a function relationship to the gap 5 was confirmed. Third, Use service gap theory to establish an exhibition service quality assessment is possible.
374

Structural Comparative Analysis of MPA Programs between Taiwan and Mainland Chinese Universities

Huang, Tsung-Cheng 09 August 2006 (has links)
Recently, implementing Public Affairs Management (PAM) educational program, and increasing the graduate¡¦s abilities and management quality have become important issues for establishing postmodern public involvement citizenship. In other words, how to instruct a modern citizen to participate in the civic society relies mostly on the education avenue. However, there existed different promoting experiences of PAM education among Taiwan, China and the U.S.A. However due to the social context and history in each country, there existed each own strength and focus. Hence, to discern and compare the PAM course structures cross-nationally and to spotlight the direction and development of PAM education program in the future became the researcher¡¦s interest. This study contained 16 universities offering executive professional master programs of MPA (Master of Public Administration, MPA) in Taiwan and 24 universities in China as research subjects. Six universities in the U.S.A. were selected for the comparison purpose. ¡§Integrated Reference Framework for Public Affairs Management¡¨ was applied as the analysis model to compare these courses. The study aims were below: (1) to review the explicitness and reflectivity of this framework; (2) to present the main characteristics of PAM education courses structure dimension; (3) to discover the denotation between universities in Taiwan and China; and (4) to contribute to PAM education course design in the future. The findings of this study were summarized: 1. Quantification analysis results After conducting Hayashi¡¦s quantification theory analysis with MPA education courses structured from universities in Taiwan and China, and incorporating with representative universities in the U.S.A., this study extracted three characteristics as the following: (1) Sixteen universities in Taiwan reflected ¡§phenomenal side¡¨, ¡§essential side¡¨ and focused on postmodern issues. The main crucial courses could be categorized into institutional analysis and design, Taiwan-China politics and policy issues, product-quality management and innovation. (2) Twenty-four universities in China reflected ¡§phenomenal side¡¨, ¡§essential side¡¨, ¡§conditional side¡¨ and focused on modern issues. The main courses could be categorized into institutional analysis and design, science-technology policies, and information technology management. (3) Six universities in the U.S.A. reflected ¡§phenomenal side¡¨, ¡§essential side¡¨ and ¡§conditional side¡¨ and focused on postmodern issues. Furthermore, they influenced the universities in both Taiwan and China. The main courses could be categorized into institutional analysis and design, science-technology policies, information technology management, and rural-urban housing development and transportation. 2. Qualitative interview analysis results The qualitative research data were collected through in-depth interviewing with the professors lecturing at the PAM education institutions. The findings suggested: (1) When designing education course, the department shall take technical rationality and philosophical thoughts of administration into consideration. (2) The PAM courses shall aim to create its localized theories instead of replicating the foreign ones. Otherwise, the interviewees suggested that a critical thinking and the ability to modify the theories were essential. (3) PAM policies could be practiced with sustainable perspectives and these policies shall take whole-person education as one of the purposes. (4) PAM courses could integrate and strengthen theory and practices, in order to broaden the scopes of academics and researches. (5) When the universities and the departments set up a goal of transformation and development, the identity crisis should be concerned. (6) The interviewees suggested that the university could set up the school of public affairs to emphasize its status in education. Key Words¡G public administration, public management, public affairs management, education system, Hayashi¡¦s quantification theory.
375

Development Of Pcr Methods For Detection And Quantification Of Genetically Modified Maize

Jabbarifarhoud, Houman 01 June 2010 (has links) (PDF)
This study describes development of methods for screening, identification and quantification of genetic modifications in maize samples. Totally 88 maize samples were collected randomly throughout Turkey in three years from 2006 to 2008 and were analyzed. Two maize samples that were detected as GM positive in previous studies were selected as positive controls. Following the DNA extraction by manual CTAB method, conventional PCR methods were employed for screening of genetic modifications in samples by detecting of P-35S and T-NOS. Qualitative PCR methods were conducted for target specific detection of cry and pat genes. Construct-specific and event-specific PCR assays were designed for detection of Bt11, Bt10 and Mon810 maize events. Specific primers and corresponding probes labeled with reporter and quencher dyes were designed for both absolute and relative quantification of Bt11 and Mon810 in samples by using TaqMan probe method. Comparing the absolute and relative quantification results indicates that there is correlation between them. In order to verify the accuracy of the quantification methods, three parallel applications were conducted according to the CRL validated protocol. The statistical analyses were performed to check the precision and repeatability of the quantification experiments by in-house validation methods. Regarding the Repeatability Relative Standard Deviation (RSDr) values of absolute and relative quantifications of Bt11 and Mon810 systems majority of the validation results accomplish the ENGL requirements for quantification of GMOs. According to screening assays, the overall results indicate that five samples (H3, H48, H73, 4M, 4G) were detected as GM positive. While the samples H3 and H48 were identified as Bt11, it was shown that the sample 4M and 4G contains both of the Bt11 and Mon810 maize events. Bt11 quantification results show samples H3 and 4G respectively with 1.06% and 5.36% exceed the 0.9% threshold level. Amount of Mon810 in samples was determined as 1.33 % for 4M and 17.32% for 4G which is higher than 0.9% threshold level. Sample H73 which was detected as GM positive did not contain Bt11 and Mon810 maize events. Since the methods developed in this study reduce dependence on commercial kits they would contribute to expansion of GMO testing in Turkey with lower cost. However the methods developed in this work should be extended to other maize events and their validation procedure should be completed.
376

Investigation on Absolute Quantification of in Vivo Proton MR Spectroscopy with Phased Array Coils

Hsu, Cheng-yun 16 July 2008 (has links)
LCModel has been widely used for MR spectroscopy analysis. LCMgui, which is the built-in user interface of LCModel, based on Linux system, provides the functionality to convert MRS data of various formats to match the format of LCModel raw file, except for GE MRSI data which can be analyzed by LCModel only with GE Sage/IDL software. Hence, the first part of this work was to develop a multi-platform tool with LCModel to support all GE data, including GE MRSI data and phased array data. With this tool, users can analyze MRS data with LCModel on their familiar environment such as Windows, and Linux. The MR spectroscopy experiments with phased array coils provide optimized SNR which lead to more accurate absolute quantification by some sophisticate combination algorithms of phased array coils. Thus, the second part of this work was to propose an algorithm of combining data obtained from phased array coils by doing phase correction and calculation of weighting factor. In addition, the comparison of the accuracy between using quadrature coil and phased array coils with different combination algorithms was investigated in order to demonstrate the efficiency of using phased array coils and the combination program.
377

Molecular characterization of the lipidome by mass spectrometry / Molekulare Charakterisierung des Lipidoms mittels Massenspektrometrie

Stenby Ejsing, Christer 01 March 2007 (has links) (PDF)
Cells, whether bacterial, fungal or mammalian, are all equipped with metabolic pathways capable of producing an assortment of structurally and functionally distinct lipid species. Despite the structural diversity of lipids being recognized and correlated to specific cellular phenomena and disease states, the molecular mechanisms that underpin this structural diversity remain poorly understood. In part, this is due to the lack of adequate analytical techniques capable of measuring the structural details of lipid species in a direct, comprehensive and quantitative manner. The aim of my thesis study was to establish methodology for automated and quantitative analysis of molecular lipid species based on mass spectrometry. From this work a novel high-throughput methodology for lipidome analysis emerged. The main assets of the methodology were the structure-specific mass analysis by powerful hybrid mass spectrometers with high mass resolution, automated and sensitive infusion of total lipid extracts by a nanoelectrospray robot, and automated spectral deconvolution by dedicated Lipid Profiler software. The comprehensive characterization and quantification of molecular lipid species was achieved by spiking total lipid extracts with unique lipid standards, utilizing selective ionization conditions for sample infusion, and performing structure-specific mass analysis by hybrid quadrupole time-of-flight and ion trap mass spectrometry. The analytical routine allowed the comprehensive characterization and quantification of molecular glycerophospholipid species, molecular diacylglycerol species, molecular sphingolipid species including ceramides, glycosphingolipids and inositol-containing sphingolipids, and sterol lipids including cholesterol. The performance of the methodology was validated by comparing its dynamic quantification range to that of established methodology based on triple quandrupole mass spectrometry. Furthermore, its efficacy for lipidomics projects was demonstrated by the successful quantitative deciphering of the lipid composition of T cell receptor signaling domains, mammalian tissues including heart, brain and red blood cells, and the yeast Saccharomyces cerevisiae.
378

Inversion de modèles probabilistes de structures à partir de fonctions<br />de transfert expérimentales

Arnst, Maarten 03 April 2007 (has links) (PDF)
L'objectif de la thèse est de développer une méthodologie d'identification expérimentale de modèles probabilistes qui prédisent le comportement dynamique de structures. Nous focalisons en particulier sur l'inversion de modèles probabilistes à paramétrage minimal, introduits par Soize, à partir de fonctions de transfert expérimentales. Nous montrons d'abord que les méthodes classiques d'estimation de la théorie des statistiques mathématiques, telle que la méthode du maximum de vraisemblance, ne sont pas bien adaptées pour aborder ce problème. En particulier, nous montrons que des difficultés numériques, ainsi que des problèmes conceptuels dus au risque d'une mauvaise spécification des modèles, peuvent entraver l'application des méthodes classiques. Ces difficultés nous motivent à formuler l'inversion de modèles probabilistes alternativement comme la minimisation, par rapport aux paramètres recherchés, d'une fonction objectif, mesurant une distance entre les données expérimentales et le modèle probabiliste. Nous proposons deux principes de construction pour la définition de telles distances, basé soit sur la fonction de logvraisemblance, soit l'entropie relative. Nous montrons comment la limitation de ces distances aux lois marginales d'ordre bas permet de surmonter les difficultés mentionnées plus haut. La méthodologie est appliquée à des exemples avec des données simulées et à un problème en ingénierie civile et environnementale avec des mesures réelles.
379

Quantification vectorielle algébrique : un outil performant pour la compression et le tatouage d'images fixes

Moureaux, Jean-Marie 10 December 2007 (has links) (PDF)
Ce manuscrit décrit douze ans d'activités de recherche au Centre de Recherche en Automatique de Nancy dans le domaine de la compression d'images (grand public mais aussi médicales), ainsi que dans celui du tatouage d'images dans un contexte de compression.<br />Nous avons mis l'accent sur l'étape de quantification de la chaîne de compression pour laquelle nous avons proposé une méthode dite de « quantification vectorielle algébrique avec zone morte »<br />(QVAZM) associée à une analyse multirésolution par ondelettes, permettant d'améliorer sensiblement, par rapport au nouveau standard JPEG2000 ainsi qu'à l'algorithme de référence SPIHT, les performances en termes de compromis débit-distorsion, et ainsi la qualité visuelle de l'image reconstruite.<br />Nous avons travaillé sur trois points essentiels et proposé à chaque fois des solutions afin de rendre l'utilisation de la QVAZM réaliste dans une chaîne de compression : l'indexage des vecteurs du dictionnaire, le réglage des paramètres du dictionnaire (facteur d'échelle et zone morte) et l'allocation des ressources binaires.<br />La contribution majeure de nos travaux dans le domaine de l'imagerie médicale 3D a consisté à tenter d'ouvrir une voie à la compression avec perte, encore inenvisageable il y a quelques années pour des raisons évidentes de diagnostic. Nous avons pour cela étendu avec succès notre algorithme<br />QVAZM au cas des images médicales volumiques. Parallèlement à ces travaux, nous avons étudié l'impact de la compression avec perte sur certaines applications de traitement d'images médicales, en particulier sur un outil d'aide à la détection de nodules pulmonaires pour lequel nous avons pu montrer une robustesse à la compression avec perte, même à fort taux de compression (jusqu'à 96 :1).<br />Enfin, la contribution principale de nos travaux dans le domaine du tatouage concerne le développement d'approches combinées compression/tatouage et ont abouti à la proposition de deux méthodes de tatouage reposant sur la QVAZM, associée à une analyse multirésolution par transformée en ondelettes. Elles sont particulièrement attractives pour les applications où la compression constitue la principale attaque (ou le principal traitement).
380

Application of communication theory to health assessment, degradation quantification, and root cause diagnosis

Costuros, Theodossios Vlasios 15 October 2013 (has links)
A review of diagnostic methods shows that new techniques are required that quantify system degradation from measured response. Information theory, developed by Claude E. Shannon, involves the quantification of information defining limits in signal processing for reliable data communication. One such technique considers information theory fundamentals forming an analogy between a machine and a communication channel to modify Shannon`s channel capacity concept and apply it to measured machine system response. The technique considers the residual signal (difference between a measured signal induced by faults from a baseline signal) to quantify degradation, perform system health assessment, and diagnose faults. Similar to noise hampering data transmission, mechanical faults hinder power transmission through the system. This residual signal can be viewed as noise within the context of information theory, to permit application of information theory to machines to construct a health measure for assessment of machine health. The goal of this dissertation is to create and study metrics for assessment of machine health. This dissertation explores channel capacity which is grounded and supported by proven theorems of information theory, studies different ways to apply and calculate channel capacity in practical industry settings, and creates methods to assess and pinpoint degradation by applying the channel capacity based measures to signals. Channel capacity is the maximum rate of information that can be sent and received over a channel having a known level of noise. A measured signal from a machine consists of a baseline signal exemplary of health, intrinsic that contaminates all measurements, and signals generated by the faults. Noise, the difference between the measured signal and the baseline signal, consists of intrinsic noise and "fault noise". Separation between fault and intrinsic (embedded in the measurement) noise shows channel capacity calculations for the machine require minimal computational efforts, and calculations are consistent in the presence of intrinsic white noise. Considering the response average or DC component of a signal in the channel capacity calculations adds robustness to diagnostic results. The method successfully predicted robot failures. Important to system health assessment is having a good baseline response as reference. The technique is favorable for industry because it applies to measurement data and calculations are done in the time domain. The technique can be used in semi-conducting industry as a tool monitoring system performance and lowering fab operating cost by extending component use and scheduling maintenance as needed. With a window running average channel capacity the technique is able to locate the fault in time. / text

Page generated in 0.055 seconds