• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 255
  • 138
  • 33
  • 27
  • 13
  • 12
  • 9
  • 9
  • 4
  • 4
  • 4
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 516
  • 188
  • 97
  • 51
  • 47
  • 42
  • 38
  • 37
  • 36
  • 35
  • 34
  • 33
  • 30
  • 27
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Estimation de fréquences fondamentales multiples. Application à la séparation de signaux de parole et de musique

Rosier, Julie 12 1900 (has links) (PDF)
L'objet de cette thèse est l'étude du problème d'estimation de fréquences fondamentales multiples, pour des mélanges de parole et de musique dont le nombre de sources est inconnu. Dans le cadre de la parole, nous proposons une méthode itérative qui estime successivement les fréquences fondamentales. La nature «voisée/non-voisée» des mélanges est caractérisée par un modèle du type "sommes de sinusoïdes harmoniques + bruit autorégressif". L'estimation consiste à maximiser un terme de Vraisemblance pénalisée qui permet également d'estimer le nombre de sources. Dans le cadre musical, nous proposons trois nouvelles méthodes qui estiment simultanément les fréquences fondamentales. Basées sur une classification des pics spectraux du mélange, elles diffèrent par leur technique de classification. Toutes permettent d'estimer le nombre de sources. Elles permettent également de prendre en compte les recouvrements spectraux entre notes et sont ainsi applicables au traitement d'accords musicaux.
42

Spatial prediction of soil properties: the Bayesian Maximum Entropy approach./ Prédiction spatiale de propriétés pédologiques : l'approche du Maximum d'Entropie Bayésien.

D'Or, Dimitri 13 May 2003 (has links)
Soil properties play important roles in a lot of environmental issues like diffuse pollution, erosion hazards or precision agriculture. With the developments of soil process models and geographical information systems, the need for accurate knowledge about soil properties becomes more acute. However, while the sources of information become each year more numerous and diversified, they rarely provide us with data at the same time having the required level of spatial and attribute accuracy. An important challenge thus consists in combining those data sources at best so as to meet the high accuracy requirements. The Bayesian Maximum Entropy (BME) approach appears as a potential candidate for achieving this task: it is especially designed for managing simultaneously data of various nature and quality ("hard" and "soft" data, continuous or categorical). It relies on a two-steps procedure involving an objective way for obtaining a prior distribution in accordance with the general knowledge at hand (the ME part), and a Bayesian conditionalization step for updating this prior probability distribution function (pdf) with respect to the specific data collected on the study site. At each prediction location, an entire pdf is obtained, allowing subsequently the easy computation of elaborate statistics chosen for their adequacy with the objectives of the study. In this thesis, the theory of BME is explained in a simplified way using standard probabilistic notations. The recent developments towards categorical variables are incorporated and an attempt is made to formulate a unified framework for both categorical and continuous variables, thus emphasizing the generality and flexibility of the BME approach. The potential of the method for predicting continuous variables is then illustrated by a series of studies dealing with the soil texture fractions (sand, silt and clay). For the categorical variables, a case study focusing on the prediction of the status of the water table is presented. The use of multiple and sometimes contradictory data sources is also analyzed. Throughout the document, BME is compared to classic geostatistical techniques like simple, ordinary or indicator kriging. Thorough discussions point out the inconsistencies of those methods and explain how BME is solving the problems. Rather than being but another geostatistical technique, BME has to be considered as a knowledge processing approach. With BME, practitioners will find a valuable tool for analyzing their spatio-temporal data sets and for providing the stake-holders with accurate information about the environmental issues to which they are confronted. Read one of the articles extracted from Chapter V at : D'Or D., Bogaert P. and Christakos, G. (2001). Application of the BME Approach to Soil Texture Mapping. Stochastic Environmental Research and Risk Assessment 15(1): 87-100 ©Springer-2001. http://springerlink.metapress.com/app/home/contribution.asp?wasp=cbttlcpaeg1rqmdb4xv2&referrer=parent&backto=issue,6,6;journal,13,29;linkingpublicationresults,1,1
43

Linear demultiple solution based on bottom-multiple generator (BMG) approximation: subsalt example

Oladeinde, Abiola Omobolaji 30 October 2006 (has links)
Significant quantities of hydrocarbons are found in complex salt environments. One of the modern challenges of exploration and production activities is to image below salt. This challenge arises from the complexities of salt structures, weak primaries from the subsalt, and the interference of free-surface multiples with the weak primaries of the subsalt. To effectively process subsalt data, we need to develop a method of attenuating free-surface multiples that preserves the amplitude and phase of primaries and does not introduce artifacts at either near and far offsets. In this thesis, we will demonstrate that the weak primaries of the subsalt can be preserved while attenuating free-surface multiples. The method used for the demonstration is the bottom-multiple generator (BMG) reflector approximation. This technique requires that a portion of the data containing only primaries be defined. A multidimensional convolution of the data containing only primaries with the actual data will predict free-surface multiples and hence is used to attenuate free-surface multiples from the actual data. This method is one of the most effective methods for attenuating free-surface multiples; however, the method requires muting data at the BMG location. One of the issues investigated in this thesis, is to establish the sensitivity of the BMG demultiple technique when the mute at the BMG location end up cutting some seismic reflections, which can be the case in complex environments such as the Gulf of Mexico and Gulf of Guinea, where freesurface multiples interfere with primaries. For this investigation, we generated synthetic data through the 2D elastic finite-difference modeling technique. The synthetic seismic data contain primaries; free-surface multiples, and internal multiples, and direct waves acquired over a 2D geological model that depicts a shallow-water geology. In this thesis, we also investigate if the first step of the BMG demultiple technique can sufficiently attenuate free-surface multiples. For this investigation, we designed a 2D geological model, which depicts the deep offshore environment, and we generated synthetic data through the 2D elastic finite-difference modeling technique. After performing the various investigations mentioned above, the following conclusions were made, that the demultiple result is not affected when the mute at the BMG location end up cutting some primaries, that the first step of the BMG demultiple technique is not sufficient for the demultiple, and that the weak subsalt primaries are preserved during demultiple processes. We compared shot gathers and zero offset data before and after the demultiple.
44

Curvelet-domain multiple elimination with sparseness constraints.

Herrmann, Felix J., Verschuur, Eric January 2004 (has links)
Predictive multiple suppression methods consist of two main steps: a prediction step, in which multiples are predicted from the seismic data, and a subtraction step, in which the predicted multiples are matched with the true multiples in the data. The last step appears crucial in practice: an incorrect adaptive subtraction method will cause multiples to be sub-optimally subtracted or primaries being distorted, or both. Therefore, we propose a new domain for separation of primaries and multiples via the Curvelet transform. This transform maps the data into almost orthogonal localized events with a directional and spatialtemporal component. The multiples are suppressed by thresholding the input data at those Curvelet components where the predicted multiples have large amplitudes. In this way the more traditional filtering of predicted multiples to fit the input data is avoided. An initial field data example shows a considerable improvement in multiple suppression.
45

Analyse de descendances : une approche bio-informatique pour estimer le risque d'hypertension et d'obésité

Gauthier, François January 2007 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
46

Intégration et évaluation de capacités interactives d'un robot humanoïde

Rousseau, Vincent January 2011 (has links)
Le domaine de l'Interaction Humain-Robot (HRI) est en pleine expansion. En effet, de plus en plus de plateformes robotiques sont mises en oeuvre pour faire évoluer ce domaine. Sur ces plateformes, toujours plus de modalités d'interaction sont mises en place telles que les mouvements corporels, la reconnaissance de gestes ou d'objets, la reconnaissance et la synthèse vocale ou encore la mobilité, pour pouvoir effectuer l'interaction la plus complète et la plus naturelle pour l'humain. Mais ceci amène aussi une complexité croissante de l'intégration de ces modalités sur une seule et même plateforme. Aussi, le domaine HRI étant à ses débuts, la méthodologie expérimentale des travaux se limite le plus souvent à des preuves de concept éprouvées en laboratoire ou en milieux ouverts non contrôlés. Il se trouve que peu de chercheurs présentent une démarche structurée et rigoureuse pour l'évaluation expérimentale d'interaction humain-robot en milieux ouverts, et il en résulte des recherches de types exploratoires qui examinent principalement la complexité technologique des modalités interactives à mettre en oeuvre, et non l'impact de ces modalités sur la qualité des interactions. Le but de l'étude présentée dans ce document est d'étudier l'intégration de plusieurs modalités interactives sur un robot mobile humanoïde telles que la parole, les gestes et la mobilité sur la qualité des interactions humain-robot. Plus spécifiquement, le contexte de l'étude consiste à examiner l'impact de modalités interactives sur la capacité du robot à attirer l'attention d'une personne et à engager une interaction avec elle. Le scénario expérimental consiste à permettre au robot, à partir de la parole, d'expressions faciales, de mouvement de la tête, de gestes avec son bras et de sa mobilité, de demander de l'assistance à une personne à proximité de lui remettre un objet se trouvant au sol. L'hypothèse sous-jacente est que l'intégration de l'ensemble de ces modalités devrait améliorer la capacité du robot à engager des personnes à interagir avec lui. Des expérimentations ont été faites en milieu contrôlé et non-contrôlé selon deux protocoles expérimentaux : une étude des modalités à l'intérieur d'une population, et une étude de variation entre individus. D'une manière générale, il en ressort que l'ajout de modalités améliore la qualité de l'engagement de l'interaction par le robot, mais qu'il faut porter une attention particulière à l'approche de la personne par le robot, principalement pour les personnes non familières avec ce dernier. De plus, les observations indiquent qu'il est plus facile d'obtenir des résultats significatifs en environnement contrôlé, elles permettent d'identifier des pistes d'amélioration pour éventuellement arriver à en obtenir en milieu non-contrôlé. Enfin, ce premier projet d'intégration et d'évaluation de capacités interactives d'un robot mobile servira à alimenter une prochaine itération avec un robot plus sophistiqué présentement en conception.
47

Ocenenie spoločnosti Infineon Technologies AG / Valuation of Infineon Technologies AG

Drotár, Martin January 2009 (has links)
The aim of this thesis is evaluation of the public company as a whole as well as on a one share basis (intrinsic value)and comparison of these values with the values based on a market price. Cash flow methods and market multiples were used in a valuation. Autor is trying, apart from the valuation itself, to analyse impact of the global economic crisis on the value of the company.
48

Stellenwert des Heavy Light Chain Assays in Diagnostik und Therapiemonitoring des Multiplen Myeloms – Vergleich mit konventionellen Analysen und minimaler Resterkrankung / Benefit of using the Heavy Light Chain Assay in diagnosis and therapy monitoring of multiple myeloma patients - comparison with conventional methods and minimal residual disease assessment

Lüthen, Julia January 2021 (has links) (PDF)
Das Multiple Myelom ist eine komplexe Erkrankung, dessen Tumorbiologie noch immer nicht in Gänze verstanden ist. Mit dem Heavy Light Chain Assay (Hevylite®) war es erstmals möglich, mit spezifischen Antikörpern nicht nur zwischen den Klassen intakter Immunglobuline, sondern auch zwischen kappa- und lambda-Isotyp zu differenzieren. Dies ist in der Behandlung von Patient*innen mit Multiplem Myelom sehr nützlich, um das vom Tumor produzierte klonale Immunglobulin von den funktionalen Immunglobulinen getrennt zu quantifizieren. Dadurch sollen die Tumorlast und die einhergehende Immunsuppression genauer erfasst werden. Den zusätzlichen Nutzen für Diagnostik und Therapiemonitoring des Multiplen Myeloms untersuchen wir in dieser Arbeit anhand von Daten einer multizentrischen, randomisierten Phase 3- Medikamentenstudie (DSMM XIV) mit dem Vorteil, hierdurch eine große und weitgehend einheitlich behandelte Kohorte und Zugang zu modernen Messmethoden zu haben. Wir bestätigen, dass das Heavy Light Chain Assays insbesondere zur Erkennung von IgA-Myelomen eine hohe Sensitivität bei negativer Serumproteinelektrophorese hat. Weiterhin zeigen wir, dass je nach Zeitpunkt in der Therapie das Heavy Light Chain Assay ein höheres Risiko für einen Progress vorhersagt als bisher verwendete Methoden. Signifikante Unterschiede im progressionsfreien Überleben finden wir nicht nur je nach Höhe der kappa/lambda Heavy Light Chain-Ratio des involvierten Immunglobulins, sondern auch bei Suppression der nicht involvierten Heavy Light Chain. Zudem beschreiben wir eine hohe Korrelation zwischen hoch abnormaler kappa/lambda Heavy Light Chain-Ratio des involvierten Immunglobulins und positivem Minimal Residual Disease Status in der Durchflusszytometrie. Wir empfehlen daher anhand unserer Ergebnisse, dass das Heavy Light Chain Assay einen Platz in der diagnostischen Routine erhält und als prognostischer Faktor zusätzlich in die Response-Kriterien integriert wird. / Multiple myeloma is a complex disease, whose the tumor biology is not yet fully understood. Using specific antibodies of the Heavy Light Chain Assay (Hevylite®) it was possible for the first time to not only differentiate between the classes of intact immunoglobuline but also between kappa- and lambda- isotype. This is very useful in the treatment of patients with multiple myeloma, in order to quantify the tumor-induced clonal immunoglobulin and the functional immunoglobulins separately. It allows assessing the tumor burden and the associated immunosuppression more precisely. In this paper we are investigating the additional benefit for diagnosis and therapy monitoring of multiple myeloma patients based on data from a multicentric, randomized phase 3 drug study (DSMM XIV). This gives us the advantage of having a large and overall uniformly treated cohort and access to modern measurement methods. We confirm that the Heavy Light Chain Assay has a high sensitivity, especially for the detection of IgA myeloma with negative serum protein electrophoresis. Furthermore, we show that depending on the point in time in therapy, the Heavy Light Chain Assay predicts a higher risk of progression than previously used methods. We found significant differences in the progression-free survival not only depending on the level of the kappa/lambda Heavy Light Chain Ratio of the involved immunoglobulin, but also depending on the suppression of the non-involved Heavy Light Chain. We also describe a high correlation between highly abnormal kappa/lambda Heavy Light Chain Ratio of the involved immunoglobulin and a positive status in minimal residual disease analysis through flow cytometry. Based on our results, we recommend that the Heavy Light Chain Assay should have a place in the diagnostic routine and be integrated into the response criteria as a prognostic factor.
49

Linearized inversion frameworks toward high-resolution seismic imaging

Aldawood, Ali 09 1900 (has links)
Seismic exploration utilizes controlled sources, which emit seismic waves that propagate through the earth subsurface and get reflected off subsurface interfaces and scatterers. The reflected and scattered waves are recorded by recording stations installed along the earth surface or down boreholes. Seismic imaging is a powerful tool to map these reflected and scattered energy back to their subsurface scattering or reflection points. Seismic imaging is conventionally based on the single-scattering assumption, where only energy that bounces once off a subsurface scatterer and recorded by a receiver is projected back to its subsurface position. The internally multiply scattered seismic energy is considered as unwanted noise and is usually suppressed or removed from the recorded data. Conventional seismic imaging techniques yield subsurface images that suffer from low spatial resolution, migration artifacts, and acquisition fingerprint due to the limited acquisition aperture, number of sources and receivers, and bandwidth of the source wavelet. Hydrocarbon traps are becoming more challenging and considerable reserves are trapped in stratigraphic and pinch-out traps, which require highly resolved seismic images to delineate them. This thesis focuses on developing and implementing new advanced cost-effective seismic imaging techniques aiming at enhancing the resolution of the migrated images by exploiting the sparseness of the subsurface reflectivity distribution and utilizing the multiples that are usually neglected when imaging seismic data. I first formulate the seismic imaging problem as a Basis pursuit denoise problem, which I solve using an L1-minimization algorithm to obtain the sparsest migrated image corresponding to the recorded data. Imaging multiples may illuminate subsurface zones, which are not easily illuminated by conventional seismic imaging using primary reflections only. I then develop an L2-norm (i.e. least-squares) inversion technique to image internally multiply scattered seismic waves to obtain highly resolved images delineating vertical faults that are otherwise not easily imaged by primaries. Seismic interferometry is conventionally based on the cross-correlation and convolution of seismic traces to transform seismic data from one acquisition geometry to another. The conventional interferometric transformation yields virtual data that suffers from low temporal resolution, wavelet distortion, and correlation/convolution artifacts. I therefore incorporate a least-squares datuming technique to interferometrically transform vertical-seismic-profile surface-related multiples to surface-seismic-profile primaries. This yields redatumed data with high temporal resolution and less artifacts, which are subsequently imaged to obtain highly resolved subsurface images. Tests on synthetic examples demonstrate the efficiency of the proposed techniques, yielding highly resolved migrated sections compared with images obtained by imaging conventionally redatumed data. I further advance the recently developed cost-effective Generalized Interferometric Multiple Imaging procedure, which aims to not only image first but also higher-order multiples as well. I formulate this procedure as a linearized inversion framework and solve it as a least-squares problem. Tests of the least-squares Generalized Interferometric Multiple imaging framework on synthetic datasets and demonstrate that it could provide highly resolved migrated images and delineate vertical fault planes compared with the standard procedure. The results support the assertion that this linearized inversion framework can illuminate subsurface zones that are mainly illuminated by internally scattered energy.
50

This is Not a Pipe

Sparks, Brittany 01 May 2020 (has links)
The artist discusses her Master of Fine Arts exhibition, This is Not a Pipe, held at the Tipton Gallery in downtown Johnson City. Exhibition dates are from February 20 through February 28, 2020. The author provides insight into concepts and influences relating to the creation of the exhibition while offering a succinct perspective on her intimate connection with process.

Page generated in 0.0626 seconds