• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 133
  • 24
  • 12
  • 7
  • 5
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 212
  • 53
  • 30
  • 30
  • 29
  • 27
  • 24
  • 21
  • 21
  • 19
  • 19
  • 19
  • 18
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Seismic noise : the good the bad and the ugly

Herrmann, Felix J., Wilkinson, Dave January 2007 (has links)
In this paper, we present a nonlinear curvelet-based sparsity-promoting formulation for three problems related to seismic noise, namely the ’good’, corresponding to noise generated by random sampling; the ’bad’, corresponding to coherent noise for which (inaccurate) predictions exist and the ’ugly’ for which no predictions exist. We will show that the compressive capabilities of curvelets on seismic data and images can be used to tackle these three categories of noise-related problems.
62

Seismic imaging and processing with curvelets

Herrmann, Felix J. January 2007 (has links)
No description available.
63

Surface-related multiple prediction from incomplete data

Herrmann, Felix J. January 2007 (has links)
No description available.
64

Multiple prediction from incomplete data with the focused curvelet transform

Herrmann, Felix J. January 2007 (has links)
Incomplete data represents a major challenge for a successful prediction and subsequent removal of multiples. In this paper, a new method will be represented that tackles this challenge in a two-step approach. During the first step, the recenly developed curvelet-based recovery by sparsity-promoting inversion (CRSI) is applied to the data, followed by a prediction of the primaries. During the second high-resolution step, the estimated primaries are used to improve the frequency content of the recovered data by combining the focal transform, defined in terms of the estimated primaries, with the curvelet transform. This focused curvelet transform leads to an improved recovery, which can subsequently be used as input for a second stage of multiple prediction and primary-multiple separation.
65

Seismology meets compressive sampling

Herrmann, Felix J. January 2007 (has links)
Presented at Cyber-Enabled Discovery and Innovation: Knowledge Extraction as a success story lecture. See for more detail https://www.ipam.ucla.edu/programs/cdi2007/
66

Phase transitions in explorations seismology : statistical mechanics meets information theory

Herrmann, Felix J. January 2007 (has links)
n this paper, two different applications of phase transitions to exploration seismology will be discussed. The first application concerns a phase diagram ruling the recovery conditions for seismic data volumes from incomplete and noisy data while the second phase transition describes the behavior of bi-compositional mixtures as a function of the volume fraction. In both cases, the phase transitions are the result of randomness in large system of equations in combination with nonlinearity. The seismic recovery problem from incomplete data involves the inversion of a rectangular matrix. Recent results from the field of "compressive sensing" provide the conditions for a successful recovery of functions that are sparse in some basis (wavelet) or frame (curvelet) representation, by means of a sparsity ($\ell_1$-norm) promoting nonlinear program. The conditions for a successful recovery depend on a certain randomness of the matrix and on two parameters that express the matrix' aspect ratio and the ratio of the number of nonzero entries in the coefficient vector for the sparse signal representation over the number of measurements. It appears that the ensemble average for the success rate for the recovery of the sparse transformed data vector by a nonlinear sparsity promoting program, can be described by a phase transition, demarcating the regions for the two ratios for which recovery of the sparse entries is likely to be successful or likely to fail. Consistent with other phase transition phenomena, the larger the system the sharper the transition. The randomness in this example is related to the construction of the matrix, which for the recovery of spike trains corresponds to the randomly restricted Fourier matrix. It is shown, that these ideas can be extended to the curvelet recovery by sparsity-promoting inversion (CRSI) . The second application of phase transitions in exploration seismology concerns the upscaling problem. To counter the intrinsic smoothing of singularities by conventional equivalent medium upscaling theory, a percolation-based nonlinear switch model is proposed. In this model, the transport properties of bi-compositional mixture models for rocks undergo a sudden change in the macroscopic transport properties as soon as the volume fraction of the stronger material reaches a critical point. At this critical point, the stronger material forms a connected cluster, which leads to the creation of a cusp-like singularity in the elastic moduli, which in turn give rise to specular reflections. In this model, the reflectivity is no longer explicitly due to singularities in the rocks composition. Instead, singularities are created whenever the volume fraction exceeds the critical point. We will show that this concept can be used for a singularity-preserved lithological upscaling.
67

Spatiotemporal Gene Networks from ISH Images

Puniyani, Kriti 01 September 2013 (has links)
As large-scale techniques for studying and measuring gene expressions have been developed, automatically inferring gene interaction networks from expression data has emerged as a popular technique to advance our understanding of cellular systems. Accurate prediction of gene interactions, especially in multicellular organisms such as Drosophila or humans, requires temporal and spatial analysis of gene expressions, which is not easily obtainable from microarray data. New image based techniques using in-sit hybridization(ISH) have recently been developed to allowlarge-scale spatial-temporal profiling of whole body mRNA expression. However, analysis of such data for discovering new gene interactions still remains an open challenge. This thesis studies the question of predicting gene interaction networks from ISH data in three parts. First, we present SPEX2, a computer vision pipeline to extract informative features from ISH data. Next, we present an algorithm, GINI, for learning spatial gene interaction networks from embryonic ISH images at a single time step. GINI combines multi-instance kernels with recent work in learning sparse undirected graphical models to predict interactions between genes. Finally, we propose NP-MuScL (nonparanormal multi source learning) to estimate a gene interaction network that is consistent with multiple sources of data, having the same underlying relationships between the nodes. NP-MuScL casts the network estimation problem as estimating the structure of a sparse undirected graphical model. We use the semiparametric Gaussian copula to model the distribution of the different data sources, with the different copulas sharing the same covariance matrix, and show how to estimate such a model in the high dimensional scenario. We apply our algorithms on more than 100,000 Drosophila embryonic ISH images from the Berkeley Drosophila Genome Project. Each of the 6 time steps in Drosophila embryonic development is treated as a separate data source. With spatial gene interactions predicted via GINI, and temporal predictions combined via NP-MuScL, we are finally able to predict spatiotemporal gene networks from these images.
68

Régression linéaire bayésienne sur données fonctionnelles / Functional Bayesian linear regression

Grollemund, Paul-Marie 22 November 2017 (has links)
Un outil fondamental en statistique est le modèle de régression linéaire. Lorsqu'une des covariables est une fonction, on fait face à un problème de statistique en grande dimension. Pour conduire l'inférence dans cette situation, le modèle doit être parcimonieux, par exemple en projetant la covariable fonctionnelle dans des espaces de plus petites dimensions.Dans cette thèse, nous proposons une approche bayésienne nommée Bliss pour ajuster le modèle de régression linéaire fonctionnel. Notre modèle, plus précisément la distribution a priori, suppose que la fonction coefficient est une fonction en escalier. A partir de la distribution a posteriori, nous définissons plusieurs estimateurs bayésiens, à choisir suivant le contexte : un estimateur du support et deux estimateurs, un lisse et un estimateur constant par morceaux. A titre d'exemple, nous considérons un problème de prédiction de la production de truffes noires du Périgord en fonction d'une covariable fonctionnelle représentant l'évolution des précipitations au cours du temps. En terme d'impact sur les productions, la méthode Bliss dégage alors deux périodes de temps importantes pour le développement de la truffe.Un autre atout du paradigme bayésien est de pouvoir inclure de l'information dans la loi a priori, par exemple l'expertise des trufficulteurs et des biologistes sur le développement de la truffe. Dans ce but, nous proposons deux variantes de la méthode Bliss pour prendre en compte ces avis. La première variante récolte de manière indirecte l'avis des experts en leur proposant de construire des données fictives. La loi a priori correspond alors à la distribution a posteriori sachant ces pseudo-données.En outre, un système de poids relativise l'impact de chaque expert ainsi que leurs corrélations. La seconde variante récolte explicitement l'avis des experts sur les périodes de temps les plus influentes sur la production et si cet l'impact est positif ou négatif. La construction de la loi a priori repose alors sur une pénalisation des fonctions coefficients en contradiction avec ces avis.Enfin, ces travaux de thèse s'attachent à l'analyse et la compréhension du comportement de la méthode Bliss. La validité de l'approche est justifiée par une étude asymptotique de la distribution a posteriori. Nous avons construit un jeu d'hypothèses spécifique au modèle Bliss, pour écrire une démonstration efficace d'un théorème de Wald. Une des difficultés est la mauvaise spécification du modèle Bliss, dans le sens où la vraie fonction coefficient n'est sûrement pas une fonction en escalier. Nous montrons que la loi a posteriori se concentre autour d'une fonction coefficient en escalier, obtenue par projection au sens de la divergence de Kullback-Leibler de la vraie fonction coefficient sur un ensemble de fonctions en escalier. Nous caractérisons cette fonction en escalier à partir du design et de la vraie fonction coefficient. / The linear regression model is a common tool for a statistician. If a covariable is a curve, we tackle a high-dimensional issue. In this case, sparse models lead to successful inference, for instance by expanding the functional covariate on a smaller dimensional space.In this thesis, we propose a Bayesian approach, named Bliss, to fit the functional linear regression model. The Bliss model supposes, through the prior, that the coefficient function is a step function. From the posterior, we propose several estimators to be used depending on the context: an estimator of the support and two estimators of the coefficient function: a smooth one and a stewpise one. To illustrate this, we explain the black Périgord truffle yield with the rainfall during the truffle life cycle. The Bliss method succeeds in selecting two relevant periods for truffle development.As another feature of the Bayesian paradigm, the prior distribution enables the integration of preliminary judgments in the statistical inference. For instance, the biologists’ knowledge about the truffles growth is relevant to inform the Bliss model. To this end, we propose two modifications of the Bliss model to take into account preliminary judgments. First, we indirectly collect preliminary judgments using pseudo data provided by experts. The prior distribution proposed corresponds to the posterior distribution given the experts’ pseudo data. Futhermore, the effect of each expert and their correlations are controlled with weighting. Secondly, we collect experts’ judgments about the most influential periods effecting the truffle yield and if the effect is positive or negative. The prior distribution proposed relies on a penalization of coefficient functions which do not conform to these judgments.Lastly, the asymptotic behavior of the Bliss method is studied. We validate the proposed approach by showing the posterior consistency of the Bliss model. Using model-specific assumptions, efficient proof of the Wald theorem is given. The main difficulty is the misspecification of the model since the true coefficient function is surely not a step function. We show that the posterior distribution contracts on a step function which is the Kullback-Leibler projection of the true coefficient function on a set of step functions. This step function is derived from the true parameter and the design.
69

Assessing the reliability, resilience and sustainability of water resources systems in data-rich and data-sparse regions

Headley, Miguel Learie January 2018 (has links)
Uncertainty associated with the potential impact of climate change on supply availability, varied success with demand-side interventions such as water efficiency and changes in priority relating to hydrometric data collection and ownership, have resulted in challenges for water resources system management particularly in data-sparse regions. Consequently, the aim of this thesis is to assess the reliability, resilience and sustainability of water resources systems in both data-rich and data-sparse regions with an emphasis on robust decision-making in data-sparse regions. To achieve this aim, new resilience indicators that capture water resources system failure duration and extent of failure (i.e. failure magnitude) from a social and environmental perspective were developed. These performance indicators enabled a comprehensive assessment of a number of performance enhancing interventions, which resulted in the identification of a set of intervention strategies that showed potential to improve reliability, resilience and sustainability in the case studies examined. Finally, a multi-criteria decision analysis supported trade-off decision making when the reliability, resilience and sustainability indicators were considered in combination. Two case studies were considered in this research: Kingston and St. Andrew in Jamaica and Anyplace in the UK. The Kingston and St. Andrew case study represents the main data-sparse case study where many assumptions were introduced to fill data gaps. The intervention strategy that showed great potential to improve reliability, resilience and sustainability identified from Kingston and St. Andrew water resources assessment was the ‘Site A-east’ desalination scheme. To ameliorate uncertainty and lack of confidence associated with results, a methodology was developed that transformed a key proportion of the Anyplace water resources system from a data-rich environment to a data-sparse environment. The Anyplace water resources system was then assessed in a data-sparse environment and the performance trade-offs of the intervention strategies were analysed using four multi-criteria decision analysis (MCDA) weighting combinations. The MCDA facilitated a robust comparison of the interventions’ performances in the data-rich and data-sparse case studies. Comparisons showed consistency in the performances of the interventions across data-rich and data-sparse hydrological conditions and serve to demonstrate to decision makers a novel approach to addressing uncertainty when many assumptions have been introduced in the water resources management process due to data sparsity.
70

High-Order Sparsity Exploiting Methods with Applications in Imaging and PDEs

January 2016 (has links)
abstract: High-order methods are known for their accuracy and computational performance when applied to solving partial differential equations and have widespread use in representing images compactly. Nonetheless, high-order methods have difficulty representing functions containing discontinuities or functions having slow spectral decay in the chosen basis. Certain sensing techniques such as MRI and SAR provide data in terms of Fourier coefficients, and thus prescribe a natural high-order basis. The field of compressed sensing has introduced a set of techniques based on $\ell^1$ regularization that promote sparsity and facilitate working with functions having discontinuities. In this dissertation, high-order methods and $\ell^1$ regularization are used to address three problems: reconstructing piecewise smooth functions from sparse and and noisy Fourier data, recovering edge locations in piecewise smooth functions from sparse and noisy Fourier data, and reducing time-stepping constraints when numerically solving certain time-dependent hyperbolic partial differential equations. / Dissertation/Thesis / Doctoral Dissertation Applied Mathematics 2016

Page generated in 0.0398 seconds