• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 39
  • 5
  • 3
  • 1
  • 1
  • Tagged with
  • 50
  • 32
  • 15
  • 13
  • 11
  • 10
  • 10
  • 8
  • 8
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Multiple prediction from incomplete data with the focused curvelet transform

Herrmann, Felix J., Wang, Deli, Hennenfent, Gilles January 2007 (has links)
Incomplete data represents a major challenge for a successful prediction and subsequent removal of multiples. In this paper, a new method will be represented that tackles this challenge in a two-step approach. During the first step, the recenly developed curvelet-based recovery by sparsity-promoting inversion (CRSI) is applied to the data, followed by a prediction of the primaries. During the second high-resolution step, the estimated primaries are used to improve the frequency content of the recovered data by combining the focal transform, defined in terms of the estimated primaries, with the curvelet transform. This focused curvelet transform leads to an improved recovery, which can subsequently be used as input for a second stage of multiple prediction and primary-multiple separation.
22

Curvelet imaging and processing : adaptive multiple elimination

Herrmann, Felix J., Verschuur, Eric January 2004 (has links)
Predictive multiple suppression methods consist of two main steps: a prediction step, in which multiples are predicted from the seismic data, and a subtraction step, in which the predicted multiples are matched with the true multiples in the data. The last step appears crucial in practice: an incorrect adaptive subtraction method will cause multiples to be sub-optimally subtracted or primaries being distorted, or both. Therefore, we propose a new domain for separation of primaries and multiples via the Curvelet transform. This transform maps the data into almost orthogonal localized events with a directional and spatial-temporal component. The multiples are suppressed by thresholding the input data at those Curvelet components where the predicted multiples have large amplitudes. In this way the more traditional filtering of predicted multiples to fit the input data is avoided. An initial field data example shows a considerable improvement in multiple suppression.
23

Curvelets And The Radon Transform

Dickerson, Jill 01 January 2013 (has links)
Computed Tomography (CT) is the standard in medical imaging field. In this study, we look at the curvelet transform in an attempt to use it as a basis for representing a function. In doing so, we seek a way to reconstruct a function from the Radon data that may produce clearer results. Using curvelet decomposition, any known function can be represented as a sum of curvelets with corresponding coefficients. It can be shown that these corresponding coefficients can be found using the Radon data, even if the function is unknown. The use of curvelets has the potential to solve partial or truncated Radon data problems. As a result, using a curvelet representation to invert radon data allows the chance of higher quality images to be produced. This paper examines this method of reconstruction for computed tomography (CT). A brief history of CT, an introduction to the theory behind the method, and implementation details will be provided.
24

Surface related multiple prediction from incomplete data

Herrmann, Felix J. January 2007 (has links)
Incomplete data, unknown source-receiver signatures and free-surface reflectivity represent challenges for a successful prediction and subsequent removal of multiples. In this paper, a new method will be represented that tackles these challenges by combining what we know about wavefield (de-)focussing, by weighted convolutions/correlations, and recently developed curvelet-based recovery by sparsity-promoting inversion (CRSI). With this combination, we are able to leverage recent insights from wave physics towards a nonlinear formulation for the multiple-prediction problem that works for incomplete data and without detailed knowledge on the surface effects.
25

Recent developments in curvelet-based seismic processing

Herrmann, Felix J. January 2007 (has links)
No description available.
26

Seismic data processing with curvelets: a multiscale and nonlinear approach

Herrmann, Felix J. January 2007 (has links)
In this abstract, we present a nonlinear curvelet-based sparsity-promoting formulation of a seismic processing flow, consisting of the following steps: seismic data regularization and the restoration of migration amplitudes. We show that the curvelet's wavefront detection capability and invariance under the migration-demigration operator lead to a formulation that is stable under noise and missing data.
27

Algoritmos computacionais baseados em coeficientes curvelet aplicados na descrição de textura em mamogramas

Bruno, Daniel Otávio Tambasco January 2013 (has links)
Orientador: Marcelo Zanchetta do Nascimento / Dissertação (mestrado) - Universidade Federal do ABC. Programa de Programa de Pós-Graduação em Engenharia da Informação, 2013
28

Curvelet-domain preconditioned "wave-equation" depth-migration with sparseness and illumination constraints

Herrmann, Felix J., Moghaddam, Peyman P. January 2004 (has links)
A non-linear edge-preserving solution to the least-squares migration problem with sparseness & illumination constraints is proposed. The applied formalism explores Curvelets as basis functions. By virtue of their sparseness and locality, Curvelets not only reduce the dimensionality of the imaging problem but they also naturally lead to a dense preconditioning that almost diagonalizes the normal/Hessian operator. This almost diagonalization allows us to recast the imaging problem into a ’simple’ denoising problem. As such, we are in the position to use non-linear estimators based on thresholding. These estimators exploit the sparseness and locality of Curvelets and allow us to compute a first estimate for the reflectivity, which approximates the least-squares solution of the seismic inverse scattering problem. Given this estimate, we impose sparseness and additional amplitude corrections by solving a constrained optimization problem. This optimization problem is initialized and constrained by the thresholded image and is designed to remove remaining imaging artifacts and imperfections in the estimation and reconstruction.
29

Curvelet denoising of 4d seismic

Bayreuther, Moritz, Cristall, Jamin, Herrmann, Felix J. January 2004 (has links)
With burgeoning world demand and a limited rate of discovery of new reserves, there is increasing impetus upon the industry to optimize recovery from already existing fields. 4D, or time-lapse, seismic imaging is an emerging technology that holds great promise to better monitor and optimise reservoir production. The basic idea behind 4D seismic is that when multiple 3D surveys are acquired at separate calendar times over a producing field, the reservoir geology will not change from survey to survey but the state of the reservoir fluids will change. Thus, taking the difference between two 3D surveys should remove the static geologic contribution to the data and isolate the timevarying fluid flow component. However, a major challenge in 4D seismic is that acquisition and processing differences between 3D surveys often overshadow the changes caused by fluid flow. This problem is compounded when 4D effects are sought to be derived from vintage 3D data sets that were not originally acquired with 4D in mind. The goal of this study is to remove the acquisition and imaging artefacts from a 4D seismic difference cube using Curvelet processing techniques.
30

A parallel windowed fast discrete curvelet transform applied to seismic processing

Thomson, Darren, Hennenfent, Gilles, Modzelewski, Henryk, Herrmann, Felix J. January 2006 (has links)
We propose using overlapping, tapered windows to process seismic data in parallel. This method consists of numerically tight linear operators and adjoints that are suitable for use in iterative algorithms. This method is also highly scalable and makes parallel processing of large seismic data sets feasible. We use this scheme to define the Parallel Windowed Fast Discrete Curvelet Transform (PWFDCT), which we apply to a seismic data interpolation algorithm. The successful performance of our parallel processing scheme and algorithm on a two-dimensional synthetic data is shown.

Page generated in 0.0235 seconds