• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 2
  • 2
  • Tagged with
  • 15
  • 15
  • 10
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Imaging bone fractures using ultrasonic scattered wavefields: numerical and in-vitro studies

Li, Hongjiang Unknown Date
No description available.
2

Imaging bone fractures using ultrasonic scattered wavefields: numerical and in-vitro studies

Li, Hongjiang 11 1900 (has links)
Ultrasound has been widely used in medical diagnostic imaging to image soft tissues. Compared with other methods, ultrasound is superior with no ionizing-radiation, easy portability, low cost, and the capability to provide elasticity information. Conventional ultrasound images provide distorted image information when the ultrasound beam is not normal to the bone structures. In this thesis, we present two imaging algorithms: reverse time migration (RTM) and split-step Fourier migration (SSFM), to image long bones using ultrasound. The methods are tested using simulated data sets. The reconstructed images show accurate cortical thickness measurement and provide the correct fracture dip. The images also clearly illustrate the healing process of a 1-mm wide crack with different in-filled tissue velocities simulating fracture healing. Two in-vitro examples using fractured bones are also presented. The study has showed that the migration methods have great potential to quantify bone fractures and monitor the fracture healing process.
3

Multisource Least-squares Reverse Time Migration

Dai, Wei 12 1900 (has links)
Least-squares migration has been shown to be able to produce high quality migration images, but its computational cost is considered to be too high for practical imaging. In this dissertation, a multisource least-squares reverse time migration algorithm (LSRTM) is proposed to increase by up to 10 times the computational efficiency by utilizing the blended sources processing technique. There are three main chapters in this dissertation. In Chapter 2, the multisource LSRTM algorithm is implemented with random time-shift and random source polarity encoding functions. Numerical tests on the 2D HESS VTI data show that the multisource LSRTM algorithm suppresses migration artifacts, balances the amplitudes, improves image resolution, and reduces crosstalk noise associated with the blended shot gathers. For this example, multisource LSRTM is about three times faster than the conventional RTM method. For the 3D example of the SEG/EAGE salt model, with comparable computational cost, multisource LSRTM produces images with more accurate amplitudes, better spatial resolution, and fewer migration artifacts compared to conventional RTM. The empirical results suggest that the multisource LSRTM can produce more accurate reflectivity images than conventional RTM does with similar or less computational cost. The caveat is that LSRTM image is sensitive to large errors in the migration velocity model. In Chapter 3, the multisource LSRTM algorithm is implemented with frequency selection encoding strategy and applied to marine streamer data, for which traditional random encoding functions are not applicable. The frequency-selection encoding functions are delta functions in the frequency domain, so that all the encoded shots have unique non-overlapping frequency content. Therefore, the receivers can distinguish the wavefield from each shot according to the frequencies. With the frequency-selection encoding method, the computational efficiency of LSRTM is increased so that its cost is comparable to conventional RTM in the examples of the Marmousi2 model and a field data set from the Gulf of Mexico. With more iterations, the LSRTM image quality is further improved. The numerical results suggest that LSRTM with frequency-selection is an efficient method to produce better reflectivity images than conventional RTM. In Chapter 4, I present an interferometric method for extracting the diffraction signals that emanate from diffractors, also denoted as seismic guide stars. The signal-to-noise ratio of these interferometric diffractions is enhanced by √N, where N is the number of source points coincident with the receiver points. Thus, diffractions from subsalt guide stars can then be rendered visible and so can be used for velocity analysis, migration, and focusing of subsalt reflections. Both synthetic and field data records are used to demonstrate the benefits and limitations of this method.
4

Imagerie de milieux complexes par équations d’ondes élastiques / Imaging of complex media with elastic wave equations

Luquel, Jérôme 16 April 2015 (has links)
L’industrie pétrolière s’intéresse désormais à des régions de la terre de plus en plus difficiles d’accès et il est essentiel de proposer des techniques permettant de garantir l’efficaité d’un forage. Parmi ces techniques, la Reverse Time Migration (RTM) est connue pour sa précision. Elle utilise les ondes réfléchies pour reconstruire une carte du sous-sol représentant les interfaces géophysiques. Elle peut être décrite en trois étapes : (i) propager le champs émis par les sources durant la campagne d’acquisition; (ii) pour chaque source, propager le champ enregistré par les récepteurs; (iii) obtenir une image du sous-sol en appliquant une condition d’imagerie à chaque pas de temps et pour chaque source. Cette technique requiert de très grosses capacités de calcul et il est encore difficile d’imager des milieux réalistes 3D, même avec l’aide du calcul haute performance. Nous avons choisi la méthode de Galerkine discontinue pour modéliser la partie propagation car elle permet d’obtenir des solutions précises et est adaptable au calcul parallèle. La quantité d’information à sauvegarder pour faire une corrélation étant importante, on se doit de trouver un algorithme de calcul d’images du sous-sol réduisant ce coût. Nous avons utilisé l’algorithme de Griewank, appelé “Optimal Checkpointing”. Ce problème de coût étant réglé, on se doit de considérer l’efficacité des ondes élastiques incluant des champs multiples pour améliorer la précision de l’image. La condition traditionnelle de J. Claerbout ne prend pas en compte les conversions d’ondes, et n’est alors surtout utile que dans le cas acoustique. De plus, les ondes P et S interagissant entre elles, il est intéressant de trouver une condition d’imagerie utilisant ce fait. Cela a été abordé dans le cadre de la méthode de l’état adjoint dans les travaux de A. Tarantola et J. Tromp et ce travail en propose utilisation dans le cadre de la RTM. Nous proposons une nouvelle condition d’imagerie prenant en compte les paramètres élastiques du milieu considéré et permettant de supprimer les artefacts numériques. Nous illustrons les images sur des cas industriels / Since a large number of sedimentary basins have been explored, oil exploration is now interested in investigating regions of the Earth which are hostile. Among existing methods for seismic imaging, Reverse Time Migration (RTM) is a technique known by industry to be efficient. The RTM uses reflected waves and is able to construct a map of the subsurface which is depicted by the interfaces limiting the geophysical layers. The algorithm of RTM can be described as a three-step procedure: (i) compute the wavefields emitted by the sources used during the seismic acquisition campaign; (ii) for each source, compute the so-called “backpropagated wavefield”, which is the wavefield obtained by using as sources the signals recorded at the receivers during the acquisition campaign and by reversing the time; (iii) get an image of the subsurface by applying an imaging condition combining the propagated and the backpropagated wavefields at each time step of the numerical scheme and for each source. This technique is computationnaly intensive and it is still difficult to image realistic 3D elastic media, even with the help of HPC. We have thus chosen to consider high-order Discontinuous Galerkin Methods which are known to be well-adapted to provide accurate solutions based upon parallel computing. As we need to correlate a lot of wavefields, we need to find an algorithm reducing the CPU time and the storage : this is the Griewank’s algorithm, so-called “Optimal Checkpointing”. The traditional imaging condition, proposed by J. Claerbout, does not take wave conversions into account and since P-wave and S-wave interact with each other, it might be relevant to use an imaging condition including these interactions. In fact, this has been done successfully by A. Tarantola and J. Tromp for seismology applications based upon the inversion of the global Earth. In this work, we propose a new imaging condition using the elastic parameters which attenuates numerical artifacts. We illustrate the properties of the new imaging condition on industrial benchmarks like the Marmousi model. In particular, we compare the new imaging condition with other imaging conditions by using as criteria the quality of the image.
5

Implementa??o do algoritmo (RTM) para processamento s?smico em arquiteturas n?o convencionais

Lima, Igo Pedro de 16 June 2014 (has links)
Made available in DSpace on 2014-12-17T14:08:57Z (GMT). No. of bitstreams: 1 IgoPL_DISSERT.pdf: 1338632 bytes, checksum: 5c21a0cb714155a0e215d803dca007ce (MD5) Previous issue date: 2014-06-16 / Coordena??o de Aperfei?oamento de Pessoal de N?vel Superior / With the growth of energy consumption worldwide, conventional reservoirs, the reservoirs called "easy exploration and production" are not meeting the global energy demand. This has led many researchers to develop projects that will address these needs, companies in the oil sector has invested in techniques that helping in locating and drilling wells. One of the techniques employed in oil exploration process is the reverse time migration (RTM), in English, Reverse Time Migration, which is a method of seismic imaging that produces excellent image of the subsurface. It is algorithm based in calculation on the wave equation. RTM is considered one of the most advanced seismic imaging techniques. The economic value of the oil reserves that require RTM to be localized is very high, this means that the development of these algorithms becomes a competitive differentiator for companies seismic processing. But, it requires great computational power, that it still somehow harms its practical success. The objective of this work is to explore the implementation of this algorithm in unconventional architectures, specifically GPUs using the CUDA by making an analysis of the difficulties in developing the same, as well as the performance of the algorithm in the sequential and parallel version / Com o crescimento do consumo energ?tico em todo o mundo, os reservat?rios convencionais, chamados de reservat?rios de f?cil explora??o e produ??o n?o est?o atendendo a demanda energ?tica mundial. Isso tem levado muitos pesquisadores a desenvolver trabalhos que venham sanar essas car?ncias. Empresas do setor petrol?fero tem investido em t?cnicas que ajudem na localiza??o e perfura??o de po?os. Uma das t?cnicas empregadas no processo de explora??o de petr?leo ? a Migra??o Reversa no Tempo (RTM), do ingl?s, Reverse Time Migration, que ? um m?todo de imageamento s?smico que produz excelente imagem de subsuperf?cie. ? um algoritmo baseado no c?lculo da equa??o de onda. A RTM ? considerada uma das t?cnicas mais avan?adas de imageamento s?smico. O valor econ?mico das reservas de petr?leo que requerem RTM para ser localizada ? muito alto, isso significa que o desenvolvimento desses algoritmos torna-se um diferencial competitivo para as empresas de processamento s?smico. No entanto, o mesmo requer grande poder computacional que, de alguma forma, ainda prejudica o seu sucesso pr?tico. Assim, o objetivo deste trabalho ? explorar a implementa??o desse algoritmo em arquiteturas n?o convencionais, especificamente as GPUs, utilizando a plataforma CUDA, fazendo uma an?lise das dificuldades no desenvolvimento do mesmo, bem como a performance do algoritmo na vers?o sequencial e paralela
6

NONPARAMETRIC INFERENCES FOR THE HAZARD FUNCTION WITH RIGHT TRUNCATION

Akcin, Haci Mustafa 03 May 2013 (has links)
Incompleteness is a major feature of time-to-event data. As one type of incompleteness, truncation refers to the unobservability of the time-to-event variable because it is smaller (or greater) than the truncation variable. A truncated sample always involves left and right truncation. Left truncation has been studied extensively while right truncation has not received the same level of attention. In one of the earliest studies on right truncation, Lagakos et al. (1988) proposed to transform a right truncated variable to a left truncated variable and then apply existing methods to the transformed variable. The reverse-time hazard function is introduced through transformation. However, this quantity does not have a natural interpretation. There exist gaps in the inferences for the regular forward-time hazard function with right truncated data. This dissertation discusses variance estimation of the cumulative hazard estimator, one-sample log-rank test, and comparison of hazard rate functions among finite independent samples under the context of right truncation. First, the relation between the reverse- and forward-time cumulative hazard functions is clarified. This relation leads to the nonparametric inference for the cumulative hazard function. Jiang (2010) recently conducted a research on this direction and proposed two variance estimators of the cumulative hazard estimator. Some revision to the variance estimators is suggested in this dissertation and evaluated in a Monte-Carlo study. Second, this dissertation studies the hypothesis testing for right truncated data. A series of tests is developed with the hazard rate function as the target quantity. A one-sample log-rank test is first discussed, followed by a family of weighted tests for comparison between finite $K$-samples. Particular weight functions lead to log-rank, Gehan, Tarone-Ware tests and these three tests are evaluated in a Monte-Carlo study. Finally, this dissertation studies the nonparametric inference for the hazard rate function for the right truncated data. The kernel smoothing technique is utilized in estimating the hazard rate function. A Monte-Carlo study investigates the uniform kernel smoothed estimator and its variance estimator. The uniform, Epanechnikov and biweight kernel estimators are implemented in the example of blood transfusion infected AIDS data.
7

Nonparametric Inferences for the Hazard Function with Right Truncation

Akcin, Haci Mustafa 03 May 2013 (has links)
Incompleteness is a major feature of time-to-event data. As one type of incompleteness, truncation refers to the unobservability of the time-to-event variable because it is smaller (or greater) than the truncation variable. A truncated sample always involves left and right truncation. Left truncation has been studied extensively while right truncation has not received the same level of attention. In one of the earliest studies on right truncation, Lagakos et al. (1988) proposed to transform a right truncated variable to a left truncated variable and then apply existing methods to the transformed variable. The reverse-time hazard function is introduced through transformation. However, this quantity does not have a natural interpretation. There exist gaps in the inferences for the regular forward-time hazard function with right truncated data. This dissertation discusses variance estimation of the cumulative hazard estimator, one-sample log-rank test, and comparison of hazard rate functions among finite independent samples under the context of right truncation. First, the relation between the reverse- and forward-time cumulative hazard functions is clarified. This relation leads to the nonparametric inference for the cumulative hazard function. Jiang (2010) recently conducted a research on this direction and proposed two variance estimators of the cumulative hazard estimator. Some revision to the variance estimators is suggested in this dissertation and evaluated in a Monte-Carlo study. Second, this dissertation studies the hypothesis testing for right truncated data. A series of tests is developed with the hazard rate function as the target quantity. A one-sample log-rank test is first discussed, followed by a family of weighted tests for comparison between finite $K$-samples. Particular weight functions lead to log-rank, Gehan, Tarone-Ware tests and these three tests are evaluated in a Monte-Carlo study. Finally, this dissertation studies the nonparametric inference for the hazard rate function for the right truncated data. The kernel smoothing technique is utilized in estimating the hazard rate function. A Monte-Carlo study investigates the uniform kernel smoothed estimator and its variance estimator. The uniform, Epanechnikov and biweight kernel estimators are implemented in the example of blood transfusion infected AIDS data.
8

Continental Arc Processes in British Columbia and Earthquake Processes in Virginia: Insights from Seismic Imaging

Wang, Kai 07 February 2014 (has links)
Travel times from a refraction and wide-angle reflection seismic survey across the Coast Plutonic Complex and Stikine terrane of British Columbia were inverted to derive two dimensional P and S-wave seismic velocity models of the crust and uppermost mantle. A felsic upper crust and a felsic to intermediate middle crust are observed in both the batholith complex and the accreted Stikine island arc terrane. The P and S wave models demonstrate a high-velocity (P 7.0 km/s, S 3.8 km/s) layer in the lower crust beneath the youngest (late Cretaceous to Eocene) portion of the continental arc complex. In contrast, the lower crust under the Stikine terrane has lower velocities consistent with amphibolite or other hydrated mafic rocks. The Moho is at ~35 km depth under the Stikine terrane, deepens to ~38 km beneath the youngest portion of the arc, then shallows towards the coast. The high velocity zone under the younger portion of the Coast Plutonic Complex has a 1.81 Vp/Vs ratio and is interpreted to have a bulk composition of mafic garnet granulite. This garnet granulite and large volumes of granodiorite-dominated melt were created by arc dehydration melting of amphibolite (or hydrated gabbro) in the pre-existing lower crust Reverse time migration method was applied to image aftershocks recorded by a dense array deployed after the 2011 Virginia earthquake. Events as tiny as magnitude -2 were successfully imaged as point sources. The propagation of energy release as a function of time and space was observed for events larger than magnitude 2.5. Spatial resolution of the images was ~200 m, which synthetic data tests show was primarily limited by the temporal sampling rate. Improved temporal and spatial sampling could produce images with sharper resolution. / Ph. D.
9

A two-way approach to adapt small-scale laboratory experiments and corresponding numerical simulations of offshore seismic surveys / Une approche conjointe pour adapter les expérimentations de laboratoire à échelle réduite aux simulations numériques correspondantes de campagnes de sismique marine

Solymosi, Bence 20 November 2018 (has links)
Les méthodes numériques sont largement utilisées en exploration sismique pour simuler la propagation des ondes et pour le post-traitement des données sismiques avant l'interprétation géologique/géophysique. Les algorithmes sont basés sur différentes hypothèses pour réduire le coût de calcul au détriment de la simplification des modèles et/ou des phénomènes physiques. En raison de leur rôle essentiel en exploration géophysique, la précision des simulations numériques présente un fort intérêt, notamment dans le cas de configurations géologiques réalistes. La comparaison directe des résultats numériques entre eux dans des configurations synthétiques peut avoir des limites, car il peut être difficile de déterminer celui qui donne la meilleure approximation de la solution physique inconnue. Comme dans la réalité le sous-sol n'est jamais connu avec précision, il est également difficile de comparer les résultats synthétiques aux données sismiques réelles acquises in situ. Par conséquent, il y a un grand intérêt à utiliser des mesures de laboratoire sur des modèles physiques aux propriétés connues pour valider la précision des outils numériques. Avant de pouvoir comparer avec précision les mesures et les simulations, nous devons tout d’abord établir un cadre comparatif avec une approche conjointe adaptée aux expériences de laboratoire et à la modélisation numérique. C’est précisément l'objectif de cette thèse. Ainsi, le cadre reproduit d'abord les mesures sismiques marines dans des conditions de laboratoire en utilisant de modèles à échelle réduite, puis les outils numériques sont adaptés à la reconstruction précise des expériences. / Numerical methods are widely used in seismic exploration to simulate wave propagation and to post-process the recorded seismic data before the geologic/geophysical interpretation. The algorithms are based on various assumptions to reduce the computational cost at the expense of simplifying the models and/or the physical phenomena. Because of their essential role in exploration geophysics, the accuracy of the numerical simulations is of particular interest, especially in the case of realistic geologic setups. The direct comparison of the numerical results with each other in synthetic configurations can have limitations, as it can be difficult to determine the one that gives the best approximation of a physically unknown solution. Because in real life the subsurface is never accurately known, it is also difficult to compare the synthetic results to any seismic data set from field measurements. Therefore there is a strong interest in using laboratory measurements on physical models of known geometries to benchmark the numerical tools. Before comparing measurements and simulations with confidence at high accuracy, we first need to establish a comparative framework with a jointly-adapted approach to both the laboratory experiments and the numerical modeling. This challenging task is the goal of this thesis. Thus, the framework first reproduces offshore seismic measurements in laboratory conditions with the help of small-scale models, and then the numerical tools are adapted to the accurate synthetic reconstruction of the experiments.
10

Reverse-time inference of biological dynamics

Lenner, Nicolas 13 November 2019 (has links)
No description available.

Page generated in 0.0763 seconds