• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 42
  • 8
  • 6
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 84
  • 11
  • 10
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Multi-dimensional direct-sequence spread spectrum multiple-access communication with adaptive channel coding

Malan, Estian 25 October 2007 (has links)
During the race towards the4th generation (4G) cellular-based digital communication systems, a growth in the demand for high capacity, multi-media capable, improved Quality-of-Service (QoS) mobile communication systems have caused the developing mobile communications world to turn towards betterMultiple Access (MA) techniques, like Code Division Multiple Access (CDMA) [5]. The demand for higher throughput and better QoS in future 4G systems have also given rise to a scheme that is becoming ever more popular for use in these so-called ‘bandwidth-on-demand’ systems. This scheme is known as adaptive channel coding, and gives a system the ability to firstly sense changes in conditions, and secondly, to adapt to these changes, exploiting the fact that under good channel conditions, a very simple or even no channel coding scheme can be used for Forward Error Correction(FEC). This will ultimately result in better system throughput utilization. One such scheme, known as incremental redundancy, is already implemented in the Enhanced Data Rates for GSM Evolution (EDGE) standard. This study presents an extensive simulation study of a Multi-User (MU), adaptive channel coded Direct Sequence Spread Spectrum Multiple Access (DS/SSMA) communication system. This study firstly presents and utilizes a complex Base Band(BB) DS/SSMA transmitter model, aimed at user data diversity [6] in order to realize the MU input data to the system. This transmitter employs sophisticated double-sideband (DSB)Constant-Envelope Linearly Interpolated Root-of-Unity (CE-LI-RU) filtered General Chirp-Like (GCL) sequences [34, 37, 38] to band limit and spread user data. It then utilizes a fully user-definable, complex Multipath Fading Channel Simulator(MFCS), first presented by Staphorst [3], which is capable of reproducing all of the physical attributes of realistic mobile fading channels. Next, this study presents a matching DS/SSMA receiver structure that aims to optimally recover user data from the channel, ensuring the achievement of data diversity. In order to provide the basic channel coding functionality needed by the system of this study, three simple, but well-known channel coding schemes are investigated and employed. These are: binary Hamming (7,4,3) block code, (15,7,5) binary Bose-Chadhuri-Hocquenghem (BCH) block code and a rate 1/3 <i.Non-Systematic (NS) binary convolutional code [6]. The first step towards the realization of any adaptive channel coded system is the ability to measure channel conditions as fast as possible, without the loss of accuracy or inclusion of known data. In 1965, Gooding presented a paper in which he described a technique that measures communication conditions at the receiving end of a system through a device called a Performance Monitoring Unit (PMU) [12, 13]. This device accelerates the system’sBit Error Rate (BER) to a so-called Pseudo Error Rate(PER) through a process known as threshold modification. It then uses a simple PER extrapolation algorithm to estimate the system’s true BER with moderate accuracy and without the need for known data. This study extends the work of Gooding by applying his technique to the DS/SSMA system that utilizes a generic Soft-Output Viterbi Algorithm(SOVA) decoder [39] structure for the trellis decoding of the binary linear block codes [3, 41-50], as well as binary convolutional codes mentioned, over realistic MU frequency selective channel conditions. This application will grant the system the ability to sense changes in communication conditions through real-time BER measurement and, ultimately, to adapt to these changes by switching to different channel codes. Because no previous literature exists on this application, this work is considered novel. Extensive simulation results also investigate the linearity of the PER vs. modified threshold relationship for uncoded, as well as all coded cases. These simulations are all done for single, as well as multiple user systems. This study also provides extensive simulation results that investigate the calculation accuracy and speed advantages that Gooding’s technique possesses over that of the classic Monte-Carlo technique for BER estimation. These simulations also consider uncoded and coded cases, as well as single and multiple users. Finally, this study investigates the experimental real-time performance of the fully functional MU, adaptive coded, DS/SSMA communication system over varying channel conditions. During this part of the study, the channel conditions are varied over time, and the system’s adaptation (channel code switching) performance is observed through a real-time observation of the system’s estimated BER. This study also extends into cases with multiple system users. Since the adaptive coded system of this study does not require known data sequences (training sequences), inclusion of Gooding’s technique for real-time BER estimation through threshold modification and PER extrapolation in future 4G adaptive systems will enable better Quality-of-Service (QoS) management without sacrificing throughput. Furthermore, this study proves that when Gooding’s technique is applied to a coded system with a soft-output, it can be an effective technique for QoS monitoring, and should be considered in 4G systems of the future. / Dissertation (MEng (Computer Engineering))--University of Pretoria, 2007. / Electrical, Electronic and Computer Engineering / MEng / unrestricted
82

Performance Evaluation Of Fan-beam And Cone-beam Reconstruction Algorithms With No Backprojection Weight On Truncated Data Problems

Sumith, K 07 1900 (has links) (PDF)
This work focuses on using the linear prediction based projection completion for the fan-beam and cone-beam reconstruction algorithm with no backprojection weight. The truncated data problems are addressed in the computed tomography research. However, the image reconstruction from truncated data perfectly has not been achieved yet and only approximately accurate solutions have been obtained. Thus research in this area continues to strive to obtain close result to the perfect. Linear prediction techniques are adopted for truncation completion in this work, because previous research on the truncated data problems also have shown that this technique works well compared to some other techniques like polynomial fitting and iterative based methods. The Linear prediction technique is a model based technique. The autoregressive (AR) and moving average (MA) are the two important models along with autoregressive moving average (ARMA) model. The AR model is used in this work because of the simplicity it provides in calculating the prediction coefficients. The order of the model is chosen based on the partial autocorrelation function of the projection data proved in the previous researches that have been carried out in this area of interest. The truncated projection completion using linear prediction and windowed linear prediction show that reasonably accurate reconstruction is achieved. The windowed linear prediction provide better estimate of the missing data, the reason for this is mentioned in the literature and is restated for the reader’s convenience in this work. The advantages associated with the fan-beam reconstruction algorithms with no backprojection weights compared to the fan-beam reconstruction algorithm with backprojection weights motivated us to use the fan-beam reconstruction algorithm with no backprojection weight for reconstructing the truncation completed projection data. The results obtained are compared with the previous work which used conventional fan-beam reconstruction algorithms with backprojection weight. The intensity plots and the noise performance results show improvements resulting from using the fan-beam reconstruction algorithm with no backprojection weight. The work is also extended to the Feldkamp, Davis, and Kress (FDK) reconstruction algorithm with no backprojection weight for the helical scanning geometry and the results obtained are compared with the FDK reconstruction algorithm with backprojection weight for the helical scanning geometry.
83

Proton computed tomography / Tomographie proton informatisée

Quiñones, Catherine Thérèse 28 September 2016 (has links)
L'utilisation de protons dans le traitement du cancer est largement reconnue grâce au parcours fini des protons dans la matière. Pour la planification du traitement par protons, l'incertitude dans la détermination de la longueur du parcours des protons provient principalement de l'inexactitude dans la conversion des unités Hounsfield (obtenues à partir de tomographie rayons X) en pouvoir d'arrêt des protons. La tomographie proton (pCT) est une solution attrayante car cette modalité reconstruit directement la carte du pouvoir d'arrêt relatif à l'eau (RSP) de l'objet. La technique pCT classique est basée sur la mesure de la perte d'énergie des protons pour reconstruire la carte du RSP de l'objet. En plus de la perte d'énergie, les protons subissent également des diffusions coulombiennes multiples et des interactions nucléaires qui pourraient révéler d'autres propriétés intéressantes des matériaux non visibles avec les cartes de RSP. Ce travail de thèse a consisté à étudier les interactions de protons au travers de simulations Monte Carlo par le logiciel GATE et d'utiliser ces informations pour reconstruire une carte de l'objet par rétroprojection filtrée le long des chemins les plus vraisemblables des protons. Mise à part la méthode pCT conventionnelle par perte d'énergie, deux modalités de pCT ont été étudiées et mises en œuvre. La première est la pCT par atténuation qui est réalisée en utilisant l'atténuation des protons pour reconstruire le coefficient d'atténuation linéique des interactions nucléaires de l'objet. La deuxième modalité pCT est appelée pCT par diffusion qui est effectuée en mesurant la variation angulaire due à la diffusion coulombienne pour reconstruire la carte de pouvoir de diffusion, liée à la longueur de radiation du matériau. L'exactitude, la précision et la résolution spatiale des images reconstruites à partir des deux modalités de pCT ont été évaluées qualitativement et quantitativement et comparées à la pCT conventionnelle par perte d'énergie. Alors que la pCT par perte d'énergie fournit déjà les informations nécessaires pour calculer la longueur du parcours des protons pour la planification du traitement, la pCT par atténuation et par diffusion donnent des informations complémentaires sur l'objet. D'une part, les images pCT par diffusion et par atténuation fournissent une information supplémentaire intrinsèque aux matériaux de l'objet. D'autre part, dans certains des cas étudiés, les images pCT par atténuation démontrent une meilleure résolution spatiale dont l'information fournie compléterait celle de la pCT par perte d'énergie. / The use of protons in cancer treatment has been widely recognized thanks to the precise stopping range of protons in matter. In proton therapy treatment planning, the uncertainty in determining the range mainly stems from the inaccuracy in the conversion of the Hounsfield units obtained from x-ray computed tomography to proton stopping power. Proton CT (pCT) has been an attractive solution as this modality directly reconstructs the relative stopping power (RSP) map of the object. The conventional pCT technique is based on measurements of the energy loss of protons to reconstruct the RSP map of the object. In addition to energy loss, protons also undergo multiple Coulomb scattering and nuclear interactions which could reveal other interesting properties of the materials not visible with the RSP maps. This PhD work is to investigate proton interactions through Monte Carlo simulations in GATE and to use this information to reconstruct a map of the object through filtered back-projection along the most likely proton paths. Aside from the conventional energy-loss pCT, two pCT modalities have been investigated and implemented. The first one is called attenuation pCT which is carried out by using the attenuation of protons to reconstruct the linear inelastic nuclear cross-section map of the object. The second pCT modality is called scattering pCT which is performed by utilizing proton scattering by measuring the angular variance to reconstruct the relative scattering power map which is related to the radiation length of the material. The accuracy, precision and spatial resolution of the images reconstructed from the two pCT modalities were evaluated qualitatively and quantitatively and compared with the conventional energy-loss pCT. While energy-loss pCT already provides the information needed to calculate the proton range for treatment planning, attenuation pCT and scattering pCT give complementary information about the object. For one, scattering pCT and attenuation pCT images provide an additional information intrinsic to the materials in the object. Another is that, in some studied cases, attenuation pCT images demonstrate a better spatial resolution and showed features that would supplement energy-loss pCT reconstructions.
84

Hodnocení tepové frekvence a saturace krve kyslíkem pomocí chytrého telefonu / Heart rate and blood oxygen saturation estimation using smartphone

Jordánová, Ivana January 2018 (has links)
Heart rate, Oxygen saturation, HR, SpO2, MATLAB, smartphone, mobile phone, photopletysmogram, PPG

Page generated in 0.0587 seconds