Spelling suggestions: "subject:"ehe pode"" "subject:"ehe mode""
181 |
Etudes de la convergence d'un calcul Monte Carlo de criticité : utilisation d'un calcul déterministe et détection automatisée du transitoire / Studies on the convergence of a Monte Carlo criticality calculation : coupling with a deterministic code and automated transient detectionJinaphanh, Alexis 03 December 2012 (has links)
Les calculs Monte Carlo en neutronique-criticité permettent d'estimer le coefficient de multiplication effectif ainsi que des grandeurs locales comme le flux ou les taux de réaction. Certaines configurations présentant de faibles couplages neutroniques (modélisation de cœurs complets, prise en compte de profils d'irradiations, ...) peuvent conduire à de mauvaises estimations du kef f ou des flux locaux. L'objet de cette thèse est de contribuer à rendre plus robuste l'algorithme Monte Carlo utilisé et améliorer la détection de la convergence. L'amélioration du calcul envisagée passe par l'utilisation, lors du calcul Monte Carlo, d'un flux adjoint obtenu par un pré-calcul détermi- niste réalisé en amont. Ce flux adjoint est ensuite utilisé pour déterminer le positionnement de la première génération, modifier la sélection des sites de naissance, et modifier la marche aléatoire par des stratégies de splitting et de roulette russe. Une méthode de détection automatique du transitoire a été développée. Elle repose sur la modélisation des séries de sortie par un processus auto régressif d'ordre 1 et un test statistique dont la variable de décision est la moyenne du pont de Student. Cette méthode a été appli- quée au kef f et à l'entropie de Shannon. Elle est suffisamment générale pour être utilisée sur n'importe quelle série issue d'un calcul Monte Carlo itératif. Les méthodes développées dans cette thèse ont été testées sur plusieurs cas simplifiés présentant des difficultés de convergence neutroniques. / Monte Carlo criticality calculation allows to estimate the effective mu- tiplication factor as well as local quantities such as local reaction rates. Some configurations presenting weak neutronic coupling (high burn up pro- file, complete reactor core, ...) may induce biased estimations for kef f or reaction rates. In order to improve robustness of the iterative Monte Carlo méthods, a coupling with a deterministic code was studied. An adjoint flux is obtained by a deterministic calculation and then used in the Monte Carlo. The initial guess is then automated, the sampling of fission sites is modi- fied and the random walk of neutrons is modified using splitting and russian roulette strategies. An automated convergence detection method has been developped. It locates and suppresses the transient due to the initialization in an output series, applied here to kef f and Shannon entropy. It relies on modeling stationary series by an order 1 auto regressive process and applying statistical tests based on a Student Bridge statistics. This method can easily be extended to every output of an iterative Monte Carlo. Methods developed in this thesis are tested on different test cases.
|
182 |
Power control techniques for CDMA-based mobile systemsNourizadeh, Sam January 2003 (has links)
Code division multiple access (CDMA) is a well-known radio communication technique that allows multiple users to share the same spectrum simultaneously. It is an alternative to frequency division and time division multiple access scheme. Its numerous advantages have merited being the main air-interface choice for the third generation (3G) mobile communication system. Nevertheless, due to the use of same frequency, the capacity of CDMA air-interface is interference limited. This problem is minimised by use of power control scheme. Power Control reduces the interference in the system by adjusting the transmitted power according to the received Signal-to-Interference (SIR) ratio. The main difficulty to achieve this scheme is that mobile terminals experience different radio propagation channel. For success completion of this task, two objectives have been identified for power control. First assure that the received signal matches the required SIR at physical layer and secondly adjust the required SIR of users at system layer to an acceptable value so that the terminals in the system are capable of achieving. In this thesis both objectives are discussed and analysed through analytical and simulation methods. At physical layer, two analytical methods based on non-linear control theory are proposed to combat the fast fading channel propagation. The proposed methods are a fast solution to assess the performance of the Closed Loop Power Control compared to the usual lengthy simulation process. At system level, a new distributed power algorithm for reverse link that adjusts the SIR target of the mobile terminal at the base station is proposed. This algorithm brings the performance of the distributed algorithm closer to the optimal solution provided by the non-feasible centralised power control algorithm in current technology.
|
183 |
Multiuser detection for mobile CDMA systemsMozaffaripour, M. January 2003 (has links)
The goal for the third generation (3G) of mobile communications system is to seamlessly integrate a wide variety of communication services such as high-speed data, video and multimedia traffic as well as voice signals for transmission on a Wideband Code Division Multiple Access (WCDMA) air interface. CDMA suffers from interference and in this thesis multiuser detection for the mobile uplink has been considered. A thorough comparative study for different multiuser detection methods is done. RAKE-IC as an architecture for mixing the ideas of RAKE receiver, and parallel Interference Cancellation, are introduced. The basic concept is to maximize the signal to noise ratio of all users in the system by using adaptive algorithms. The structure of RAKE-IC has been extended to multi-stages and several adaptive algorithms are implemented. An iterative method for interference cancellation has been considered and its convergence issue has been analytically studied. An improvement in convergence using the Rayleigh-Ritz theorem is proposed which in consequence increases the convergence speed in synchronous scenarios. Using analytical methods another improvement using the Gershgorin theorem has been proposed which does not impose a great complexity in the system, yet works well even in asynchronous environments. A suboptimum search algorithm for correcting the reliable detected information has been introduced with the property that its structure can be combined well with the iterative detectors. This combination achieves a better performance than partial parallel interference cancellation method even in rather low interference regions of operation. The structure of the sub-optimum search algorithm has been extended to multiple stages and its performance in terms of bit error rate has been analytically derived in closed form that shows good agreement with the simulation results. Considering the power profile of the users and by sacrificing a little performance, the suboptimum search structure has been further simplified. Key words: Multiuser Detection, WCDMA and Interference Cancellation.
|
184 |
Multi-carrier CDMA using convolutional coding and interference cancellationMaxey, Joshua James January 1997 (has links)
No description available.
|
185 |
Code acquisition techniques for CDMA-based mobile networksNeda, Naaser January 2003 (has links)
The initial code acquisition techniques for direct sequence code division multiple access (DS/CDMA) communication networks are investigated in this thesis. Conventional methods of code acquisition, which are basically based on the auto correlation and cross correlation properties of spreading codes, fail in the presence of multiple access interference (MAI) and the near-far effect. This fact motivates the study for interference resistant acquisition algorithms in the hostile channel environment. Training-based acquisition is investigated and the effect of training sequence structure on acquisition performance is discussed. A new training sequence architecture is proposed which results in a shorter acquisition time. Demands for high bit rate services and needs for more efficient exploitation of resources lead to the study of acquisition algorithms that do not need the preamble or training sequences. In this context, blind adaptive algorithms for code acquisition are investigated. The mismatch problem of blind algorithms is addressed and a novel method of mismatch problem handling for Constraint Minimum Output Energy (C-MOE) is proposed. The algorithm results in good acquisition performance under different channel conditions and system loadings. The idea of joint acquisition and demodulation of data, where the outcome of the acquisition mode is an interference suppressor filter, is also discussed. It is shown that in this class of receivers, a one-step constraint acquisition process is not sufficient for handling both the mismatch problem and exploiting the multi-path diversity. Therefore, a novel receiver is proposed which is able to handle the mismatch problem as well as the channel diversity. This receiver is based on a two- step constraint minimum output energy algorithm and comparatively provides a good acquisition and demodulation performance.
|
186 |
Constructions et performances de codes LDPC quantiquesDelfosse, Nicolas 12 December 2012 (has links)
L'objet de cette thèse est l'étude des codes LDPC quantiques. Dans un premier temps, nous travaillons sur des constructions topologiques de codes LDPC quantiques. Nous proposons de construire une famille de codes couleur basée sur des pavages hyperboliques. Nous étudions ensuite les paramètres d'une famille de codes basée sur des graphes de Cayley.Dans une seconde partie, nous examinons les performances de ces codes. Nous obtenons une borne supérieure sur les performances des codes LDPC quantiques réguliers sur le canal à effacement quantique. Ceci prouve que ces codes n'atteignent pas la capacité du canal à effacement quantique. Dans le cas du canal de dépolarisation, nous proposons un nouvel algorithme de décodage des codes couleur basé sur trois décodages de codes de surface. Nos simulations numériques montrent de bonnes performances dans le cas des codes couleur toriques.Pour finir, nous nous intéressons au phénomène de percolation. La question centrale de la théorie de la percolation est la détermination du seuil critique. Le calcul exacte de ce seuil est généralement difficile. Nous relions la probabilité de percolation dans certains pavages réguliers du plan hyperbolique à la probabilité d'erreur de décodage pour une famille de codes hyperboliques. Nous en déduisons une borne sur le seuil critique de ces pavages hyperboliques basée sur des résultats de théorie de l'information quantique. Il s'agit d'une application de la théorie de l'information quantique à un problème purement combinatoire. / This thesis is devoted to the study of quantum LDPC codes. The first part presents some topological constructions of quantum LDPC codes. We introduce a family of color codes based on tilings of the hyperbolic plane. We study the parameters of a family of codes based on Cayley graphs.In a second part, we analyze the performance of these codes. We obtain an upper bound on the performance of regular quantum LDPC codes over the quantum erasure channel. This implies that these codes don't achieve the capacity of the quantum erasure channel. In the case of the depolarizing channel, we propose a new decoding algorithm of color codes based on three surface codes decoding. Our numerical results show good performance for toric color codes.Finally, we focus on percolation theory. The central question in percolation theory is the determination of the critical probability. Computing the critical probability exactly is usually quite difficult. We relate the probability of percolation in some regular tilings of the hyperbolic plane to the probability of a decoding error for hyperbolic codes on the quantum erasure channel. This leads to an upper bound on the critical probability of these hyperbolic tilings based on quantum information. It is an application of quantum information to a purely combinatorial problem.
|
187 |
Using risk mitigation approaches to define the requirements for software escrowRode, Karl January 2015 (has links)
Two or more parties entering into a contract for service or goods may make use of an escrow of the funds for payment to enable trust in the contract. In such an event the documents or financial instruments, the object(s) in escrow, are held in trust by a trusted third party (escrow provider) until the specified conditions are fulfilled. In the scenario of software escrow, the object of escrow is typically the source code, and the specified release conditions usually address potential scenarios wherein the software provider becomes unable to continue providing services (such as due to bankruptcy or a change in services provided, etc.) The subject of software escrow is not well documented in the academic body of work, with the largest information sources, active commentary and supporting papers provided by commercial software escrow providers, both in South Africa and abroad. This work maps the software escrow topic onto the King III compliance framework in South Africa. This is of value since any users of bespoke developed applications may require extended professional assistance to align with the King III guidelines. The supporting risk assessment model developed in this work will serve as a tool to evaluate and motivate for software escrow agreements. It will also provide an overview of the various escrow agreement types and will transfer the focus to the value proposition that they each hold. Initial research has indicated that current awareness of software escrow in industry is still very low. This was evidenced by the significant number of approached specialists that declined to participate in the survey due to their own admitted inexperience in applying the discipline of software escrow within their companies. Moreover, the participants that contributed to the research indicated that they only required software escrow for medium to highly critical applications. This proved the value of assessing the various risk factors that bespoke software development introduces, as well as the risk mitigation options available, through tools such as escrow, to reduce the actual and residual risk to a manageable level.
|
188 |
Probabilistic Proof-carrying CodeSharkey, Michael Ian January 2012 (has links)
Proof-carrying code is an application of software verification techniques to the problem of ensuring the safety of mobile code. However, previous proof-carrying code systems have assumed that mobile code will faithfully execute the instructions of the program. Realistic implementations of computing systems are susceptible to probabilistic behaviours that can alter the execution of a program in ways that can result in corruption or security breaches. We investigate the use of a probabilistic bytecode language to model deterministic programs that are executed on probabilistic computing systems. To model probabilistic safety properties, a probabilistic logic is adapted to out bytecode instruction language, and soundness is proven. A sketch of a completeness proof of the logic is also shown.
|
189 |
Aspekte van die ontwikkeling van 'n herhalermodule vir pulskodemodulasiestelselsDe Beer, Daniel Jacobus 29 September 2014 (has links)
M.Ing. (Electrical & Electronic Engineering) / Please refer to full text to view abstract
|
190 |
The Girl Disappeared: the Prostitute of La Isla De Santa FloraWinston, Michael 05 1900 (has links)
The novella, The Girl Disappeared, focuses on the life of Emalia, a street kid from Mexico. She is taken from the streets of Veracruz and forced into a life of prostitution on the fictitious island of La Isla de Santa Flora. The primary conflict that drives the action of the story is her pending choice between escaping her life of slavery and saving another young woman who is on the verge of being forced into a life of prostitution as well. The novella, as a literary piece, dwells on the question of character agency and explores the multilayered nature of code switching. Language for these women becomes a tool in their struggle against their captives and a means of self-preservation, or sanctuary, as they use their growing bilingualism to foment a limited agency, to act in their own defense.
|
Page generated in 0.0775 seconds