• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 3
  • 1
  • 1
  • Tagged with
  • 36
  • 36
  • 36
  • 12
  • 12
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 6
  • 6
  • 6
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On some problems related to machine-generated noise

Stockis, Jean-Pierre January 1997 (has links)
No description available.
2

Higher-Dimensional Properties of Non-Uniform Pseudo-Random Variates

Leydold, Josef, Leeb, Hannes, Hörmann, Wolfgang January 1998 (has links) (PDF)
In this paper we present the results of a first empirical investigation on how the quality of non-uniform variates is influenced by the underlying uniform RNG and the transformation method used. We use well known standard RNGs and transformation methods to the normal distribution as examples. We find that except for transformed density rejection methods, which do not seem to introduce any additional defects, the quality of the underlying uniform RNG can be both increased and decreased by transformations to non-uniform distributions. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
3

Determinants of specificity in autobiographical memory

Healy, Helen G. January 1997 (has links)
Depressed and suicidal patients have difficulty in recollecting specific autobiographical events. In response to cue words they tend to generate summarised or general memories instead of specific events. The objectives of this thesis are to explore the mechanisms underlying the production of specific and general autobiographical memories in a non clinical population. The roles of imagery and working memory in the generation of autobiographical memories were investigated. Four experiments examined how manipulating the imageability of the cue affected subsequent retrieval in autobiographical memory. The results show that cues high in imageability facilitated access to specific memories and that visual imageability was the most significant piedictor of memory specificity compared to a range of other perceptual modalities. The effect of an experimental manipulation on retrieval style was examined by instructing participants to retrieve specific events or general events using high or low imageable words to cue memories. The results show that induction. of a generic retrieval style reduced the specificity of images of future events. This models clinical findings with depressed and suicidal patients and suggests that associations between memory retrieval and future imaging share common intermediate pathways. A further experiment suggested that the image ability effects mediating the construction of specific memories may be in part due to the predicability of such retrieval cues. The hypothesis that retrieval of specific autobiographical memories is more effortful compared to the retrieval of general memories was also investigated using a dual task paradigm. Although central executive function has been implicated many times in the monitoring of autobiographical retrieval, no direct assessment of executive capacity during retrieval has been made. The results showed no significant difference in the randomness of a keypressing task when specific or general autobiographical memories were retrieved in response to either high or low imageable cue words. A direct retrieval hypothesis was proposed whereby cues directly accessed specific events in autobiographical memory and the adoption of such a strategy enabled participants to maintain performance on the secondary task.
4

The Multivariate Ahrens Sampling Method

Karawatzki, Roman January 2006 (has links) (PDF)
The "Ahrens method" is a very simple method for sampling from univariate distributions. It is based on rejection from piecewise constant hat functions. It can be applied analogously to the multivariate case where hat functions are used that are constant on rectangular domains. In this paper we investigate the case of distributions with so called orthounimodal densities. Technical implementation details as well as their practical limitations are discussed. The application to more general distributions is considered. (author's abstract) / Series: Research Report Series / Department of Statistics and Mathematics
5

The Automatic Generation of One- and Multi-dimensional Distributions with Transformed Density Rejection

Leydold, Josef, Hörmann, Wolfgang January 1997 (has links) (PDF)
A rejection algorithm, called ``transformed density rejection", is presented. It uses a new method for constructing simple hat functions for a unimodal density $f$. It is based on the idea of transforming $f$ with a suitable transformation $T$ such that $T(f(x))$ is concave. The hat function is then constructed by taking the pointwise minimum of tangents which are transformed back to the original scale. The resulting algorithm works very well for a large class of distributions and is fast. The method is also extended to the two- and multidimensional case. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
6

Modelling Probability Distributions from Data and its Influence on Simulation

Hörmann, Wolfgang, Bayar, Onur January 2000 (has links) (PDF)
Generating random variates as generalisation of a given sample is an important task for stochastic simulations. The three main methods suggested in the literature are: fitting a standard distribution, constructing an empirical distribution that approximates the cumulative distribution function and generating variates from the kernel density estimate of the data. The last method is practically unknown in the simulation literature although it is as simple as the other two methods. The comparison of the theoretical performance of the methods and the results of three small simulation studies show that a variance corrected version of kernel density estimation performs best and should be used for generating variates directly from a sample. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
7

rstream: Streams of Random Numbers for Stochastic Simulation

L'Ecuyer, Pierre, Leydold, Josef January 2005 (has links) (PDF)
The package rstream provides a unified interface to streams of random numbers for the R statistical computing language. Features are: * independent streams of random numbers * substreams * easy handling of streams (initialize, reset) * antithetic random variates The paper describes this packages and demonstrates an simple example the usefulness of this approach. / Series: Preprint Series / Department of Applied Statistics and Data Processing
8

Self-Testing and Device-Independent Quantum Random Number Generation with Nonmaximally Entangled States

Bamps, Cédric 12 February 2018 (has links)
The generation of random number sequences, that is, of unpredictable sequences free from any structure, has found numerous applications in the field of information technologies. One of the most sensitive applications is cryptography, whose modern practice makes use of secret keys that must indeed be unpredictable for any potential adversary. This type of application demands highly secure randomness generators.This thesis contributes to the device-independent approach to quantum random number generation (DIRNG, for Device-Independent Random Number Generation). Those methods of randomness generation exploit the fundamental unpredictability of the measurement of quantum systems. In particular, the security of device-independent methods does not appeal to a specific model of the device itself, which is treated as a black box. This approach therefore stands in contrast to more traditional methods whose security rests on a precise theoretical model of the device, which may lead to vulnerabilities caused by hardware malfunctions or tampering by an adversary.Our contributions are the following. We first introduce a family of robust self-testing criteria for a class of quantum systems that involve partially entangled qubit pairs. This powerful form of inference allows us to certify that the contents of a quantum black box conforms to one of those systems, on the sole basis of macroscopically observable statistical properties of the black box.That result leads us to introduce and prove the security of a protocol for randomness generation based on such partially entangled black boxes. The advantage of this method resides in its low shared entanglement cost, which allows to reduce the use of quantum resources (both entanglement and quantum communication) compared to existing DIRNG protocols.We also present a protocol for randomness generation based on an original estimation of the black-box correlations. Contrary to existing DIRNG methods, which summarize the accumulated measurement data into a single quantity---the violation of a unique Bell inequality---, our method exploits a complete, multidimensional description of the black-box correlations that allows it to certify more randomness from the same number of measurements. We illustrate our results on a numerical simulation of the protocol using partially entangled states. / La génération de suites de nombres aléatoires, c'est-à-dire de suites imprévisibles et dépourvues de toute structure, trouve de nombreuses applications dans le domaine des technologies de l'information. L'une des plus sensibles est la cryptographie, dont les pratiques modernes font en effet appel à des clés secrètes qui doivent précisément être imprévisibles du point de vue d'adversaires potentiels. Ce type d'application exige des générateurs d'aléa de haute sécurité.Cette thèse s'inscrit dans le cadre de l'approche indépendante des appareils des méthodes quantiques de génération de nombres aléatoires (en anglais, Device-Independent Random Number Generation ou DIRNG). Ces méthodes exploitent la nature fondamentalement imprévisible de la mesure des systèmes quantiques. En particulier, l'appellation "indépendante des appareils" implique que la sécurité de ces méthodes ne fait pas appel à un modèle théorique particulier de l'appareil lui-même, qui est traité comme une boîte noire. Cette approche se distingue donc de méthodes plus traditionnelles dont la sécurité repose sur un modèle théorique précis de l'appareil et peut donc être compromise par un dysfonctionnement matériel ou l'intervention d'un adversaire.Les contributions apportées sont les suivantes. Nous démontrons tout d'abord une famille de critères de "self-testing" robuste pour une classe de systèmes quantiques impliquant des paires de systèmes à deux niveaux (qubits) partiellement intriquées. Cette forme d'inférence particulièrement puissante permet de certifier que le contenu d'une boîte noire quantique est conforme à l'un de ces systèmes, sur base uniquement de propriétés statistiques de la boîte observables macroscopiquement.Ce résultat nous amène à introduire et à prouver la sécurité d'une méthode de génération d'aléa basée sur ces boîtes noires partiellement intriquées. L'intérêt de cette méthode réside dans son faible coût en intrication, qui permet de réduire l'usage de ressources quantiques (intrication ou communication quantique) par rapport aux méthodes de DIRNG existantes.Nous présentons par ailleurs une méthode de génération d'aléa basée sur une estimation statistique originale des corrélations des boîtes noires. Contrairement aux méthodes de DIRNG existantes, qui résument l'ensemble des mesures observées à une seule grandeur (la violation d'une inégalité de Bell unique), notre méthode exploite une description complète (et donc multidimensionnelle) des corrélations des boîtes noires qui lui permet de certifier une plus grande quantité d'aléa pour un même nombre de mesures. Nous illustrons ensuite cette méthode numériquement sur un système de qubits partiellement intriqués. / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
9

On-Chip True Random Number Generation in Nanometer Cmos

Suresh, Vikram Belur 01 January 2012 (has links) (PDF)
On-chip True Random Number Generator (TRNG) forms an integral part of a number of cryptographic systems in multi-core processors, communication networks and RFID. TRNG provides random keys, device id and seed for Pseudo Random Number Generators (PRNG). These circuits, harnessing physical random variations like thermal noise or stray electromagnetic waves are ideally expected to generate random bits with very high entropy and zero correlation. But, progression to advance semiconductor manufacturing processes has brought about various challenges in the design of TRNG. Increasing variations in the fabrication process and the sensitivity of transistors to operating conditions like temperature and supply voltage have significant effect on the efficiency of TRNG designed in sub-micron technologies. Poorly designed random number generators also provide an avenue for attackers to break the security of a cryptographic system. Process variation and operating conditions may be used as effective tools of attack against TRNG. This work makes a comprehensive study of the effect of process variation on metastability-based TRNG designed in deep sub-micron technology. Furthermore, the effect of operating temperature and the supply voltage on the performance of TRNG is also analyzed. To mitigate these issues we study entropy extraction mechanisms based both on algorithmic approach and circuit tuning and compare these techniques based on their tolerance to process variation and the energy overhead for correction. We combine the two v approaches to efficiently perform self-calibration, using a hybrid of algorithmic correction and circuit tuning to compensate the effect of variations. The proposed technique provides a fair trade-off between the degree of entropy extraction and the overhead in terms of area and energy, introducing minimal correlation in the output of the TRNG. Besides the study of the effect of process variation and operating conditions on the TRNG, we also propose to study the possible attack models on a TRNG. Finally, we propose a probabilistic approach to design and analysis of TRNG using a stochastic model of the circuit operation and incorporating the random source in thermal noise. All analysis is done for 45nm technology using the NCSU PDK transistor models. The simulation platform is developed using HSPICE and a Perl based automation flow.
10

A quantum entropy source based on Single Photon Entanglement

Leone, Nicolò 26 April 2022 (has links)
In this thesis, I report on how to use Single Photon Entanglement for generating certified quantum random numbers. Single Photon Entanglement is a particular type of entanglement which involves non-contextual correlations between two degrees of freedom of a single photon. In particular, here I consider momentum and polarization. The presence of the entanglement was validated using different attenuated coherent and incoherent sources of light by evaluating the Bell inequality, a well-known entanglement witness. Different non-idealities in the calculation of the inequality are discussed addressing them both theoretically and experimentally. Then, I discuss how to use the Single Photon Entanglement for generating certified quantum random numbers using a semi-device independent protocol. The protocol is based on a partial characterization of the experimental setup and the violation of the Bell's inequality. An analysis of the non-idealities of the devices employed in the experimental setup is also presented In the last part of the thesis, the integrated photonic version of the previously introduced experiments is discussed: first, it is presented how to generate single photon entangled states exploiting different degrees of freedom with respect to the bulk experiment. Second, I discuss how to perform an integrated test of the Bell's inequality.

Page generated in 0.1468 seconds