• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 10
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Low cost, short range free space quantum cryptography for consumer applications : pocket size for pocket change

Lowndes, D. L. D. January 2014 (has links)
A Quantum Key Distribution system has been demonstrated with a focus on applicability of the technology to a consumer use model. The optical devices were split into a large, expensive "Quantum ATM" (the Bob terminal) and a small, cheap handheld section (the Alice device). Work was initially done to integrate the devices into a realistic demonstration system which was exhibited at several international conferences. Following this success, the devices constituent parts were isolated and improved upon. Work on the Alice device focused upon proposing a new scheme for light collimation and then investigating a method to enact this. In the Bob device the progress was directed towards a more effective single photon detection system utilizing active quenching. An asymptotic secure key rate of 20kb/s was obtained at a bit error rate (BER) of 4% corresponding to a daylight operation scenario. A docking scheme was enacted to simplify the alignment of the optical channel which was shown to be repeatable over at least 50 cycles. The system was also shown to be stable over an extended period of time.
2

Reconciliation and estimation for a short range quantum cryptography system

Godfrey, Mark January 2010 (has links)
The University of Bristol has been developing a free-space, low cost, test-bed quantum cryptography system for the past five years, and the work in this thesis is 'concerned with the algorithms and protocols used in this system. The use scenario is that a hand-held, Alice, device can be used in conjunction with a fixed terminal, Bob, to expand an existing store of shared secret key bits between the two parties. This requires that both devices are compact and low-cost, but with the hand-held Alice device being more so, as there will be a high of ratio of these to Bob devices. To both satisfy the cost constraint and allow rapid prototyping, the physical setup uses off-the-shelf optical and electrical components. The first part· of the work in this thesis analyses the security and performance implications of implementation imperfections. In order to produce a secret key string, novel data processing and reconciliation algorithms are described, optimisation for the scenario where the Alice device has less computational power than that of Bob. Specifically, clock-period recovery, elimination of the background noise by temporal gating, and synchronisation, with low-density parity-check (LDPC) codes being used for the error correction ·stage. In light of recent results on the security of quantum cryptography systems, estimates are calculated that show that the hardware needs to be further improved in order to expand the key store in a convenient period of time. Several improved models are simulated that explore the modifications required to bring the key expansion times into an acceptable range.
3

Power analysis attacks and countermeasures for block ciphers

Boey, Kean Hong January 2012 (has links)
In today's digital world, cryptographic algorithms are commonly used in all aspects of our daily life. Whilst most of the modem cryptographic algorithms are secure against theoretical attacks, cryptographic algorithms can be compromised by monitoring the power consumption, which is known as power analysis attacks. The differential power analysis (DPA) attack is the most powerful attack. Although several countermeasures have been proposed to defend against power analysis attacks, these countermeasure techniques are costly to develop or need to be designed for specific encryption algorithms. In this thesis, a number of techniques and practical experiments have been undertaken to explore DPA attacks in more detail. DPA attacks were performed on two block cipher encryption algorithms CAST-128 and SEED. These two block ciphers use two round keys in each round function. However, existing power analysis attack strategies are not suitable for cryptographic algorithms that use two round keys in each round function. Therefore, two attack strategies have been proposed and targeted at the SBox component, in each of the algorithms to reveal the round keys. Unlike previous research which has mostly focused on simulation-based analysis of the SBox component, this research involved a focused analysis of DPA attacks of hardware implementations of the SBox component to investigate which SBox are more secure against such attacks. Based on this analysis, some recommendations for more power resistant SBox functionality were proposed. In this research two novel countermeasures, which misalign the power traces by randomly inserting idle cycle(s) or dummy cycle(s) in between two or more consecutive operations, are proposed to counteract power analysis attacks. The proposed countermeasures can be used to increase the resistance of a cryptographic device by reducing the overall SNR by more than 94%. Both the proposed countermeasures are better in terms of area and power consumption than other countermeasure techniques.
4

Applications of search techniques to cryptanalysis and the construction of cipher components

McLaughlin, James January 2012 (has links)
In this dissertation, we investigate the ways in which search techniques, and in particular metaheuristic search techniques, can be used in cryptology. We address the design of simple cryptographic components (Boolean functions), before moving on to more complex entities (S-boxes). The emphasis then shifts from the construction of cryptographic artefacts to the related area of cryptanalysis, in which we first derive non-linear approximations to S-boxes more powerful than the existing linear approximations, and then exploit these in cryptanalytic attacks against the ciphers DES and Serpent.
5

Advancements in password-based cryptography

Kiefer, Franziskus January 2016 (has links)
Password-based authentication is the most popular authentication mechanism for humans today, not only on the internet. Despite increasing efforts to move to supposedly more secure alternatives, password-based authentication is most likely to stay for the foreseeable future due to its user experience and convenience. However, although secure cryptographic protocols for password-based authentication and key-exchange exist, they are hardly used in practice. While previous work on password-based cryptography including secure password-based key-exchange, authentication, and secret sharing protocols, this thesis sets out to bring cryptographic password-based protocols closer to real world deployment as well as improving their security guarantees. To this end we propose frameworks for password-based authentication and key-exchange in the verifier-based and two-server setting as a step towards deploying cryptographically secure password-based protocols. These frameworks do not only include the authentication/key-exchange step, which has been researched before, but also investigate registration of prospective client passwords, which has not been considered before. In particular, the first step of each proposed framework is the secure registration of passwords with limited trust assumptions on server and client that requires the server to enforce a password policy for minimum security of client passwords and enables the client to compute the password verifier or password shares on the client side. While this first essential step for password-based authentication and key-exchange has hardly been explored before, the second step, the actual authentication and key-exchange protocol enjoys a large body of research in the plain single-server setting. In this thesis however we focus on the less well studied verifier-based and two-server settings where we propose new protocols for both settings and the first security model for two-server protocols in the UC framework. The theoretical work is underpinned by implementations of the password registration phase that allows the comparison of not only security but also performance of the proposed protocols. To further facilitate adoption and demonstrate usability we show real world usage of the verifier-based framework by implementing a demo application and Firefox extension that allows the use of the proposed framework for account registration and authentication.
6

Analysis of some modern symmetric ciphers

Mirza, Fauzan ul-Haque January 2002 (has links)
No description available.
7

The viability of an ITS for word processing in education and commerce

Jupp, Dax January 2001 (has links)
No description available.
8

High performance, low cost and low consumption Radio over fiber Systems for diversified communications applications. / Systèmes radio-sur-fibre bas coût, basse consommation et hautes performances pour des applications de communications diversifiées

Nanni, Jacopo 14 December 2018 (has links)
Cette thèse vise à analyser en détails la possibilité d'améliorer les futurs systèmes Radio-sur-Fibre (RoF) dans différents scénarios de télécommunication, tels que les réseaux cellulaires actuels et de la prochaine génération, ainsi que dans d'autres applications telles que la radioastronomie. Le système RoF étudié est donc composé d’un laser à cavité verticale (VCSEL) fonctionnant à 850 nm, d'une fibre standard monomode (SSMF) et d'un phototransistor à hétérojonction SiGe (HPT), adoptant la technique appelée détection directe par modulation d'intensité qui est aujourd'hui l'architecture à moindre coût et la plus simple pour RoF. Cette thèse décrit en détail la propagation non naturelle dans le SSMF (conçue pour fonctionner uniquement à 1310 nm et 1550 nm) à 850 nm. A travers un modèle mathématique développé, la propagation à deux modes est décrite et les principaux phénomènes impliqués sont analysés. En particulier, le modèle se concentre sur la dispersion intermodale et le bruit modal, considérés comme les deux principales contributions de performance nuisible. Le modèle mis au point permet d'identifier les principaux paramètres qui contribuent à renforcer les effets néfastes de la dispersion intermodale et du bruit modal, à la fois en fréquence et dans le temps. A partir du modèle, des techniques possibles pour améliorer les performances sont alors proposées. En fait, une technique de pré-filtrage est réalisée afin d'éviter l'excitation du mode du second ordre, permettant une propagation quasi-monomode au sein du SSMF. La technique est validée théoriquement et expérimentalement soit pour une transmission sinusoïdale à radiofréquence unique, soit pour une transmission de signal passe-bande centrée sur une bande de radiofréquence. Il est démontré expérimentalement que possibilité d'augmenter la bande passante de modulation du système RoF, tout en réduisant les fluctuations de puissance et de gain. De plus, la technique est validée dans un véritable système de transmission LTE, ce qui permet à la technologie RoF de transmettre un signal LTE 256-QAM de 20 MHz, confirmant la possibilité d'utiliser cette technologie pour réduire le coût global et la consommation du réseau. Des travaux supplémentaires ont été réalisés sur le modèle mathématique. En fait, la propagation des deux modes est exploitée en sens inverse pour caractériser le chirp du VCSEL utilisé. Enfin, le problème du couplage entre les fibres et les dispositifs optoélectroniques est discuté et étudié afin d’améliorer les performances tout en gardant un faible coût. La possibilité d'utiliser une structure collective et passive pour coupler la fibre optique avec des photodétecteurs à petite surface et VCSEL est présentée, montrant des améliorations importantes sur l'efficacité du couplage et la tolérance au désalignement / This dissertation aims to analyze the possibility of improving in terms of cost and consumption the future Radio-over-Fiber (RoF) systems in different telecommunication scenarios, such as current and next generation of cellular networks and in other applications such as Radio Astronomy. The RoF system studied is hence composed of a Vertical Cavity Surface Emitting Laser (VCSEL) operating at 850 nm, standard single mode fiber (SSMF) and SiGe Heterojunction Phototransistor (HPT), adopting the technique called Intensity Modulation{Direct Detection which is nowadays the cheapest and simplest architecture for RoF. This dissertation describes in detail the multimode propagation within the SSMF (designed to operate only at 1310 nm and 1550 nm) which is present at 850 nm. Through a developed mathematical model, the two-modes propagation is described and the main phenomena involved are analyzed. In particular, the model focus on intermodal dispersion and modal noise which are considered the two main contributions of performance detrimental. The model developed is able to identify the main parameters which contribute to enhance the detrimental effects produced by intermodal dispersion and modal noise both in frequency and time domain. Starting from the model, possible techniques to improve the performances are then proposed. In particular, a pre-filtering technique is realized in order to avoid the excitation of the second order mode, allowing a quasi-single-modepropagation within the SSMF. The technique is theoretically and experimentally validated either for single radio frequency sinusoidal transmission either for bandpass signal transmission centered in radio frequency band. In particular it is demonstrated experimentally the possibility of increasing the modulationbandwidth of the RoF system, reducing at the same time the fluctuations of power and gain. Furthermore, the technique is validated in a real LTE transmission system, making the RoF technology proposed able to transmit 256-QAM LTE signal of 20 MHz bandwidth, confirming the possibility of using this technology to decrease the overall cost and consumption of the network. Further work hasbeen done on the mathematical model. In particular the two modes propagation is exploited reversely in order to characterize the chirp parameter of the VCSEL employed. Finally, the problem of coupling between fiber and opto-electronic devices is also discussed and investigated, in order to enhance theperformances while keeping low the cost. The possibility of utilizing a collective and passive ploymerbased structure for coupling the optical fiber with small area photodetectors and VCSEL is presented, showing important improvements on coupling efficiency and tolerance to misalignment
9

Návrh sítě WDM-PON / Design of WDM-PON Network

Lásko, Jan January 2012 (has links)
The work is focused on the design of optical network of technologies WDM-PON. The work describes passive technologies like APON, GPON, EPON etc. Theoretical part is focused on elements used in WDM-PON networks like transmitters, receivers. Distributional part is focused on optical fibers. The work also describes the quality of service QoS and Triple play. Practical part is focused on attenuation balance, choice of locality and description of individual parts of optical network. It is further described the technology of microtubing and financial analysis. In end of the work there is a simulation of optical network for 96 clients in the distance of 1 60 km described.
10

Secure digital documents using Steganography and QR Code

Hassanein, Mohamed Sameh January 2014 (has links)
With the increasing use of the Internet several problems have arisen regarding the processing of electronic documents. These include content filtering, content retrieval/search. Moreover, document security has taken a centre stage including copyright protection, broadcast monitoring etc. There is an acute need of an effective tool which can find the identity, location and the time when the document was created so that it can be determined whether or not the contents of the document were tampered with after creation. Owing the sensitivity of the large amounts of data which is processed on a daily basis, verifying the authenticity and integrity of a document is more important now than it ever was. Unsurprisingly document authenticity verification has become the centre of attention in the world of research. Consequently, this research is concerned with creating a tool which deals with the above problem. This research proposes the use of a Quick Response Code as a message carrier for Text Key-print. The Text Key-print is a novel method which employs the basic element of the language (i.e. Characters of the alphabet) in order to achieve authenticity of electronic documents through the transformation of its physical structure into a logical structured relationship. The resultant dimensional matrix is then converted into a binary stream and encapsulated with a serial number or URL inside a Quick response Code (QR code) to form a digital fingerprint mark. For hiding a QR code, two image steganography techniques were developed based upon the spatial and the transform domains. In the spatial domain, three methods were proposed and implemented based on the least significant bit insertion technique and the use of pseudorandom number generator to scatter the message into a set of arbitrary pixels. These methods utilise the three colour channels in the images based on the RGB model based in order to embed one, two or three bits per the eight bit channel which results in three different hiding capacities. The second technique is an adaptive approach in transforming domain where a threshold value is calculated under a predefined location for embedding in order to identify the embedding strength of the embedding technique. The quality of the generated stego images was evaluated using both objective (PSNR) and Subjective (DSCQS) methods to ensure the reliability of our proposed methods. The experimental results revealed that PSNR is not a strong indicator of the perceived stego image quality, but not a bad interpreter also of the actual quality of stego images. Since the visual difference between the cover and the stego image must be absolutely imperceptible to the human visual system, it was logically convenient to ask human observers with different qualifications and experience in the field of image processing to evaluate the perceived quality of the cover and the stego image. Thus, the subjective responses were analysed using statistical measurements to describe the distribution of the scores given by the assessors. Thus, the proposed scheme presents an alternative approach to protect digital documents rather than the traditional techniques of digital signature and watermarking.

Page generated in 0.0222 seconds