• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1927
  • 598
  • 576
  • 417
  • 240
  • 177
  • 57
  • 54
  • 45
  • 26
  • 26
  • 25
  • 24
  • 23
  • 20
  • Tagged with
  • 4822
  • 533
  • 504
  • 499
  • 433
  • 421
  • 376
  • 362
  • 354
  • 346
  • 340
  • 337
  • 320
  • 319
  • 317
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
681

Architectures for fault-tolerant quantum computation

O'Gorman, Joe January 2017 (has links)
Quantum computing has enormous potential, but this can only be realised if quantum errors can be controlled sufficiently to allow quantum algorithms to be completed reliably. However, quantum-error-corrected logical quantum bits (qubits) which can be said to have achieved meaningful error suppression have not yet been demonstrated. This thesis reports research on several topics related to the challenge of designing fault-tolerant quantum computers. The first topic is a proposal for achieving large-scale error correction with the surface code in a silicon donor based quantum computing architecture. This proposal relaxes some of the stringent requirements in donor placement precision set by previous ideas from the single atom level to the order of 10 nm in some regimes. This is shown by means of numerical simulation of the surface code threshold. The second topic then follows, it is the development of a method for benchmarking and assessing the performance of small error correcting codes in few-qubit systems, introducing a metric called 'integrity' - closely linked to the trace distance -- and a proposal for experiments to demonstrate various stepping stones on the way to 'strictly superior' quantum error correction. Most quantum error correcting codes, including the surface code, do not allow for fault-tolerant universal computation without the addition of extra gadgets. One method of achieving universality is through a process of distilling and then consuming high quality 'magic states'. This process adds additional overhead to quantum computation over and above that incurred by the use of the base level quantum error correction. The latter parts of this thesis report an investigation into how many physical qubits are needed in a `magic state factory' within a surface code quantum computer and introduce a number of techniques to reduce the overhead of leading magic state techniques. It is found that universal quantum computing is achievable with &Tilde; 16 million qubits if error rates across a device are kept below 10<sup>-4</sup>. In addition, the thesis introduces improved methods of achieving magic state distillation for unconventional magic states that allow for logical small angle rotations, and show that this can be more efficient than synthesising these operations from the gates provided by traditional magic states.
682

Synchronization of multi-carrier CDMA signals and security on internet.

January 1996 (has links)
by Yooh Ji Heng. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1996. / Includes bibliographical references (leaves 119-128). / Appendix in Chinese. / Chapter I --- Synchronization of Multi-carrier CDMA Signals --- p.1 / Chapter 1 --- Introduction --- p.2 / Chapter 1.1 --- Spread Spectrum CDMA --- p.4 / Chapter 1.1.1 --- Direct Sequence/SS-CDMA --- p.5 / Chapter 1.1.2 --- Frequency Hopping/SS-CDMA --- p.5 / Chapter 1.1.3 --- Pseudo-noise Sequence --- p.6 / Chapter 1.2 --- Synchronization for CDMA signal --- p.7 / Chapter 1.2.1 --- Acquisition of PN Sequence --- p.7 / Chapter 1.2.2 --- Phase Locked Loop --- p.8 / Chapter 2 --- Multi-carrier CDMA --- p.10 / Chapter 2.1 --- System Model --- p.11 / Chapter 2.2 --- Crest Factor --- p.12 / Chapter 2.3 --- Shapiro-Rudin Sequence --- p.14 / Chapter 3 --- Synchronization and Detection by Line-Fitting --- p.16 / Chapter 3.1 --- Unmodulated Signals --- p.16 / Chapter 3.2 --- Estimating the Time Shift by Line-Fitting --- p.19 / Chapter 3.3 --- Modulated Signals --- p.22 / Chapter 4 --- Matched Filter --- p.23 / Chapter 5 --- Performance and Conclusion --- p.27 / Chapter 5.1 --- Line Fitting Algorithm --- p.27 / Chapter 5.2 --- Matched Filter --- p.28 / Chapter 5.3 --- Conclusion --- p.30 / Chapter II --- Security on Internet --- p.31 / Chapter 6 --- Introduction --- p.32 / Chapter 6.1 --- Introduction to Cryptography --- p.32 / Chapter 6.1.1 --- Classical Cryptography --- p.33 / Chapter 6.1.2 --- Cryptanalysis --- p.35 / Chapter 6.2 --- Introduction to Internet Security --- p.35 / Chapter 6.2.1 --- The Origin of Internet --- p.35 / Chapter 6.2.2 --- Internet Security --- p.36 / Chapter 6.2.3 --- Internet Commerce --- p.37 / Chapter 7 --- Elementary Number Theory --- p.39 / Chapter 7.1 --- Finite Field Theory --- p.39 / Chapter 7.1.1 --- Euclidean Algorithm --- p.40 / Chapter 7.1.2 --- Chinese Remainder Theorem --- p.40 / Chapter 7.1.3 --- Modular Exponentiation --- p.41 / Chapter 7.2 --- One-way Hashing Function --- p.42 / Chapter 7.2.1 --- MD2 --- p.43 / Chapter 7.2.2 --- MD5 --- p.43 / Chapter 7.3 --- Prime Number --- p.44 / Chapter 7.3.1 --- Listing of Prime Number --- p.45 / Chapter 7.3.2 --- Primality Testing --- p.45 / Chapter 7.4 --- Random/Pseudo-Random Number --- p.47 / Chapter 7.4.1 --- Examples of Random Number Generator --- p.49 / Chapter 8 --- Private Key and Public Key Cryptography --- p.51 / Chapter 8.1 --- Block Ciphers --- p.51 / Chapter 8.1.1 --- Data Encryption Standard (DES) --- p.52 / Chapter 8.1.2 --- International Data Encryption Algorithm (IDEA) --- p.54 / Chapter 8.1.3 --- RC5 --- p.55 / Chapter 8.2 --- Stream Ciphers --- p.56 / Chapter 8.2.1 --- RC2 and RC4 --- p.57 / Chapter 8.3 --- Public Key Cryptosystem --- p.58 / Chapter 8.3.1 --- Diffie-Hellman --- p.60 / Chapter 8.3.2 --- Knapsack Algorithm --- p.60 / Chapter 8.3.3 --- RSA --- p.62 / Chapter 8.3.4 --- Elliptic Curve Cryptosystem --- p.63 / Chapter 8.3.5 --- Public Key vs. Private Key Cryptosystem --- p.64 / Chapter 8.4 --- Digital Signature --- p.65 / Chapter 8.4.1 --- ElGamal Signature Scheme --- p.66 / Chapter 8.4.2 --- Digital Signature Standard (DSS) --- p.67 / Chapter 8.5 --- Cryptanalysis to Current Cryptosystems --- p.68 / Chapter 8.5.1 --- Differential Cryptanalysis --- p.68 / Chapter 8.5.2 --- An Attack to RC4 in Netscapel.l --- p.69 / Chapter 8.5.3 --- "An Timing Attack to Diffie-Hellman, RSA" --- p.71 / Chapter 9 --- Network Security and Electronic Commerce --- p.73 / Chapter 9.1 --- Network Security --- p.73 / Chapter 9.1.1 --- Password --- p.73 / Chapter 9.1.2 --- Network Firewalls --- p.76 / Chapter 9.2 --- Implementation for Network Security --- p.79 / Chapter 9.2.1 --- Kerberos --- p.79 / Chapter 9.2.2 --- Privacy-Enhanced Mail (PEM) --- p.80 / Chapter 9.2.3 --- Pretty Good Privacy (PGP) --- p.82 / Chapter 9.3 --- Internet Commerce --- p.83 / Chapter 9.3.1 --- Electronic Cash --- p.85 / Chapter 9.4 --- Internet Browsers --- p.87 / Chapter 9.4.1 --- Secure NCSA Mosaic --- p.87 / Chapter 9.4.2 --- Netscape Navigator --- p.89 / Chapter 9.4.3 --- SunSoft HotJava --- p.91 / Chapter 10 --- Examples of Electronic Commerce System --- p.94 / Chapter 10.1 --- CyberCash --- p.95 / Chapter 10.2 --- DigiCash --- p.97 / Chapter 10.3 --- The Financial Services Technology Consortium --- p.98 / Chapter 10.3.1 --- Electronic Check Project --- p.99 / Chapter 10.3.2 --- Electronic Commerce Project --- p.101 / Chapter 10.4 --- FirstVirtual --- p.103 / Chapter 10.5 --- Mondex --- p.104 / Chapter 10.6 --- NetBill --- p.106 / Chapter 10.7 --- NetCash --- p.108 / Chapter 10.8 --- NetCheque --- p.111 / Chapter 11 --- Conclusion --- p.113 / Chapter A --- An Essay on Chinese Remainder Theorem and RSA --- p.115 / Bibliography --- p.119
683

Perspective Identification in Informal Text

Elfardy, Hebatallah January 2017 (has links)
This dissertation studies the problem of identifying the ideological perspective of people as expressed in their written text. One's perspective is often expressed in his/her stance towards polarizing topics. We are interested in studying how nuanced linguistic cues can be used to identify the perspective of a person in informal genres. Moreover, we are interested in exploring the problem from a multilingual perspective comparing and contrasting linguistics devices used in both English informal genres datasets discussing American ideological issues and Arabic discussion fora posts related to Egyptian politics. %In doing so, we solve several challenges. Our first and utmost goal is building computational systems that can successfully identify the perspective from which a given informal text is written while studying what linguistic cues work best for each language and drawing insights into the similarities and differences between the notion of perspective in both studied languages. We build computational systems that can successfully identify the stance of a person in English informal text that deal with different topics that are determined by one's perspective, such as legalization of abortion, feminist movement, gay and gun rights; additionally, we are able to identify a more general notion of perspective–namely the 2012 choice of presidential candidate–as well as build systems for automatically identifying different elements of a person's perspective given an Egyptian discussion forum comment. The systems utilize several lexical and semantic features for both languages. Specifically, for English we explore the use of word sense disambiguation, opinion features, latent and frame semantics as well; as Linguistic Inquiry and Word Count features; in Arabic, however, in addition to using sentiment and latent semantics, we study whether linguistic code-switching (LCS) between the standard and dialectal forms for the language can help as a cue for uncovering the perspective from which a comment was written. This leads us to the challenge of devising computational systems that can handle LCS in Arabic. The Arabic language has a diglossic nature where the standard form of the language (MSA) coexists with the regional dialects (DA) corresponding to the native mother tongue of Arabic speakers in different parts of the Arab world. DA is ubiquitously prevalent in written informal genres and in most cases it is code-switched with MSA. The presence of code-switching degrades the performance of almost any MSA-only trained Natural Language Processing tool when applied to DA or to code-switched MSA-DA content. In order to solve this challenge, we build a state-of-the-art system–AIDA–to computationally handle token and sentence-level code-switching. On a conceptual level, for handling and processing Egyptian ideological perspectives, we note the lack of a taxonomy for the most common perspectives among Egyptians and the lack of corresponding annotated corpora. In solving this challenge, we develop a taxonomy for the most common community perspectives among Egyptians and use an iterative feedback-loop process to devise guidelines on how to successfully annotate a given online discussion forum post with different elements of a person's perspective. Using the proposed taxonomy and annotation guidelines, we annotate a large set of Egyptian discussion fora posts to identify a comment's perspective as conveyed in the priority expressed by the comment, as well as the stance on major political entities.
684

Code-switching from Cantonese to modern standard Chinese : a study of primary pupils in Hong Kong

Lau, Hui Yuen 01 January 1995 (has links)
No description available.
685

Code-mixing users in Hong Kong

Low, Wai Man Winnie 01 January 1999 (has links)
No description available.
686

Nový daňový řád / New Tax Regulations

Jančurová, Hana January 2010 (has links)
The basic aim of this diploma thesis is to preset readers with Act. No. 280/2009 Coll., Tax Code, which came into force on January 1st 2011. Others aims are to describe main benefits of the new Tax Code, to find what are the main problems to put into practice and to evaluate the annual activity. The thesis also introduces and explains how to proceed in accordance with the Tax Code and the specific examples of common situations that may arise in tax administration.
687

Testament a jeho právní úprava v historii, v současnosti a de lege ferenda / Testament and its legal regulation in the history, present and de lege ferenda

Hruboňová, Michaela January 2012 (has links)
1 ABSTRACT (resumé - anglická verze) Testament and its legal regulation in the history, present and de lege ferenda This thesis is concerned with historical development of legislation of the testament as a probate title in our country. It analyzes the different legal regulations and delivering a comprehensive look at this institute from past to present and thanks to the new civil code in to the future. Its aim is to bring will to the general public as a significant Heritage title respecting the will of the testator and allowing him to choose his heirs or their inheritance shares otherwise than as provided for in the rules of inheritance law. Since the new Civil Code returns to our inheritance law some traditional institutions, it is beneficial for better understanding to be also familiar with their historical adapting and development. Thesis in each historical period approximates the form of the testament with its most relevant features that enable easier comparisons between treatments. The first part is presented form of testament by the law of Roman. It wasn't only laid the foundations of this institute, but continental law at all therefore his conception of wills and inheritance law cannot be ignored. First chapter closer the Roman law requirements for persons deceased and heirs (testamentary capacity),...
688

Mapping beyond cartography : the experimental maps of artists working with locative media

Frodsham, Daniel James January 2015 (has links)
The experimental maps produced by artists working with locative media both bear witness to and participate in a radical reworking of the way in which space is conceived and encountered that destabilizes longstanding assumptions about the nature of representation, knowledge, and power. These mapmaking practices, it is argued, operate at the juncture of a cartographic tradition that entails distinctively modern ways of seeing, knowing, and acting in the world, and digital technologies and software operations that propose alternative ways of linking the world up. The thesis charts how these art maps engage in a critique of cartography, the extent to which they remain indebted to it, but also their use of coded operations to pioneer novel apprehensions of space that mark a decisive ‘break’ with a modern worldview. The map works of locative media are accordingly positioned in relation to what is seen as a paradigmatic shift from Cartographic Space to Code Space, and the analysis of case studies supplies a means of comprehending this ongoing transformation, demonstrating that mapping survives beyond cartography but entails a tearing apart of the cartographic surface and the representational epistemology that accompanies it. Gone are the compass, scale and fix-points by which, for centuries, a sense of place was anchored and the world made knowable, yet to be set adrift in this way is not to be left ‘all at sea’. Working with the novel intuitions, forms and geometries that arise from the operations of software code, post-cartographical mapping practices continue to supply a sense of orientation. However, they also pioneer novel forms of territory, and power over territory, that call for new strategies of counter-mapping and, with it, a ‘post-cartographical’ reframing of the study of locative media. Now pictured as a site of contestation between antithetical spatial paradigms, locative media is rehabilitated as a vital force, operating at a pivotal moment, in a broadly epoch-defining reshaping of space and spatial representation.
689

ML4JIT- um arcabouço para pesquisa com aprendizado de máquina em compiladores JIT. / ML4JIT - a framework for research on machine learning in JIT compilers.

Alexandre dos Santos Mignon 27 June 2017 (has links)
Determinar o melhor conjunto de otimizações para serem aplicadas a um programa tem sido o foco de pesquisas em otimização de compilação por décadas. Em geral, o conjunto de otimizações é definido manualmente pelos desenvolvedores do compilador e aplicado a todos os programas. Técnicas de aprendizado de máquina supervisionado têm sido usadas para o desenvolvimento de heurísticas de otimização de código. Elas pretendem determinar o melhor conjunto de otimizações com o mínimo de interferência humana. Este trabalho apresenta o ML4JIT, um arcabouço para pesquisa com aprendizado de máquina em compiladores JIT para a linguagem Java. O arcabouço permite que sejam realizadas pesquisas para encontrar uma melhor sintonia das otimizações específica para cada método de um programa. Experimentos foram realizados para a validação do arcabouço com o objetivo de verificar se com seu uso houve uma redução no tempo de compilação dos métodos e também no tempo de execução do programa. / Determining the best set of optimizations to be applied in a program has been the focus of research on compile optimization for decades. In general, the set of optimization is manually defined by compiler developers and apply to all programs. Supervised machine learning techniques have been used for the development of code optimization heuristics. They intend to determine the best set of optimization with minimal human intervention. This work presents the ML4JIT, a framework for research with machine learning in JIT compilers for Java language. The framework allows research to be performed to better tune the optimizations specific to each method of a program. Experiments were performed for the validation of the framework with the objective of verifying if its use had a reduction in the compilation time of the methods and also in the execution time of the program.
690

"Projeto e confecção de simuladores oftálmicos para aplicações clínicas" / DESIGN AND CONSTRUCTION OF OPHTHALMIC SIMULATORS FOR CLINICAL APPLICATIONS

Andrea Sanchez 09 June 2006 (has links)
Este trabalho apresenta uma metodologia de cálculo para a obtenção de doses em estruturas do olho humano, como: esclera, coróide, retina, nervo óptico, corpo vítreo, câmara anterior, lente, além do tumor devido ao tratamento com placas oftálmicas. Construiu-se um modelo de olho humano com suas principais estruturas e dimensões fieis, além de um modelo matemático para uma placa de Co-60 e uma placa de sementes de I-125, levando-se em conta tamanho e disposição geométrica das fontes reais, com o código de Monte Carlo MCNP-4C. Esse modelo é capaz de calcular as distribuições de dose axiais e radiais para qualquer ponto do olho e para cada uma de suas estruturas. Construiu-se, também, um simulador de acrílico para o olho. Esse simulador é formado por uma esfera de acrílico fatiada em lâminas de 1 mm de espessura para simular as mesmas condições de simulação realizada pelos código MCNP-4C, fornecendo as doses axiais e radiais em filmes radiográficos. O simulador foi utilizado para validar os cálculos realizados com o código MCNP-4C. Os dados obtidos desse modelo matemático servirão para montar um banco de dados de doses para todas as estruturas do olho, posições e tamanhos de tumores e quaisquer placas oftálmicas utilizadas para tratamento. Esse banco de dados será a parte principal para a construção de um “software” nacional para cálculos de dose, que poderá fazer parte de um sistema de planejamento confiável para ser utilizado em radioterapia/braquiterapia. / This work presents a calculational methodology for dose determination in human eye structures, such as: sclera, choroid, retina, lens, vitreous body, optic nerve and disc, and cornea, as well as tumor due to treatment to the eye plaques. A human eye model was constructed taking into consideration its main structural and dimension characteristics. Beyond that a mathematical model for the Co-60 and I-125 plaques with all geometric details were built employing the MCNP-4C code. This model is able to calculate the axial and radial doses in any point of the eye and for each of its structures. An acrylic eye simulator was also built with the aim to obtain experimental results for the both model validations. This simulator is made of an acrylic sphere split into foils of 1 mm thickness which allow the introduction a radiographic film to measure the axial and radial doses. The experimental data were used to validate the MCNP-4C results. The data from the mathematical model will serve as the basis to build a data bank for all the eye structures allowing different position and sizes of tumor as well as the replacement of all ophthalmic plaques used in the treatment. This data bank will be the principal part for the construction of a national software for the dose calculation and can be of great help for a reliable treatment system planning in radiotherapy/brachytherapy.

Page generated in 0.0688 seconds