• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 221
  • 94
  • 69
  • 67
  • 24
  • 19
  • 11
  • 10
  • 6
  • 6
  • 5
  • 4
  • 4
  • 3
  • 2
  • Tagged with
  • 609
  • 115
  • 88
  • 72
  • 68
  • 57
  • 57
  • 50
  • 43
  • 42
  • 42
  • 41
  • 40
  • 39
  • 38
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Contributions à la cryptographie post-quantique / Contributions to post-quantum cryptography

Deneuville, Jean-Christophe 01 December 2016 (has links)
Avec la possibilité de l’existence d’un ordinateur quantique, les primitives cryptographiques basées sur la théorie des nombres risquent de devenir caduques. Il devient donc important de concevoir des schémas résistants à ce nouveau type de menaces. Les réseaux euclidiens et les codes correcteurs d’erreurs sont deux outils mathématiques permettant de construire des problèmes d’algèbre linéaire, pour lesquels il n’existe aujourd’hui pas d’algorithme quantique permettant d’accélérer significativement leur résolution. Dans cette thèse, nous proposons quatre primitives cryptographiques de ce type : deux schémas de signatures (dont une signature traçable) basés sur les réseaux, un protocole de délégation de signature utilisant du chiffrement complètement homomorphe, et une nouvelle approche permettant de construire des cryptosystèmes très efficaces en pratique basés sur les codes. Ces contributions sont accompagnées de paramètres concrets permettant de jauger les coûts calculatoires des primitives cryptographique dans un monde post-quantique. / In the likely event where a quantum computer sees the light, number theoretic based cryptographic primitives being actually in use might become deciduous. This results in an important need to design schemes that could face off this new threat. Lattices and Error Correcting Codes are mathematical tools allowing to build algebraic problems, for which – up to-date – no quantum algorithm significantly speeding up their resolution is known. In this thesis, we propose four such kind cryptographic primitives: two signatures schemes (among those a traceable one) based on lattices, a signature delegation protocol using fully homomorphic encryption, and a new framework for building very efficient and practical code-based cryptosystems. These contributions are fed with concrete parameters allowing to gauge the concrete costs of security in a post-quantum world.
52

Vicefaktorová autentizace elektronických dokumentů / Multifactoral Authentication of Electronic Documents

Gancarčík, Lukáš January 2013 (has links)
The aim of the thesis is to provide complete information regarding electronic documents and possibilities of their usage. The focus is concentrated on the area of authentication, which specifies the possibility of obtaining authentication information and describes the authentication processes itself. The diploma thesis also deals with the suggestion of multifactor authentication of electronic documents for the selected company.
53

Identification of candidate signature genes and key regulators associated with trypanotolerance in the Sheko breed

Mekonnen, Yonatan Ayalew 31 January 2020 (has links)
No description available.
54

Information Retrieval by Identification of Signature Terms in Clusters

Muppalla, Sesha Sai Krishna Koundinya 24 May 2022 (has links)
No description available.
55

Consensualisme et formalisme à l'épreuve de l'électronique, étude de droit civil / Consensualism and formalism challenged by electronic, civil law study

Binois, Fanny 02 July 2019 (has links)
Si l’étude du formalisme et du consensualisme peut constituer un thème classique du droit français, une confrontation de ces grandes notions du droit privé avec l’électronique est moins commune. Pourtant, la transmission électronique des volontés invite à réfléchir au défi lancé aux principes classiques de formation du contrat. Défi partiellement relevé par le législateur qui, en raison de l’abolition du support papier, a dû harmoniser ces notions. Manifestées par des formes libres ou imposées, les volontés devaient pouvoir circuler en ligne. Sans que le consensualisme ne soit explicitement visé, la rencontre de l’offre avec l’acceptation a été adaptée aux contraintes électroniques. Toutefois, il en résulte une réglementation souffrant d’une terminologie et d’un régime imprécis. Pour l’offre, la profusion du formalisme informatif, sans sanction, et le déni de sa forme écrite sont particulièrement incommodants. Pour l’acceptation, la portée du double-clic ou de l’accusé de réception sont encore incertaines. Concernant le formalisme, la transposition de l’acte sous signature privée et de l’acte authentique a inévitablement contribué à la dénaturation de ces formes. Pour le premier, les traits sont alourdis : comment des mentions manuscrites, supposant un irrémédiable lien physique entre la main de l’auteur et l’écriture, pouvaient-elles se réaliser en ligne sans une consolidation technique ? Pour le second, paradoxalement, son avenir semble conditionné par l’assouplissement du caractère mystique de l’authenticité, pour laisser place à une nouvelle authenticité électronique. D’où la nécessité de perfectionner l’adaptation des notions.Cette dernière fait en réalité surgir un mouvement plus profond. Une fois transposés à l’électronique, le consensualisme et le formalisme paraissent infidèles à leurs conceptions traditionnelles. Aussi, les notions seraient-elles exposées à une véritable transformation. Ce qui ne peut être neutre en droit commun des contrats. Concernant le consensualisme, la hiérarchie créée entre les formes et la discrimination des formes non-électroniques suggèrent un renouvellement de son approche classique. L’élaboration d’un néo-consensualisme est envisagée, faisant apparaître une nouvelle dualité du consensualisme en droit commun. Quant aux formalismes ad validitatem et ad probationem, l’impact de l’électronique se concentre sur leurs rapports mutuels. En ligne, leur dissociation devient impossible. Outre une assimilation textuelle, il y aurait une fusion substantielle des écrits, quel que soit leur rôle juridique. Or, une telle union ravive les couleurs d’un phénomène d’unification connu en droit commun des contrats. Il n’y a, alors, qu’un pas vers l’unité du formalisme.Loin d’une abolition des notions, leur mise à l’épreuve par l’électronique invite finalement à repenser les principes de la formation du contrat. / If the study of formalism and consensualism can constitute a classical theme of French law, a comparison of these great notions of private law with electronics is less common. However, the electronic transmission of wills invites to think about the challenge to the conventional principles of contract formation. A challenge partially raised by the legislator who, because of the abolition of paper support, had to adapt these notions. Wills, expressed by free or imposed forms, had to be able to circulate online. Without consensualism being explicitly targeted, the meeting of supply with acceptance has been adapted to electronic constraints. Nevertheless, the result is a regulation with an unclear terminology and regime. For the offer, the profusion of informative formalism, without sanction, and the denial of its written form are particularly bothersome. For acceptance, the scope of the double-click, or the acknowledgment of receipt, are still uncertain. Concerning formalism, the transposition of the act under private signature and the authentic act inevitably contributed to the denaturation of these forms. For the first, the traits are weighed down: how could handwritten mentions, assuming an irremediable physical link between the author's hand and the writing, be realized online without a technical consolidation? For the second, paradoxically, its future seems conditioned by the relaxation of the mystical character of authenticity, to give way to a new electronic authenticity. Hence the need to perfect the adaptation of notions.It actually causes a deeper movement. Once transposed to electronics, consensualism and formalism seem unfaithful to their traditional conceptions. Also, the notions would be exposed to a real transformation. This can not be neutral in common law contracts. Concerning consensualism, the hierarchy created between forms and the discrimination of non-electronic forms suggest a renewal of its classical approach. The development of a neo-consensualism is envisaged, bringing out a new duality of consensualism in common law. As for the formalisms ad validitatem and ad probationem, the impact of the electronics focuses on their mutual relations. Online, their dissociation becomes impossible. In addition to a textual assimilation, there would be a substantial fusion between the writings, whatever their legal role. However, such a union revives the colors of a unifying phenomenon known in the common law of contracts. There is then only one step towards the unity of formalism.Far from an abolition of notions, their testing by the electronics finally invites to rethink the principles of the formation of the contract.
56

Fault Tolerance in Cryptographic Applications Using Cover-Free Families

Bardini Idalino, Thais 27 September 2019 (has links)
Cryptography is of fundamental importance to guarantee the security of communications, from the point-to-point transmission of messages and documents to the storage of big sets of data on cloud providers. As an example, we can use encryption, message authentication codes, and digital signatures to help us guarantee the privacy, integrity, and authenticity of data that is stored and/or transmitted. Many cryptographical solutions have a Boolean outcome, for example, the message is either authentic and accepted as it is, or it is not and so it needs to be rejected/re-transmitted. This outcome may be acceptable in scenarios where we can easily re-transmit the message, but it can pose a challenge when we deal with a large amount of data or a more sensitive content in which changes need to be further explored. In this context, this thesis proposes solutions to provide fault tolerance to certain cryptographic problems that traditionally have an all-or-nothing outcome. Fault tolerance is application dependent. In the case of a digital signature of a doc- ument that has been later modified, a fault-tolerant scheme can ensure authenticity and further identify which parts of the document were modified. This approach can be used in data forensics to investigate cybercrime, or to create redactable signatures for the purpose of privacy. In the case of aggregation of signatures, we consider an aggregation of a set of signatures containing a few invalid signatures (in the traditional sense). A fault-tolerant scheme is able to identify which signatures are valid and which ones are invalid, instead of rejecting the whole set. Digital signatures and aggregation of digital signatures require fault tolerance to be ensured at the origin (signer algorithm and aggregation algorithm, respectively), rather than just at the destination (verification algorithm). For this rea- son, we focus on techniques from combinatorial group testing that are nonadaptive rather than adaptive. This is in contrast with other applications of group testing, such as batch verification of signatures, employed at the verifier’s end which allow both adaptive and nonadaptive solutions. In this work, we explore solutions for fault tolerance using techniques of identification of defective elements used in nonadaptive combinatorial group testing. More specifically, we use the well studied cover-free families (CFFs). A d-cover-free family d-CFF(t, n) is a set system with n subsets of a t-set, where the union of any d subsets does not contain any other. A d-CFF(t, n) allows for the identification of up to d defective elements in a set of n elements by performing only t tests (typically t ≪ n). In the literature, CFFs are used to solve many problems in cryptography. In this work, we explore different aspects of cover-free families in order to better approach fault tolerance problems. The problems we investigate can be divided into two categories: static problems (fixed size) and dynamic problems (increasing size). In the context of static problems, we consider modification-tolerant digital signature schemes, which allow the identification of modifica- tions in signed data using a d-CFF, and in some cases the correction of such modifications in order to retrieve the originally signed data. We also propose a generalization of the classical definition of a d-CFF to support variable cover-free property, followed by some constructions and potential applications in cryptography. For dynamic problems, we con- sider the application of fault-tolerant aggregation of signatures. This problem requires an ii infinite sequence of CFFs with increasing n, consequently increasing t, and potentially in- creasing d. In this context, we investigate monotone, nested, and embedding sequences of CFFs, and propose constructions using techniques from combinatorial design theory and finite fields. In constructing these families, we are concerned with their compression ratio. The compression ratio of a sequence of CFFs measures how slowly the number of tests grow with respect to the number of elements to be tested, which affects the overall efficiency of the method. We provide constructions of CFF sequences with different compression ratios, which depend on relations among the CFF parameters in the sequence and the type of sequence. Some of these sequences achieve the best possible compression ratio, which meets the best known upper bound. Monotone, nested and embedding sequences of CFFs can be used in any group testing problem that is dynamic in nature. We discuss various cryptographical applications that can benefit from these infinite sequences of CFFs.
57

免憑證代理簽名及其代理盲簽名擴張 / Certificateless proxy signature and its extension to blind signature

陳力瑋 Unknown Date (has links)
在傳統的公開金鑰簽章系統中,用戶的公鑰需要一個可信第三方(Trusted Third Party-TTP)發給憑證來保證其可靠性。其後Shamir提出基於使用者身分的簽名機制(ID-Based Signature) 儘管不需要憑證,但此種系統的概念中,TTP仍然扮演著強大的角色,隨之而來的是金鑰託管(Key Escrow)的問題。而在2003年時提出的免憑證簽章系統Certificateless Signature Scheme(CL-S)概念中,不僅不需要憑證也同時解決了Key Escrow的問題。本篇文章便是基於CL-S的概念下,發展出一套免憑證的可代理簽章系統(CL-Proxy Signature)。並且可利用簡單的方式使我們的系統擴張成為一個支援盲簽名(Blind Signature)的免憑證代理盲簽章系統。 / Traditional public key cryptosystem (PKC) requires a Trusted-Third-Party(TTP) for the management of certificates of users’ public keys. To solve this problem, Shamir introduced the concept of ID based signature scheme. Although the ID based signature scheme doesn't require the certificate management, TTP still plays an important role here. In addition, ID-based schemes have the key escrow problem. In 2003, a certificateless signature scheme has been introduced. In a certificateless signature scheme, certificates are not required on one hand and can solve the key escrow problem on the other hand. In this paper, based on the concept of certificateless, we introduce a certificateless proxy signature scheme. A remarkable feature of our scheme is that it can be extended into a certificateless proxy blind signature.
58

Elektroninio parašo panaudojimo verslo įmonėse modelio sukūrimas / The model of using digital signature in business

Daugėlienė, Giedrė 03 July 2012 (has links)
Magistro baigiamajame darbe „Elektroninio parašo panaudojimo verslo įmonėse modelio sukūrimas“ išanalizuota el. parašo naudojimo problematika, el. parašo bei el. dokumentų klasifikacinės sistemos bei parengtas el. parašo taikymo versle modelis, kuris patikrintas empiriniu tyrimu. Pirmoje darbo dalyje išanalizuota el. dokumentų raida ir jų klasifikacinės sistemos. Antrajame skyriuje atlikta el. parašo taikymo verslo įmonėse sisteminė analizė: apibrėžiama el. parašo sąvoka, rūšys, el. parašo pritaikymo versle, remiantis komunikacinėmis pusėmis. Trečiojoje darbo dalyje atliktas el. parašo taikymo versle empirinis tyrimas, kuriame ištyrus el. dokumentų specialistų ekspertinę nuomonę, nustatomos el. parašo taikymo problemos, jų sprendimo būdai. Paskutiniame skyriuje – sukurtas el. parašo taikymo versle modelis bei atliktas modelio ekspertinis vertinimas. / In Master thesis ‘The model of Using Digital Signature in Business’ the topic of digital signature, the classification systems of digital signature and digital records was analyzed as well as a model of digital signature usage in business was prepared which was examined by empirical research. In the first part of thesis the development of digital records and their classification systems was analyzed. The second part consists of the systemic analysis of digital signature usage in business: defined the concept of digital signature, it kinds (kinds of digital signature), usage in business based on communicational sides. The third part contains the empirical research of digital signature usage in business which was made by the examination of the digital records professionals’ opinions, the problems of digital signature usage and the ways of their solutions were defined. In the last part – a model of digital signature usage in business was created and the evaluation of it was made.
59

Identifying an Ohio community signature fundraising quilt

Struck, Julie A. 01 July 2020 (has links)
No description available.
60

Methods for optimization of the signature-based radiation scanning approach for detection of nitrogen-rich explosives

Callender, Kennard January 1900 (has links)
Doctor of Philosophy / Department of Mechanical and Nuclear Engineering / William L. Dunn / The signature-based radiation scanning (SBRS) technique can be used to rapidly detect nitrogen-rich explosives at standoff distances. This technique uses a template-matching procedure that produces a figure-of-merit (FOM) whose value is used to distinguish between inert and explosive materials. The present study develops a tiered-filter implementation of the signature-based radiation scanning technique, which reduces the number of templates needed. This approach starts by calculating a normalized FOM between signatures from an unknown target and an explosive template through stages or tiers (nitrogen first, then oxygen, then carbon, and finally hydrogen). If the normalized FOM is greater than a specified cut-off value for any of the tiers, the target signatures are considered not to match that specific template and the process is repeated for the next explosive template until all of the relevant templates have been considered. If a target’s signatures match all the tiers of a single template, then the target is assumed to contain an explosive. The tiered filter approach uses eight elements to construct artificial explosive-templates that have the function of representing explosives cluttered with real materials. The feasibility of the artificial template approach to systematically build a library of templates that successfully differentiates explosive targets from inert ones in the presence of clutter and under different geometric configurations was explored. In total, 10 different geometric configurations were simulated and analyzed using the MCNP5 code. For each configuration, 51 different inert materials were used as inert samples and as clutter in front of the explosive cyclonite (RDX). The geometric configurations consisted of different explosive volumes, clutter thicknesses, and distances of the clutter from the neutron source. Additionally, an objective function was developed to optimize the parameters that maximize the sensitivity and specificity of the method.

Page generated in 0.054 seconds