• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 879
  • 551
  • 228
  • 104
  • 68
  • 45
  • 35
  • 33
  • 28
  • 28
  • 17
  • 13
  • 12
  • 9
  • 8
  • Tagged with
  • 2356
  • 403
  • 242
  • 224
  • 199
  • 177
  • 164
  • 130
  • 129
  • 124
  • 118
  • 112
  • 112
  • 104
  • 103
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Study and Implementation of Elliptic Curve Cryptosystem

Jen, Li-hsiang 24 August 2005 (has links)
Elliptic curve cryptosystems were proposed in 1985 by Victor Miller and by Neal Koblitz independently. Since elliptic curve discrete logarithm problem is harder to solve than discrete logarithm problem in finite fields. If is believed that the key length of elliptic curve cryptosystems can be shorter then that of RSA with the same security strength. The most important work of using elliptic curve cryptosystem is constructing a group from a proper elliptic curve. The major work of constructing an elliptic curve is counting points on elliptic curves over finite fields. In 1985, Schoof published a deterministic polynomial time algorithm for computing the number of points on the elliptic curves over finite fields. We consult IEEE P1363 to implement pseudo random elliptic curve.
12

Regression methods for areas and partial areas under the receiver-operating characteristic curve /

Dodd, Lori Elizabeth, January 2001 (has links)
Thesis (Ph. D.)--University of Washington, 2001. / Vita. Includes bibliographical references (p. 231-238).
13

Point-based mathematics and graphics for CADCAM

Cook, Peter Robert January 2000 (has links)
No description available.
14

Learning curves and their applicability to unit training levels in operational testing

Brokenburr, Jesse Lee 08 1900 (has links)
No description available.
15

The Beveridge curve and institutional arrangements

Adema, Willem January 1993 (has links)
The main objective of our analysis is to investigate the causes of shifts of the Beveridge curve in Great Britain, the Netherlands and Sweden. In chapter 2., we will outline the model which is the basis for our analysis. The cornerstone of our theoretical framework regarding the long-run relationship between unemployment and vacancies otherwise known as the Beveridge curve is the matching process. We will describe how certain features such as structural mismatch, the relative attractiveness of benefit provisions and changes in search intensity and search effectiveness of the unemployed, could theoretically affect the Beveridge curve. In order to analyse a possible shift of the Beveridge curve, time series analysis will be used. In chapter 3., we describe the patterns of the relevant data series. Also, we describe the significance of the long-term unemployment problem in Great Britain and the Netherlands. In the following chapter we describe the characteristics of the disability arrangements in the three relevant countries. We do this in order to explain how the disability arrangements have affected the unemployment patterns in one of our sample countries. The focal point is the existence of a hidden unemployment component in the disability stock. In order to estimate the Beveridge curve for each country, we will use the instrumental variables technique. In chapter 5., after first having tested for the suitability of our econometric practice regarding the data series in the context of the theory of cointegration, we will present and discuss several model specifications regarding the Beveridge curve. We will also test for the sensitivy of our main results to variations in data and estimation method. Also, we present models of the British and Dutch long-term unemployment patterns. In chapter 6., we will discuss the most relevant results and compare the British, Dutch and Swedish labour market experiences. Conclusions are presented in the final chapter.
16

Cost and accuracy comparisons in medical testing using sequential testing strategies

Ahmed, Anwar. January 1900 (has links)
Thesis (Ph.D.)--Virginia Commonwealth University, 2010. / Prepared for: Dept. of Biostatistics. Title from resource description page. Includes bibliographical references.
17

Dynamic problems in computational geometry

Gowda, Ihor George January 1981 (has links)
Computational geometry is the study of algorithms for manipulating sets of points, lines, polygons, planes and other geometric objects. For many problems in this realm, the sets considered are static and the data structures representing them do not permit efficient insertion and deletion of objects (e.g. points). Dynamic problems, in which the set and the geometric data structure change over time, are also of interest. The topic of this thesis is the presentation of fast algorithms for fully dynamic maintenance of some common geometric structures. The following configurations are examined: planar nearest-point and farthest-point Voronoi diagrams, convex hulls (in two and three dimensions), common intersection of halfspaces (2-D and 3-D), and contour of maximal vectors (2-D and 3-D). The principal techniques exploited are fast merging of substructures, and the use of extra storage. Dynamic geometric search structures based upon the configurations are also presented. / Science, Faculty of / Computer Science, Department of / Graduate
18

IS THE PHILLIPS CURVE A UNICORN?

Unknown Date (has links)
The new Keynesian wage Phillips curve (NKWPC) is derived from the standard new Keynesian Phillips curve (NKPC) that is examined and verified by many economists. The NKWPC model uses the structural wage equation to present the significant inverse relationship between wage inflation and the unemployment rate in the US economy with the significant assumption of a constant natural rate of unemployment. This study examines the NKWPC model using the generalized method of moments (GMM) and generalized autoregressive conditionally heteroskedastic-M (GARCH-M) to confirm the critical inverse relationship of the Phillips curve. In particular, this study tests the NKWPC separately targeting the official unemployment rate from Komlos (2019)’s real unemployment rate. The estimated results of this study support the NKWPC re-confirming a significant negative relationship between wage inflation and unemployment, using two different econometric techniques of GMM and GARCH-M. Moreover, it is apparent that they do not distinguish the official unemployment rate from the real unemployment rate. The Phillips curve is not just a unicorn, or rarity, in the economic world. It is a substantial indicator and still holds merit. This study yields to another lending support to the importance of the Phillips curve. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2021. / FAU Electronic Theses and Dissertations Collection
19

Creating a virtual standard curve for quantitative PCR

Lively, Brianna 10 March 2022 (has links)
The use of standards in quantitative PCR (qPCR) is essential, especially in forensic cases, to determine the concentration of DNA in an unknown sample. The standard curve for Quantifiler Trio is commonly made up of a ten-fold serial dilution of the known DNA standard in the qPCR kit. Due to the known concentration of the standard and the serial dilution, the threshold cycle values of the standards can be compared to the cycle threshold values of the unknown samples to determine their concentrations. Use of the Quantifiler Trio kit provides the concentration of total human DNA, including both small autosomal and large autosomal amounts which are used to calculate a degradation ratio, and also determines the concentration of male DNA in the sample. Each aspect has its own standard curve to determine the DNA concentration. However, serial dilutions cause variance between runs, even when using the same samples, due to small pipetting differences or errors which can make sample reruns and verifying data difficult. Creating a virtual curve by using data from multiple serial dilutions would minimize or eliminate the variance between runs. This is accomplished by taking the averages of slopes and y-intercepts of the standards to creature the virtual curve.
20

The Elliptic Curve Group Over Finite Fields: Applications in Cryptography

Lester, Jeremy W. 28 September 2012 (has links)
No description available.

Page generated in 0.0268 seconds