• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 548
  • 275
  • 80
  • 78
  • 71
  • 30
  • 26
  • 25
  • 19
  • 17
  • 10
  • 7
  • 6
  • 6
  • 6
  • Tagged with
  • 1474
  • 188
  • 146
  • 106
  • 104
  • 96
  • 94
  • 89
  • 81
  • 68
  • 68
  • 66
  • 60
  • 58
  • 58
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

An empirical comparison of extreme value modelling procedures for the estimation of high quantiles

Engberg, Alexander January 2016 (has links)
The peaks over threshold (POT) method provides an attractive framework for estimating the risk of extreme events such as severe storms or large insurance claims. However, the conventional POT procedure, where the threshold excesses are modelled by a generalized Pareto distribution, suffers from small samples and subjective threshold selection. In recent years, two alternative approaches have been proposed in the form of mixture models that estimate the threshold and a folding procedure that generates larger tail samples. In this paper the empirical performances of the conventional POT procedure, the folding procedure and a mixture model are compared by modelling data sets on fire insurance claims and hurricane damage costs. The results show that the folding procedure gives smaller standard errors of the parameter estimates and in some cases more stable quantile estimates than the conventional POT procedure. The mixture model estimates are dependent on the starting values in the numerical maximum likelihood estimation, and are therefore difficult to compare with those from the other procedures. The conclusion is that none of the procedures is overall better than the others but that there are situations where one method may be preferred.
322

Non-linear prediction in the presence of macroeconomic regimes

Okumu, Emmanuel Latim January 2016 (has links)
This paper studies the predictive performance and in-sample dynamics of three regime switching models for Swedish macroeconomic time series. The models discussed are threshold autoregressive (TAR), Markov switching autoregressive (MSM-AR), and smooth-transition autoregressive (STAR) regime switching models. We perform recursive out-of-sample forecasting to study the predictive performance of the models. We also assess the in-sample dynamics correspondence to the forecast performance and find that there is not always a relationship. Furthermore, we seek to explore if these unrestricted models yield interpretable results regarding the regimes from an macroeconomic standpoint. We assess GDP-growth, the unemployment rate, and government bond yields and find evidence of Teräsvirta's claims that even when the data has non-linear dynamics, non-linear models might not improve the forecast performance of linear models when the forecast window is linear.
323

0-Hz-IF FSK/AM Sub-Carrier Demodulator on a 6U-VME-Card

Weitzman, Jonathan M. 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / Aerospace Report No. TOR-0059(6110-01)-3, section 1.3.3 outlines the design and performance requirements of SGLS (Space Ground Link Subsystem) services. GDP Space Systems has developed a single card slot FSK (Frequency Shift Keying)/AM (Amplitude Modulation) demodulator. An application of this service is the US Air Force Satellite Command and Ranging System. The SGLS signal is tri-tone-FSK, amplitude modulated by a modified triangle wave at half the data rate. First generation FSK/AM demodulators had poor noise performance because the signal tones were filtered and processed at IF frequencies (65, 76 and 95 kHz). Second generation demodulators suffer from "threshold" due to non-linear devices in the signal path before the primary noise filtering. The GDP Space Systems demodulator uses a 0-Hz- IF topology and avoids both of these shortcomings. In this approach, the signal is first noncoherently down converted to baseband by linear devices, then it is filtered and processed. This paper will discuss the GDP 0-Hz-IF FSK/AM (SGLS) demodulator.
324

Efficient Variations of the Quality Threshold Clustering Algorithm

Loforte, Frank, Jr. 01 May 2015 (has links)
Clustering gene expression data such that the diameters of the clusters formed are no greater than a specified threshold prompted the development of the Quality Threshold Clustering (QTC) algorithm. It iteratively forms clusters of non-increasing size until all points are clustered; the largest cluster is always selected first. The QTC algorithm applies in many other domains that require a similar quality guarantee based on cluster diameter. The worst-case complexity of the original QTC algorithm is (n5). Since practical applications often involve large datasets, researchers called for more efficient versions of the QTC algorithm. This dissertation aimed to develop and evaluate efficient variations of the QTC algorithm that guarantee a maximum cluster diameter while producing partitions that are similar to those produced by the original QTC algorithm. The QTC algorithm is expensive because it considers forming clusters around every item in the dataset. This dissertation addressed this issue by developing methods for selecting a small subset of promising items around which to form clusters. A second factor that adversely affects the efficiency of the QTC algorithm is the computational cost of updating cluster diameters as new items are added to clusters. This dissertation proposed and evaluated alternate methods to meet the cluster diameter constraint while not having to repeatedly update the cluster diameters. The variations of the QTC algorithm developed in this dissertation were evaluated on benchmark datasets using two measures: execution time and quality of solutions produced. Execution times were compared to the time taken to execute the most efficient published implementation of the QTC algorithm. Since the partitions produced by the proposed variations are not guaranteed to be identical to those produced by the original algorithm, the Jaccard measure of partition similarity was used to measure the quality of the solutions. The findings of this research were threefold. First, the Stochastic QTC alone wasn’t computationally helpful since in order to produce partitions that were acceptably similar to those found by the deterministic QTCs, the algorithm had to be seeded with a large number of centers (ntry ≈ n). Second, the preprocessed data methods are desirable since they reduce the complexity of the search for candidate cluster points. Third, radius based methods are promising since they produce partitions that are acceptably similar to those found by the deterministic QTCs in significantly less time.
325

Measurement of Threshold Friction Velocities at Potential Dust Sources in Semi-arid Regions

King, Matthew A. January 2015 (has links)
The threshold friction velocities of potential dust sources in the US Southwest were measured in the field using a Portable Wind Tunnel, which is based on the Desert Research Institute's Portable In-Situ Wind Erosion Laboratory (PI-SWERL). A mix of both disturbed and undisturbed surfaces were included in this study. It was found that disturbed surfaces, such as those at the Iron King Mine tailings site, which is part of the EPA's Superfund program and contains surface concentrations of arsenic and lead reaching as high as 0.5% (w/w), had lower threshold friction velocities (0.32 m s⁻¹ to 0.40 m s⁻¹) in comparison to those of undisturbed surfaces (0.48 to 0.61 m s⁻¹). Surface characteristics, such as particle size distribution, had effects on the threshold friction velocity (smaller grain sized distributions resulted in lower threshold friction velocities). Overall, the threshold friction velocities of disturbed surfaces were within the range of natural wind conditions, indicating that surfaces disturbed by human activity are more prone to causing windblown dust.
326

Applications of photolithographic techniques : materials modeling for double-exposure lithography and development of shape-encoded biosensor arrays

Lee, Shao-Chien 19 October 2009 (has links)
Double-exposure lithography has shown promise as potential resolu- tion enhancement technique that is attractive because it is much cheaper than double-patterning lithography and it can be deployed on existing imaging tools. However, this technology is not possible without the development of new materials with nonlinear response to exposure dose. Several materials have been proposed to implement a nonlinear response to exposure including re- versible contrast enhancement layers (rCELs), two-photon materials, interme- diate state two-photon (ISTP) materials, and optical threshold layers (OTLs). The performance of these materials in double-exposure applications was inves- tigated through computer simulation using a custom simulator. The results from the feasibility studies revealed that the ISTP and OTL types of materials showed much more promise than the rCEL and two-photon types of materi- als. Calculations show that two-photon materials will not be feasible unless achievable laser peak power in exposure tools can be signi¯cantly increased. Although rCEL materials demonstrated nonlinear behavior in double-exposure mode, only marginal image quality and process window improvements were ob- served. Using the results from the simulation work described herein, materials development work is currently ongoing to enable potential ISTP and OTL materials for manufacturing. A new biochip platform named \Mesoscale Unaddressed Functional- ized Features INdexed by Shape" (MUFFINS) was developed in the Willson Research Group at the University of Texas at Austin as a potential method to achieve a new low-cost biosensor system. The platform uses poly(ethylene glycol) hydrogels with bioprobes covalently cross-linked into the matrix for detection. Each sensor is shape-encoded with a unique pattern such that the information of the sensor is associated with the pattern and not its position. Large quantities of individual sensors can be produced separately and then self- assembled to form random arrays. Detection occurs through hybridization of the probes with °uorescently labeled targets. The key designs of the system include parallel batch fabrication using photolithography and self-assembly, in- creased information density using multiplexing, and enhanced shape-encoding with automated pattern recognition. The development of two aspects of the platform { self-assembly mechanics and pattern recognition algorithm, and a demonstration of all the key design elements using a single array are described herein. / text
327

中國物價膨脹與經濟成長之實證分析 / Inflation and economic growth in China: an empirical analysis

吳銘家, Wu, Ming Jia Unknown Date (has links)
本文旨在研究中國物價膨脹對其經濟成長的非線性效果,一個重要卻未有定論的議題。我們採用自 1986 年至 2006 年中國各省官方公佈的國內生產總值、消費者物價指數、與其它解釋變數的數據,透過追蹤資料進行迴歸分析。本文主要發現為中國物價膨脹門檻的估計結果十分顯著且深具頑強性。當物價膨脹率超過 2.5% 此一門檻值,物價膨脹率每超出 1% 將減低經濟成長率 0.61%;而物價膨脹率低於該門檻值時,物價膨脹率每增加 1% 將刺激經濟成長率 0.53%。是故,我們建議中國政府應維持溫和的物價膨脹以利於長期經濟成長。 / This paper investigates a crucial but still open issue about the nonlinear effect of inflation on economic growth in China. We adopt official provincial data set of gross provincial product, consumer price index, as well as other explanatory variables from 1986 to 2006, and regression of panel data is used for analysis. The main finding is that the inflation threshold effect is highly significant and robust in China. Above the 2.5% inflation threshold, every increase of inflation rate by 1% impedes economic growth by 0.61%; below the threshold, such degree of increase stimulates the growth by 0.53%. We suggest China shall keep inflation rate moderate for long-run growth.
328

Efektyvios šifravimo bei skaitmeninio parašo sistemos / Efficient encryption and digital signature schemes

Valkaitis, Mindaugas 04 July 2014 (has links)
Šio darbo tikslas – apžvelgti šiuo metu naudojamas klasikines viešojo rakto šifravimo ir skaitmeninio parašo sistemas bei naujos kartos Signcryption kriptosistemą ir atlikti dedikuotos pasirašymo ir šifravimo kriptosistemos efektyvumo palyginimą su pasirašymo arba šifravimo kriptosistemų kompozicija bei pasiūlyti praktinio pritaikymą naujos kartos Signcryption kriptosistemai. Darbe apžvelgtos šios kriptosistemos: 1. RSA (Rivest, Shamir, Adleman) – klasikinė viešojo rakto pasirašymo arba šifravimo kriptosistema, kurios saugumas paremtas didelių skaičių faktorizacijos uždavinio sprendimo sudėtingumu, 2. ElGamalio – klasikinė viešojo rakto pasirašymo arba šifravimo kriptosistema, kurios saugumas paremtas diskretaus logaritmo problemos sprendimo sudėtingumu, 3. Signcryption – naujos kartos viešojo rakto pasirašymo ir šifravimo kriptosistema, realizuota modifikuotos ElGamalio skaitmeninio parašo schemos pagrindu. Minėtos kriptosistemos apžvelgtos teoriškai, sukurta praktinė jų realizacija ir apžvelgti rezultatai bei palygintas jų efektyvumas, kuris apibrėžiamas dviem parametrais: 1. Pranešimo pasirašymo, šifravimo, dešifravimo ir parašo patikrinimo operacijų trukmė, 2. Perduodamos perteklinės informacijos kiekis – pranešimo ilgio padidėjimas atlikus pasirašymo ir šifravimo operacijas. Taip pat apžvelgtos kriptosistemų realizacijoje naudotos papildomos funkcijos bei algoritmai, tokie kaip AES blokiniai šifrai, SHA maišos funkcijų šeima, HMAC kontrolinis parašas bei pasiūlyti du... [toliau žr. visą tekstą] / This submission called “Efficient encryption and digital signature schemes” consists of three parts. I. In Part I theoretical analysis of popular public key cryptosystems RSA (Rivest, Shamir, Adleman) with security based on the large integer factorization problem and ElGamal with security based on the discrete logarithm problem, along with new cryptographic primitive termed as "signcryption" proposed by Y. Zheng which simultaneously fulfills both the functions of digital signature and public key encryption in a logically single step, and with a cost significantly smaller than that required by "signature followed by encryption" using popular public key cryptosystem composition is done. For the completeness of analysis description of supplemental algorithms and functions such as AES block cipher, SHA hash functions, HMAC keyed hash function is present. II. In Part II the results of the practical implementation done in Python programming language are analyzed. Effectiveness is described by two factors: 1. Total computation time of signing – encryption – decryption – verification operations; 2. Communication overhead – signed and encrypted message length increase compared to the original plaintext. III. In Part III two effective Signcryption implementation algorithms are proposed: secret sharing without threshold and (k, n) threshold schemes. Results of analysis prove Signcryption being secure and extremely effective signature and encryption cryptosystem. It has very low... [to full text]
329

Multidisciplined individuals : defining the genre

Rogers, Jacqueline Rhoda January 2010 (has links)
Much of literature is predicated upon the assumption that learning occurring inside the workplace is related to developing expertise associated with the tasks for which the individual is employed and has a background in. This research investigates those individuals who acquire expertise in other disciplines and how the application of that additional expertise changes and enhances the individual and the organisation. By combining perspectives across the disciplinary boundaries and developing multidisciplinary expertise, these individuals demonstrate better methods of achieving business objectives, leading to faster, more imaginative solutions, more frequently, and with significantly less effort. The literature review commenced with defining “multidisciplinary” before addressing communities that cluster around disciplines such as professional societies and Communities of Practice, Aspects of organisational, team and “learning by participation” (Ashton, 2004) literature were also considered. The study took an inductive approach using an ethnographical perspective to data collection and analysis to achieve its aim of determining the existence of multidisciplined individuals and how they acquire additional disciplines. The study used interviewing as its primary method yielding both qualitative and quantitative data from a cross sectional sample set inside a medium sized oil and gas consultancy offering technical and management advice. The disciplines inside the case organisation were mapped to ascertain boundaries where the richest learning opportunities lie. Measuring learning across the disciplines confirmed the existence of multidisciplined individuals with evidence pointing towards the integrated multidisciplined team being the ideal learning environment. The study was able to use Threshold Concepts (Meyer and Land, 2003) to demonstrate the multidisciplinary individual development process. Moreover, having examined the social interaction learning processes the potential negative impacts of Communities of Practice in encouraging this type of multidiscipline approach was highlighted. The study concluded that developing multidisciplined individuals was worthwhile but required organisations to be willing to provide the appropriate platform for such learning by more adventurous individuals who held the appropriate underlying abilities required by the additional discipline (s).
330

Financial Stress, Sovereign Debt and Economic Activity in Industrialized Countries: Evidence from Dynamic Threshold Regressions

Proaño, Christian R., Schoder, Christian, Semmler, Willi 02 1900 (has links) (PDF)
We analyze how the impact of a change in the sovereign debt-to-GDP ratio on economic growth depends on the level of debt, the stress level on the financial market and the membership in a monetary union. A dynamic growth model is put forward demonstrating that debt affects macroeconomic activity in a non-linear manner due to amplifications from the financial sector. Employing dynamic country-specific and dynamic panel threshold regression methods, we study the non-linear relation between the growth rate and the debt-to-GDP ratio using quarterly data for sixteen industrialized countries for the period 1981Q1-2013Q2. We find that the debt-to-GDP ratio has impaired economic growth primarily during times of high financial stress and only for countries of the European Monetary Union and not for the stand-alone countries in our sample. A high debt-to-GDP ratio by itself does not seem to necessarily negatively affect growth if financial markets are calm. (authors' abstract) / Series: Department of Economics Working Paper Series

Page generated in 0.0461 seconds