• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 221
  • 51
  • 49
  • 18
  • 16
  • 15
  • 14
  • 12
  • 11
  • 7
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 488
  • 488
  • 164
  • 101
  • 79
  • 67
  • 67
  • 52
  • 47
  • 39
  • 38
  • 38
  • 36
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Runtime Adaptive Scrubbing in Fault-Tolerant Network-on-Chips (NoC) Architectures

Boraten, Travis H. 09 June 2014 (has links)
No description available.
92

Essays in applied demand and production analysis

Zereyesus, Yacob Abrehe January 1900 (has links)
Doctor of Philosophy / Department of Agricultural Economics / Vincent R. Amanor-Boadu / This dissertation is composed of two essays in applied microeconomics. Using farm level data, the first essay applied nonparametric methods to test the adherence of individual farm’s production choices to profit maximization objective. Results indicate that none of the farms consistently satisfy the joint hypothesis of profit maximization. The study took into account the uncertainty prevalent in agricultural production by systematically modeling the optimization behavior of farms. Departures of observed data of individual farms from profit maximization objectives were attributed more due to stochastic influences caused by output production decisions than input use decisions. Results also support the existence of technological progress during the study period for Kansas farms. At an alpha level of 5%, assuming both input and output quantities as stochastic, only 5.3% of the farms violated the joint hypothesis of profit maximization with standard error exceeding 10%. Whereas when only input quantities are considered stochastic, a total of 71.73% and 2.09% of the farms had minimum standard errors of greater than 10% and 20% respectively required for the joint profit maximization hypothesis to hold. When only output quantity measurements were assumed as stochastic, a total of 80.10 % and 18.84 % of the farms had minimum standard errors of greater than 10% and 20% respectively required for the profit maximization hypothesis to hold. The second essay examines the demand for alcoholic beverages (beer, wine and distilled spirits) for the U.S. using time series data from 1979-2006. The estimation is done using an error correction form of the Almost Ideal Demand System . Results indicate that there is a significant difference between short run and long run elasticity estimates. The paper addresses the exogeneity of log of prices and log of real expenditures. For the beer and wine equations, the hypothesis of joint exogeneity of price index and real expenditure cannot be rejected at all the conventional levels of significance. For the spirits equation, the tests strongly reject the simultaneous exogeneity of price index and real expenditure. When independently tested, price index appears to be endogenous variable where as real expenditure seems exogenous variable. Based on these results, the real expenditure was considered as an exogenous variable, where as the price index for spirits as an endogenous variable.
93

625 MBIT/SEC BIT ERROR LOCATION ANALYSIS FOR INSTRUMENTATION RECORDING APPLICATIONS

Waschura, Thomas E. 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California / This paper describes techniques for error location analysis used in the design and testing of high-speed instrumentation data recording and communications applications. It focuses on the differences between common bit error rate testing and new error location analysis. Examples of techniques presented include separating bit and burst error components, studying probability of burst occurrences, looking at error free interval occurrence rates as well as auto-correlating error position. Each technique contributes to a better understanding of the underlying error phenomenon and enables higher-quality digital recording and communication. Specific applications in error correction coding emulation, magnetic media error mapping and systematic error interference are discussed.
94

AN INEXPENSIVE DATA ACQUISITION SYSTEM FOR MEASURING TELEMETRY SIGNALS ON TEST RANGES TO ESTIMATE CHANNEL CHARACTERISTICS

Horne, Lyman D., Dye, Ricky G. 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / In an effort to determine a more accurate characterization of the multipath fading effects on telemetry signals, the BYU telemetering group is implementing an inexpensive data acquisition system to measure these effects. It is designed to measure important signals in a diversity combining system. The received RF envelope, AGC signal, and the weighting signal for each beam, as well as the IRIG B time stamp will be sampled and stored. This system is based on an 80x86 platform for simplicity, compactness, and ease of use. The design is robust and portable to accommodate measurements in a variety of locations including aircraft, ground, and mobile environments.
95

Single manager hedge funds - aspects of classification and diversification

Bohlandt, Florian Martin 12 1900 (has links)
Thesis (PhD)--Stellenbosch University, 2013. / A persistent problem for hedge fund researchers presents itself in the form of inconsistent and diverse style classifications within and across database providers. For this paper, single-manager hedge funds from the Hedge Fund Research (HFR) and Hedgefund.Net (HFN) databases were classified on the basis of a common factor, extracted using the factor axis methodology. It was assumed that the returns of all sample hedge funds are attributable to a common factor that is shared across hedge funds within one classification, and a specific factor that is unique to a particular hedge fund. In contrast to earlier research and the application of principal component analysis, factor axis has sought to determine how much of the covariance in the dataset is due to common factors (communality). Factor axis largely ignores the diagonal elements of the covariance matrix and orthogonal factor rotation maximises the covariance between hedge fund return series. In an iterative framework, common factors were extracted until all return series were described by one common and one specific factor. Prior to factor extraction, the series was tested for autoregressive moving-average processes and the residuals of such models were used in further analysis to improve upon squared correlations as initial factor estimates. The methodology was applied to 120 ten-year rolling estimation windows in the July 1990 to June 2010 timeframe. The results indicate that the number of distinct style classifications is reduced in comparison to the arbitrary self-selected classifications of the databases. Single manager hedge funds were grouped in portfolios on the basis of the common factor they share. In contrast to other classification methodologies, these common factor portfolios (CFPs) assume that some unspecified individual component of the hedge fund constituents’ returns is diversified away and that single manager hedge funds should be classified according to their common return components. From the CFPs of single manager hedge funds, pure style indices were created to be entered in a multivariate autoregressive framework. For each style index, a Vector Error Correction model (VECM) was estimated to determine the short-term as well as co-integrating relationship of the hedge fund series with the index level series of a stock, bond and commodity proxy. It was postulated that a) in a well-diversified portfolio, the current level of the hedge fund index is independent of the lagged observations from the other asset indices; and b) if the assumptions of the Efficient Market Hypothesis (EMH) hold, it is expected that the predictive power of the model will be low. The analysis was conducted for the July 2000 - June 2010 period. Impulse response tests and variance decomposition revealed that changes in hedge fund index levels are partially induced by changes in the stock, bond and currency markets. Investors are therefore cautioned not to overemphasise the diversification benefits of hedge fund investments. Commodity trading advisors (CTAs) / managed futures, on the other hand, deliver diversification benefits when integrated with an existing portfolio. The results indicated that single manager hedge funds can be reliably classified using the principal factor axis methodology. Continuously re-balanced pure style index representations of these classifications could be used in further analysis. Extensive multivariate analysis revealed that CTAs and macro hedge funds offer superior diversification benefits in the context of existing portfolios. The empirical results are of interest not only to academic researchers, but also practitioners seeking to replicate the methodologies presented.
96

Turbo Equalization for OFDM over the Doubly-Spread Channel using Nonlinear Programming

Iltis, Ronald A. 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / OFDM has become the preferred modulation format for a wide range of wireless networks including 802.11g, 802.16e (WiMAX) and 4G LTE. For multipath channels which are time-invariant during an OFDM symbol duration, near-optimal demodulation is achieved using the FFT followed by scalar equalization. However, demodulating OFDM on the doubly-spread channel remains a challenging problem, as time-variations within a symbol generate intercarrier interference. Furthermore, demodulation and channel estimation must be effectively combined with decoding of the LDPC code in the 4G-type system considered here. This paper presents a new Turbo Equalization (TEQ) decoder, detector and channel estimator for OFDM on the doubly-spread channel based on nonlinear programming. We combine the Penalty Gradient Projection TEQ with a MMSE-type channel estimator (PGP-TEQ) that is shown to yield a convergent algorithm. Simulation results are presented comparing conventional MMSE TEQ using the Sum Product Algorithm (MMSE-SPA-TEQ) with the new PGP-TEQ for doubly-spread channels.
97

Efficient Disk-Based Techniques for Manipulating Very Large String Databases

Allam, Amin 18 May 2017 (has links)
Indexing and processing strings are very important topics in database management. Strings can be database records, DNA sequences, protein sequences, or plain text. Various string operations are required for several application categories, such as bioinformatics and entity resolution. When the string count or sizes become very large, several state-of-the-art techniques for indexing and processing such strings may fail or behave very inefficiently. Modifying an existing technique to overcome these issues is not usually straightforward or even possible. A category of string operations can be facilitated by the suffix tree data structure, which basically indexes a long string to enable efficient finding of any substring of the indexed string, and can be used in other operations as well, such as approximate string matching. In this document, we introduce a novel efficient method to construct the suffix tree index for very long strings using parallel architectures, which is a major challenge in this category. Another category of string operations require clustering similar strings in order to perform application-specific processing on the resulting possibly-overlapping clusters. In this document, based on clustering similar strings, we introduce a novel efficient technique for record linkage and entity resolution, and a novel method for correcting errors in a large number of small strings (read sequences) generated by the DNA sequencing machines.
98

Long haul communications in the HF spectrum utilizing high speed modems

Ellis, Robert H. 03 1900 (has links)
Approved for public release; distribution is unlimited / In the past ten years reliable high-speed satellite systems have pushed slower less reliable communication systems to the bottom of the list for development programs. Concern over reduced budgets, vulnerability of expensive satellite systems, and recent advances in HF technology are creating new interest in upgrading existing HF communication systems. Nondevelopment Items (NDI) are defined as the use of off-the-shelf commercial items instead of costly, time-consuming conventional research and development programs. The Navy Department's current policies are designed to insure the maximum use of NDI to fulfill Navy requirements. The speed of HF systems can be improved using current signaling and modulation techniques, and reliability can be increased by error-correcting codes or error detection used in conjunction with automatic repeat request (ARQ) schemes. Improved HF systems not only provide survivable back-up capability, but increased capacity for present communication needs. / http://archive.org/details/longhaulcommunic00elli / Lieutenant, United States Navy
99

Pollution, Electricity Consumption, and Income in the Context of Trade Openness in Zambia

Lackson Daniel, Mudenda January 2016 (has links)
This paper examines the Environmental Kuznets Curve (EKC) hypothesis and tests for causality using Dynamic Ordinary Least Squares (DOLS) and the Vector Error Correction Model (VECM). There is evidence of long-run relationships in the three models under consideration. The Dynamic Ordinary Least Squares (DOLS) finds no evidence to support the existence of an environmental Kuznets curve (EKC) hypothesis for Zambia in the long-run. The evidence from the long-run suggests an opposite of the Environmental Kuznets Curve (EKC), in that the results indicate a U-shaped curve relationship between income and carbon emission. The conclusion on causality based on the VECM is that there is evidence of neutrality hypothesis between either total electricity and income or between industrial electricity and income in the short-run Additionally, there is evidence of conservation hypothesis in the context of residential and agricultural electricity consumption.
100

Microfabricated Surface Trap and Cavity Integration for Trapped Ion Quantum Computing

Van Rynbach, Andre Jan Simoes January 2016 (has links)
<p>Atomic ions trapped in microfabricated surface traps can be utilized as a physical platform with which to build a quantum computer. They possess many of the desirable characteristics of such a device, including high fidelity state preparation and readout, universal logic gates, and long coherence times, and can be readily entangled with each other through photonic interconnects. The use of optical cavities integrated with trapped ion qubits as a photonic interface presents the possibility for order of magnitude improvements in performance in several key areas for their use in quantum computation. The first part of this thesis describes the design and fabrication of a novel surface trap for integration with an optical cavity. The trap is custom made on a highly reflective mirror surface and includes the capability of moving the ion trap location along all three trap axes with nanometer scale precision. The second part of this thesis demonstrates the suitability of small microcavities formed from laser ablated, fused silica substrates with radii of curvature in the 300-500 micron range for use with the mirror trap as part of an integrated ion trap cavity system. Quantum computing applications for such a system include dramatic improvements in the photon entanglement rate of up to 10 kHz, the qubit measurement time down to 1 microsecond, and the qubit measurement error rate down to the 1e-5 range. The final part of this thesis describes a performance simulator for exploring the physical resource requirements and performance demands to scale a quantum computer to sizes capable of implementing quantum algorithms beyond the limits of classical computation.</p> / Dissertation

Page generated in 0.1386 seconds