• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 875
  • 201
  • 126
  • 110
  • 73
  • 25
  • 17
  • 16
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1726
  • 412
  • 311
  • 245
  • 228
  • 184
  • 173
  • 166
  • 166
  • 156
  • 154
  • 152
  • 152
  • 150
  • 140
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
491

SPECTRAL EFFICIENCY OF 8-ARY PSK MODULATION UTILIZING SQUARE ROOT RAISED COSINE FILTERING

Scheidt, Kelly J. 10 1900 (has links)
International Telemetering Conference Proceedings / October 21, 2002 / Town & Country Hotel and Conference Center, San Diego, California / As frequency allocation restrictions are tightening, and data rates are increasing, it is becoming necessary to incorporate higher order modulation techniques to make more efficient use of available spectrum. When used with Square Root Raised Cosine filtering, 8-ary Phase Shift Keyed modulation is a spectrally efficient technique that makes better use of today’s RF spectrum in comparison to standard formats. This paper will discuss 8-ary PSK modulation and its spectral efficiency with a SRRC filter, along with comparisons to BPSK, QPSK, and FSK.
492

A ROADMAP TO STANDARDIZING THE IRIG 106 CHAPTER 10 COMPLIANT DATA FILTERING AND OVERWRITNG SOFTWARE PROCESS

Berard, Alfredo, Manning, Dennis, Kim, Jeong Min 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / High speed digital recorders have revolutionized the way Major Range and Test Facility Bases collect instrumentation data. One challenge facing these organizations is the need for a validated process for the separation of specific data channels and/or data from multiplexed recordings. Several organizations within Eglin Air Force Base have joined forces to establish the requirements and validate a software process compliant with the IRIG-106 Chapter 10 Digital Recording Standard (which defines allowable media access, data packetization, and error controls mechanics). This paper describes a roadmap to standardizing the process to produce this software process, Data Overwriting and Filtering Application (DOFA).
493

Design of a pump-as-turbine microhydro system for an abalone farm

Teuteberg, B. H. 03 1900 (has links)
ENGLISH ABSTRACT: This document details the design process of a 97 kW microhydro system for Roman Bay Sea Farm in Gansbaai in the Western Cape Province of South Africa. It contains a literature study of microhydro power, with a focus on the use of Pump-as-Turbine technology and direct-drive systems. The literature study leads to several possible concepts for the project, which are then evaluated and the most suitable design is found to be a reverse running pump that powers a different pump through a direct drive system. Experimental data from KSB is used to test the accuracy of various correlations that can be used to generate turbine-mode operation curves from pump curves. The final design parameters for the complete system are then determined, and presented along with a cost benefit analysis. / AFRIKAANSE OPSOMMING: Hierdie verslag dokumenteer die ontwerpsproses van ‘n 97 kW mikro hidro stelsel vir Roman Bay Sea Farm in Gansbaai in die Wes-Kaap van Suid Afrika. Dit bevat ‘n literatuurstudie van mikro hidrokrag, met ‘n fokus op Pomp-as-Turbine en direk-gekoppelde stelsels. Die literatuurstudie lei tot ‘n aantal moontlike konsepte vir die projek wat dan evalueer word sodat die mees gepaste ontwerp gekies kan word. Dit word gevind dat ‘n pomp wat verkeerd om hardloop en ‘n ander pomp direk van krag voorsien die mees gepaste ontwerp is. Eksperimentele data van KSB word gebruik om die akkuraatheid van verskeie korrelasies te toets wat gebruik kan word om turbine-mode gedrag van pomp kurwes te bepaal. Die finale parameters van die hele stelsel word dan bepaal en word dan saam met ‘n koste-analise aangebied. / Centre for Renewable and Sustainable Energy Studies
494

Breaking digital firewalls : analyzing internet censorship and circumvention in the arab world

Al-saqaf, Walid January 2014 (has links)
This dissertation explores the role of Internet censorship and circumvention in the Arab world as well as Arabs’ views on the limits to free speech on the Internet. The project involves the creation of an Internet censorship circumvention tool named Alkasir that allows users to report and access certain types of censored websites. The study covers the Arab world at large with special focus on Egypt, Syria, Tunisia, and Yemen. This work is of interdisciplinary nature and draws on the disciplines of media and communication studies and computer science. It uses a pioneering experimental approach by placing Alkasir in the hands of willing users who automatically feed a server with data about usage patterns without storing any of their personal information. In addition to the analysis of Alkasir usage data, Web surveys were used to learn about any technical and nontechnical Internet censorship practices that Arab users and content producers may have been exposed to. The study also aims at learning about users’ experiences with circumvention tools and how such tools could be improved. The study found that users have successfully reported and accessed hundreds of censored social networking, news, dissident, multimedia and other websites. The survey results show that while most Arab informants disapprove censoring online anti-government political content, the majority support the censoring of other types of content such as pornography, hate speech, and anti-religion material. Most informants indicated that circumvention tools should be free of charge, fast and reliable. An increase in awareness among survey respondents of the need for privacy and anonymity features in circumvention solutions was observed.
495

Monocular vision-aided inertial navigation for unmanned aerial vehicles

Magree, Daniel Paul 21 September 2015 (has links)
The reliance of unmanned aerial vehicles (UAVs) on GPS and other external navigation aids has become a limiting factor for many missions. UAVs are now physically able to fly in many enclosed or obstructed environments, due to the shrinking size and weight of electronics and other systems. These environments, such as urban canyons or enclosed areas, often degrade or deny external signals. Furthermore, many of the most valuable potential missions for UAVs are in hostile or disaster areas, where navigation infrastructure could be damaged, denied, or actively used against the vehicle. It is clear that developing alternative, independent, navigation techniques will increase the operating envelope of UAVs and make them more useful. This thesis presents work in the development of reliable monocular vision-aided inertial navigation for UAVs. The work focuses on developing a stable and accurate navigation solution in a variety of realistic conditions. First, a vision-aided inertial navigation algorithm is developed which assumes uncorrelated feature and vehicle states. Flight test results on a 80 kg UAV are presented, which demonstrate that it is possible to bound the horizontal drift with vision aiding. Additionally, a novel implementation method is developed for integration with a variety of navigation systems. Finally, a vision-aided navigation algorithm is derived within a Bierman-Thornton factored extended Kalman Filter (BTEKF) framework, using fully correlated vehicle and feature states. This algorithm shows improved consistency and accuracy by 2 to 3 orders of magnitude over the previous implementation, both in simulation and flight testing. Flight test results of the BTEKF on large (80 kg) and small (600 g) vehicles show accurate navigation over numerous tests.
496

DIGITAL FILTERING OF MULTIPLE ANALOG CHANNELS

Hicks, William T. 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / The traditional use of active RC-type filters to provide anti-aliasing filters in Pulse Code Modulation (PCM) systems is being replaced by the use of Digital Signal Processing (DSP). This is especially true when performance requirements are stringent and require operation over a wide environmental temperature range. This paper describes the design of a multi channel digital filtering card that incorporates up to 100 unique digitally implemented cutoff frequencies. Any combination of these frequencies can be independently assigned to any of the input channels.
497

Modelling spatial autocorrelation in spatial interaction data

Fischer, Manfred M., Griffith, Daniel A. 12 1900 (has links) (PDF)
Spatial interaction models of the gravity type are widely used to model origindestination flows. They draw attention to three types of variables to explain variation in spatial interactions across geographic space: variables that characterise an origin region of a flow, variables that characterise a destination region of a flow, and finally variables that measure the separation between origin and destination regions. This paper outlines and compares two approaches, the spatial econometric and the eigenfunction-based spatial filtering approach, to deal with the issue of spatial autocorrelation among flow residuals. An example using patent citation data that capture knowledge flows across 112 European regions serves to illustrate the application and the comparison of the two approaches.(authors' abstract)
498

NOVEL DENSE STEREO ALGORITHMS FOR HIGH-QUALITY DEPTH ESTIMATION FROM IMAGES

Wang, Liang 01 January 2012 (has links)
This dissertation addresses the problem of inferring scene depth information from a collection of calibrated images taken from different viewpoints via stereo matching. Although it has been heavily investigated for decades, depth from stereo remains a long-standing challenge and popular research topic for several reasons. First of all, in order to be of practical use for many real-time applications such as autonomous driving, accurate depth estimation in real-time is of great importance and one of the core challenges in stereo. Second, for applications such as 3D reconstruction and view synthesis, high-quality depth estimation is crucial to achieve photo realistic results. However, due to the matching ambiguities, accurate dense depth estimates are difficult to achieve. Last but not least, most stereo algorithms rely on identification of corresponding points among images and only work effectively when scenes are Lambertian. For non-Lambertian surfaces, the "brightness constancy" assumption is no longer valid. This dissertation contributes three novel stereo algorithms that are motivated by the specific requirements and limitations imposed by different applications. In addressing high speed depth estimation from images, we present a stereo algorithm that achieves high quality results while maintaining real-time performance. We introduce an adaptive aggregation step in a dynamic-programming framework. Matching costs are aggregated in the vertical direction using a computationally expensive weighting scheme based on color and distance proximity. We utilize the vector processing capability and parallelism in commodity graphics hardware to speed up this process over two orders of magnitude. In addressing high accuracy depth estimation, we present a stereo model that makes use of constraints from points with known depths - the Ground Control Points (GCPs) as referred to in stereo literature. Our formulation explicitly models the influences of GCPs in a Markov Random Field. A novel regularization prior is naturally integrated into a global inference framework in a principled way using the Bayes rule. Our probabilistic framework allows GCPs to be obtained from various modalities and provides a natural way to integrate information from various sensors. In addressing non-Lambertian reflectance, we introduce a new invariant for stereo correspondence which allows completely arbitrary scene reflectance (bidirectional reflectance distribution functions - BRDFs). This invariant can be used to formulate a rank constraint on stereo matching when the scene is observed by several lighting configurations in which only the lighting intensity varies.
499

Aspects of probabilistic modelling for data analysis

Delannay, Nicolas 23 October 2007 (has links)
Computer technologies have revolutionised the processing of information and the search for knowledge. With the ever increasing computational power, it is becoming possible to tackle new data analysis applications as diverse as mining the Internet resources, analysing drugs effects on the organism or assisting wardens with autonomous video detection techniques. Fundamentally, the principle of any data analysis task is to fit a model which encodes well the dependencies (or patterns) present in the data. However, the difficulty is precisely to define such proper model when data are noisy, dependencies are highly stochastic and there is no simple physical rule to represent them. The aim of this work is to discuss the principles, the advantages and weaknesses of the probabilistic modelling framework for data analysis. The main idea of the framework is to model dispersion of data as well as uncertainty about the model itself by probability distributions. Three data analysis tasks are presented and for each of them the discussion is based on experimental results from real datasets. The first task considers the problem of linear subspaces identification. We show how one can replace a Gaussian noise model by a Student-t noise to make the identification more robust to atypical samples and still keep the learning procedure simple. The second task is about regression applied more specifically to near-infrared spectroscopy datasets. We show how spectra should be pre-processed before entering the regression model. We then analyse the validity of the Bayesian model selection principle for this application (and in particular within the Gaussian Process formulation) and compare this principle to the resampling selection scheme. The final task considered is Collaborative Filtering which is related to applications such as recommendation for e-commerce and text mining. This task is illustrative of the way how intuitive considerations can guide the design of the model and the choice of the probability distributions appearing in it. We compare the intuitive approach with a simpler matrix factorisation approach.
500

Filtering service recovery feedback : A Case study research at Handelsbanken, Uppsala city

Nolan, Neil, Rudström, David January 2008 (has links)
<p>Research has shown that companies encourage customers to complain and gather huge amounts of service recovery information, although most of this information isn´t used by the companies. Our purpose with this thesis is to explore what determines the filtering of service recovery feedback, and if possible to identify its underlying reasons. This was accomplished through a qualitative case study at Handelsbanken Uppsala City. Empirical material was mainly collected through interviews with the office manager, frontline employees, and the regional complaints manager. When analyzing the empirical material Tax and Brown model of service recovery was used as an analytical framework.</p><p>The analysis shows that the employees at Handelsbanken Uppsala city aren’t controlled by many guidelines and policies; instead emphasis is put on the independence, trust, and responsibility of each individual employee. This is probably due to the decentralized organization of Handelsbanken and the belief in the employee’s capability to better understand what is of importance to filter, due to their close interaction with customers.</p>

Page generated in 0.6661 seconds