• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 631
  • 276
  • 107
  • 77
  • 66
  • 44
  • 31
  • 13
  • 12
  • 11
  • 9
  • 5
  • 5
  • 4
  • 4
  • Tagged with
  • 1525
  • 137
  • 132
  • 96
  • 91
  • 86
  • 81
  • 78
  • 77
  • 75
  • 75
  • 74
  • 70
  • 70
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

USING THE FFT FOR DSP SPECTRUM ANALYSIS: A TELEMETRY ENGINEERING APPROACH

Rosenthal, Glenn, Salley, Thomas 11 1900 (has links)
International Telemetering Conference Proceedings / October 29-November 02, 1990 / Riviera Hotel and Convention Center, Las Vegas, Nevada / The Fast Fourier Transform (FFT) converts digitally sampled time domain data into the frequency domain. This paper will provide an advanced introduction for the telemetry engineer to basic FFT theory and then present and explain the different user preprocessing options that are available when using the FFT. These options include: using windowing functions, “zero filling” for frequency data interpolation, and setting the frequency resolution of the FFT resultant spectrum,
112

The New Generation Spacecraft Data Simulator to Test Level Zero Processing Systems

Michandani, Chandru, Kozlowski, Chuck, Bennett, Toby 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / Over the last several years, the Data Systems Technology Division (DSTD) at Goddard Space Flight Center (GSFC) has developed software tools to generate simulated spacecraft data to support the development, test, and verification of prototype and production of its Very Large Scale Integration (VLSI) telemetry data systems. Recently, these data simulation tools have demonstrated their versatility and flexibility in the testing and deployment of several very high performance Level Zero Processing (LZP) systems. Because LZP involves the wide scale reordering of transmitted telemetry data, the data simulation tools were required to create a number of very large and complex simulated data sets to effectively test these high rate systems. These data sets simulated spacecraft with numerous instrument data sources downlinking out-of-sequence and errored data streams. Simulated data streams were encapsulated in Consultative Committee for Space Data Systems (CCSDS) packet and NASCOM data formats. The knowledge and expertise gained in the development of the current simulation tools has been used to develop a new generation data simulation tool, known as the Simulated Telemetry Generation (STGEN) package. STGEN is a menu driven software package running on UNIX platforms that can implement dynamic test scenarios with very fast turn around times from the data set design to the data set generation. The error options and locations in the telemetry data stream are fed via simple programs which are in turn script-driven. Scripts are used to manipulate packets, frames, and permit error insertion more easily and quickly. This paper first describes the STGEN software package and its test data design strategies. It then provides an example of STGEN 's first usage in the testing of systems to support EOS-AM spacecraft. Finally, a description of future planned improvements and uses of STGEN are provided.
113

PTP EX: HIGH-RATE FRONT-END TELEMETRY AND COMMAND PROCESSING SYSTEM

Ozkan, Siragan 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the PTP EX, a 160 Mbps Telemetry and Command front-end system, which takes advantage of the state-of-the-art in networking and software technology, and the rapid development in PC components and FPGA design. Applications for the PTP EX include High-rate Remote Sensing Ground Stations, Satellite/Payload Integration and Testing, High-rate Bit Error Rate Test (BERT) System and High-rate Digital Recorder/Playback System. The PTP EX Interface Board, the MONARCH-EX PCI High Speed Frame Synchronizer/Telemetry Simulator with Reed-Solomon Encoder/Decoder, is designed with the following key capabilities: · 160 Mbps serial input for CCSDS Frame Processing (Frame Synchronization, Derandomization, CRC, Reed-Solomon decoding, time stamping, quality annotation, filtering, routing, and stripping); · 160 Mbps disk logging of Reed-Solomon corrected CCSDS frames with simultaneous real-time processing of spacecraft engineering data and ancillary payload data; · Onboard CCSDS Telemetry Simulation with 160 Mbps serial output (Sync Pattern, background pattern, ID counter, time stamp, CRC, Reed-Solomon encoding, Randomization, and Convolutional encoding); · Bit Error Rate Testing up to 160 Mbps (Pseudo-random transmitter and receiver with bit error counter). The innovative architecture of the MONARCH-EX allows for simultaneous logging of a high-rate data stream and real-time telemetry processing. The MONARCH-EX is also designed with the latest in field-programmable gate array (FPGA) technology. FPGAs allow the board to be reprogrammed quickly and easily to perform different functions. Thus, the same hardware can be used for both Telemetry processing and simulation, and BERT applications. The PTP EX also takes advantage of the latest advances in off-the-shelf PC computing and technology, including Windows NT, Pentium II, PCI, Gigabit Ethernet, and RAID subsystems. Avtec Systems, Inc. is leveraging the PTP EX to take advantage of the continuous improvement in high-end PC server components.
114

Rapid screening of novel nanoporous materials for carbon capture separations

Mangano, Enzo January 2013 (has links)
In this work the experimental results from the rapid screening and ranking of a wide range of novel adsorbents for carbon capture are presented. The samples were tested using the Zero Length Column (ZLC) method which has proved to be an essential tool for the rapid investigation of the equilibrium and kinetic properties of prototype adsorbents. The study was performed on different classes of nanoporous materials developed as part of the EPSRC-funded “Innovative Gas Separations for Carbon Capture” (IGSCC) project. More than 120 novel adsorbents with different key features for post-combustion carbon capture were tested. The classes of materials investigated were: • PIMs (Polymers of Intrinsic Microporosity) • MOFs (Metal - Organic Frameworks) • Mesoporous Silica • Zeolites • Carbons All the samples were tested at experimental conditions close to the ones of a typical flue gas of a fossil fuel power plant: 35 ºC and 0.1 bar of partial pressure of CO2. The results from the ranking of the CO2 capacity of the materials, at the conditions of interest, indicate the Mg and Ni-based MOF samples as the adsorbents with the highest uptake among all the candidates. The best sample shows a CO2 capacity almost double than the benchmark adsorbent, zeolite 13X (provided by UOP). The ranking also shows some of the zeolite adsorbents synthesised as promising materials for carbon capture: uptakes comparable or slightly higher than 13X were obtained for several samples of Rho and Chabazite zeolite. Water stability tests were also performed on the best MOFs and showed a deactivation rate considerably faster for the Mg-based MOFs, proving an expected higher resistance to degradation for the Ni based materials. A focused investigation was also carried out on the diffusion of CO2 in different ionexchanged zeolites Rho samples. The study of these samples, characterised by extremely slow kinetics, extended the use of the ZLC method to very slow diffusional time constants which are very difficult to extract from the traditional long time asymptotic analysis. The results show how the combination of the full saturation and partial loading experiment can provide un-ambiguous diffusional time constants. The diffusivity of CO2 in zeolite Rho samples shows to be strongly influenced by the framework structure as well as the nature and the position of the different cations in the framework. The kinetics of the Na-Cs Rho sample was also measured by the use of the Quantachrome Autosorb-iQ™ volumetric system. To correctly interpret the dynamic response of the instrument modifications were applied to the theoretical model developed by Brandani in 1998 for the analysis of the piezometric method. The analytical solution of the model introduces parameters which allow to account for the real experimental conditions. The results confirm the validity of the methodology in the analysis of slow diffusion processes. In conclusion the advantages offered by the small size of the column and the small amount of sample required proved the ZLC method to be a very useful tool for the rapid ranking of the CO2 capacity of prototype adsorbents. Equilibrium and kinetic measurements were performed on a very wide range of novel nanoporous materials. The most promising and interesting samples were further investigated through the use of the water stability test, the partial loading experiment and the volumetric system. The ZLC technique was also extended to the measurements of systems with very slow kinetics, for which is very difficult to extract reliable diffusional time constants. An improved model for the interpretation of dynamic response curves from a non-ideal piezometric system was developed.
115

Anaerobic Bioremediation of Hexavalent Uranium in Groundwater

Tapia-Rodriguez, Aida Cecilia January 2011 (has links)
Uranium contamination of groundwater from mining and milling operations is an environmental concern. Reductive precipitation of soluble and mobile hexavalent uranium (U(VI)) contamination to insoluble and immobile tetravalent uranium (U(IV)) constitutes the most promising remediation approach for uranium in groundwater. Previous research has shown that many microorganisms are able to catalyze this reaction in the presence of suitable electron-donors. The purpose of this work is to explore lowcost, effective alternatives for biologically catalyzed reductive precipitation of U(VI). Methanogenic granular sludge from anaerobic reactors treating industrial wastewaters was tested for its ability to support U(VI)-reduction. Due to their high microbial diversity, methanogenic granules displayed intrinsic activity towards U(VI)-reduction. Endogenous substrates from the slow decomposition of sludge biomass provided electron-equivalents to support efficient U(VI)-reduction without external electrondonors. Continuous columns with methanogenic granules also demonstrated sustained reduction for one year at high uranium loading rates. One column fed with ethanol, only enabled a short-term enhancement in the uranium removal efficiency, and no enhancement over the long term compared to the endogenous column. Nitrate, a common co-contaminant of uranium, remobilized previously deposited biogenic U(IV). U(VI) also caused inhibition to denitrification. An enrichment culture (EC) was developed from a zero-valent iron (Fe⁰)/sand packed-bed bioreactor. During 28 months, the EC enhanced U(VI)-reduction rates by Fe⁰ compared with abiotic Fe⁰ controls. Additional experiments indicated that the EC prevented the passivation of Fe⁰ surfaces through the use of cathodic H₂ for the reduction of Fe(III) in passivating corrosion mineral phases (e.g. magnetite) to Fe²⁺. This contributed to the formation of secondary minerals more enriched with Fe(II), which are known to be chemically reactive with U(VI). To determine the toxicity of U(VI) to different populations present in uranium contaminated sites, including methanogens, denitrifiers and uranium-reducers, experiments were carried out with anaerobic mixed cultures at increasing U(VI) concentrations. Significant inhibition to the presence of U(VI) was observed for methanogens and denitrifiers. On the other hand uranium-reducing microorganisms were tolerant to high U(VI) concentrations. The results of this dissertation indicate that direct microbial reduction of U(VI) and microbially enhanced reduction of U(VI) by Fe⁰ are promising approaches for uranium bioremediation.
116

A Model Selection Paradigm for Modeling Recurrent Adenoma Data in Polyp Prevention Trials

Davidson, Christopher L. January 2012 (has links)
Colorectal polyp prevention trials (PPTs) are randomized, placebo-controlled clinical trials that evaluate some chemo-preventive agent and include participants who will be followed for at least 3 years to compare the recurrence rates (counts) of adenomas. A large proportion of zero counts will likely be observed in both groups at the end of the observation period. Poisson general linear models (GLMs) are usually employed for estimation of recurrence in PPTs. Other models, including the negative binomial (NB2), zero-inflated Poisson (ZIP), and zero-inflated negative binomial (ZINB) may be better suited to handle zero-inflation or other forms of overdispersion that are common in count data. A model selection paradigm that determines a statistical approach for choosing the best fitting model for recurrence data is described. An example using a subset from a large Phase III clinical trial indicated that the ZINB model was the best fitting model for the data.
117

Lattice models of pattern formation in bacterial dynamics

Thompson, Alasdair Graham January 2012 (has links)
In this thesis I study a model of self propelled particles exhibiting run-and tumble dynamics on lattice. This non-Brownian diffusion is characterised by a random walk with a finite persistence length between changes of direction, and is inspired by the motion of bacteria such as Escherichia coli. By defining a class of models with multiple species of particle and transmutation between species we can recreate such dynamics. These models admit exact analytical results whilst also forming a counterpart to previous continuum models of run-and- tumble dynamics. I solve the externally driven non-interacting and zero-range versions of the model exactly and utilise a field theoretic approach to derive the continuum fluctuating hydrodynamics for more general interactions. I make contact with prior approaches to run-and-tumble dynamics of lattice and determine the steady state and linear stability for a class of crowding interactions, where the jump rate decreases as density increases. In addition to its interest from the perspective of nonequilibrium statistical mechanics, this lattice model constitutes an efficient tool to simulate a class of interacting run-and-tumble models relevant to bacterial motion. Pattern formation in bacterial colonies is confirmed to be able to stem solely from the interplay between a diffusivity that depends on the local bacterial density and regulated division of the cells, in particular without the need for any explicit chemotaxis. This simple and generic mechanism thus provides a null hypothesis for pattern formation in bacterial colonies which has to be falsified before appealing to more elaborate alternatives. Most of the literature on bacterial motility relies on models with instantaneous tumbles. As I show, however, the finite tumble duration can play a major role in the patterning process. Finally a connection is made to some real experimental results and the population ecology of multiple species of bacteria competing for the same resources is considered.
118

A Logistic Normal Mixture Model for Compositions with Essential Zeros

Bear, John Stanley, Bear, John Stanley January 2016 (has links)
Compositions are vectors of nonnegative numbers that sum to a constant, usually one or 100%. They arise in a wide array of fields: geological sampling, budgets,fat/protein/carbohydrate in foods, percentage of the vote acquired by each political party, and more. The usual candidate distributions for modeling compositions -- the Dirichlet and the logistic normal distribution -- have density zero if any component is zero. While statistical methods have been developed for "rounded" zeros, zeros stemming from values below a detection level, and zeros arising from count data, there remain problems with essential zeros, i.e. cases in continuous compositions where a component is truly absent. We develop a model for compositions with essential zeros based on an approach by Aitchison and Kay (2003). It uses a mixture of additive logistic normal distributions of different dimension, related by common parameters. With the requirement of an additional constraint, we develop a likelihood and methods estimating parameters for location and dispersion. We also develop a permutation test for a two-group comparison, and demonstrate the model and test using data from a diabetes study. These results provide the first use of the additive logistic normal distribution formodeling and testing compositional data in the presence of essential zeros.
119

Developing sustainable household waste management : a Local Authority approach to zero waste

Cole, Christine January 2014 (has links)
This project was a case study with a Local Authority (Charnwood Borough Council, Leicestershire) to research the options in response to the challenges of managing household waste. This research focused on establishing and analysing methods of improving the sustainability of household waste management operation within a Waste Collection Authority, where the interaction with a variety of external and internal stakeholders meant a holistic approach was needed. Waste management practices and performances in Charnwood were evaluated and benchmarked against national standards and the demography of a semi-rural Borough. Waste management practices nationally were also reviewed. The performance of the LA was quantitatively compared with other UK LAs where higher recycling performances are achieved. Differences were separate food waste collection and treatment; a larger proportion of urban housing and the university with a transient population. Other differences included strategy and operational practices for garden waste, the storage, collection, transportation and treatment of waste. A time series statistical model was modified and applied to investigate long term waste generation trends from the Boroughs official waste data returns to Defra. These were used to assess the success of interventions undertaken. This statistical model was able to differentiate interventions that were able to achieve lasting improvements in either waste minimisation or recycling. The declaration of a Zero Waste Strategy was to capture the public imagination. A series of focus groups and public consultations were held to judge public reaction and develop and refine the strategy. These were used to adapt the Zero Waste idea to suit the local conditions. A major conclusion was that householder involvement would be crucial for successful implementation of the further separation of waste that would be required.
120

Minimální protipříklady na hypotézy o tocích / Minimal counterexamples to flow conjectures

Korcsok, Peter January 2015 (has links)
We say that a~graph admits a~nowhere-zero k-flow if we can assign a~direction and a~positive integer (<k) as a~flow to each edge so that total in-flow into $v$ and total out-flow from $v$ are equal for each vertex $v$. In 1954, Tutte conjectured that every bridgeless graph admits a~nowhere-zero 5-flow and the conjecture is still open. Kochol in his recent papers introduces a~computational method how to prove that a~minimal counterexample cannot contain short circuits (up to length 10). In this Thesis, we provide a~comprehensive view on this method. Moreover, since Kochol does not share his implementation and in order to independently verify the method, we provide our source code that validates Kochol's results and extend them: we prove that any minimal counterexample to the conjecture does not contain any circuit of length less than 12. Powered by TCPDF (www.tcpdf.org)

Page generated in 0.0512 seconds