• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 394
  • 75
  • 49
  • 39
  • 34
  • 29
  • 19
  • 12
  • 8
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 901
  • 140
  • 139
  • 126
  • 79
  • 75
  • 67
  • 64
  • 63
  • 61
  • 59
  • 58
  • 56
  • 54
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

The Effect of Peak Detection Algorithms on the Quality of Underwater Laser Ranging

Hung, Chia-Chun 29 July 2004 (has links)
Laser based underwater triangulation ranging is sensitive to the environmental conditions and laser beam profile. Also, its ranging quality is greatly affected by the algorithm choices for peak detection and for image processing. By utilizing the merging least-squares approximation for laser image processing, it indeed succeeds in increasing quality of triangulation ranging in water; however, this result was obtained on the use of a laser beam with nearly circular cross-section. Therefore, by using an ellipse-like laser beam cross-section for range finding, we are really interested in understanding the quality of range finding with different peak detection algorithms. Besides, the ellipse orientation of the laser spot projected on the image plane would be various. We are also interested in learning about the relationship between the ellipse orientation and the quality of range finding. In this study, peak detection algorithms are investigated by considering four different laser beam cross-sections which are ircle, horizontal ellipse, oblique ellipse, and vertical ellipse. First, we employ polynomial regression for processing laser image to study the effect of polynomial degree on quality of triangulation ranging. It was found that the linear regression achieves the best ranging quality than others. Then, according to this result, the ranging quality associated with peak detection is evaluated by employing three different algorithms which are the illumination center, twice illumination center and the illumination center with principal component analysis. We found that the ranging quality by using the illumination center with principal component analysis is the best, next is twice illumination center, and last the illumination center. This result indicates that the orientation of elliptical laser beam has an influential effect on the quality of range finding. In addition, the ranging quality difference among peak detection algorithms is significantly reduced by implementing the merging least-squares approximation rlaser image processing. This result illustrates that the merging least-squares approximation does reduce the effect of peak detection algorithm on the quality of range finding.
282

Study on Peak-to-Average Power Ratio of OFDM Systems

Hung, Kuen-Ming 05 September 2004 (has links)
In recent years, the development of OFDM system has received a lot of attention. Some examples of existing systems where OFDM system is used are digital audio broadcasting, high-definition television terrestrial broadcasting, asymmetric digital subcarrier lines and so on. There are several reasons for using OFDM systems. First, OFDM system is an efficient way to deal with multipath effect. Under a fixed amount of delay spread, the implementation complexity of OFDM system is much less than that of single-carrier system. The reason is that OFDM system can simply use guard time to process delay spread without a complex equalizer. Second, OFDM system can achieve high data rate to transmit by using large number of subcarriers. Third, OFDM system can also efficiently combat with narrow band interference. On the other hand, OFDM system also has two main drawbacks. One is more sensitive to frequency offset, the other is higher PAPR. This thesis focuses on the PAPR problem. Pulse shaping method is an effective way to solve this problem. It can be used for any number of subcarriers of OFDM systems, so it is very flexible. It doesn¡¦t have any additional IFFTs in comparison to the selected mapping or partial transmit sequence method. Its implementation is simpler. And because it also doesn¡¦t distort the OFDM symbols, its bit error performance should be better than the clipping method. According to the pulse shaping method, we get a better waveform that can make the PAPR of OFDM symbols do not exceed about 2.
283

Attenuation Relationship For Peak Ground Velocity Based On Strong Ground Motion Data Recorded In Turkey

Altintas, Suleyman Serkan 01 December 2006 (has links) (PDF)
Estimation of the ground motion parameters is extremely important for engineers to make the structures safer and more economical, so it is one of the main issues of Earthquake Engineering. Peak values of the ground motions obtained either from existing records or with the help of attenuation relationships, have been used as a useful parameter to estimate the effect of an earthquake on a specific location. Peak Ground Velocities (PGV) of a ground motion is used extensively in the recent years as a measure of intensity and as the primary source of energy-related analysis of structures. Consequently, PGV values are used to construct emergency response systems like Shake Maps or to determine the deformation demands of structures. Despite the importance of the earthquakes for Turkey, there is a lack of suitable attenuation relationships for velocity developed specifically for the country. The aim of this study is to address this deficiency by developing an attenuation relationship for the Peak Ground Velocities of the chosen database based on the strong ground motion records of Turkey. A database is processed with the established techniques and corrected database for the chosen ground motions is formed. Five different forms of equations that were used in the previous studies are selected to be used as models and by using nonlinear regression analysis, best fitted mathematical relation for attenuation is obtained. The result of this study can be used as an effective tool for seismic hazard assessment studies for Turkey. Besides, being a by-product of this study, a corrected database of strong ground motion recordings of Turkey may prone to be a valuable source for the future researchers.
284

Design And Realization Of A High Voltage Radio Interference Voltage (riv) Measurement System

Ozer, Mutlu 01 March 2010 (has links) (PDF)
This thesis aims the design and the realization of a radio noise meter which can be used to measure radio interference of a high-voltage transmission line due to partial discharges like conductor corona. The radio noise meter is the common equipment for radio noise and radio interference voltage measurements. The corona of transmission lines, its characteristics, its effects on radio interference and measurement of corona caused radio noise in the scope of relevant international standards are investigated. A radio noise meter fed by a monopole antenna, centered to 1 MHz with a bandwidth of 4.5 KHz and using a Quasi-Peak detector having 1 ms charge time and 600 ms discharge time is realized. The conductor corona from the radio interference point of view is observed, measured and analyzed with the help of the realized radio noise meter.
285

Geant4 Based Monte Carlo Simulation For Carbon Fragmentation In Nuclear Emulsion

Hosseini, Navid 01 July 2012 (has links) (PDF)
The study is mainly focused on Monte Carlo simulation of carbon fragmentation in nuclear emulsion. The carbon ion is selected as a remarkable candidate for the cancer therapy usages due to its high efficiency in depositing majority of its energy in the narrow region which is called Bragg Peak. On the other hand, the main side effect of heavy-ion therapy is the radiation dose beyond the Bragg Peak which damages the healthy tissues. Therefore the use of heavy-ion in cancer therapy requires accurate understanding of ion-matter interactions which result in the production of secondary particles. A Geant4 based simulation of carbon fragmentation has been done considering 400 MeV/n carbon beam directed to the detector which is made of nuclear emulsion films, interleaved with lexan layers. Four different models in Geant4 are compared with recent real data. Among the four different models, Binary Cascade Model (BIC) shows a better agreement with real data.
286

Engineering analysis of fugitive particulate matter emissions from cattle feedyards

Hamm, Lee Bradford 12 April 2006 (has links)
An engineering analysis of the fugitive particulate matter emissions from a feedyard is not simple. The presence of an evening dust peak in concentration measurements downwind of a feedyard complicates the calculation of an average 24-h emission flux for the feedyard. The evening dust peak is a recurring event that occurs during evening hours when particulate matter concentration measurements increase and decrease dramatically during a short period of time. The concentrations measured during the evening can be up to 8 times the concentrations measured throughout the rest of the day. There is a perception that these concentration increases are due to increases in cattle activity as the temperature decreases during the evening. The purpose of Objective 1 of this research was to quantify the changes in concentrations based on changes in meteorological conditions and/or cattle activity. Using ISCST3, a Gaussian-based EPAapproved dispersion model used to predict concentrations downwind of the feedyard , the results of this work indicate that up to 80% of the increase in concentrations can be attributed to changes in meteorological conditions (wind speed, stability class, and mixing height.)The total fugitive particulate matter emissions on a cattle feedyard are due to two sources: unpaved roads (vehicle traffic) and pen surfaces (cattle activity). Objective 2 of this research was to quantify the mass fraction of the concentration measurements that was due to unpaved road emissions (vehicle traffic). A recent finding by Wanjura et al. (2004) reported that as much as 80% of the concentrations measured after a rain event were due to unpaved road emissions. An engineering analysis of the potential of the unpaved road emissions versus the total feedyard emissions using ISCST3 suggests that it is possible for 70 to 80% of the concentration measurements to be attributed to unpaved road emissions. The purpose of Objective 3 was to demonstrate the science used by ISCST3 to predict concentrations downwind of an area source. Results from this study indicate that the ISCST3 model utilizes a form of the Gaussian line source algorithm to predict concentrations downwind of an area source.
287

Essays on pricing under uncertainty

Escobari Urday, Diego Alfonso 10 October 2008 (has links)
This dissertation analyzes pricing under uncertainty focusing on the U.S. airline industry. It sets to test theories of price dispersion driven by uncertainty in the demand by taking advantage of very detailed information about the dynamics of airline prices and inventory levels as the flight date approaches. Such detailed information about inventories at a ticket level to analyze airline pricing has been used previously by the author to show the importance of capacity constraints in airline pricing. This dissertation proposes and implements many new ideas to analyze airline pricing. Among the most important are: (1) It uses information about inventories at a ticket level. (2) It is the first to note that fare changes can be explained by adding dummy variables representing ticket characteristics. Therefore, the load factor at a ticket level will lose its explanatory power on fares if all ticket characteristics are included in a pricing equation. (3) It is the first to propose and implement a measure of Expected Load Factor as a tool to identify which flights are peak and which ones are not. (4) It introduces a novel idea of comparing actual sales with average sales at various points prior departure. Using these deviations of actual sales from sales under average conditions, it presents is the first study to show empirical evidence of peak load pricing in airlines. (5) It controls for potential endogeneity of sales using dynamic panels. The first essay tests the empirical importance of theories that explain price dispersion under costly capacity and demand uncertainty. The essay calculates a measure of an Expected Load Factor, that is used to calibrate the distribution of demand uncertainty and to identify which flights are peak and which ones are off-peak. It shows that different prices can be explained by the different selling probabilities. The second essay is the first study to provide formal evidence of stochastic peak-load pricing in airlines. It shows that airlines learn about the demand and respond to early sales setting higher prices when expected demand is high and more likely to exceed capacity.
288

Novel Low-Complexity SLM Schemes for PAPR Reduction in OFDM Systems

Lee, Kun-Sheng 10 August 2008 (has links)
Selected mapping (SLM) schemes are commonly employed to reduce the peak-to-average power ratio (PAPR) in orthogonal frequency division multiplexing (OFDM) systems. It has been shown that the computational complexity of the traditional SLM scheme can be substantially reduced by adopting conversion vectors obtained by using the inverse fast Fourier transform (IFFT) of the phase rotation vector in place of the conventional IFFT operations [21]. Unfortunately, however, the elements of these phase rotation vectors of the conversion vectors in [21] do not generally have an equal magnitude, and thus a significant degradation in the bit error rate (BER) performance is incurred. This problem can be remedied by utilizing conversion vectors having the form of a perfect sequence. This paper presents three novel classes of perfect sequence, each of which comprises certain base vectors and their cyclic-shifted versions. Three novel low-complexity SLM schemes are then proposed based upon the unique structures of these perfect sequences. It is shown that while the PAPR performances of the proposed schemes are marginally poorer than that of the traditional SLM scheme, the three schemes achieve an identical BER performance and have a substantially lower computational complexity.
289

Novel Low-Complexity SLM Schemes for PAPR Reduction in OFDMA Uplink Systems

Xie, Jia-Cheng 10 August 2008 (has links)
One of the major drawbacks of multi-carrier systems is the high peak-to-average power ratio (PAPR) of the transmitted signals. In this paper, the proposed novel low-complexity selective mapping (SLM) schemes are applicable to interleaved-4 orthogonal frequency division multiple access (OFDMA) uplink systems for PAPR reduction. The novel scheme just needs one inverse fast Fourier transform (IFFT) block because that the phases of the transmitted signals in frequency domain are rotated by circular convolution with conversion vectors in time domain. Moreover, a special set of conversion vectors are proposed in novel scheme, which are not only computed with low complexity but also reduce the PAPR effectively. In proposed scheme, different conversion vectors and appropriate subcarriers mapping are picked up for different users. The scheme supplies a practicable low-complexity method for PAPR reduction in interleaved-4 OFDMA uplink systems. Besides, the bit error rate (BER) performance is as good as the SLM scheme.
290

Semiparametric estimation of unimodal distributions [electronic resource] / by Jason K. Looper.

Looper, Jason K. January 2003 (has links)
Title from PDF of title page. / Document formatted into pages; contains 93 pages. / Thesis (M.S.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: One often wishes to understand the probability distribution of stochastic data from experiment or computer simulations. However, where no model is given, practitioners must resort to parametric or non-parametric methods in order to gain information about the underlying distribution. Others have used initially a nonparametric estimator in order to understand the underlying shape of a set of data, and then later returned with a parametric method to locate the peaks. However they are interested in estimating spectra, which may have multiple peaks, where in this work we are interested in approximating the peak position of a single-peak probability distribution. One method of analyzing a distribution of data is by fitting a curve to, or smoothing them. Polynomial regression and least-squares fit are examples of smoothing methods. Initial understanding of the underlying distribution can be obscured depending on the degree of smoothing. / ABSTRACT: Problems such as under and oversmoothing must be addressed in order to determine the shape of the underlying distribution.Furthermore, smoothing of skewed data can give a biased estimation of the peak position. We propose two new approaches for statistical mode estimation based on the assumption that the underlying distribution has only one peak. The first method imposes the global constraint of unimodality locally, by requiring negative curvature over some domain. The second method performs a search that assumes a position of the distribution's peak and requires positive slope to the left, and negative slope to the right. / ABSTRACT: Each approach entails a constrained least-squares fit to the raw cumulative probability distribution.We compare the relative efficiencies [12] of finding the peak location of these two estimators for artificially generated data from known families of distributions Weibull, beta, and gamma. Within each family a parameter controls the skewness or kurtosis, quantifying the shapes of the distributions for comparison. We also compare our methods with other estimators such as the kernel-density estimator, adaptive histogram, and polynomial regression. By comparing the effectiveness of the estimators, we can determine which estimator best locates the peak position. We find that our estimators do not perform better than other known estimators. We also find that our estimators are biased. / ABSTRACT: Overall, an adaptation of kernel estimation proved to be the most efficient.The results for the work done in this thesis will be submitted, in a different form, for publication by D.A. Rabson and J.K. Looper. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.

Page generated in 0.069 seconds