Spelling suggestions: "subject:"auto correlation"" "subject:"duto correlation""
1 |
Gaussian Integer Sequences of Length 4n with Ideal Periodic Auto-Correlation FunctionChen, I-sheng 27 July 2009 (has links)
Many researchers had developed polyphase sequences, so called ¡§perfect sequence¡¨ or ¡§ideal sequence¡¨, with ideal periodic auto-correlation function. There are lots of applications of communication system depends on the sequences with good auto-correlation property, i.e., synchronization, channel estimation and multiple access. These sequences cannot maintain the ideal property in implementation, because of the error of quantization in digital signal processing of transmitter. On the contrary, we develop a novel set of perfect sequences, Gaussian Integer Perfect Sequence (GIPS), which only contains Gaussian integers. In this paper, we construct them by linear combination and cyclic shift of the eight base sequences. We present the design and basic properties of the sequences. Furthermore, the design method of sequences with the smallest dynamic range is presented.
|
2 |
AVALIAÇÃO DE UM PROCESSO DE ELETROGALVANIZAÇÃO POR MEIO DE MODELAGEM ESTATÍSTICA E CARTAS DE CONTROLE / ASSESSMENT OF AN ELECROLYTIC GALVANIZING PROCESS THROUGH STATISTIC MODELING AND CONTROl CHARTSAndara, Flávio Roberto 07 July 2015 (has links)
Quality tools, more specifically control charts, are important statistical resources to know and to monitor production processes. Their goal is to find the common and notable causes of a process to, through monitoring, increase the stability and, from it, assess if the process is under control. The dynamics of today s industrial activities has raised new requirements for good monitoring, and in that sense, new control tools have been developed and these are able to understand the new causal relationships among variables. The research shows the use of three modeling methodologies to treat autocorrelated data enabling to monitor a productive electroplating process. Initially, it was carried out a descriptive analysis for the verification of normality and independence and, afterwards, ARIMA from Box and Jenkins models, ARMAX models of multiple linear regression, MRLM, for the subsequent construction of waste control charts. In addition to the provided academic knowledge, it presents more than one application of control charts to the industrial environment, and also collaborates with the company where the research was developed showing which of the methods is more effective in controlling the production. The best result obtained by monitoring these three statistical methodologies work when confronted with the conventional control method, i.e., without treating the autocorrelation, it was used ARIMA model and a subsequent application of waste control charts derived from this modeling. The decision of the most effective methodology for modeling electroplating was defined by the number of points found out of the conventional limits established. The one that better captured the fluctuations of the process was obtained with the residues of ARIMA. / As ferramentas da qualidade, mais especificamente as cartas de controle, são importantes recursos estatísticos para se conhecer e monitorar processos produtivos, sendo que seu objetivo é encontrar as causas comuns e assinaláveis de um processo para, com seu monitoramento, aumentar sua estabilidade e, a partir daí, considerar se o processo está sob controle. A dinâmica das atividades industriais hoje existentes fez surgir novas necessidades para um bom monitoramento, e, nesse sentido, novas ferramentas de controle foram desenvolvidas, capazes de entender as novas relações causais entre as variáveis. A pesquisa apresenta o uso de três metodologias de modelagem para tratar dados autocorrelacionados possibilitando o monitoramento de um processo produtivo de eletrogalvanização. Inicialmente foi realizada uma análise descritiva para a verificação de pressupostos de normalidade e independência e após foram ajustados os modelos ARIMA de Box e Jenkis, modelos ARMAX e modelos de regressão linear múltipla, MRLM, para posterior construção das cartas de controle dos resíduos. Além do conhecimento acadêmico proporcionado, apresenta mais de uma aplicação das cartas de controle ao ambiente industrial, e também colabora com a empresa onde a pesquisa foi desenvolvida mostrando qual das metodologias é mais efetiva no controle da produção. O melhor resultado de monitoramento obtido com o trabalho estatístico nessas três metodologias quando confrontado com o método de controle convencional, ou seja, sem tratar a autocorrelação foi utilizando a modelo ARIMA e posterior aplicação dos gráficos de controle de resíduos oriundos desta modelagem. A decisão da metodologia de modelagem mais eficaz para a eletrogalvanização foi definida pelo número de pontos encontrados fora dos limites convencionais estabelecidos. A que melhor captou as flutuações do processo foi a obtida com os resíduos do ARIMA.
|
3 |
Low complexity UWB receivers with ranging capabilitiesRabbachin, A. (Alberto) 16 May 2008 (has links)
Abstract
This Thesis examines low complexity receiver structures for impulse-radio (IR) ultra-wideband (UWB) systems to be used in wireless sensor network applications. Such applications require radio communication solutions characterized by low cost, low complexity hardware and low power consumption to provide very long battery life.
Analysis of several auto-correlation receiver (AcR) structures is performed in the presence of additive white Gaussian noise to identify receiver structures that offer a good compromise between implementation complexity and data communication performance.
The classes of receiver that demonstrate the best complexity/performance trade-off are shown to be the AcR utilising transmitted-reference with binary pulse amplitude modulation signaling, and the energy detector (ED) utilising binary pulse position modulation. The analysis of these two schemes is extended to consider multipath fading channels. Numerically integrable bit error rate probability (BEP) expressions are derived in order to evaluate the receivers' performance in the presence of fading distributions characterized by closed form characteristic functions. Simulations utilising widely accepted UWB channel models are then used to evaluate the BEP in different indoor environments.
Since UWB systems share frequency spectrum with many narrowband (NB) systems, and need to coexist with other UWB systems, the performance of low complexity receivers can be seriously affected by interference. In the presence of NB interference, two cases have been considered: 1) single NB interference, where the interfering node is located at a fixed distance from the receiver, and 2) multiple NB interference, where the interfering nodes are scattered according to a spatial Poisson process. When considering UWB interference, the case of multiple sources of interference has been considered. For both the multiple NB and the multiple UWB interference cases, the model derived considers several interference parameters, which can be integrated into BEP formulations for quick performance evaluations. The framework is sufficiently simple to allow tractable analysis and can serve as a guideline for the design of heterogeneous networks where coexistence between UWB systems and NB systems is of importance.
The very large bandwidth of UWB signals offers an unprecedented possibility for accurate ranging operations. Signal leading-edge estimation algorithms based on average maximum likelihood estimators are derived considering different multipath channel fading distributions. Suboptimal solutions are proposed and investigated in order to support ranging capabilities in low complexity receiver structures. The ability to identify line-of-sight and non-line-of-sight conditions with the ED-based receiver is also addressed.
An example of an IR-UWB low complexity transceiver based on ED for sensor network applications is proposed in this Thesis. Ad-hoc solutions for pulse transmission, synchronization and data detection are developed.
|
4 |
Thermo-Hydrological-Mechanical Analysis of a Clay Barrier for Radioactive Waste Isolation: Probabilistic Calibration and Advanced ModelingDontha, Lakshman 2012 May 1900 (has links)
The engineered barrier system is a basic element in the design of repository to isolate high level radioactive waste (HLW). In this system, the clay barrier plays a prominent role in dispersing the heat generated from the waste, reduce the flow of pore water from the host rock, and maintaining the structural stability of the waste canister. The compacted expansive clay (generally bentonite blocks) is initially in unsaturated state. During the life time of the repository, the barrier will undergo different coupled thermal, hydrological and mechanical (THM) phenomena due to heating (from the heat-emitting nuclear waste) and hydration (from the saturated host rock). The design of nuclear waste disposal requires the prediction of the long term barrier behavior (i.e. hundred or thousand years), so numerical modeling is a basic component of the repository design. The numerical analyses are performed using mathematical THM formulation and the associated numerical code. Constitutive models are an essential part of the numerical simulations. Those constitutive models represent the intrinsic behavior of the material for the individual physical phenomenon (i.e. thermal, hydraulic and mechanical). Deterministic analyses have shown the potential of such mathematical formulations to describe the physical behavior of the engineered barrier system. However, the effect of the inherent uncertainties associated with the different constitutive models on the global behavior of the isolation system has not been explored yet.
The first part of this thesis is related to application of recent probabilistic methods to understand and assess the impact of uncertainties on the global THM model response. Experimental data associated with the FEBEX project has been adopted for the case study presented in this thesis. CODE_BRIGHT, a fully coupled THM finite element program, is used to perform the numerical THM analysis.
The second part of this thesis focuses on the complex mechanical behavior observed in a barrier material subjected (during 5 years) to heating and hydration under actual repository conditions The studied experiment is the (ongoing) full scale in-situ FEBEX test at Grimsel test site, Switzerland. A partial dismantling of this experiment has allowed the inspection of the barrier material subjected to varying stresses due to hydration and heating. The clay underwent both elastic and plastic volumetric deformations at different suction and temperature levels with changes in the pre-consolidation pressure and voids ratio that are difficult to explain with conventional models. In this thesis a double structure elasto plastic model is proposed to study the mechanical behavior of this barrier material. The numerical modeling was performed with CODE_BRIGHT. The study shows that the double structure model explains satisfactorily the observed changes in the mechanical behavior of the clay material.
|
5 |
A Design of Karaoke Music Retrieval System by Acoustic InputTsai, Shiu-Iau 11 August 2003 (has links)
The objective of this thesis is to design a system that can be used to retrieve the music songs by acoustic input. The system listens to the melody or the partial song singing by the Karaoke users, and then prompts them the whole song paragraphs. Note segmentation is completed by both the magnitude of the song and the k-Nearest Neighbor technique. In order to speed up our system, the pitch period estimation algorithm is rewritten by a theory in communications. Besides, a large popular music database is built to make this system more practical.
|
6 |
Experimental Study of Roughness Effect on Turbulent Shear Flow Downstream of a Backward Facing StepEssel, Ebenezer Ekow 16 January 2014 (has links)
An experimental study was undertaken to investigate the effect of roughness on the characteristics of separated and reattached turbulent shear flow downstream of a backward facing step. Particle image velocimetry technique was used to conducted refined velocity measurements over a reference smooth acrylic wall and rough walls produced from sandpaper 36 and 24 grits positioned downstream of a backward facing step, one after another. Each experiment was conducted at Reynolds number based on the step height and centerline mean velocity of 7050. The results showed that sandpaper 36 and 24 grits increased the reattachment length by 5% and 7%, respectively, compared with the value obtained over the smooth wall. The distributions of the mean velocities, Reynolds stresses, triple velocity correlations and turbulence production are used to examine roughness effects on the flow field downstream of the backward facing step. Two-point auto-correlation function and proper orthogonal decomposition (POD) are also used to investigate the impact of wall roughness on the large scale structures.
|
7 |
Metody pro vylepšení kvality digitálního obrazu / Methods for enhancing quality of digital imagesSvoboda, Radovan January 2010 (has links)
With arrival of affordable digital technology we are increasingly coming into contact with digital images. Cameras are no longer dedicated devices, but part of almost every mobile phone, PDA and laptop. This paper discusses methods for enhancing quality of digital images with focus on removing noise, creating high dynamic range (HDR) images and extending depth of field (DOF). It contains familiarization with technical means for acquiring digital image, explains origin of image noise. Further attention is drawn to HDR, from explaining the term, physical basis, difference between HDR sensing and HDR displaying, to survey and historical development of methods dealing with creating HDR images. The next part is explaining DOF when displaying, physical basis of this phenomenon and review of methods used for DOF extension. The paper mentions problem of acquiring images needed for solving given tasks and designs method for acquiring images. Using it a database of test images for each task was created. Part of the paper also deals with design of a program, that implements discussed methods, for solving the given tasks. With help of proposed class imgmap, quality of output images is improved, by modifying maps of input images. The paper describes methods, improvements, means of setting parameters and their effects on algorithms and control of program using proposed GUI. Finally, comparison with free software for extending DOF takes place. The proposed software provides at least comparable results, the correct setting of parameters for specific cases allows to achieve better properties of the resulting image. Time requirements of image processing are worse because designed software was not optimised.
|
8 |
Modelos de volatilidade estatísticaIshizawa, Danilo Kenji 22 August 2008 (has links)
Made available in DSpace on 2016-06-02T20:06:01Z (GMT). No. of bitstreams: 1
2117.pdf: 990773 bytes, checksum: a7b62936541ab91d8ae3424f62aa0f40 (MD5)
Previous issue date: 2008-08-22 / In the financial market usually notices are taken of the shares
sequentially over the time in order to characterize them a time
series. However, the major interest is to forecast the behavior of these shares. Motivated by this fact, a lot of models were created based on the past information considering constant averages and variance over time. Although, in financial series a feature often presented is called volatility, which can be noticed by the variance
to vary in time. In order to catch this characteristic were developed the models of the family GARCH, that model the conditional variance through known information. These models were well used and have passed by many formulation modifications to be able to catch different effects, such as the effect leverage EGARCH. Thus, the goal is to estimate volatility patterns obeying the specifications of the family GARCH verifying which ones of them describe better the data inside and outside the sample. / No mercado financeiro costuma-se fazer observações sobre as
carteiras sequencialmente ao longo do tempo, caracterizando uma série temporal. Contudo, o maior interesse está em prever o comportamento destas carteiras. Motivado por este fato, foram criados muitos modelos de previsão baseando-se em observações passadas considerando a média e variância constantes no tempo. Porém, nas séries financeiras uma característica muito presente é a chamada volatilidade, que pode ser observada pela variância não constante no tempo. A fim de captar esta característica, desenvolveram-se os modelos da família GARCH, que modelam a variância condicional através de informações passadas. Estes
modelos foram muito utilizados e sofreram muitas modificações nas formulações para poderem captar diferentes efeitos, como o efeito de leverage (EGARCH). Assim, deseja-se estimar modelos de volatilidade obedecendo às especificações da família GARCH, verificando quais deles descrevem melhor os dados dentro e fora da amostra.
|
9 |
模糊資料之相關係數研究及其應用 / Evaluating Correlation Coefficient with Fuzzy Data and Its Applications楊志清, Yang, Chih Ching Unknown Date (has links)
近年來,由於人類對自然現象、社會現象或經濟現象的認知意識逐漸產生多元化的研判與詮釋,也因此致使人類思維數據化的概念已逐漸廣泛的被應用,對數據分析已從傳統以單一數值或平均值的分析作法,演變為考量多元化數值的分析作為。有鑑於此,在數據資料具備「模糊性」特質的現今,藉由模糊區間的演算方法,進一步探討之間的關係。
傳統的統計分析,對於兩變數間線性關係的強度判斷,一般是藉由皮爾森相關係數(Pearson’s Correlation Coefficient)的方法予以衡量,同時也可以經由係數的正、負符號判斷變數間的關係方向。然而,在現實生活中無論是環境資料或社會經濟資料等,均可能以模糊的資料型態被蒐集,如果當資料型態係屬於模糊性質時,將無法透過皮爾森相關係數的方法計算。
因此,本研究欲研擬一個較簡而易懂的方法,計算模糊區間資料的相關係數,據以呈現兩組模糊區間資料的相互影響程度。此外,若時間性之模糊區間資料被蒐集之際,我們亦提出利用中心點與長度之模糊自相關係數(ACF with the Fuzzy Data of Center and Length;簡稱CLACF)及模糊區間資料之自相關函數(ACF with Fuzzy Interval Data;簡稱FIACF)的方法,探討時間性模糊資料的自相關係數予以衡量。 / The classical Pearson’s correlation coefficient has been widely adopted in various fields of application. However, when the data are composed of fuzzy interval values, it is not feasible to use such a traditional approach to evaluate the correlation coefficient. In this study, we propose the specific calculation of fuzzy interval correlation coefficient with fuzzy interval data to measure the relationship between various stocks.
In addition, in time series analysis, the auto-correlation function (ACF) can evaluate the effect of stationary for time series data. However, as the fuzzy interval data could be occurred, then the classical time series analysis will be not applied. In this paper, we proposed two approaches, ACF with the fuzzy data of center and length (CLACF) and ACF with fuzzy interval data (FIACF), to calculate the auto-correlation coefficient for fuzzy interval data, and use the scheme of Mote Carlo simulation to illustrate the effect of evaluation methods. Finally, we offer empirical study to indentify the performance of CLACF and FIACF which may measure the effect of lagged period of fuzzy interval data for daily price (low, high) of the Centralized Securities Trading Market and the result show that the effect of evaluation lagged period via CLACF and FIACF may response the effect more easily than classical evaluation of ACF for the close price of Centralized Securities Trading Market.
|
10 |
Origins of genetic variation and population structure of foxsnakes across spatial and temporal scalesROW, JEFFREY 11 January 2011 (has links)
Understanding the events and processes responsible for patterns of within species diversity, provides insight into major evolutionary themes like adaptation, species distributions, and ultimately speciation itself. Here, I combine ecological, genetic and spatial perspectives to evaluate the roles that both historical and contemporary factors have played in shaping the population structure and genetic variation of foxsnakes (Pantherophis gloydi).
First, I determine the likely impact of habitat loss on population distribution, through radio-telemetry (32 individuals) at two locations varying in habitat patch size. As predicted, individuals had similar habitat use patterns, but restricted movements to patches of suitable habitat at the more disturbed site. Also, occurrence records spread across a fragmented region were non-randomly distributed and located close to patches of usable habitat, suggesting habitat distribution limits population distribution.
Next, I combined habitat suitability modeling with population genetics (589 individuals, 12 microsatellite loci) to infer how foxsnakes disperse through a mosaic of natural and altered landscape features. Boundary regions between genetic clusters were comprised of low suitability habitat (e.g. agricultural fields). Island populations were grouped into a single genetic cluster suggesting open water presents less of a barrier than non-suitable terrestrial habitat. Isolation by distance models had a stronger correlation with genetic data when including resistance values derived from habitat suitability maps, suggesting habitat degradation limits dispersal for foxsnakes.
At larger temporal and spatial scales I quantified patterns of genetic diversity and population structure using mitochondrial (101 cytochrome b sequences) and microsatellite (816 individuals, 12 loci) DNA and used Approximate Bayesian computation to test competing models of demographic history. Supporting my predictions, I found models with populations which have undergone population size drops and splitting events continually had more support than models with small founding populations expanding to stable populations. Based on timing, the most likely cause was the cooling of temperatures and infilling of deciduous forest since the Hypisthermal. On a smaller scale, evidence suggested anthropogenic habitat loss has caused further decline and fragmentation. Mitochondrial DNA structure did not correspond to fragmented populations and the majority of foxsnakes had an identical haplotype, suggesting a past bottleneck or selective sweep. / Thesis (Ph.D, Biology) -- Queen's University, 2011-01-11 10:40:52.476
|
Page generated in 0.1327 seconds