21 |
Scaling conditional random fields for natural language processing /Cohn, Trevor A. January 2007 (has links)
Thesis (Ph.D.)--University of Melbourne, Dept. of Computer Science and Software Engineering, Faculty of Engineering, 2007. / Typescript. Includes bibliographical references (leaves 171-179).
|
22 |
Techniques d'optimisation déterministe et stochastique pour la résolution de problèmes difficiles en cryptologie / Deterministic and stochastic optimization techniques for hard problems in cryptologyBouallagui, Sarra 05 July 2010 (has links)
Cette thèse s'articule autour des fonctions booléennes liées à la cryptographie et la cryptanalyse de certains schémas d'identification. Les fonctions booléennes possèdent des propriétés algébriques fréquemment utilisées en cryptographie pour constituer des S-Boxes (tables de substitution).Nous nous intéressons, en particulier, à la construction de deux types de fonctions : les fonctions courbes et les fonctions équilibrées de haut degré de non-linéarité.Concernant la cryptanalyse, nous nous focalisons sur les techniques d'identification basées sur les problèmes de perceptron et de perceptron permuté. Nous réalisons une nouvelle attaque sur le schéma afin de décider de sa faisabilité.Nous développons ici des nouvelles méthodes combinant l'approche déterministe DCA (Difference of Convex functions Algorithm) et heuristique (recuit simulé, entropie croisée, algorithmes génétiques...). Cette approche hybride, utilisée dans toute cette thèse, est motivée par les résultats intéressants de la programmation DC. / In cryptography especially in block cipher design, boolean functions are the basic elements.A cryptographic function should have high non-linearity as it can be attacked by linear method. There are three goals for the research presented in this thesis :_ Finding a new construction algorithm for the highest possible nonlinear boolean functions in the even dimension, that is bent functions, based on a detreministic model._ Finding highly non linear boolean functions._ Cryptanalysing an identification scheme based on the perceptron problem.Optimisation heuristic algorithms (Genetic algorithm and simulated annealing) and a deterministicone based on DC programming (DCA) were used together.
|
23 |
Métodos de estatística bayesiana e máxima entropia aplicados na análise de dados em eventos de raios cósmicos / Bayesian statistics and maximum entropy methods applied in cosmic ray events data analysisPerassa, Eder Arnedo, 1982- 13 December 2017 (has links)
Orientador: José Augusto Chinellato / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin / Made available in DSpace on 2018-09-03T07:30:39Z (GMT). No. of bitstreams: 1
Perassa_EderArnedo_D.pdf: 3556275 bytes, checksum: c4e6180df4a4a5dcbfe476b7d331bee4 (MD5)
Previous issue date: 2017 / Resumo: Neste trabalho, estudamos os métodos de estatística bayesiana e máxima entropia na análise de dados em eventos de raios cósmicos. Inicialmente, fizemos um resumo sobre o desenvolvimento da física de raios cósmicos em que descrevemos alguns resultados teóricos e experimentais recentes. A seguir, apresentamos uma breve revisão do método bayesiano e o aplicamos na determinação da composição em massa dos primários em eventos de raios cósmicos. Além disso, introduzimos o método de máxima entropia e propomos um método de parametrização do perfil longitudinal de chuveiros atmosféricos extensos. Em todas as aplicações, foram mostrados os algoritmos desenvolvidos e os resultados obtidos a partir de dados de eventos simulados. Os resultados indicaram que tais métodos podem ser utilizados satisfatoriamente como ferramentas na análise de dados em eventos de raios cósmicos / Abstract: In this work, we study bayesian statistics and maximum entropy methods in cosmic ray events data analysis. At first, we summarized developments in cosmic rays physics, describing some recent theoretical and experimental results. We present briefly a review of bayesian method and apply it to the problem of determining mass composition primary cosmic ray events. Moreover, we introduce the maximum entropy method and propose a method for the parametrization of the longitudinal profile of extensive air showers. In all applications, the algorithms developed and the results obtained from simulated event data were shown. The results suggested that such methods can be satisfactorily used as tools in cosmic rays events data analysis / Doutorado / Física / Doutor em Ciências / 277612/2007 / CAPES
|
24 |
Refinamento da estrutura cristalina de pigmentos baseados no sistema cerâmicoRibeiro, Mauricio Aparecido 22 February 2010 (has links)
Made available in DSpace on 2017-07-24T19:38:02Z (GMT). No. of bitstreams: 1
Mauricio Aparecido.pdf: 11649152 bytes, checksum: 907f11d64c1167074cacc8ec4b4226a1 (MD5)
Previous issue date: 2010-02-22 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Abstract Abstract / Resumo Resumo
|
25 |
Advanced Monte Carlo Methods with Applications in FinanceJoshua Chi Chun Chan Unknown Date (has links)
The main objective of this thesis is to develop novel Monte Carlo techniques with emphasis on various applications in finance and economics, particularly in the fields of risk management and asset returns modeling. New stochastic algorithms are developed for rare-event probability estimation, combinatorial optimization, parameter estimation and model selection. The contributions of this thesis are fourfold. Firstly, we study an NP-hard combinatorial optimization problem, the Winner Determination Problem (WDP) in combinatorial auctions, where buyers can bid on bundles of items rather than bidding on them sequentially. We present two randomized algorithms, namely, the cross-entropy (CE) method and the ADAptive Mulitilevel splitting (ADAM) algorithm, to solve two versions of the WDP. Although an efficient deterministic algorithm has been developed for one version of the WDP, it is not applicable for the other version considered. In addition, the proposed algorithms are straightforward and easy to program, and do not require specialized software. Secondly, two major applications of conditional Monte Carlo for estimating rare-event probabilities are presented: a complex bridge network reliability model and several generalizations of the widely popular normal copula model used in managing portfolio credit risk. We show how certain efficient conditional Monte Carlo estimators developed for simple settings can be extended to handle complex models involving hundreds or thousands of random variables. In particular, by utilizing an asymptotic description on how the rare event occurs, we derive algorithms that are not only easy to implement, but also compare favorably to existing estimators. Thirdly, we make a contribution at the methodological front by proposing an improvement of the standard CE method for estimation. The improved method is relevant, as recent research has shown that in some high-dimensional settings the likelihood ratio degeneracy problem becomes severe and the importance sampling estimator obtained from the CE algorithm becomes unreliable. In contrast, the performance of the improved variant does not deteriorate as the dimension of the problem increases. Its utility is demonstrated via a high-dimensional estimation problem in risk management, namely, a recently proposed t-copula model for credit risk. We show that even in this high-dimensional model that involves hundreds of random variables, the proposed method performs remarkably well, and compares favorably to existing importance sampling estimators. Furthermore, the improved CE algorithm is then applied to estimating the marginal likelihood, a quantity that is fundamental in Bayesian model comparison and Bayesian model averaging. We present two empirical examples to demonstrate the proposed approach. The first example involves women's labor market participation and we compare three different binary response models in order to find the one best fits the data. The second example utilizes two vector autoregressive (VAR) models to analyze the interdependence and structural stability of four U.S. macroeconomic time series: GDP growth, unemployment rate, interest rate, and inflation. Lastly, we contribute to the growing literature of asset returns modeling by proposing several novel models that explicitly take into account various recent findings in the empirical finance literature. Specifically, two classes of stylized facts are particularly important. The first set is concerned with the marginal distributions of asset returns. One prominent feature of asset returns is that the tails of their distributions are heavier than those of the normal---large returns (in absolute value) occur much more frequently than one might expect from a normally distributed random variable. Another robust empirical feature of asset returns is skewness, where the tails of the distributions are not symmetric---losses are observed more frequently than large gains. The second set of stylized facts is concerned with the dependence structure among asset returns. Recent empirical studies have cast doubts on the adequacy of the linear dependence structure implied by the multivariate normal specification. For example, data from various asset markets, including equities, currencies and commodities markets, indicate the presence of extreme co-movement in asset returns, and this observation is again incompatible with the usual assumption that asset returns are jointly normally distributed. In light of the aforementioned empirical findings, we consider various novel models that generalize the usual normal specification. We develop efficient Markov chain Monte Carlo (MCMC) algorithms to estimate the proposed models. Moreover, since the number of plausible models is large, we perform a formal Bayesian model comparison to determine the model that best fits the data. In this way, we can directly compare the two approaches of modeling asset returns: copula models and the joint modeling of returns.
|
26 |
Multi-objective optimisation using the cross-entropy method in CO gas management at a South African ilmenite smelterStadler, Johan George 12 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: In a minerals processing environment, stable production processes, cost minimisation and energy efficiency are key to operational excellence, safety and profitability. At an ilmenite smelter, typically found in the heavy minerals industry, it is no different. Management of an ilmenite smelting process is a complex, multi-variable challenge with high costs and safety risks at stake. A by-product of ilmenite smelting is superheated carbon monoxide (CO) gas, or furnace off-gas. This gas is inflammable and extremely poisonous to humans. At the same time the gas is a potential energy source for various on-site heating applications. Re-using furnace off-gas can increase the energy efficiency of the energy intensive smelting process and can save on the cost of procuring other gas for heating purposes.
In this research project, the management of CO gas from the Tronox KZN Sands ilmenite smelter in South Africa was studied with the aim of optimising the current utilisation of the gas. In the absence of any buffer capacity in the form of a pressure vessel, the stability of the available CO gas is directly dependent on the stability of the furnaces. The CO gas has been identified as a partial replacement for methane gas which is currently purchased for drying and heating of feed material and pre-heating of certain smelter equipment. With no buffer capacity between the furnaces and the gas consuming plants, a dynamic prioritisation approach had to be found if the CO was to replace the methane. The dynamics of this supply-demand problem, which has been termed the “CO gas problem”, needed to be studied.
A discrete-event simulation model was developed to match the variable supply of CO gas to the variable demand for gas over time – the demand being a function of the availability of the plants requesting the gas, and the feed rates and types of feed material processed at those plants. The problem was formulated as a multi-objective optimisation problem with the two main, conflicting objectives, identified as: 1) the average production time lost per plant per day due to CO-methane switchovers; and 2) the average monthly saving on methane gas costs due to lower consumption thereof. A metaheuristic, namely multi-objective optimisation using the cross-entropy method, or MOO CEM, was applied as optimisation algorithm to solve the CO gas problem. The performance of the MOO CEM algorithm was compared with that of a recognised benchmark algorithm for multi-objective optimisation, the NSGA II, when both were applied to the CO gas problem.
The background of multi-objective optimisation, metaheuristics and the usage of furnace off-gas, particularly CO gas, were investigated in the literature review. The simulation model was then developed and the optimisation algorithm applied.
The research aimed to comment on the merit of the MOO CEM algorithm for solving the dynamic, stochastic CO gas problem and on the algorithm’s performance compared to the benchmark algorithm. The results served as a basis for recommendations to Tronox KZN Sands in order to implement a project to optimise usage and management of the CO gas. / AFRIKAANSE OPSOMMING: In mineraalprosessering is stabiele produksieprosesse, kostebeperking en energie-effektiwiteit sleuteldrywers tot bedryfsprestasie, veiligheid en wins. ‘n Ilmenietsmelter, tipies aangetref in swaarmineraleprosessering, is geen uitsondering nie. Die bestuur van ‘n ilmenietsmelter is ‘n komplekse, multi-doelwit uitdaging waar hoë kostes en veiligheidsrisiko’s ter sprake is. ‘n Neweproduk van die ilmenietsmeltproses is superverhitte koolstofmonoksiedgas (CO gas). Hierdie gas is ontvlambaar en uiters giftig vir die mens. Terselfdertyd kan hierdie gas benut word as energiebron vir allerlei verhittingstoepassings. Die herbenutting van CO gas vanaf die smelter kan die energie-effektiwiteit van die energie-intensiewe smeltproses verhoog en kan verder kostes bespaar op die aankoop van ‘n ander gas vir verhittingsdoeleindes.
In hierdie navorsingsprojek is die bestuur van die CO gasstroom wat deur die ilmenietsmelter van Tronox KZN Sands in Suid-Afrika geproduseer word, ondersoek met die doel om die huidige benuttingsvlak daarvan te verbeter. Weens die afwesigheid van enige bufferkapasiteit in die vorm van ‘n drukbestande tenk, is die stabiliteit van CO gas beskikbaar vir hergebruik direk afhanklik van die stabiliteit van die twee hoogoonde wat die gas produseer. Die CO gas kan gedeeltelik metaangas, wat tans aangekoop word vir die droog en verhitting van voermateriaal en vir die voorverhitting van sekere smeltertoerusting, vervang. Met geen bufferkapasiteit tussen die hoogoonde en die aanlegte waar die gas verbruik word nie, was die ondersoek van ‘n dinamiese prioritiseringsbenadering nodig om te kon vasstel of die CO die metaangas kon vervang. Die dinamika van hierdie vraag-aanbod probleem, getiteld die “CO gasprobleem”, moes bestudeer word.
‘n Diskrete-element simulasiemodel is ontwikkel as probleemoplossingshulpmiddel om die vraag-aanbodproses te modelleer en die prioritiseringsbenadering te ondersoek. Die doel van die model was om oor tyd die veranderlike hoeveelhede van geproduseerde CO teenoor die veranderlike gasaanvraag te vergelyk. Die vlak van gasaanvraag is afhanklik van die beskikbaarheidsvlak van die aanlegte waar die gas verbruik word, sowel as die voertempo’s en tipes voermateriaal in laasgenoemde aanlegte. Die probleem is geformuleer as ‘n multi-doelwit optimeringsprobleem met twee hoof, teenstrydige doelwitte: 1) die gemiddelde verlies aan produksietyd per aanleg per dag weens oorgeskakelings tussen CO en metaangas; 2) die gemiddelde maandelikse besparing op metaangaskoste weens laer verbruik van dié gas. ‘n Metaheuristiek, genaamd MOO CEM (multi-objective optimisation using the cross-entropy method), is ingespan as optimeringsalgoritme om die CO gasprobleem op te los. Die prestasie van die MOO CEM algoritme is vergelyk met dié van ‘n algemeen aanvaarde riglynalgoritme, die NSGA II, met beide toepas op die CO gasprobleem.
The agtergrond van multi-doelwit optimering, metaheuristieke en die benutting van hoogoond af-gas, spesifiek CO gas, is ondersoek in die literatuurstudie. Die simulasiemodel is daarna ontwikkel en die optimeringsalgoritme is toegepas.
|
27 |
The application of the cross-entropy method for multi-objective optimisation to combinatorial problemsHauman, Charlotte 12 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: Society is continually in search of ways to optimise various objectives.
When faced with multiple and con
icting objectives, humans are in
need of solution techniques to enable optimisation. This research
is based on a recent venture in the eld of multi-objective optimisation,
the use of the cross-entropy method to solve multi-objective
problems. The document provides a brief overview of the two elds,
multi-objective optimisation and the cross-entropy method, touching
on literature, basic concepts and applications or techniques. The
application of the method to two problems is then investigated. The
rst application is to the multi-objective vehicle routing problem with
soft time windows, a widely studied problem with many real-world
applications. The problem is modelled mathematically with a transition
probability matrix that is updated according to cross-entropy
principles before converging to an approximation solution set. The
highly constrained problem is successfully modelled and the optimisation
algorithm is applied to a set of benchmark problems. It was
found that the cross-entropy method for multi-objective optimisation
is a valid technique in providing feasible and non-dominated solutions.
The second application is to a real world case study in blood management
done at the Western Province Blood Transfusion Service. The
conceptual model is derived from interviews with relevant stakeholders
before discrete event simulation is used to model the system. The
cross-entropy method is used to optimise the inventory policy of the
system by simultaneously maximising the combined service level of the
system and minimising the total distance travelled. By integrating the
optimisation and simulation model, the study shows that the inventory policy of the service can improve signi cantly, and the use of the
cross-entropy algorithm adequately progresses to a front of solutions.
The research proves the remarkable width and simplicity of possible
applications of the cross-entropy algorithm for multi-objective optimisation,
whilst contributing to literature on the vehicle routing problem
and blood management. Results on benchmark problems for the vehicle
routing problem with soft time windows are provided and an
improved inventory policy is suggested to the Western Province Blood
Transfusion Service. / AFRIKAANSE OPSOMMING: Die mensdom is voortdurend op soek na maniere om verskeie doelwitte
te optimeer. Wanneer die mens konfrontreer word met meervoudige
en botsende doelwitte, is oplossingsmetodes nodig om optimering te
bewerkstellig. Hierdie navorsing is baseer op 'n nuwe wending in die
veld van multi-doelwit optimering, naamlik die gebruik van die kruisentropie
metode om multi-doelwit probleme op te los. Die dokument
verskaf 'n bre e oorsig oor die twee velde { multi-doelwit optimering en
die kruis-entropie-metode { deur kortliks te kyk na die beskikbare literatuur,
basiese beginsels, toepassingsareas en metodes. Die toepassing
van die metode op twee onafhanklike probleme word dan ondersoek.
Die eerste toepassing is di e van die multi-doelwit voertuigroeteringsprobleem
met plooibare tydvensters. Die probleem word eers wiskundig
modelleer met 'n oorgangswaarskynlikheidsmatriks. Die matriks word
dan deur kruis-entropie beginsels opdateer voor dit konvergeer na 'n
benaderingsfront van oplossings. Die oplossingsruimte is onderwerp
aan heelwat beperkings, maar die probleem is suksesvol modelleer en
die optimeringsalgoritme is gevolglik toegepas op 'n stel verwysingsprobleme.
Die navorsing het gevind dat die kruis-entropie metode vir
multi-doelwit optimering 'n geldige metode is om 'n uitvoerbare front
van oplossings te beraam.
Die tweede toepassing is op 'n gevallestudie van die bestuur van bloed
binne die konteks van die Westelike Provinsie Bloedoortappingsdiens.
Na aanleiding van onderhoude met die relevante belanghebbers is 'n
konsepmodel geskep voor 'n simulasiemodel van die stelsel gebou is.
Die kruis-entropie metode is gebruik om die voorraadbeleid van die
stelsel te optimeer deur 'n gesamentlike diensvlak van die stelsel te
maksimeer en terselfdetyd die totale reis-afstand te minimeer. Deur die optimerings- en simulasiemodel te integreer, wys die studie dat
die voorraadbeleid van die diens aansienlik kan verbeter, en dat die
kruis-entropie algoritme in staat is om na 'n front van oplossings te
beweeg. Die navorsing bewys die merkwaardige wydte en eenvoud
van moontlike toepassings van die kruis-entropie algoritme vir multidoelwit
optimering, terwyl dit 'n bydrae lewer tot die afsonderlike
velde van voertuigroetering en die bestuur van bloed. Uitslae vir die
verwysingsprobleme van die voertuigroeteringsprobleem met plooibare
tydvensters word verskaf en 'n verbeterde voorraadbeleid word aan
die Westelike Provinsie Bloedoortappingsdiens voorgestel.
|
28 |
Estudo sobre a aplicação de estatística bayesiana e método de máxima entropia em análise de dados / Study on application of bayesian statistics and method of maximun entropy in data analysisPerassa, Eder Arnedo, 1982- 19 April 2007 (has links)
Orientador: Jose Augusto Chinellato / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Fisica Gleb Wataghin / Made available in DSpace on 2018-08-09T22:35:29Z (GMT). No. of bitstreams: 1
Perassa_EderArnedo_M.pdf: 7742499 bytes, checksum: 5f8e2630e2b11b5f5965e6b95c19be9b (MD5)
Previous issue date: 2007 / Resumo: Neste trabalho são estudados os métodos de estatística bayesiana e máxima entropia na análise de dados. É feita uma revisão dos conceitos básicos e procedimentos que podem ser usados para in-ferência de distribuições de probabilidade. Os métodos são aplicados em algumas áreas de interesse, com especial atenção para os casos em que há pouca informação sobre o conjunto de dados. São apresentados algoritmos para a aplicação de tais métodos, bem como alguns exemplos detalhados em que espera-se servirem de auxílio aos interessados em aplicações em casos mais comuns de análise de dados / Abstract: In this work, we study the methods of Bayesian Statistics and Maximum Entropy in data analysis. We present a review of basic concepts and procedures that can be used for inference of probability distributions. The methods are applied in some interesting fields, with special attention to the cases where there¿s few information on set of data, which can be found in physics experiments such as high energies physics, astrophysics, among others. Algorithms are presented for the implementation of such methods, as well as some detailed examples where it is expected to help interested in applications in most common cases of data analysis / Mestrado / Física das Particulas Elementares e Campos / Mestre em Física
|
29 |
Estratégias numéricas e de otimização para inferência da dinâmica de redes bioquímicasLadeira, Carlos Roberto Lima 28 February 2014 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2017-03-07T11:11:33Z
No. of bitstreams: 1
carlosrobertolimaladeira.pdf: 2482685 bytes, checksum: b90ffa199573e38ddbce8d8ac0283585 (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-03-07T15:03:08Z (GMT) No. of bitstreams: 1
carlosrobertolimaladeira.pdf: 2482685 bytes, checksum: b90ffa199573e38ddbce8d8ac0283585 (MD5) / Made available in DSpace on 2017-03-07T15:03:08Z (GMT). No. of bitstreams: 1
carlosrobertolimaladeira.pdf: 2482685 bytes, checksum: b90ffa199573e38ddbce8d8ac0283585 (MD5)
Previous issue date: 2014-02-28 / Estimar parâmetros de modelos dinâmicos de sistemas biológicos usando séries temporais é
cada vez mais importante, pois uma quantidade imensa de dados experimentais está sendo
mensurados pela biologia molecular moderna. Uma abordagem de resolução baseada em
problemas inversos pode ser utilizada na solução deste tipo de problema. A escolha do
modelo matemático é uma tarefa importante, pois vários modelos podem ser utilizados,
apresentando níveis diversos de precisão em suas representações.
A Teoria dos Sistemas Bioquímicos (TSB) faz uso de equações diferenciais ordinárias
e expansões de séries de potências para representar processos bioquímicos. O Sistema
S é um dos modelos usados pela TSB que permite a transformação do sistema original
de equações diferenciais em um sistema algébrico desacoplado, facilitando a solução do
problema inverso. Essa transformação pode comprometer a qualidade da resposta se o
valor das derivadas nos pontos das séries temporais não for obtidos com precisão. Para
estimar as derivadas pretende-se explorar o método do passo complexo, que apresenta
vantagens em relação ao método das diferenças finitas, mais conhecido e utilizado.
A partir daí pode então ser realizada a busca pelas variáveis que definirão as equações
do sistema. O método da Regressão Alternada é um dos mais rápidos para esse tipo de
problema, mas a escolha inicial dos parâmetros possui influência em seu resultado, que
pode até mesmo não ser encontrado. Pretende-se avaliar o método da Entropia Cruzada,
que possui a vantagem de realizar buscas globais e talvez por esse motivo a escolha dos
parâmetros inicias não cause tanta influência nos resultados. Além disso, será avaliado um
método híbrido que fará uso das principais vantagens do método da Regressão Alternada
e do Entropia Cruzada para resolver o problema.
Experimentos numéricos sistematizados serão realizados tanto para a etapa de estimativa
das derivadas quanto para a etapa de otimização para obtenção dos parâmetros
das equações do sistema. / Estimating parameters of dynamic models of biological systems using time series is becoming
very important because a huge amount of experimental data is being measured
by modern molecular biology. A resolution-based approach on inverse problems can be
used in solving this type of problem. The choice of the mathematical model is an important
task, since many models can be used, with varying levels of accuracy in their
representations.
The Biochemical Systems Theory (BST) makes use of ordinary differential equations
and power series expansions to represent biochemical processes. The S-system is one of the
models used by BST that allows the transformation of the original system of differential
equations in a decoupled system of algebric equations, favouring the solution of the inverse
problem. This transformation can compromise the quality of the response if the value
of the derivatives at points of time series are not obtained accurately. To estimate the
derivatives we intend to explore the complex-step method, which has advantages over the
finite difference method, best known and used .
So the search for the variables that define the equations of the system can be performed.
The Alternating Regression method is one of the fastest for this type of problem, but the
initial choice of parameters has influence on its performance, which may not even be
found. We intend to evaluate the Cross-entropy method, which has the advantage of
performing global searches and for this reason the choice of the initial search parameters
does not cause as much influence on the results. Also, will be assessed a hybrid method
that makes use of the main advantages of Alternating Regression and Cross-entropy to
solve the problem.
Systematic numerical experiments will be conducted for both the step of estimating
derivatives as for the optimization step to estimate the variables of the equations of the
system.
|
30 |
Digital Signal Processing of SARSAT Signals Using the MEM and FFTChung, Kwai-Sum Thomas 07 1900 (has links)
<p> This thesis investigates the processing of emergency locator transmitter (ELT) signals which are used in search and rescue satellite-aided tracking (SARSAT) systems. Essentially, the system relies on the transmission of ELT signals from a distressed platform being relayed through an orbiting satellite to an earth station where signal processing can be performed. </p> <p> The methods of signal processing investigated here include both linear and nonlinear. The linear methods include the window function, the autocorrelation function, the digital filtering and the Fast Fourier Transform (FFT). The nonlinear processing is based on the Maximum Entropy Method (MEM) . In addition, additive white Gaussian noise has been added to simulate the performance under different carrier-to-noise density ratio conditions. </p> <p> For a single ELT signal, it is shown in the thesis that the MEM processor gives good spectral performance as compared to the FFT when applied to all types of modulation. When multiple ELT signals are present, the MEM also provides certain benefits in improving the spectral performance as compared to the FFT. </p> / Thesis / Master of Engineering (ME)
|
Page generated in 0.1275 seconds