• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 553
  • 162
  • 106
  • 67
  • 62
  • 51
  • 12
  • 12
  • 12
  • 10
  • 10
  • 9
  • 6
  • 5
  • 5
  • Tagged with
  • 1249
  • 264
  • 214
  • 131
  • 126
  • 117
  • 116
  • 114
  • 114
  • 104
  • 93
  • 82
  • 79
  • 74
  • 70
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Estimation de mesures de risque pour des distributions elliptiques conditionnées / Estimation of risk measures for conditioned elliptical distributions

Usseglio-Carleve, Antoine 26 June 2018 (has links)
Cette thèse s'intéresse à l'estimation de certaines mesures de risque d'une variable aléatoire réelle Y en présence d'une covariable X. Pour cela, on va considérer que le vecteur (X,Y) suit une loi elliptique. Dans un premier temps, on va s'intéresser aux quantiles de Y sachant X=x. On va alors tester d'abord un modèle de régression quantile assez répandu dans la littérature, pour lequel on obtient des résultats théoriques que l'on discutera. Face aux limites d'un tel modèle, en particulier pour des niveaux de quantile dits extrêmes, on proposera une nouvelle approche plus adaptée. Des résultats asymptotiques sont donnés, appuyés par une étude numérique puis par un exemple sur des données réelles. Dans un second chapitre, on s'intéressera à une autre mesure de risque appelée expectile. La structure du chapitre est sensiblement la même que celle du précédent, à savoir le test d'un modèle de régression inadapté aux expectiles extrêmes, pour lesquels on propose une approche méthodologique puis statistique. De plus, en mettant en évidence le lien entre les quantiles et expectiles extrêmes, on s'aperçoit que d'autres mesures de risque extrêmes sont étroitement liées aux quantiles extrêmes. On se concentrera sur deux familles appelées Lp-quantiles et mesures d'Haezendonck-Goovaerts, pour lesquelles on propose des estimateurs extrêmes. Une étude numérique est également fournie. Enfin, le dernier chapitre propose quelques pistes pour traiter le cas où la taille de la covariable X est grande. En constatant que nos estimateurs définis précédemment étaient moins performants dans ce cas, on s'inspire alors de quelques méthodes d'estimation en grande dimension pour proposer d'autres estimateurs. Une étude numérique permet d'avoir un aperçu de leurs performances / This PhD thesis focuses on the estimation of some risk measures for a real random variable Y with a covariate vector X. For that purpose, we will consider that the random vector (X,Y) is elliptically distributed. In a first time, we will deal with the quantiles of Y given X=x. We thus firstly investigate a quantile regression model, widespread in the litterature, for which we get theoretical results that we discuss. Indeed, such a model has some limitations, especially when the quantile level is said extreme. Therefore, we propose another more adapted approach. Asymptotic results are given, illustrated by a simulation study and a real data example.In a second chapter, we focus on another risk measure called expectile. The structure of the chapter is essentially the same as that of the previous one. Indeed, we first use a regression model that is not adapted to extreme expectiles, for which a methodological and statistical approach is proposed. Furthermore, highlighting the link between extreme quantiles and expectiles, we realize that other extreme risk measures are closely related to extreme quantiles. We will focus on two families called Lp-quantiles and Haezendonck-Goovaerts risk measures, for which we propose extreme estimators. A simulation study is also provided. Finally, the last chapter is devoted to the case where the size of the covariate vector X is tall. By noticing that our previous estimators perform poorly in this case, we rely on some high dimensional estimation methods to propose other estimators. A simulation study gives a visual overview of their performances
232

Estimation des limites d'extrapolation par les lois de valeurs extrêmes. Application à des données environnementales / Estimation of extrapolation limits based on extreme-value distributions.Application to environmental data.

Albert, Clément 17 December 2018 (has links)
Cette thèse se place dans le cadre de la Statistique des valeurs extrêmes. Elle y apporte trois contributions principales. L'estimation des quantiles extrêmes se fait dans la littérature en deux étapes. La première étape consiste à utiliser une approximation des quantiles basée sur la théorie des valeurs extrêmes. La deuxième étape consiste à estimer les paramètres inconnus de l'approximation en question, et ce en utilisant les valeurs les plus grandes du jeu de données. Cette décomposition mène à deux erreurs de nature différente, la première étant une erreur systémique de modèle, dite d'approximation ou encore d'extrapolation, la seconde consituant une erreur d'estimation aléatoire. La première contribution de cette thèse est l'étude théorique de cette erreur d'extrapolation mal connue.Cette étude est menée pour deux types d'estimateur différents, tous deux cas particuliers de l'approximation dite de la "loi de Pareto généralisée" : l'estimateur Exponential Tail dédié au domaine d'attraction de Gumbel et l'estimateur de Weissman dédié à celui de Fréchet.Nous montrons alors que l'erreur en question peut s'interpréter comme un reste d'ordre un d'un développement de Taylor. Des conditions nécessaires et suffisantes sont alors établies de telle sorte que l'erreur tende vers zéro quand la taille de l'échantillon augmente. De manière originale, ces conditions mènent à une division du domaine d'attraction de Gumbel en trois parties distinctes. En comparaison, l'erreur d'extrapolation associée à l'estimateur de Weissman présente un comportement unifié sur tout le domaine d'attraction de Fréchet. Des équivalents de l'erreur sont fournis et leur comportement est illustré numériquement. La deuxième contribution est la proposition d'un nouvel estimateur des quantiles extrêmes. Le problème est abordé dans le cadre du modèle ``log Weibull-tail'' généralisé, où le logarithme de l'inverse du taux de hasard cumulé est supposé à variation régulière étendue. Après une discussion sur les conséquences de cette hypothèse, nous proposons un nouvel estimateur des quantiles extrêmes basé sur ce modèle. La normalité asymptotique dudit estimateur est alors établie et son comportement en pratique est évalué sur données réelles et simulées.La troisième contribution de cette thèse est la proposition d'outils permettant en pratique de quantifier les limites d'extrapolation d'un jeu de données. Dans cette optique, nous commençons par proposer des estimateurs des erreurs d'extrapolation associées aux approximations Exponential Tail et Weissman. Après avoir évalué les performances de ces estimateurs sur données simulées, nous estimons les limites d'extrapolation associées à deux jeux de données réelles constitués de mesures journalières de variables environnementales. Dépendant de l'aléa climatique considéré, nous montrons que ces limites sont plus ou moins contraignantes. / This thesis takes place in the extreme value statistics framework. It provides three main contributions to this area. The extreme quantile estimation is a two step approach. First, it consists in proposing an extreme value based quantile approximation. Then, estimators of the unknown quantities are plugged in the previous approximation leading to an extreme quantile estimator.The first contribution of this thesis is the study of this previous approximation error. These investigations are carried out using two different kind of estimators, both based on the well-known Generalized Pareto approximation: the Exponential Tail estimator dedicated to the Gumbel maximum domain of attraction and the Weissman estimator dedicated to the Fréchet one.It is shown that the extrapolation error can be interpreted as the remainder of a first order Taylor expansion. Necessary and sufficient conditions are then provided such that this error tends to zero as the sample size increases. Interestingly, in case of the so-called Exponential Tail estimator, these conditions lead to a subdivision of Gumbel maximum domain of attraction into three subsets. In constrast, the extrapolation error associated with Weissmanestimator has a common behavior over the whole Fréchet maximum domain of attraction. First order equivalents of the extrapolation error are thenderived and their accuracy is illustrated numerically.The second contribution is the proposition of a new extreme quantile estimator.The problem is addressed in the framework of the so-called ``log-Generalized Weibull tail limit'', where the logarithm of the inverse cumulative hazard rate function is supposed to be of extended regular variation. Based on this model, a new estimator of extreme quantiles is proposed. Its asymptotic normality is established and its behavior in practice is illustrated on both real and simulated data.The third contribution of this thesis is the proposition of new mathematical tools allowing the quantification of extrapolation limits associated with a real dataset. To this end, we propose estimators of extrapolation errors associated with the Exponentail Tail and the Weissman approximations. We then study on simulated data how these two estimators perform. We finally use these estimators on real datasets to show that, depending on the climatic phenomena,the extrapolation limits can be more or less stringent.
233

Microstructure-sensitive extreme value probabilities of fatigue in advanced engineering alloys

Przybyla, Craig Paul 07 July 2010 (has links)
A novel microstructure-sensitive extreme value probabilistic framework is introduced to evaluate material performance/variability for damage evolution processes (e.g., fatigue, fracture, creep). This framework employs newly developed extreme value marked correlation functions (EVMCF) to identify the coupled microstructure attributes (e.g., phase/grain size, grain orientation, grain misorientation) that have the greatest statistical relevance to the extreme value response variables (e.g., stress, elastic/plastic strain) that describe the damage evolution processes of interest. This is an improvement on previous approaches that account for distributed extreme value response variables that describe the damage evolution process of interest based only on the extreme value distributions of a single microstructure attribute; previous approaches have given no consideration of how coupled microstructure attributes affect the distributions of extreme value response. This framework also utilizes computational modeling techniques to identify correlations between microstructure attributes that significantly raise or lower the magnitudes of the damage response variables of interest through the simulation of multiple statistical volume elements (SVE). Each SVE for a given response is constructed to be a statistical sample of the entire microstructure ensemble (i.e., bulk material); therefore, the response of interest in each SVE is not expected to be the same. This is in contrast to computational simulation of a single representative volume element (RVE), which often is untenably large for response variables dependent on the extreme value microstructure attributes. This framework has been demonstrated in the context of characterizing microstructure-sensitive high cycle fatigue (HCF) variability due to the processes of fatigue crack formation (nucleation and microstructurally small crack growth) in polycrystalline metallic alloys. Specifically, the framework is exercised to estimate the local driving forces for fatigue crack formation, to validate these with limited existing experiments, and to explore how the extreme value probabilities of certain fatigue indicator parameters (FIPs) affect overall variability in fatigue life in the HCF regime. Various FIPs have been introduced and used previously as a means to quantify the potential for fatigue crack formation based on experimentally observed mechanisms. Distributions of the extreme value FIPs are calculated for multiple SVEs simulated via the FEM with crystal plasticity constitutive relations. By using crystal plasticity relations, the FIPs can be computed based on the cyclic plastic strain on the scale of the individual grains. These simulated SVEs are instantiated such that they are statistically similar to real microstructures in terms of the crystallographic microstructure attributes that are hypothesized to have the most influence on the extreme value HCF response. The polycrystalline alloys considered here include the Ni-base superalloy IN100 and the Ti alloy Ti-6Al-4V. In applying this framework to study the microstructure dependent variability of HCF in these alloys, the extreme value distributions of the FIPs and associated extreme value marked correlations of crystallographic microstructure attributes are characterized. This information can then be used to rank order multiple variants of the microstructure for a specific material system for relative HCF performance or to design new microstructures hypothesized to exhibit improved performance. This framework enables limiting the (presently) large number of experiments required to characterize scatter in HCF and lends quantitative support to designing improved, fatigue-resistant materials and accelerating insertion of modified and new materials into service.
234

Extreme Production - Administration Makeover : ur ett informationslogistiskt perspektiv

Granlund, Mirva, Hjerling, Sandra January 2007 (has links)
<p>Extreme Production Makeover™ fångade vårt intresse för en tid sedan, strax innan vi skulle börja skriva vår C-uppsats. Metoden går ut på att under mycket kort tid förändra och effektivisera en produktionslina för att höja produktiviteten. Vi blev genast nyfikna på om Extreme Production Makeover™ skulle gå att omsätta i administrativa flöden och tjänste-flöden med ett informationslogistiskt perspektiv.</p><p>Inom produktion finns det en lång tradition av att arbeta med effektiviseringar och förbättringar. Våra studier av effektiviseringsmetoder inom administrativa flöden visar på att me-toderna inte är lika utbredda inom tjänste- och servicebranschen. Vad beror det på?</p><p>Normalt tar ett förändringsarbete av administrativa flöden eller tjänsteflöden en ganska lång tid i anspråk. Men, om en Extreme Production Makeover™ kan genomföras på en kort tid, så borde ju också en Extreme Administration Makeover kunna genomföras.</p><p>Syfte:</p><p>Vårt syfte med studien är att undersöka och jämföra effektivisering i produktionsflöden med effektivisering i administrativa flöden och tjänsteflöden. Vi vill belysa nyttan med informationslogistik i samband med förändringsarbeten, för företag och personer som arbetar med processeffektivisering. Resultatet av studien är en handlingsplan för Extreme Administration Makeover.</p><p>Metod:</p><p>Vårt arbete är genomfört som en skrivbordsstudie, vi har alltså inte provat eller utvärderat Extreme Administration Makeover i verkligheten. Vi har studerat olika typer av litteratur som rör informationslogistik och effektivisering av olika flöden. Detta har vi sammanfattat och redovisat under rubriken Teori. För att ge ytterligare kraft till teorin, har vi samtalat med nyckelpersoner inom Extreme Production Makeover™ och Lean Forum, för att bilda oss en uppfattning om hur metoderna kan jämföras och tillämpas. Resultatet av dessa diskussioner tas upp i Empirin. Till detta har vi lagt vår kunskap i informationslogistik och därefter skapat en handlingsplan med tillhörande modell, för Extreme Administration Makeover. Därefter följer Analysen av vårt studieresultat och en Avslutande Diskussion.</p><p>Avgränsning:</p><p>Vi har i vår rapport valt att begränsa oss till metoderna inom traditionell produktionseffektivisering och Lean Production men är väl medvetna om att metoderna för förändringsarbete är fler än så.</p><p>Resultat:</p><p>Delar av ett förändringsarbete kan genomföras under en kortare, eller extremt kort tid, men att etablera och rota arbetssätt och framför allt tanken bakom; att få tänkandet att genomsyra verksamheten, kräver förmodligen betydligt längre tid. Det beror på de personer som berörs och blir involverade i den. Genom att förankra tankar, mål och visioner i organisationens alla led, skapas en grund för en stabil förändring och en förutsättning för ett gott resultat.</p> / <p>Extreme Production Makeover™ caught our interest before we began to write our C-essay. The method is about transformation and how to make a production line more efficient in order to raise the productivity level within a short timeframe. Since we are about to gradu-ate in Information logistics, we are curious to find out if Extreme Production Makeover™ is transferable to administrative and service flows, from an information logistics perspec-tive.</p><p>Within the area of production, there has been a long tradition to work with efficiency and improvements. Our studies of efficiency methods within administrative flows indicate that these methods are not as widespread within the service trades. How come?</p><p>In general, the procedure of changing administrative and service flows is a long process, however, if an Extreme Production Makeover™ could be accomplished during a short timeframe, so could probably also Extreme Administration Makeover.</p><p>Aim:</p><p>The aim of our study is to examine and compare production flow efficiency along with ef-ficiency in administrative and service flows. We want to illustrate the benefit of using in-formation logistics in structural changes, for both companies and personnel working with process efficiency. The result of our study is an action plan for Extreme Administration Makeover.</p><p>Method:</p><p>Our essay was created as a proposal; we did not test or evaluate Extreme Administration Makeover in reality. We have studied a variety of literature linked to information logistics and the efficacy of different flows, which we have summarized and illustrate under the heading Theory. In order to support the theory further, we spoke to key personnel within Extreme Production Makeover™ and Lean Forum and determined how the methods could be compared and applied. The result of these discussions is presented in the Em-pirical research. Furthermore, we have added our knowledge about information logistics and created an action plan, with an attached model exclusively for Extreme Administration Makeover. Accordingly, the result presented in the Analysis is followed by a Conclusion.</p><p>Delimitation:</p><p>We have chosen to limit our essay to the methods of traditional production efficiency and Lean Production, although we are well aware that there are several other methods for change.</p><p>Results:</p><p>Parts of a transformation plan could be accomplished during a short, or minimally short timeframe, however, in order to establish a deeply rooted work process, particularly the underlying force, and to permeate this vision into the enterprise, almost certainly requires more time. The outcome depends on the individuals affected and involved in the process. By establishing deeply rooted thoughts, goals and visions throughout the organisation, a solid foundation generates a sound transformation, which is crucial for a positive outcome.</p>
235

Operation of silicon-germanium heterojunction bipolar transistors on silicon-on-insulator in extreme environments

Bellini, Marco 02 March 2009 (has links)
Recently, several SiGe HBT devices fabricated on CMOS-compatible silicon on insulator (SOI) substrates (SiGe HBTs-on-SOI) have been demonstrated, combining the well-known SiGe HBT performance with the advantages of SOI substrates. These new devices are especially interesting in the context of extreme environments - highly challenging surroundings that lie outside commercial and even military electronics specifications. However, fabricating HBTs on SOI substrates instead of traditional silicon bulk substrates requires extensive modifications to the structure of the transistors and results in significant trade-offs. The present work investigates, with measurements and TCAD simulations, the performance and reliability of SiGe heterojunction bipolar transistors fabricated on silicon on insulator substrates with respect to operation in extreme environments such as at extremely low or extremely high temperatures or in the presence of radiation (both in terms of total ionizing dose and single effect upset).
236

Life & lifestyle makeovers the promotion of materialism in Extreme Makeover: Home Edition /

Ratliff, Kari. January 2007 (has links)
Thesis (M.A.)--Miami University, Dept. of Communication, 2007. / Title from first page of PDF document. Includes bibliographical references (p. 69-71).
237

Développement d'un modèle statistique non stationnaire et régional pour les précipitations extrêmes simulées par un modèle numérique de climat / A non-stationary and regional statistical model for the precipitation extremes simulated by a climate model

Jalbert, Jonathan 30 October 2015 (has links)
Les inondations constituent le risque naturel prédominant dans le monde et les dégâts qu'elles causent sont les plus importants parmi les catastrophes naturelles. Un des principaux facteurs expliquant les inondations sont les précipitations extrêmes. En raison des changements climatiques, l'occurrence et l'intensité de ces dernières risquent fort probablement de s'accroître. Par conséquent, le risque d'inondation pourrait vraisemblablement s'intensifier. Les impacts de l'évolution des précipitations extrêmes sont désormais un enjeu important pour la sécurité du public et pour la pérennité des infrastructures. Les stratégies de gestion du risque d'inondation dans le climat futur sont essentiellement basées sur les simulations provenant des modèles numériques de climat. Un modèle numérique de climat procure notamment une série chronologique des précipitations pour chacun des points de grille composant son domaine spatial de simulation. Les séries chronologiques simulées peuvent être journalières ou infra-journalières et elles s'étendent sur toute la période de simulation, typiquement entre 1961 et 2100. La continuité spatiale des processus physiques simulés induit une cohérence spatiale parmi les séries chronologiques. Autrement dit, les séries chronologiques provenant de points de grille avoisinants partagent souvent des caractéristiques semblables. De façon générale, la théorie des valeurs extrêmes est appliquée à ces séries chronologiques simulées pour estimer les quantiles correspondants à un certain niveau de risque. La plupart du temps, la variance d'estimation est considérable en raison du nombre limité de précipitations extrêmes disponibles et celle-ci peut jouer un rôle déterminant dans l'élaboration des stratégies de gestion du risque. Par conséquent, un modèle statistique permettant d'estimer de façon précise les quantiles de précipitations extrêmes simulées par un modèle numérique de climat a été développé dans cette thèse. Le modèle développé est spécialement adapté aux données générées par un modèle de climat. En particulier, il exploite l'information contenue dans les séries journalières continues pour améliorer l'estimation des quantiles non stationnaires et ce, sans effectuer d'hypothèse contraignante sur la nature de la non-stationnarité. Le modèle exploite également l'information contenue dans la cohérence spatiale des précipitations extrêmes. Celle-ci est modélisée par un modèle hiérarchique bayésien où les lois a priori des paramètres sont des processus spatiaux, en l'occurrence des champs de Markov gaussiens. L'application du modèle développé à une simulation générée par le Modèle régional canadien du climat a permis de réduire considérablement la variance d'estimation des quantiles en Amérique du Nord. / Precipitation extremes plays a major role in flooding events and their occurrence as well as their intensity are expected to increase. It is therefore important to anticipate the impacts of such an increase to ensure the public safety and the infrastructure sustainability. Since climate models are the only tools for providing quantitative projections of precipitation, flood risk management for the future climate may be based on their simulations. Most of the time, the Extreme value theory is used to estimate the extreme precipitations from a climate simulation, such as the T-year return levels. The variance of the estimations are generally large notably because the sample size of the maxima series are short. Such variance could have a significant impact for flood risk management. It is therefore relevant to reduce the estimation variance of simulated return levels. For this purpose, the aim of this paper is to develop a non-stationary and regional statistical model especially suited for climate models that estimates precipitation extremes. At first, the non-stationarity is removed by a preprocessing approach. Thereafter, the spatial correlation is modeled by a Bayesian hierarchical model including an intrinsic Gaussian Markov random field. The model has been used to estimate the 100-year return levels over North America from a simulation by the Canadian Regional Climate Model. The results show a large estimation variance reduction when using the regional model.
238

[en] CONTAGION AND EXTREMAL INTERDEPENDENCE IN EMERGING MARKETS / [pt] INTERDEPENDÊNCIA EXTREMA E CONTÁGIO EM MERCADOS EMERGENTES

RODRIGO GELLI CAVALCANTI 01 June 2007 (has links)
[pt] Nesta dissertação avalia-se o grau de associação entre pares de excessos de retornos, simultâneos e defasados no tempo, usando-se o conceito de cópulas. Cópulas assimétricas são ajustadas aos pares de distribuições de retornos e coeficientes de dependência de cauda, as medidas de interdependência e contágio baseadas nessas cópulas, são calculados para 10 pares de índices de mercados. Tais coeficientes balizam a escolha do par de ativos com melhor desempenho em períodos de estresse. Se excessos defasados são incluídos, então estes coeficientes também indicam a direção e intensidade de propagação das crises (contágio). Os resultados encontrados na nossa investigação mostram que a técnica utilizada é eficaz na montagem de carteiras em que se pretende aproveitar os ganhos extremos conjuntos dos ativos e, ao mesmo tempo, evitar perdas extremas conjuntas. O uso de retornos defasados, porém, foi um artifício pouco producente, refletindo possivelmente o contágio quase instantâneo entre os mercados financeiros mundiais, nos dias de hoje. / [en] In this dissertation we evaluate the degree of association between pairs of excess of returns, simultaneous and lagged, using the concept of copulas. Asymmetric copulas are fitted to 10 pairs of distributions of returns of world markets índices. From these copulas coefficients of tail dependence are obtained for the right and left tails. Isong those coefficients as measures of cross dependence and contagion between markets one can pick the pair of returns that show the best performance in periods of stress. If lagged excess of returns are included, then these coefficients provide information on the direction and intensity of the contagion spread. Our results have shown that such technique isd efficent in constructing a portfolio in which one wants to take advantage of joint extreme gains of pairs of returns and, simultaneously, avoid losses associated with the occurrence of joint negative extremes. The use of lagged returns in this context hás shown no extra gain, maybe reflecting the fact that, nowadays, the spread of contagion between world financial markets is almost instantaneous.
239

[en] AN EMPIRICAL EVALUATION OF AN ENVIRONMENT DESIGNED FOR TEST DRIVEN DEVELOPMENT / [pt] UMA AVALIAÇÃO EMPÍRICA DE UM AMBIENTE FAVORÁVEL PARA O DESENVOLVIMENTO DIRIGIDO POR TESTES

HENRIQUE FELICIANO PRANGE 28 September 2007 (has links)
[pt] Test Driven Development (TDD) é uma das práticas de eXtreme Programming (XP) mais fáceis de entender e ao mesmo tempo uma das mais difíceis de executar. Para que o TDD seja usado apropriadamente, é preciso empregar práticas complementares, utilizar ferramentas adequadas e tomar algumas precauções para que seja feito de forma correta. Este trabalho de mestrado apresenta um estudo baseado na experiência real - realizada em uma pequena empresa - na qual foi elaborada uma infra- estrutura favorável ao desenvolvimento dirigido por testes. Quais as vantagens e desvantagens de cada uma das práticas? Como introduzir essas práticas no dia-a- dia de uma pequena empresa? Que tipo de infra-estrutura deve ser montada? Quais as ferramentas? Quanto tempo e qual o tipo de investimento necessário para a implantação dessas melhorias? Estas e outras perguntas são respondidas no decorrer do trabalho. / [en] Test Driven Development (TDD) is one of the eXtreme Programming´s (XP) easiest practices to understand but at the same time difficult to implement. It is necessary to use complementary practices, appropriate tools, and follow carefully some rules for achieving good results. A real experiment creating an adequate environment for TDD was conducted in a small company. This study will show the results obtained. What are the advantages and disadvantages of each one of the practices? How to establish these practices in small company daily operations? What type of environment has to be built? Which tools? How much time and investment for implementing this kind of enhancement would be required? This work will present answers to these questions.
240

Um método de aprendizagem seqüencial com filtro de Kalman e Extreme Learning Machine para problemas de regressão e previsão de séries temporais

NÓBREGA, Jarley Palmeira 24 August 2015 (has links)
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-03-15T12:52:14Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Tese_Jarley_Nobrega_CORRIGIDA.pdf: 12392055 bytes, checksum: 30d9ff36e7236d22ddc3a16dd942341f (MD5) / Made available in DSpace on 2016-03-15T12:52:14Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Tese_Jarley_Nobrega_CORRIGIDA.pdf: 12392055 bytes, checksum: 30d9ff36e7236d22ddc3a16dd942341f (MD5) Previous issue date: 2015-08-24 / Em aplicações de aprendizagem de máquina, é comum encontrar situações onde o conjunto de entrada não está totalmente disponível no início da fase de treinamento. Uma solução conhecida para essa classe de problema é a realização do processo de aprendizagem através do fornecimento sequencial das instâncias de treinamento. Entre as abordagens mais recentes para esses métodos, encontram-se as baseadas em redes neurais do tipo Single Layer Feedforward Network (SLFN), com destaque para as extensões da Extreme Learning Machine (ELM) para aprendizagem sequencial. A versão sequencial da ELM, chamada de Online Sequential Extreme Learning Machine (OS-ELM), utiliza uma solução recursiva de mínimos quadrados para atualizar os pesos de saída da rede através de uma matriz de covariância. Entretanto, a implementação da OS-ELM e suas extensões sofrem com o problema de multicolinearidade entre os elementos da matriz de covariância. Essa tese introduz um novo método para aprendizagem sequencial com capacidade para tratar os efeitos da multicolinearidade. Chamado de Kalman Learning Machine (KLM), o método proposto utiliza o filtro de Kalman para a atualização sequencial dos pesos de saída de uma SLFN baseada na OS-ELM. Esse trabalho também propõe uma abordagem para a estimativa dos parâmetros do filtro, com o objetivo de diminuir a complexidade computacional do treinamento. Além disso, uma extensão do método chamada de Extended Kalman Learning Machine (EKLM) é apresentada, voltada para problemas onde a natureza do sistema em estudo seja não linear. O método proposto nessa tese foi comparado com alguns dos mais recentes e efetivos métodos para o tratamento de multicolinearidade em problemas de aprendizagem sequencial. Os experimentos executados mostraram que o método proposto apresenta um desempenho melhor que a maioria dos métodos do estado da arte, quando medidos o de erro de previsão e o tempo de treinamento. Um estudo de caso foi realizado, aplicando o método proposto a um problema de previsão de séries temporais para o mercado financeiro. Os resultados confirmaram que o KLM consegue simultaneamente reduzir o erro de previsão e o tempo de treinamento, quando comparado com os demais métodos investigados nessa tese. / In machine learning applications, there are situations where the input dataset is not fully available at the beginning of the training phase. A well known solution for this class of problem is to perform the learning process through the sequential feed of training instances. Among most recent approaches for sequential learning, we can highlight the methods based on Single Layer Feedforward Network (SLFN) and the extensions of the Extreme Learning Machine (ELM) approach for sequential learning. The sequential version of the ELM algorithm, named Online Sequential Extreme Learning Machine (OS-ELM), uses a recursive least squares solution for updating the output weights through a covariance matrix. However, the implementation of OS-ELM and its extensions suffer from the problem of multicollinearity for the hidden layer output matrix. This thesis introduces a new method for sequential learning in which the effects of multicollinearity is handled. The proposed Kalman Learning Machine (KLM) updates sequentially the output weights of an OS-ELM based network by using the Kalman filter iterative procedure. In this work, in order to reduce the computational complexity of the training process, a new approach for estimating the filter parameters is presented. Moreover, an extension of the method, named Extended Kalman Learning Machine (EKLM), is presented for problems where the dynamics of the model are non linear. The proposed method was evaluated by comparing the related state-of-the-art methods for sequential learning based on the original OS-ELM. The results of the experiments show that the proposed method can achieve the lowest forecast error when compared with most of their counterparts. Moreover, the KLM algorithm achieved the lowest average training time when all experiments were considered, as an evidence that the proposed method can reduce the computational complexity for the sequential learning process. A case study was performed by applying the proposed method for a problem of financial time series forecasting. The results reported confirm that the KLM algorithm can decrease the forecast error and the average training time simultaneously, when compared with other sequential learning algorithms.

Page generated in 0.0467 seconds