• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 200
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 462
  • 63
  • 56
  • 56
  • 54
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Three Dimensional Hyperbolic Grid Generation

Dincgez, Umut Can 01 April 2006 (has links) (PDF)
This thesis analyzes procedure of generation of hyperbolic grids formulated by two constraints, which specify grid orthogonality and cell volume. The procedure was applied on a wide range of geometries and high quality two and three dimensional hyperbolic grids were generated by using grid control and smoothing procedures, which supply grid clustering in all directions and prevent grid deformation (grid shock), respectively.
152

Prediction Of Prices Of Risky Assets Using Smoothing Algorithm

Capanoglu, Gulsum Elcin 01 May 2006 (has links) (PDF)
This thesis presents the prediction algorithm for the price of the share of risky asset. The price of the share is presented by dynamic model and observation is presented by the measurement model. Dynamic model is derived by using Stochastic Calculus. The algorithm is simulated by using Matlab.
153

Essays on international investment holdings and risk sharing

Wu, Yi-Tsung. January 2007 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Department of Economics, 2007. / Includes bibliographical references.
154

Trends in Forest Soil Acidity : A GAM Based Approach with Application on Swedish Forest Soil Inventory Data

Betnér, Staffan January 2018 (has links)
The acidification of soils has been a continuous process since at least the beginning of the 20th century. Therefore, an inquiry of how and when the soil pH levels have changed is relevant to gain better understanding of this process. The aim of this thesis is to study the average national soil pH level over time in Sweden and the local spatial differences within Sweden over time. With data from the Swedish National Forest Inventory, soil pH surfaces are estimated for each surveyed year together with the national average soil pH using a generalized additive modeling approach with one model for each pair of consecutive years. A decreasing trend in average national level soil pH was found together with some very weak evidence of year-to-year differences in the spatial structure of soil pH.
155

Empirical likelihood with applications in time series

Li, Yuyi January 2011 (has links)
This thesis investigates the statistical properties of Kernel Smoothed Empirical Likelihood (KSEL, e.g. Smith, 1997 and 2004) estimator and various associated inference procedures in weakly dependent data. New tests for structural stability are proposed and analysed. Asymptotic analyses and Monte Carlo experiments are applied to assess these new tests, theoretically and empirically. Chapter 1 reviews and discusses some estimation and inferential properties of Empirical Likelihood (EL, Owen, 1988) for identically and independently distributed data and compares it with Generalised EL (GEL), GMM and other estimators. KSEL is extensively treated, by specialising kernel-smoothed GEL in the working paper of Smith (2004), some of whose results and proofs are extended and refined in Chapter 2. Asymptotic properties of some tests in Smith (2004) are also analysed under local alternatives. These special treatments on KSEL lay the foundation for analyses in Chapters 3 and 4, which would not otherwise follow straightforwardly. In Chapters 3 and 4, subsample KSEL estimators are proposed to assist the development of KSEL structural stability tests to diagnose for a given breakpoint and for an unknown breakpoint, respectively, based on relevant work using GMM (e.g. Hall and Sen, 1999; Andrews and Fair, 1988; Andrews and Ploberger, 1994). It is also original in these two chapters that moment functions are allowed to be kernel-smoothed after or before the sample split, and it is rigorously proved that these two smoothing orders are asymptotically equivalent. The overall null hypothesis of structural stability is decomposed according to the identifying and overidentifying restrictions, as Hall and Sen (1999) advocate in GMM, leading to a more practical and precise structural stability diagnosis procedure. In this framework, these KSEL structural stability tests are also proved via asymptotic analysis to be capable of identifying different sources of instability, arising from parameter value change or violation of overidentifying restrictions. The analyses show that these KSEL tests follow the same limit distributions as their counterparts using GMM. To examine the finite-sample performance of KSEL structural stability tests in comparison to GMM's, Monte Carlo simulations are conducted in Chapter 5 using a simple linear model considered by Hall and Sen (1999). This chapter details some relevant computational algorithms and permits different smoothing order, kernel type and prewhitening options. In general, simulation evidence seems to suggest that compared to GMM's tests, these newly proposed KSEL tests often perform comparably. However, in some cases, the sizes of these can be slightly larger, and the false null hypotheses are rejected with much higher frequencies. Thus, these KSEL based tests are valid theoretical and practical alternatives to GMM's.
156

[en] DEFINITION OF A QUALITY INDEX FOR ELECTRIC POWER DISTRIBUTION COMPANIES USING MULTIPLE CRITERIA DECISION SUPPORT AND TIME SERIES ANALYSIS / [pt] DEFINIÇÃO DE UM ÍNDICE DE QUALIDADE PARA DISTRIBUIDORAS DE ENERGIA ELÉTRICA UTILIZANDO O APOIO MULTICRITÉRIO À DECISÃO E ANÁLISE DE SÉRIES TEMPORAIS

ADERSON CAMPOS PASSOS 06 June 2011 (has links)
[pt] O presente trabalho desenvolve um método híbrido com a finalidade de criar um índice de qualidade para distribuidoras de energia elétrica. Esse método é construído através da fusão do Método de Análise Hierárquica (AHP) e Técnicas de Amortecimento Exponencial. Com isso, é possível avaliar uma distribuidora levando em conta múltiplos critérios e seus diversos índices passados. / [en] This work develops a hybrid method in order to create a quality index for electric power distribution companies. This method is built through the merger of the Analytical Hierarchy Process (AHP) and exponential smoothing techniques. Thus, it is possible to evaluate a distribution company taking into account multiple criteria and its several indexes in the past.
157

Variational based analysis and modelling using B-splines

Sherar, P. A. January 2004 (has links)
The use of energy methods and variational principles is widespread in many fields of engineering of which structural mechanics and curve and surface design are two prominent examples. In principle many different types of function can be used as possible trial solutions to a given variational problem but where piecewise polynomial behaviour and user controlled cross segment continuity is either required or desirable, B-splines serve as a natural choice. Although there are many examples of the use of B-splines in such situations there is no common thread running through existing formulations that generalises from the one dimensional case through to two and three dimensions. We develop a unified approach to the representation of the minimisation equations for B-spline based functionals in tensor product form and apply these results to solving specific problems in geometric smoothing and finite element analysis using the Rayleigh-Ritz method. We focus on the development of algorithms for the exact computation of the minimisation matrices generated by finding stationary values of functionals involving integrals of squares and products of derivatives, and then use these to seek new variational based solutions to problems in the above fields. By using tensor notation we are able to generalise the methods and the algorithms from curves through to surfaces and volumes. The algorithms developed can be applied to other fields where a variational form of the problem exists and where such tensor product B-spline functions can be specified as potential solutions.
158

Run-to-run modelling and control of batch processes

Duran Villalobos, Carlos Alberto January 2016 (has links)
The University of ManchesterCarlos Alberto Duran VillalobosDoctor of Philosophy in the Faculty of Engineering and Physical SciencesDecember 2015This thesis presents an innovative batch-to-batch optimisation technique that was able to improve the productivity of two benchmark fed-batch fermentation simulators: Saccharomyces cerevisiae and Penicillin production. In developing the proposed technique, several important challenges needed to be addressed:For example, the technique relied on the use of a linear Multiway Partial Least Squares (MPLS) model to adapt from one operating region to another as productivity increased to estimate the end-point quality of each batch accurately. The proposed optimisation technique utilises a Quadratic Programming (QP) formulation to calculate the Manipulated Variable Trajectory (MVT) from one batch to the next. The main advantage of the proposed optimisation technique compared with other approaches that have been published was the increase of yield and the reduction of convergence speed to obtain an optimal MVT. Validity Constraints were also included into the batch-to-batch optimisation to restrict the QP calculations to the space only described by useful predictions of the MPLS model. The results from experiments over the two simulators showed that the validity constraints slowed the rate of convergence of the optimisation technique and in some cases resulted in a slight reduction in final yield. However, the introduction of the validity constraints did improve the consistency of the batch optimisation. Another important contribution of this thesis were a series of experiments that were implemented utilising a variety of smoothing techniques used in MPLS modelling combined with the proposed batch-to-batch optimisation technique. From the results of these experiments, it was clear that the MPLS model prediction accuracy did not significantly improve using these smoothing techniques. However, the batch-to-batch optimisation technique did show improvements when filtering was implemented.
159

Multi-scale Feature-Preserving Smoothing of Images and Volumes on GPU / Lissage multi-echelle sur GPU des images et volumes avec preservation des details

Jibai, Nassim 24 May 2012 (has links)
Les images et données volumiques sont devenues importantes dans notre vie quotidienne que ce soit sur le plan artistique, culturel, ou scientifique. Les données volumiques ont un intérêt important dans l'imagerie médicale, l'ingénierie, et l'analyse du patrimoine culturel. Ils sont créées en utilisant la reconstruction tomographique, une technique qui combine une large série de scans 2D capturés de plusieur points de vue. Chaque scan 2D est obtenu par des methodes de rayonnement : Rayons X pour les scanners CT, ondes radiofréquences pour les IRM, annihilation électron-positron pour les PET scans, etc. L'acquisition des images et données volumique est influencée par le bruit provoqué par différents facteurs. Le bruit dans les images peut être causée par un manque d'éclairage, des défauts électroniques, faible dose de rayonnement, et un mauvais positionnement de l'outil ou de l'objet. Le bruit dans les données volumique peut aussi provenir d'une variété de sources : le nombre limité de points de vue, le manque de sensibilité dans les capteurs, des contrastes élevé, les algorithmes de reconstruction employés, etc. L'acquisition de données non bruitée est iréalisable. Alors, il est souhaitable de réduire ou d'éliminer le bruit le plus tôt possible dans le pipeline. La suppression du bruit tout en préservant les caractéristiques fortes d'une image ou d'un objet volumique reste une tâche difficile. Nous proposons une méthode multi-échelle pour lisser des images 2D et des données tomographiques 3D tout en préservant les caractéristiques à l'échelle spécifiée. Notre algorithme est contrôlé par un seul paramètre – la taille des caractéristiques qui doivent être préservées. Toute variation qui est plus petite que l'échelle spécifiée est traitée comme bruit et lissée, tandis que les discontinuités telles que des coins, des bords et des détails à plus grande échelle sont conservés. Nous démontrons les données lissées produites par notre algorithme permettent d'obtenir des images nettes et des iso-surfaces plus propres. Nous comparons nos résultats avec ceux des methodes précédentes. Notre méthode est inspirée par la diffusion anisotrope. Nous calculons nos tenseurs de diffusion à partir des histogrammes continues locaux de gradients autour de chaque pixel dans les images et autour de chaque voxel dans des volumes. Comme notre méthode de lissage fonctionne entièrement sur GPU, il est extrêmement rapide. / Two-dimensional images and three-dimensional volumes have become a staple ingredient of our artistic, cultural, and scientific appetite. Images capture and immortalize an instance such as natural scenes, through a photograph camera. Moreover, they can capture details inside biological subjects through the use of CT (computer tomography) scans, X-Rays, ultrasound, etc. Three-dimensional volumes of objects are also of high interest in medical imaging, engineering, and analyzing cultural heritage. They are produced using tomographic reconstruction, a technique that combine a large series of 2D scans captured from multiple views. Typically, penetrative radiation is used to obtain each 2D scan: X-Rays for CT scans, radio-frequency waves for MRI (magnetic resonance imaging), electron-positron annihilation for PET scans, etc. Unfortunately, their acquisition is influenced by noise caused by different factors. Noise in two-dimensional images could be caused by low-light illumination, electronic defects, low-dose of radiation, and a mispositioning tool or object. Noise in three-dimensional volumes also come from a variety of sources: the limited number of views, lack of captor sensitivity, high contrasts, the reconstruction algorithms, etc. The constraint that data acquisition be noiseless is unrealistic. It is desirable to reduce, or eliminate, noise at the earliest stage in the application. However, removing noise while preserving the sharp features of an image or volume object remains a challenging task. We propose a multi-scale method to smooth 2D images and 3D tomographic data while preserving features at a specified scale. Our algorithm is controlled using a single user parameter – the minimum scale of features to be preserved. Any variation that is smaller than the specified scale is treated as noise and smoothed, while discontinuities such as corners, edges and detail at a larger scale are preserved. We demonstrate that our smoothed data produces clean images and clean contour surfaces of volumes using standard surface-extraction algorithms. In addition to, we compare our results with results of previous approaches. Our method is inspired by anisotropic diffusion. We compute our diffusion tensors from the local continuous histograms of gradients around each pixel in image
160

Kernel smoothing dos dados de chuva no Nordeste

BARBOSA, Nyedja Fialho Morais 22 March 2013 (has links)
Submitted by (ana.araujo@ufrpe.br) on 2016-08-09T13:11:01Z No. of bitstreams: 1 Nyedja Fialho Morais Barbosa.pdf: 3325046 bytes, checksum: 58f0c964732402cfaf2333cb5ea24c35 (MD5) / Made available in DSpace on 2016-08-09T13:11:01Z (GMT). No. of bitstreams: 1 Nyedja Fialho Morais Barbosa.pdf: 3325046 bytes, checksum: 58f0c964732402cfaf2333cb5ea24c35 (MD5) Previous issue date: 2013-03-22 / Northeastern Brazil has great climatic adversity, is considered a very complex region, attracting the interest of scholars from around the world. The rainfall over this region is considered by seasonal behave more intensely on three internal zones of the region in different periods of the year, lasting three months, besides suffering heavily influenced by the incidence of El Niño, La Niña and other phenomena acting on the basins of the tropical Pacific and Atlantic oceans. In this work the technique was applied computational mathematics-interpolation Kernel Smoothing the data of rain on northeastern Brazil collected in the period from 1904 to 1998, from 2283 conventional weather stations located in all states of the Northeast. The calculations were performed on the GPU developed "Cluster Neumann" Program Graduate in Applied Statistics and Biometry, Department of Statistics and Informatics UFRPE through software "kernel" written in C language and Cuda. This tool allowed to do the interpolation of more than 26 million measurements of rainfall over the entire Northeast, allowing generate maps of rainfall intensity over the entire region, and make estimates in areas of missing data, and calculate statistics for precipitation Northeast in general scope and seasonal. According to the interpolations made could be detected among the studied period, the driest years and wettest, the spatial distribution of rainfall in each month as well as the characteristic of rainfall in times of El Niño and La Niña. / O Nordeste do Brasil possui grande diversidade climática, sendo considerada uma região bastante complexa, despertando o interesse de estudiosos de todo o mundo. O regime de chuvas sobre esta região é considerada sazonal por comportar-se de forma mais intensa sobre três zonas internas da região, em períodos do ano diferenciados, com duração de três meses, além de sofrer fortes influências pela incidência do El Niño, La Niña e outros fenômenos atuantes sobre as bacias dos oceanos Pacífico e Atlântico Tropicais. Neste trabalho foi aplicada a técnica matemática-computacional de interpolação do Kernel Smoothing nos dados de chuva sobre a Região Nordeste do Brasil coletados no período de 1904 a 1998, provenientes de 2.283 estações meteorológicas convencionais localizadas em todos os estados do Nordeste. Os cálculos realizados foram desenvolvidos no GPU "Cluster Neumann" do Programa de Pós-Graduação em Biometria e Estatística Aplicada do Departamento de Estatística e Informática da UFRPE através do software "Kernel" escrito em linguagem C e Cuda. Tal ferramenta possibilitou fazer a interpolação de mais de 26 milhões de medidas de precipitação de chuva sobre todo o Nordeste, permitindo gerar mapas de intensidade de chuva sobre toda a região, além de fazer estimativas em áreas de dados ausentes, e calcular estatísticas para a precipitação do Nordeste em âmbito geral e sazonal. De acordo com as interpolações realizadas foi possível detectar, dentre o período estudado, os anos mais secos e mais chuvosos, a distribuição espacial das chuvas em cada mês, bem como a característica da precipitação pluviométrica em épocas de El Niño e La Niña.

Page generated in 1.7519 seconds