• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 152
  • 151
  • 38
  • 21
  • 13
  • 9
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 452
  • 123
  • 60
  • 58
  • 57
  • 51
  • 49
  • 45
  • 42
  • 40
  • 39
  • 36
  • 35
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

A Framework for Participatory Sensing Systems

Mendez Chaves, Diego 01 January 2012 (has links)
Participatory sensing (PS) systems are a new emerging sensing paradigm based on the participation of cellular users in a cooperative way. Due to the spatio-temporal granularity that a PS system can provide, it is now possible to detect and analyze events that occur at different scales, at a low cost. While PS systems present interesting characteristics, they also create new problems. Since the measuring devices are cheaper and they are in the hands of the users, PS systems face several design challenges related to the poor accuracy and high failure rate of the sensors, the possibility of malicious users tampering the data, the violation of the privacy of the users as well as methods to encourage the participation of the users, and the effective visualization of the data. This dissertation presents four main contributions in order to solve some of these challenges. This dissertation presents a framework to guide the design and implementation of PS applications considering all these aspects. The framework consists of five modules: sample size determination, data collection, data verification, data visualization, and density maps generation modules. The remaining contributions are mapped one-on-one to three of the modules of this framework: data verification, data visualization and density maps. Data verification, in the context of PS, consists of the process of detecting and removing spatial outliers to properly reconstruct the variables of interest. A new algorithm for spatial outliers detection and removal is proposed, implemented, and tested. This hybrid neighborhood-aware algorithm considers the uneven spatial density of the users, the number of malicious users, the level of conspiracy, and the lack of accuracy and malfunctioning sensors. The experimental results show that the proposed algorithm performs as good as the best estimator while reducing the execution time considerably. The problem of data visualization in the context of PS application is also of special interest. The characteristics of a typical PS application imply the generation of multivariate time-space series with many gaps in time and space. Considering this, a new method is presented based on the kriging technique along with Principal Component Analysis and Independent Component Analysis. Additionally, a new technique to interpolate data in time and space is proposed, which is more appropriate for PS systems. The results indicate that the accuracy of the estimates improves with the amount of data, i.e., one variable, multiple variables, and space and time data. Also, the results clearly show the advantage of a PS system compared with a traditional measuring system in terms of the precision and spatial resolution of the information provided to the users. One key challenge in PS systems is that of the determination of the locations and number of users where to obtain samples from so that the variables of interest can be accurately represented with a low number of participants. To address this challenge, the use of density maps is proposed, a technique that is based on the current estimations of the variable. The density maps are then utilized by the incentive mechanism in order to encourage the participation of those users indicated in the map. The experimental results show how the density maps greatly improve the quality of the estimations while maintaining a stable and low total number of users in the system. P-Sense, a PS system to monitor pollution levels, has been implemented and tested, and is used as a validation example for all the contributions presented here. P-Sense integrates gas and environmental sensors with a cell phone, in order to monitor air quality levels.
152

試錐調査データの地球統計学的解析による堆積岩域での高塩分地下水系の解明 / Clarification of saline groundwater system in sedimentary rock area by geostatistical analyses of drilling investigation data

呂, 磊 23 March 2015 (has links)
Kyoto University (京都大学) / 0048 / 新制・課程博士 / 博士(工学) / 甲第18966号 / 工博第4008号 / 新制||工||1617 / 31917 / 京都大学大学院工学研究科都市社会工学専攻 / (主査)教授 小池 克明, 教授 石田 毅, 准教授 水戸 義忠 / 学位規則第4条第1項該当
153

Erdvės - laiko duomenų statistinis modeliavimas, pagrįstas laiko eilučių parametrų erdviniu interpoliavimu / Statistical modelling of spatio-temporal data based on spatial interpolation of time series parameters

Paulionienė, Laura 17 January 2014 (has links)
Disertaciniame darbe nagrinėjama erdvės – laiko duomenų modeliavimo problema. Dažnai erdvinių duomenų rinkiniai yra gana nedideli, o taškai, kuriuose pasklidę stebėjimai, išsidėstę netaisyklingai. Sprendžiant „erdvinį“ uždavinį, paprastai siekiama inerpoliuoti arba įvertinti erdvinį vidurkį. Laiko eilučių duomenys dažniausiai naudojami ateities reikšmėms prognozuoti. Tuo tarpu erdvės – laiko uždaviniai jungia abu uždavinių tipus. Pasiūlyta keletas originalių erdvinių laiko eilučių modeliavimo metodų. Siūlomi metodai pirmiausia analizuoja vienmates laiko eilutes, o pašalinus laikinę priklausomybė jose, laiko eilučių liekanoms vertinama erdvinė priklausomybė. Tikslas – sudaryti modelį, leidžiantį prognozuoti požymio reikšmę naujame, nestebėtame taške, nauju laiko momentu. Tokio modelio sudarymas remiasi laiko eilučių parametrų erdviniu interpoliavimu. / Space – time data modeling problem is analysed. Often spatial data sets are relatively small, and the points, where observations are taken, are located irregularly. When solving spatial task, usually we are interpolating or estimating the spatial average. Time series data usually are used to predict future values. Meanwhile, the space - time tasks combines both types of tasks. Few original modeling methods of spatial time series are proposed. The proposed methods firstly analyzes the univariate time series, and after removing temporal dependence, spatial dependence in the time series of residuals is measured. Aim of this dissertational work - to create time series model at new unobserved location by incorporating spatial interaction thru spatial interpolation of estimated time series parameters. Such a model is based on the spatial interpolation of time series parameters.
154

Statistical modelling of spatio-temporal data based on spatial interpolation of time series parameters / Erdvės - laiko duomenų statistinis modeliavimas, pagrįstas laiko eilučių parametrų erdviniu interpoliavimu

Paulionienė, Laura 17 January 2014 (has links)
Space – time data modeling problem is analysed. Often spatial data sets are relatively small, and the points, where observations are taken, are located irregularly. When solving spatial task, usually we are interpolating or estimating the spatial average. Time series data usually are used to predict future values. Meanwhile, the space - time tasks combines both types of tasks. Few original modeling methods of spatial time series are proposed. The proposed methods firstly analyzes the univariate time series, and after removing temporal dependence, spatial dependence in the time series of residuals is measured. Aim of this dissertational work - to create time series model at new unobserved location by incorporating spatial interaction thru spatial interpolation of estimated time series parameters. Such a model is based on the spatial interpolation of time series parameters. / Disertaciniame darbe nagrinėjama erdvės – laiko duomenų modeliavimo problema. Dažnai erdvinių duomenų rinkiniai yra gana nedideli, o taškai, kuriuose pasklidę stebėjimai, išsidėstę netaisyklingai. Sprendžiant „erdvinį“ uždavinį, paprastai siekiama inerpoliuoti arba įvertinti erdvinį vidurkį. Laiko eilučių duomenys dažniausiai naudojami ateities reikšmėms prognozuoti. Tuo tarpu erdvės – laiko uždaviniai jungia abu uždavinių tipus. Pasiūlyta keletas originalių erdvinių laiko eilučių modeliavimo metodų. Siūlomi metodai pirmiausia analizuoja vienmates laiko eilutes, o pašalinus laikinę priklausomybė jose, laiko eilučių liekanoms vertinama erdvinė priklausomybė. Tikslas – sudaryti modelį, leidžiantį prognozuoti požymio reikšmę naujame, nestebėtame taške, nauju laiko momentu. Tokio modelio sudarymas remiasi laiko eilučių parametrų erdviniu interpoliavimu.
155

Data Driven Selective Sensing for 3D Image Acquisition

Curtis, Phillip 26 November 2013 (has links)
It is well established that acquiring large amounts of range data with vision sensors can quickly lead to important data management challenges where processing capabilities become saturated and pre-empt full usage of the information available for autonomous systems to make educated decisions. While sub-sampling offers a naïve solution for reducing dataset dimension after acquisition, it does not capitalize on the knowledge available in already acquired data to selectively and dynamically drive the acquisition process over the most significant regions in a scene, the latter being generally characterized by variations in depth and surface shape in the context of 3D imaging. This thesis discusses the development of two formal improvement measures, the first based upon surface meshes and Ordinary Kriging that focuses on improving scene accuracy, and the second based upon probabilistic occupancy grids that focuses on improving scene coverage. Furthermore, three selection processes to automatically choose which locations within the field of view of a range sensor to acquire next are proposed based upon the two formal improvement measures. The first two selection processes each use only one of the proposed improvement measures. The third selection process combines both improvement measures in order to counterbalance the parameters of the accuracy of knowledge about the scene and the coverage of the scene. The proposed algorithms mainly target applications using random access range sensors, defined as sensors that can acquire depth measurements at a specified location within their field of view. Additionally, the algorithms are applicable to the case of estimating the improvement and point selection from within a single point of view, with the purpose of guiding the random access sensor to locations it can acquire. However, the framework is developed to be independent of the range sensing technology used, and is validated with range data of several scenes acquired from many different sensors employing various sensing technologies and configurations. Furthermore, the experimental results of the proposed selection processes are compared against those produced by a random sampling process, as well as a neural gas selective sensing algorithm.
156

Bayesian Analysis for Large Spatial Data

Park, Jincheol 2012 August 1900 (has links)
The Gaussian geostatistical model has been widely used in Bayesian modeling of spatial data. A core difficulty for this model is at inverting the n x n covariance matrix, where n is a sample size. The computational complexity of matrix inversion increases as O(n3). This difficulty is involved in almost all statistical inferences approaches of the model, such as Kriging and Bayesian modeling. In Bayesian inference, the inverse of covariance matrix needs to be evaluated at each iteration in posterior simulations, so Bayesian approach is infeasible for large sample size n due to the current computational power limit. In this dissertation, we propose two approaches to address this computational issue, namely, the auxiliary lattice model (ALM) approach and the Bayesian site selection (BSS) approach. The key feature of ALM is to introduce a latent regular lattice which links Gaussian Markov Random Field (GMRF) with Gaussian Field (GF) of the observations. The GMRF on the auxiliary lattice represents an approximation to the Gaussian process. The distinctive feature of ALM from other approximations lies in that ALM avoids completely the problem of the matrix inversion by using analytical likelihood of GMRF. The computational complexity of ALM is rather attractive, which increase linearly with sample size. The second approach, Bayesian site selection (BSS), attempts to reduce the dimension of data through a smart selection of a representative subset of the observations. The BSS method first split the observations into two parts, the observations near the target prediction sites (part I) and their remaining (part II). Then, by treating the observations in part I as response variable and those in part II as explanatory variables, BSS forms a regression model which relates all observations through a conditional likelihood derived from the original model. The dimension of the data can then be reduced by applying a stochastic variable selection procedure to the regression model, which selects only a subset of the part II data as explanatory data. BSS can provide us more understanding to the underlying true Gaussian process, as it directly works on the original process without any approximations involved. The practical performance of ALM and BSS will be illustrated with simulated data and real data sets.
157

Contributions to computer experiments and binary time series

Hung, Ying January 2008 (has links)
Thesis (Ph.D.)--Industrial and Systems Engineering, Georgia Institute of Technology, 2008. / Committee Chair: C. F. Jeff Wu; Committee Co-Chair: Roshan Joseph Vengazhiyil; Committee Member: Kwok L. Tsui; Committee Member: Ming Yuan; Committee Member: Shreyes N. Melkote
158

Metodologia geoestatística para dados com tendência regionalizada /

Nardi, Luiz Alberto Amaral. January 2015 (has links)
Orientador: Paulo Milton Barbosa Landim / Banca: Vilma Mayumi Tachibana / Banca: Alessandra Fagioli da Silva / Resumo: Georges Matheron, baseado na teoria das variáveis regionalizadas, sua criação, desenvolveu a Geoestatística, inicialmente aplicada em mineração, mas que atualmente é de larga aplicação em várias áreas do conhecimento cujos problemas possuem forte ligação espacial. Segundo essa metodologia, o método estimador mais usual é a Krigagem Ordinária, que leva em consideração a estrutura de variâncias e covariâncias entre as amostras que, neste caso, dependem da localização geográfica dos postos amostrados. Para utilização essa técnica, parte-se da pressuposição que a função aleatória Z(x), que descreve o fenômeno estudado, apresente estacionariedade de segunda ordem ou satisfaça a hipótese intrínseca. Isto implica, em ambos os casos, que E [Z(x)] = m, sendo m desconhecido, mas constante. Se, porventura, esta condição não for satisfeita e se a função aleatória Z (x) puder ser escrita como a seguinte soma Z (x) = m(x) + Y(x), onde m(x) é o valor da esperança de Z(x) e pode ser escrita como um polinômio de grau baixo e Y(x) constituir a parte aleatória, deve-se usar a metodologia da Krigagem Universal, da qual a Krigagem Ordinária é um caso particular. No entanto, a Krisagem Universal possui uma dificuldade metodológica que é a de se supor conhecido o semivariograma dos Resíduos. Uma das maneiras de contornar este problema é utilizar a Krisagem dos Resíduos ou Krigagem Residual. O objetivo desta dissertação é apresentar, embora de maneira não rigorosa, o formalismo matemático que embasa a teoria da Krigagem, os problemas relativos a Krigagem Residual e as críticas que são feitas a este método. Por fim, serão exibidos dois exemplos retirados da literatura, cujo fenômeno em estudo não cumpre e as condições exigidas para a Krigagem Ordinária e nos quais, portanto, deve-se aplicar a Krigagem Universal que, nestes casos, será feita através da Krigagem dos Resíduos / Abstract: Not available / Mestre
159

Desenvolvimento de Metamodelos Kriging e otimização de uma planta de tratamento de efluentes (BSM2). / Development of Metamodels Kriging and optimization of an effluent treatment plant (BSM2).

COSTA, Adriana Barbosa da. 09 March 2018 (has links)
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-03-09T21:45:39Z No. of bitstreams: 1 ADRIANA BARBOSA DA COSTA - DISSERTAÇÃO PPGEQ 2016..pdf: 2381705 bytes, checksum: 9ae8c9d691a6fa4d5aa82e113e910d76 (MD5) / Made available in DSpace on 2018-03-09T21:45:39Z (GMT). No. of bitstreams: 1 ADRIANA BARBOSA DA COSTA - DISSERTAÇÃO PPGEQ 2016..pdf: 2381705 bytes, checksum: 9ae8c9d691a6fa4d5aa82e113e910d76 (MD5) Previous issue date: 2016-12-14 / Um constante estudo para o tratamento de águas residuais e descarte de efluentes é necessário a fim de lidar com normas cada vez mais rígidas nesse âmbito. As estações de tratamento de águas residuais podem ser consideradas sistemas altamente não-lineares, devido a existência de perturbações, bem como a interação de um número considerável de variáveis do processo. Neste contexto, o estudo, otimização e controle dessas plantas são essenciais para o bom funcionamento do processo em atenção às exigências. Vários métodos de otimização são propostos na literatura, e suas implementações em aplicações de engenharia podem ser significativamente melhoradas pelo uso de metamodelos representando o modelo rigoroso do processo a partir de dados computacionais. O presente trabalho trata do desenvolvimento de metamodelos, pela aplicação do modelo Kriging, para um processo de tratamento de águas residuais. Para tal, são realizadas as etapas de amostragem de pontos, por meio do Latin Hypercube Sampling, estimação dos parâmetros e validação. A metodologia proposta baseiase na geração de dados computacionais por meio do modelo rigoroso para o Benchmark Simulation Model N°2, implementado no Simulink®, e na otimização do processo utilizando os metamodelos Kriging. Estes modelos obtidos através de dados de processo rigoroso mostram uma alta precisão e minimização do esforço computacional para o processo de otimização. A Programação Quadrática Seqüencial e o Algoritmo Genético são utilizados para a tarefa de otimização, bem como a geração do modelo de Otimização em Tempo Real. Os resultados obtidos no modelo de referência demonstram a potencialidade da metodologia proposta para minimizar o custo do processo enquanto obedecem as restrições do efluente para as águas residuais tratadas. / A continuous study for improving the treatment of wastewater and the effluent disposal is necessary in order to deal with increasingly stringent environmental laws in this field. Wastewater treatment plants can be considered as highly non-linear systems, due to the existing disturbances as well as the interaction of a considerable number of process variables. In such a context, the study, optimization and control of these plants are essential for the proper operation of the process with respect to requirements. Several optimization methods are proposed in the literature and, their implementation for engineering applications can be significantly improved by the use of metamodels representing the rigorous model of the process starting from computational data. The present work deals with the development of metamodels, such as the Kriging model, a wastewater treatment process. To this end, the steps of data sampling, through Latin Hypercube Sampling, parameter estimation and validation are performed. The proposed methodology is based on the generation of computational data through the rigorous model of the Benchmark Simulation Model No. 2, implemented in Simulink®, and the optimization of the process using of the Kriging metamodels. These models obtained through the rigorous process data show a high accuracy and the computational effort of the optimization methods. The Sequential Quadratic Programming and Genetic Algorithm are used for the optimization task, as well as the generation of the Real Time Optimization model. The achieved results on benchmark model demonstrate the potentiality of the proposed methodology, to minimize the process cost while obeying the effluent restrictions of the treated wastewater.
160

Variabilidade espacial da condutividade hidráulica e da permeabilidade ao ar em função dos conteúdos de água e ar no solo / Spatial variability of hydraulic conductivity and air permeability as a function of soil water and air contents

Alexsandro dos Santos Brito 16 July 2010 (has links)
O conhecimento de propriedades do solo ligadas diretamente à produtividade das culturas é uma busca incessante. As propriedades fortemente correlacionadas com o espaço poroso do solo tornam-se muito importantes, principalmente porque têm ação direta no desenvolvimento vegetal: é pelo espaço poroso que ocorre o deslocamento de água e ar para a rizosfera das plantas. Cada tipo de solo e mesmo cada horizonte pedológico possui uma geometria de poros que o caracteriza e que permite uma maior ou menor facilidade de transportar água e, consequentemente, o ar. Como as propriedades do solo relacionadas ao transporte de água e de ar são altamente variáveis no espaço, o objetivo deste trabalho foi estudar a variabilidade espacial: a) dos parâmetros da equação que correlaciona a condutividade hidráulica com o conteúdo de água no solo e b) da permeabilidade do solo ao ar. Para tanto, foi instalado um experimento no campo, num Latossolo Vermelho Amarelo textura média, constituído por 60 tubos de acesso a uma sonda de nêutrons, numa malha regular de 5 x 5 m, com a finalidade de medir o conteúdo de água em função do tempo, nas profundidades de 0,20; 0,40; 0,60; 0,80 m e, então, determinar-se a condutividade hidráulica pelo método do perfil instantâneo. A permeabilidade do solo ao ar (ka) foi determinada pelo método da pressão decrescente, realizada em laboratório, utilizando amostras de solo com estrutura indeformada, equilibradas nas tensões de 6 e 10 kPa. De posse desses atributos do solo, realizou-se o estudo da variabilidade espacial. Os valores mais elevados de condutividade hidráulica do solo saturado e de permeabilidade intrínseca do solo ao ar foram encontrados nas porções da área experimental com menores cotas altimétricas, o que indica uma influência da morfologia do terreno na estrutura do solo, principalmente porque nessas porções há uma maior quantidade de argila. Os mapas de predição mostram que a condutividade hidráulica do solo saturado está correlacionada diretamente com o conteúdo de água do solo saturado e inversamente com o coeficiente angular da reta do logaritmo da condutividade hidráulica em função do conteúdo de água no solo. Os mapas de predição das permeabilidades do solo ao ar foram semelhantes quanto à localização dos maiores e menores valores, mas houve um aumento das porções de área com valores mais elevados de ka(m=-10 kPa) em comparação ao de ka(m=-6 kPa), devido ao esvaziamento de poros com raios menores, que aumenta a conectividade e consequentemente o fluxo de ar. / The knowledge of soil properties directly related to crops yield is na incessant search. The soil properties strongly correlated with the soil porosity are very important for their direct action on plant development: it is through the soil porous space that occurs water and air displacement to plant rhizosfere. Each kind of soil and even each soil horizont have a pore geometry that the chracterizes and permits a higher or lower facility in transporting water and air. Since these soil properties are highly variable in space, the objective of this work was to study the spatial variability: a) of the parameters of the equation correlating soil hydraulic conductivity and soil water content and, b) of the soil air permeability. For this, 60 acess tubes for a neutron probe were installed in a medium texture Yellow Red Latosol, in a regular grid of 5 x 5 m, in order to measure the soil water content a long the time, in depths of 0.2; 0.4; 0.6; 0.8 m and to determine the hydraulic conductivity by instantaneous profile method. The soil air permeability was determined by the decreasing pressure method in laboratory, using undisturbed soil samples under stabilized matric soil water potentials of -6 and -10 kPa. The higher values of saturated hydraulic conductivity and soil air permeability were found in the lower altitudes of the experimental area, showing that topography can influence soil structure, due mainly the higher amount of clay in lower altitudes. The predict map of saturated hydraulic conductivity shows that this soil propertie is directly correlated with the saturated soil water content and inversely correlated with the slope of the curve of logarithm of hydraulic conductivity as a function of soil water content. Predict maps of soil air permeability were similar, since the higher and lower values were found in the same place. However, there were an increase of the area with higher air permeability, measured under -10 kPa, due to higher amount of air filled pores, increasing the porous connectivity and the air flow.

Page generated in 0.0372 seconds