• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 5
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 29
  • 29
  • 8
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

DESENVOLVIMENTO DE UM SISTEMA WEB PARA ESTIMATIVA NUMÉRICA DE DADOS METEOROLÓGICOS DO RIO GRANDE DO SUL / DEVELOPMENT OF A WEB SYSTEM FOR NUMERICAL ESTIMATE OF METEOROLOGICAL DATA FROM RIO GRANDE DO SUL

Ferraz, Rafael Camargo 19 January 2010 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Climate influences a large variety of human activities and the real-time access to climatic data aims for providing information that are fundamental in several human activities, mainly for agriculture. Nowadays there are few weather stations operating, which causes a lack of information at worldwide level, in different regions. Keeping in mind that the state of Rio Grande do Sul has a large part of its economy based on agriculture, as well as the climatological information's relevance, this study aims at developing a web system for numerical estimate of meteorological data in the RS's state, based on the automatical and superficial weather stations from the National Institute of Meteorology (INMET), in order to make the data available to the regions where there are no stations. The inverse distance weighting was used as an interpolation model, applying the exponents from zero (0) to five (5) and, later, comparing them through the coefficients of correlation, regression and the performance index. It was made use of programming languages as PHP, HTML and JavaScript to develop the web system, with support for MySQL database. The programs used were Macromedia Dreamweaver 8.0 and HeidSQL; the first was used for web programming whereas the second was used to manage the database. Among the nine variables that was analysed, just four of them showed a great performance. They are: temperature, relative humidity, atmospheric pressure and dew points. The interpolation model with the exponent five (5) has shown the best performance regarding to the four variables. After defining the best method, it was created a MySQL database called SWIM (Meteorological Interpolation's Web System) and through it the web system was developed, which has offered quickness, security and reliability to the application. / O clima influencia as mais diversas atividades humanas e o acesso aos dados climatológicos em tempo real, visa o fornecimento de informações que são fundamentais, principalmente para a agricultura. Atualmente existem poucas estações meteorológicas instaladas o que gera carência de informações, em âmbito mundial, para diversas regiões. Tendo em vista que o Estado do Rio Grande do Sul possui grande parte de sua economia baseada na agricultura e também a relevância das informações climatológicas, o presente trabalho teve como objetivo desenvolver um sistema web de estimativa numérica de dados meteorológicos para o Estado, com base nas estações meteorológicas automáticas de superfície, do Instituto Nacional de Meteorologia (INMET), com o intuito de dispor os dados para as regiões as quais não possuem estações. Como modelo interpolador foi utilizado o inverso da potência da distância, aplicando os expoentes de 0 a 5 e, posteriormente comparando-os através dos coeficientes de correlação, regressão e índice de desempenho. Para a realização do sistema web foram utilizadas as linguagens de programação PHP, HTML e javascript, com suporte ao banco de dados MYSQL. Utilizou-se os programas Macromedia Dreamweaver 8.0 para a programação web e HeidiSQL para gerenciar o banco de dados. Dentre as nove variáveis analisadas, apenas quatro apresentaram ótimo desempenho, sendo elas: temperatura, umidade relativa do ar, pressão atmosférica e ponto de orvalhos. O modelo de interpolação com expoente 5 foi o que apresentou melhor desempenho para as quatro variáveis. Após definição do melhor método, criou-se o banco de dados SWIM (Sistema Web de Interpolação Meteorológica) em MySQL e desenvolveu-se o sistema web, o qual ofereceu rapidez, segurança e confiabilidade para a aplicação.
12

INTERPOLATING HYDROLOGIC DATA USING LAPLACE FORMULATION

Tianle Xu (10802667) 14 May 2021 (has links)
Spatial interpolation techniques play an important role in hydrology as many point observations need to be interpolated to create continuous surfaces. Despite the availability of several tools and methods for interpolating data, <a>not all of them work consistently for hydrologic applications</a><a>. One of the techniques, Laplace Equation, which is used in hydrology for creating flownets, has rarely been used for interpolating hydrology data</a>. The objective of this study is to examine the efficiency of Laplace formulation (LF) in interpolating hydrologic data and compare it wih other widely used methods such as the inverse distance weighting (IDW), natural neighbor, and kriging. Comparison is performed quantitatively for using root mean square error (RMSE), visually for creating reasonable surfaces and computationally for ease of operation and speed. Data related to surface elevation, river bathymetry, precipitation, temperature, and soil moisture data are used for different areas in the United States. RMSE results show that LF performs better than IDW and is comparable to other methods for accuracy. LF is easy to use as it requires fewer input parameters compared to IDW and Kriging. Computationally, LF is comparable to other methods in terms of speed when the datasets are not large. Overall, LF offers an robust alternative to existing methods for interpolating various hydrology data. Further work is required to improve its computational efficiency with large data size and find out the effects of different cell size.
13

Methods for Characterizing Groundwater Resources with Sparse In-Situ Data

Nishimura, Ren 14 June 2022 (has links)
Groundwater water resources must be accurately characterized in order to be managed sustainably. Due to the cost to install monitoring wells and challenges in collecting and managing in-situ data, groundwater data is sparse in space and time especially in developing countries. In this study we analyzed long-term groundwater storage changes with limited times-series data where each well had only one groundwater measurement in time. We developed methods to synthetically create times-series groundwater table elevation (WTE) by clustering wells with uniform grid and k-means-constrained clustering and creating pseudo wells. Pseudo wells with the WTE values from the cluster-member wells were temporally and spatially interpolated to analyze groundwater changes. We used the methods for the Beryl-Enterprise aquifer in Utah where other researchers quantified the groundwater storage depletion rate in the past, and the methods yielded a similar storage depletion rate. The method was then applied to the southern region in Niger and the result showed a ground water storage change that partially matched with the trend calculated by the GRACE data. With a limited data set that regressions or machine learning did not work, our method captured the groundwater storage trend correctly and can be used for the area where in-situ data is highly limited in time and space.
14

Bayesian Hierarchical Space-Time Clustering Methods

Thomas, Zachary Micah 08 October 2015 (has links)
No description available.
15

Spatial Interpolation Enables Normative Data Comparison in Gaze-Contingent Microperimetry

Denniss, Jonathan, Astle, A.T. 09 September 2016 (has links)
Yes / Purpose: To demonstrate methods that enable visual field sensitivities to be compared with normative data without restriction to a fixed test pattern. Methods: Healthy participants (n = 60, age 19–50) undertook microperimetry (MAIA-2) using 237 spatially dense locations up to 13° eccentricity. Surfaces were fit to the mean, variance, and 5th percentile sensitivities. Goodness-of-fit was assessed by refitting the surfaces 1000 times to the dataset and comparing estimated and measured sensitivities at 50 randomly excluded locations. A leave-one-out method was used to compare individual data with the 5th percentile surface. We also considered cases with unknown fovea location by adding error sampled from the distribution of relative fovea–optic disc positions to the test locations and comparing shifted data to the fixed surface. Results: Root mean square (RMS) difference between estimated and measured sensitivities were less than 0.5 dB and less than 1.0 dB for the mean and 5th percentile surfaces, respectively. Root mean square differences were greater for the variance surface, median 1.4 dB, range 0.8 to 2.7 dB. Across all participants 3.9% (interquartile range, 1.8–8.9%) of sensitivities fell beneath the 5th percentile surface, close to the expected 5%. Positional error added to the test grid altered the number of locations falling beneath the 5th percentile surface by less than 1.3% in 95% of participants. Conclusions: Spatial interpolation of normative data enables comparison of sensitivity measurements from varied visual field locations. Conventional indices and probability maps familiar from standard automated perimetry can be produced. These methods may enhance the clinical use of microperimetry, especially in cases of nonfoveal fixation.
16

A Comparison of Spatial Interpolation Techniques for Determining Shoaling Rates of the Atlantic Ocean Channel

Sterling, David L. 06 October 2004 (has links)
The United States of Army Corp of Engineers (USACE) closely monitors the changing depths of navigation channels throughout the U.S. and Western Europe. The main issue with their surveying methodology is that the USACE surveys in linear cross sections, perpendicular to the channel direction. Depending on the channel length and width, these cross sections are spaced 100 - 400 feet apart, which produces large unmapped areas within each cross section of a survey. Using a variety of spatial interpolation methods, depths of these unmapped areas were produced. The choice of spatial interpolator varied upon which method adequately produced surfaces from large hydrographic survey data sets with the lowest amount of prediction error. The data used for this research consisted of multibeam and singlebeam surveys. These surveys were taken in a systematic manner of linear cross-sections that produced tens of thousands of data points. Nine interpolation techniques (inverse distance weighting, completely regularized spline, spline with tension, thin plate spline, multiquadratic spline, inverse multiquadratic spline, ordinary kriging, simple kriging, and universal kriging) were compared for their ability to accurately produce bathymetric surfaces of navigation channels. Each interpolation method was tested for effectiveness in determining depths at "unknown" areas. The level of accuracy was tested through validation and cross validation of training and test data sets for a particular hydrographic survey. By using interpolation, grid surfaces were created at 15, 30, 60, and 90-meter resolution for each survey of the study site, the Atlantic Ocean Channel. These surfaces are used to produce shoaling amounts, which are taken in the form of volumes (yd.³). Because the Atlantic Ocean Channel is a large channel with a small gradual change in depth, a comparison of grid resolution was conducted to determine what difference, if any, exists between the calculated volumes from varying grid resolutions. Also, a comparison of TIN model volume calculations was compared to grid volume estimates. Volumes are used to determine the amount of shoaling and at what rate shoaling is occurring in a navigation channel. Shoaling in each channel was calculated for the entire channel length. Volumes from varying grid resolutions were produced from the Atlantic Ocean Channel over a seven-year period from 1994-2001. Using randomly arranged test and training datasets, spline with tension and thin plate spline produced the mean total error when interpolating using singlebeam and multibeam hydrographic data respectively. Thin plate spline and simple kriging produced the lowest mean total error in full cross validation testing of entire singlebeam and multibeam hydrographic datasets respectively. Volume analysis of varying grid resolution indicates that finer grid resolution provides volume estimates comparable to TIN modeling, the USACE's technique for determining sediment volume estimates. The coarser the resolution, the less similar the volume estimates are in comparison to TIN modeling. All grid resolutions indicate that the Atlantic Ocean Channel is shoaling. Using a plan depth of 53 feet, TIN modeling displayed an annual average increase of 928,985 cubic yards of sediment from 1994 - 2001. / Master of Science
17

Analyzing Residential Land Use Impacts along the Sheppard Subway Corridor

Lee, Matthew 04 1900 (has links)
Urban economic theory states that transit improvements result in travel time savings and consequently warrant higher rents particularly with proximity to surrounding stations. This research uses the Sheppard subway corridor as a case study to test the established theories by measuring the changes to residential intensification and property values (1) as a function of time before and after the construction, and (2) as a function of distance to subway stations. Two metrics are established to observe residential intensification and property value: Dwelling Density and Value Density respectively. Dwelling Density is the number of dwellings contained in its property parcel divided by property area; Value Density is total property value of a given property parcel divided by its property area. Using obtained property sales data in four identified analysis years (1991, 1996, 2001, and 2006) and ArcGIS, spatial interpolation surfaces are generated to visualize the changes on a geographical plane through time. Dwelling and Value Density scatterplots are generated by extracting values from the interpolated surfaces and computing its distance to the nearest subway station and to major development nodes. The generated interpolated surfaces show a strong increase in Dwelling and Value Density in North York Centre which suggest that (1) planning policies succeeded in guiding residential growth, (2) a time lag is present of which the full benefits of rapid transit construction are realized, and (3) there may be positive network effects associated with the completion of the Sheppard subway. The scatterplot results demonstrated moderate change in Dwelling and Value Density at the Bayview station area and little change for the remaining stations (Bessarion, Don Mills, and Leslie) based on observations up to December 2006. The results warrant a degree of optimism about Sheppard subway’s ability to attract residential intensification and raise property values, especially given that data was analyzed only up to four years after the subway corridor began revenue service. It is recommended that a similar methodology be performed at a later date when the corridor’s ridership and surrounding development reaches maturity. A preliminary forecasting exercise determined that Dwelling and Value Density will rise, particularly surrounding stations that have since demonstrated little change in residential land use.
18

Analyzing Residential Land Use Impacts along the Sheppard Subway Corridor

Lee, Matthew 04 1900 (has links)
Urban economic theory states that transit improvements result in travel time savings and consequently warrant higher rents particularly with proximity to surrounding stations. This research uses the Sheppard subway corridor as a case study to test the established theories by measuring the changes to residential intensification and property values (1) as a function of time before and after the construction, and (2) as a function of distance to subway stations. Two metrics are established to observe residential intensification and property value: Dwelling Density and Value Density respectively. Dwelling Density is the number of dwellings contained in its property parcel divided by property area; Value Density is total property value of a given property parcel divided by its property area. Using obtained property sales data in four identified analysis years (1991, 1996, 2001, and 2006) and ArcGIS, spatial interpolation surfaces are generated to visualize the changes on a geographical plane through time. Dwelling and Value Density scatterplots are generated by extracting values from the interpolated surfaces and computing its distance to the nearest subway station and to major development nodes. The generated interpolated surfaces show a strong increase in Dwelling and Value Density in North York Centre which suggest that (1) planning policies succeeded in guiding residential growth, (2) a time lag is present of which the full benefits of rapid transit construction are realized, and (3) there may be positive network effects associated with the completion of the Sheppard subway. The scatterplot results demonstrated moderate change in Dwelling and Value Density at the Bayview station area and little change for the remaining stations (Bessarion, Don Mills, and Leslie) based on observations up to December 2006. The results warrant a degree of optimism about Sheppard subway’s ability to attract residential intensification and raise property values, especially given that data was analyzed only up to four years after the subway corridor began revenue service. It is recommended that a similar methodology be performed at a later date when the corridor’s ridership and surrounding development reaches maturity. A preliminary forecasting exercise determined that Dwelling and Value Density will rise, particularly surrounding stations that have since demonstrated little change in residential land use.
19

Avaliação da disponibilidade de água em aquíferos por meio de análises espaço-temporais

Brito Neto, Romildo Toscano de 21 December 2012 (has links)
Made available in DSpace on 2015-05-14T12:09:34Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 9717249 bytes, checksum: 6e8a4f409ba857ea6929c99fadf8a342 (MD5) Previous issue date: 2012-12-21 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Increasing groundwater extraction for domestic, agricultural, and industrial supply frequently makes exploitation exceed the natural recharge rate. This results in decreasing volumes, deteriorating water quality, and soil degradation. The goal of this study is to assess water availability in the Texas State (USA) portion of the Ogallala aquifer, where extraction has reached levels that are above the recharge values. Fifty three years of water level data from wells between 1960 and 2012 were used. Fifty three surfaces each were created using both (kriging and spline techniques) of spatial interpolation, and cross-validation methods (leaveone- out, and holdout) were used to compare the predicted water level surfaces with the well data. The predictions were similar and the results were satisfactory, however we found spline technique simpler, and easier for automatic processing of multiple surfaces, and we recommend it. However, kriging technique presents higher accuracy. After processing the water level surfaces; the saturated thickness and volume (pixel by pixel), for the whole time series were estimated. This allowed temporal analysis to be performed from the results. A set of 492 water level variation time series (to analyze trends using the Mann-Kendall test) were selected and grouped by cluster analysis (using both hierarchical and k-means methods). From the temporal analyses, the spatial distribution of the results was verified, and it was observed that the clusters most likely to decrease converge with the critical areas identified. Furthermore, the influence of agricultural activities on water level variance was established, showing a strong relationship between the two. Finally, the results showed that over 53 years, the total water volume had been reduced by 33.9%. A full 74.39% of the time series analyzed also showed a decreasing trend in water levels, and it was observed that the saturated thickness is gradually being reduced as well / O crescimento da exploração de águas subterrâneas para suprir demandas de abastecimento, agricultura e indústria faz com que a extração frequentemente exceda a recarga natural, resultando num declínio de seu volume, deterioração do solo e qualidade da água. O objetivo geral deste trabalho é avaliar a disponibilidade de água na porção do aquífero Ogallala contido no Estado do Texas (EUA), onde a explotação tem alcançado níveis acima dos valores de recarga. Para isto, inicialmente, foram utilizados dados de nível da água de poços com uma série de 53 anos (1960 a 2012), criando-se 53 superfícies por cada método de interpolação espacial (krigagem e spline). Em seguida, foi realizada uma comparação entre as técnicas de interpolação pelos métodos leave-one-out e holdout de validação cruzada, além de verificar as diferenças nas estimativas do volume total do aquífero. Os dois métodos de interpolação produziram resultados semelhantes e desempenhos satisfatórios; entretanto, recomenda-se o spline para processar múltiplas superfícies de nível da água de modo automatizado e a krigagem para quando se demandar maior acurácia dos resultados. De posse das superfícies de nível da água, foi estimada a camada saturada do aquífero e o volume pixel a pixel, para toda a série histórica, o que permitiu a partir dos resultados espaciais, realizar análises temporais. Em paralelo, foi selecionado um conjunto de 492 séries temporais de variação do nível da água para se analisar tendências (teste de Mann-Kendall) e agrupá-las por meio de análise de cluster (método hierárquico e k-means). A partir destas análises temporais, foi verificada a distribuição espacial dos resultados, constatando que o grupo de clusters com a maior tendência de decrescimento converge com áreas críticas identificadas. Além disso, foi verificada a influência de áreas cultivadas na variação do nível da água, mostrando que existe uma forte relação entre os dois fenômenos. Por fim, os resultados mostram que houve uma redução de 33,9% do volume total em 53 anos, 74,39% das séries analisadas indicaram tendência decrescente e observou-se que o nível da camada saturada vem reduzindo gradativamente
20

High-resolution climate variable generation for the Western Cape

Joubert, Sarah Joan 03 1900 (has links)
Thesis (MSc (Geography and Environmental Studies))--University of Stellenbosch, 2007. / Due to the relative scarcity of weather stations, the climate conditions of large areas are not adequately represented by a weather station. This is especially true for regions with complex topographies or low population densities. Various interpolation techniques and software packages are available with which the climate of such areas can be calculated from surrounding weather stations’ data. This study investigates the possibility of using the software package ANUSPLIN to create accurate climate maps for the Western Cape, South Africa. ANUSPLIN makes use of thin plate smoothing splines and a digital elevation model to convert point data into grid format to represent an area’s climatic conditions. This software has been used successfully throughout the world, therefore a large body of literature is available on the topic, highlighting the limitations and successes of this interpolation method. Various factors have an effect on a region’s climate, the most influential being location (distance from the poles or equator), topography (height above sea level), distance from large water bodies, and other topographical factors such as slope and aspect. Until now latitude, longitude and the elevation of a weather station have most often been used as input variables to create climate grids, but the new version of ANUSPLIN (4.3) makes provision for additional variables. This study investigates the possibility of incorporating the effect of the surrounding oceans and topography (slope and aspect) in the interpolation process in order to create climate grids with a resolution of 90m x 90m. This is done for monthly mean daily maximum and minimum temperature and the mean monthly rainfall for the study area for each month of the year. Not many projects where additional variables have been incorporated in the interpolation process using ANUSPLIN are to be found in the literature, thus further investigation into the correct transformation and the units of these variables had to be done before they could be successfully incorporated. It was found that distance to oceans influences a region’s maximum and minimum temperatures, and to a lesser extent rainfall, while aspect and slope has an influence on a region’s rainfall. In order to assess the accuracy of the interpolation process, two methods were employed, namely statistical values produced during the spline function calculations by ANUSPLIN, and the removal of a selected number of stations in order to compare the interpolated values with the actual measured values. The analysis showed that more accurate maps were obtained when additional variables were incorporated into the interpolation process. Once the best transformations and units were identified for the additional variables, climate maps were produced in order to compare them with existing climate grids available for the study area. In general the temperatures were higher than those of the existing grids. For the rainfall grids ANUSPLIN’s produced higher rainfall values throughout the study region compared to the existing grids, except for the Southwestern Cape where the rainfall values were lower on north-facing slopes and high-lying area

Page generated in 0.1442 seconds