• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 49
  • 8
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Sensor web geoprocessing on the grid

McCullough, Aengus January 2011 (has links)
Recent standardisation initiatives in the fields of grid computing and geospatial sensor middleware provide an exciting opportunity for the composition of large scale geospatial monitoring and prediction systems from existing components. Sensor middleware standards are paving the way for the emerging sensor web which is envisioned to make millions of geospatial sensors and their data publicly accessible by providing discovery, task and query functionality over the internet. In a similar fashion, concurrent development is taking place in the field of grid computing whereby the virtualisation of computational and data storage resources using middleware abstraction provides a framework to share computing resources. Sensor web and grid computing share a common vision of world-wide connectivity and in their current form they are both realised using web services as the underlying technological framework. The integration of sensor web and grid computing middleware using open standards is expected to facilitate interoperability and scalability in near real-time geoprocessing systems. The aim of this thesis is to develop an appropriate conceptual and practical framework in which open standards in grid computing, sensor web and geospatial web services can be combined as a technological basis for the monitoring and prediction of geospatial phenomena in the earth systems domain, to facilitate real-time decision support. The primary topic of interest is how real-time sensor data can be processed on a grid computing architecture. This is addressed by creating a simple typology of real-time geoprocessing operations with respect to grid computing architectures. A geoprocessing system exemplar of each geoprocessing operation in the typology is implemented using contemporary tools and techniques which provides a basis from which to validate the standards frameworks and highlight issues of scalability and interoperability. It was found that it is possible to combine standardised web services from each of these aforementioned domains despite issues of interoperability resulting from differences in web service style and security between specifications. A novel integration method for the continuous processing of a sensor observation stream is suggested in which a perpetual processing job is submitted as a single continuous compute job. Although this method was found to be successful two key challenges remain; a mechanism for consistently scheduling real-time jobs within an acceptable time-frame must be devised and the tradeoff between efficient grid resource utilisation and processing latency must be balanced. The lack of actual implementations of distributed geoprocessing systems built using sensor web and grid computing has hindered the development of standards, tools and frameworks in this area. This work provides a contribution to the small number of existing implementations in this field by identifying potential workflow bottlenecks in such systems and gaps in the existing specifications. Furthermore it sets out a typology of real-time geoprocessing operations that are anticipated to facilitate the development of real-time geoprocessing software.
12

Suitability of DEMs derived from SAR interferometry and ASTER stereoscopy for hydrological applications using GIS : a case study of Al-Jafer basin, Jordan

Al Harbi, Sultan Duham January 2009 (has links)
Digital elevation models (DEMs) provide an essential quantitative environmental variable for a vast amount of the research published in remote sensing, GIS and physical geography. Traditionally, DEMs have been derived from ground surveys and photogrammetric techniques, and from topographic maps using contour data and spot heights. Satellite remote sensing now provides the most accurate digital elevation datasets with worldwide coverage by means of optical, radar or laser sensors. The aim of this study is to evaluate the accuracy and reliability of DEMs generated from InSAR (ERS-1/2) and ASTER data over a sparsely vegetated drainage system in central Jordan. DEMs of the study area were generated by each of these systems and an accuracy assessment was carried out through verification of the DEM based on the characteristics of the terrain against a number of independent check points collected using differential GPS data and references to generate a DEM from a topographic map (scale 1:50,000). The accuracy of independent check points used in this study is less than 1 m, which is more accurate than remote sensing techniques. The accuracy of the DEMs derived from InSAR and ASTER are represented by their RMSE, which were ± 6.95 m and 13.28 m respectively. The increase in errors in high elevation areas for ASTER DEM was higher than in InSAR DEM. The effect of these errors is investigated using stochastic conditional simulation to generate multiple equal likelihood representations of an actual terrain surface. The propagation of data uncertainty to the elevation derivatives, and the impact on the extracted surface flow are assessed. The results suggest that an elevation error propagates to flow accumulation especially in low slope areas. The effects of DEM resolution on a set of topographic parameters, including slope, accumulated flow area, flow length and catchment area, are examined.
13

Visualisation of quality information for geospatial and remote sensing data : providing the GIS community with the decision support tools for geospatial dataset quality evaluation

Lush, Victoria January 2015 (has links)
The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.
14

An investigation into the use of VRS data in Malaysia

Yusoff, Mohd Yunus Mohd January 2007 (has links)
The VRS (Virtual Reference Station) technique used in network RTK (Real Time Kinematic) GPS seems to be the answer to all surveyors' prayers for accurate single receiver real time technique. The VRS technique, operating either in real time or post-processed, has been proven by several researchers to provide centimetre accuracy positioning suitable for many surveying and positioning applications.
15

User generated spatial content : an analysis of the phenomenon and its challenges for mapping agencies

Antoniou, V. January 2011 (has links)
Since the World Wide Web (Web) became a medium to serve information, its impact on geographic information has been constantly growing. Today the evolution of the bi-directional Web 2.0 has created the phenomenon of User Generated Spatial Content. In this Thesis the focus is into analysing different aspects of this phenomenon from the perspective of a mapping agency and also developing methodologies for meeting the challenges revealed. In this context two empirical studies are conducted. The first examines the spatial dimension of the popular Web 2.0 photo-sharing websites like Flickr, Panoramio, Picasa Web and Geograph, mainly investigating whether such Web applications can serve as sources of spatial content. The findings show that only Web applications that urge users to interact directly with spatial entities can serve as universal sources of spatial content. The second study looks into data quality issues of the OpenStreetMap, a popular wiki-based Web mapping application. Here the focus is on the positional accuracy and attribution quality of the user generated spatial entities. The research reveals that positional accuracy is fit for a number of purposes. On the other hand, the user contributed attributes suffer from inconsistencies. This is mainly due to the lack of a methodology that could help to the formalisation of the contribution process, and thus enhance the overall quality of the dataset. The Thesis explores a formalisation process through an XML Schema for remedying this problem. Finally, the advantages of using vector data in order to enhance interactivity and thus create more efficient and bi-directional Web 2.0 mapping applications is analysed and a new method for vector data transmission over the Web is presented.
16

A framework for quality evaluation of VGI linear datasets

Koukoletsos, T. January 2012 (has links)
Spatial data collection, processing, distribution and understanding have traditionally been handled by professionals. However, as technology advances, non-experts can now collect Geographic Information (GI), create spatial databases and distribute GI through web applications. This Volunteered Geographic Information (VGI), as it is called, seems to be a promising spatial data source. However, the most concerning issue is its unknown and heterogeneous quality, which cannot be handled by traditional quality measurement methods; the quality elements that these methods measure were standardised long before the appearance of VGI and they assume uniform quality behaviour. The lack of a suitable quality evaluation framework with an appropriate level of automation, which would enable the repetition of the quality assessment when VGI is updated, renders the choice of using it difficult or risky for potential users. This thesis proposes a framework for quality evaluation of linear VGI datasets, used to represent networks. The suggested automated methodology is based on a comparison of a VGI dataset with a dataset of known quality. The heterogeneity issue is handled by producing individual results for small areal units, using a tessellation grid. The quality elements measured are data completeness, attribute and positional accuracy, considered as most important for VGI. Compared to previous research, this thesis includes an automated data matching procedure, specifically designed for VGI. It combines geometric and thematic constraints, shifting the scale of importance from geometry to non-spatial attributes, depending on their existence in the VGI dataset. Based on the data matching results, all quality elements are then measured for corresponding objects, providing a more accurate quality assessment. The method is tested on three case studies. Data matching proves to be quite efficient, leading to more accurate quality results. The data completeness approach also tackles VGI over-completeness, which broadens the method usage for data fusion purposes.
17

Winding Dali’s clock : the construction of a fuzzy temporal-GIS for archaeology

Green, Christopher Thomas January 2011 (has links)
Archaeology is fundamentally concerned with both space and time: dates, chronologies, stratigraphy, plans and maps are all routinely used by archaeologists in their work. To aid in their analysis of this material, the use of Geographic Information Systems (GIS) by archaeologists has become widespread. However, GIS are conventionally ignorant of time. Thus, if archaeologists are to achieve the fullest potential in the application of GIS to their studies, GIS are needed that properly take into account time as well as space. A GIS capable of dealing with temporal data is referred to as a temporal-GIS (TGIS), and commercial TGIS systems currently exist. However, these are locked into a model of modern clock time. Archaeological time does not sit well within that model, being altogether fuzzier and less precise. Nor are commercial TGIS able to address the questions that archaeologists ask of their spatio-temporal data. Thus, a TGIS is needed that deals with the types of time that we encounter as archaeologists, lest we end up shaping our data and questions to the inherent capabilities of non-archaeological TGIS. The creation of that new TGIS is the subject of this thesis: a fuzzy TGIS built specifically for the study of archaeological data that also takes into account recent developments in the theory of temporality within the discipline. The new TGIS needs to be flexible and powerful, yet to ensure that it is actually used it must remain within the software horizons of GIS-literate archaeologists. The new TGIS has been applied to two case studies, one in prehistoric Derbyshire and one in Roman Northamptonshire, producing informative and interesting new results. It is hoped that others will fruitfully use the TGIS and that, as a result, new forms of spatio-temporal analysis might come to be applied to archaeological studies.
18

Capturing navigation landmarks with in-car mobile games

Oliver, Keith John January 2012 (has links)
The augmentation of navigation systems with landmarks has been proposed as a method of improving the effectiveness of the technology, and facilitating drivers' engagement with the environment. As a consequence, benefits are predicted both in increased spatial learning and reduced driver distraction. Good navigational landmarks are however both laborious to collect and difficult to define. Collecting useful data as a by-product of mobile applications or games has received growing research interest. It has been suggested as a way of collecting location based data, which may be reused in navigation systems or other applications. Previous research games, with the aim of collecting landmarks, have encountered problems with producing a game concept which encourages the collection of good navigational landmarks. Past research has also concentrated on pedestrian applications. Since the majority of navigation applications are used in cars, exploring the potential of capturing data in the context of a car journey makes a novel and valuable contribution. This research aimed to devise a game concept, which could be played by passengers in cars and would collect useful landmark data as a by-product. The thesis describes how a virtual graffiti concept was generated, and how it was evaluated in both simulated and real world journeys. Design studies, using video journeys, were performed to gather information and to inform the development of a high level concept. An in-car trial of a prototype, which embodied a virtual graffiti game mechanic, was carried out, and 38 participants took part. The data collected in this trial was then evaluated by means of a survey, in which 100 respondents assessed the quality of the landmarks collected and their potential for reuse in navigation applications. Players of the virtual graffiti prototype displayed a consensus in where to place their graffiti tags. In ten out of twelve locations, over 30% of the players chose the same object to tag. Furthermore, a significant correlation was found between how highly landmarks were rated in the survey and how frequently they were tagged in the car trial. This work demonstrates a method for evaluating a crowdsourcing application for car journeys, and proposes an effective strategy of designing games with by-products, which is to relate the role of the objects used in the game to the purpose of the by-products. In the case of a virtual graffiti game the two are related by environmental salience. Objects were chosen by the graffiti taggers because of features such as size, visibility, ease of description and semantic salience, and these factors are also defining characteristics of good navigational landmarks. The studies described in this thesis provide evidence that crowdsourcing to collect landmarks will not require large numbers of people, or extensive coverage of an area, to produce candidate landmarks for navigation. The thesis also presents some practical ideas for the use of the by-products of landmark capturing games in navigation systems, and some implications for the design of car-based, virtual graffiti games and car-based crowdsourcing in general.
19

Precise positioning in urban canyons: applied to the localisation of buried assets

Montillet, Jean-Philippe January 2008 (has links)
The last decades have seen the applications related to Global Positioning Systems (GPS) flourishing in various and unrelated ways such as car navigation, electronic advertisement, military defense (e.g. missile tracking) ... etc. The increasing consumers interest in positioning market is also due to the recent coupling of mobile phone and GPS technologies which is deemed to kick-off the location based services market (LBS). The LBS market are electronic services for providing information that has been created, compiled, selected or filtered taking into consideration the current locations of the users or those of other persons or mobile objects. Analysts foresee that these services will open a new era of mass consumption market where all positioning technologies will play a key role. The level of positioning accuracy (e.g. 1 cm or 1 m) will be a service that the user may order from his mobile phone, laptop or other devices.
20

Mitigation of scintillation effects on GPS satellite positioning

Mokhtar, Mohd Hezri January 2012 (has links)
Ionospheric scintillation causes significant effects on GPS signals such as amplitude fading and fast phase changes that can lead to cycle slips resulting in error in the positioning determination. In a worst case scenario, it will lead to the loss of lock of the PLL in the receiver. The study of amplitude and phase scintillation on GPS signals, especially in the high and low latitude regions where the scintillation can be severe, as well as the prediction and mitigation of these effects on the GPS positioning, has been conducted in the research work presented in this thesis. The tracking error at the output of the PLL limits the accuracy of the range measurements which the receiver uses to compute the position. Determination of tracking error variance is accomplished using Conker's model (Conker et al.,2003) which requires the spectral parameters p and T. Three methods to accurately determine the tracking error variance by making use of scintillation indices from high sample rate scintillation data are investigated. These all find the spectral parameters (p and T) required by the Conker model. The first method determines the spectral parameters p and T from the PSD of phase, obtained from the detrended high sample rate phase data, through performing the FFT (Fast Fourier Transform. The second method (Strangeways, 2009). finds these parameters from the scintillation indices using an approximation model of the phase and amplitude spectra together with an estimated Fresnel frequency, thus without the need to perform FFTs and also useful when high sample rate data is not available. The third method filters the detrended high sample rate phase scintillation data employing five band pass filters to allow an approximate phase spectrum to be obtained. The determination of the spectral parameters is accomplished after detrending the time series of the phase and intensity of the received signal which removes trends due to the moving satellite. Different methods for performing this detrending (3 for amplitude and 2 for phase) are compared. The application of the method to estimate the Fresnet frequency by performing a cubic fit to the amplitude spectrum (required for checking the accuracy of the method of finding the spectral parameters from the scintillation indices) is also presented. Finally comparison is made between the three methods of determining spectral parameters in order to determine their relative accuracies and their respective appropriate areas of usefulness and validity. The mitigation of the scintillation effects has been accomplished by observing the scintillation level on the paths of the signals from every satellite received simultaneously. The accuracy of GPS positioning estimation has shown the improvement by using the weighting measurement based on the tracking jitter variance for each of the satellites used to calculate the position.

Page generated in 0.0305 seconds