Spelling suggestions: "subject:"GA amathematical geography. cartography"" "subject:"GA amathematical geography. artography""
1 |
The design and implementation of a two and three-dimensional triangular irregular network based GISAbdul-Rahman, Alias January 2000 (has links)
It has been realised in the GIS community that most 2D GISs are capable of handling 2D spatial data efficiently, but systems have had less success with 3D spatial data. This is reflected in the current GIS market place where systems which can handle 3D data are hardly available - due to several impediments in implementing such systems. This thesis attempts to address some of the impediments. The impediments which related to spatial data especially data representation, data structuring and data modelling using object-oriented (OO) techniques are the foci of this thesis. OO techniques are utilized because they offer several advantages over the traditional (i.e. structural) techniques in software development. In the aspect of spatial representation, several major representations are investigated, which then lead to identifying an appropriate representation both for 2D and 3D, that is triangular irregular network (TIN) data structures. 2D data is represented by a 2D TIN, and 3D data is represented by a 3D TIN (also called a tetrahedral network or TEN). Several algorithms were developed for the construction of the data structures where procedures such as distance transformation (DT) and Voronoi tessellations were utilized. Besides standard Delaunay triangulations, constrained triangulations were also developed, thus the inclusion of real world objects in the spatial data modelling can be facilitated. Four classes of real world objects are identified (i.e., point, line, surface, and solid objects). For the purpose of spatial data modelling of the four types of objects, a formal data structure (FDS) is utilized.
|
2 |
Photogrammetric evaluation of space linear array imagery for medium scale topographic mappingZoej, Mohammad Javad Valadan January 1997 (has links)
This thesis is concerned with the 2D and 3D mathematical modelling of satellite-based linear array stereo images and the implementation of this modelling in a general adjustment program for use in sophisticated analytically-based photogrammetric systems. The programs have also been used to evaluate the geometric potential of linear array images in different configurations for medium scale topographic mapping. In addition, an analysis of the information content that can be extracted for topographic mapping purposes has been undertaken. The main aspects covered within this thesis are: - 2D mathematical modelling of space linear array images; - 3D mathematical modelling of the geometry of cross-track and along-track stereo linear array images taken from spacebome platforms; - the algorithms developed for use in the general adjustment program which implements the 2D and 3D modelling; - geometric accuracy tests of space linear array images conducted over high-accuracy test fields in different environments; - evaluation of the geometric capability and information content of space linear array images for medium scale topographic mapping; This thesis concludes that the mathematical modelling of the geometry and the adjustment program developed during the research has the capability to handle the images acquired from all available types of space linear array imaging systems. Furthermore it has been developed to handle the image data from the forthcoming very high-resolution space imaging systems utilizing flexible pointing of their linear array sensors. It also concludes that cross-track and along-track stereo images such as those acquired by the SPOT and MOMS- 02 linear array sensors have the capability for map compilation in 1:50,000 scales and smaller, but only in conjunction with a comprehensive field completion survey to supplement the data acquired from the satellite imagery.
|
3 |
Terrain synthesis : the creation, management, presentation and validation of artificial landscapesGriffin, Mark William January 2001 (has links)
'Synthetic Terrain' is the term used for artificially-composed computer-based Digital Terrain Models (DTMs) created by a combination of techniques and heavily influenced by Earth Sciences applications. The synthetic landscape is created to produce 'geographically acceptable', 'realistic' or 'valid' computer-rendered landscapes, maps and 3D images, which are themselves based on synthetic terrain Digital Elevation Models (OEMs). This thesis examines the way in which mainly physical landscapes can be synthesised, and presents the techniques by which terrain data sets can be managed (created, manipulated, displayed and validated), both for academic reasons and to provide a convenient and cost-effective alternative to expensive 'real world' data sets. Indeed, the latter are collected by ground-based or aerial surveying techniques (e.g. photogrammetry), normally at considerable expense, depending on the scale, resolution and type required. The digital information for a real map could take months to collect, process and reproduce, possibly involving demanding Information Technology (IT) resources and sometimes complicated by differing (or contradictory) formats. Such techniques are invalid if the region lies within an 'unfriendly' or inaccessible part of the globe, where (for example), overflying or ground surveys are forbidden. Previous attempts at synthesising terrain have not necessarily aimed at realism. Digital terrain sets have been created by using fractal mathematical models, as 'special effects' for the entertainment industry (e.g. science fiction 'alien' landscapes' for motion pictures and arcade games) or for artistic reasons. There are no known examples of synthesised DTMs being created with such a wide range of requirements and functionality, and with such a regard to validation and realism. This thesis addresses the whole concept of producing' alternative' landscapes in artificial form - nearly 22 years of research aimed at creating' geographically-sensible' synthetic terrain is described with the emphasis on the last 5 years, when this PhD thesis was conceived. These concepts are based on radical, inexpensive and rapid techniques for synthesising terrain, yet value is also placed on the 'validity', realism and 'fitness for purpose' of such models. The philosophy - or the 'thought processes' - necessary to achieve the development of the algorithms leading to synthesised DTMs is one of the primary achievement of the research. This in turn led to the creation of an interactive software package called GEOFORMA, which requires some manual intervention in the form of preliminary terrain classification. The sequence is thus: the user can choose to create terrain or landform assemblages without reference to any real world area. Alternatively, he can select a real world region or a 'typical' terrain type on a 'dial up' basis, which requires a short period of intensive parametric analysis based on research into established terrain classification techniques (such as fractals and other mathematical routines, process-response models etc.) The creates a composite synthesised terrain model of high quality and realism, a factor examined both qualitatively and quantitatively. Although the physical terrain is the primary concern, similar techniques are applied to the human landscape, noting such attributes as the density, type, nature and distribution of settlements, transport systems etc., and although this thread of the research is limited in scope compared with the physical landscape synthesis, some spectacular results are presented. The system also creates place names based on a simple algorithm. Fluvial landscapes, upland regions and coastlines have been selected from the many possible terrain types for 'treatment', and the thesis gives each of these sample landscapes a separate chapter with appropriate illustrations from this original and extensive research. Finally, and inevitably, the work also poses questions in attempting to provide answers, this is perhaps inevitable in a relatively new genre, encompassing so many disciplines, and with relatively sparse literature on the subject.
|
4 |
Credibility assessment and labelling of map mashupsIdris, Nurul Hawani January 2014 (has links)
The Web 2.0 revolution has changed the culture of mapping by opening it up to a wider range of users and creators. Map mashups, in particular, are being widely used to map variety of information. There is, however, no gatekeeper to validate the correctness of the information presented. The purpose of this research was to understand better what it is that influence users’ perceived credibility and trust within a map mashup presentation and to support the future implementation of automated credibility assessment and labelling of map mashup applications. This research has been conducted in three stages using mixed method approaches. The objective of the first stage was to examine the influence of metadata related to sources, specifically the map producer and map supplier, on respondents’ assessment of the credibility of map mashup information. The findings indicate a low influence of the tested metadata and a high influence of visual cue elements on users’ credibility assessment. Only half of the respondents used the metadata whilst the other half did not include it in their assessment. These findings became the basis of stage two, which was to examine the influence of colour coded traffic light (CCTL) labelling on respondents’ assessment of credibility. From the findings, the probability of respondents making informed judgements by choosing a high credibility map based on this rating label (CCTL) was three times higher than where only the metadata was presented. The third stage was to propose a conceptual framework to support the implementation of automated credibility labelling for map mashup applications. The framework was proposed on the basis of thorough reviews from the literature. The suggested parameters and approaches are not limited to assess credibility of information in the map mashup context, but could be applied to other Web GIS applications.
|
5 |
Disruptive cartographies : manoeuvres, risk and navigationHind, Sam January 2016 (has links)
There have been many opportunities to study protest events over the last six years. From Occupy to the Arab Spring and 15M. After the financial crash, citizens of the world crafted their own original responses. What they shared – from New York to Cairo and Madrid – was a desire to take to the streets in political protest. In the UK the enemy was ‘austerity’. One of the first policies of this new era proposed a rise in Higher Education tuition fees. Students took to the streets in dissent. A host of political, institutional, technological and social transformations occurred. More specifically, it saw the birth of a digital platform designed to help protesters navigate during protests. It was called Sukey. This thesis interrogates the impact and legacy of the Sukey platform; over, and beyond, these tumultuous years. It does so through the lens of ‘disruptive cartography’, arguing that the platform was deployed to disrupt the smooth running of both so-called ‘A-to-B’ demonstrations, and police containment tactics colloquially referred to as ‘kettles’. I contend that the platform did so by providing up-to-date navigational information regarding active phenomena, such as police movements. In this thesis I undertake an aesthetic, interactive and mobile analysis to investigate the navigational dimensions of the project. I do so through an automobile metaphor in which I look ‘under the bonnet’, ‘through the windscreen’, and ‘on(to) the road’. In its absence, I argue that protesters have lacked the requisite navigational knowledges to perform unpredictable manoeuvres, during protest events. As a result, they have returned to using institutional forms that limit the navigational possibilities brought-into-being by the Sukey platform. I conclude by speculating on three possible ‘failures’ of the platform regarding its ability to faithfully ‘capture’ live events, provide a navigational ‘correspondence’ between cartographic ‘signposts’, and to protect participants from data-driven policing.
|
6 |
Expertise in map comprehension : processing of geographic features according to spatial configuration and abstract rolesKent, Robin S. G. January 2011 (has links)
Expertise in topographic map reading is dependent on efficient processing of geographical information presented in a standardised map format. Studies have supported the proposition that expert map readers employ cognitive schemas in which prototypical configurations held in long term memory are employed during the surface search of map features to facilitate map comprehension. Within the experts' cognitive schemas, it is assumed that features are grouped according to spatial configurations that have been frequently encountered and these patterns facilitate efficient chunking of features during information processing. This thesis investigates the nature of information held in experts' cognitive schemas. It also proposes that features are grouped in the experts' schemas not only by their spatial configurations but according to the abstract and functional roles they perform. Three experiments investigated the information processing strategies employed by firstly, skilled map readers engaged in a map reproduction task and secondly, expert map readers engaged in a location comparison exercise. In the first and second experiments, skilled and novice map readers studied and reproduced a town map and a topographic map. Drawing protocols and verbal protocols provided insights into their information processing strategies. The skilled map readers demonstrated superior performance for reproducing contour related data with evidence of the use of cognitive schemas. For the third experiment, expert and novice map readers compared locations within map excerpts for similarities of boundary extents. Eye-gaze data and verbal protocols provided information on the features attended to and the participants' search patterns. The expert group integrated features into their cognitive schemas according to the abstract roles they performed significantly more frequently than the novices. Both groups employed pattern recognition to integrate features for some of the locations. Within a similar experimental design the second part of the third experiment examined whether experts also integrated the abstract roles of remote features and village grouping concepts within their cognitive schemas. The experts again integrated the abstract roles of physical features into their schemas more often than novices but this strategy was not employed for either the remote feature or grouping categories. Implications for map design and future Geographic Information Systems are discussed.
|
7 |
Parallel numerical modelling of ice flow in AntarcticaTakeda, A. L. January 2000 (has links)
This thesis describes the parallel implementation of a three-dimensional numerical ice flow model of the whole of the grounded part of the Antarctic Ice Sheet at a grid resolution of 20km. Numerical modelling of ice flow is computationally intensive as it requires the solution of non-linear equations over long time scales. A parallel model was developed to overcome these restrictions, and it is demonstrated that the model runs more quickly on multiple processors than on a single processor (70% efficiency on four processors). The model was successfully validated against published benchmarks and compared against other models and remote sensing work. The main ice flow features are well reproduced, including some newly observed fast flow features in East Antarctica. The optimal run-time versus efficiency was exploited to run a series of detailed sensitivity tests on parameters that may affect the resulting ice sheet volume and basal thermal regime. Compared with the effects of surface air temperature, the accumulation rate and tuning parameter m in flow parameter A., geothermal heat flux was found to have the strongest effect on basal melting. It is shown that use of different geothermal heat flux values can affect the inclusion of sub-glacial lakes in the zone of basal melting. Topographic smoothing may reduce the model’s ability to locate subglacial lakes. Fast flow features appear in the modelled ice sheet despite the lack of basal slip conditions in the model. Use of a new topography data set improved the model’s ability to locate subglacial lakes in zones of basal melting, and revealed additional fast flow features in East Antarctica.
|
8 |
An investigation of the oceanic skin temperature deviationDonlon, C. J. January 1994 (has links)
Satellite and in-situ radiometric measurements of sea surface temperature (SST) together with conventional SST and meteorological parameters are used to provide a description of the ocean surface skin temperature deviation (skin temperature - bulk temperature, AT) for a transect made across the Atlantic ocean from 50°N 00°W to 23°S 35°W during September and October 1992. Methods of in-situ SST measurement are discussed and the errors associated with each technique are given. The principles of infra red radiometry are explained. The differences between the calibration strategies used to determine SST using infra-red radiometers from both in-situ and satellite platforms are reviewed and the errors associated with each technique are given. Differences between published in-situ infra red SST data indicate that there may be a bias in these data as a consequence of the calibration strategy adopted. The need for an inter calibration of in-situ infra red radiometer systems used for the validaion of satellite SST is highlighted. Satellite SST algorithms are discussed and the principles of atmospheric correction are explained. The difference between the radiometric 'skin' temperature of the ocean and the conventional 'bulk' temperature at depth is defined. A review of current observations of AT is given. Several theoretical treatments of AT are reviewed. The definitions of the surface fluxes of heat and momentum are given. A description of the collection of data and an analysis of the calibration of the infra-red radiometer used to measure the skin temperature is presented. Data have been processed to obtain AT and the surface fluxes of heat and momentum nave been evaluated according to the bulk aerodynamic formulae. The relationships between AT and the measurements made are presented for the entire data set and for day and night time observations separately. Four time series of observed data are presented and the local conditions during the time of measurement are used to discuss AT. AT has a mean value of 0.39°C ±0.3°C and is shown to be a persistent feature of the Atlantic ocean. Correlation analyses reveal the skin and bulk temperature fields to be correlated at length scales > 155 km. Night time correlations are consistently higher than the day time at all length scales. For this reason it is recommended that satellite validation data are only collected during the night. High sea states are shown to affect both in-situ and satellite observations of SST biasing these data warm. The regional nature of AT is presented which is related to the dominant atmosphere-ocean conditions for each region. AT is shown to be greatest at the higher latitudes and weak in the tropical regions. Several parameterisations of AT are used to obtain estimates of AT using the data collected. These are found to be inadequate to predict AT at small temporal scales. A regional dependence of AT is found in these parameterisations. The coefficient A, of the Saunders (1969) parameterisation has been evaluated and is shown to have a regional dependence on the local atmosphere ocean conditions. The coefficient Ci and Ci of the Hasse (1971) parameterisation have been evaluated using the data collected. These are Ci=4.74 and C2=1.22. A comparison between the Along Track Scanning Radiometer Average SST is presented. Satellite - in-situ bulk AT has been obtained and shown to be comparable to that observed in-situ. This comparison highlights the need to make skin SST validation measurements rather than bulk SST measurements. The ATSR ASST data are shown to return a SST accurate to better than 0.3°C.
|
9 |
Numerical modelling of Langjökull Ice Cap, IcelandGooday, Richard David January 2003 (has links)
This thesis describes the development and application of a mass balance model for Langjökull Ice Cap to enable an investigation into its state of balance. This model is then coupled to a numerical model of ice flow, also developed as part of this thesis, to allow an assessment of the sensitivity of the ice cap to future climate change. Using data collected at a field site on one of the ice cap’s outlet glaciers in the summer of 2000, an energy balance model was optimised in order to obtain the best fit between the predicted and observed ablation. Although the optimisation enabled a reasonable fit to the observed data, it was worse than could be obtained using a simpler degree-day model. This degree-day model was developed to calculated the mass balance of the whole ice cap; using 30 years of meteorological data, the results of this model suggested that the ice cap is currently in a state of expansion – the average net mass balance across the ice cap being 0.5 m w.e.yr-1. To aid the development of a numerical model of ice flow, the flow regime of Langjökull was investigated by looking at several different methods of calculating the velocities occurring there. The effects of sediment deformation were found to be important in accurately modelling the ice flow occurring at Langjökull Ice Cap, with a till viscosity of 6x109 Pa s found suitable for modelling this process. When the degree-day and the ice flow models were coupled together, the modelled ice cap was found to be strongly dependent on air temperature and, under climate change scenarios with warming rates of 0.02 and 0.04 yr-1, Langjökull is predicted to disappear within 200 year. The inclusion of sediment deformation was found to have little effect on the response of the ice cap to climate change.
|
10 |
Evaluating human-centered approaches for geovisualizationLloyd, David January 2009 (has links)
Working with two small group of domain experts I evaluate human-centered approaches to application development which are applicable to geovisualization, following an ISO13407 taxonomy that covers context of use, eliciting requirements, and design. These approaches include field studies and contextual analysis of subjects' context; establishing requirements using a template, via a lecture to communicate geovisualization to subjects and by communicating subjects' context to geovisualization experts with a scenario; autoethnography to understand the geovisualization design process; wireframe, paper and digital interactive prototyping with alternative protocols; and a decision making process for prioritising application improvement. I find that the acquisition and use of real user data is key; that a template approach and teaching subjects about visualization tools and interactions both fail to elicit useful requirements for a visualization application. Consulting geovisualization experts with a scenario of user context and samples of user data does yield suggestions for tools and interactions of use to a visualization designer. The complex and composite natures of both visualization and human-centered domains, incorporating learning from both domains, with user context, makes design challenging. Wireframe, paper and digital interactive prototypes mediate between the user and visualization domains successfully, eliciting exploratory behaviour and suggestions to improve prototypes. Paper prototypes are particularly successful at eliciting suggestions and especially novel visualization improvements. Decision-making techniques prove useful for prioritising different possible improvements, although domain subjects select data-related features over more novel alternative and rank these more inconsistently. The research concludes that understanding subject context of use and data is important and occurs throughout the process of engagement with domain experts, and that standard requirements elicitation techniques are unsuccessful for geovisualization. Engagement with subjects at an early stage with simple prototypes incorporating real subject data and moving to successively more complex prototypes holds the best promise for creating successful geovisualization applications.
|
Page generated in 0.1009 seconds