• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • Tagged with
  • 8
  • 8
  • 8
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A two-step algorithm for the machine rendering of three-dimensional objects with hidden line removed

游曼美, Yau, Mann-may, Judy. January 1979 (has links)
published_or_final_version / Computer Science / Master / Master of Philosophy
2

A computational approach to the cartographic dot distribution problem

Hickey, Mutahar January 1993 (has links)
In the field of cartography, there is occasionally a need to create a distribution of dots on a map. These dots should give an impression of the density of some countable object set. This type of map is called a "Dot Distribution Map".Up to the current time, if the dots are to represent reality at all, they have to be placed by hand by a cartographer using a digitizing tablet or other input device. This is due to the fact that a census of a region gives only a total, yet it is known that the densities vary within that region. A cartographer can look at all the data available about a region and then can make judgements about how the densities will change within the region. He then can place dots which represent his interpretation of reality.This thesis states that there exists an algorithm which would assign dots to a map based upon the common belief that the density will gradate smoothly from one region with one census value to another region with a different census value.The approach taken was to relate the Map regions to polygons and to then subdivide the polygons into triangles. These triangles would then be subdivided into six children recursively and the data stored in a hex-tree. This is the current level of development. the next steps will be:Generate a surface above the 2-D map based upon the known input data of counts for the various regions.From the centroid for each existing leaf on the Hex-Tree, find the corresponding Zi value from the surface information. From each of these leaves, recursively subdivide the triangle further until the number of dots indicated by the Zt. value can be placed on the map. / Department of Computer Science
3

Uncertainty analysis of runoff estimates from runoff-depth contour maps produced by five automated procedures for the northeastern United States

Bishop, Gary D. 01 January 1991 (has links)
Maps of runoff-depth have been found to be useful tools in a variety of water resource applications. Producing such maps can be a challenging and expensive task. One of the standard methods of producing these maps is to use a manual procedure based on gaged runoff data, topographic and past runoff-depth maps, and the expert opinion of hydrologists. This thesis examined five new automated procedures for producing runoff-depth contour maps to see if the maps produced by these procedures had similar accuracy and characteristics when compared to the manual procedure. An uncertainty analysis was used to determine the accuracy of the automated procedure maps by withholding gaged runoff data from the creation of the contour maps and then interpolating estimated runoff back to these sites from the maps produced. Subtracting gaged runoff from estimated runoff produced interpolation error values. The mean interpolation error was used to define the accuracy of each map and was then compared to a similar study by Rochelle, et al., (1989) conducted on a manual procedure map.
4

Effect of manual digitizing error on the accuracy and precision of polygon area and line length

Keefer, Brenton Jan 20 November 2012 (has links)
Manual digitizing has been recognized by investigators as a significant source of map error in GIS, but the error characteristics have not been well defined. This thesis presents a methodology for simulating manual digitizing error. Stream mode digitizing error was modeled using autoregressive moving average (ARMA) procedures, and point mode digitizing was stochastically simulated using an uniform random model. These models were developed based on quantification of digitizing error collected from several operators. The resulting models were used to evaluate the effect digitizing error had upon polygon size and total line length at varying map accuracy standards. Digitizing error produced no bias in polygon area. The standard deviation of polygon area doubled as the accuracy standard bandwidth doubled, but the standard deviation was always less than 1.6 percent of total area for stream mode digitizing. Smaller polygons (less than 10 square map inches) had more bias and more variance relative to their size than larger polygons. A doubling of the accuracy standard bandwidth caused a quadrupling of line length bias and a doubling to tripling of the line length standard deviation. For stream mode digitizing, reasonable digitizing standards produced line length biases of less than 2 percent of total length and standard deviations of less than 1 percent of total length. Bias and standard deviation both increased with increasing line length (or number of points), but the bias and standard deviation as a percent of total line length remained constant as feature size changed. / Master of Science
5

Modeling the structure of cartographic information for query processing /

Nyerges, Timothy Lee January 1980 (has links)
No description available.
6

Centralized and decentralized map updating and terrain masking analysis

Bello, Martin Glen January 1981 (has links)
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1981. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Vita. / Includes bibliographical references. / by Martin Glen Bello. / Ph.D.
7

An appraisal of the production and utility of digital atlases in Africa: a case study of Swaziland

Pettit, Louise Manda January 2000 (has links)
This research will appraise the production and utility of digital atlases in an African context, using personal experience gained through the production of a digital atlas for Swaziland, the opinions obtained from other producers of African digital atlases and relevant literature. Since the 1950s, decision makers and researchers have viewed information as a powerful contributor to national development. Over the past thirty years, developed countries have increased the efficient collation and dissemination of geographic information through the utilization of the digital environment. The development of Geographic Information Systems (GIS) has increased the range of applications attainable in digital mapping projects. Applications have ranged from resource inventories to the monitoring of environmental degradation, crime patterns and service provision. The patchy history of data collection, analysis and mapping in Africa, has had a limiting effect on the ability of countries to identify, plan and control their resources efficiently. Despite the desire to automate the mapping process and reap some of the planning benefits evident in the developed world, Africa has not succeeded in mobilising full technological potential. Political instability, poor infrastructure, the absence of national policy guidelines, and a lack of skilled manpower are some of the issues which have limited utilization. Despite many of the hurdles faced by African countries, automated mapping and analysis technologies are still being pursued. The role of computers in the utilisation of data has become apparent through several means, one of which is the digital atlas. Analytical functions in many of these products allow situation modelling and provide superior graphic displays in comparison to their paper counterparts. Several African countries have embarked on the development of national digital databases and in some cases have produced digital national atlases. The potential to improve resource utilization, service provision and land use planning using these atlases, does exist. The production and utility of these atlases in an African context, however, needs closer assessment. "Computers don't clothe, don't cure, don't feed. Their power begins and ends with information. Their usefulness is therefore strictly linked to the ffectiveness of the information" (Gardner,1993:16).
8

Spatial model development for resource management decision making and strategy formulation : application of neural network (Mounds State Park, Anderson, Indiana)

Guisse, Amadou Wane January 1993 (has links)
An important requirement of a rational policy for provision of outdoor recreation opportunities is some understanding of natural processes and public concern and /or preferences. Computerized land use suitability mapping is a technique which can help find the best location for a variety of developmental actions given a set of goals and other criteria. Over the past two decades, the methods and techniques of land use planning have been engaged in a revolution on at least two fronts as to shift the basic theories and attitudes of which land use decisions are based. The first of these fronts is the inclusion of environmental concerns, and the second is the application of more systematic methods or models. While these automated capabilities have shed new light on environmental issues, they, unfortunately, have failed to develop sufficient intelligence and adaptation to accurately model the dynamics of ecosystems.The work reported proceeds on the belief that neural network models can be used to assess and develop resource management strategies for Mounds State Park, Anderson, Indiana. The study combines a photographic survey technique with a geographic information system (GIS) and artificial neural networks (NN) to investigate the perceived impact of park management activities on recreation opportunities and experiences. It is unique in that it incorporates both survey data with spatial data and an optimizing technique to develop a model for predicting perceived management values for short and long term recreation management.According to Jeannette Stanley and Evan Bak (1988) a neural network is a massively parallel, dynamic systems of highly interconnected interacting parts based on neurobiological models. The behavior of the network depends heavily on the connection details. The state of the network evolves continually with time. Networks are considered clever and intuitive because they learn by example rather than following simple programming rules. They are defined by a set of rules or patterns based on expertise or perception for better decision making. With experience networks become sensitive to subtle relationships in the environment which are not obvious to humans.The model was developed as a counter-propagation network with a four layer learning network consisting of an input layer, a normalized layer, a kohonen layer, and an output layer. The counter-propagation network is a feed-forward network which combines Kohonen and Widrow-Hoff learning rules for a new type of mapping neural network. The network was trained with patterns derived by mapping five variables (slope, aspect, vegetation, soil, site features) and survey responses from three groups. The responses included, for each viewshed, the preference and management values, and three recreational activities each group associated with a given landscape. Overall the model behaves properly in learning the different rules and generalizing in cases where inputs had not been shown to the network apriori. Maps are provided to illustrate the different responses obtained from each group and simulated by the model. The study is not conclusive as to the capabilities of the combination of GIS techniques and neural networks, but it gives a good flavor of what can be achieved when accurate mapping information is used by an intelligent system for decision making. / Department of Landscape Architecture

Page generated in 0.1353 seconds