• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 34
  • 26
  • 6
  • 4
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 87
  • 87
  • 24
  • 13
  • 12
  • 10
  • 9
  • 9
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

A Similarity-based Data Reduction Approach

Ouyang, Jeng 07 September 2009 (has links)
Finding an efficient data reduction method for large-scale problems is an imperative task. In this paper, we propose a similarity-based self-constructing fuzzy clustering algorithm to do the sampling of instances for the classification task. Instances that are similar to each other are grouped into the same cluster. When all the instances have been fed in, a number of clusters are formed automatically. Then the statistical mean for each cluster will be regarded as representing all the instances covered in the cluster. This approach has two advantages. One is that it can be faster and uses less storage memory. The other is that the number of new representative instances need not be specified in advance by the user. Experiments on real-world datasets show that our method can run faster and obtain better reduction rate than other methods.
42

The quantification and visualisation of human flourishing.

Henley, Lisa January 2015 (has links)
Economic indicators such as GDP have been a main indicator of human progress since the first half of last century. There is concern that continuing to measure our progress and / or wellbeing using measures that encourage consumption on a planet with limited resources, may not be ideal. Alternative measures of human progress, have a top down approach where the creators decide what the measure will contain. This work defines a 'bottom up' methodology an example of measuring human progress that doesn't require manual data reduction. The technique allows visual overlay of other 'factors' that users may feel are particularly important. I designed and wrote a genetic algorithm, which, in conjunction with regression analysis, was used to select the 'most important' variables from a large range of variables loosely associated with the topic. This approach could be applied in many areas where there are a lot of data from which an analyst must choose. Next I designed and wrote a genetic algorithm to explore the evolution of a spectral clustering solution over time. Additionally, I designed and wrote a genetic algorithm with a multi-faceted fitness function which I used to select the most appropriate clustering procedure from a range of hierarchical agglomerative methods. Evolving the algorithm over time was not successful in this instance, but the approach holds a lot of promise as an alternative to 'scoring' new data based on an original solution, and as a method for using alternate procedural options to those an analyst might normally select. The final solution allowed an evolution of the number of clusters with a fixed clustering method and variable selection over time. Profiling with various external data sources gave consistent and interesting interpretations to the clusters.
43

Monitoring thermic patterns in beehives via wireless sensor networks / Monitoramento de padrÃes tÃrmicos em colmeias de abelhas via redes de sensores sem fio

Douglas Santiago Kridi 28 August 2014 (has links)
nÃo hà / Swarming is the mass exodus of bees in a hive, whose most common causes are lack of food, stress, variations of humidity and especially high temperatures. Among the types of swarming, one in which the complete abandonment of the hive occurs has brought great harm to Brazilian beekeepers, particularly the Northeast. In the Northeast region, of great importance for the Brazilian beekeeping, and where high temperatures are common in most of the year, a large number of hives is lost due to the swarming through abandonment. In an attempt to mitigate this problem, we propose a proactive monitoring hives via a network of wireless sensors capable of identifying atypical heating indicative of a preswarming condition. By means of a sampling pattern obtained from the cyclical daily temperatures, we developed a predictive algorithm based on pattern recognition techniques capable of detecting the increase of temperature inside the beehive (microclimate) responsible for the typical stress bees culminating in swarming. Such a mechanism is also able to recognize and avoid sending redundant information over the network in order to reduce radio communication, thereby reducing costs of data transmission and energy. / EnxameaÃÃo à a saÃda em massa das abelhas de uma colmeia, cujas causas mais comuns sÃo a falta de alimentos, estresse, variaÃÃes da umidade do ar e principalmente as altas temperaturas. Dentre os tipos de enxameaÃÃo, aquela em que ocorre o abandono completo da colmeia tem trazido grandes prejuÃzos aos apicultores brasileiros, particularmente aos nordestinos. Na regiÃo Nordeste, de grande importÃncia para a produÃÃo apÃcola brasileira e onde altas temperaturas sÃo comuns na maior parte do ano, um grande nÃmero de colmeias à perdido em funÃÃo da enxameaÃÃo por abandono. Na tentativa de mitigar este problema, propomos aqui um monitoramento proativo de colmeias via uma rede de sensores sem fio capaz de identificar o aquecimento atÃpico indicativo de uma condiÃÃo prÃ-enxameatÃria. Por meio de um padrÃo de coletas obtido a partir do comportamento cÃclico de temperaturas diÃrias, elaboramos um algoritmo preditivo, baseado em tÃcnicas de reconhecimento de padrÃes, capaz de detectar o aumento da temperatura no interior da colmeia (microclima) responsÃvel pelo estresse tÃpico das abelhas que culmina na enxameaÃÃo. Tal mecanismo tambÃm à capaz de reconhecer e evitar o envio de informaÃÃes redundantes pela rede de modo a diminuir a comunicaÃÃo via rÃdio, consequentemente reduzindo custos de transmissÃo de dados e energia.
44

From gas and dust to protostars: addressing the initial stages of star formation using observations of nearby molecular clouds

Mairs, Steve 11 December 2017 (has links)
Though there has been a considerable amount of work investigating the early stages of low-mass star formation in recent years, the general theory is only broadly understood and several open questions remain. Specifically, the dominant physical mechanisms which connect large-scale molecular cloud structures, intermediate-scale filamentary gas flows, and small-scale collapsing prestellar envelopes in the interstellar medium are poorly constrained. Even for an individual forming protostar, the evolution of the mass accretion rate from the envelope onto the central object is debated with little observational evidence to help guide the theoretical framework. In addition, with the development of new technology such as the continuum imaging instrument in operation at the James Clerk Maxwell Telescope (JCMT), the Submillimetre Common User Bolometer Array 2 (SCUBA-2), the best practices for data reduction and image calibration for ground-based, submillimetre wavelength observations are still being investigated. In this dissertation, I address facets of these open questions in five main projects with an overarching focus on the flow of material from the largest to the smallest scales in a molecular cloud. By performing synthetic observations of a numerical simulation of a turbulent molecular cloud, I investigate the nature of prestellar envelopes and find evidence of larger mass reservoirs that form filamentary structures and feed cluster formation. Then, after robustly investigating and suggesting improvements for ground-based, submillimetre data reduction techniques, I continue to probe the connection between larger and smaller scales by characterising structure fragmentation in the Southern Orion A Molecular Cloud from the perspective of 850 m continuum data. Finally, I follow star forming material to even smaller scales by exploring the evolution of the mass accretion rate onto individual protostars. This examination has required designing and implementing unprecedented spatial alignment and flux calibration techniques at 850 m. Using these newly calibrated images, I am able to identify several candidate sources that show evidence for submillimetre variability, suggesting changes in protostellar accretion rates over several year timescales. / Graduate
45

The effect of data reduction on LiDAR-based DEMs

Immelman, Jaco 02 November 2012 (has links)
M.Sc. / Light Detection and Ranging (LiDAR) provide decidedly accurate datasets with high data densities, in a very short time-span. However, the high volumes of data associated with LiDAR often require some form of data reduction to increase the data handling efficiency of these datasets, of which the latter could affect the feasibility of Digital Elevation Models (DEMs). Critically, when DEM processing times are reduced, the resultant DEM should still represent the terrain adequately. This study investigated three different data reduction techniques, (1) random point reduction, (2) grid resolution reduction, and (3) combined data reduction, in order to assess their effects on the accuracy, as well as the data handling efficiency of derived DEMs. A series of point densities of 1 %, 10 %, 25 %, 50 % and 75 % were interpolated along a range of horizontal grid resolutions (1-, 2-, 3-, 4-, 5-, 10- and 30- m). Results show that, irrespective of terrain complexity, data points can be randomly reduced up to 25 % of the data points in the original dataset, with minimal effects on the remaining dataset. However, when these datasets are interpolated, data points can only be reduced to 50 % of the original data points, before showing large deviations from the original DEM. A reduction of the grid resolution of DEMs showed that the grid resolution could be lowered to 4 metres before showing significant deviations. When combining point density reduction with grid resolution reduction, results indicate that DEMs can be derived from 75 % of the data points, at a grid resolution of 3 metres, without sacrificing more than 15 percent of the accuracy of the original DEM. Ultimately, data reduction should result in accurate DEMs that reduce the processing time. When considering the effect on the accuracy, as well as the processing times of the data reduction techniques, results indicate that resolution reduction is the most effective data reduction technique. When reducing the grid resolution to 4 metres, data handling efficiencies improved by 94 %, while only sacrificing 10 % of the data accuracy. Furthermore, this study investigated data reduction on a variety of terrain complexities and found that the reduction thresholds established by this study were applicable to both complex and non-complex terrain.
46

An ICT architecture for the neighbourhood area network in the Smart Grid

Pourmirza, Zoya January 2015 (has links)
In planning for future electricity supplies certain issues will need to be considered such as increased energy usage, urbanisation, reduction in personnel, global warming and the conservation of natural resources. As the result, some countries have investigated the transformation of their existing power grid to the so-called Smart Grid. The Smart Grid has three main characteristics which are, to some degree, antagonistic. These characteristics are the provision of good power quality, energy cost reduction and improvement in the reliability of the grid. The need to ensure that they can be accomplished together demands much richer Information and Communications Technology (ICT) networks than the current systems available. In this research we have identified the gap in the current proposals for the ICT of the power grid. We have designed and developed an ICT architecture for the neighbourhood sub-Grid level of the electrical network, where monitoring at this level is very underdeveloped because most current grids are controlled centrally and the response of the neighbourhood area is not generally monitored or actively controlled. Our designed ICT architecture, which is based on established architectural principles, can incorporate data from heterogeneous sources. This layered architecture provides both the sensors that can directly measure the electrical activity of the network (e.g. voltage) and also the sensors that measure the environment (e.g. temperature) since these provide information that can be used to anticipate demand and improve control actions. Additionally, we have de-signed a visualisation tool as an interface for a grid operators to facilitate a better comprehension of the behaviour of the neighbourhood level of the Smart Grid. Since we have noticed that energy aware ICT is a prerequisite for an efficient Smart Grid, we have utilised two different approaches to tackle this issue. The first approach was to utilise a cluster-based communication technique for the second layer of the architecture, which comprises Wireless Sensor Networks, where energy limitation is the major problem. Accordingly, we have analysed the energy-aware topology for wireless sensor networks embedded in the mentioned layer. We provide evidence that the proposed topology will bring energy efficiency to the communication network of the Smart Grid. The second approach was to develop a data reduction algorithm to reduce the volume of data prior to data transmission. We demonstrated that our developed data reduction is suitable for Smart Grid applications which can keep the integrity and quality of data. Finally, the work presented in this thesis is based on a real project that is being implemented in the medium voltage power network of the University of Manchester where power grid instrumentation, real data and professionals in the field are available. Since the project is long-term and the environmental sensor networks in particular are not currently installed we have evaluated some of our predictions via simulation. However, where the instrumentation was available, we were able to compare our predictions and our simulations with actual experimental results.
47

Partitioned Persistent Homology

Malott, Nicholas O. January 2020 (has links)
No description available.
48

The developement of software for the assessment of the microwave landing system's capability to support guided missed-approach and departure procedures

Snyder, Christopher Allen January 1997 (has links)
No description available.
49

High-dimensional Data Clustering and Statistical Analysis of Clustering-based Data Summarization Products

Zhou, Dunke 27 June 2012 (has links)
No description available.
50

A Cellular Algorithm for Data Reduction of Polygon Based Images

Caesar, Robert James 01 January 1988 (has links) (PDF)
ABSTRACT The amount of information contained in an image is often much more than is necessary. Computer generated images will always be constrained by the computer's resources or the time allowed for generation. To reduce the quantity of data in a picture while preserving its apparent quality can require complex filtering of the image data. This paper presents an algorithm for reducing data in polygon based images, using different filtering techniques that take advantage of a priori knowledge as to the images' content. One technique uses a novel implementation of vertex elimination. By passing the image through a sequence of controllable filtering stages, the image is segmented into homogeneous regions, simplified, then reassembled. The amount of data representing the picture is reduced considerably while a high degree of image quality is maintained. The effects of the different filtering stages will be analyzed with regard to data reduction and picture quality as it relates to flight simulation imagery. Numeri­ cal results are included in the analysis. Further applications of the algorithm are discussed as well.

Page generated in 0.124 seconds