Spelling suggestions: "subject:"esource mapping"" "subject:"desource mapping""
1 |
Heterogeneous Processing in Software Defined Radio: Flexible Implementation and Optimal Resource MappingBieberly, Frank 05 April 2012 (has links)
The advantages provided by Software Defined Radios (SDRs) have made them useful tools for communication engineers and academics alike. The ability to support a wide range of communication waveforms with varying modulation, encoding, or frequencies on a single hardware platform can decrease production costs while accelerating wave-form development. SDR applications are expanding in military and commercial environments as advances in transistor technology allow greater computational density with decreased power-consumption, size, and weight. As the demand for greater performance continues to increase, some SDR manufacturers are experimenting with heterogeneous processing platforms to meet these requirements.
Heterogeneous processing, a method of dividing computational tasks among dissimilar processors, is well-suited to the data flow programming paradigm used in many common SDR software frameworks. Particularly on embedded platforms, heterogeneous processing can offer significant gains in computational power while maintaining low power-consumption, opening the door for affordable and useful mobile SDR platforms.
Many past SDR hardware implementations utilize a partially heterogeneous processing approach. A field programmable gate array (FPGA) is often used to perform high-speed processing (DDC, decimation) near the radio front-end while another processor (GPP, DSP or FPGA) performs the rest of the SDR application signal processing (gain control, filtering, demodulation). A few recent SDR hardware platforms are designed to allow the use of multiple processor types throughout the SDR application's processing chain. This can result in significant benefit to SDR software that can take advantage of the greater heterogeneous processing now available.
This thesis will present a new method of heterogeneous processing in the framework of GNU Radio. In this implementation a software wrapper allows a DSP to participate seamlessly in GNU Radio applications. The DSP can be directly substituted for existing GNU Radio signal processing blocks—significantly expanding the platform's capabilities while maintaining the benefits of the component-based design methodology. A similar approach could be applied to additional processing elements (e.g. FPGAs and co-processors) and to other SDR software frameworks.
As the capabilities of this heterogeneous framework increase users will be required to assign hardware resources to signal processing tasks to maximize performance. To remove this burden, a method of predicting GNU Radio application performance and a heuristic resource mapping algorithm, which seems to perform well in practice, are presented. / Master of Science
|
2 |
Mapping HW resource usage towards SW performanceSuljevic, Benjamin January 2019 (has links)
With the software applications increasing in complexity, description of hardware is becoming increasingly relevant. To ensure the quality of service for specific applications, it is imperative to have an insight into hardware resources. Cache memory is used for storing data closer to the processor needed for quick access and improves the quality of service of applications. The description of cache memory usually consists of the size of different cache levels, set associativity, or line size. Software applications would benefit more from a more detailed model of cache memory.In this thesis, we offer a way of describing the behavior of cache memory which benefits software performance. Several performance events are tested, including L1 cache misses, L2 cache misses, and L3 cache misses. With the collected information, we develop performance models of cache memory behavior. Goodness of fit is tested for these models and they are used to predict the behavior of the cache memory during future runs of the same application.Our experiments show that L1 cache misses can be modeled to predict the future runs. L2 cache misses model is less accurate but still usable for predictions, and L3 cache misses model is the least accurate and is not feasible to predict the behavior of the future runs.
|
3 |
A review of the environmental resource mapping system and a proof that it is impossible to write a general algorithm for analysing interactions between organisms distributed at locations described by a locationally linked database and physical properties recorded within the databaseHall, Bryan, University of Western Sydney, Faculty of Science and Technology, School of Science January 1994 (has links)
The Environmental Resource Mapping System (E-RMS) is a geographic information system (GIS) that is used by the National Parks and Wildlife Service to assist in management of national parks. The package is available commercially from the Service and is used by other government departments for environmental management. E-RMS has also been present in Australian Universities and used for academic work for a number of years. This thesis demonstrates that existing procedures for product quality and performance have not been followed in the production of the package and that the package and therefore much of the work undertaken with the package is fundamentally flawed. The E-RMS software contains and produces a number of serious mistakes. Several problems are identified and discussed in this thesis. As a result of the shortcomings, the author recommends that an enquiry be conducted to investigate *1/ The technical feasibility of each project for which the E-RMS package has been used; *2/ The full extent and consequences of the failings inherent with the package; and *3/ The suitability of the E-RMS GIS package for the purposes for which it is sold. Australian Standard 3898 requires that the purpose, functions and limitations of consumer software shall be described. To comply with this standard, users of the E-RMS package would have to be informed of several factors related to it. These are discussed in the research. Failure to consider the usefulness and extractable nature of information in any GIS database will inevitably lead to problems that may endanger the phenomena that the GIS is designed to protect. / Master of Applied Science (Environmental Science)
|
4 |
Assessing the potential contribution of renewable energy to electricity supply in Australia: A study of renewable energy with a particular focus upon domestic rooftop photovoltaics, domestic solar hot water and commercial wind energyMills, David Unknown Date (has links)
Renewable energy has become the world's fastest growing energy source as a direct result of increasing concerns about the environmental damage that is being caused by fossil fuel and nuclear energy use. With the exception of large-scale hydro, however, very little of Australia's electricity is supplied from renewable energy. Due to our lack of experience with the use of most renewable energy technologies and the associated lack of knowledge regarding their true potential, doubts remain as to how much electricity could be generated or displaced by renewable energy. Although renewable energy industries in Australia have recently begun to experience strong growth, this growth could be curtailed if there is a lack of faith in the potential for renewable energy. The aim of this study is to further our understanding of the potential for renewable energy to contribute to electricity supply in Australia. This aim is achieved through the development and demonstration of methodologies for estimating potential electricity production from key renewable energy resources. The study demonstrates how methodologies for assessing the potential contribution of key renewable energy resources to electricity supply in Australia can be developed utilising a spatial assessment of important resource variables within the context of plausible utilisation of renewable energy resources. A literature review provides the basis for an assessment of the current state of knowledge regarding the use of renewable energy for electricity supply in Australia. The range of different renewable energy technologies is canvassed, brief descriptions of the technologies are presented and an appraisal is made of their commercial development status. The extent to which different renewable energy technologies have been utilised for electricity supply in Australia and prospects for near-future developments are described. Scenario analysis is used to provide insights into future development paths for renewable energy. This assists in the identification of key renewable energy technologies that will be examined in more detail and it helps in the setting of parameters for assessments of these technologies. Three scenarios are presented and these provide a framework for an analysis of possible contributions by renewable energy to electricity supply in Australia. Of those technologies that could potentially make significant contributions to electricity supply in the near term, utility scale wind energy, domestic rooftop photovoltaics (rooftop BIPV) and domestic solar hot water (SHW) stand out as being key technologies where further research in relation to resource assessment would be beneficial. The dispersed nature of the resource bases utilised by these technologies has made it difficult to assess how much electricity they could generate or displace. Conventional methods of assessing electricity generation or displacement, based upon project or site-specific analyses, have not proven amenable to analyses of the total amount of electricity that could be generated or displaced by these technologies throughout Australia. Therefore, alternative methods for assessing the potential of these technologies are needed. New models for analysing wind, BIPV and SHW performance are developed in this study. These models demonstrate the application of Geographical Information Systems (GIS) for wind, BIPV and SHW resource mapping. Wind energy maps for Australia are created showing actual wind speeds suitable for use at elevations appropriate for wind turbines. These maps represent significant advances over traditional wind atlases used in other nations due to their presentation of estimated actual wind speeds, rather than isovent lines for idealised wind speed gradients. The use of GIS for analysing BIPV and SHW resources also represents a significant departure from traditional modelling processes and demonstrates a means of overcoming important limitations of existing BIPV and SHW evaluation tools. The wind, BIPV and SHW resource mapping processes that have been developed and applied in this study show how broad-area assessments of electricity supply or displacement can be produced for technologies where spatial variations in key performance attributes constrain the use of traditional modelling processes.
|
5 |
Organisational awareness : mapping human capital for enhancing collaboration in organisations / La sensibilisation organisationnelle : la cartographie du capital humain pour le renforcement de la collaboration dans les organisationsGarbash, Dor Avraham 13 October 2016 (has links)
Comment peut-on devenir plus conscients des sources de connaissance au sein des organisations des humains? Les changements économiques et technologiques rapides forcent les organisations à devenir plus souples, agiles et interdisciplinaires. Pour cela, les organisations cherchent des alternatives pour les structures de communication hiérarchiques traditionnelles qui entravent les pratiques de collaboration ascendantes. Pour que les méthodes ascendantes soient efficaces, il est nécessaire d'offrir aux membres l'accès à l'information et à l'expertise dont ils ont besoin pour prendre des décisions qualifiées. Ceci est un défi complexe qui implique la culture organisationnelle, l'informatique et les pratiques de travail. Un défaut au niveau de l'application de ce système peut aborder des points critiques qui peuvent ralentir les processus de travail, d'entraver l'innovation et qui conduisent souvent au travail suboptimal et redondant. Par exemple, une enquête 2014 de 152 dirigeants de Campus IT aux Etats-Unis, estime que 19% des systèmes informatiques du campus sont redondants, ce qui coûte les universités des Etats-Unis 3.8B$ par an. Dans l'ensemble, les travailleurs intellectuels trouvent l'information dont ils ont besoin seulement 56% du temps. Avec un quart du temps total des travailleurs intellectuels consacré à la recherche et l'analyse des informations. Ce gaspillage de temps coûte 7K$ pour chaque employé par an. Un autre exemple du gaspillage est celui des nouveaux arrivants et des employés promus qui peuvent prendre jusqu'à 2 ans pour s'intégrer pleinement au sein de leur département. En outre et selon des enquêtes étendues, seulement 28% des apprenants estiment que leurs organisations actuelles «utilisent pleinement» les compétences qu'ils ont actuellement capable d'offrir et 66% prévoient quitter leur organisation en 2020. Réussir la résolution de ce défi est capable de motiver les membres de l'organisation, ainsi que d'y améliorer l'innovation et l'apprentissage. L'objectif de cette thèse est de mieux comprendre ce problème en explorant les défis rencontrés par le service d'informatique dans une université et un centre de recherche interdisciplinaire. Deuxièmement, co-développer et mettre en œuvre une solution avec ces institutions, je décris leur utilisation des logiciels que nous avons développés, les résultats et la valeur obtenus avec ces pilotes. Troisièmement, tester l'efficacité de la solution, et explorer de nouvelles applications et le potentiel d'un tel système similaire pour être utilisé dans une plus grande échelle. Pour mieux comprendre le problème je me suis engagé dans une discussion avec les membres et les dirigeants des deux organisations. Une conclusion importante des discussions est que les membres de ces organisations souffrent souvent d'un manque de sensibilisation à propos de leurs connaissances-compétences au niveau d'organisation du capital, et la connaissance des processus et des relations sociales avec leurs collègues dans l'organisation. Grâce à cette exposition, les idées novatrices, les opportunités et les intérêts communs des pairs sont sévèrement limités. Cela provoque des retards inutiles dans les projets inter-équipes, des goulots d'étranglement, et un manque de sensibilisation sur les possibilités de stages. Aussi, j'ai craqué le problème et je l’avais défini comme l'une des informations de fragmentation: Différentes informations sont stockées dans des bases de données disparates ou dans la tête des gens, exigeant un effort et de savoir-faire pour l'obtenir. (...) / How can we become more aware of the sources of insight within human organisations? Rapid economical and technological changes force organisations to become more adaptive, agile and interdisciplinary. In light of this, organisations are seeking alternatives for traditional hierarchical communication structures that hinder bottom-up collaboration practices. Effective bottom-up methods require empowering members with access to the information and expertise they need to take qualified decisions. This is a complex challenge that involves organisational culture, IT and work practices. Failing to address it creates bottlenecks that can slow down business processes, hinder innovation and often lead to suboptimal and redundant work. For example, a 2014 survey of 152 Campus IT leaders in the US, estimated that 19% of the campus IT systems are redundant, costing US universities 3.8B$ per year. In aggregate, knowledge workers find the information they need only 56% of the time. With a quarter of knowledge workers total work time spent in finding and analyzing information. This time waste alone costs 7K$ per employee annually. Another example of the waste created is that newcomers and remote employees may take up to 2 years to fully integrate within their department. Furthermore according to extended surveys, only 28% of millennials feel that their current organizations are making ‘full use’ of the skills they currently have to offer and 66% expect to leave their organisation by 2020. Successfully resolving this challenge holds the potential to motivate organisation members, as well as enhance innovation and learning within it. The focus of this thesis is to better understand this problem by exploring the challenges faced by a university IT department and an interdisciplinary research center. Second, co-develop and implement a solution with these institutions, I describe their usage of the software tool we developed, outcomes and value obtained in these pilots. Third, test the effectiveness of the solution, and explore further applications and potential for a similar system to be used in a wider scale. To better understand the problem I engaged in discussion with members and leaders of both organisations. An important conclusion from the discussions is that members of these organizations often suffer from lack of awareness about their organisation’s knowledge capital—the competencies, knowledge of processes and social connections of their colleagues. Due to this exposure to innovative ideas, opportunities and common interests of peers is severely limited. This causes unnecessary delays in inter-team projects, bottlenecks, and lack of awareness about internship opportunities. I further broke down the problem, and defined it as one of information fragmentation: Different information is stored in disparate databases or inside people’s heads, requiring effort and know-how in order to obtain it. Following the conclusions of this analysis and state-of-the-art review, we have set together the goal to create a collaborative visual database to map the people, projects, skills and institutions for the IT department of Descartes University, and in addition, people, interests and internship opportunities within the CRI, an interdisciplinary research and education center. We have also conducted interviews, surveys and quizzes that ascertain that people had difficulties identifying experts outside their core teams. During the course of this thesis, I progressively addressed this challenge by developing two collaborative web applications called Rhizi and Knownodes. Knownodes is a collaborative knowledge graph which utilized information-rich edges to describe relationships between resources. Rhizi is a real-time and collaborative knowledge capital mapping interface. A prominent unique feature of Rhizi is that it provides a UI that turns text-based assertions made by users into a visual knowledge graph. (...)
|
6 |
The Need for Accurate Pre-processing and Data Integration for the Application of Hyperspectral Imaging in Mineral ExplorationLorenz, Sandra 06 November 2019 (has links)
Die hyperspektrale Bildgebung stellt eine Schlüsseltechnologie in der nicht-invasiven Mineralanalyse dar, sei es im Labormaßstab oder als fernerkundliche Methode. Rasante Entwicklungen im Sensordesign und in der Computertechnik hinsichtlich Miniaturisierung, Bildauflösung und Datenqualität ermöglichen neue Einsatzgebiete in der Erkundung mineralischer Rohstoffe, wie die drohnen-gestützte Datenaufnahme oder digitale Aufschluss- und Bohrkernkartierung. Allgemeingültige Datenverarbeitungsroutinen fehlen jedoch meist und erschweren die Etablierung dieser vielversprechenden Ansätze. Besondere Herausforderungen bestehen hinsichtlich notwendiger radiometrischer und geometrischer Datenkorrekturen, der räumlichen Georeferenzierung sowie der Integration mit anderen Datenquellen. Die vorliegende Arbeit beschreibt innovative Arbeitsabläufe zur Lösung dieser Problemstellungen und demonstriert die Wichtigkeit der einzelnen Schritte. Sie zeigt das Potenzial entsprechend prozessierter spektraler Bilddaten für komplexe Aufgaben in Mineralexploration und Geowissenschaften. / Hyperspectral imaging (HSI) is one of the key technologies in current non-invasive material analysis. Recent developments in sensor design and computer technology allow the acquisition and processing of high spectral and spatial resolution datasets. In contrast to active spectroscopic approaches such as X-ray fluorescence or laser-induced breakdown spectroscopy, passive hyperspectral reflectance measurements in the visible and infrared parts of the electromagnetic spectrum are considered rapid, non-destructive, and safe. Compared to true color or multi-spectral imagery, a much larger range and even small compositional changes of substances can be differentiated and analyzed. Applications of hyperspectral reflectance imaging can be found in a wide range of scientific and industrial fields, especially when physically inaccessible or sensitive samples and processes need to be analyzed. In geosciences, this method offers a possibility to obtain spatially continuous compositional information of samples, outcrops, or regions that might be otherwise inaccessible or too large, dangerous, or environmentally valuable for a traditional exploration at reasonable expenditure. Depending on the spectral range and resolution of the deployed sensor, HSI can provide information about the distribution of rock-forming and alteration minerals, specific chemical compounds and ions. Traditional operational applications comprise space-, airborne, and lab-scale measurements with a usually (near-)nadir viewing angle. The diversity of available sensors, in particular the ongoing miniaturization, enables their usage from a wide range of distances and viewing angles on a large variety of platforms. Many recent approaches focus on the application of hyperspectral sensors in an intermediate to close sensor-target distance (one to several hundred meters) between airborne and lab-scale, usually implying exceptional acquisition parameters. These comprise unusual viewing angles as for the imaging of vertical targets, specific geometric and radiometric distortions associated with the deployment of small moving platforms such as unmanned aerial systems (UAS), or extreme size and complexity of data created by large imaging campaigns. Accurate geometric and radiometric data corrections using established methods is often not possible. Another important challenge results from the overall variety of spatial scales, sensors, and viewing angles, which often impedes a combined interpretation of datasets, such as in a 2D geographic information system (GIS). Recent studies mostly referred to work with at least partly uncorrected data that is not able to set the results in a meaningful spatial context.
These major unsolved challenges of hyperspectral imaging in mineral exploration initiated the motivation for this work. The core aim is the development of tools that bridge data acquisition and interpretation, by providing full image processing workflows from the acquisition of raw data in the field or lab, to fully corrected, validated and spatially registered at-target reflectance datasets, which are valuable for subsequent spectral analysis, image classification, or fusion in different operational environments at multiple scales. I focus on promising emerging HSI approaches, i.e.: (1) the use of lightweight UAS platforms, (2) mapping of inaccessible vertical outcrops, sometimes at up to several kilometers distance, (3) multi-sensor integration for versatile sample analysis in the near-field or lab-scale, and (4) the combination of reflectance HSI with other spectroscopic methods such as photoluminescence (PL) spectroscopy for the characterization of valuable elements in low-grade ores. In each topic, the state of the art is analyzed, tailored workflows are developed to meet key challenges and the potential of the resulting dataset is showcased on prominent mineral exploration related examples. Combined in a Python toolbox, the developed workflows aim to be versatile in regard to utilized sensors and desired applications.
|
Page generated in 0.0949 seconds