• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 98
  • 82
  • Tagged with
  • 180
  • 180
  • 159
  • 144
  • 123
  • 59
  • 20
  • 19
  • 18
  • 16
  • 16
  • 16
  • 14
  • 14
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Quality assessment for hyperspectral airborne systems

Kerr, Grégoire Henry Gérard 06 August 2015 (has links)
Im Rahmen dieser Arbeit wird ein Konzept entwickelt und umgesetzt, welches eine umfassende Bewertung von Daten flugzeuggetragener hyperspektraler Systeme ermöglicht. Es baut auf mehreren aktuellen Initiativen zur Erfassung der Datenqualität flugzeuggetragener Sensoren auf: Der ''European Facility for Airborne Reseach'', der ''Quality Assessment for Earth Observation Workgroup'' und dem ''Joint Committee for Guides in Metrology''. Bei der Befliegung eines Gebietes mit hyperspektralen Sensorsystemen werden mehrere, teilweise sich überlappende, Flugstreifen aufgenommen. Es wird vorgeschlagen, die Bildinformationen dieser Überlappungsbereiche als redundant anzusehen und so die innere Variabilität der Daten zu erfassen. Die jeweils zwischen zwei Flugstreifen auftretende Variabilität kann (aufgrund unterschiedlicher Blickrichtungen) als ungünstigster anzunehmender Fall (''worst-case'') betrachtet werden und ergänzt daher existierende Ansätze, die sich auf die Auswertung homogener Flächen konzentrieren. Das entwickelte Konzept ist auf unterschiedliche Sensorsysteme anwendbar, somit generisch und kann problemlos in die aktuelle Datenprozessierungskette des Deutschen Zentrums für Luft- und Raumfahrt e.V. integriert werden. Im ersten Abschnitt der Arbeit wird dargelegt, wie korrespondierende Pixelpaare, die in den jeweiligen Streifen an gleicher Geolokation liegen, ermittelt werden können. Darauf aufbauend erfolgt eine Plausibilitätsüberprüfung der erfaßten Pixelpaare unter Verwendung von Zuverlässigkeitsmetriken, die auf Basis höherwertigerer Datenprodukte berechnet werden. In einem weiteren Schritt werden die Ergebnisse genutzt, um die notwendigen Parameter für eine optimierte Bildauswertung - hier im Sinne der Zuverlässigkeit - abzuleiten. Abschließend werden die Pixelpaare benutzt, um die globale Variabilität der Reflektanzwerte abzuschätzen. Insgesamt werden durch diese Arbeit die existierenden Methoden zur Qualitätskontrolle optischer Bilddaten umfassend ergänzt. / This work proposes a methodology for performing a quality assessment on the complete airborne hyperspectral system, thus ranging from data acquisition up to land-product generation. It is compliant with other quality assessment initiatives, such as the European Facility for Airborne Research (EUFAR), the Quality Assessment for Earth observation work-group (QA4EO) and the Joint Committee for Guides in Metrology (JCGM). These are extended into a generic framework allowing for a flexible but reliable quality assessment strategy. Since airborne hyperspectral imagery is usually acquired in several partially overlapping flight-lines, it is proposed to use this information redundancy to retrieve the imagery''s internal variability. The underlying method is generic and can be easily introduced in the German Aerospace Center DLR''s hyperspectral processing chain. The comparison of two overlapping flight-lines is not straightforward, should it only be because the presence of geo-location errors present in the data. A first step consists in retrieving the relative variability of the pixel''s geo-locations, hence providing pairs of pixels imaging the same areas. Subsequently, these pairs of pixels are used to obtain quality indicators accounting for the reproducibility of mapping-products, thus extending the EUFAR''s quality layers up to land-products. The third stage of the analysis consists of using these reliability results to improve the mapping-products: it is proposed to maximise the reliability over the mapping-methods'' parameters. Finally, the repeatability assessment is back propagated to the hyperspectral data itself. As a result, an estimator of the reflectance variability (including model-, and scene-induced uncertainties) is proposed by means of a blind-deconvolution approach. Altogether, this complements and extends the EUFAR quality layers with estimates of the data and products repeatability while providing confidence intervals as recommended by JCGM and QA4EO.
162

Übersicht über die Promotionen an der Wirtschaftswissenschaftlichen Fakultät der Universität Leipzig von 1993 bis 1997

Universität Leipzig 28 November 2004 (has links)
No description available.
163

Übersicht über die Promotionen an der Wirtschaftswissenschaftlichen Fakultät der Universität Leipzig von 1998 bis 2000

Universität Leipzig 28 November 2004 (has links)
No description available.
164

Quality Assurance of Exposure Models for Environmental Risk Assessment of Substances / Qualitätssicherung von Expositionsmodellen zur Umweltrisikoabschätzung von Substanzen

Schwartz, Stefan 04 September 2000 (has links)
Environmental risk assessment of chemical substances in the European Union is based on a harmonised scheme. The required models and parameters are laid down in the Technical Guidance Document (TGD) and are implemented in the EUSES software. An evaluation study of the TGD exposure models was carried out. In particular, the models for estimating chemical intake by humans were investigated. The objective of this study was two-fold: firstly, to develop an evaluation methodology, since no appropriate approach is available in the scientific literature. Secondly, to elaborate applicability and limitations of the models and to provide proposals for their improvement. The principles of model evaluation in terms of quality assurance, model validation and software evaluation were elaborated and a suitable evaluation protocol for chemical risk assessment models was developed. Quality assurance of a model includes internal (e.g. an investigation of the underlying theory) and external (e.g. a comparison of the results with experimental data) validation, and addresses the evaluation of the respective software. It should focus not only on the predictive capability of a model, but also on the strength of the theoretical underpinnings, evidence supporting the model?s conceptualisation, the database and the software. The external validation was performed using a set of reference substances with different physico-chemical properties and use patterns. Additionally, sensitivity and uncertainty analyses were carried out, and alternative models were discussed. Recommendations for improvements and maintenance of the risk assessment methodology were presented. To perform the software evaluation quality criteria for risk assessment software were developed. From a theoretical point of view, it was shown that the models strongly depend on the lipophilicity of the substance, that the underlying assumptions drastically limit the applicability, and that realistic concentrations may seldom be expected. If the models are applied without adjustment, high uncertainties must inevitably be expected. However, many cases were found in which the models deliver highly valuable results. The overall system was classified as a good compromise between complexity and practicability. But several chemicals and classes of chemicals, respectively, with several restrictions were revealed: The investigated models used to assess indirect exposure to humans are in parts currently not applicable for dissociating compounds, very polar compounds, very lipophilic compounds, ions, some surfactants, and compounds in which metabolites provide the problems and mixtures. In a strict sense, the method is only applicable for persistent, non-dissociating chemicals of intermediate lipophilicity. Further limitations may exist. Regarding the software, it was found that EUSES basically fulfils the postulated criteria but is highly complex and non-transparent. To overcome the inadequacies a more modular design is proposed.
165

Spatial Aspects of Chemical Exposure Assessment: A Tool for River Networks

Wagner, Jan-Oliver 31 October 2001 (has links)
Spatial Aspects of Chemical Exposure Assessment: A Tool for River Networks. Chemical exposure assessment has gained increasing attention in recent years. Its methodologies have enabled scientists and policy-makers to understand exposure paths and to identify environmental compartments of concern. Mathematical models are used for the prediction of a chemical's concentration in a certain compartment and in some cases also for predicting the duration or time of highest load. With the Geo-referenced Regional Exposure Assessment Tool for European Rivers (GREAT-ER) spatial aspects of regional exposure assessment are addressed for the "down-the-drain" path of consumer chemicals such as detergents. On the basis of a carefully developed simulation model (Boeije, 1999), this thesis describes the concept and realization of the developed software tool GREAT-ER. With data composition and processing on the one hand and application and analysis on the other hand, two crucial aspects in spatial exposure assessment are identified and discussed. Geo-referenced real-world data are not readily available in a usable form. An intermediate format is defined to separate the tasks of an initial preparation of raw data from the final aggregation leading to a directly usable data set. It is shown that the latter step can be fully automated and thus efficiently supports an iterative procedure of data quality improvement. The application of GREAT-ER to the substances LAS (readily degradable) and boron (inert) in four Yorkshire catchments demonstrates the ability to predict mean final effluent and in-stream concentrations with an average error of less than a factor of 2. Furthermore, regional summaries and risk characterization add useful information to judging a regional response to the (potential) release of a substance. In conclusion, the development and application of GREAT-ER has proven that geo-referenced exposure assessment is possible with regard to both quality and practicability. Future activities should focus on gaining further experiences in performing simulations, improving the tool itself and extending its abilities. Finally the integration of further models should be evaluated.
166

Dynamics and sustainable use of species-rich moist forests: A process-based modelling approach

Rüger, Nadja 24 January 2007 (has links)
Sustainable use of species-rich moist forests is hampered by an insufficient understanding of their dynamics and long-term response to different wood harvesting strategies. This thesis contributes to a better understanding of natural forest dynamics, explores the productivity of native forests subjected to different management strategies, and quantifies the ecological impacts of these strategies. The thesis focuses on two study regions: tropical montane cloud forest (TMCF) in central Veracruz, Mexico, and Valdivian temperate rain forest (VTRF) in northern Chiloé Island, Chile. The process-based forest growth model FORMIND is applied to study natural forest succession, to assess long-term ecological implications of fuelwood extraction on TMCF, to explore the potential of secondary TMCF for provision of ecosystem services and fuelwood, and to compare potential harvesting strategies for VTRF regarding forest productivity and ecological consequences.Simulation results show that both forest types have a high potential for wood production. As wood extraction increases, the forest structure becomes simplified because large old trees disappear from the forest. The species composition shifts to tree species that are favoured by the respective harvesting strategy. The overall ecological impact increases linearly with the amount of extracted wood. Simulation results allow to define management strategies that balance conservation and production objectives, promote the regeneration of desired tree species, or minimise shifts in the species composition of the forest. Process-based forest models enhance our understanding of the dynamics of species-rich moist forests and are indispensable tools to assess long-term implications of anthropogenic disturbances on forest ecosystems. Thereby they contribute to the conservation and sustainable use of native forests outside protected areas.
167

Sustainable grazing management in semi-arid rangelands. An ecological-economic modelling approach

Müller, Birgit 28 March 2006 (has links)
The loss of utilisable rangeland in semi-arid areas results in huge economic and social costs worldwide. Only adaptive management strategies are able to cope with these systems, which are largely driven by unpredictable and stochastic rainfall. Additionally they are characterized by strong feedback mechanisms between economic and ecological factors. This study aims to contribute to the identification of basic principles for sustainable grazing management. The approach emphasizes learning from existing management systems through the use of ecological-economic modelling. Two apparently successful management systems in Namibia are used as a starting point for a broader analysis: the Gamis Karakul sheep farm and the land use system of the semi-nomadic Ova-Himba. Although the economic systems differ strongly, their management seems to have similarities: the importance of pasture resting and of adapting livestock numbers to available forage. This PhD thesis contributes substantial insights about the relevance and functioning of pasture resting for sustainable grazing management in semi-arid regions. Assessment of the two case studies leads to the hypothesis that resting in the rainy season, particularly during wet years, is fundamental for ensuring pasture productivity under low regeneration potential of the vegetation. The thesis highlights that resting during wet years acts as a risk reducing strategy. Additionally, the study reveals that access to economic risk management strategies, such as rain-index-insurance, may change farmer´s behaviour towards less conservative strategies. The used approach - learning from existing apparently successful grazing strategies by ecological-economic modelling - offers a powerful tool for tackling new questions related to global change. The scope and the limits for generalizing the key factors discovered for sustainable grazing management can be easily detected under changing ecological, climatic and economic conditions.
168

Object Warehouse - Konzeption der Basis objektorientierter Management Support Systems am Beispiel von Smalltalk und dem ERP Baan

Maur, Eitel von 18 May 2001 (has links)
Die Arbeit untersucht die Realisierungsmöglichkeiten der Potenziale des objektorientierten Paradigmas im Bereich der Management Support Systems (MSS). Die Untersuchung konzentriert sich dabei auf die grundlegende Datenhaltungskomponente, das Data Warehouse (DWH) respektive Object Warehouse. Dazu wird im ersten Teil zunächst nach einer ausführlichen Analyse der Begriffslandschaft im Bereich der Management Support Systems und einer geeigneten Neudefinition und Einordnung des Data Warehouse auf den Entwicklungspfad des Data Warehouse eingegangen, bei welchem insbesondere die Anforderungen und Zielsetzungen des Data-Warehouse-Konzeptes herausgearbeitet werden. Darauf aufbauend wird ein generelles, systemunabhängiges Data-Warehouse-Konzept entwickelt und gegenüber den in der Literatur und Praxis vorherrschenden, konventionellen Konzepten bewertet. Basierend auf einer umfassenden Klassifizierung und Analyse des objektorientierten Paradigmas und seiner MSS- und DWH-spezifischen Verbesserungspotenziale erfolgt anschließend die Konzeption für ein Object Warehouse. Angesichts der erheblichen Anzahl bezüglich Konzepten und Zielsetzungen sehr unterschiedlicher Objekt-Paradigmata, muss die gleichwohl notwendige Paradigmen-Diskussion und Integration teilweise beschränkt auf den Teilaspekt der Objectbase Management Systems (OBMS) erfolgen, speziell auf solche, die die geforderten semantischen Modellierungsmöglichkeiten des Objekt-Paradigmas für den Reconciled Object Layer des Data Warehouse unterstützen. Im zweiten Teil der Arbeit werden die in konkreten Projekten gewonnenen Erkenntnisse bezüglich der Umsetzung des Object-Warehouse-Konzeptes in der Praxis vorgestellt und einer kritischen Würdigung unterzogen. Dabei werden erhebliche Diskrepanzen bezüglich der Umsetzung methodischer Konzeptionen in realen Software-Werkzeugen expliziert, was eine umfassende Umsetzung des Object-Warehouse-Konzeptes zur Zeit noch erheblich einschränkt.
169

Übersicht über die Habilitationen an der Wirtschaftswissenschaftlichen Fakultät der Universität Leipzig von 1993 bis 1997

Universität Leipzig 11 March 1999 (has links)
No description available.
170

3D real time object recognition

Amplianitis, Konstantinos 01 March 2017 (has links)
Die Objekterkennung ist ein natürlicher Prozess im Menschlichen Gehirn. Sie ndet im visuellen Kortex statt und nutzt die binokulare Eigenschaft der Augen, die eine drei- dimensionale Interpretation von Objekten in einer Szene erlaubt. Kameras ahmen das menschliche Auge nach. Bilder von zwei Kameras, in einem Stereokamerasystem, werden von Algorithmen für eine automatische, dreidimensionale Interpretation von Objekten in einer Szene benutzt. Die Entwicklung von Hard- und Software verbessern den maschinellen Prozess der Objek- terkennung und erreicht qualitativ immer mehr die Fähigkeiten des menschlichen Gehirns. Das Hauptziel dieses Forschungsfeldes ist die Entwicklung von robusten Algorithmen für die Szeneninterpretation. Sehr viel Aufwand wurde in den letzten Jahren in der zweidimen- sionale Objekterkennung betrieben, im Gegensatz zur Forschung zur dreidimensionalen Erkennung. Im Rahmen dieser Arbeit soll demnach die dreidimensionale Objekterkennung weiterent- wickelt werden: hin zu einer besseren Interpretation und einem besseren Verstehen von sichtbarer Realität wie auch der Beziehung zwischen Objekten in einer Szene. In den letzten Jahren aufkommende low-cost Verbrauchersensoren, wie die Microsoft Kinect, generieren Farb- und Tiefendaten einer Szene, um menschenähnliche visuelle Daten zu generieren. Das Ziel hier ist zu zeigen, wie diese Daten benutzt werden können, um eine neue Klasse von dreidimensionalen Objekterkennungsalgorithmen zu entwickeln - analog zur Verarbeitung im menschlichen Gehirn. / Object recognition is a natural process of the human brain performed in the visual cor- tex and relies on a binocular depth perception system that renders a three-dimensional representation of the objects in a scene. Hitherto, computer and software systems are been used to simulate the perception of three-dimensional environments with the aid of sensors to capture real-time images. In the process, such images are used as input data for further analysis and development of algorithms, an essential ingredient for simulating the complexity of human vision, so as to achieve scene interpretation for object recognition, similar to the way the human brain perceives it. The rapid pace of technological advancements in hardware and software, are continuously bringing the machine-based process for object recognition nearer to the inhuman vision prototype. The key in this eld, is the development of algorithms in order to achieve robust scene interpretation. A lot of recognisable and signi cant e ort has been successfully carried out over the years in 2D object recognition, as opposed to 3D. It is therefore, within this context and scope of this dissertation, to contribute towards the enhancement of 3D object recognition; a better interpretation and understanding of reality and the relationship between objects in a scene. Through the use and application of low-cost commodity sensors, such as Microsoft Kinect, RGB and depth data of a scene have been retrieved and manipulated in order to generate human-like visual perception data. The goal herein is to show how RGB and depth information can be utilised in order to develop a new class of 3D object recognition algorithms, analogous to the perception processed by the human brain.

Page generated in 0.1178 seconds