• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 29
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 119
  • 119
  • 48
  • 38
  • 38
  • 31
  • 31
  • 17
  • 16
  • 15
  • 15
  • 15
  • 15
  • 14
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Investigating Sand Dune Location, Orientation and Geomorphometry Through GEOBIA-Based Mapping: A Case Study in Northern Sweden / En undersökning av rumslig förekomst, orientering och morfometri hos fossila sanddyner genom GEOBIA-baserad kartläggning: en fallstudie i norra Sverige

Stammler, Melanie January 2020 (has links)
Climate change has repeatedly been framed as the defining issue of the Anthropocene and with the Arctic changing at unpreceded speed need is high for a profound understanding of the Northern Swedish landscape. Northern Swedish aeolian sand dunes have been impacted by climatic changes throughout time. Their location, orientation and geomorphometry can therefore be used to explore past wind patterns and dune activity. By systematically and spatially mapping the dunes, patterns in location can be illustrated, dune orientations investigated, the dunes’ geomorphometry characterised and sediment sources determined. Based on this knowledge, insight in landscape development along with a better understanding of long-term landscape (in)stability in Northern Sweden can be gained. This M. Sc. thesis sets out to summarize useful concepts to understand the formation of Northern Swedish aeolian sand dunes and to derive its implications for understanding landscape development. Based thereon, it deduces the strong need to systematically and spatially analyse aeolian sand dunes in Northern Sweden. The use of geographic object-based image analysis (GEOBIA) allows for the detection of potential dune locations over a large area and provides defined and reproducible mapping boundaries. Polygons are created by segmenting a residual-relief separated digital elevation model (DEM) as well as slope and curvature data. The multi-resolution segmentation provides best results with a scale parameter of 15 and a homogeneity criterion of 0.1 for the shape criterion, as well as 0.5 for the compactness criterion. A rule-based classification with empirically derived parameters accepts on average 2.5 % of the segmented image objects as potential dune sites. Subsequent expert-decision confirms on average 25 % of the classified image objects as identified dune locations. The rule-based classification provides best results when targeting a smaller area as this allows for less variability within the dune characteristics. The investigation of expert-accepted dune locations confirms a prevalence of parabolic dune forms, reveals the coexistence of simple dunes with large coalesced systems, exemplifies variation in dune orientation and highlights that the majority of dunes are supplied by glaciofluvial deposits. By mapping Northern Swedish aeolian sand dunes and investigating their meaning for landscape development, this thesis furthermore contributes to closing the gap identified for research on Northern Swedish aeolian sand dunes. / Den första associationen till sanddyner är säkert Sahara snarare än norra Sverige. Ändå är dessa fossila sanddyner också mycket relevanta och intressanta att studera. De kan analyseras i samband med det omgivande landskapet och dess orientering. Dessa egenskaper hjälper till att identifiera mönster i landskapsutveckling. Detta och på grund av dynarnas relativt gamla ålder kan slutsatser om landskapets (in)stabilitet på geologiska tidsskalor dras. Detta är mycket användbart eftersom det kan ge insikter om hur klimatet såg ut under tiden som sanddynerna bildades - perioder där människor ännu inte har bevittnat klimatet. Kunskap som till exempel hur klimatet som rådde för länge sedan såg ut kan användas bland annat för att uppskatta hur landskapet kommer förändras i framtiden till följd av klimatförändringar. Trots dessa användbara egenskaper hos sanddynerna har lite forskning gjorts hittills. Det här examensarbet försöker motverka detta kunskapsgap och kartlägger sanddyner i norra Sverige med hjälp av geografisk objektbaserad bildanalys (geographic object-based image analysis, GEOBIA). Det innebär att bildmaterial och digitala höjdmodeller frigjorda från vegetation automatiskt analyseras med hjälp av algoritmer. Fokus här är inte på att analysera enskilda pixlar. Snarare grupperas pixlar med liknande egenskaper så som lutning (slope), krökning (curvature) och spektralegenskaper. Dessa blir sedan grunden för analysen. Möjliga sanddyner upptäcks semi-automatiskt så att deras position och orientering sedan kan analyseras. Den kunskap som erhållits på detta sätt utgör grunden för vidare forskning. Ett annat mål är att bidra till en djupare förståelse kring landskapsutvecklingen i norra Sverige. Det är viktigt att komma ihåg att detta är ett område som särskilt påverkas av klimatförändringar. En ökad kunskap om landskapets tidigare klimatrespons kan därmed bidra till att förutsäga framtiden för denna region. Förutom att öka kunskapen kring sanddyner i norra Sverige hjälper det här mastersarbetet även till att utvidga användningen av GEOBIA inom geomorfologiska studier.
62

Estimating Pinyon and Juniper Cover Across Utah Using NAIP Imagery

Roundy, Darrell B 01 June 2015 (has links) (PDF)
Expansion of Pinus L. (pinyon) and Juniperus L. (juniper) (P-J) trees into sagebrush (Artemisia L.) steppe communities can lead to negative effects on hydrology, loss of wildlife habitat, and a decrease in desirable understory vegetation. Tree reduction treatments are often implemented to mitigate these negative effects. In order to prioritize and effectively plan these treatments, rapid, accurate, and inexpensive methods are needed to estimate tree canopy cover at the landscape scale. We used object based image analysis (OBIA) software (Feature AnalystTM for ArcMap 10.1®, ENVI Feature Extraction®, and Trimble eCognition Developer 8.2®) to extract tree canopy cover using NAIP (National Agricultural Imagery Program) imagery. We then compared our extractions with ground measured tree canopy cover (crown diameter and line point) on 309 subplots across 44 sites in Utah. Extraction methods did not consistently over- or under-estimate ground measured P-J canopy cover except where tree cover was > 45%. Estimates of tree canopy cover using OBIA techniques were strongly correlated with estimates using the crown diameter method (r = 0.93 for ENVI, 0.91 for Feature Analyst, and 0.92 for eCognition). Tree cover estimates using OBIA techniques had lower correlations with tree cover measurements using the line-point method (r = 0.85 for ENVI, 0.83 for Feature Analyst, and 0.83 for eCognition). Results from this study suggest that OBIA techniques may be used to extract P-J tree canopy cover accurately and inexpensively. All software packages accurately evaluated accurately extracted P-J canopy cover from NAIP imagery when imagery was not blurred and when P-J cover was not mixed with Amelanchier alnifolia (Utah serviceberry) and Quercus gambelii (Gambel's oak), which are shrubs with similar spectral values as P-J.
63

Feature Extraction and FeatureSelection for Object-based LandCover Classification : Optimisation of Support Vector Machines in aCloud Computing Environment

Stromann, Oliver January 2018 (has links)
Mapping the Earth’s surface and its rapid changes with remotely sensed data is a crucial tool to un-derstand the impact of an increasingly urban world population on the environment. However, the impressive amount of freely available Copernicus data is only marginally exploited in common clas-sifications. One of the reasons is that measuring the properties of training samples, the so-called ‘fea-tures’, is costly and tedious. Furthermore, handling large feature sets is not easy in most image clas-sification software. This often leads to the manual choice of few, allegedly promising features. In this Master’s thesis degree project, I use the computational power of Google Earth Engine and Google Cloud Platform to generate an oversized feature set in which I explore feature importance and analyse the influence of dimensionality reduction methods. I use Support Vector Machines (SVMs) for object-based classification of satellite images - a commonly used method. A large feature set is evaluated to find the most relevant features to discriminate the classes and thereby contribute most to high clas-sification accuracy. In doing so, one can bypass the sensitive knowledge-based but sometimes arbi-trary selection of input features.Two kinds of dimensionality reduction methods are investigated. The feature extraction methods, Linear Discriminant Analysis (LDA) and Independent Component Analysis (ICA), which transform the original feature space into a projected space of lower dimensionality. And the filter-based feature selection methods, chi-squared test, mutual information and Fisher-criterion, which rank and filter the features according to a chosen statistic. I compare these methods against the default SVM in terms of classification accuracy and computational performance. The classification accuracy is measured in overall accuracy, prediction stability, inter-rater agreement and the sensitivity to training set sizes. The computational performance is measured in the decrease in training and prediction times and the compression factor of the input data. I conclude on the best performing classifier with the most effec-tive feature set based on this analysis.In a case study of mapping urban land cover in Stockholm, Sweden, based on multitemporal stacks of Sentinel-1 and Sentinel-2 imagery, I demonstrate the integration of Google Earth Engine and Google Cloud Platform for an optimised supervised land cover classification. I use dimensionality reduction methods provided in the open source scikit-learn library and show how they can improve classification accuracy and reduce the data load. At the same time, this project gives an indication of how the exploitation of big earth observation data can be approached in a cloud computing environ-ment.The preliminary results highlighted the effectiveness and necessity of dimensionality reduction methods but also strengthened the need for inter-comparable object-based land cover classification benchmarks to fully assess the quality of the derived products. To facilitate this need and encourage further research, I plan to publish the datasets (i.e. imagery, training and test data) and provide access to the developed Google Earth Engine and Python scripts as Free and Open Source Software (FOSS). / Kartläggning av jordens yta och dess snabba förändringar med fjärranalyserad data är ett viktigt verktyg för att förstå effekterna av en alltmer urban världsbefolkning har på miljön. Den imponerande mängden jordobservationsdata som är fritt och öppet tillgänglig idag utnyttjas dock endast marginellt i klassifikationer. Att hantera ett set av många variabler är inte lätt i standardprogram för bildklassificering. Detta leder ofta till manuellt val av få, antagligen lovande variabler. I det här arbetet använde jag Google Earth Engines och Google Cloud Platforms beräkningsstyrkan för att skapa ett överdimensionerat set av variabler i vilket jag undersöker variablernas betydelse och analyserar påverkan av dimensionsreducering. Jag använde stödvektormaskiner (SVM) för objektbaserad klassificering av segmenterade satellitbilder – en vanlig metod inom fjärranalys. Ett stort antal variabler utvärderas för att hitta de viktigaste och mest relevanta för att diskriminera klasserna och vilka därigenom mest bidrar till klassifikationens exakthet. Genom detta slipper man det känsliga kunskapsbaserade men ibland godtyckliga urvalet av variabler.Två typer av dimensionsreduceringsmetoder tillämpades. Å ena sidan är det extraktionsmetoder, Linjär diskriminantanalys (LDA) och oberoende komponentanalys (ICA), som omvandlar de ursprungliga variablers rum till ett projicerat rum med färre dimensioner. Å andra sidan är det filterbaserade selektionsmetoder, chi-två-test, ömsesidig information och Fisher-kriterium, som rangordnar och filtrerar variablerna enligt deras förmåga att diskriminera klasserna. Jag utvärderade dessa metoder mot standard SVM när det gäller exakthet och beräkningsmässiga prestanda.I en fallstudie av en marktäckeskarta över Stockholm, baserat på Sentinel-1 och Sentinel-2-bilder, demonstrerade jag integrationen av Google Earth Engine och Google Cloud Platform för en optimerad övervakad marktäckesklassifikation. Jag använde dimensionsreduceringsmetoder som tillhandahålls i open source scikit-learn-biblioteket och visade hur de kan förbättra klassificeringsexaktheten och minska databelastningen. Samtidigt gav detta projekt en indikation på hur utnyttjandet av stora jordobservationsdata kan nås i en molntjänstmiljö.Resultaten visar att dimensionsreducering är effektiv och nödvändig. Men resultaten stärker också behovet av ett jämförbart riktmärke för objektbaserad klassificering av marktäcket för att fullständigt och självständigt bedöma kvaliteten på de härledda produkterna. Som ett första steg för att möta detta behov och för att uppmuntra till ytterligare forskning publicerade jag dataseten och ger tillgång till källkoderna i Google Earth Engine och Python-skript som jag utvecklade i denna avhandling.
64

"Eftersom allt är byggt på ideella krafter så gör vi så gott vi kan" : Hembygdsföreningars ideella textilförvaltning / ”We do what we can with what we have” : Local history societies textile management as a voluntary society

Åsblom, Anna-Linnéa January 2023 (has links)
The main purpose of this study is to examine the interaction between practical textilemanagement and the values setting up the theoretical framework. The practical part is examined through 50 local history societies in Sweden participating in a survey set out to map out the presence, management, and element of advisement regarding the textiles. To make the comparison the concepts of ethical responsibility and symbolic worth are used to iron out where ideal and experience meets. This also raises questions regarding maintaining sustainable management and how societies need and take accept guidance. In general, active textile management is done consistently with object-based research but the knowledge of the societies ethical responsibility differs. The practical handling of textiles is also often affected by a lack of resources and knowledge. Both factors affect the width of the societies work actively with the textiles. In short, this comes with a lot of challenges but also has a lot of possibilities to develop further if the present resources are strengthened or developed. The symbolic worth of textiles, even though mostly connected to their usages, is rated high by the societies in general. Also, the interest in textiles is linked to the presence of active management and withholds a willingness to apply practical management according to the theoretical framework. In conclusion, the willingness and interest are somewhat high but when support from the museum sector is done in knowledge and awareness of responsibility, there is a high potential to develop the textile management further.
65

Cyber Attack Modelling using Threat Intelligence. An investigation into the use of threat intelligence to model cyber-attacks based on elasticsearch and honeypot data analysis

Al-Mohannadi, Hamad January 2019 (has links)
Cyber-attacks have become an increasing threat to organisations as well as the wider public. This has led to greatly negative impacts on the economy at large and on the everyday lives of people. Every successful cyber attack on targeted devices and networks highlights the weaknesses within the defense mechanisms responsible for securing them. Gaining a thorough understanding of cyber threats beforehand is therefore essential to prevent potential attacks in the future. Numerous efforts have been made to avoid cyber-attacks and protect the valuable assets of an organisation. However, the most recent cyber-attacks have exhibited the profound levels of sophistication and intelligence of the attacker, and have shown conven- tional attack detection mechanisms to fail in several attack situations. Several researchers have highlighted this issue previously, along with the challenges faced by alternative solu- tions. There is clearly an unprecedented need for a solution that takes a proactive approach to understanding potential cyber threats in real-time situations. This thesis proposes a progressive and multi-aspect solution comprising of cyber-attack modeling for the purpose of cyber threat intelligence. The proposed model emphasises on approaches from organisations to understand and predict future cyber-attacks by collecting and analysing network events to identify attacker activity. This could then be used to understand the nature of an attack to build a threat intelligence framework. However, collecting and analysing live data from a production system can be challenging and even dangerous as it may lead the system to be more vulnerable. The solution detailed in this thesis deployed cloud-based honeypot technology, which is well-known for mimicking the real system while collecting actual data, to see network activity and help avoid potential attacks in near real-time. In this thesis, we have suggested a new threat intelligence technique by analysing attack data collected using cloud-based web services in order to identify attack artefacts and support active threat intelligence. This model was evaluated through experiments specifically designed using elastic stack technologies. The experiments were designed to assess the identification and prediction capability of the threat intelligence system for several different attack cases. The proposed cyber threat intelligence and modeling systems showed significant potential to detect future cyber-attacks in real-time. / Government of Qatar
66

Image/video compression and quality assessment based on wavelet transform

Gao, Zhigang 14 September 2007 (has links)
No description available.
67

An Application-Attuned Framework for Optimizing HPC Storage Systems

Paul, Arnab Kumar 19 August 2020 (has links)
High performance computing (HPC) is routinely employed in diverse domains such as life sciences, and Geology, to simulate and understand the behavior of complex phenomena. Big data driven scientific simulations are resource intensive and require both computing and I/O capabilities at scale. There is a crucial need for revisiting the HPC I/O subsystem to better optimize for and manage the increased pressure on the underlying storage systems from big data processing. Extant HPC storage systems are designed and tuned for a specific set of applications targeting a range of workload characteristics, but they lack the flexibility in adapting to the ever-changing application behaviors. The complex nature of modern HPC storage systems along with the ever-changing application behaviors present unique opportunities and engineering challenges. In this dissertation, we design and develop a framework for optimizing HPC storage systems by making them application-attuned. We select three different kinds of HPC storage systems - in-memory data analytics frameworks, parallel file systems and object storage. We first analyze the HPC application I/O behavior by studying real-world I/O traces. Next we optimize parallelism for applications running in-memory, then we design data management techniques for HPC storage systems, and finally focus on low-level I/O load balance for improving the efficiency of modern HPC storage systems. / Doctor of Philosophy / Clusters of multiple computers connected through internet are often deployed in industry and laboratories for large scale data processing or computation that cannot be handled by standalone computers. In such a cluster, resources such as CPU, memory, disks are integrated to work together. With the increase in popularity of applications that read and write a tremendous amount of data, we need a large number of disks that can interact effectively in such clusters. This forms the part of high performance computing (HPC) storage systems. Such HPC storage systems are used by a diverse set of applications coming from organizations from a vast range of domains from earth sciences, financial services, telecommunication to life sciences. Therefore, the HPC storage system should be efficient to perform well for the different read and write (I/O) requirements from all the different sets of applications. But current HPC storage systems do not cater to the varied I/O requirements. To this end, this dissertation designs and develops a framework for HPC storage systems that is application-attuned and thus provides much improved performance than other state-of-the-art HPC storage systems without such optimizations.
68

Remote sensing analysis of wetland dynamics and NDVI : A case study of Kristianstad's Vattenrike

Herstedt, Evelina January 2024 (has links)
Wetlands are vital ecosystems providing essential services to both humans and the environment, yet they face threats from human activities leading to loss and disturbance. This study utilizes remote sensing (RS) methods, including object-based image analysis (OBIA), to map and assess wetland health in Kristianstad’s Vattenrike in the southernmost part of Sweden between 2015 and 2023. Objectives include exploring RS capabilities in detecting wetlands and changes, deriving wetland health indicators, and assessing classification accuracy. The study uses Sentinel-2 imagery, elevation data, and high-resolution aerial images to focus on wetlands along the river Helge å. Detection and classifications were based on Sentinel-2 imagery and elevation data, and the eCognition software was employed. The health assessment was based on the spectral indices Normalized Difference Vegetation Index (NDVI) and Modified Normalized Difference Water Index (mNDWI). Validation was conducted through aerial photo interpretation. The derived classifications demonstrate acceptable accuracy levels and the analysis reveals relatively stable wetland conditions, with an increase in wetland area attributed to the construction of new wetlands. Changes in wetland composition, such as an increase in open meadows and swamp forests, were observed. However, an overall decline in NDVI values across the study area indicates potential degradation, attributed to factors like bare soil exposure and water presence. These findings provide insights into the local changes in wetland extent, composition, and health between the study years.
69

Regeneração florestal após desmatamento: estudo da região de Santarém, Pará, Brasil / Regrowth forest after deforestation: study on Santarém region, Para, Brazil

Menezes, Diego Pinheiro de 15 March 2017 (has links)
A superfície da terra foi modificada nos últimos 50 anos mais do que em qualquer outro período da História, mais intensa e rápida nos trópicos pela expansão das frentes de ocupação humana sobre floresta madura. A Amazônia brasileira, caracterizada pela alternância de ciclos econômicos extrativistas, exemplifica esse processo. Entre o abandono de áreas degradadas e a abertura de novas frentes de ocupação, ocorre a regeneração florestal. A floresta secundária tem uma reconhecida importância para o restabelecimento das funções dos ecossistemas e dos estoques de nutrientes perdidos da floresta madura, mas ignorados por muitos anos de taxas oficiais de desmatamento na Amazônia brasileira. Este estudo apresenta uma abordagem utilizando Análise de Imagens Baseada em Objetos Geográficos (GEOBIA) para classificar os estágios de sucessão secundária numa área com cerca de 11.124 km² na região de Santarém (Pará, Brasil). Dentre os resultados, foram produzidas 19 diferentes classificações cobrindo o período 1984 a 2016, que permitiu identificar a redução da floresta madura e da floresta secundária devido à expansão da fronteira agrícola. Outro resultado relevante foi a modelagem de uma árvore de decisão aplicável às imagens de refletância de superfície coletadas pelos satélites LANDSAT, processando esses atributos de classificação em um aplicativo de mineração de dados / The earth surface was modified in the last 50 years more than in any other period of the History, more intense and fast in the tropics by the expansion of human occupation frontiers on the mature forest. The Brazilian Amazon, characterized by alternating extractive economic cycles, exemplifies this process. Between the degraded areas abandonment and the new occupation fronts, forest regeneration takes place. The secondary forest has a recognized importance for the restoration of ecosystem functions and the nutrient stocks lost from the mature forest but ignored for many years of official deforestation rates in the Brazilian Amazon. In this study, an approach using Geographic Object-Based Imaging Analysis (GEOBIA) is presented to classify the stages of secondary succession in an area with near 11,124 km² on Santarém region (Pará State, Brazil). Among the results, 19 different classifications were produced covering the period 1984 to 2016, which allowed identify the reduction of mature forest and secondary forest due to agricultural frontier expansion. Another relevant result was the modeling of a decision tree applicable to surface reflectance images collected by the LANDSAT satellites, processing these classifications attributes in a data mining software
70

Caracterização do uso da terra em periferias urbanas utilizando geotecnologias: bacia do Reservatório Guarapiranga / Land use characterization in urban peripheries using geo-technologies: Guarapiranga Reservoir basin

Salim, Aline 02 September 2013 (has links)
O estudo das cidades requer um olhar amplo, capaz de identificar e relacionar os inúmeros processos que atuam na produção do espaço urbano. Geotecnologias comumente são utilizadas para adquirir informação detalhada da cobertura da terra do espaço urbano. Neste contexto, o objetivo desta pesquisa é a proposição de metodologia para geração de informações da ocupação urbana nas periferias, definindo procedimentos para análise urbana que gere informações sobre as características da ocupação urbana, a partir de imagens de satélite de alta resolução espacial. Para tanto, foi escolhida como área de estudo o distrito do Jardim São Luís compreendido na bacia do reservatório Guarapiranga, manancial que fornece água para a Região Metropolitana de São Paulo (RMSP) e cuja bacia é área de proteção e recuperação de mananciais, de acordo com a legislação estadual. Foram realizadas discussões de como se organiza o espaço urbano e dos processos que refletiram na ocupação urbana da periferia da RMSP. A metodologia desenvolvida nesta pesquisa articulou o uso de técnicas de Sensoriamento Remoto e Sistemas de Informação Geográfica com dados socioeconômicos do censo demográfico. Os resultados foram apresentados e discutidos e a metodologia proposta demonstrou-se promissora para ser aplicada na atualização de informação do espaço urbano para subsidiar o planejamento urbano e a gestão territorial e consequentemente, para a melhoria da qualidade de vida da população. / Studies from cities require a wide look to identify the amount of processes occurring in the production of the urban space. Geo-technologies are commonly used to acquire detailed information of land cover from the urban space. In this context, the objective of this study is to propose methodology for the generation of information from the occupation at the urban peripheries, defining procedures for the analysis of urban areas, to obtain information of the characteristics of this occupation, from high resolution satellite images. The area under study was the district Jardim São Luis, located at the Guarapiranga Reservoir basin, an important water supplier for the São Paulo Metropolitan Region (RMSP), an area of environmental protection and recuperation, according to State legislation. Discussions were made on how the urban space is organized and on the processes of urban occupation in the periphery of RMSP. The methodology developed in this study used remote sensing and GIS techniques and socio-economic data from the last demographic census. The results were presented and the methodology proposed is very promising to be used to update information of the urban space and land management and consequently to improve the quality of life from the population.

Page generated in 0.0483 seconds