• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 39
  • 20
  • 9
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 95
  • 95
  • 30
  • 27
  • 22
  • 17
  • 14
  • 13
  • 13
  • 12
  • 10
  • 9
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Dinâmica populacional e avaliação do estoque do camarão rosa (Farfantepenaeus subtilis Pérez-Farfante 1967) na plataforma continental amazônica brasileira / Population dynamics and stock assessment of the brown shrimp, Farfantepenaeus subtilis, (Pérez-Farfante 1967) in the Amazon continental shelf

José Augusto Negreiros Aragão 12 September 2012 (has links)
O camarão rosa (Farfantepenaeus subtilis) explotado pela pesca industrial na plataforma continental amazônica brasileira possui um ciclo de vida curto, mas complexo, habitando áreas oceânicas, mais ao norte da área de ocorrência, na fase adulta e larval, e áreas estuarinas e lagunares na fase de pós-larva e juvenil. O período de maior intensidade de reprodução se estende de maio a setembro e logo após a reprodução as larvas eclodem e iniciam sua migração para áreas costeiras, passando por diversas fases, onde se assentam e residem principalmente entre junho e outubro. A partir de setembro até janeiro do ano seguinte é maior a intensidade de recrutamento de juvenis às áreas oceânicas, onde passam a amadurecer e, a partir de dezembro, começam a ser capturados pela pesca industrial. A maior abundância da população adulta em termos de biomassa vai de março a agosto quando também se verificam as maiores capturas. As fêmeas crescem mais que os machos e estão presentes sempre em maior proporção nas capturas (61%). Os comprimentos assintóticos foram estimados em 231 mm ( k = 1,6 \'ano POT.-1\') e 205 mm (k = 0,94 \'ano POT.-1\'), para fêmeas e machos respectivamente. A população apresenta taxa de mortalidade natural relativamente elevada, 2,53 \'ano POT.-1\' para fêmeas e 1,83 \'ano POT.-1\' para machos, sendo observadas acentuadas flutuações de recrutamento e abundância, com evidências de que são fortemente governadas pelas condições ambientais. O estoque vem sendo explotado em níveis moderados nos anos recentes (E = 0,45), embora tenha sofrido elevadas taxas de explotação na década de 80, o que levou a uma redução do tamanho da população. O rendimento máximo sustentável, considerado uma média de longo prazo, foi estimado em 4.032 toneladas de cauda por ano, para um esforço de pesca de 19.370 dias de mar. Nos últimos anos, se observa uma tendência de recuperação da biomassa populacional, mas com as oscilações anuais características da espécie. A vazão do rio Amazonas é o fator ambiental que governa com mais intensidade as condições do ambiente costeiro na região e verificou-se que suas flutuações estão correlacionadas a alterações na abundância da população da espécie. Postula-se que o aporte e sobrevivência das larvas e pós-larvas no ambiente costeiro seja influenciada pela intensidade da vazão do rio. O período em que se assentam nos berçários na zona costeira coincide com a estação de vazante do rio, sendo a sobrevivência favorecida por vazões abaixo da média e vice-versa. Portanto, medidas de ordenamento voltadas para o uso sustentável do recurso devem estar associadas ao conhecimento das condições ambientais nesta fase, bem como a estudos sobre a abundância de pós-larvas e juvenis na faixa costeira. / The brown shrimp (Farfantepenaeus subtilis) exploited by the industrial fishery on the continental shelf of the Brazilian Amazon has a short but complex life cyele, inhabiting oceanic areas, at the north of the area of occurrence, during the adult and larval stages, and estuarine areas and lagoons in post-larval and juvenile. The period of highest intensity of reproduction extends from May to September and soon after the hatch, the larvae start their migration to coastal areas, passing through several stages, where they settle and remain resident between June and October. From September to January of the following year the intensity of recruitment to ocean areas is higher, and once there they start to mature and are caught by the industrial fishery from December on. The highest abundance of the adult population in terms of biomass is observed from March to August when the largest catches also occur. Females grow larger than males and are always present in greater proportion in catches (61%). The asymptotic lengths were estimated at 231 mm (k = 1.6 \'year POT.-1\') and 205 mm (k = 0.94 \'year POT.-1\') for females and males respectively. The population has a natural mortality rate relatively high, 2.53 \'year POT.-1\' for females and 1.83 \'years POT.-1\' for males, and pronounced fluctuations in recruitment and abundance are observed, with evidence of being strongly governed by environmental conditions. The stock has been exploited at moderate levels in recent years (E = 0.45), although it has suffered high rates of exploitation in the 80\'s, which led to a reduction in population size. The maximum sustainable yield, considered a long-term average, was estimated at 4,032 ton of tail per year for a fishing effort of 19,370 days at sea. In recent years, it is observed a tendency of recovering of the population biomass, but annual fluctuations are characteristics of the species. The flow of the Amazon River is the main environmental facto r that governs the conditions of the coastal environment in the region and it was found that it is correlated with the fluctuatícn of the brown shrimp population abundance. It is postulated that the uptake and survival of larvae and post larvae in the coastal environment is lnfluenced by the intensity of river flow, The period during which they settle at the nurseries in the coastal zone coincides with the dry season and their survival is favored when the flow of the river is below the average, and vice versa. Therefore, management measures aimed at sustainable use of the resource must be associated with the knowledge of environmental conditions during this phase, as well as studies on the abundance of post-larvae and juveniles in the coastal zone.
82

Analýza vrstvy nervových vláken pro účely diagnostiky glaukomu / Analysis of retinal nerve fiber layer for diagnosis of glaucoma

Vodáková, Martina January 2013 (has links)
The master thesis is focused on creating a methodology for quantification of the nerve fiber layer on photographs of the retina. The introductory part of the text presents a medical motivation of the thesis and mentions several studies dealing with this issue. Furthermore, the work describes available textural features and compares their ability to quantify the thickness of the nerve fiber layer. Based on the described knowledge, the methodology to make different regression models enabling prediction of the retinal nerve fiber layer thickness was developed. Then, the methodology was tested on the available image dataset. The results showed, that the outputs of regression models achieve a high correlation between the predicted output and the retinal nerve fiber layer thickness measured by optical coherence tomography. The conclusion discusses an usability of the applied solution.
83

Variabilita vývoje počáteční gramotnosti u dětí s rizikem dyslexie: Predikční modely gramotnostních deficitů. / The early literacy development and its variability in children at risk of dyslexia: The prediction models of literacy deficits.

Medřická, Tereza January 2019 (has links)
In the context of both projects Enhancing literacy development in European languages, work package 2 and The early literacy development and its variability in children at risk of specific learning disabilities, we monitored child development of literacy in preschool age and during the first years of school attendance in a four-stage process. The research group (n = 76) compound of typically developing children (BV = 37), children with the family risk of dyslexia (RR = 22) and children with specific language impairment (NVŘ = 17). We evaluated development of phonemic/phonological, lexical/semantic and morphological/syntactic skills, preliteracy skills and early literacy skills. The last fifth test stage included the assessment of literacy development in 3rd graders. First, a group of children with literacy deficits (n = 9) was identified via the latent profile analysis method. Subsequently, four predictive models of literacy deficits for each stage were created by means of lasso or L-1 penalized regression method. Predictive models follows the trend that until literacy skills are fully automatized (preschool age and the 1st grade), phonemic and phonological skills predominate, but later - after the formal learning to read and write proceeds - early literacy skills are becoming more and more...
84

Modelling of a System for the Detection of Weak Signals Through Text Mining and NLP. Proposal of Improvement by a Quantum Variational Circuit

Griol Barres, Israel 30 May 2022 (has links)
Tesis por compendio / [ES] En esta tesis doctoral se propone y evalúa un sistema para detectar señales débiles (weak signals) relacionadas con cambios futuros trascendentales. Si bien la mayoría de las soluciones conocidas se basan en el uso de datos estructurados, el sistema propuesto detecta cuantitativamente estas señales utilizando información heterogénea y no estructurada de fuentes científicas, periodísticas y de redes sociales. La predicción de nuevas tendencias en un medio tiene muchas aplicaciones. Por ejemplo, empresas y startups se enfrentan a cambios constantes en sus mercados que son muy difíciles de predecir. Por esta razón, el desarrollo de sistemas para detectar automáticamente cambios futuros significativos en una etapa temprana es relevante para que cualquier organización tome decisiones acertadas a tiempo. Este trabajo ha sido diseñado para obtener señales débiles del futuro en cualquier campo dependiendo únicamente del conjunto de datos de entrada de documentos. Se aplican técnicas de minería de textos y procesamiento del lenguaje natural para procesar todos estos documentos. Como resultado, se obtiene un mapa con un ranking de términos, una lista de palabras clave clasificadas automáticamente y una lista de expresiones formadas por múltiples palabras. El sistema completo se ha probado en cuatro sectores diferentes: paneles solares, inteligencia artificial, sensores remotos e imágenes médicas. Este trabajo ha obtenido resultados prometedores, evaluados con dos metodologías diferentes. Como resultado, el sistema ha sido capaz de detectar de forma satisfactoria nuevas tendencias en etapas muy tempranas que se han vuelto cada vez más importantes en la actualidad. La computación cuántica es un nuevo paradigma para una multitud de aplicaciones informáticas. En esta tesis doctoral también se presenta un estudio de las tecnologías disponibles en la actualidad para la implementación física de qubits y puertas cuánticas, estableciendo sus principales ventajas y desventajas, y los marcos disponibles para la programación e implementación de circuitos cuánticos. Con el fin de mejorar la efectividad del sistema, se describe un diseño de un circuito cuántico basado en máquinas de vectores de soporte (SVM) para la resolución de problemas de clasificación. Este circuito está especialmente diseñado para los ruidosos procesadores cuánticos de escala intermedia (NISQ) que están disponibles actualmente. Como experimento, el circuito ha sido probado en un computador cuántico real basado en qubits superconductores por IBM como una mejora para el subsistema de minería de texto en la detección de señales débiles. Los resultados obtenidos con el experimento cuántico muestran también conclusiones interesantes y una mejora en el rendimiento de cerca del 20% sobre los sistemas convencionales, pero a su vez confirman que aún se requiere un desarrollo tecnológico continuo para aprovechar al máximo la computación cuántica. / [CA] En aquesta tesi doctoral es proposa i avalua un sistema per detectar senyals febles (weak signals) relacionats amb canvis futurs transcendentals. Si bé la majoria de solucions conegudes es basen en l'ús de dades estructurades, el sistema proposat detecta quantitativament aquests senyals utilitzant informació heterogènia i no estructurada de fonts científiques, periodístiques i de xarxes socials. La predicció de noves tendències en un medi té moltes aplicacions. Per exemple, empreses i startups s'enfronten a canvis constants als seus mercats que són molt difícils de predir. Per això, el desenvolupament de sistemes per detectar automàticament canvis futurs significatius en una etapa primerenca és rellevant perquè les organitzacions prenguen decisions encertades a temps. Aquest treball ha estat dissenyat per obtenir senyals febles del futur a qualsevol camp depenent únicament del conjunt de dades d'entrada de documents. S'hi apliquen tècniques de mineria de textos i processament del llenguatge natural per processar tots aquests documents. Com a resultat, s'obté un mapa amb un rànquing de termes, un llistat de paraules clau classificades automàticament i un llistat d'expressions formades per múltiples paraules. El sistema complet s'ha provat en quatre sectors diferents: panells solars, intel·ligència artificial, sensors remots i imatges mèdiques. Aquest treball ha obtingut resultats prometedors, avaluats amb dues metodologies diferents. Com a resultat, el sistema ha estat capaç de detectar de manera satisfactòria noves tendències en etapes molt primerenques que s'han tornat cada cop més importants actualment. La computació quàntica és un paradigma nou per a una multitud d'aplicacions informàtiques. En aquesta tesi doctoral també es presenta un estudi de les tecnologies disponibles actualment per a la implementació física de qubits i portes quàntiques, establint-ne els principals avantatges i desavantatges, i els marcs disponibles per a la programació i implementació de circuits quàntics. Per tal de millorar l'efectivitat del sistema, es descriu un disseny d'un circuit quàntic basat en màquines de vectors de suport (SVM) per resoldre problemes de classificació. Aquest circuit està dissenyat especialment per als sorollosos processadors quàntics d'escala intermèdia (NISQ) que estan disponibles actualment. Com a experiment, el circuit ha estat provat en un ordinador quàntic real basat en qubits superconductors per IBM com una millora per al subsistema de mineria de text. Els resultats obtinguts amb l'experiment quàntic també mostren conclusions interessants i una millora en el rendiment de prop del 20% sobre els sistemes convencionals, però a la vegada confirmen que encara es requereix un desenvolupament tecnològic continu per aprofitar al màxim la computació quàntica. / [EN] In this doctoral thesis, a system to detect weak signals related to future transcendental changes is proposed and tested. While most known solutions are based on the use of structured data, the proposed system quantitatively detects these signals using heterogeneous and unstructured information from scientific, journalistic, and social sources. Predicting new trends in an environment has many applications. For instance, companies and startups face constant changes in their markets that are very difficult to predict. For this reason, developing systems to automatically detect significant future changes at an early stage is relevant for any organization to make right decisions on time. This work has been designed to obtain weak signals of the future in any field depending only on the input dataset of documents. Text mining and natural language processing techniques are applied to process all these documents. As a result, a map of ranked terms, a list of automatically classified keywords and a list of multi-word expressions are obtained. The overall system has been tested in four different sectors: solar panels, artificial intelligence, remote sensing, and medical imaging. This work has obtained promising results that have been evaluated with two different methodologies. As a result, the system was able to successfully detect new trends at a very early stage that have become more and more important today. Quantum computing is a new paradigm for a multitude of computing applications. This doctoral thesis also presents a study of the technologies that are currently available for the physical implementation of qubits and quantum gates, establishing their main advantages and disadvantages and the available frameworks for programming and implementing quantum circuits. In order to improve the effectiveness of the system, a design of a quantum circuit based on support vector machines (SVMs) is described for the resolution of classification problems. This circuit is specially designed for the noisy intermediate-scale quantum (NISQ) computers that are currently available. As an experiment, the circuit has been tested on a real quantum computer based on superconducting qubits by IBM as an improvement for the text mining subsystem in the detection of weak signals. The results obtained with the quantum experiment show interesting outcomes with an improvement of close to 20% better performance than conventional systems, but also confirm that ongoing technological development is still required to take full advantage of quantum computing. / Griol Barres, I. (2022). Modelling of a System for the Detection of Weak Signals Through Text Mining and NLP. Proposal of Improvement by a Quantum Variational Circuit [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/183029 / TESIS / Compendio
85

Improving the Performance of Clinical Prediction Tasks by Using Structured and Unstructured Data Combined with a Patient Network

Nouri Golmaei, Sara 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / With the increasing availability of Electronic Health Records (EHRs) and advances in deep learning techniques, developing deep predictive models that use EHR data to solve healthcare problems has gained momentum in recent years. The majority of clinical predictive models benefit from structured data in EHR (e.g., lab measurements and medications). Still, learning clinical outcomes from all possible information sources is one of the main challenges when building predictive models. This work focuses mainly on two sources of information that have been underused by researchers; unstructured data (e.g., clinical notes) and a patient network. We propose a novel hybrid deep learning model, DeepNote-GNN, that integrates clinical notes information and patient network topological structure to improve 30-day hospital readmission prediction. DeepNote-GNN is a robust deep learning framework consisting of two modules: DeepNote and patient network. DeepNote extracts deep representations of clinical notes using a feature aggregation unit on top of a state-of-the-art Natural Language Processing (NLP) technique - BERT. By exploiting these deep representations, a patient network is built, and Graph Neural Network (GNN) is used to train the network for hospital readmission predictions. Performance evaluation on the MIMIC-III dataset demonstrates that DeepNote-GNN achieves superior results compared to the state-of-the-art baselines on the 30-day hospital readmission task. We extensively analyze the DeepNote-GNN model to illustrate the effectiveness and contribution of each component of it. The model analysis shows that patient network has a significant contribution to the overall performance, and DeepNote-GNN is robust and can consistently perform well on the 30-day readmission prediction task. To evaluate the generalization of DeepNote and patient network modules on new prediction tasks, we create a multimodal model and train it on structured and unstructured data of MIMIC-III dataset to predict patient mortality and Length of Stay (LOS). Our proposed multimodal model consists of four components: DeepNote, patient network, DeepTemporal, and score aggregation. While DeepNote keeps its functionality and extracts representations of clinical notes, we build a DeepTemporal module using a fully connected layer stacked on top of a one-layer Gated Recurrent Unit (GRU) to extract the deep representations of temporal signals. Independent to DeepTemporal, we extract feature vectors of temporal signals and use them to build a patient network. Finally, the DeepNote, DeepTemporal, and patient network scores are linearly aggregated to fit the multimodal model on downstream prediction tasks. Our results are very competitive to the baseline model. The multimodal model analysis reveals that unstructured text data better help to estimate predictions than temporal signals. Moreover, there is no limitation in applying a patient network on structured data. In comparison to other modules, the patient network makes a more significant contribution to prediction tasks. We believe that our efforts in this work have opened up a new study area that can be used to enhance the performance of clinical predictive models.
86

3D Dose Prediction from Partial Dose Calculations using Convolutional Deep Learning models / 3D-dosförutsägelser från partiella dosberäkningar med hjälp av konvolutionella Deep Learning-modeller

Liberman Bronfman, Sergio Felipe January 2021 (has links)
In this thesis, the problem of predicting the full dose distribution from a partially modeled dose calculation is addressed. Two solutions were studied: a vanilla Hierarchically Densely Connected U-net (HDUnet) and a Conditional Generative Adversarial Network (CGAN) with HDUnet as a generator. The CGAN approach is a 3D version of Pix2Pix [1] for Image to Image translation which we name Dose2Dose. The research question that this project tackled is whether the Dose2Dose can learn more effective dose transformations than the vanilla HDUnet. To answer this, the models were trained using dose calculations of phantom slabs generated for the problem in pairs of inputs (doses without magnetic field) and targets (doses with magnetic field). Once trained, the models were evaluated and compared in various aspects. The evidence gathered suggests that the vanilla HDUnet model can learn to generate better dose predictions than the generative model. However, in terms of the resulting dose distributions, the samples generated from the Dose2Dose are as likely to belong to the target dose calculation distribution as those of the vanilla HDUnet. The results contain errors of considerable magnitude, and do not accomplish clinical suitability tests. / I denna avhandling har problemet med att förutsäga full dosfördelning från en delvis modellerad dosberäkning tagits upp. Två lösningar studerades: ett vanilla HDUnet och ett betingat generativt nätverk (CGAN) med HDUnet som generator. CGAN -metoden var en 3D-version av Pix2Pix [1] för översättning av bild till bild med namnet Dose2Dose. Forskningsfrågan som detta projekt tog upp var om Dose2Dose kan lära sig mer effektiva dostransformationer än vanilla HDUnet. För att svara på detta tränades modellerna med hjälp av parvisa dosberäkningar, i indata (doser utan magnetfält) och mål (doser med magnetfält).. När de var tränade utvärderades modellerna och jämfördes i olika aspekter. De samlade bevisen tyder på att Vanilla HDUnet -modellen kan lära sig att generera bättre dosförutsägelser än den generativa modellen. När det gäller de resulterande dosfördelningarna är emellertid de prover som genererats från Dose2Dose lika sannolikt att tillhöra måldosberäkningsfördelningen som de för vanilla HDUnet. Resultaten innehåller stora storleksfel och uppfyller inte kraven för klinisk tillämpbarhet.
87

Forecasting the data cube

Lehner, Wolfgang, Fischer, Ulrike, Schildt, Christopher, Hartmann, Claudio 12 January 2023 (has links)
Forecasting time series data is crucial in a number of domains such as supply chain management and display advertisement. In these areas, the time series data to forecast is typically organized along multiple dimensions leading to a high number of time series that need to be forecasted. Most current approaches focus only on selection and optimizing a forecast model for a single time series. In this paper, we explore how we can utilize time series at different dimensions to increase forecast accuracy and, optionally, reduce model maintenance overhead. Solving this problem is challenging due to the large space of possibilities and possible high model creation costs. We propose a model configuration advisor that automatically determines the best set of models, a model configuration, for a given multi-dimensional data set. Our approach is based on a general process that iteratively examines more and more models and simultaneously controls the search space depending on the data set, model type and available hardware. The final model configuration is integrated into F2DB, an extension of PostgreSQL, that processes forecast queries and maintains the configuration as new data arrives. We comprehensively evaluated our approach on real and synthetic data sets. The evaluation shows that our approach significantly increases forecast query accuracy while ensuring low model costs.
88

Exploiting big data in time series forecasting: A cross-sectional approach

Lehner, Wolfgang, Hartmann, Claudio, Hahmann, Martin, Rosenthal, Frank 12 January 2023 (has links)
Forecasting time series data is an integral component for management, planning and decision making. Following the Big Data trend, large amounts of time series data are available from many heterogeneous data sources in more and more applications domains. The highly dynamic and often fluctuating character of these domains in combination with the logistic problems of collecting such data from a variety of sources, imposes new challenges to forecasting. Traditional approaches heavily rely on extensive and complete historical data to build time series models and are thus no longer applicable if time series are short or, even more important, intermittent. In addition, large numbers of time series have to be forecasted on different aggregation levels with preferably low latency, while forecast accuracy should remain high. This is almost impossible, when keeping the traditional focus on creating one forecast model for each individual time series. In this paper we tackle these challenges by presenting a novel forecasting approach called cross-sectional forecasting. This method is especially designed for Big Data sets with a multitude of time series. Our approach breaks with existing concepts by creating only one model for a whole set of time series and requiring only a fraction of the available data to provide accurate forecasts. By utilizing available data from all time series of a data set, missing values can be compensated and accurate forecasting results can be calculated quickly on arbitrary aggregation levels.
89

GUIDELINES FOR COMPARING INTERVENTIONS, PREDICTING HIGH-RISK PATIENTS, AND CONDUCTING OPTIMIZATION FOR EARLY HF READMISSION

Khasawneh, Ahmad Ali 05 October 2017 (has links)
No description available.
90

Archeologia d'alta quota alle sorgenti del Brembo

Croce, Enrico 18 July 2022 (has links)
The focus of this research is the area known as Sorgenti del Brembo di Carona (sources of river Brembo of Carona), which is located in the Orobie Alps (province of Bergamo, Italy). The current archaeological activities in the area, carried out by the Civico Museo Archeologico di Bergamo, are site-specific and mainly focused on Iron Age rock engravings and on a medieval dwelling excavation. The present study aims at a wider approach to upland archaeology, more focused on landscape evolution rather than on single evidence. The starting point is the methodology developed in other alpine contexts, like the ALPES (Alpine Landscapes: Pastoralism and Environment of Val di Sole) project. The data, gathered through extensive field survey activities, assessed the presence of a complex landscape, with pastoral evidence, iron mining facilities and charcoal production sites, dating from Early Middle Ages to the present. All the collected data are managed through a GIS in order to maintain their spatial reference. Therefore, it was possible to easy cross-reference them with several historical documents (cartography, cadastres, archives) and also to perform quantitative and spatial analysis. This method allowed us to reconstruct a diachronic evolution of human activities impact on the landscape formation. An inductive predictive modelling based on the integration with ethnoarchaeology was also implemented using modern pastoral sites. The results shed light on the complex dynamics of the human approach to high-altitude regions and on the alpine environment constraints to human activities. On the other hand, it was also possible to asses both the strengths and biases of the current application of predictive models to Alpine cultural heritage. The methodology developed during this research, following and implementing previously developed methods, can be a step forward on the definition of a common archaeological approach to upland contexts. / Il progetto di ricerca nasce a seguito delle indagini archeologiche condotte dal Civico Museo Archeologico di Bergamo nel comune di Carona (BG), situato in alta val Brembana, sulle Alpi Orobie, che hanno permesso di identificare un sito cultuale con incisioni rupestri dell'età del Ferro e un villaggio minerario con fasi altomedievali e medievali. L'obiettivo principale della presente ricerca è stato ampliare la conoscenza storico-archeologica di tutto il territorio alla testata del Brembo di Carona, senza focalizzarsi su singoli siti e applicando le metodologie sviluppate all'Università di Trento nell'ambito del progetto ALPES (Alpine Landscapes: Pastoralism and Environment of Val di Sole), che prevedono un approccio al paesaggio montano in una prospettiva diacronica, inquadrabile nell'ambito della Landscape Archaeology. Le attività di ricerca sul campo hanno rappresentato il fulcro del progetto, permettendo l'individuazione di centinaia di evidenze antropiche. I dati raccolti sul campo sono stati contestualizzati attraverso l'analisi di diverse tipologie di fonti e materiali, non solo di tipo archeologico ma anche inquadrabili in ambiti storico-archivistici e topografici, con un’impostazione della ricerca in senso marcatamente interdisciplinare. L'elaborazione di un modello predittivo etnoarcheologico ha avuto il duplice obiettivo di fornire uno strumento di interpretazione delle strutture presenti sul territorio e di validare la stessa metodologia prognostica impiegata, già elaborata in ambito trentino. I dati raccolti e i risultati della loro analisi hanno permesso la ricostruzione diacronica di un paesaggio complesso, caratterizzato dalla compresenza di differenti attività economiche (pastorizia, attività minerarie e sfruttamento forestale), attraverso le quali si è espressa l'azione umana nell'ambiente montano lungo l'arco di più di un millennio. La metodologia proposta, in quanto sintesi di diverse esperienze di ricerca in ambito alpino, potrebbe porre le basi per una più ampia riflessione riguardo possibili approcci condivisi e comuni ad una "archeologia di montagna", che sempre più si sta delineando come una disciplina autonoma.

Page generated in 0.1217 seconds