• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 14
  • 1
  • 1
  • Tagged with
  • 54
  • 54
  • 14
  • 12
  • 12
  • 10
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Building digital literary geographies: modelling and prototyping as modes of inquiry

El Khatib, Randa 14 October 2021 (has links)
The mode of carrying out literary spatial studies—or literary geography—has largely shifted to embrace digital methods and tools, culminating in the field of geospatial humanities. This shift has affected the scope of research questions that scholars can ask and answer using digital methods. Although there many continuities between non-digital and digital spatial studies, there are some fundamental points of departure in the critical processes that are involved in carrying out geospatial humanities research, including data modelling, prototyping, and multidisciplinary collaboration, that demand a revisit of the ways that knowledge production and analysis are carried out in the humanities. First there is thinking about how data models, prototypes, and digital projects embed within themselves spatial methodologies and spatial theory that form the foundation of humanities-oriented spatial inquiry. In addition, collaborating across multidisciplinary groups involves working toward shared project goals, while ideally ensuring that individual team members are drawing benefit from the collaborative research experience. Another factor has to do with creating rich and accurate data models that can capture the complexity of their subject of inquiry for meaningful humanities research. This dissertation addresses each of the aforementioned challenges through practical applications, by focusing not only on the literary contributions of geospatial humanities, but also engaging the critical processes involved in this form of digital research. By designing and co-creating three geospatial prototypes, TopoText, TopoText 2.0, and A Map of Paradise Lost, my goal is to demonstrate how digital objects can embody spatial theory and methodologies, and to portray how traditional literary studies approaches such as close reading and literary interpretation can be combined with digital methods that enable interactivity and mixed-media visualizations for an immersed literary geography analysis. The first two chapters translate a literary theory and method of analysis, geocriticism, into a digital prototype and iteratively improve on it to demonstrate the type of research made possible through a digital geocritical interpretation. In that part of the dissertation, I also address the challenges involved in translating a literary framework into a digital environment, such as designing under constraint, and discuss what is lost in translation alongside what is gained (McCarty 2008). Chapter three demonstrates how technological advances enable scholars to build community-university partnerships that can contribute to humanities scholarship while also making research findings publicly available. In particular, the chapter argues that scholars can draw on Volunteered Geographical Information to create rich cultural gazetteers that can inform spatial humanities research. The final two chapters demonstrate how a geospatial prototype that is fueled by rich data and embeds other types of media can inform literary interpretation and help make arguments. By focusing on the process of building A Map of Paradise Lost—a geospatial humanities text-to-map project that visualizes the locatable places in John Milton’s Paradise Lost—the closing chapter addresses the question “why map literature?” and demonstrates how the process of research prototyping is in itself a form of knowledge production. Since the methods and technologies that inform geospatial humanities research are rapidly evolving, this dissertation adopts a portfolio model and consists of five released and one forthcoming publications, as well as three published prototypes. Together, they form a digital dissertation, meaning that the digital component comprises a significant part of the intellectual work of the dissertation. Reflecting the collaborative nature of digital humanities research, some articles were co-authored and all three prototypes were co-developed. In all components of this dissertation, I took on the leading role in the publication and prototype development, which is detailed at the beginning of every chapter. / Graduate
22

Assessing the Impact of Centralizationon Safety Stock : A Scenario Based Case Study to Support Companies Exploring the Benefits of Utilizing a Centralized Warehouse Strategy in the Absence of Historical Data

Koundinya, Shashank, Ekendahl, Emil January 2023 (has links)
This master thesis investigates the comparison between centralized and decentralized warehousing systems, focusing on the variation in safety stock levels and the associated safety stock holding value. The aim of this research is to address the existing knowledge gap by introducing a methodology that utilizes scenario analysis in different potential locations, incorporating a range of plausible future circumstances and integrating them into a sophisticated mathematical model. This proposed approach seeks to estimate the potential inventory savings and corresponding cost reductions achievable through the implementation of a centralized warehouse. Additionally, it aims to examine the influence of different parameters on the anticipated impact of centralization, providing a comprehensive understanding of the potential outcomes. The findings of this case study will contribute to the academic discourse on warehouse optimization strategies as well as it will offer practical guidance to companies in their pursuit of operational excellence and cost savings within their supply chains. This thesis aims to bridge the gap between theoretical knowledge and practical implementation, facilitating informed decision-making processes by the supply chain practitioner and enhancing overall supply chain performance.
23

Propagation of online consumer-perceived negativity: Quantifying the effect of supply chain underperformance on passenger car sales

Singh, A., Jenamani, M., Thakker, J.J., Rana, Nripendra P. 10 April 2021 (has links)
Yes / The paper presents a text analytics framework that analyses online reviews to explore how consumer-perceived negativity corresponding to the supply chain propagates over time and how it affects car sales. In particular, the framework integrates aspect-level sentiment analysis using SentiWordNet, time-series decomposition, and bias-corrected least square dummy variable (LSDVc) – a panel data estimator. The framework facilitates the business community by providing a list of consumers’ contemporary interests in the form of frequently discussed product attributes; quantifying consumer-perceived performance of supply chain (SC) partners and comparing the competitors; and a model assessing various firms’ sales performance. The proposed framework demonstrated to the automobile supply chain using a review dataset received from a renowned car-portal in India. Our findings suggest that consumer-voiced negativity is maximum for dealers and minimum for manufacturing and assembly related features. Firm age, GDP, and review volume significantly influence car sales whereas the sentiments corresponding to SC partners do not. The proposed research framework can help the manufacturers in inspecting their SC partners; realising consumer-cited critical car sales influencers; and accurately predicting the sales, which in turn can help them in better production planning, supply chain management, marketing, and consumer relationships.
24

Estimação de contrastes de médias de tratamentos, de um experimento em blocos ao acaso, utilizando as análises clássica e espacial / Estimation of treatments means contrasts, in a random blocks model, using the classical and spatial analysis

Maestre, Marina Rodrigues 08 October 2008 (has links)
Em um experimento, é comum ocorrerem fatores não controláveis, responsáveis pela heterogeneidade entre as parcelas. Mesmo executando os três princípios básicos da experimentação no planejamento (repetição, casualização e controle local), ainda assim, pode haver correlação nos erros e, portanto, dependência espacial na área estudada. Se for detectada essa estrutura de auto-correlação e se essa informação for utilizada na análise estatística, estimativas mais eficientes dos contrastes entre as médias dos tratamentos são garantidas, mas se tal estrutura for desconsiderada pode impedir que diferenças reais sejam detectadas. Neste trabalho, foram observadas as coordenadas dos centros das parcelas de um delineamento em blocos ao acaso. A variável resposta, deste experimento, é a concentração de carbono orgânico no solo, sendo as avaliações feitas no início do experimento, ou seja, antes da aplicaçao dos tratamentos, portanto, um ensaio em branco, um ano após a aplicação dos tratamentos e, novamente, depois de mais um ano. Para tanto, foram utilizadas as análises clássica e espacial na comparação dos métodos de estimação de contrastes de médias de tratamentos. O método estudado para a análise clássica, em que considera que os erros são não correlacionados, foi o dos mínimos quadrados ordinários. Já para a análise, levando em consideração a dependência espacial, foram utilizados o modelo geoestatístico, em que consiste na adição de um efeito aleatório com correlação, e o modelo de Papadakis, que consiste na adição de uma covariável construída a partir de observações em parcelas vizinhas. No modelo geoestatístico foi verificada a presença da dependência espacial através dos critérios de informação de Akaike e de informação Bayesiano ou de Schwarz e os métodos testados foram o do variograma seguido de mínimos quadrados generalizados e o da máxima verossimilhança. Para o modelo de Papadakis, foi testada a significância da covariável referente duas médias dos resíduos entre as parcelas vizinhas e a própria parcela tanto no modelo em blocos ao acaso quanto no modelo inteiramente casualizado, e o teste não foi significativo em nenhum dos dois casos. Mesmo assim, os cálculos foram realizados para esse método, mostrando que para esse conjunto de dados, este método não é indicado. Fazendo uso de algumas medidas de comparação desses métodos, para os dados em questão, o método de estimação dos contrastes de médias de tratamentos que apresentou as medidas de comparação mais dispersas foi o do modelo de Papadakis e o menos disperso foi o da máxima verossimilhança. Ainda, pelos intervalos de confiança, observou-se que na análise espacial, outros contrastes diferiram de zero significativamente, além daqueles que foram observados na análise clássica, o que se conclui que quando é levada em consideração a autocorrelação dos erros, os contrastes são estimados com maior eficiência / Not controllable factors is common occur in experiments, they are responsible for the heterogeneity among parcels. Even executing the three experimentation basic principles in the design (repetition, randomization and local control), even so, may have correlation in errors and, therefore, spatial dependence in the area of study. If that autocorrelation structure is detected and if this information is used in statistical analysis, estimates more efficient of contrasts among treatments means are guaranteed, but if this structure is disregarded can prevent that real diferences are detected. In this work, the coordinates of parcels centers in a design of random blocks were observed. The concentration of soil organic carbon is the response variable of this experiment, with the available made at the beginning of the experiment, ie, before the treatments application, therefore, a blank, a year after the treatments application and, again, after a year. Then, the classical and spatial analysis were used to compare the methods of estimation of treatments means contrasts. The method studied for the classical analysis, which considers that the errors are not correlated, was the ordinary least squares. For the analysis, considering the spatial dependence, were used the geostatistical model, where consists in the addition of a random effect with correlation, and the Papadakis model, which consists in the addition of a covariate built from observations in neighbouring. In geostatistical model was verified the spatial dependence through the Akaike and Bayesian or Schwarz criteria of information and the methods tested were the variogram followed by generalized least squares and the maximum likelihood. For the Papadakis model, was tested the significance of covariate referring to the average of residuals among neighbouring parcels and own parcel in the random blocks model and in the completely randomized model, and the test was not significant in any of both cases. Still, the calculus were made for this method, showing that for this data set, this method is not indicated. Using some measures to compare these methods, for these data, the method of estimation of treatments means contrasts which presented the measures of comparison more dispersed was the Papadakis model and the less dispersed was the maximum likelihood. Still, in the confidence intervals, it was observed that in spatial analysis other contrasts di®ered from zero significantly, besides of those which were observed in classical analysis, which concludes that when the autocorrelation of errors is considering, the contrasts are estimated with greater e±ciency.
25

Estimação de contrastes de médias de tratamentos, de um experimento em blocos ao acaso, utilizando as análises clássica e espacial / Estimation of treatments means contrasts, in a random blocks model, using the classical and spatial analysis

Marina Rodrigues Maestre 08 October 2008 (has links)
Em um experimento, é comum ocorrerem fatores não controláveis, responsáveis pela heterogeneidade entre as parcelas. Mesmo executando os três princípios básicos da experimentação no planejamento (repetição, casualização e controle local), ainda assim, pode haver correlação nos erros e, portanto, dependência espacial na área estudada. Se for detectada essa estrutura de auto-correlação e se essa informação for utilizada na análise estatística, estimativas mais eficientes dos contrastes entre as médias dos tratamentos são garantidas, mas se tal estrutura for desconsiderada pode impedir que diferenças reais sejam detectadas. Neste trabalho, foram observadas as coordenadas dos centros das parcelas de um delineamento em blocos ao acaso. A variável resposta, deste experimento, é a concentração de carbono orgânico no solo, sendo as avaliações feitas no início do experimento, ou seja, antes da aplicaçao dos tratamentos, portanto, um ensaio em branco, um ano após a aplicação dos tratamentos e, novamente, depois de mais um ano. Para tanto, foram utilizadas as análises clássica e espacial na comparação dos métodos de estimação de contrastes de médias de tratamentos. O método estudado para a análise clássica, em que considera que os erros são não correlacionados, foi o dos mínimos quadrados ordinários. Já para a análise, levando em consideração a dependência espacial, foram utilizados o modelo geoestatístico, em que consiste na adição de um efeito aleatório com correlação, e o modelo de Papadakis, que consiste na adição de uma covariável construída a partir de observações em parcelas vizinhas. No modelo geoestatístico foi verificada a presença da dependência espacial através dos critérios de informação de Akaike e de informação Bayesiano ou de Schwarz e os métodos testados foram o do variograma seguido de mínimos quadrados generalizados e o da máxima verossimilhança. Para o modelo de Papadakis, foi testada a significância da covariável referente duas médias dos resíduos entre as parcelas vizinhas e a própria parcela tanto no modelo em blocos ao acaso quanto no modelo inteiramente casualizado, e o teste não foi significativo em nenhum dos dois casos. Mesmo assim, os cálculos foram realizados para esse método, mostrando que para esse conjunto de dados, este método não é indicado. Fazendo uso de algumas medidas de comparação desses métodos, para os dados em questão, o método de estimação dos contrastes de médias de tratamentos que apresentou as medidas de comparação mais dispersas foi o do modelo de Papadakis e o menos disperso foi o da máxima verossimilhança. Ainda, pelos intervalos de confiança, observou-se que na análise espacial, outros contrastes diferiram de zero significativamente, além daqueles que foram observados na análise clássica, o que se conclui que quando é levada em consideração a autocorrelação dos erros, os contrastes são estimados com maior eficiência / Not controllable factors is common occur in experiments, they are responsible for the heterogeneity among parcels. Even executing the three experimentation basic principles in the design (repetition, randomization and local control), even so, may have correlation in errors and, therefore, spatial dependence in the area of study. If that autocorrelation structure is detected and if this information is used in statistical analysis, estimates more efficient of contrasts among treatments means are guaranteed, but if this structure is disregarded can prevent that real diferences are detected. In this work, the coordinates of parcels centers in a design of random blocks were observed. The concentration of soil organic carbon is the response variable of this experiment, with the available made at the beginning of the experiment, ie, before the treatments application, therefore, a blank, a year after the treatments application and, again, after a year. Then, the classical and spatial analysis were used to compare the methods of estimation of treatments means contrasts. The method studied for the classical analysis, which considers that the errors are not correlated, was the ordinary least squares. For the analysis, considering the spatial dependence, were used the geostatistical model, where consists in the addition of a random effect with correlation, and the Papadakis model, which consists in the addition of a covariate built from observations in neighbouring. In geostatistical model was verified the spatial dependence through the Akaike and Bayesian or Schwarz criteria of information and the methods tested were the variogram followed by generalized least squares and the maximum likelihood. For the Papadakis model, was tested the significance of covariate referring to the average of residuals among neighbouring parcels and own parcel in the random blocks model and in the completely randomized model, and the test was not significant in any of both cases. Still, the calculus were made for this method, showing that for this data set, this method is not indicated. Using some measures to compare these methods, for these data, the method of estimation of treatments means contrasts which presented the measures of comparison more dispersed was the Papadakis model and the less dispersed was the maximum likelihood. Still, in the confidence intervals, it was observed that in spatial analysis other contrasts di®ered from zero significantly, besides of those which were observed in classical analysis, which concludes that when the autocorrelation of errors is considering, the contrasts are estimated with greater e±ciency.
26

Wildfire Management in the Southside Region of Canada’s Montane Cordillera - A Systems Modelling Application on Firebreak Strategies

Kessels, Henricus January 2016 (has links)
There is growing recognition of the importance of preserving Canada’s forests. Canada’s 348 million hectares of forest land cover 35% of its land area, representing 9% of the world’s forests and 24% of the world’s boreal forests. As a renewable resource, forests offer significant environmental, economic and recreational benefits and innumerable services contributing to the quality of life. Canada has recently entered an era of increased frequency and severity of natural disasters. Ecosystems and communities especially in western Canada have recently undergone a trend of increasing pressures from natural disturbances. These disturbances include wildfires associated with increased fuel load levels from past fire suppression regimes and a widely spread infestation of the mountain pine beetle in addition to changes in weather patterns. Wildfire activity has reached extreme levels in many of the recent years. This thesis profiles an area of western Canada within the Montane Cordillera covering the Nechako Lakes Electoral District in central British Columbia and assesses its vulnerability to the specific hazard of wildfires caused by natural and man-made sources. The objectives of this research are to review, simulate and assess the impact of various fuel management strategies in a sub-section of the Nechako Lakes Electoral District called the Southside. Values at risk include private property and old growth forest in respectively timber supply areas, provincial parks, woodlots and community forests. Simulation results show that firebreaks are effective in significantly reducing the area burned in different parts of the landscape. The performance of different strategies shows large variation. Although this has not been investigated further, such variation has likely been caused by topographic aspects and the positioning of firebreaks in the landscape in relation to climatic parameters. These results can therefore not be extrapolated beyond the simulated area, but do give an indication of the performance variation that may be expected when similar firebreaks are applied elsewhere. The results also show that model performance of all firebreak strategies is heavily and fairly consistently influenced by weather stream parameters. Sensitivity analyses of weather stream parameters show that although the reduction in total area burned varies, the ranking between strategies in their overall performance is consistent regardless of the weather pattern. Combined dry, warm and windy weather conditions lead to a 3.44-fold increase in total area burned as compared to the scenario with average weather conditions. In favourable weather conditions represented by wet, cold and nearly windless conditions, the model shows an 85% reduction in total burned area as compared to the average scenario. These results illustrate the significant impact of uncontrollable variables on the overall result.
27

Kompletný dátový model ITILu / Complete data model of ITIL

Gažmerčík, Jakub January 2011 (has links)
The objective of this thesis is to design a complete ITIL (IT Infrastructure Library) data model by using Entity-Relationship diagram and the argumentation of this model in relation to ITIL Configuration and Asset Management federative approach. The thesis is build of two major parts, where the first one focuses on the theory of recommendations and best practices in the IT Governance processes, its relationship to ITIL and a high-level overview of ITIL history. Theoretical part ends with the brief description of tools and methods used in data modeling. It is followed by practical part in which the content is based on analyzing of ITIL publications in the same order as IT Service Management life cycle phases and designing of the logical data model. Each chapter discusses at its end the abiliity of designed model to cover the data requirements for particular ITSM lifecycle phase. In the very last chapter, all partial models are joined into one complete data model and shown relations are argumented.
28

Applying Human-scale Understanding to Sensor-based Data : Generating Passive Feedback to Understand Urban Space Use

Eriksson, Adam, Uppling, Hugo January 2021 (has links)
The aim of this thesis is to investigate how parametrization of large-scale person movement data can contribute to describing the use of urban space. Given anonymous coordinate and timestamp data from a sensor observing an open-air mall, movement-based parameters are selected according to public life studies, behavioral mapping, and space syntax tools. The thesis aim is operationalized by answering how well the parametrizations perform in capturing urban space use, as well as investigating how the use is described when applying the parameterized data in selected urban space use tools. Also, the parameterized data are evaluated as time series to investigate possible further understanding of urban space use. The parametrization performance is evaluated by accuracy and F1-score and time series forecasts are evaluated by root mean square error (RMSE) and mean absolute error (MAE). The results indicate a parametrization accuracy of 93% or higher, while a high yet fluctuating F1 -score indicates that the parameterizations might be sensitive to imbalanced data, and that accuracy alone might not be sufficient when evaluating urban data. The parameterized data applied in the selected urban space use tools highlights the granularity achieved from sensor-based data. In the time series analysis, a Facebook Prophet forecast model is implemented, with an MAE of 8.6% and RMSE of 11.7%, outperforming a seasonal naïve forecast implementation with an MAE of 14.1% and RMSE of 18.8%. The thesis finds that time series modelling adds to understanding patterns and changes of use over time and that the approach could be developed further in future studies. In answering how the urban space is used, the thesis develops a new methodology. This methodology combines human-scale understanding of urban space use with large-scale data, generating citizen passive feedback. / Vikten av att förstå hur en plats, eller ett stadsrum, faktiskt används härstammar ur det faktum att användningen ofta avviker från vad som var planerat. Genom en utökad förståelse för användningen av en plats går det exempelvis att anpassa platsens utformning efter faktisk användning. För att uppnå denna djupare förståelse finns flera olika tillvägagångssätt. Ett sätt är att använda de analoga teorier och verktyg som under lång tid har utvecklats av arkitekter och stadsplanerare, med avsikt att förstå sig på människors beteenden i olika stadsrum. Dessa urbana analysverktyg innefattar exempelvis ramverk för att kartlägga människors aktivitet. Ett annat sätt är att analysera stora datamängder för att utvinna generella rörelsemönster eller detaljerade trender. I denna uppsats presenteras en metod som kombinerar dessa två tillvägagångssätt i syfte att väva in de analoga teoriernas mänskliga utgångspunkt med de möjligheter som uppstår vid analys av stora datamängder. Genom att utveckla algoritmer kan rörelse-baserad information utvinnas, eller parametriseras, ur data från människors rörelse. Metoden innebär i kontexten av denna studie således en parametrisering av rörelse-data från en sensor uppsatt på shoppinggatan Kompassen i Göteborg. Urvalet av parametriseringar har baserats på de urbana analysverktygen. Detta sammanfattas i studiens övergripande syfte: att undersöka hur parametrisering av storskalig rörelsedata kan bidra till att förklara användningen av stadsrum. För att uppnå detta syfte besvaras tre frågeställningar. Först utvärderas hur väl det parametriserade rörelsedatat kan fånga upp användningen av stadsrum. Sedan undersöks hur användningen gestaltas genom att det parametriserade datat appliceras i utvalda urbana analysverktyg. Till sist analyseras datat som tidsserier i syfte att undersöka hur en förståelse över tid kan öka förståelsen för användningen av stadsrum. Genom att utgå från rörelsedata utvanns personers hastighet, startpunkt, och destination. Vidare parametriserades klasserna butiksinteraktion, grupptillhörighet, och stillastående i enlighet med de urbana analysverktygen. Vid utvärdering av dessa tre klasser visar studiens resultat att användningen av stadsrummet fångas upp till hög grad och uppnår åtminstone 93% i precision. Dock visar resultaten även att träffsäkerheten minskar ju mer obalanserat datat är. Detta innebär att ju lägre frekvent en klass är i datat desto svårare är den att fånga upp.    När det parametriserade datat används i de urbana analysverktygen, visar resultaten att det utvunna datat bidrar med en högre upplösning som kan bana väg för ny förståelse för hur stadsrum används. Den högre upplösningen möjliggör även för tidsserieanalys av det parametriserade datat. Resultaten pekar på en mer detaljerad förståelse för trender och användningen av stadsrummet över tid. Till exempel implementeras verktyget Facebook Prophet som i detta fall prognostiserar andelen med grupptillhörighet. För en prognos på två veckor uppnås ett genomsnittligt absolutfel på 8.6%, vilket anses vara ett träffsäkert resultat. På så sätt medför möjligheten att prognostisera användning och identifiera avvikelser från trender ett ytterligare bidrag till förståelsen för hur platsen används. Tidsserieanalysen uppvisar stor potential och tolkningar från såväl tidsserierna som prognosmodeller har utrymme att vidareutvecklas. I framtida studier bör även algoritmer för fler aktivitetsbaserade parametrar, till exempel sittande eller samtalande, utvecklas. Uppsatsens fokus kretsar kring att skapa förståelse för hur ett stadsrum används och lämnar således frågan varför åt framtida studier, där resultat från denna studie kan fungera som viktigt underlag. Studiens metod tillför ett mänskligt perspektiv till stora datamängder och bidrar på så sätt till ett bredare underlag för hur stadsrum används. Med utgångspunkt i urbana analysverktyg har insamlad sensordata parametriserats till viktiga rörelse-baserade klasser. Detta underlag motsvarar en passiv återkoppling från användarna av stadsrummet som därigenom förklarar hur en plats faktiskt används.
29

Fourier transform ion cyclotron resonance mass spectrometry for petroleomics

Hauschild, Jennifer M. January 2012 (has links)
The past two decades have witnessed tremendous advances in the field of high accuracy, high mass resolution data acquisition of complex samples such as crude oils and the human proteome. With the development of Fourier transform ion cyclotron resonance mass spectrometry, the rapidly growing field of petroleomics has emerged, whose goal is to process and analyse the large volumes of complex and often poorly understood data on crude oils generated by mass spectrometry. As global oil resources deplete, oil companies are increasingly moving towards the extraction and refining of the still plentiful reserves of heavy, carbon rich and highly contaminated crude oil. It is essential that the oil industry gather the maximum possible amount of information about the crude oil prior to setting up the drilling infrastructure, in order to reduce processing costs. This project describes how machine learning can be used as a novel way to extract critical information from complex mass spectra which will aid in the processing of crude oils. The thesis discusses the experimental methods involved in acquiring high accuracy mass spectral data for a large and key industry-standard set of crude oil samples. These data are subsequently analysed to identify possible links between the raw mass spectra and certain physical properties of the oils, such as pour point and sulphur content. Methods including artificial neural networks and self organising maps are described and the use of spectral clustering and pattern recognition to classify crude oils is investigated. The main focus of the research, the creation of an original simulated annealing genetic algorithm hybrid technique (SAGA), is discussed in detail and the successes of modelling a number of different datasets using all described methods are outlined. Despite the complexity of the underlying mass spectrometry data, which reflects the considerable chemical diversity of the samples themselves, the results show that physical properties can be modelled with varying degrees of success. When modelling pour point temperatures, the artificial neural network achieved an average prediction error of less than 10% while SAGA predicted the same values with an average accuracy of more than 85%. It did not prove possible to model any of the other properties with such statistical significance; however improvements to feature extraction and pre-processing of the spectral data as well as enhancement of the modelling techniques should yield more consistent and statistically reliable results. These should in due course lead to a comprehensive model which the oil industry can use to process crude oil data using rapid and cost effective analytical methods.
30

Medições do saldo de radiação em copas de cafeeiros e limeiras ácidas por sistemas de integração espaço-temporal e estimativas por técnicas de modelagem / Measurements of net radiation of the canopy of coffee and acid lime trees in hedgerows by spatiotemporal integration systems and estimates by modelling techniques

Simon, Jones 16 June 2010 (has links)
A energia radiante absorvida pelas copas tem aplicação em estudos de fotossíntese e transpiração de plantas arbóreas. Sua determinação não é simples. Nas últimas duas décadas tem sido realizados trabalhos sobre interceptação de radiação por espécies arbóreas isoladas ou em renques, envolvendo tanto radiação de ondas curtas como de ondas longas (saldo de radiação, Rn). Uma técnica de medida de Rn utiliza radiômetros movendo-se em torno da copa de uma árvore (geometria esférica de medida) ou ao longo de copas de um renque (geometria cilíndrica). Uma alternativa é o uso de modelagem físico-matemática para estimativa de Rn, que também exige medidas para testar modelos. Considerando os poucos trabalhos nessa linha de pesquisa, o presente estudo objetivou: a) avaliar o desempenho de sistemas móveis de integração espaçotemporal de medidas de Rn em renque de um cafezal e de um pomar de limeira ácida localizados no Campus Luiz de Queiroz, USP, Piracicaba, SP; b) estabelecer relações do saldo de radiação das copas de cafeeiros (Rnc) e limeiras ácidas (Rnl) com o saldo de radiação de gramado (Rng) e irradiância solar global (Rg); c) avaliar o desempenho de três modelos físicomatemáticos de estimativa de Rn por comparação com as medidas realizadas com varredura da copa pelos saldo-radiômetros movimentando-se ao longo dos renques (geometria cilíndrica de medidas). Os estudos foram realizados nas quatro estações do ano e em três no pomar, iniciandose respectivamente no outono e no verão de 2008. Os sistemas integradores forneceram valores diários de magnitude coerentes com os de calor latente de vaporização da copa determinados por medidas de transpiração de uma árvore. O desempenho do sistema de grande porte utilizado no pomar exige testes adicionais. O curso diário de Rn em cada posição dos radiômetros em torno da copa mostraram padrões diferenciados conforme orientação do renque e época do ano. No cafezal foram encontradas muito boas relações de Rnc com Rg e Rng nas escalas de 15min, horária e diurna, exceto no verão para 15min e horária e no inverno, na escala diurna, para Rng. Para o pomar, as relações de Rnl com Rg e Rng foram boas nas três escalas temporais, exceto no verão para 15min e horária e no outono na escala diurna. Para o cafezal o modelo de Beer apresentou altos índices de confiabilidade nas épocas do ano, nas escalas horária e diurna; e no pomar se mostrou confiável para inverno, mas não no verão na escala diurna e na escala horária apresentou menor confiabilidade. O modelo de Pilau apresentou boa confiabilidade para o cafezal no outono e menor confiabilidade nas demais estações, enquanto na escala horária verificou-se boa confiabilidade no outono e inverno; para o pomar, o desempenho foi satisfatório para inverno nas duas escalas temporais e insatisfatório para verão na escala diurna. Para o cafezal o modelo de Oyarzun mostrou-se confiável na escala diurna, e menos confiável na escala horária no verão; para o pomar, apresentou boa confiabilidade em ambas escalas temporais no inverno e baixa confiabilidade no verão. / The amount of radiant energy absorbed by the canopy of trees has great application in photosynthesis and transpiration research. Studies of radiation interception by the canopies of an isolated or a sequence of trees in hedgerows have been carried out by a technique which employs moving radiometers around an isolated tree (spherical geometry of measurement) or along a part of a hedgerow (cylindrical geometry). This technique allows spatiotemporal integration of the values. Crop Modeling is an alternative approach to estimate the radiation balance of canopies (Rn). The present study had the following objectives: a) to evaluate the performance of systems for spatiotemporal integration of Rn measurements in hedgerows of a coffee and an acid lime crops at the Luiz de Queiroz Campus of the University of São Paulo, in Piracicaba, SP, Brazil; b) to establish relations of the net radiation of the coffee (Rnc) and of acid lime plants (Rnl) canopies with grass net radiation (Rng) and incoming solar radiation (Rg); c) to evaluate the performance of three models of Radiation Balance (Rn), by comparing modem results with measurements by the moving radiometers along the planting row (cylindrical geometry). The coffee experiment was carried out all year long starting in the fall of 2008 while for the orchard only three seasons were tested starting in the summer 2008. Measurements of Rn by the moving systems were considered reliable, but it is recommended further testing of that of the orchard. The daily course of Rn for each measurement around the canopy showed different patterns for the two crops, in agreement with planting row orientation and season. Concerning to relations of Rnc with Rg and Rng integrated for 15min, hourly and diurnal periods, they were very good for the coffee crop in the three timescales, excepting for the 15min and hourly timescales in the summer and for the Rng for the diurnal period in the winter. For the orchard, the relations of Rnl with Rg and Rng were good, excepting for the summer in the 15min and hourly and in autumn for the diurnal timescales. The Beer´s model showed good reliability for the coffee crop; while for the orchard it showed to be reliable in the winter, but not in the summer for the diurnal period; for the hourly periods it was less reliable. The Pilau´s model showed good reliability for the coffee crop for both the diurnal and hourly periods during the fall, but less reliability in the other seasons; for the orchard, its performance was reliable for the winter in the two timescales and not reliable for the summer in the diurnal scale. The Oyarzun´s model showed to be reliable in the diurnal timescale and less reliable in the hourly timescale for the summer; for the orchard, it showed good reliability for the winter and little reliability for the summer.

Page generated in 0.1049 seconds