• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 144
  • 104
  • 47
  • 19
  • 7
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 376
  • 82
  • 77
  • 63
  • 62
  • 57
  • 53
  • 47
  • 37
  • 33
  • 31
  • 30
  • 28
  • 25
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Non-Destructive VIS/NIR Reflectance Spectrometry for Red Wine Grape Analysis

Fadock, Michael 04 August 2011 (has links)
A novel non-destructive method of grape berry analysis is presented that uses reflected light to predict berry composition. The reflectance spectrum was collected using a diode array spectrometer (350 to 850 nm) over the 2009 and 2010 growing seasons. Partial least squares regression (PLS) and support vector machine regression (SVMR) generated calibrations between reflected light and composition for five berry components, total soluble solids (°Brix), titratable acidity (TA), pH, total phenols, and anthocyanins. Standard methods of analysis for the components were employed and characterized for error. Decomposition of the reflectance data was performed by principal component analysis (PCA) and independent component analysis (ICA). Regression models were constructed using 10x10 fold cross validated PLS and SVM models subject to smoothing, differentiation, and normalization pretreatments. All generated models were validated on the alternate season using two model selection strategies: minimum root mean squared error of prediction (RMSEP), and the "oneSE" heuristic. PCA/ICA decomposition demonstrated consistent features in the long VIS wavelengths and NIR region. The features are consistent across seasons. 2009 was generally more variable, possibly due to cold weather affects. RMSEP and R2 statistics of models indicate that PLS °Brix, pH, and TA models are well predicted for 2009 and 2010. SVM was marginally better. The R2 values of the PLS °Brix, pH, and TA models for 2009 and 2010 respectively were: 0.84, 0.58, 0.56 and: 0.89, 0.81, 0.58. 2010 °Brix models were suitable for rough screening. Optimal pretreatments were SG smoothing and relative normalization. Anthocyanins were well predicted in 2009, R2 0.65, but not in 2010, R2 0.15. Phenols were not well predicted in either year, R2 0.15-0.25. Validation demonstrated that °Brix, pH, and TA models from 2009 transferred to 2010 with fair results, R2 0.70, 0.72, 0.31. Models generated using 2010 reflectance data did not generate models that could predict 2009 data. It is hypothesized that weather events present in 2009 and not in 2010 allowed for a forward calibration transfer, and prevented the reverse calibration transfer. Heuristic selection was superior to minimum RMSEP for transfer, indicating some overfitting in the minimum RMSEP models. The results are demonstrative of a reflectance-composition relationship in the VIS-NIR region for °Brix, pH, and TA requiring additional study and development of further calibrations.
242

Assessment of Strategic Management Practices in Small Agribusiness Firms in Tanzania

Dominic, Theresia 11 May 2015 (has links)
No description available.
243

Maturité supply chain des entreprises : conception d'un modèle d'évaluation et mise en oeuvre

Zouaghi, Iskander 19 February 2013 (has links) (PDF)
Le développement de la maturité supply chain est une préoccupation croissante des entreprises évoluant dans un environnement qui se transforme profondément et simultanément dans plusieurs domaines. S'adossant à plusieurs initiatives proposées sous forme de référentiels, les entreprises progresser quelque peu avec difficulté. S'inscrivant dans un paradigme post-positiviste, cette recherche se concentre sur la conception d'un modèle qui permet d'évaluer la maturité, sur plusieurs dimensions, de la supply chain. Ces objectifs ont été atteints à partir d'une revue approfondie de la littérature, mais également à partir de la réalisation d'une étude empirique. D'une part, les concepts de supply chain, de supply chain management, de maturité supply chain, ainsi que des capacités relatives ont été revus en profondeur, et d'autre part, les référentiels d'évaluation dans le domaine de la logistique, de la gestion des opérations et du supply chain management ont été étudiés et confrontés aux capacités de maturité supply chain établies. Cela a donné lieu à la conception d'un modèle qui s'appuie sur les modèles antérieurs pour proposer des critères qui les complètent. Ce modèle a été validé auprès d'un échantillon de 115 répondants (Directeurs logistique, supply chain managers, etc.). L'analyse des données collectées s'est appuyée sur les modèles d'équations structurelles, en l'occurrence PLS-PM (régressions aux moindres carrées partiels). Cette analyse, précédée d'une analyse univariée et d'une analyse en composantes principales a permis de constater la validité convergente ainsi que la validité discriminante des échelles de mesures, mais également de l'existence de liens d'impacts variables entre les différentes dimensions et la maturité supply chain. Les résultats de cette recherche montrent que la maturité est principalement constituée par les dimensions opérationnelles et stratégiques, suivies par les dimensions informationnelles, structurelles, organisationnelles et humaines. La thèse établit le constat selon lequel la dimension de risques et de résilience, la dimension relationnelle et la dimension de développement durable et de responsabilité sociale ont le moins d'impact dans la structuration de la maturité supply chain. Une fois le modèle validé, la maturité effective des entreprises de l'échantillon a été évaluée en utilisant l'analyse Importance/Maturité. Les résultats émanant de cette analyse montrent une forte maturité supply chain des entreprises par rapport aux dimensions stratégiques, structurelles, organisationnelles, de risques et de résilience, ce qui n'est pas le cas par rapport aux autres dimensions, où l'analyse a décelé une maturité moyenne à faible.Cela suggère aux entreprises de concentrer leurs efforts sur les aspects opérationnels, relationnels, informationnels et relatifs au développement durable et à la responsabilité sociale.
244

Explaining temporal variations in soil respiration rates and delta13C in coniferous forest ecosystems

Comstedt, Daniel January 2008 (has links)
Soils of Northern Hemisphere forests contain a large part of the global terrestrial carbon (C) pool. Even small changes in this pool can have large impact on atmospheric [CO2] and the global climate. Soil respiration is the largest terrestrial C flux to the atmosphere and can be divided into autotrophic (from roots, mycorrhizal hyphae and associated microbes) and heterotrophic (from decomposers of organic material) respiration. It is therefore crucial to establish how the two components will respond to changing environmental factors. In this thesis I studied the effect of elevated atmospheric [CO2] (+340 ppm, 13C-depleted) and elevated air temperature (2.8-3.5 oC) on soil respiration in a whole-tree chamber (WTC) experiment conducted in a boreal Norway spruce forest. In another spruce forest I used multivariate modelling to establish the link between day-to-day variations in soil respiration rates and its δ13C, and above and below ground abiotic conditions. In both forests, variation in δ13C was used as a marker for autotrophic respiration. A trenching experiment was conducted in the latter forest in order to separate the two components of soil respiration. The potential problems associated with the trenching, increased root decomposition and changed soil moisture conditions were handled by empirical modelling. The WTC experiment showed that elevated [CO2] but not temperature resulted in 48 to 62% increased soil respiration rates. The CO2-induced increase was in absolute numbers relatively insensitive to seasonal changes in soil temperature and data on δ13C suggest it mostly resulted from increased autotrophic respiration. From the multivariate modelling we observed a strong link between weather (air temperature and vapour pressure deficit) and the day-to-day variation of soil respiration rate and its δ13C. However, the tightness of the link was dependent on good weather for up to a week before the respiration sampling. Changes in soil respiration rates showed a lag to weather conditions of 2-4 days, which was 1-3 days shorter than for the δ13C signal. We hypothesised to be due to pressure concentration waves moving in the phloem at higher rates than the solute itself (i.e., the δ13C–label). Results from the empirical modelling in the trenching experiment show that autotrophic respiration contributed to about 50% of total soil respiration, had a great day-to-day variation and was correlated to total soil respiration while not to soil temperature or soil moisture. Over the first five months after the trenching, an estimated 45% of respiration from the trenched plots was an artefact of the treatment. Of this, 29% was a water difference effect and 16% resulted from root decomposition. In conclusion, elevated [CO2] caused an increased C flux to the roots but this C was rapidly respired and has probably not caused changes in the C stored in root biomass or in soil organic matter in this N-limited forest. Autotrophic respiration seems to be strongly influenced by the availability of newly produced substrates and rather insensitive to changes in soil temperature. Root trenching artefacts can be compensated for by empirical modelling, an alternative to the sequential root harvesting technique.
245

The use of blogs for teaching and learning in UK and US Higher Education

Garcia, Elaine January 2017 (has links)
Within the last decade there has been a significant increase in the range of Social Media tools that have become available. This has led to a significant increase in the use and popularity of Social Media within many aspects of everyday life, particularly within the UK and US. One of the areas in which there has been a rise in the use of Social Media is within Higher Education (HE). Within HE there have been reports that Social Media has been successfully utilized for teaching and learning, particularly in the case of blogs. Despite reportedly successful usage there has to date been relatively few empirical studies which have explored whether the use of blogs within teaching and learning leads to an increase in perceived learning by students. This research study therefore provides an empirical study of perceived learning by students when using blogs within teaching and learning in UK and US HE. This research study adopts a post positivist research approach and a quantitative research design method. Questionnaires have been utilised in order to explore student views of perceived learning when using blogs as a tool for HE teaching and learning within the UK and US. This study provides a framework for student use of blogs within HE teaching and learning and explores whether the use of blogs in this way leads to greater levels of perceived learning amongst students. The results of this research are analysed using PLS-SEM and have shown that the successful use of blogs for teaching and learning is complex. The results have demonstrated that students do report higher degrees of learning from using blogs within teaching and learning, however, this is influenced by the perceptions students hold relating to digital technology, teaching and learning, previous experience and expectations of blogging. The results of this study have implications for both HE teachers and HE students and provides a framework which can be used to help ensure the successful use of blogs when utilised for HE teaching and learning within the UK and US in the future.
246

Técnicas de análise multivariável aplicadas ao desenvolvimento de analisadores virtuais

Facchin, Samuel January 2005 (has links)
A construção de um analisador virtual é sustentada basicamente por três pilares: o modelo, as variáveis que integram o modelo e a estratégia de correção/atualização do modelo. Os modelos matemáticos são classificados quanto ao nível de conhecimento do processo contido nele, indo de modelos complexos baseados em relações fundamentais e leis físico-químicas, denominados white-box, até modelos obtidos através de técnicas de análise multivariável, como técnicas de regressão multiváriavel e redes neurais, referenciados como black box. O presente trabalho objetiva uma análise de dois dos pilares: os modelos, focando em modelos obtidos através das técnicas de redução de dimensionalidade do tipo PLS, e metodologias de seleção de variáveis para a construção dessa classe de modelos. Primeiramente é realizada uma revisão das principais variantes lineares e não lineares da metodologia PLS, compreendendo desde o seu desenvolvimento até a sua combinação com redes neurais. Posteriormente são apresentadas algumas das técnicas popularmente utilizadas para a seleção de variáveis em modelos do tipo black-box, técnicas de validação cruzada e técnicas de seleção de dados para calibração e validação de modelos. São propostas novas abordagens para os procedimentos de seleção de variáveis, originadas da combinação das técnicas de seleção de dados com duas metodologias de seleção de variáveis. Os resultados produzidos por essas novas abordagens são comparados com o método clássico através de casos lineares e não lineares. A viabilidade das técnicas analisadas e desenvolvidas é verificada através da aplicação das mesmas no desenvolvimento de um analisador virtual para uma coluna de destilação simulada através do simulador dinâmico Aspen Dynamics®. Por fim são apresentadas as etapas e desafios da implementação de um analisador virtual baseados em técnicas PLS em uma Torre Depropanizadora de uma central de matérias primas de um pólo petroquímico. / The construction of a virtual analyzer is sustained basically by three pillars: the model, the variables that integrate the model and the updating strategy of the model. The mathematical models are classified with relationship at the level of the process knowledge within it, going from complex models, based on fundamental relationships and physical-chemistries laws, called white-box, until models obtained through multivariable analysis techniques, as multiple linear regression and neural networks, also called as black box. The focus of the present work is the analysis of two of the pillars: the models, specially the ones obtained by dimension reduction techniques, like PLS, and methodologies used in the development of this class of models. Initially, a revision of the main linear and non linear variants of the PLS methodology is done, embracing since its development to its combination with neural networks. Later on, some popularly variables selection techniques for black-box models are explained, as well as some cross validation techniques and strategies for data selection for calibration and validation of models. New approaches for variables selection procedures are proposed, originated by the combination of data selection strategies and two variables selection techniques. The results produced by those new approaches are compared with the classic method through linear and non linear case studies. The viability of the analyzed and developed techniques is verified through the application of the same ones in the development of a virtual analyzer for a distillation column, simulated by the dynamic simulator Aspen Dynamics®. The steps and challenges faced in the implementation of a virtual analyzer based on PLS technical for a Depropanizer Unit are finally presented.
247

Sistemática para seleção de variáveis e determinação da condição ótima de operação em processos contínuos multivariados em múltiplos estágios

Loreto, Éverton Miguel da Silva January 2014 (has links)
Esta tese apresenta uma sistemática para seleção de variáveis de processo e determinação da condição ótima de operação em processos contínuos multivariados e em múltiplos estágios. O método proposto é composto por seis etapas. Um pré-tratamento nos dados é realizado após a identificação das variáveis de processo e do estabelecimento dos estágios de produção, onde são descartadas observações com valores espúrios e dados remanescentes são padronizados. Em seguida, cada estágio é modelado através de uma regressão Partial Least Squares (PLS) que associa a variável dependente daquele estágio às variáveis independentes de todos os estágios anteriores. A posterior seleção de variáveis independentes apoia-se nos coeficientes da regressão PLS; a cada interação, a variável com menor coeficiente de regressão é removida e um novo modelo PLS é gerado. O erro de predição é então avaliado e uma nova eliminação é promovida até que o número de variáveis remanescentes seja igual ao número de variáveis latentes (condição limite para geração de novos modelos PLS). O conjunto com menor erro determina as variáveis de processo mais relevantes para cada modelo. O conjunto de modelos PLS constituído pelas variáveis selecionadas é então integrado a uma programação quadrática para definição das condições de operação que minimizem o desvio entre os valores preditos e nominais das variáveis de resposta. A sistemática proposta foi validada através de dois exemplos numéricos. O primeiro utilizou dados de uma empresa do setor avícola, enquanto que o segundo apoiou-se em dados simulados. / This dissertation proposes a novel approach for process variable selection and determination of the optimal operating condition in multiple stages, multivariate continuous processes. The proposed framework relies on six steps. First, a pre-treatment of the data is carried out followed by the definition of production stages and removal of outliers. Next, each stage is modeled by a Partial Least Squares regression (PLS) which associates the dependent variable of each stage to all independent variables from previous stages. Independent variables are then iteratively selected based on PLS regression coefficients as follows: the variable with the lowest regression coefficient is removed and a new PLS model is generated. The prediction error is then evaluated and a new elimination is promoted until the number of remaining variables is equal to the number of latent variables (boundary condition for the generation of new PLS models). The subset of independent variables yielding the lowest predictive in each PLS model error is chosen. The set of PLS models consisting of the selected variables is then integrated to a quadratic programming aimed at defining the optimal operating conditions that minimize the deviation between the predicted and nominal values of response variables. The proposed approach was validated through two numerical examples. The first was applied to data from a poultry company, while the second used simulated data.
248

Nova metodologia para o desenvolvimento de inferências baseadas em dados

Fleck, Thiago Dantas January 2012 (has links)
As inferências têm diversas aplicações na indústria de processos químicos, sendo essenciais no sucesso de projetos de controle avançado. O desempenho do controle será sempre ligado ao desempenho da inferência, sendo importante a manutenção da sua qualidade ao longo do tempo. Neste trabalho, uma nova metodologia é sugerida para o desenvolvimento de inferências baseadas em dados seguindo uma abordagem segmentada com o objetivo de facilitar a sua manutenção. A nova proposta consiste em modelar a parte estacionária separada da parte dinâmica, diferentemente do que é feito na metodologia tradicional, onde o modelo dinâmico é gerado diretamente dos dados de processo. O modelo estacionário é obtido através de uma regressão PLS (Partial Least Squares), enquanto as dinâmicas são inseridas posteriormente utilizando-se um algoritmo de otimização. A técnica é aplicada a uma coluna de destilação e o resultado obtido é semelhante ao de inferências dinâmicas e estáticas desenvolvidas com métodos tradicionais. Outras etapas do desenvolvimento de inferências também são investigadas. Na seleção de variáveis, métodos estatísticos são comparados com a busca exaustiva e se conclui este último deve ser usado como padrão, visto que custo computacional não é mais um problema. Também são apresentadas boas práticas no pré-tratamento de dados, remoção do tempo morto do cromatógrafo modelado e detecção de estados estacionários. / Soft-sensors have several applications in the chemical processes industry and are essential for the success of advanced control projects. Its performance will always be linked to the performance of the soft-sensor, so it is important to maintain its quality over time. In this paper, a new methodology is suggested for the development of data-based soft-sensors following a segmented approach in order to facilitate its maintenance. The new proposal is to model the stationary part separated from the dynamic, unlike the traditional methodology where the dynamic model is generated directly from process data. The stationary model is obtained by a PLS (Partial Least Squares) regression, while the dynamics are inserted using an optimization algorithm. The technique is applied to a distillation column and its performance is similar to dynamic and static soft-sensors developed using traditional methods. Other steps in the development of soft-sensors are also investigated. In variable selection issue, statistical methods are compared with the testing of all possibilities; the latter should be used as default, since computational cost is no longer a problem. We also present best practices in data pre-processing, gas chromatograph dead-time removal and steady state detection.
249

Análise da intenção de adoção da computação em nuvem por profissionais da área de TI

Cogo, Gabriel Silva January 2013 (has links)
A computação em nuvem emerge quando se trata da necessidade dos desenvolvedores de TI de sempre aumentar ou incluir novas capacidades, o mais rápido possível, com o menor investimento possível. Ela vem sendo apontada como uma das maiores inovações em TI nos últimos anos e por isso vem chamando a atenção tanto da comunidade acadêmica quanto da comercial. Apesar deste crescente interesse na tecnologia pela literatura acadêmica, a maior parte do foco das pesquisas se dá nos aspectos técnicos, como potencial computacional e custos. Pesquisas sobre as preferências dos profissionais da área relativa à computação em nuvem como uma ferramenta de negócios estão limitadas a estudos de consultorias e empresas privadas. Esta pesquisa tem como objetivo apresentar um estudo do impacto de diferentes dimensões sobre a intenção de adoção da computação em nuvem por profissionais de TI. Para isto, utiliza uma variação do modelo TAM/UTAUT para verificação de intenção de adoção de novas tecnologias. O método escolhido foi a pesquisa survey, realizada a partir de um instrumento previamente proposto e adaptado, sendo feita em duas etapas: estudo de pré-teste e estudo final. Diferentes técnicas estatísticas foram empregadas para refinar o instrumento, como Análise de Confiabilidade, Análise Fatorial Exploratória e Análise Fatorial Confirmatória, utilizando o método PLS (Partial Least Squares) para Equações Estruturais. Como resultado deste refinamento emergiu um modelo teórico de pesquisa final contendo 8 dimensões e 36 itens. Como contribuição para a área de SI, o modelo teórico de pesquisa final se mostrou adequado para avaliar a intenção de adoção da computação em nuvem por profissionais de TI. A principal contribuição da pesquisa para a prática gerencial é o modelo de intenção de adoção da computação em nuvem, que pode auxiliar provedores de computação em nuvem, através da mensuração das principais razões para sua adoção, que são Utilidade Percebida e Atitude Frente à Inovação Tecnológica. Também demonstra que não existe uma relação positiva entre Segurança e Confiança e a Intenção Comportamental. Doze hipóteses foram validadas e seis das hipóteses propostas foram negadas pelos dados. Estas informações buscam fornecer material para que se possa inspirar os esforços no desenvolvimento da tecnologia como ferramenta de negócio. / Cloud computing emerges when we talk about the necessity of the IT developers to always increase or add new capabilities, as soon as possible, with the lowest investment possible. It has been appointed as one of the biggest IT innovations in the recent years, and for that reason it’s been calling the attention of the academic and management communities. Even with the growing interest by the academic community, most of the research focus on technical aspects, such as computational potencial and costs. Researches involving professionals’ preferences with cloud computing as a business tool are limited to consultant and private studies at most. This research has the purpose of presenting a study about the impact of different dimensions in the intention of adoption of cloud computing by IT professionals. To do so, it uses a variation of the TAM/UTAUT model for the verification of the intention of adopting new technologies. The research method is the survey research, made with a previously proposed and adapted instrument, conducted in two stages: pre-test study and final study. Different statistical techniques were used to refine the instrument, such as Reliability Analisis, Exploratory Factor Analysis and Confirmatory Factor Analysis, this one using the PLS (Partial Least Squares) Path modeling for SEM (Structural Equation Modeling). As a result of this refinement, emerged a theorical research model containing 8 dimensions and 36 measuring items. As contribution to the IS area, the theorical model proved adequate to assess the intention of adoption of cloud computing by IT professionals. The research’s main contribution to the business practice is the model of cloud computing intention of adoption, that aids cloud providers, trought the measurement of the main reasons behind the adoption of the technology, wich are Perceived Utility and Attitude Towards Technology Innovation. Also demonstrates that there are no positive relation between Security and Trust and the Behaviorial Intention. Twelve of the hypothesis were sustained, and six of the proposed hypothesis were denied by the data. This information intends to inspire efforts in developing the technology as a business tool.
250

Técnicas de análise multivariável aplicadas ao desenvolvimento de analisadores virtuais

Facchin, Samuel January 2005 (has links)
A construção de um analisador virtual é sustentada basicamente por três pilares: o modelo, as variáveis que integram o modelo e a estratégia de correção/atualização do modelo. Os modelos matemáticos são classificados quanto ao nível de conhecimento do processo contido nele, indo de modelos complexos baseados em relações fundamentais e leis físico-químicas, denominados white-box, até modelos obtidos através de técnicas de análise multivariável, como técnicas de regressão multiváriavel e redes neurais, referenciados como black box. O presente trabalho objetiva uma análise de dois dos pilares: os modelos, focando em modelos obtidos através das técnicas de redução de dimensionalidade do tipo PLS, e metodologias de seleção de variáveis para a construção dessa classe de modelos. Primeiramente é realizada uma revisão das principais variantes lineares e não lineares da metodologia PLS, compreendendo desde o seu desenvolvimento até a sua combinação com redes neurais. Posteriormente são apresentadas algumas das técnicas popularmente utilizadas para a seleção de variáveis em modelos do tipo black-box, técnicas de validação cruzada e técnicas de seleção de dados para calibração e validação de modelos. São propostas novas abordagens para os procedimentos de seleção de variáveis, originadas da combinação das técnicas de seleção de dados com duas metodologias de seleção de variáveis. Os resultados produzidos por essas novas abordagens são comparados com o método clássico através de casos lineares e não lineares. A viabilidade das técnicas analisadas e desenvolvidas é verificada através da aplicação das mesmas no desenvolvimento de um analisador virtual para uma coluna de destilação simulada através do simulador dinâmico Aspen Dynamics®. Por fim são apresentadas as etapas e desafios da implementação de um analisador virtual baseados em técnicas PLS em uma Torre Depropanizadora de uma central de matérias primas de um pólo petroquímico. / The construction of a virtual analyzer is sustained basically by three pillars: the model, the variables that integrate the model and the updating strategy of the model. The mathematical models are classified with relationship at the level of the process knowledge within it, going from complex models, based on fundamental relationships and physical-chemistries laws, called white-box, until models obtained through multivariable analysis techniques, as multiple linear regression and neural networks, also called as black box. The focus of the present work is the analysis of two of the pillars: the models, specially the ones obtained by dimension reduction techniques, like PLS, and methodologies used in the development of this class of models. Initially, a revision of the main linear and non linear variants of the PLS methodology is done, embracing since its development to its combination with neural networks. Later on, some popularly variables selection techniques for black-box models are explained, as well as some cross validation techniques and strategies for data selection for calibration and validation of models. New approaches for variables selection procedures are proposed, originated by the combination of data selection strategies and two variables selection techniques. The results produced by those new approaches are compared with the classic method through linear and non linear case studies. The viability of the analyzed and developed techniques is verified through the application of the same ones in the development of a virtual analyzer for a distillation column, simulated by the dynamic simulator Aspen Dynamics®. The steps and challenges faced in the implementation of a virtual analyzer based on PLS technical for a Depropanizer Unit are finally presented.

Page generated in 0.0287 seconds