• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 144
  • 104
  • 47
  • 19
  • 7
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 376
  • 82
  • 77
  • 63
  • 62
  • 57
  • 53
  • 47
  • 37
  • 33
  • 31
  • 30
  • 28
  • 25
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Aplicação da química quântica ao estudo de um grupo de moléculas antihistamínicas H3 / A quantum chemical study on a set of H3 antihistamine molecules

Edson Barbosa da Costa 10 February 2010 (has links)
Nesta tese foi estudado um grupo de 28 compostos não-imidazólicos antagonistas do receptor H3 através de cálculos de orbitais moleculares utilizando os métodos de química quântica Austin Model 1, Hartree-Fock-Roothaan e Teoria do Funcional da Densidade com o objetivo de investigar possíveis relações entre descritores eletrônicos teóricos e as afinidades ligantes experimentais desses compostos (pKi). Observou-se nos resultados obtidos que as energias dos orbitais FERMOs (Frontier Effective-for-Reaction Molecular Orbitals) apresentam melhor correlação com os valores de pKi do que as energias dos orbitais de fronteira HOMO (Highest Occupied Molecular Orbital) e LUMO (Lowest Unoccupied Molecular Orbital). Além disso, verificou-se pelas análises de métodos multivariados PCA (Principal Componente Analysis) e HCA (Hierarchical Cluster Analysis) que um conjunto de quatro descritores foi capaz de separar os compostos em dois grupos distintos, o primeiro que apresenta valores de afinidades ligantes maiores e o segundo com menores valores de pKi. Esta separação foi possível com o uso dos seguintes descritores teóricos: energia do FERMO (εFERMO), carga derivada do potencial eletrostático no átomo de nitrogênio N1, índice de densidade eletrônica no átomo N1 (Σ(FERMO) ci2) e eletrofilicidade (ω\'). Estes descritores foram utilizados, posteriormente, para a construção de três equações de regressão pelo método PLS (Partial Least Squares). O melhor modelo de regressão gerou os seguintes parâmetros estatísticos Q2 = 0,88 e R2 = 0,927, obtidos com um conjunto treino e de validação externa de 23 e 5 moléculas, respectivamente. Logo após a avaliação da equação de regressão, juntamente com os valores dos descritores selecionados e outros não selecionados, foi sugerido que altos valores de energias dos FERMOs e de Σ(FERMO) ci2 em conjunto com baixos valores de eletrofilicidades e cargas extremamente negativas no átomo N1 são parâmetros relevantes para potencializar as afinidades ligantes de outros compostos a serem sintetizados, que apresentem estruturas químicas semelhantes às moléculas estudadas neste trabalho. Além disso, esses compostos podem ser considerados como doadores de elétrons e, logo, há uma grande probabilidade que tais moléculas interajam com o receptor histamínico H3 a partir de um processo de transferência de carga. / In this thesis, molecular orbital calculations were carried out on a set of 28 non-imidazole H3 antihistamine compounds using Austin Moldel 1, Hartree-Fock-Roothaan, and Density Functional Theory methods in order to investigate the possible relationships between electronic descriptors and binding affinity for H3 receptors (pKi). It was observed that the frontier effective-for-reaction molecular orbital (FERMO) energies were better correlated with pKi values than HOMO (Highest Occupied Molecular Orbital) and LUMO (Lowest Unoccupied Molecular Orbital) energy values. Exploratory data analysis through hierarchical cluster (HCA) and principal component analysis (PCA) showed a separation of the compounds into two sets by using four descriptors, one grouping the molecules with high pKi values, the other gathering low pKi value compounds. This separation was obtained with the use of the following descriptors: FERMO energies (εFERMO), charges derived from the electrostatic potential on the nitrogen atom (N1), electronic density indexes for FERMO on the N1 atom (Σ(FERMO) ci2), and electrophilicity (ω\'). These electronic descriptors were used to construct three quantitative structure-activity relationship (QSAR) models through the Partial Least Squares Method (PLS). The best model generated Q2 = 0.88 and R2 = 0.927 values obtained from a training set and external validation of 23 and 5 molecules, respectively. After the analysis of the PLS regression equation, the values for the selected electronic descriptors and other descriptors, it is suggested that high values of FERMO energies and of Σ(FERMO) ci2, together with low values of electrophilicity and pronounced negative charges on N1 appear as desirable properties for the conception of new molecules which might have high binding affinity. Moreover, these molecules can be classified as electron donating compounds and have a great probability of interacting through a charge transfer process with the biological receptor H3.
52

Desenvolvimento de softwares, algoritmos e diferentes abordagens quimiométricas em estudos de QSAR / Development of softwares, algorithms and different chemometric aproaches in QSAR studies

Martins, João Paulo Ataíde, 1980- 25 August 2018 (has links)
Orientador: Márcia Miguel Castro Ferreira / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Química / Made available in DSpace on 2018-08-25T11:39:21Z (GMT). No. of bitstreams: 1 Martins_JoaoPauloAtaide_D.pdf: 3637503 bytes, checksum: 5fe52d182b4f300eb103baf168ad75ab (MD5) Previous issue date: 2013 / Resumo: O planejamento de fármacos com o auxílio do computador é uma área de pesquisa de extrema importância em química e áreas correlatas. O conjunto de ferramentas disponíveis para tal fim consiste, dentre outras, em programas para geração de descritores e construção e validação de modelos matemáticos em QSAR (do inglês, Quantitative Structure-Activity Relationship). Com o objetivo de tornar esse estudo mais acessível para a comunidade científica, novas metodologias e programas para geração de descritores e construção e validação de modelos QSAR foram desenvolvidos nessa tese. Uma nova metodologia de QSAR 4D, conhecida com LQTA-QSAR, foi desenvolvida com o objetivo de gerar descritores espaciais levando em conta os perfis de amostragem conformacional das moléculas em estudo obtidos a partir de simulações de dinâmica molecular. A geração desses perfis é feita com o software livre GROMACS e os descritores são gerados a partir de um novo software desenvolvido nesse trabalho, chamado de LQTAgrid. Os resultados obtidos com essa metodologia foram validados comparando-os com resultados obtidos para conjuntos de dados disponíveis na literatura. Um outro software de fácil uso, e que engloba as principais ferramentas de construção e validação de modelos em QSAR, foi desenvolvido e chamado de QSAR modeling. Esse software implementa o método de seleção de variáveis OPS, desenvolvido em nosso laboratório, e utiliza PLS (do inglês Partial Least Squares) como método de regressão. A escolha do algoritmo PLS implementado no programa foi feita com base em um estudo sobre o desempenho e a precisão no erro de validação dos principais algoritmos PLS disponíveis na literatura. Além disso, o programa QSAR modeling foi utilizado em um estudo de QSAR 2D para um conjunto de 20 flavonóides com atividade anti-mutagênica contra 3-nitrofluoranteno (3-NFA) / Abstract: Computer aided drug design is an important research field in chemistry and related areas. The available tools used in such studies involve software to generate molecular descriptors and to build and validate mathematical models in QSAR (Quantitative Structure-Activity Relationship). A new set of methodologies and software to generate molecular descriptors and to build and validate QSAR models were developed aiming to make these kind of studies more accessible to scientific community. A new 4DQSAR methodology, known as LQTA-QSAR, was developed with the purpose to generate spatial descriptors taking into account conformational ensemble profile obtained from molecular dynamics simulations. The generation of these profiles is performed by free software GROMACS and the descriptors are generated by a new software developed in this work, called LQTAgrid. The results obtained with this methodology were validated comparing them with results available in literature. Another user friendly software, which contains some of the most important tools used to build and validate QSAR models was developed and called QSAR modeling. This software implements the OPS variable selection algorithm, developed in our laboratory, and uses PLS (Partial Least Squares) as regression method. The choice of PLS algorithm implemented in the program was performed by a study about the performance and validation precision error involving the most important PLS algorithms available in literature. Further, QSAR modeling was used in a 2D QSAR study with 20 flavonoid derivatives with antimutagenic activity against 3-nitrofluoranthene (3-NFA) / Doutorado / Físico-Química / Doutor em Ciências
53

Statut de la faillite en théorie financière : approches théoriques et validations empiriques dans le contexte français / Status of the bankruptcy of financial theory : theoretical and empirical validation in French context

Ben Jabeur, Sami 27 May 2011 (has links)
Dans la conjoncture économique actuelle un nombre croissant de firmes se trouvent confrontées à des difficultés économiques et financières qui peuvent, dans certains cas, conduire à la faillite. En principe, les difficultés ne surviennent pas brutalement, en effet, avant qu’une entreprise soit déclarée en faillite, elle est confrontée à des difficultés financières de gravité croissante : défaut de paiement d’une dette, insolvabilité temporaire, pénurie de liquidité, etc. L’identification des causes de la défaillance n’est pas évidente, puisqu’on ne saurait énumérer de manière limitative les facteurs qui la provoquent. Les causes sont multiples et leur cumul compromet d’autant plus la survie de l’entreprise. L’importance de ce phénomène et son impact sur l’ensemble de l’économie justifie le besoin de le comprendre, l’expliquer en analysant les causes et les origines. L’objectif de notre étude est de classer les entreprises en difficulté selon leur degré de viabilité et de comprendre les causes de la dégradation de leur situation. Nous effectuerons une comparaison entre trois modèles (Analyse discriminante linéaire, le modèle Logit et la régression PLS) ce qui nous permettra à partir des taux de bon classement obtenus, de choisir le meilleur modèle tout en précisant l’origine et les causes de ces défaillances. / In actual economic situation an increasing number of firms are facing economic and financial difficulties which can, in certain cases, drive to failure. In principle, difficulties do not happen suddenly, in effect, before a firm is declared bankrupt, it is confronted to financial difficulties of growing seriousness: default in payment of a debt, temporary insolvency, scarceness of liquidity, etc. Identifying the causes of the failure is not obvious, since one can not exhaustively enumerate the factors that cause it. The causes are multiple and overlapping compromise even more the company's survival. The importance of this phenomenon and its impact on the overall economy justifies the need to understand, explain it by analyzing the causes and origins The aim of our study is to classify firms in trouble according to their degree of viability and to understand the causes of the deterioration of their situation. We will do a comparison between three models (linear differential Analysis, the model Logit and decline PLS) what will allow us from the rates of good classification acquired, to choose the best model while specifying origin and reasons of these faults.
54

Avaliação da qualidade da carne bovina utilizando imagem hiperespectral no infravermelho próximo / Beef quality evaluation using near infrared hyperspectral imaging

Balage, Juliana Monteiro 14 November 2017 (has links)
Cada vez mais, a indústria requer métodos em tempo real para o controle de qualidade da carne fresca, a fim de melhorar a eficiência produtiva, garantir homogeneidade dos produtos e atender expectativas do consumidor. No presente trabalho, a imagem hiperespectral foi empregada para avaliação da qualidade da carne de bovinos Nelore com ênfase para a maciez e características relacionadas, e, ainda, a construção de mapas de distribuição das características para observação da variabilidade dessas entre e dentro de amostras. Para investigar se o uso de diferentes grupos musculares aumenta a variabilidade dos valores de referência, promovendo melhora nos modelos de predição e classificação da maciez, foram utilizadas amostras do músculo Longissimus(94) e B. femoris (94) de bovinos Nelore. Para investigar se a seleção da região de interesse (ROI) na imagem no exato local onde foi coletado o cilindro para determinação da força de cisalhamento melhora os modelos de predição e classificação da maciez, foram utilizadas amostras do músculo Longissimus (50). Após a aquisição da imagem (1.000 - 2.500 nm), cada amostra foi avaliada seguindo metodologia tradicional para força de cisalhamento, matéria seca, proteína bruta, lipídios e comprimento de sarcômero. Os dados espectrais e espaciais foram analisados por técnicas quimiométricas e modelos PLSR e PLS-DA foram construídos. Em relação à abordagem com diferentes músculos, os dados foram modelados separadamente para evitar que fenômenos devidos às diferenças musculares fossem equivocadamente atribuídos às características investigadas. Ainda assim, amostras de Longissimus com maciez inaceitável foram classificadas com sensibilidade = 87% e amostras macias de B. femoris com sensibilidade = 90%, ambas na validação externa. Com relação à forma de seleção da ROI, os modelos de classificação utilizando ROI local apresentaram melhor desempenho do que os modelos com ROI de toda a amostra (sensibilidade na validação externa para a classe dura = 33% e 70%, respectivamente). Entretanto, o modelo mais geral tem desempenho melhor na construção de mapas de distribuição da maciez, com de 72% das imagens preditas corretamente classificadas. / Increasingly, industry requires real-time methods for quality control of fresh meat in order to improve production efficiency, ensure product homogeneity and meet consumer expectations. In the present work, the hyperspectral image was used to evaluate the quality of Nellore beef with emphasis on tenderness and characteristics related to it, and also the construction of distribution maps to observe the variability of these characteristics between and within samples. To investigate whether the use of different muscle groups increases the variability of the reference values, improving tenderness prediction and classification models, samples from Longissimus (94) and B. femoris (94) of Nellore cattle were used. To investigate whether the selection of the region of interest (ROI) in the image at the exact location where the shear force cores were collected improves tenderness prediction and classification models, samples from Longissimus muscle were used (50). After image acquisition (1,000 - 2,500 nm), each sample was evaluated following traditional methodology for shear force, dry matter, crude protein, lipids and sarcomere length. The spectral and spatial data were analyzed by chemometric techniques and PLSR and PLS-DA models were constructed. Regarding the approach with different muscles, the data were modeled separately to avoid that phenomena due to muscle differences were mistakenly attributed to the characteristics investigated. Nevertheless, samples from Longissimus with unacceptable tenderness were classified with sensitivity = 87% and tender samples from B. femoris with sensitivity = 90%, both in the external validation. Regarding the ROI selection, the classification models using local ROI presented better performance than the ROI models of the whole sample (external validation sensitivity for the tough class = 33% and 70%, respectively). However, the more general model had better performance in the tenderness distribution maps, with 72% of predicted images correctly classified.
55

Aplicação da química quântica ao estudo de um grupo de moléculas antihistamínicas H3 / A quantum chemical study on a set of H3 antihistamine molecules

Costa, Edson Barbosa da 10 February 2010 (has links)
Nesta tese foi estudado um grupo de 28 compostos não-imidazólicos antagonistas do receptor H3 através de cálculos de orbitais moleculares utilizando os métodos de química quântica Austin Model 1, Hartree-Fock-Roothaan e Teoria do Funcional da Densidade com o objetivo de investigar possíveis relações entre descritores eletrônicos teóricos e as afinidades ligantes experimentais desses compostos (pKi). Observou-se nos resultados obtidos que as energias dos orbitais FERMOs (Frontier Effective-for-Reaction Molecular Orbitals) apresentam melhor correlação com os valores de pKi do que as energias dos orbitais de fronteira HOMO (Highest Occupied Molecular Orbital) e LUMO (Lowest Unoccupied Molecular Orbital). Além disso, verificou-se pelas análises de métodos multivariados PCA (Principal Componente Analysis) e HCA (Hierarchical Cluster Analysis) que um conjunto de quatro descritores foi capaz de separar os compostos em dois grupos distintos, o primeiro que apresenta valores de afinidades ligantes maiores e o segundo com menores valores de pKi. Esta separação foi possível com o uso dos seguintes descritores teóricos: energia do FERMO (εFERMO), carga derivada do potencial eletrostático no átomo de nitrogênio N1, índice de densidade eletrônica no átomo N1 (Σ(FERMO) ci2) e eletrofilicidade (ω\'). Estes descritores foram utilizados, posteriormente, para a construção de três equações de regressão pelo método PLS (Partial Least Squares). O melhor modelo de regressão gerou os seguintes parâmetros estatísticos Q2 = 0,88 e R2 = 0,927, obtidos com um conjunto treino e de validação externa de 23 e 5 moléculas, respectivamente. Logo após a avaliação da equação de regressão, juntamente com os valores dos descritores selecionados e outros não selecionados, foi sugerido que altos valores de energias dos FERMOs e de Σ(FERMO) ci2 em conjunto com baixos valores de eletrofilicidades e cargas extremamente negativas no átomo N1 são parâmetros relevantes para potencializar as afinidades ligantes de outros compostos a serem sintetizados, que apresentem estruturas químicas semelhantes às moléculas estudadas neste trabalho. Além disso, esses compostos podem ser considerados como doadores de elétrons e, logo, há uma grande probabilidade que tais moléculas interajam com o receptor histamínico H3 a partir de um processo de transferência de carga. / In this thesis, molecular orbital calculations were carried out on a set of 28 non-imidazole H3 antihistamine compounds using Austin Moldel 1, Hartree-Fock-Roothaan, and Density Functional Theory methods in order to investigate the possible relationships between electronic descriptors and binding affinity for H3 receptors (pKi). It was observed that the frontier effective-for-reaction molecular orbital (FERMO) energies were better correlated with pKi values than HOMO (Highest Occupied Molecular Orbital) and LUMO (Lowest Unoccupied Molecular Orbital) energy values. Exploratory data analysis through hierarchical cluster (HCA) and principal component analysis (PCA) showed a separation of the compounds into two sets by using four descriptors, one grouping the molecules with high pKi values, the other gathering low pKi value compounds. This separation was obtained with the use of the following descriptors: FERMO energies (εFERMO), charges derived from the electrostatic potential on the nitrogen atom (N1), electronic density indexes for FERMO on the N1 atom (Σ(FERMO) ci2), and electrophilicity (ω\'). These electronic descriptors were used to construct three quantitative structure-activity relationship (QSAR) models through the Partial Least Squares Method (PLS). The best model generated Q2 = 0.88 and R2 = 0.927 values obtained from a training set and external validation of 23 and 5 molecules, respectively. After the analysis of the PLS regression equation, the values for the selected electronic descriptors and other descriptors, it is suggested that high values of FERMO energies and of Σ(FERMO) ci2, together with low values of electrophilicity and pronounced negative charges on N1 appear as desirable properties for the conception of new molecules which might have high binding affinity. Moreover, these molecules can be classified as electron donating compounds and have a great probability of interacting through a charge transfer process with the biological receptor H3.
56

Comprendre l’appropriation des objets connectés grand public : une approche de modélisation à composants hiérarchiques / Understanding smart connected objects appropriation : a hierarchical component modelling approach

Zhong, Zeling 12 November 2019 (has links)
Selon Hoffman & Novak (2018), les objets connectés grand public qui ouvrent la voie à de nouvelles expériences d'usage, ont le potentiel de révolutionner la vie des consommateurs dans les années à venir. Le principal enjeu des objets connectés réside dans le fait d'intégrer leur usage dans les pratiques quotidiennes des consommateurs en produisant activement des données d'usage sur le long terme, à savoir l'appropriation. Cette recherche a validé le modèle explicatif de l'appropriation de l'objet connecté grand public au travers des besoins psychologiques des consommateurs français à l'égard de leurs objets connectés possédés. Nos résultats montrent que l'appropriation de l'objet connecté est fortement corrélée au besoin d'identité de soi, le besoin d'avoir un territoire ainsi que le besoin d'efficace et d'effectance. Et l'appropriation de l'objet connecté peut avoir un impact positif sur la perception de la valeur globale de l'objet connecté par les consommateurs, les comportements extra-rôle des consommateurs, ainsi que la satisfaction de leur vie quotidienne. Par ailleurs, le rôle médiateur des comportements extra rôle dans la relation entre l'appropriation et la valeur perçue nous permet d'affiner la compréhension des mécanismes de cocréation de valeur du point de vue du consommateur. Il nous enseigne de manière complémentaire comment l'appropriation de l'objet connecté contribue à la création de valeur par les consommateurs. / According to Hoffman & Novak (2018), the smart connected object is presenting new opportunities for usage experience that have the potential to revolutionize consumers' lives. The main challenge for smart connected objects is to integrate their use into the daily practices of consumers by actively producing usage data in the long-term, namely appropriation. This research has validated the explanatory model of consumer smart connected object appropriation through the psychological needs of French consumers regarding their smart connected objects. Our results show that the smart connected object appropriation is strongly correlated with the need for self-identity, the need for having a place, the need for efficacy and effectance. And the smart connected object appropriation has a positive impact on perceived value of smart connected objects by consumers, their extra-role behaviors, as well as satisfaction of their daily life. Moreover, the mediating role of extra-role behaviors in the relationship between appropriation and perceived value allows us to understand in a complementary way the value cocreation mechanisms from the viewpoint of consumers, concerning how the smart connected object appropriation contributes to value creation by consumers.
57

Multivariate Data Analysis on (Ti,Al)N Arc-PVD coating process : MVDA of the growth parameters thickness, stress,composition, and cutting performance

Öqvist, Per-Olof January 2021 (has links)
This diploma work was done at Seco Tools AB (SECO) in Fagersta and aimed to evaluate the possibility to model the relationship between deposition data, deposition properties and, cutting performance of a (Ti,Al)N coating on cutting inserts by applying the Multivariate Data Analysis (MVDA) modeling technique Partial Least Squares Projection to Latent Structures Modeling (PLS). Cathodic Arc Deposition (Arc-PVD) was the PVD technique focused on this study. The deposition technique that was focused on in this study was Cathodic Arc Deposition (Arc-PVD). For this purpose, two series of Arc-PVD coatings were manufactured. The first series aimed to generate a supervised explorative model for the deposition process. The second manufactured series was aimed to generate a batch-to-batch variation model of a deposition process. In the first supervised explorative model, the deposition parameters were set by a Design of Experiment (DOE) setup using a quarter factorial design with resolution III. In the second batch-to-batch model, the non-fixed deposition parameters and the cathode wear were monitored, and all other parameters were kept the same for every run. The results demonstrate good possibilities to model Arc-PVD coating properties and its performance in metal cutting with respect to the applied deposition parameters. The supervised explorative model confirmed previously established relationships, while the batch-to-batch model shows that variations between batches could be related to the wear of the cathode. This wear was shown to have a negative influence on the properties of the deposited coating.
58

A Statistical Methodology for Classifying Time Series in the Context of Climatic Data

Ramírez Buelvas, Sandra Milena 24 February 2022 (has links)
[ES] De acuerdo con las regulaciones europeas y muchos estudios científicos, es necesario monitorear y analizar las condiciones microclimáticas en museos o edificios, para preservar las obras de arte en ellos. Con el objetivo de ofrecer herramientas para el monitoreo de las condiciones climáticas en este tipo de edificios, en esta tesis doctoral se propone una nueva metodología estadística para clasificar series temporales de parámetros climáticos como la temperatura y humedad relativa. La metodología consiste en aplicar un método de clasificación usando variables que se computan a partir de las series de tiempos. Los dos primeros métodos de clasificación son versiones conocidas de métodos sparse PLS que no se habían aplicado a datos correlacionados en el tiempo. El tercer método es una nueva propuesta que usa dos algoritmos conocidos. Los métodos de clasificación se basan en diferentes versiones de un método sparse de análisis discriminante de mínimos cuadra- dos parciales PLS (sPLS-DA, SPLSDA y sPLS) y análisis discriminante lineal (LDA). Las variables que los métodos de clasificación usan como input, corresponden a parámetros estimados a partir de distintos modelos, métodos y funciones del área de las series de tiempo, por ejemplo, modelo ARIMA estacional, modelo ARIMA- TGARCH estacional, método estacional Holt-Winters, función de densidad espectral, función de autocorrelación (ACF), función de autocorrelación parcial (PACF), rango móvil (MR), entre otras funciones. También fueron utilizadas algunas variables que se utilizan en el campo de la astronomía para clasificar estrellas. En los casos que a priori no hubo información de los clusters de las series de tiempos, las dos primeras componentes de un análisis de componentes principales (PCA) fueron utilizadas por el algoritmo k- means para identificar posibles clusters de las series de tiempo. Adicionalmente, los resultados del método sPLS-DA fueron comparados con los del algoritmo random forest. Tres bases de datos de series de tiempos de humedad relativa o de temperatura fueron analizadas. Los clusters de las series de tiempos se analizaron de acuerdo a diferentes zonas o diferentes niveles de alturas donde fueron instalados sensores para el monitoreo de las condiciones climáticas en los 3 edificios.El algoritmo random forest y las diferentes versiones del método sparse PLS fueron útiles para identificar las variables más importantes en la clasificación de las series de tiempos. Los resultados de sPLS-DA y random forest fueron muy similares cuando se usaron como variables de entrada las calculadas a partir del método Holt-Winters o a partir de funciones aplicadas a las series de tiempo. Aunque los resultados del método random forest fueron levemente mejores que los encontrados por sPLS-DA en cuanto a las tasas de error de clasificación, los resultados de sPLS- DA fueron más fáciles de interpretar. Cuando las diferentes versiones del método sparse PLS utilizaron variables resultantes del método Holt-Winters, los clusters de las series de tiempo fueron mejor discriminados. Entre las diferentes versiones del método sparse PLS, la versión sPLS con LDA obtuvo la mejor discriminación de las series de tiempo, con un menor valor de la tasa de error de clasificación, y utilizando el menor o segundo menor número de variables.En esta tesis doctoral se propone usar una versión sparse de PLS (sPLS-DA, o sPLS con LDA) con variables calculadas a partir de series de tiempo para la clasificación de éstas. Al aplicar la metodología a las distintas bases de datos estudiadas, se encontraron modelos parsimoniosos, con pocas variables, y se obtuvo una discriminación satisfactoria de los diferentes clusters de las series de tiempo con fácil interpretación. La metodología propuesta puede ser útil para caracterizar las distintas zonas o alturas en museos o edificios históricos de acuerdo con sus condiciones climáticas, con el objetivo de prevenir problemas de conservación con las obras de arte. / [CA] D'acord amb les regulacions europees i molts estudis científics, és necessari monitorar i analitzar les condiciones microclimàtiques en museus i en edificis similars, per a preservar les obres d'art que s'exposen en ells. Amb l'objectiu d'oferir eines per al monitoratge de les condicions climàtiques en aquesta mena d'edificis, en aquesta tesi es proposa una nova metodologia estadística per a classificar series temporals de paràmetres climàtics com la temperatura i humitat relativa.La metodologia consisteix a aplicar un mètode de classificació usant variables que es computen a partir de les sèries de temps. Els dos primers mètodes de classificació són versions conegudes de mètodes sparse PLS que no s'havien aplicat adades correlacionades en el temps. El tercer mètode és una nova proposta que usados algorismes coneguts. Els mètodes de classificació es basen en diferents versions d'un mètode sparse d'anàlisi discriminant de mínims quadrats parcials PLS (sPLS-DA, SPLSDA i sPLS) i anàlisi discriminant lineal (LDA). Les variables queels mètodes de classificació usen com a input, corresponen a paràmetres estimats a partir de diferents models, mètodes i funcions de l'àrea de les sèries de temps, per exemple, model ARIMA estacional, model ARIMA-TGARCH estacional, mètode estacional Holt-Winters, funció de densitat espectral, funció d'autocorrelació (ACF), funció d'autocorrelació parcial (PACF), rang mòbil (MR), entre altres funcions. També van ser utilitzades algunes variables que s'utilitzen en el camp de l'astronomia per a classificar estreles. En els casos que a priori no va haver-hi información dels clústers de les sèries de temps, les dues primeres components d'una anàlisi de components principals (PCA) van ser utilitzades per l'algorisme k-means per a identificar possibles clústers de les sèries de temps. Addicionalment, els resultats del mètode sPLS-DA van ser comparats amb els de l'algorisme random forest.Tres bases de dades de sèries de temps d'humitat relativa o de temperatura varen ser analitzades. Els clústers de les sèries de temps es van analitzar d'acord a diferents zones o diferents nivells d'altures on van ser instal·lats sensors per al monitoratge de les condicions climàtiques en els edificis.L'algorisme random forest i les diferents versions del mètode sparse PLS van ser útils per a identificar les variables més importants en la classificació de les series de temps. Els resultats de sPLS-DA i random forest van ser molt similars quan es van usar com a variables d'entrada les calculades a partir del mètode Holt-winters o a partir de funcions aplicades a les sèries de temps. Encara que els resultats del mètode random forest van ser lleument millors que els trobats per sPLS-DA quant a les taxes d'error de classificació, els resultats de sPLS-DA van ser més fàcils d'interpretar.Quan les diferents versions del mètode sparse PLS van utilitzar variables resultants del mètode Holt-Winters, els clústers de les sèries de temps van ser més ben discriminats. Entre les diferents versions del mètode sparse PLS, la versió sPLS amb LDA va obtindre la millor discriminació de les sèries de temps, amb un menor valor de la taxa d'error de classificació, i utilitzant el menor o segon menor nombre de variables.En aquesta tesi proposem usar una versió sparse de PLS (sPLS-DA, o sPLS amb LDA) amb variables calculades a partir de sèries de temps per a classificar series de temps. En aplicar la metodologia a les diferents bases de dades estudiades, es van trobar models parsimoniosos, amb poques variables, i varem obtindre una discriminació satisfactòria dels diferents clústers de les sèries de temps amb fácil interpretació. La metodologia proposada pot ser útil per a caracteritzar les diferents zones o altures en museus o edificis similars d'acord amb les seues condicions climàtiques, amb l'objectiu de previndre problemes amb les obres d'art. / [EN] According to different European Standards and several studies, it is necessary to monitor and analyze the microclimatic conditions in museums and similar buildings, with the goal of preserving artworks. With the aim of offering tools to monitor the climatic conditions, a new statistical methodology for classifying time series of different climatic parameters, such as relative humidity and temperature, is pro- posed in this dissertation.The methodology consists of applying a classification method using variables that are computed from time series. The two first classification methods are ver- sions of known sparse methods which have not been applied to time dependent data. The third method is a new proposal that uses two known algorithms. These classification methods are based on different versions of sparse partial least squares discriminant analysis PLS (sPLS-DA, SPLSDA, and sPLS) and Linear Discriminant Analysis (LDA). The variables that are computed from time series, correspond to parameter estimates from functions, methods, or models commonly found in the area of time series, e.g., seasonal ARIMA model, seasonal ARIMA-TGARCH model, seasonal Holt-Winters method, spectral density function, autocorrelation function (ACF), partial autocorrelation function (PACF), moving range (MR), among others functions. Also, some variables employed in the field of astronomy (for classifying stars) were proposed.The methodology proposed consists of two parts. Firstly, different variables are computed applying the methods, models or functions mentioned above, to time series. Next, once the variables are calculated, they are used as input for a classification method like sPLS-DA, SPLSDA, or SPLS with LDA (new proposal). When there was no information about the clusters of the different time series, the first two components from principal component analysis (PCA) were used as input for k-means method for identifying possible clusters of time series. In addition, results from random forest algorithm were compared with results from sPLS-DA.This study analyzed three sets of time series of relative humidity or temperate, recorded in different buildings (Valencia's Cathedral, the archaeological site of L'Almoina, and the baroque church of Saint Thomas and Saint Philip Neri) in Valencia, Spain. The clusters of the time series were analyzed according to different zones or different levels of the sensor heights, for monitoring the climatic conditions in these buildings.Random forest algorithm and different versions of sparse PLS helped identifying the main variables for classifying the time series. When comparing the results from sPLS-DA and random forest, they were very similar for variables from seasonal Holt-Winters method and functions which were applied to the time series. The results from sPLS-DA were easier to interpret than results from random forest. When the different versions of sparse PLS used variables from seasonal Holt- Winters method as input, the clusters of the time series were identified effectively.The variables from seasonal Holt-Winters helped to obtain the best, or the second best results, according to the classification error rate. Among the different versions of sparse PLS proposed, sPLS with LDA helped to classify time series using a fewer number of variables with the lowest classification error rate.We propose using a version of sparse PLS (sPLS-DA, or sPLS with LDA) with variables computed from time series for classifying time series. For the different data sets studied, the methodology helped to produce parsimonious models with few variables, it achieved satisfactory discrimination of the different clusters of the time series which are easily interpreted. This methodology can be useful for characterizing and monitoring micro-climatic conditions in museums, or similar buildings, for preventing problems with artwork. / I gratefully acknowledge the financial support of Pontificia Universidad Javeriana Cali – PUJ and Instituto Colombiano de Crédito Educativo y Estudios Técnicos en el Exterior – ICETEX who awarded me the scholarships ’Convenio de Capacitación para Docentes O. J. 086/17’ and ’Programa Crédito Pasaporte a la Ciencia ID 3595089 foco-reto salud’ respectively. The scholarships were essential for obtaining the Ph.D. Also, I gratefully acknowledge the financial support of the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 814624. / Ramírez Buelvas, SM. (2022). A Statistical Methodology for Classifying Time Series in the Context of Climatic Data [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/181123
59

Recovery of Yttrium and Neodymium from Copper Pregnant Leach Solutions by Solvent Extraction

Copp, Rebecca January 2016 (has links)
The solvent extraction of yttrium and neodymium from copper pregnant leach solutions (PLS) using Primene JM-T, a primary aliphatic amine, has been studied. Effect of contact time, pH, sulfate concentration, and extractant concentration were investigated using synthetic and actual PLS systems. Standard experimental conditions were 5 minute contact time, pH ~2.5, 10% v/v Primene JM-T concentration, and 1:1 O:A phase ratio. Distribution isotherms were constructed for the pure systems and for actual copper leach solutions. Synthetic solutions contained 100 ppm Y and ~75 ppm Nd. Copper PLS contained 2.1 ppm Nd and 14.9 ppm Y. Results showed that complete extraction of both yttrium and neodymium occurred within five minutes and at pH values greater than 1. It was also found that sulfate concentration does not inhibit extraction at any concentration. Additionally, the distribution isotherms created show that extraction for these metals can operationally take place in one stage from both synthetic solutions and copper leach solutions.
60

Measurement of Reduction Efficiency in Green Liquor Using a NIR Spectrometer / Mätning av reduktionsgrad i grönlut med hjälp av en NIR spektrometer

Persson, Josef January 2016 (has links)
Domsjö Fabriker has earlier installed a Near Infrared (NIR) spectrometer after one of their recovery boilers. The purpose is to monitor the reduction efficiency of the boiler and later do process optimization. In this work calibration models for the instrument have been created. 108 green liquor samples have been extracted. A NIR spectrum was recorded for each sample and the samples were subsequently analyzed in laboratory for total alkali, sulfide and total sulfur. Several calibration models were created with multivariate data analysis and their performance and robustness were compared. The best model was able to predict reduction efficiency with a RMSEP of 2.75 percent units. Moreover, models were created for prediction of total alkali with a RMSEP of 0.108 mol/l, sulfides with a RMSEP of 1.95 g/l, total sulfur with a RMSEP of 2.83 g/l and S/Na2 ratio with a RMSEP of 0.022. The result is good enough that the instrument could be used to optimize the process and monitor process disturbances.

Page generated in 0.0287 seconds