• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 3
  • 1
  • 1
  • Tagged with
  • 17
  • 17
  • 17
  • 17
  • 8
  • 7
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Improvements to PLS methodology

Bissett, Alastair Campbell January 2015 (has links)
Partial Least Squares (PLS) is an important statistical technique with multipleand diverse applications, used as an effective regression method for correlated orcollinear datasets or for datasets that are not full rank for other reasons. A shorthistory of PLS is followed by a review of the publications where the issues with theapplication PLS that have been discussed. The theoretical basis of PLS is developedfrom the single value decomposition of the covariance, so that the strong links between principal components analysis and within the various PLS algorithms appear as a natural consequence. Latent variable selection by crossvalidation, permutation and information criteriaare examined. A method for plotting crossvalidation results is proposed that makeslatent variable selection less ambiguous than conventional plots. Novel and practicalmethods are proposed to extend published methods for latent variable selection byboth permutation and information criteria from univariate PLS1 models to PLS2 multivariate cases. The numerical method proposed for information criteria is also more general than the algebraic methods for PLS1 that have been recently published as it does not assume any particular form for the PLS regression coefficients. All of these methods have been critically assessed using a number of datasets, selected specifically to represent a diverse set of dimensions and covariance structures. Methods for simulating multivariate datasets were developed that allow controlof correlation and collinearity in both regressors and responses independently. Thisdevelopment also allows control over the variate distributions. Statistical design ofexperiments was used to generate plans for the simulation that allowed the factorsthat infuence PLS model fit and latent variable selection. It was found that all thelatent variable selection methods in the simulation tend to overfit and the feature inthe simulation that causes overfitting has been identified.
2

Improving collaborative forecasting performance in the food supply chain

Eksoz, Can January 2014 (has links)
The dynamic structure of the Food Supply Chain (FSC) distinguishes itself from other supply chains. Providing food to customers in a healthy and fresh manner necessitates a significant effort on the part of manufacturers and retailers. In practice, while these partners collaboratively forecast time-sensitive and / or short-life product-groups (e.g. perishable, seasonal, promotional and newly launched products), they confront significant challenges which prevent them from generating accurate forecasts and conducting long-term collaborations. Partners’ challenges are not limited only to the fluctuating demand of time-sensitive product-groups and continuously evolving consumer choices, but are also largely related to their conflicting expectations. Partners’ contradictory expectations mainly occur during the practices of integration, forecasting and information exchange in the FSC. This research specifically focuses on the Collaborative Forecasting (CF) practices in the FSC. However, CF is addressed from the manufacturers’ point of view, when they collaboratively forecast perishable, seasonal, promotional and newly launched products with retailers in the FSC. The underlying reasons are that while there is a paucity of research studying CF from the manufacturers’ standpoint, associated product-groups decay at short notice and their demand is influenced by uncertain consumer behaviour and the dynamic environment of FSC. The aim of the research is to identify factors that have a significant influence on the CF performance. Generating accurate forecasts over the aforementioned product-groups and sustaining long-term collaborations (one year or more) between partners are the two major performance criteria of CF in this research. This research systematically reviews the literature on Collaborative Planning, Forecasting and Replenishment (CPFR), which combines the supply chain practices of upstream and downstream members by linking their planning, forecasting and replenishment operations. The review also involves the research themes of supply chain integration, forecasting process and information sharing. The reason behind reviewing these themes is that partners’ CF is not limited to forecasting practices, it also encapsulates the integration of chains and bilateral information sharing for accurate forecasts. A single semi-structured interview with a UK based food manufacturer and three online group discussions on the business oriented social networking service of LinkedIn enrich the research with pragmatic and qualitative data, which are coded and analysed via software package QSR NVivo 9. Modifying the results of literature review through the qualitative data makes it possible to develop a rigorous conceptual model and associated hypotheses. Then, a comprehensive online survey questionnaire is developed to be delivered to food manufacturers located in the UK & Ireland, North America and Europe. An exploratory data analysis technique using Partial Least Squares (PLS) guides the research to analyse the online survey questionnaire empirically. The most significant contributions of this research are (i) to extend the body of literature by offering a new CF practice, aiming to improve forecast accuracy and long-term collaborations, and (ii) to provide managerial implications by offering a rigorous conceptual model guiding practitioners to implement the CF practice, for the achievement of accurate forecasts and long-term collaborations. In detail, the research findings primarily emphasise that manufacturers’ interdepartmental integration plays a vital role for successful CF and integration with retailers. Effective integration with retailers encourages manufacturers to conduct stronger CF in the FSC. Partners’ forecasting meetings are another significant factor for CF while the role of forecasters in these meetings is crucial too, implying forecasters’ indirect influence on CF. Complementary to past studies, this research further explores the manufacturers’ various information sources that are significant for CF and which should be shared with retailers. It is also significant to maintain the quality level of information whilst information is shared with retailers. This result accordingly suggests that the quality level of information is obliquely important for CF. There are two major elements that contribute to the literature. Firstly, relying on the particular product-groups in the FSC and examining CF from the manufacturers’ point of view not only closes a pragmatic gap in the literature, but also identifies new areas for future studies in the FSC. Secondly, the CF practice of this research demonstrates the increasing forecast satisfaction of manufacturers over the associated product-groups. Given the subjective forecast expectations of manufacturers, due to organisational objectives and market dynamics, demonstrating the significant impact of the CF practice on the forecast satisfaction leads to generalising its application to the FSC. Practitioners need to avail themselves of this research when they aim to collaboratively generate accurate forecasts and to conduct long-term collaborations over the associated product-groups. The benefits of this research are not limited to the FSC. Manufacturers in other industries can benefit from the research while they collaborate with retailers over similar product-groups having a short shelf life and / or necessitating timely and reliable forecasts. In addition, this research expands new research fields to academia in the areas of the supply chain, forecasting and information exchange, whilst it calls the interest of academics to particular product-groups in the FSC for future research. Nevertheless, this research is limited to dyad manufacturer-retailer forecast collaborations over a limited range of product-groups. This is another opportunity for academics to extend this research to different types of collaborations and products.
3

Trialability, perceived risk and complexity of understanding as determinants of cloud computing services adoption

Etsebeth, Eugene Everard 16 February 2013 (has links)
In 2011 one-third of South African organisations did not intend to adopt cloud computing services because IT decision-maker lacked understanding of the related concepts and benefits (Goldstuck, 2011). This research develops a media-oriented model to examine the adoption of these services in South Africa. The model uses the technology acceptance model (TAM) and innovation diffusion theory (IDT) to develop variables that are considered determinants of adoption including trialability, complexity of understanding, perceived risk, perceived ease of use and perceived usefulness.An electronic survey was sent to 107 IT decision-makers. Over 80% of the respondents were C-suite executives. The Partial Least Squares (PLS) method was chosen to depict and test the proposed model. PLS is superior to normal regression models and is a second generation technique. The data analysis included evaluating and modifying the model, assessing the new measurement model, testing the hypotheses of the model structure and presenting the structural model.The research found that media, experts and word of mouth mitigate perceived risks including bandwidth, connectivity and power. Furthermore, trialability and perceived usefulness were affected by social influence, as well as influencing adoption. The results enable service providers and marketers to develop product roadmaps and pinpoint media messages. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
4

Faktoren für eine erfolgreiche Steuerung von Patentaktivitäten

Günther, Thomas, Moses, Heike 12 September 2006 (has links) (PDF)
Empirischen Studien zufolge können Patente sich positiv auf den Unternehmenserfolg auswirken. Allerdings wirkt dieser Effekt nicht automatisch, sondern Unternehmen müssen sich um den Aufbau und die gesteuerte Weiterentwicklung eines nachhaltigen und wertvollen Patentportfolios bemühen. Bisher ist jedoch nicht wissenschaftlich untersucht worden, welche Maßnahmen Unternehmen ergreifen können, um die unternehmensinternen Vorraussetzungen für eine erfolgreiche Steuerung von Patentaktivitäten zu schaffen. Um diese betrieblichen Faktoren zu identifizieren und deren Relevanz zu quantifizieren, wurden 2005 in einer breiten empirischen Untersuchung die aktiven Patentanmelder im deutschsprachigen Raum (über 1.000 Unternehmen) mit Hilfe eines standardisierten Fragebogens befragt. Auf der Basis von 325 auswertbaren Fragebögen (Ausschöpfungsquote 36,8 %) konnten zum einen Ergebnisse zum aktuellen Aufgabenspektrum der Patentabteilungen sowie zu deren organisatorischen und personellen Strukturen gewonnen werden. Ebenfalls wurde in dieser Status quo-Analyse der Bekanntheits- und Implementierungsgrad von Methoden und Systemen (z. B. Patentbewertungsmethoden, Patent-IT-Systeme) beleuchtet. Zum anderen wurden die betrieblichen Faktoren herausgestellt, auf die technologieorientierte Unternehmen achten sollten, um das Fundament für eine erfolgreiche Patentsteuerung zu legen. / Empirical studies have shown that patents can have a positive effect on corporate success. However, this effect does not occur by itself. Companies have to make an effort to create and to develop a sustainable patent portfolio. So far, no academic studies have investigated into which actions a company can take to establish the internal conditions for successful patent management. To identify and to quantify the relevance of these internal factors, a study was conducted using a standardized written questionnaire with more than 1,000 patent-oriented companies in the German-speaking countries (Germany, Austria, Switzerland, Liechtenstein). In total, 325 valid questionnaires were included in the analyses; this corresponds to an above-average response rate of 36.8 %. These analyses revealed insights into the current task profile of patent departments and their organizational and personnel structures. This status quo analysis also included the investigation into the awareness and implementation level of used methods and systems (e. g. patent evaluation methods, patent IT systems). Furthermore, the study could expose the internal determinants, which technology-oriented companies should focus on to ensure a successful patent management.
5

A multivariate approach to QSAR

Hellberg, Sven January 1986 (has links)
Quantitative structure-activity relationships (OSAR) constitute empirical analogy models connecting chemical structure and biological activity. The analogy approach to QSAR assume that the factors important in the biological system also are contained in chemical model systems. The development of a QSAR can be divided into subproblems: 1. to quantify chemical structure in terms of latent variables expressing analogy, 2. to design test series of compounds, 3. to measure biological activity and 4. to construct a mathematical model connecting chemical structure and biological activity. In this thesis it is proposed that many possibly relevant descriptors should be considered simultaneously in order to efficiently capture the unknown factors inherent in the descriptors. The importance of multivariately and multipositionally varied test series is discussed. Multivariate projection methods such as PCA and PLS are shown to be appropriate far QSAR and to closely correspond to the analogy assumption. The multivariate analogy approach is applied to a beta- adrenergic agents, b haloalkanes, c halogenated ethyl methyl ethers and d four different families of peptides. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1986, härtill 8 uppsatser</p> / digitalisering@umu
6

The application of multivariate statistical analysis and optimization to batch processes

Yan, Lipeng January 2015 (has links)
Multivariate statistical process control (MSPC) techniques play an important role in industrial batch process monitoring and control. This research illustrates the capabilities and limitations of existing MSPC technologies, with a particular focus on partial least squares (PLS).In modern industry, batch processes often operate over relatively large spaces, with many chemical and physical systems displaying nonlinear performance. However, the linear PLS model cannot predict nonlinear systems, and hence non-linear extensions to PLS may be required. The nonlinear PLS model can be divided into Type I and Type II nonlinear PLS models. In the Type I Nonlinear PLS method, the observed variables are appended with nonlinear transformations. In contrast to the Type I nonlinear PLS method, the Type II nonlinear PLS method assumes a nonlinear relationship within the latent variable structure of the model. Type I and Type II nonlinear multi-way PLS (MPLS) models were applied to predict the endpoint value of the product in a benchmark simulation of a penicillin batch fermentation process. By analysing and comparing linear MPLS, and Type I and Type II nonlinear MPLS models, the advantages and limitations of these methods were identified and summarized. Due to the limitations of Type I and II nonlinear PLS models, in this study, Neural Network PLS (NNPLS) was proposed and applied to predict the final product quality in the batch process. The application of the NNPLS method is presented with comparison to the linear PLS method, and to the Type I and Type II nonlinear PLS methods. Multi-way NNPLS was found to produce the most accurate results, having the added advantage that no a-priori information regarding the order of the dynamics was required. The NNPLS model was also able to identify nonlinear system dynamics in the batch process. Finally, NNPLS was applied to build the controller and the NNPLS method was combined with the endpoint control algorithm. The proposed controller was able to be used to keep the endpoint value of penicillin and biomass concentration at a set-point.
7

Application of multivariate regression techniques to paint: for the quantitive FTIR spectroscopic analysis of polymeric components

Phala, Adeela Colyne January 2011 (has links)
Thesis submitted in fulfilment of the requirements for the degree Master of Technology Chemistry in the Faculty of (Science) Supervisor: Professor T.N. van der Walt Bellville campus Date submitted: October 2011 / It is important to quantify polymeric components in a coating because they greatly influence the performance of a coating. The difficulty associated with analysis of polymers by Fourier transform infrared (FTIR) analysis’s is that colinearities arise from similar or overlapping spectral features. A quantitative FTIR method with attenuated total reflectance coupled to multivariate/ chemometric analysis is presented. It allows for simultaneous quantification of 3 polymeric components; a rheology modifier, organic opacifier and styrene acrylic binder, with no prior extraction or separation from the paint. The factor based methods partial least squares (PLS) and principle component regression (PCR) permit colinearities by decomposing the spectral data into smaller matrices with principle scores and loading vectors. For model building spectral information from calibrators and validation samples at different analysis regions were incorporated. PCR and PLS were used to inspect the variation within the sample set. The PLS algorithms were found to predict the polymeric components the best. The concentrations of the polymeric components in a coating were predicted with the calibration model. Three PLS models each with different analysis regions yielded a coefficient of correlation R2 close to 1 for each of the components. The root mean square error of calibration (RMSEC) and root mean square error of prediction (RMSEP) was less than 5%. The best out-put was obtained where spectral features of water was included (Trial 3). The prediction residual values for the three models ranged from 2 to -2 and 10 to -10. The method allows paint samples to be analysed in pure form and opens many opportunities for other coating components to be analysed in the same way.
8

In silico tools in risk assessment : of industrial chemicals in general and non-dioxin-like PCBs in particular

Stenberg, Mia January 2012 (has links)
Industrial chemicals in European Union produced or imported in volumes above 1 tonne annually, necessitate a registration within REACH. A common problem, concerning these chemicals, is deficient information and lack of data for assessing the hazards posed to human health and the environment. Animal studies for the type of toxicological information needed are both expensive and time consuming, and to that an ethical aspect is added. Alternative methods to animal testing are thereby requested. REACH have called for an increased use of in silico tools for non-testing data as structure-activity relationships (SARs), quantitative structure-activity relationships (QSARs), and read-across. The main objective of the studies underlying this thesis is related to explore and refine the use of in silico tools in a risk assessment context of industrial chemicals. In particular, try to relate properties of the molecular structure to the toxic effect of the chemical substance, by using principles and methods of computational chemistry. The initial study was a survey of all industrial chemicals; the Industrial chemical map was created. A part of this map was identified including chemicals of potential concern. Secondly, the environmental pollutants, polychlorinated biphenyls (PCBs) were examined and in particular the non-dioxin-like PCBs (NDL-PCBs). A set of 20 NDL-PCBs was selected to represent the 178 PCB congeners with three to seven chlorine substituents. The selection procedure was a combined process including statistical molecular design for a representative selection and expert judgements to be able to include congeners of specific interest. The 20 selected congeners were tested in vitro in as much as 17 different assays. The data from the screening process was turned into interpretable toxicity profiles with multivariate methods, used for investigation of potential classes of NDL-PCBs. It was shown that NDL-PCBs cannot be treated as one group of substances with similar mechanisms of action. Two groups of congeners were identified. A group including in general lower chlorinated congeners with a higher degree of ortho substitution showed a higher potency in more assays (including all neurotoxic assays). A second group included abundant congeners with a similar toxic profile that might contribute to a common toxic burden. To investigate the structure-activity pattern of PCBs effect on DAT in rat striatal synaptosomes, ten additional congeners were selected and tested in vitro. NDL-PCBs were shown to be potent inhibitors of DAT binding. The congeners with highest DAT inhibiting potency were tetra- and penta-chlorinated with 2-3 chlorine atoms in ortho-position. The model was not able to distinguish the congeners with activities in the lower μM range, which could be explained by a relatively unspecific response for the lower ortho chlorinated PCBs. / Den europeiska kemikalielagstiftningen REACH har fastställt att kemikalier som produceras eller importeras i en mängd över 1 ton per år, måste registreras och riskbedömmas. En uppskattad siffra är att detta gäller för 30 000 kemikalier. Problemet är dock att data och information ofta är otillräcklig för en riskbedömning. Till stor del har djurförsök använts för effektdata, men djurförsök är både kostsamt och tidskrävande, dessutom kommer den etiska aspekten in. REACH har därför efterfrågat en undersökning av möjligheten att använda in silico verktyg för att bidra med efterfrågad data och information. In silico har en ungefärlig betydelse av i datorn, och innebär beräkningsmodeller och metoder som används för att få information om kemikaliers egenskaper och toxicitet. Avhandlingens syfte är att utforska möjligheten och förfina användningen av in silico verktyg för att skapa information för riskbedömning av industrikemikalier. Avhandlingen beskriver kvantitativa modeller framtagna med kemometriska metoder för att prediktera, dvs förutsäga specifika kemikaliers toxiska effekt. I den första studien (I) undersöktes 56 072 organiska industrikemikalier. Med multivariata metoder skapades en karta över industrikemikalierna som beskrev dess kemiska och fysikaliska egenskaper. Kartan användes för jämförelser med kända och potentiella miljöfarliga kemikalier. De mest kända miljöföroreningarna visade sig ha liknande principal egenskaper och grupperade i kartan. Genom att specialstudera den delen av kartan skulle man kunna identifiera fler potentiellt farliga kemiska substanser. I studie två till fyra (II-IV) specialstuderades miljögiftet PCB. Tjugo PCBs valdes ut så att de strukturellt och fysiokemiskt representerade de 178 PCB kongenerna med tre till sju klorsubstituenter. Den toxikologiska effekten hos dessa 20 PCBs undersöktes i 17 olika in vitro assays. De toxikologiska profilerna för de 20 testade kongenerna fastställdes, dvs vilka som har liknande skadliga effekter och vilka som skiljer sig åt. De toxicologiska profilerna användes för klassificering av PCBs. Kvantitativa modeller utvecklades för prediktioner, dvs att förutbestämma effekter hos ännu icke testade PCBs, och för att få ytterligare kunskap om strukturella egenskaper som ger icke önskvärda effekter i människa och natur. Information som kan användas vid en framtida riskbedömning av icke-dioxinlika PCBs. Den sista studien (IV) är en struktur-aktivitets studie som undersöker de icke-dioxinlika PCBernas hämmande effekt av signalsubstansen dopamin i hjärnan.
9

Faktoren für eine erfolgreiche Steuerung von Patentaktivitäten: Ergebnisse einer empirischen Studie

Günther, Thomas, Moses, Heike 12 September 2006 (has links)
Empirischen Studien zufolge können Patente sich positiv auf den Unternehmenserfolg auswirken. Allerdings wirkt dieser Effekt nicht automatisch, sondern Unternehmen müssen sich um den Aufbau und die gesteuerte Weiterentwicklung eines nachhaltigen und wertvollen Patentportfolios bemühen. Bisher ist jedoch nicht wissenschaftlich untersucht worden, welche Maßnahmen Unternehmen ergreifen können, um die unternehmensinternen Vorraussetzungen für eine erfolgreiche Steuerung von Patentaktivitäten zu schaffen. Um diese betrieblichen Faktoren zu identifizieren und deren Relevanz zu quantifizieren, wurden 2005 in einer breiten empirischen Untersuchung die aktiven Patentanmelder im deutschsprachigen Raum (über 1.000 Unternehmen) mit Hilfe eines standardisierten Fragebogens befragt. Auf der Basis von 325 auswertbaren Fragebögen (Ausschöpfungsquote 36,8 %) konnten zum einen Ergebnisse zum aktuellen Aufgabenspektrum der Patentabteilungen sowie zu deren organisatorischen und personellen Strukturen gewonnen werden. Ebenfalls wurde in dieser Status quo-Analyse der Bekanntheits- und Implementierungsgrad von Methoden und Systemen (z. B. Patentbewertungsmethoden, Patent-IT-Systeme) beleuchtet. Zum anderen wurden die betrieblichen Faktoren herausgestellt, auf die technologieorientierte Unternehmen achten sollten, um das Fundament für eine erfolgreiche Patentsteuerung zu legen. / Empirical studies have shown that patents can have a positive effect on corporate success. However, this effect does not occur by itself. Companies have to make an effort to create and to develop a sustainable patent portfolio. So far, no academic studies have investigated into which actions a company can take to establish the internal conditions for successful patent management. To identify and to quantify the relevance of these internal factors, a study was conducted using a standardized written questionnaire with more than 1,000 patent-oriented companies in the German-speaking countries (Germany, Austria, Switzerland, Liechtenstein). In total, 325 valid questionnaires were included in the analyses; this corresponds to an above-average response rate of 36.8 %. These analyses revealed insights into the current task profile of patent departments and their organizational and personnel structures. This status quo analysis also included the investigation into the awareness and implementation level of used methods and systems (e. g. patent evaluation methods, patent IT systems). Furthermore, the study could expose the internal determinants, which technology-oriented companies should focus on to ensure a successful patent management.
10

Relações estrutura-retenção de flavonóides por cromatografia a líquido em membranas imobilizadas artificialmente / Structure retention relationships of flavonoids by liquid chromatography using immobilized artificial membranes

Santoro, Adriana Leandra 24 August 2007 (has links)
Para um composto químico exercer seu efeito bioativo é necessário que ele atravesse várias barreiras biológicas até alcançar seu sitio de ação. Propriedades farmacocinéticas insatisfatórias (como absorção, distribuição, metabolismo e excreção) são reconhecidamente as principais causas na descontinuidade de pesquisas na busca por novos fármacos. Neste trabalho, modelos biofísicos foram utilizados para o estudo de absorção de uma série de flavonóides naturais com atividade tripanossomicida. O coeficiente cromatográfico de partição, kw, foi determinado através da cromatografia líquida de alta eficiência em fase reversa, RP-HPLC, utilizando-se de colunas cromatográficas empacotadas com constituintes básicos da membrana biológica (fosfatidilcolina e colesterol). Os resultados obtidos demonstraram que nas colunas compostas por fosfatidilcolina a retenção de flavonóides hidroxilados é determinada por interações secundárias, além da partição, e no caso da coluna de colesterol, a partição é o principal mecanismo que rege a retenção. Uma série de descritores físico-químicos foi gerada pelos campos moleculares de interações (MIFs) entre os flavonóides naturais e algumas sondas químicas virtuais, utilizando o programa GRID. Os descritores físico-químicos gerados foram correlacionados com os log kw por análise dos mínimos múltiplos parciais (PLS), utilizando o programa VolSurf, com a finalidade de gerar um modelo quantitativo entre estrutura e propriedade (QSPR) para esta classe de compostos. O modelo produzido por este estudo, ao utilizar os dados de partição em colesterol, log kwCol, apresentou elevada consistência interna, com bom poder de correlação (R2 = 0, 97) e predição (Q2 = 0,86) para a partição destas moléculas / In order to a chemical compound exert its bioactive effect it is necessary that it crosses some biological barriers until reaching its site of action. Unfavorable pharmacokinetics properties (absorption, distribution, metabolism and excretion) are admittedly one of the main causes in the discontinuity of research in the search for new drugs. In this work, biophysics models were used for the study of absorption of a series of natural flavonoids with trypanocide activity. The chromatographic retention indices (log kw) were determined on immobilized artificial membranes columns (IAM.PC.DD, IAM.PC.DD2, Cholesteryl 10-Undecetonoato) obtained by the extrapolation method. The results demonstrated that in the composed columns for fosfatidilcolina the retention of hydroxil flavonoids is determined by secondary interactions, beyond the partition. In the case of the retention for the cholesterol column, the partition is the main mechanism that drives the retention. A series of physico-chemical descriptors were generated by the molecular interaction fields (MIF) between the flavonoids and some virtual chemical probes, using the program GRID. The descriptors were correlated with log kw by the partial least squares regression (PLS), using the VolSurf program, with the purpose to generate a quantitative model between the structure and the retention (QSRR) for this compounds class. The model produced for this study, when using the data of partition in cholesterol, log kwCol, presented high internal consistency, with good correlation power (R2 = 0, 97) and prediction (Q2 = 0,86) for the partition of these molecules

Page generated in 0.0476 seconds