• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 48
  • 17
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 76
  • 76
  • 76
  • 18
  • 17
  • 16
  • 15
  • 14
  • 14
  • 13
  • 13
  • 12
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Análise da qualidade e da contribuição dos laudos periciais toxicológicos no processo de investigação criminal e sentença judicial em casos envolvendo substâncias ilícitas / Analysis of the quality and contribution of forensic toxicology reports in the process of criminal investigation and court decision in cases involving illegal substances

Yoshida, Ricardo Luís 04 March 2015 (has links)
Atualmente, no meio jurídico, há um reconhecimento implícito de que as provas materiais necessitam de embasamento científico para alcançar a autenticidade imprescindível ao estabelecimento da convicção dos magistrados. A natureza de determinados exames, como a classificação de substâncias proibidas, demandam a utilização de técnicas e saberes oriundos das ciências naturais e tecnológicas. O trabalho pericial deve ser pautado pela cientificidade, com a aplicação de conhecimentos de diversas áreas, dentre as quais está incluída a estatística forense. Neste trabalho foram utilizadas ferramentas estatísticas para avaliar a qualidade e a contribuição dos laudos periciais para os casos envolvendo substâncias ilícitas e correlacionar o conteúdo destes documentos com a sentença judicial. Numa primeira etapa foram analisadas as informações contidas em laudos toxicológicos de drogas, com o intuito de quantificar a qualidade e importância que eles poderiam fornecer em um processo. Para isso foram analisados 1008 documentos oficiais de diversas jurisdições, divididos em 504 conjuntos de laudos preliminares e definitivos do mesmo caso forense A intenção foi apreciar um conjunto heterogêneo de documentos para possibilitar uma melhor análise. A quantificação foi apreciada através de equações empíricas elaboradas. A validação do método ocorreu por análise de dados multivariados. A metodologia empregada demonstrou-se bastante robusta. A segunda fase do trabalho foi aplicar o resultado dos exames da etapa precedente e correlacionar com a decisão judicial. Para tanto, foram esmiuçadas 167 sentenças proferidas em primeira instância e que continham os laudos elencados na primeira fase. A ferramenta utilizada foi a inferência Bayesiana. Os resultados apontaram que os laudos periciais sempre foram essenciais neste tipo de procedimento julgatório. A qualidade dos documentos produzidos encontrava-se entre boa e ótima, avalizada pelo parâmetro \"relevância do laudo\". Alguns aspectos nos documentos poderiam ser aperfeiçoados, como, por exemplo, a inserção de fotografias do material apreendido e/ou imagens alusivas às análises laboratoriais. Estes estudos permitiram estabelecer um valor de corte para a quantificação da qualidade dos laudos, a partir do qual houve 100% de concordância entre o laudo direcionado e a sentença, para casos de condenação onde o suspeito foi considerado traficante. Por fim, a metodologia proposta apresentou potencial promissor e possibilidade de ser utilizada em outros tipos de casos forenses, como, por exemplo, homicídios, suicídios e outros. / There is an implicit recognition in the current legal scenario that material evidences require scientific support in order to achieve the authenticity that the magistrates need for making decisions. The nature of certain exams, such as classification of prohibited substances, requires the use of techniques and knowledge from natural sciences and technology. The forensic work must rely on scientific methods and apply knowledge from several areas, including forensic statistics. The present work used statistic tools to evaluate the quality and the contribution of forensic reports about illegal substances; the goal is to correlate the content of these documents with the court ruling. In the first part we analyzed the information from toxicology reports on drugs, aiming at the quantification of the importance they might bear to court proceedings. We have parsed 1008 official documents from several jurisdictions, divided into 504 sets of preliminary and final reports from the same case. The objective was to evaluate a heterogeneous document set for a better analysis. The quantification was determined from elaborate empiric equations. The validation of the method was performed by multivariate data analysis. The methodology used in the present work has proved very robust. The second part was the application of the results from the previous part and correlation to the court ruling. We have thoroughly examined 167 rulings at first instance that contained the reports cited in the first part. We have used Bayesian inference, and the results indicated that forensic reports were always required in this type of court proceeding. The quality of the documents was considered good or excellent, as stated in the parameter \"relevance of the report\". Some aspects could be improved, for instance, images of collected material evidence or laboratory analytical procedures could be included. These studies allowed establishing a cut-off value for the quantification of the report quality, from which a 100% agreement between the report and the court decision was achieved, in cases where the suspect was found guilty. Finally, the proposed methodology in this work showed a good potential and could be used in other kinds of forensic cases, such as homicide, suicide and other forensic investigations.
32

Experimental Designs at the Crossroads of Drug Discovery

Olsson, Ing-Marie January 2006 (has links)
<p>New techniques and approaches for organic synthesis, purification and biological testing are enabling pharmaceutical industries to produce and test increasing numbers of compounds every year. Surprisingly, this has not led to more new drugs reaching the market, prompting two questions – why is there not a better correlation between their efforts and output, and can it be improved? One possible way to make the drug discovery process more efficient is to ensure, at an early stage, that the tested compounds are diverse, representative and of high quality. In addition the biological evaluation systems have to be relevant and reliable. The diversity of the tested compounds could be ensured and the reliability of the biological assays improved by using Design Of Experiments (DOE) more frequently and effectively. However, DOE currently offers insufficient options for these purposes, so there is a need for new, tailor-made DOE strategies. The aim of the work underlying this thesis was to develop and evaluate DOE approaches for diverse compound selection and efficient assay optimisation. This resulted in the publication of two new DOE strategies; D-optimal Onion Design (DOOD) and Rectangular Experimental Designs for Multi-Unit Platforms (RED-MUP), both of which are extensions to established experimental designs.</p><p>D-Optimal Onion Design (DOOD) is an extension to D-optimal design. The set of possible objects that could be selected is divided into layers and D-optimal selection is applied to each layer. DOOD enables model-based, but not model-dependent, selections in discrete spaces to be made, since the selections are not only based on the D-optimality criterion, but are also biased by the experimenter’s prior knowledge and specific needs. Hence, DOOD selections provide controlled diversity.</p><p>Assay development and optimisation can be a major bottleneck restricting the progress of a project. Although DOE is a recognised tool for optimising experimental systems, there has been widespread unwillingness to use it for assay optimisation, mostly because of the difficulties involved in performing experiments according to designs in 96-, 384- and 1536- well formats. The RED-MUP framework combines classical experimental designs orthogonally onto rectangular experimental platforms, which facilitates the execution of DOE on these platforms and hence provides an efficient tool for assay optimisation.</p><p>In combination, these two strategies can help uncovering the crossroads between biology and chemistry in drug discovery as well as lead to higher information content in the data received from biological evaluations, providing essential information for well-grounded decisions as to the future of the project. These two strategies can also help researchers identify the best routes to take at the crossroads linking biological and chemical elements of drug discovery programs.</p>
33

Multivariate profiling of metabolites in human disease : Method evaluation and application to prostate cancer

Thysell, Elin January 2012 (has links)
There is an ever increasing need of new technologies for identification of molecular markers for early diagnosis of fatal diseases to allow efficient treatment. In addition, there is great value in finding patterns of metabolites, proteins or genes altered in relation to specific disease conditions to gain a deeper understanding of the underlying mechanisms of disease development. If successful, scientific achievements in this field could apart from early diagnosis lead to development of new drugs, treatments or preventions for many serious diseases.  Metabolites are low molecular weight compounds involved in the chemical reactions taking place in the cells of living organisms to uphold life, i.e. metabolism. The research field of metabolomics investigates the relationship between metabolite alterations and biochemical mechanisms, e.g. disease processes. To understand these associations hundreds of metabolites present in a sample are quantified using sensitive bioanalytical techniques. In this way a unique chemical fingerprint is obtained for each sample, providing an instant picture of the current state of the studied system. This fingerprint or picture can then be utilized for the discovery of biomarkers or biomarker patterns of biological and clinical relevance. In this thesis the focus is set on evaluation and application of strategies for studying metabolic alterations in human tissues associated with disease. A chemometric methodology for processing and modeling of gas chromatography-mass spectrometry (GC-MS) based metabolomics data, is designed for developing predictive systems for generation of representative data, validation and result verification, diagnosis and screening of large sample sets. The developed strategies were specifically applied for identification of metabolite markers and metabolic pathways associated with prostate cancer disease progression. The long-term goal was to detect new sensitive diagnostic/prognostic markers, which ultimately could be used to differentiate between indolent and aggressive tumors at diagnosis and thus aid in the development of personalized treatments. Our main finding so far is the detection of high levels of cholesterol in prostate cancer bone metastases. This in combination with previously presented results suggests cholesterol as a potentially interesting therapeutic target for advanced prostate cancer. Furthermore we detected metabolic alterations in plasma associated with metastasis development. These results were further explored in prospective samples attempting to verify some of the identified metabolites as potential prognostic markers.
34

Nonionic surfactants : A multivariate study

Uppgård, Lise-Lott January 2002 (has links)
In this thesis technical nonionic surfactants are studied using multivariate techniques. The surfactants studied were alkyl ethoxylates (AEOs) and alkyl polyglucosides (APGs). The aquatic toxicity of the surfactants towards two organisms, a shrimp and a rotifer, was examined. The specified effect was lethality, LC50, as indicated by immobilisation. In a comparative study, the LC50 values obtained were used to develop two different types of model. In the log P model the toxicity was correlated to log P alone, while in the multivariate model several physicochemical variables, including log P, were correlated to the toxicity. The multivariate model gave smaller prediction errors than the log P model. Further, the change in reactivity when a surfactant mixture was added to dissolving pulp under alkaline conditions was studied, using the amount of residual cellulose as a measure of the reactivity. Ten AEO/APG mixtures were tested, and the mixture with greatest potential was studied in more detail. An optimum in the amount of added surfactant was found that seems to coincide, according to surface tension measurements, with the CMC.
35

Multiresolutional partial least squares and principal component analysis of fluidized bed drying

Frey, Gerald M. 14 April 2005
Fluidized bed dryers are used in the pharmaceutical industry for the batch drying of pharmaceutical granulate. Maintaining optimal hydrodynamic conditions throughout the drying process is essential to product quality. Due to the complex interactions inherent in the fluidized bed drying process, mechanistic models capable of identifying these optimal modes of operation are either unavailable or limited in their capabilities. Therefore, empirical models based on experimentally generated data are relied upon to study these systems.<p> Principal Component Analysis (PCA) and Partial Least Squares (PLS) are multivariate statistical techniques that project data onto linear subspaces that are the most descriptive of variance in a dataset. By modeling data in terms of these subspaces, a more parsimonious representation of the system is possible. In this study, PCA and PLS are applied to data collected from a fluidized bed dryer containing pharmaceutical granulate. <p>System hydrodynamics were quantified in the models using high frequency pressure fluctuation measurements. These pressure fluctuations have previously been identified as a characteristic variable of hydrodynamics in fluidized bed systems. As such, contributions from the macroscale, mesoscale, and microscales of motion are encoded into the signals. A multiresolutional decomposition using a discrete wavelet transformation was used to resolve these signals into components more representative of these individual scales before modeling the data. <p>The combination of multiresolutional analysis with PCA and PLS was shown to be an effective approach for modeling the conditions in the fluidized bed dryer. In this study, datasets from both steady state and transient operation of the dryer were analyzed. The steady state dataset contained measurements made on a bed of dry granulate and the transient dataset consisted of measurements taken during the batch drying of granulate from approximately 33 wt.% moisture to 5 wt.%. Correlations involving several scales of motion were identified in both studies.<p> In the steady state study, deterministic behavior related to superficial velocity, pressure sensor position, and granulate particle size distribution was observed in PCA model parameters. It was determined that these properties could be characterized solely with the use of the high frequency pressure fluctuation data. Macroscopic hydrodynamic characteristics such as bubbling frequency and fluidization regime were identified in the low frequency components of the pressure signals and the particle scale interactions of the microscale were shown to be correlated to the highest frequency signal components. PLS models were able to characterize the effects of superficial velocity, pressure sensor position, and granulate particle size distribution in terms of the pressure signal components. Additionally, it was determined that statistical process control charts capable of monitoring the fluid bed hydrodynamics could be constructed using PCA<p>In the transient drying experiments, deterministic behaviors related to inlet air temperature, pressure sensor position, and initial bed mass were observed in PCA and PLS model parameters. The lowest frequency component of the pressure signal was found to be correlated to the overall temperature effects during the drying cycle. As in the steady state study, bubbling behavior was also observed in the low frequency components of the pressure signal. PLS was used to construct an inferential model of granulate moisture content. The model was found to be capable of predicting the moisture throughout the drying cycle. Preliminary statistical process control models were constructed to monitor the fluid bed hydrodynamics throughout the drying process. These models show promise but will require further investigation to better determine sensitivity to process upsets.<p> In addition to PCA and PLS analyses, Multiway Principal Component Analysis (MPCA) was used to model the drying process. Several key states related to the mass transfer of moisture and changes in temperature throughout the drying cycle were identified in the MPCA model parameters. It was determined that the mass transfer of moisture throughout the drying process affects all scales of motion and overshadows other hydrodynamic behaviors found in the pressure signals.
36

Multiresolutional partial least squares and principal component analysis of fluidized bed drying

Frey, Gerald M. 14 April 2005 (has links)
Fluidized bed dryers are used in the pharmaceutical industry for the batch drying of pharmaceutical granulate. Maintaining optimal hydrodynamic conditions throughout the drying process is essential to product quality. Due to the complex interactions inherent in the fluidized bed drying process, mechanistic models capable of identifying these optimal modes of operation are either unavailable or limited in their capabilities. Therefore, empirical models based on experimentally generated data are relied upon to study these systems.<p> Principal Component Analysis (PCA) and Partial Least Squares (PLS) are multivariate statistical techniques that project data onto linear subspaces that are the most descriptive of variance in a dataset. By modeling data in terms of these subspaces, a more parsimonious representation of the system is possible. In this study, PCA and PLS are applied to data collected from a fluidized bed dryer containing pharmaceutical granulate. <p>System hydrodynamics were quantified in the models using high frequency pressure fluctuation measurements. These pressure fluctuations have previously been identified as a characteristic variable of hydrodynamics in fluidized bed systems. As such, contributions from the macroscale, mesoscale, and microscales of motion are encoded into the signals. A multiresolutional decomposition using a discrete wavelet transformation was used to resolve these signals into components more representative of these individual scales before modeling the data. <p>The combination of multiresolutional analysis with PCA and PLS was shown to be an effective approach for modeling the conditions in the fluidized bed dryer. In this study, datasets from both steady state and transient operation of the dryer were analyzed. The steady state dataset contained measurements made on a bed of dry granulate and the transient dataset consisted of measurements taken during the batch drying of granulate from approximately 33 wt.% moisture to 5 wt.%. Correlations involving several scales of motion were identified in both studies.<p> In the steady state study, deterministic behavior related to superficial velocity, pressure sensor position, and granulate particle size distribution was observed in PCA model parameters. It was determined that these properties could be characterized solely with the use of the high frequency pressure fluctuation data. Macroscopic hydrodynamic characteristics such as bubbling frequency and fluidization regime were identified in the low frequency components of the pressure signals and the particle scale interactions of the microscale were shown to be correlated to the highest frequency signal components. PLS models were able to characterize the effects of superficial velocity, pressure sensor position, and granulate particle size distribution in terms of the pressure signal components. Additionally, it was determined that statistical process control charts capable of monitoring the fluid bed hydrodynamics could be constructed using PCA<p>In the transient drying experiments, deterministic behaviors related to inlet air temperature, pressure sensor position, and initial bed mass were observed in PCA and PLS model parameters. The lowest frequency component of the pressure signal was found to be correlated to the overall temperature effects during the drying cycle. As in the steady state study, bubbling behavior was also observed in the low frequency components of the pressure signal. PLS was used to construct an inferential model of granulate moisture content. The model was found to be capable of predicting the moisture throughout the drying cycle. Preliminary statistical process control models were constructed to monitor the fluid bed hydrodynamics throughout the drying process. These models show promise but will require further investigation to better determine sensitivity to process upsets.<p> In addition to PCA and PLS analyses, Multiway Principal Component Analysis (MPCA) was used to model the drying process. Several key states related to the mass transfer of moisture and changes in temperature throughout the drying cycle were identified in the MPCA model parameters. It was determined that the mass transfer of moisture throughout the drying process affects all scales of motion and overshadows other hydrodynamic behaviors found in the pressure signals.
37

Experimental Designs at the Crossroads of Drug Discovery

Olsson, Ing-Marie January 2006 (has links)
New techniques and approaches for organic synthesis, purification and biological testing are enabling pharmaceutical industries to produce and test increasing numbers of compounds every year. Surprisingly, this has not led to more new drugs reaching the market, prompting two questions – why is there not a better correlation between their efforts and output, and can it be improved? One possible way to make the drug discovery process more efficient is to ensure, at an early stage, that the tested compounds are diverse, representative and of high quality. In addition the biological evaluation systems have to be relevant and reliable. The diversity of the tested compounds could be ensured and the reliability of the biological assays improved by using Design Of Experiments (DOE) more frequently and effectively. However, DOE currently offers insufficient options for these purposes, so there is a need for new, tailor-made DOE strategies. The aim of the work underlying this thesis was to develop and evaluate DOE approaches for diverse compound selection and efficient assay optimisation. This resulted in the publication of two new DOE strategies; D-optimal Onion Design (DOOD) and Rectangular Experimental Designs for Multi-Unit Platforms (RED-MUP), both of which are extensions to established experimental designs. D-Optimal Onion Design (DOOD) is an extension to D-optimal design. The set of possible objects that could be selected is divided into layers and D-optimal selection is applied to each layer. DOOD enables model-based, but not model-dependent, selections in discrete spaces to be made, since the selections are not only based on the D-optimality criterion, but are also biased by the experimenter’s prior knowledge and specific needs. Hence, DOOD selections provide controlled diversity. Assay development and optimisation can be a major bottleneck restricting the progress of a project. Although DOE is a recognised tool for optimising experimental systems, there has been widespread unwillingness to use it for assay optimisation, mostly because of the difficulties involved in performing experiments according to designs in 96-, 384- and 1536- well formats. The RED-MUP framework combines classical experimental designs orthogonally onto rectangular experimental platforms, which facilitates the execution of DOE on these platforms and hence provides an efficient tool for assay optimisation. In combination, these two strategies can help uncovering the crossroads between biology and chemistry in drug discovery as well as lead to higher information content in the data received from biological evaluations, providing essential information for well-grounded decisions as to the future of the project. These two strategies can also help researchers identify the best routes to take at the crossroads linking biological and chemical elements of drug discovery programs.
38

A multivariate approach to QSAR

Hellberg, Sven January 1986 (has links)
Quantitative structure-activity relationships (OSAR) constitute empirical analogy models connecting chemical structure and biological activity. The analogy approach to QSAR assume that the factors important in the biological system also are contained in chemical model systems. The development of a QSAR can be divided into subproblems: 1. to quantify chemical structure in terms of latent variables expressing analogy, 2. to design test series of compounds, 3. to measure biological activity and 4. to construct a mathematical model connecting chemical structure and biological activity. In this thesis it is proposed that many possibly relevant descriptors should be considered simultaneously in order to efficiently capture the unknown factors inherent in the descriptors. The importance of multivariately and multipositionally varied test series is discussed. Multivariate projection methods such as PCA and PLS are shown to be appropriate far QSAR and to closely correspond to the analogy assumption. The multivariate analogy approach is applied to a beta- adrenergic agents, b haloalkanes, c halogenated ethyl methyl ethers and d four different families of peptides. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1986, härtill 8 uppsatser</p> / digitalisering@umu
39

Process development for the production of a therapeutic Affibody® Molecule / Processutveckling för att tillverka en Affibody®-molekyl avsedd för cancerterapi

Fridman, Belinda January 2014 (has links)
Recently HER3, member of the epidermal growth factor receptor family (EGFR), has been found to play a crucial role in the development of resistance towards inhibitors that are given to patients with HER1- and HER2-driven cancers. As HER3 is up-regulated or over-activated in several types of human cancers, it is of outmost importance that new innovative drugs target its oncologic activity. The Affibody® Molecule Z08698 inhibits the heregulin induced signalling of HER3 with high affinity (KD~50 pM). As the Affibody® Molecule is small, has high solubility and outstanding folding kinetics, an effective penetration of tumour tissue is suggested together with a rationalized manufacturing process. Further coupling to an albumin binding domain (ABD) expands the plasma half-life of the molecule, hence increasing the molecule's potential of serving as a therapeutic. A process development for production of Z08698-VDGS-ABD094 has been established, where the molecule is efficiently produced in the E. coli host strain BL21(DE3), through a T7 based expression system. Cultivations were performed with a fed-batch fermentation process and the conditions were further optimized in order to obtain highest expression, while avoiding undesirable modifications like gluconoylations. By employing Design of experiments in combination with multivariate data analysis, a production process resulting in ~3.5 g product/ l culture could be verified. Moreover, thermolysis was evaluated as a suitable method for cell disruption, enabling an easy and cost-effective manufacturing process of the ABD fused Affibody® Molecule.
40

Gestão do conhecimento: uma análise do setor automobilístico a partir de fatores contextuais da organização

Gonzalez, Rodrigo Valio Dominguez 29 June 2011 (has links)
Made available in DSpace on 2016-06-02T19:50:12Z (GMT). No. of bitstreams: 1 3762.pdf: 1095394 bytes, checksum: 0ba952111fc170c93998133ca7ab7dc9 (MD5) Previous issue date: 2011-06-29 / The latest models on Knowledge Management (KM) recognize four stages to their practice: acquisition, storage, distribution and use of knowledge that presents itself as tacit and explicit. The KM is characterized as a multidisciplinary discipline, involving aspects of technical and social order and due to the diversity of issues about this topic, it is essential to delimit their study. In this sense, the proposed delimitation for this research is social and coordination, involving the identification of contextual factors, developed internally in organizations that support the process of KM in automotive companies. These contextual factors are defined based on organizational aspects to enable the development of the four phases of the KM, especially the development of human resources, organizational culture, teamwork, organizational structure and the way how knowledge is developed and absorbed internally. This thesis also identifies groupings (clusters) composed of companies with similar characteristics as the development of these contextual factors, analyzing the implications of the clusters features for KM and for the use of knowledge. The automotive companies can be characterized as advanced for improvement and innovation processes, and thus the knowledge management is fundamental to the sustainability of these activities. To achieve this goal, this research uses a quantitative research method, based on a survey. Data analysis is performed using multivariate statistical techniques of factor analysis, cluster analysis and discriminant analysis. In regarding the results were collected eight contextual factors that support the process of GC, and also identified four clusters for the development of these factors: innovate, exploited, exploited and Laggards. The companies of the first two groups have a better capacity to absorb knowledge and build a broader base of primary knowledge, making them more innovative, and the other two groups take a more reactive posture within the industry, following the development proposed by the companies of first two clusters. In the latter two groups, in particular exploited, there is the development of factors related to incremental improvement and teamwork, showing a tendency to use the knowledge designed to improve production efficiency. / Os modelos mais recentes sobre Gestão do Conhecimento (GC) reconhecem quatro fases para a sua prática: aquisição, armazenamento, distribuição e utilização do conhecimento, que se apresenta na forma tácita e explícita. A GC se caracteriza como uma disciplina multidisciplinar, envolvendo aspectos de ordem técnico e social e devido à diversidade de assuntos que cerca este tema, é essencial delimitar seu estudo. Neste sentido, o recorte proposto para esta pesquisa é social e de coordenação, envolvendo a identificação de fatores contextuais, desenvolvidos internamente nas organizações, que sustentam o processo de GC em empresas do setor automobilístico. Estes fatores contextuais são definidos a partir de aspectos organizacionais que possibilitem o desenvolvimento das quatro fases do processo de GC, destacando-se o desenvolvimento de recursos humanos, a cultura organizacional, o trabalho em equipe, a estrutura organizacional e a forma como o conhecimento é desenvolvido e absorvido internamente. Esta tese identifica também agrupamentos (clusters), compostos de empresas com características similares quanto ao desenvolvimento destes fatores contextuais, analisando as implicações das características dos clusters para a GC e também para a utilização do conhecimento. As empresas do setor automobilístico, foco desta pesquisa, podem ser caracterizadas como maduras em processos de melhoria e inovação, e, desta forma, o gerenciamento do conhecimento é fundamental para a sustentabilidade destas atividades. Para atingir este objetivo, é utilizado um método de pesquisa quantitativo, baseado em uma pesquisa survey. A análise dos dados é realizada por meio de técnicas estatísticas multivariadas de análise fatorial, análise de cluster e análise discriminante. Quanto aos resultados, foram levantados oito fatores contextuais que sustentam o processo de GC, e também identificados quatro agrupamentos de empresas quanto ao desenvolvimento destes fatores: Inovativo, Explorativo, Explotativo e Retardatário. As empresas dos dois primeiros agrupamentos possuem melhor capacidade de absorver conhecimento e de construir uma base mais ampla de conhecimento primário, tornando-as mais inovativas; e os outros dois agrupamentos assumem uma postura mais reativa dentro do setor, seguindo o desenvolvimento proposto pelas empresas dos dois primeiros clusters. Nestes dois últimos agrupamentos, em especial no explotativo, destaca-se o desenvolvimento de fatores relacionados à melhoria incremental e trabalho em equipe, denotando uma tendência de utilização do conhecimento voltada para a melhoria da eficiência produtiva.

Page generated in 0.117 seconds