• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 144
  • 104
  • 47
  • 19
  • 8
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 377
  • 82
  • 77
  • 63
  • 62
  • 58
  • 53
  • 47
  • 37
  • 33
  • 31
  • 30
  • 28
  • 25
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Nya betalmedel : Hur accepterat är egentligen Bitcoin?

Elsa, Winai, Joar, Lundgren January 2023 (has links)
Bitcoin är en teknologi som har blivit allt mer etablerad i det svenska samhället. Trots att så pass många känner till teknologin är det fortfarande väldigt få som använder den. Denna studie ämnar undersöka vad som påverkar det svenska folkets acceptans av Bitcoin med hjälp av det teoretiska ramverket TAM, samt med tillägg för finansiell risk och med ett fokus på demografiska variablers (externa faktorers) påverkan på acceptansen. En kvantitativ surveyundersökning genomfördes genom en webbenkät, där totalt 204 respondenter deltog. Resultaten från enkäten analyserades med hjälp av dataanalysmetoden PLS-SEM vilket visade att upplevd användarnytta hade en signifikant positiv påverkan på en individs attityd mot Bitcoin. Vidare visade studien en signifikant positiv koppling mellan den upplevda användarvänligheten och den upplevda nyttan av Bitcoin. Dock fann studien att de testade externa faktorerna samt att tillägget av finansiell risk inte hade någon signifikant påverkan på resultatet. / Bitcoin is a technology that has become increasingly established in Swedish society. Despite the fact that so many people know about the technology, very few actually use it. This study aims to investigate what affects the Swedish people's acceptance of Bitcoin using the theoretical framework TAM, as well as with additions for financial risk and with a focus on the influence of demographic variables (external factors) on acceptance. A quantitative survey was conducted using a web survey, in which a total of 204 respondents participated. The results from the survey were analyzed using the data analysis method PLS-SEM, which showed that perceived usefulness had a significant positive impact on an individual's attitude towards Bitcoin. Furthermore, the study showed a significant positive link between an individual's perceived ease of use and their perceived usefulness of Bitcoin. However, the study found that the tested external factors and that the addition of financial risk had no significant impact on the result.
232

Monitoring Kraft Recovery Boiler Fouling by Multivariate Data Analysis

Edberg, Alexandra January 2018 (has links)
This work deals with fouling in the recovery boiler at Montes del Plata, Uruguay. Multivariate data analysis has been used to analyze the large amount of data that was available in order to investigate how different parameters affect the fouling problems. Principal Component Analysis (PCA) and Partial Least Square Projection (PLS) have in this work been used. PCA has been used to compare average values between time periods with high and low fouling problems while PLS has been used to study the correlation structures between the variables and consequently give an indication of which parameters that might be changed to improve the availability of the boiler. The results show that this recovery boiler tends to have problems with fouling that might depend on the distribution of air, the black liquor pressure or the dry solid content of the black liquor. The results also show that multivariate data analysis is a powerful tool for analyzing these types of fouling problems. / Detta arbete handlar om inkruster i sodapannan pa Montes del Plata, Uruguay. Multivariat dataanalys har anvands for att analysera den stora datamangd som fanns tillganglig for att undersoka hur olika parametrar paverkar inkrusterproblemen. Principal·· Component Analysis (PCA) och Partial Least Square Projection (PLS) har i detta jobb anvants. PCA har anvants for att jamfora medelvarden mellan tidsperioder med hoga och laga inkrusterproblem medan PLS har anvants for att studera korrelationen mellan variablema och darmed ge en indikation pa vilka parametrar som kan tankas att andras for att forbattra tillgangligheten pa sodapannan. Resultaten visar att sodapannan tenderar att ha problem med inkruster som kan hero pa fdrdelningen av luft, pa svartlutens tryck eller pa torrhalten i svartluten. Resultaten visar ocksa att multivariat dataanalys ar ett anvandbart verktyg for att analysera dessa typer av inkrusterproblem.
233

An Evaluation of Technological, Organizational and Environmental Determinants of Emerging Technologies Adoption Driving SMEs’ Competitive Advantage

Dobre, Marius January 2022 (has links)
This research evaluates the technological, organizational, and environmental determinants of emerging technologies adoption represented by Artificial Intelligence (AI) and Internet of Things (IoT) driving SMEs’ competitive advantage within a resource-based view (RBV) theoretical approach supported by the technological-organizational-environmental (TOE)-framework setting. Current literature on SMEs competitive advantage as outcome of emerging technologies in the technological, organisational, and environmental contexts presents models focused on these contexts individual components. There are no models in the literature to represent the TOE framework as an integrated structure with gradual levels of complexity, allowing for incremental evaluation of the business context in support of decision making towards emerging technologies adoption supporting the firm competitive advantage. This research gap is addressed with the introduction of a new concept, the IT resource-based renewal, underpinned by the RBV, and supported by the TOE framework for providing a holistic understanding of the SMEs strategic renewal decision through information technology. This is achieved through a complex measurement model with four level constructs, leading into a parsimonious structural model that evaluates the relationships between IT resource-based renewal, and emerging technologies adoption driving SMEs competitive advantage. The model confirms the positive association between the IT resource-based renewal and emerging technologies adoption, and between the IT resource-based renewal and SME competitive advantage for the SMEs managers model, with the SME owners model outcomes are found not being supportive towards emerging technologies adoption driving SME competitive advantage. As methodology, PLS-SEM is used for its capabilities of assessing complex paths among model variables. Analysis is done on three models, one for the full sample, with two subsequent ones for owners and managers, respectively, as SME decision makers, with data collected using a web-based survey in Canada, the UK, and the US, that has provided 510 usable answers. This research has a theoretical contribution represented by the introduction of the IT resource-based renewal concept, that integrates the RBV perspective and the TOE framework for supporting organization’s decision on emerging technologies adoption driving SMEs competitive advantage. As practical implications, this thesis provides SMEs with a reference framework on adopting emerging technologies, offering SME managers and owners a comprehensive model of hierarchical factors contributing to SMEs competitive advantage acquired as outcome of AI and IoT adoption. This research makes an original contribution to the enterprise management, information systems adoption, and SME competitive advantage literature, with an empirical approach that verifies a model of emerging technologies adoption determinants driving SMEs competitive advantage.
234

Quality by Design through multivariate latent structures

Palací López, Daniel Gonzalo 14 January 2019 (has links)
La presente tesis doctoral surge ante la necesidad creciente por parte de la mayoría de empresas, y en especial (pero no únicamente) aquellas dentro de los sectores farmacéu-tico, químico, alimentación y bioprocesos, de aumentar la flexibilidad en su rango ope-rativo para reducir los costes de fabricación, manteniendo o mejorando la calidad del producto final obtenido. Para ello, esta tesis se centra en la aplicación de los conceptos del Quality by Design para la aplicación y extensión de distintas metodologías ya exis-tentes y el desarrollo de nuevos algoritmos que permitan la implementación de herra-mientas adecuadas para el diseño de experimentos, el análisis multivariante de datos y la optimización de procesos en el ámbito del diseño de mezclas, pero sin limitarse ex-clusivamente a este tipo de problemas. Parte I - Prefacio, donde se presenta un resumen del trabajo de investigación realiza-do y los objetivos principales que pretende abordar y su justificación, así como una introducción a los conceptos más importantes relativos a los temas tratados en partes posteriores de la tesis, tales como el diseño de experimentos o diversas herramientas estadísticas de análisis multivariado. Parte II - Optimización en el diseño de mezclas, donde se lleva a cabo una recapitu-lación de las diversas herramientas existentes para el diseño de experimentos y análisis de datos por medios tradicionales relativos al diseño de mezclas, así como de algunas herramientas basadas en variables latentes, tales como la Regresión en Mínimos Cua-drados Parciales (PLS). En esta parte de la tesis también se propone una extensión del PLS basada en kernels para el análisis de datos de diseños de mezclas, y se hace una comparativa de las distintas metodologías presentadas. Finalmente, se incluye una breve presentación del programa MiDAs, desarrollado con la finalidad de ofrecer a sus usuarios la posibilidad de comparar de forma sencilla diversas metodologías para el diseño de experimentos y análisis de datos para problemas de mezclas. Parte III - Espacio de diseño y optimización a través del espacio latente, donde se aborda el problema fundamental dentro de la filosofía del Quality by Design asociado a la definición del llamado 'espacio de diseño', que comprendería todo el conjunto de posibles combinaciones de condiciones de proceso, materias primas, etc. que garanti-zan la obtención de un producto con la calidad deseada. En esta parte también se trata el problema de la definición del problema de optimización como herramienta para la mejora de la calidad, pero también para la exploración y flexibilización de los procesos productivos, con el objeto de definir un procedimiento eficiente y robusto de optimiza-ción que se adapte a los diversos problemas que exigen recurrir a dicha optimización. Parte IV - Epílogo, donde se presentan las conclusiones finales, la consecución de objetivos y posibles líneas futuras de investigación. En esta parte se incluyen además los anexos. / Aquesta tesi doctoral sorgeix davant la necessitat creixent per part de la majoria d'em-preses, i especialment (però no únicament) d'aquelles dins dels sectors farmacèutic, químic, alimentari i de bioprocessos, d'augmentar la flexibilitat en el seu rang operatiu per tal de reduir els costos de fabricació, mantenint o millorant la qualitat del producte final obtingut. La tesi se centra en l'aplicació dels conceptes del Quality by Design per a l'aplicació i extensió de diferents metodologies ja existents i el desenvolupament de nous algorismes que permeten la implementació d'eines adequades per al disseny d'ex-periments, l'anàlisi multivariada de dades i l'optimització de processos en l'àmbit del disseny de mescles, però sense limitar-se exclusivament a aquest tipus de problemes. Part I- Prefaci, en què es presenta un resum del treball de recerca realitzat i els objec-tius principals que pretén abordar i la seua justificació, així com una introducció als conceptes més importants relatius als temes tractats en parts posteriors de la tesi, com ara el disseny d'experiments o diverses eines estadístiques d'anàlisi multivariada. Part II - Optimització en el disseny de mescles, on es duu a terme una recapitulació de les diverses eines existents per al disseny d'experiments i anàlisi de dades per mit-jans tradicionals relatius al disseny de mescles, així com d'algunes eines basades en variables latents, tals com la Regressió en Mínims Quadrats Parcials (PLS). En aquesta part de la tesi també es proposa una extensió del PLS basada en kernels per a l'anàlisi de dades de dissenys de mescles, i es fa una comparativa de les diferents metodologies presentades. Finalment, s'inclou una breu presentació del programari MiDAs, que ofe-reix la possibilitat als usuaris de comparar de forma senzilla diverses metodologies per al disseny d'experiments i l'anàlisi de dades per a problemes de mescles. Part III- Espai de disseny i optimització a través de l'espai latent, on s'aborda el problema fonamental dins de la filosofia del Quality by Design associat a la definició de l'anomenat 'espai de disseny', que comprendria tot el conjunt de possibles combina-cions de condicions de procés, matèries primeres, etc. que garanteixen l'obtenció d'un producte amb la qualitat desitjada. En aquesta part també es tracta el problema de la definició del problema d'optimització com a eina per a la millora de la qualitat, però també per a l'exploració i flexibilització dels processos productius, amb l'objecte de definir un procediment eficient i robust d'optimització que s'adapti als diversos pro-blemes que exigeixen recórrer a aquesta optimització. Part IV- Epíleg, on es presenten les conclusions finals i la consecució d'objectius i es plantegen possibles línies futures de recerca arran dels resultats de la tesi. En aquesta part s'inclouen a més els annexos. / The present Ph.D. thesis is motivated by the growing need in most companies, and specially (but not solely) those in the pharmaceutical, chemical, food and bioprocess fields, to increase the flexibility in their operating conditions in order to reduce production costs while maintaining or even improving the quality of their products. To this end, this thesis focuses on the application of the concepts of the Quality by Design for the exploitation and development of already existing methodologies, and the development of new algorithms aimed at the proper implementation of tools for the design of experiments, multivariate data analysis and process optimization, specially (but not only) in the context of mixture design. Part I - Preface, where a summary of the research work done, the main goals it aimed at and their justification, are presented. Some of the most relevant concepts related to the developed work in subsequent chapters are also introduced, such as those regarding design of experiments or latent variable-based multivariate data analysis techniques. Part II - Mixture design optimization, in which a review of existing mixture design tools for the design of experiments and data analysis via traditional approaches, as well as some latent variable-based techniques, such as Partial Least Squares (PLS), is provided. A kernel-based extension of PLS for mixture design data analysis is also proposed, and the different available methods are compared to each other. Finally, a brief presentation of the software MiDAs is done. MiDAs has been developed in order to provide users with a tool to easily approach mixture design problems for the construction of Designs of Experiments and data analysis with different methods and compare them. Part III - Design Space and optimization through the latent space, where one of the fundamental issues within the Quality by Design philosophy, the definition of the so-called 'design space' (i.e. the subspace comprised by all possible combinations of process operating conditions, raw materials, etc. that guarantee obtaining a product meeting a required quality standard), is addressed. The problem of properly defining the optimization problem is also tackled, not only as a tool for quality improvement but also when it is to be used for exploration of process flexibilisation purposes, in order to establish an efficient and robust optimization method in accordance with the nature of the different problems that may require such optimization to be resorted to. Part IV - Epilogue, where final conclusions are drawn, future perspectives suggested, and annexes are included. / Palací López, DG. (2018). Quality by Design through multivariate latent structures [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/115489
235

Flexibility through Information Sharing : Evidences from the Automotive Industry in Sweden

Dwaikat, Nidal January 2016 (has links)
Research has validated the contribution of information sharing to performance improvement. It has also suggested that flexibility is a highly important competitive priority for those companies where demand is volatile. Several studies argue that flexibility has been recognized as a key enabler for supply chain responsiveness. However, the impact of information sharing on supplier flexibility is still unexplored, especially for the companies that operate in agile business environments such as in the automotive industry where flexibility is a strategic requirement to manage demand uncertainty. In agile supply chains, such as in the automotive industry, information sharing can play an important role in responding to demand variability. In such settings, the demand volumes generally fluctuate, and hence create production-scheduling problems for the upstream suppliers such as first-tier suppliers. Interestingly, the impact of demand fluctuations on suppliers is higher than that of Original Equipment Manufacturers (OEMs). The aim of this doctoral thesis is to investigate the role of information sharing between OEMs and first-tier suppliers, in enhancing supplier flexibility. Particularly, the research focuses on exploring the relationship between sharing demand schedules and inventory data, and volume and delivery flexibility. The questions on whether information sharing between OEMs and first-tier suppliers affect supplier flexibility remain unanswered. The following research questions have emerged:  RQ1: How does information sharing between OEMs and first-tier suppliers affect the latter's responsiveness to fluctuating demand? RQ2: What is the relationship between information sharing of OEMsʼ demand forecasts and inventory data, and suppliers’ volume and delivery flexibility? RQ3: What factors should OEMs consider to improve the sharing of demand forecasts with suppliers? The empirical part of this thesis comprises three individual studies that constitute the empirical foundations of the research problem. Each study analyzes one research question using its own methodological approach. Hence, different research methods for collecting and analyzing data were used to address the research questions. Applying different research methods is deemed advantageous because it allows for methodological rigorousness in this doctoral thesis. This thesis contributes to the body of knowledge in three dimensions—theory, method, and context. First, it contributes to the academic field of operations and supply chain management by developing a model to explain how information sharing could affect suppliers’ delivery performance. The model provides a measurement scale to measure the level of information sharing between OEMs and suppliers, and its impact on suppliers’ delivery flexibility. Second, this thesis contributes to the methods by using state-of-the-art techniques, which is partial least squares structural equation modeling (PLS-SEM) including consistent PLS, and applying advanced concepts to empirically test the proposed model. Third, this thesis has a managerial contribution to examine the concept of information sharing and flexibility at the supplier level. Investigating the problem at the supplier level may enable managers to improve short-term decisions, such as production scheduling decisions, internal production, and inventory processes, and evaluate collaboration practices with OEMs. This doctoral thesis is organized in a monograph format comprising five chapters: Introduction, Literature review, Methodology, Empirics, and Conclusion. As an outcome, several scientific articles have emerged from this thesis and have been submitted for consideration for publication in peer-reviewed journals and international conferences in the field of operations and supply chain management. These articles are listed and appended at the end of this dissertation. / <p>QC 20160302</p>
236

Optimering i organisk syntes : betingelser, system, syntesvägar

Hansson, Lars January 1990 (has links)
This thesis deals with different optimization problems encountered in organic synthesis. The use of response surface, sequential simplex and PLS techniques, for simultanious optimization of yield and suppression of side reactions is investigated. This is illustrated by an example of enamine synthesis, were a side reaction was a serious problem. The problem of efficient screening to find suitable catalysts and solvents in new reactions is also investigated. Here, the use of principal properties as selection criterion, is demonstrated with a new process for the silylation of a,ß-unsaturated ketones. The extension of the new method to bis silylation of 1,2- and 1,3-diketones is demonstrated. The total synthesis of (±)-geosmin is investigated by an approach aimed to reduce the number of necessary steps involved. The suggested strategy, is to find compatible solvents through several transformations in the sequence to accomplish one-pot multistep reactions. In this context an improved method for the preparation of 1,10-dimethyl-l(9)-octalone-2 was established. Comparison with previously reported total syntheses of (±)-geosmin was done. / digitalisering@umu
237

Nova metodologia para o desenvolvimento de inferências baseadas em dados

Fleck, Thiago Dantas January 2012 (has links)
As inferências têm diversas aplicações na indústria de processos químicos, sendo essenciais no sucesso de projetos de controle avançado. O desempenho do controle será sempre ligado ao desempenho da inferência, sendo importante a manutenção da sua qualidade ao longo do tempo. Neste trabalho, uma nova metodologia é sugerida para o desenvolvimento de inferências baseadas em dados seguindo uma abordagem segmentada com o objetivo de facilitar a sua manutenção. A nova proposta consiste em modelar a parte estacionária separada da parte dinâmica, diferentemente do que é feito na metodologia tradicional, onde o modelo dinâmico é gerado diretamente dos dados de processo. O modelo estacionário é obtido através de uma regressão PLS (Partial Least Squares), enquanto as dinâmicas são inseridas posteriormente utilizando-se um algoritmo de otimização. A técnica é aplicada a uma coluna de destilação e o resultado obtido é semelhante ao de inferências dinâmicas e estáticas desenvolvidas com métodos tradicionais. Outras etapas do desenvolvimento de inferências também são investigadas. Na seleção de variáveis, métodos estatísticos são comparados com a busca exaustiva e se conclui este último deve ser usado como padrão, visto que custo computacional não é mais um problema. Também são apresentadas boas práticas no pré-tratamento de dados, remoção do tempo morto do cromatógrafo modelado e detecção de estados estacionários. / Soft-sensors have several applications in the chemical processes industry and are essential for the success of advanced control projects. Its performance will always be linked to the performance of the soft-sensor, so it is important to maintain its quality over time. In this paper, a new methodology is suggested for the development of data-based soft-sensors following a segmented approach in order to facilitate its maintenance. The new proposal is to model the stationary part separated from the dynamic, unlike the traditional methodology where the dynamic model is generated directly from process data. The stationary model is obtained by a PLS (Partial Least Squares) regression, while the dynamics are inserted using an optimization algorithm. The technique is applied to a distillation column and its performance is similar to dynamic and static soft-sensors developed using traditional methods. Other steps in the development of soft-sensors are also investigated. In variable selection issue, statistical methods are compared with the testing of all possibilities; the latter should be used as default, since computational cost is no longer a problem. We also present best practices in data pre-processing, gas chromatograph dead-time removal and steady state detection.
238

Explaining temporal variations in soil respiration rates and delta<sup>13</sup>C in coniferous forest ecosystems

Comstedt, Daniel January 2008 (has links)
<p>Soils of Northern Hemisphere forests contain a large part of the global terrestrial carbon (C) pool. Even small changes in this pool can have large impact on atmospheric [CO2] and the global climate. Soil respiration is the largest terrestrial C flux to the atmosphere and can be divided into autotrophic (from roots, mycorrhizal hyphae and associated microbes) and heterotrophic (from decomposers of organic material) respiration. It is therefore crucial to establish how the two components will respond to changing environmental factors. In this thesis I studied the effect of elevated atmospheric [CO2] (+340 ppm, <sup>13</sup>C-depleted) and elevated air temperature (2.8-3.5 oC) on soil respiration in a whole-tree chamber (WTC) experiment conducted in a boreal Norway spruce forest. In another spruce forest I used multivariate modelling to establish the link between day-to-day variations in soil respiration rates and its δ<sup>13</sup>C, and above and below ground abiotic conditions. In both forests, variation in δ<sup>13</sup>C was used as a marker for autotrophic respiration. A trenching experiment was conducted in the latter forest in order to separate the two components of soil respiration. The potential problems associated with the trenching, increased root decomposition and changed soil moisture conditions were handled by empirical modelling. The WTC experiment showed that elevated [CO2] but not temperature resulted in 48 to 62% increased soil respiration rates. The CO2-induced increase was in absolute numbers relatively insensitive to seasonal changes in soil temperature and data on δ<sup>13</sup>C suggest it mostly resulted from increased autotrophic respiration. From the multivariate modelling we observed a strong link between weather (air temperature and vapour pressure deficit) and the day-to-day variation of soil respiration rate and its δ<sup>13</sup>C. However, the tightness of the link was dependent on good weather for up to a week before the respiration sampling. Changes in soil respiration rates showed a lag to weather conditions of 2-4 days, which was 1-3 days shorter than for the δ<sup>13</sup>C signal. We hypothesised to be due to pressure concentration waves moving in the phloem at higher rates than the solute itself (i.e., the δ<sup>13</sup>C–label). Results from the empirical modelling in the trenching experiment show that autotrophic respiration contributed to about 50% of total soil respiration, had a great day-to-day variation and was correlated to total soil respiration while not to soil temperature or soil moisture. Over the first five months after the trenching, an estimated 45% of respiration from the trenched plots was an artefact of the treatment. Of this, 29% was a water difference effect and 16% resulted from root decomposition. In conclusion, elevated [CO2] caused an increased C flux to the roots but this C was rapidly respired and has probably not caused changes in the C stored in root biomass or in soil organic matter in this N-limited forest. Autotrophic respiration seems to be strongly influenced by the availability of newly produced substrates and rather insensitive to changes in soil temperature. Root trenching artefacts can be compensated for by empirical modelling, an alternative to the sequential root harvesting technique.</p>
239

A multivariate approach to computational molecular biology

Pettersson, Fredrik January 2005 (has links)
<p>This thesis describes the application of multivariate methods in analyses of genomic DNA sequences, gene expression and protein synthesis, which represent each of the steps in the central dogma of biology. The recent finalisation of large sequencing projects has given us a definable core of genetic data and large-scale methods for the dynamic quantification of gene expression and protein synthesis. However, in order to gain meaningful knowledge from such data, appropriate data analysis methods must be applied.</p><p>The multivariate projection methods, principal component analysis (PCA) and partial least squares projection to latent structures (PLS), were used for clustering and multivariate calibration of data. By combining results from these and other statistical methods with interactive visualisation, valuable information was extracted and further interpreted.</p><p>We analysed genomic sequences by combining multivariate statistics with cytological observations and full genome annotations. All oligomers of di- (16), tri- (64), tetra- (256), penta- (1024) and hexa-mers (4096) of DNA were separately counted and normalised and their distributions in the chromosomes of three Drosophila genomes were studied by using PCA. Using this strategy sequence signatures responsible for the differentiation of chromosomal elements were identified and related to previously defined biological features. We also developed a tool, which has been made publicly available, to interactively analyse single nucleotide polymorphism data and to visualise annotations and linkage disequilibrium.</p><p>PLS was used to investigate the relationships between weather factors and gene expression in field-grown aspen leaves. By interpreting PLS models it was possible to predict if genes were mainly environmentally or developmentally regulated. Based on a PCA model calculated from seasonal gene expression profiles, different phases of the growing season were identified as different clusters. In addition, a publicly available dataset with gene expression values for 7070 genes was analysed by PLS to classify tumour types. All samples in a training set and an external test set were correctly classified. For the interpretation of these results a method was applied to obtain a cut-off value for deciding which genes could be of interest for further studies.</p><p>Potential biomarkers for the efficacy of radiation treatment of brain tumours were identified by combining quantification of protein profiles by SELDI-MS-TOF with multivariate analysis using PCA and PLS. We were also able to differentiate brain tumours from normal brain tissue based on protein profiles, and observed that radiation treatment slows down the development of tumours at a molecular level.</p><p>By applying a multivariate approach for the analysis of biological data information was extracted that would be impossible or very difficult to acquire with traditional methods. The next step in a systems biology approach will be to perform a combined analysis in order to elucidate how the different levels of information are linked together to form a regulatory network.</p>
240

Prédire l'âge de personnes à partir de photos du visage : une étude fondée sur la caractérisation et l'analyse de signes du vieillissement

Nkengne, Alex A. 13 June 2008 (has links) (PDF)
L'âge a de tout temps constitué un attribut identitaire important. Nous avons développé au fil de l'évolution une aptitude innée à classer les individus en fonction de leur âge. Cette classification s'appuie en grande partie sur le visage et sur les transformations anatomiques qu'il subit au cours du temps. De plus en plus de traitements cosmétiques, dermatologiques et d'interventions chirurgicales s'attaquant à un signe ou un groupe de signes spécifiques du vieillissement sont mis en oeuvre pour annuler, ou tout au moins masquer partiellement l'effet du temps sur le visage. On peut dès lors s'interroger sur l'influence de chacun des signes sur notre capacité à prédire l'âge d'un individu en observant son visage. Afin de construire un algorithme capable de déterminer l'âge d'individus à partir de leurs photos, nous nous sommes intéressés aux signes du vieillissement et à leur impact sur l'âge apparent. Dans un premier temps, nous avons déterminé et analysé les transformations anatomiques qui altèrent le visage à partir de l'âge adulte (au-delà de 20 ans). Puis nous avons étudié les signes sur lequel on se base pour prédire l'âge d'une personne. Enfin, nous avons construit et validé un modèle prédictif de l'âge en s'appuyant sur les observations précédentes. Transformations anatomiques du visage avec l'âge : La prévalence d'un certain nombre de signes de vieillissement (rides, tâches brunes, forme du visage...) a été mesurée sur un panel représentatif de femmes volontaires âgées de 20 à 74 ans. Ces données ont permis d'établir la cinétique d'apparition de ces signes. Appréciation subjective de l'âge: Il s'agissait de déterminer les signes sur lesquels un observateur s'appuie lorsqu'il évalue l'âge d'un sujet. Pour ce faire, nous avons demandé à un panel constitué de 48 observateurs d'attribuer un âge aux volontaires sur lesquelles nous avions précédemment mesuré les signes du vieillissement. Nous avons confirmé avec ce groupe d'observateurs que la perception de l'âge est liée au sexe et à l'âge de l'observateur. De plus, à l'aide d'une régression PLS (Partial Least Square régression), nous avons établi des relations entre les signes du vieillissement et l'âge observé et démontré que selon que l'on soit jeune ou âgé, un homme ou une femme, on n'exploite pas les mêmes signes de vieillissement pour prédire l'âge.Modèle de prédiction : Enfin, nous avons proposé un modèle s'appuyant sur la régression PLS pour prédire automatiquement l'âge à partir des photos du visage. Ce modèle présente la particularité d'associer, dans une approche unifiée, les signes relatifs à la couleur, à la forme et à la texture du visage, à l'âge des sujets. A l'instar des Modèles Actifs D'apparence (AAM), le modèle construit vise à réduire fortement l'information portée par l'ensemble des pixels du visage. Toutefois, ce dernier est supervisé : Il est donc très approprié dans notre contexte puisque que l'on peut mettre en oeuvre une procédure d'apprentissage pilotée par le but. Les performances sont de fait comparables à celles des humains.

Page generated in 0.0347 seconds