Spelling suggestions: "subject:"aprincipal component 2analysis"" "subject:"aprincipal component 3analysis""
511 |
Prise en compte de la flexibilité des ressources humaines dans la planification et l’ordonnancement des activités industrielles / Considering the flexibility of human resources in planning and scheduling industrial activitiesAtalla El-Awady Attia, El-Awady 05 April 2013 (has links)
Le besoin croissant de réactivité dans les différents secteurs industriels face à la volatilité des marchés soulève une forte demande de la flexibilité dans leur organisation. Cette flexibilité peut être utilisée pour améliorer la robustesse du planning de référence d’un programme d’activités donné. Les ressources humaines de l’entreprise étant de plus en plus considérées comme le coeur des structures organisationnelles, elles représentent une source de flexibilité renouvelable et viable. Tout d’abord, ce travail a été mis en oeuvre pour modéliser le problème d’affectation multi-périodes des effectifs sur les activités industrielles en considérant deux dimensions de la flexibilité: L’annualisation du temps de travail, qui concerne les politiques de modulation d’horaires, individuels ou collectifs, et la polyvalence des opérateurs, qui induit une vision dynamique de leurs compétences et la nécessité de prévoir les évolutions des performances individuelles en fonction des affectations successives. La nature dynamique de l’efficacité des effectifs a été modélisée en fonction de l’apprentissage par la pratique et de la perte de compétence pendant les périodes d’interruption du travail. En conséquence, nous sommes résolument placés dans un contexte où la durée prévue des activités n’est plus déterministe, mais résulte du nombre des acteurs choisis pour les exécuter, en plus des niveaux de leur expérience. Ensuite, la recherche a été orientée pour répondre à la question : « quelle genre, ou quelle taille, de problème pose le projet que nous devons planifier? ». Par conséquent, les différentes dimensions du problème posé sont classées et analysés pour être évaluées et mesurées. Pour chaque dimension, la méthode d’évaluation la plus pertinente a été proposée : le travail a ensuite consisté à réduire les paramètres résultants en composantes principales en procédant à une analyse factorielle. En résultat, la complexité (ou la simplicité) de la recherche de solution (c’est-à-dire de l’élaboration d’un planning satisfaisant pour un problème donné) peut être évaluée. Pour ce faire, nous avons développé une plate-forme logicielle destinée à résoudre le problème et construire le planning de référence du projet avec l’affectation des ressources associées, plate-forme basée sur les algorithmes génétiques. Le modèle a été validé, et ses paramètres ont été affinés via des plans d’expériences pour garantir la meilleure performance. De plus, la robustesse de ces performances a été étudiée sur la résolution complète d’un échantillon de quatre cents projets, classés selon le nombre de leurs tâches. En raison de l’aspect dynamique de l’efficacité des opérateurs, le présent travail examine un ensemble de facteurs qui influencent le développement de leur polyvalence. Les résultats concluent logiquement qu’une entreprise en quête de flexibilité doit accepter des coûts supplémentaires pour développer la polyvalence de ses opérateurs. Afin de maîtriser ces surcoûts, le nombre des opérateurs qui suivent un programme de développement des compétences doit être optimisé, ainsi que, pour chacun d’eux, le degré de ressemblance entre les nouvelles compétences développées et les compétences initiales, ou le nombre de ces compétences complémentaires (toujours pour chacun d’eux), ainsi enfin que la façon dont les heures de travail des opérateurs doivent être réparties sur la période d’acquisition des compétences. Enfin, ce travail ouvre la porte pour la prise en compte future des facteurs humains et de la flexibilité des effectifs pendant l’élaboration d’un planning de référence. / The growing need of responsiveness for manufacturing companies facing the market volatility raises a strong demand for flexibility in their organization. This flexibility can be used to enhance the robustness of a baseline schedule for a given programme of activities. Since the company personnel are increasingly seen as the core of the organizational structures, they provide the decision-makers with a source of renewable and viable flexibility. First, this work was implemented to model the problem of multi-period workforce allocation on industrial activities with two degrees of flexibility: the annualizing of the working time, which offers opportunities of changing the schedules, individually as well as collectively. The second degree of flexibility is the versatility of operators, which induces a dynamic view of their skills and the need to predict changes in individual performances as a result of successive assignments. The dynamic nature of workforce’s experience was modelled in function of learning-by-doing and of oblivion phenomenon during the work interruption periods. We firmly set ourselves in a context where the expected durations of activities are no longer deterministic, but result from the number and levels of experience of the workers assigned to perform them. After that, the research was oriented to answer the question “What kind of problem is raises the project we are facing to schedule?”: therefore the different dimensions of the project are inventoried and analysed to be measured. For each of these dimensions, the related sensitive assessment methods have been proposed. Relying on the produced correlated measures, the research proposes to aggregate them through a factor analysis in order to produce the main principal components of an instance. Consequently, the complexity or the easiness of solving or realising a given scheduling problem can be evaluated. In that view, we developed a platform software to solve the problem and construct the project baseline schedule with the associated resources allocation. This platform relies on a genetic algorithm. The model has been validated, moreover, its parameters has been tuned to give the best performance, relying on an experimental design procedure. The robustness of its performance was also investigated, by a comprehensive solving of four hundred instances of projects, ranked according to the number of their tasks. Due to the dynamic aspect of the workforce’s experience, this research work investigates a set of different parameters affecting the development of their versatility. The results recommend that the firms seeking for flexibility should accept an amount of extra cost to develop the operators’ multi functionality. In order to control these over-costs, the number of operators who attend a skill development program should be optimised, as well as the similarity of the new developed skills relative to the principal ones, or the number of the additional skills an operator may be trained to, or finally the way the operators’ working hours should be distributed along the period of skill acquisition: this is the field of investigations of the present work which will, in the end, open the door for considering human factors and workforce’s flexibility in generating a work baseline program.
|
512 |
Classification of Carpiodes Using Fourier Descriptors: A Content Based Image Retrieval ApproachTrahan, Patrick 06 August 2009 (has links)
Taxonomic classification has always been important to the study of any biological system. Many biological species will go unclassified and become lost forever at the current rate of classification. The current state of computer technology makes image storage and retrieval possible on a global level. As a result, computer-aided taxonomy is now possible. Content based image retrieval techniques utilize visual features of the image for classification. By utilizing image content and computer technology, the gap between taxonomic classification and species destruction is shrinking. This content based study utilizes the Fourier Descriptors of fifteen known landmark features on three Carpiodes species: C.carpio, C.velifer, and C.cyprinus. Classification analysis involves both unsupervised and supervised machine learning algorithms. Fourier Descriptors of the fifteen known landmarks provide for strong classification power on image data. Feature reduction analysis indicates feature reduction is possible. This proves useful for increasing generalization power of classification.
|
513 |
Effets dynamiques et conformationnels sur le rôle de transport des albumines sériques / Dynamics and conformational effects on the transport role of serum albuminsParis, Guillaume 05 June 2014 (has links)
L’albumine sérique humaine (HSA) est une protéine connue pour ses propriétés de transport exceptionnelles et son contenu élevé en ponts disulfure. L’étude de sa dynamique conformationnelle représente un défi important dans la compréhension de ses fonctions physiologiques. Le but de notre travail a été d’étudier cette dynamique conformationnelle et de comprendre le rôle des ponts disulfure dans le maintien de la structure native de la protéine. Notre analyse est basée sur des simulations de dynamique moléculaire couplées à des analyses par composantes principales. Outre la validation de la méthode de simulation les résultats fournissent de nouveaux éclairages sur les principaux effets de la réduction des ponts disulfure dans les albumines sériques. Les processus de dépliement/repliement protéique ont été détaillés. La prédiction de la structure réduite d’équilibre a également fait l’objet d’une attention particulière. Une étude détaillée de la dynamique conformationnelle globale de la protéine ainsi que celle des deux sites principaux de complexation a été effectuée. D’éventuels effets allostériques entre ces deux sites ont été recherchés. Les résultats théoriques obtenus ont été discutés avec les données expérimentales disponibles / Human serum albumin (HSA) is a protein known for its exceptional transport properties and its high content of disulfide bridges. The study of the conformational dynamics represents a major challenge in the comprehension of its physiological functions. The aim of our work was to study the conformational dynamics and to understand the roleof disulfide bonds in the stability of the native protein structure. Our analysis is based on simulations of molecular dynamics coupled with principal component analysis. Beyond the validation of the simulation method, the results provide new insights on the main effects of the disulfide bonds reduction in serum albumins. Protein unfolding/refolding processes were detailed. A special attention is paid to the prediction of the reduced structure at the equilibrium. A detailed study of the global protein conformational dynamics as well as the two main binding sites were performed. Possible allosteric effects between these two sites were researched. The theoretical results have been discussed with the available experimental data
|
514 |
Contribution à la modélisation de la qualité de l'orge et du malt pour la maîtrise du procédé de maltage / Modeling contribution of barley and malt quality for the malting process controlAjib, Budour 18 December 2013 (has links)
Dans un marché en permanente progression et pour répondre aux besoins des brasseurs en malt de qualité, la maîtrise du procédé de maltage est indispensable. La qualité du malt est fortement dépendante des conditions opératoires, en particulier des conditions de trempe, mais également de la qualité de la matière première : l'orge. Dans cette étude, nous avons établi des modèles polynomiaux qui mettent en relation les conditions opératoires et la qualité du malt. Ces modèles ont été couplés à nos algorithmes génétiques et nous ont permis de déterminer les conditions optimales de maltage, soit pour atteindre une qualité ciblée de malt (friabilité), soit pour permettre un maltage à faible teneur en eau (pour réduire la consommation en eau et maîtriser les coûts environnementaux de production) tout en conservant une qualité acceptable de malt. Cependant, la variabilité de la matière première est un facteur limitant de notre approche. Les modèles établis sont en effet très sensibles à l'espèce d'orge (printemps, hiver) ou encore à la variété d'orge utilisée. Les modèles sont surtout très dépendants de l'année de récolte. Les variations observées sur les propriétés d'une année de récolte à une autre sont mal caractérisées et ne sont donc pas intégrées dans nos modèles. Elles empêchent ainsi de capitaliser l'information expérimentale au cours du temps. Certaines propriétés structurelles de l'orge (porosité, dureté) ont été envisagées comme nouveaux facteurs pour mieux caractériser la matière première mais ils n'ont pas permis d'expliquer les variations observés en malterie.Afin de caractériser la matière première, 394 échantillons d'orge issus de 3 années de récolte différentes 2009-2010-2011 ont été analysés par spectroscopie MIR. Les analyses ACP ont confirmé l'effet notable des années de récolte, des espèces, des variétés voire des lieux de culture sur les propriétés de l'orge. Une régression PLS a permis, pour certaines années et pour certaines espèces, de prédire les teneurs en protéines et en béta-glucanes de l'orge à partir des spectres MIR. Cependant, ces résultats, pourtant prometteurs, se heurtent toujours à la variabilité. Ces nouveaux modèles PLS peuvent toutefois être exploités pour mettre en place des stratégies de pilotage du procédé de maltage à partir de mesures spectroscopiques MIR / In a continuously growing market and in order to meet the needs of Brewers in high quality malt, control of the malting process is a great challenge. Malt quality is highly dependent on the malting process operating conditions, especially on the steeping conditions, but also the quality of the raw material: barley. In this study, we established polynomial models that relate the operating conditions and the malt quality. These models have been coupled with our genetic algorithms to determine the optimal steeping conditions, either to obtain a targeted quality of malt (friability), or to allow a malting at low water content while maintaining acceptable quality of malt (to reduce water consumption and control the environmental costs of malt production). However, the variability of the raw material is a limiting factor for our approach. Established models are very sensitive to the species (spring and winter barley) or to the barley variety. The models are especially highly dependent on the crop year. Variations on the properties of a crop from one to another year are poorly characterized and are not incorporated in our models. They thus prevent us to capitalize experimental information over time. Some structural properties of barley (porosity, hardness) were considered as new factors to better characterize barley but they did not explain the observed variations.To characterize barley, 394 samples from 3 years of different crops 2009-2010-2011 were analysed by MIR spectroscopy. ACP analyses have confirmed the significant effect of the crop-years, species, varieties and sometimes of places of harvest on the properties of barley. A PLS regression allowed, for some years and for some species, to predict content of protein and beta-glucans of barley using MIR spectra. These results thus still face product variability, however, these new PLS models are very promising and could be exploited to implement control strategies in malting process using MIR spectroscopic measurements
|
515 |
Interprétation des signaux cérébraux pour l’autonomie des handicapés : Système de reconnaissance de mots imaginés / Cerebral signal processing for the autonomy of the handicapped : Imagery recognition systemAbdallah, Nassib 20 December 2018 (has links)
Les interfaces Cerveau Machine représentent une solution pour rétablir plusieurs fonctions comme le mouvement, la parole, etc. La construction de BCI se compose de quatre phases principales: "Collecte des données", "Prétraitement du signal", "Extraction et sélection de caractéristiques", "Classification". Dans ce rapport nous présentons un nouveau système de reconnaissance de mots imaginées basé sur une technique d’acquisition non invasive (EEG) et portable pour faciliter aux personnes ayant des handicaps spécifiques, leurs communications avec le monde extérieur. Cette thèse inclut un système nommé FEASR pour la construction d’une base de données pertinente et optimisée. Cette base a été testée avec plusieurs méthodes de classification pour obtenir un taux maximal de reconnaissance de 83.4% pour cinq mots imaginés en arabe. De plus, on discute de l’impact des algorithmes d’optimisations (Sélection des capteurs de Wernicke, Analyse en composante principale et sélection de sous bandes résultant de la décomposition en ondelette) sur les pourcentages de reconnaissance en fonction de la taille de notre base de données et de sa réduction. / The Brain Machine interfaces represent a solution to restore several human issues such as movement, speech, etc. The construction of BCI consists of four main phases: "Data Recording", "Signal preprocessing", "Extraction and Selection of Characteristics", and "Classification". In this report we present a new imagery recognition system based on a non-invasive (EEG) and portable acquisition technique to facilitate communication with the outside world for people with specific disabilities.This thesis includes a system called FEASR for the construction of a relevant and optimized database. This database has been tested with several classification methods to obtain a maximum recognition rate of 83.4% for five words imagined in Arabic. In addition, we discuss the impact of optimization algorithms (Wernicke sensor selection, principal component analysis algorithm and the selection of subbands resulting from the discrete wavelet transform decomposition) on recognition percentages according to the size of our database and its reduction.
|
516 |
Calibration d'algorithmes de type Lasso et analyse statistique de données métallurgiques en aéronautique / Calibration of Lasso-type algorithms & statistical analysis of metallurgical data in aeronauticsConnault, Pierre 06 April 2011 (has links)
Notre thèse comprend deux parties : l’une méthodologique, l’autre appliquée.La partie méthodologique porte sur le Lasso et une variante de cet algorithme, le Lasso projeté, en vue de leur calibration par pente. Notre méthode tire parti des propriétés de parcimonie du Lasso, en envisageant le problème de sa calibration comme un problème de sélection de modèles, permettant l’emploi de critères pénalisés nécessitant le réglage d’une constante. Pour déterminer la forme de la pénalité et la valeur de la constante, nous adaptons les approches classiques de Birgé et Massart. Ceci permet de dégager la notion de pénalité canonique. Pente et validation croisée sont ensuite comparées. La proximité des résultats suggère qu’en pratique on utilise les deux conjointement, avec des corrections visuelles concernant la pente. Des améliorations sur le temps de calcul des pénalités canoniques sont ensuite proposées, mais sans succès patent. La partie appliquée analyse certaines questions métallurgiques en aéronautique. En fiabilité, le grand nombre de variables présentes, relativement au nombre limité de données, mène à une instabilité des solutions par modèles linéaires et à des temps de calculs trop élevés ; c’est pourquoi le Lasso constitue une solution intéressante. Notre méthode de réglage permet souvent de retenir les variables conformes à l’expérience métier. La question de la qualité du procédé de fabrication, par contre, ne peut se traiter au moyen du Lasso. Quatre aspects sont alors envisagés : la détermination des facteurs du procédé, la mise en évidence de recettes, l’étude de la stabilité du procédé dans le temps et la détection de pièces hors-normes. Un schéma général d’étude procédé est ainsi dégagé,en qualité comme en fiabilité. / Our work contains a methodological and an applied part.In the methodological part we study Lasso and a variant of this algorithm : the projectedLasso. We develop slope heuristics to calibrate them.Our approach uses sparsity properties of the Lasso, showing how to remain to a modelselection framework. This both involves a penalized criterion and the tuning of a constant.To this aim, we adopt the classical approaches of Birgé and Massart about slope heuristics.This leads to the notion of canonical penalty.Slope and (tenfold) crossvalidation are then compared through simulations studies.Results suggest the user to consider both of them. In order to increase calculation speed,simplified penalties are (unsuccessfully) tried.The applied part is about aeronautics. The results of the methodological part doapply in reliability : in classical approaches (without Lasso) the large number of variables/number of data ratio leads to an instability of linear models, and to huge calculustimes. Lasso provides a helpful solution.In aeronautics, dealing with reliability questions first needs to study quality of theelaboration and forging processes. Four major axis have to be considered : analysing thefactor of the process, discrimining recipes, studying the impact of time on quality anddetecting outliers. This provides a global statistical strategy of impowerment for processes.
|
517 |
O impacto das fontes de poluição na distribuição de tamanho em número e massa do material particulado atmosférico em São Paulo / The Impact of Pollution Sources on Number and Mass Size Distribution of Atmospheric Particulate Matter in São PauloSantos, Luís Henrique Mendes dos 06 August 2018 (has links)
Diversos estudos tiveram como objetivo determinar e caracterizar o aerossol atmosférico na cidade de São Paulo, quanto a seu tamanho e composição química, bem como encontrar as suas fontes emissoras e contribuições em massa para a região estudada. A coleta dos constituintes atmosféricos foi realizada na estação de amostragem do Laboratório de Análises dos Processos Atmosféricos (LAPAt) do Instituto de Astronomia, Geofísica e Ciências Atmosféricas (IAG) da Universidade de São Paulo (USP), localizada na zona oeste da cidade de São Paulo, geograficamente em 23°3334 S e 46°4400 O. O experimento foi realizado de 15 de agosto a 16 de setembro de 2016. Foram realizadas coletas de material particulado para análise da concentração em massa de sua fração fina inalável e composição química. A distribuição de tamanho para massa de material particulado foi determinada através da coleta com um impactador em cascata. A distribuição de tamanho para número foi obtida a partir de medidas com um Scanning Mobility Particle Sampler (SMPS) com o cálculo da concentração número de partículas (PNC) para o intervalo de 9 a 450 nm de diâmetro. Para estudar as relações entre os gases presentes na região amostrada com a radiação ultravioleta e com o PNC utilizamos os valores horários de concentrações dos gases (O3, NO, NO2 e NOX) e UV medidos na Rede Telemétrica da CETESB (Companhia de Tecnologia Ambiental do Estado de São Paulo). Os filtros coletados foram analisados pela técnica de Fluorescência de Raios-X dispersivo em energia (EDX). As concentrações de Black Carbon (BC) foram obtidas por refletância. Para a determinação das fontes de material particulado fino (MP2,5) foram utilizados os seguintes modelos receptores: Análise de Componentes Principais (ACP) e Fatoração de Matriz Positiva (FMP). Para análise de dispersão do poluente, utilizamos dados meteorológicos da estação climatológica do IAG situada no Parque do Estado. A concentração média de MP2,5 foi de 18,6 (±12,5) g/m³ e a concentração média de BC foi de 1,9 (±1,5) g/m³. As principais fontes encontradas, por ambos modelos receptores ACP e FMP, foram: veículos pesados (a diesel), veículos leves, queima de biomassa, ressuspensão de poeira de solo, pavimentos e construção, processos secundários e misturas de fontes. Os elementos-traço foram definidos em diferentes modas de tamanho: Al, Ca, Si e Ti com picos nas modas de acumulação, traçadores de ressuspensão de pavimento; Fe, Mn, P, K e Cr com picos na fração mais grossa da moda de acumulação, traçadores de emissões veiculares e queima de biomassa. Cu, Zn, Br, Pb, S e BC apresentam picos na fração mais fina da moda de acumulação, traçadores de emissões veiculares e queima de biomassa. / Several studies aimed to determine and characterize the atmospheric aerosol in the city of São Paulo, not only to its size and chemical composition, but as well as to find its emitting sources and mass contributions in the studied area. The atmospheric constituents were collected at the Laboratório de Análise dos Processos Atmosféricos (LAPAt) of the Institute of Astronomy, Geophysics and Atmospheric Sciences (IAG) of the University of São Paulo (USP), located in the western zone of the city of São Paulo Paulo, geographically at 23°33\'34\"S and 46°44\'00\" W. The experiment was conducted from August 15 to September 16 of 2016. Samples of particulate matter were collected to analyze the mass concentration and chemical composition of its inhalable fine fraction. The particulate mass size distribution was determined through the collection with a cascade impactor. The number size distribution was obtained from measurements with a Scanning Mobility Particle Sampler (SMPS) with the calculated number of particle concentration (PNC) for the range of 9 to 450 nm of the diameter. In order to study the relationships among the compounds present in the region and the PNC, we used the hourly values of the gaseous concentrations (O3, NO, NO2 and NOx) and UV measured in CETESB\'s Air Quality Telemetric Network in the State of São Paulo. The sampled filters were analyzed by the energy dispersive X-ray Fluorescence (EDX) technique to determine the elemental composition. The concentrations of Black Carbon (BC) were obtained by reflectance analysis. In order to determine the sources of fine particulate matter (PM2.5), the following Receptors Models were used: Principal Component Analysis (PCA) and Positive Matrix Factorization (PMF). For air pollution dispersion analysis, we used meteorological data from the IAG climatological station located in the Southeast of the city. The mean MP2.5 concentration was 18.6 (± 12.5) g/m³ and the mean concentration of BC was 1.9 (± 1.5) g/m³ for the sampling period. The main sources found by both ACP and PMF models were heavy-duty vehicles (diesel), light-duty vehicles, biomass burning, resuspension of soil dust, pavements and construction, secondary processes and mixed sources. The trace elements were defined at different size distributions: Al, Ca, Si and Ti with peaks in accumulation fraction (related to pavement resuspension tracers); Fe, Mn, P, K and Cr with peaks in the largest fraction of accumulation mode, characteristic of vehicular emissions tracer and biomass burning. Cu, Zn, Br, Pb, S and BC presented peaks in the finer fraction of the accumulation mode, related to vehicle emissions tracer and biomass burning.
|
518 |
Alimentos ultraprocessados e a qualidade nutricional das dietas dos EUA / Ultra-processed foods and the nutritional quality of US dietsSteele, Eurídice Martínez 31 May 2017 (has links)
Introdução: A introdução da agricultura e pecuária foram muito recentes para que o genoma humano se adaptasse e a tecnologia avançada pós revolução Industrial foi ainda mais. Segundo Cordain, a substituição de alimentos minimamente processados por alimentos pós-agrícolas e pós-industriais influenciaram os indicadores nutricionais: carga glicêmica, composição de ácidos graxos e macronutrientes, densidade de micronutrientes, equilíbrio ácido-base, relação sódio/potássio e teor de fibras, levando a um desequilíbrio que é causa de várias doenças atuais da civilização. A Protein Leverage Hypothesis (PLH) propõe que a queda na ingestão de proteínas possa levar a obesidade e doenças cardiometabólicas associadas. Objetivos: Estudar o efeito do consumo de alimentos ultraprocessados nos indicadores nutricionais na população dos EUA, incluindo a composição de macronutrientes, densidade de fibras e micronutrientes e fitoestrógenos urinários; avaliar se a contribuição calórica de alimentos ultraprocessados é determinante para a qualidade nutricional das dietas contemporâneas; e finalmente estudar se a associação entre o consumo de alimentos ultraprocessados, proteína e energia correspondem às previsões do modelo PLH. Métodos: Foram avaliados os participantes do National Health and Nutrition Examination Survey 2009-2010, com pelo menos um recordatório alimentar de 24 horas. Os itens foram classificados em: alimentos in natura ou minimamente processados, processados, ultraprocessados e ingredientes de uso culinário. O manuscrito 1 examina a relação entre a contribuição calórica de alimentos ultraprocessados e qualidade nutricional da dieta, avaliando individual e globalmente a contribuição de cada ingrediente crítico, usando a análise de componentes principais (ACP). O manuscrito 2 estuda a associação entre a contribuição calórica dos alimentos ultraprocessados e consumo de açúcares de adição. O manuscrito 3 avalia como o consumo de alimentos ultraprocessados influencia o conteúdo proteico relativo da dieta e as ingestões absolutas de energia e proteína, e se essas relações se encaixam nas previsões da PLH. O manuscrito 4 avalia a relação entre a contribuição calórica de alimentos ultraprocessados e níveis de fitoestrógenos urinários. Resultados: O teor médio de proteínas, fibras, vitaminas A, C, D e E, zinco, potássio, fósforo, magnésio e cálcio na dieta diminuiu ao longo dos quintis de contribuição calórica de alimentos ultraprocessados, enquanto o de carboidratos, açúcares de adição e gordura saturada aumentou. Uma associação inversa de dose-resposta foi encontrada entre o consumo de alimentos ultraprocessados e qualidade nutricional total, medida através de um escore de padrão balanceado de nutrientes derivado usando ACP. Consistente com a PLH, a contribuição calórica de alimentos ultraprocessados foi inversamente associada à densidade proteica e diretamente ao consumo energético total, enquanto a ingestão absoluta de proteínas permaneceu constante com aumento do consumo de alimentos ultraprocessados. Os níveis médios de enterolignanos urinários diminuíram ao longo dos quintis de consumo de alimentos ultraprocessados, enquanto os níveis de isoflavonas permaneceram inalterados. Conclusões: Este estudo mostra que a diminuição da contribuição calórica de alimentos ultraprocessados é um meio racional e eficaz de melhorar a qualidade nutricional das dietas dos EUA / Background: The introduction of agricultural and animal husbandry has not provided the human genome time enough to adapt, much less the advancing technology after Industrial Revolution. According to Cordain et al., displacement of minimally processed foods by post-agricultural and post-industrial food items adversely affected the following dietary indicators: glycemic load, fatty acid and macronutrient compositions, micronutrient density, acid-base balance, sodium-potassium ratio and fiber content. Many current diseases of civilization, in turn may be ascribable to those unbalanced dietary indicators. Indeed, Raubenheimer and Simpson have proposed the Protein Leverage Hypothesis (PLH) to explain how a drop in dietary protein content might lead to obesity and associated cardiometabolic disease. Objective: This thesis aims to study the effect of an increased consumption of ultra-processed foods on dietary indicators in the US population, including macronutrient composition, micronutrient and fiber densities, and urinary phytoestrogens. It also explores whether the dietary share of ultra-processed foods, expressed as a percentage of total energy intake, is a meaningful determinant of overall nutritional quality of contemporary diets. Lastly, it also looks into whether the association between ultra-processed food, protein and energy consumptions fit predictions of the PLH model. Methods: Participants from cross-sectional 2009-2010 National Health and Nutrition Examination Survey with at least one 24-hour dietary recall were evaluated. Food items were classified according to extent and purpose of industrial food processing as: unprocessed or minimally processed foods, processed culinary ingredients, processed foods and ultra-processed foods. Manuscript 1, examines the relationship between dietary contribution of ultra-processed foods and nutritional quality of US diet through the evaluation of dietary contents of critical nutrients individually and also overall, using Principal Component Analysis (PCA). Manuscript 2 studies the association between dietary contribution of ultra-processed foods and energy intake from added sugars. Manuscript 3 examines how consumption of ultra-processed food influences relative dietary protein content and, absolute energy and protein intakes; it furthermore, tests whether the relationships fit PLH predictions. Manuscript 4 assesses the relationship between dietary contribution of ultra-processed foods and urinary levels of phytoestrogens. Results: The average content of protein, fiber, vitamins A, C, D and E, zinc, potassium, phosphorus, magnesium and calcium in US diet decreased significantly across quintiles of energy contribution of ultra-processed foods, while carbohydrate, added sugars and saturated fat contents increased. An inverse dose-response association was found between ultra-processed food consumption and overall dietary quality measured through a Nutrient balanced pattern PCA derived factor score. Consistent with PLH, dietary contribution of ultra-processed foods was inversely associated with protein density and directly associated with total energy intake, while absolute protein intake remained relatively constant with increases in ultra-processed food consumption. Average urinary mammal lignan levels decreased across quintiles of ultra-processed food consumption, while isoflavone levels remained unchanged. Conclusions: This study suggests that decreasing the dietary share of ultra-processed foods is a rational and effective way to improve the nutritional quality of US diets
|
519 |
Avaliação da qualidade do diagnóstico do meio biótico de EIAs do Estado de São Paulo / Assessment of biotic baseline studies of EISs of São Paulo stateLamonica, Laura de Castro 19 September 2016 (has links)
A Política Nacional do Meio Ambiente visa compatibilizar o desenvolvimento socioeconômico com a qualidade ambiental. A Avaliação de Impacto Ambiental, um de seus instrumentos, utiliza-se do Estudo de Impacto Ambiental (EIA) na sua aplicação em projetos ou empreendimentos. A elaboração do EIA envolve a etapa de diagnóstico para análise da qualidade ambiental da área. A qualidade do EIA e do diagnóstico tem sido objeto de críticas e descrédito junto à sociedade, principalmente à comunidade científica e às associações ambientalistas. Sabe-se que a qualidade do diagnóstico influencia diretamente a efetividade processual do EIA e seu papel como influenciador da tomada de decisão; assim, uma avaliação da qualidade dessa etapa do EIA contribui com a aplicação mais efetiva desse instrumento. A pesquisa visou avaliar a qualidade do diagnóstico biótico dos EIAs do Estado de São Paulo elaborados entre 2005 e 2014. Para isso, proposições ao diagnóstico biótico foram reunidas em uma lista de verificação, utilizada para a avaliação de 55 diagnósticos bióticos e 35 termos de referência de EIAs. Os resultados foram analisados qualitativamente e em comparação com as recomendações dos termos de referência (TRs) analisados. Posteriormente, a qualidade dos diagnósticos foi analisada sob três perspectivas: aprovação dos estudos, tipo de empreendimento e ano de elaboração do EIA. Por fim, foi realizada análise de componentes principais não-linear (NLPCA) para os dados de diagnóstico, no intuito de testar a sugestão de aplicação dessa ferramenta para a identificação dos critérios determinantes para a qualidade dos diagnósticos e possíveis relações entre esses critérios e entre os estudos. A qualidade dos diagnósticos bióticos analisados foi mais satisfatória para aspectos descritivos do que analíticos. Foram determinantes para a qualidade dos estudos critérios relativos à coleta de dados quantitativos e levantamentos para espécies raras, segundo a NLPCA. Tempo de levantamento e sazonalidade foram considerados insatisfatórios, e apresentaram relação estatística com a identificação do grau de vulnerabilidade da área. Os resultados realçaram a importância da sistematização de dados de biodiversidade em fontes confiáveis e atualizadas para elaboração e análise de diagnósticos, e para TRs mais específicos, uma vez que, apesar de estarem sendo cumpridos pelos estudos, os TRs são genéricos e apresentam mais recomendações descritivas do que analíticas. Não houve diferença representativa entre a qualidade dos diagnósticos referentes a estudos aprovados e não aprovados, o setor de Obras Hidráulicas apresentou avaliações mais satisfatórias, o que foi salientado pela NLPCA e pode estar relacionado ao porte do projeto, e a análise temporal evidenciou uma tendência de melhora dos estudos e TRs. Tanto a lista de verificação quanto a NLPCA se mostraram ferramentas adequadas para a investigação da qualidade de diagnósticos biológicos de EIA / The Brazilian National Environmental Policy established Environmental Impact Assessment (EIA) as one of the 13 tools to reconcile socio-economic development with environmental quality. EIA involves the Environmental Impact Statements (EIS) in its application to development projects. EIS drafting involves a baseline step for analysis of environmental quality of the area. The quality of the EIS and the baseline process has been criticized by society, especially by scientific community and environmental groups, and this quality directly influences the effectiveness of the EIA procedure and its role as a decision making tool. Thus, an evaluation of the quality of this EIS step contributes to a more effective application of this instrument. The research aimed to evaluate the quality of biotic baseline studies of EIS drawn up between 2005 and 2014 in the state of São Paulo. We assessed 55 biotic baseline studies and 35 terms of reference (TRs) of EISs by a checklist which consists of a set of recommendations from literature and regulations to biotic baseline studies. The results of baseline and TRs were analyzed qualitatively and compared to one another. Then, we looked at the baseline quality under three approaches: license emission, sector and project type of activity, and year of EIS preparation. Finally, multivariate analysis was performed by Nonlinear Principal Component Analysis (NLPCA) for the baseline quality data in order to test the application of this analysis for the identification of critical and determinant criteria for the quality of baseline and the investigation of how these criteria and the EISs are related to one another. Results point to more satisfactory descriptive than analytical issues. Criteria of quantitative data collecting and surveys of rare species were determinants for baseline quality. Time of survey and seasonality was an unsatisfactory criterion, and statistically related to the vulnerability degree of the area. Results highlighted the importance of systematization of biodiversity data in reliable and updated sources useful for EISs preparation and analysis and for the draft of TRs in a more specific way. TRs were satisfactorily complied by the baseline content, but they are generic and present more descriptive than analytical recommendations. There was no representative difference between the quality of baseline of approved and not approved EISs. Hydraulic project showed more satisfactory evaluations, emphasized by NLPCA, and it may be related to the size of the project. Temporal analysis highlighted an improvement trend of studies and TRs. Thus, both the checklist as NLPCA proved to be suitable tools to the assessment of biological baseline studies of EIS
|
520 |
Representation of individual finger movements in macaque areas AIP, F5 and M1Sheng, Wei-An 21 June 2018 (has links)
No description available.
|
Page generated in 0.0661 seconds