• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1398
  • 1015
  • 380
  • 88
  • 62
  • 59
  • 46
  • 38
  • 21
  • 19
  • 14
  • 12
  • 11
  • 8
  • 8
  • Tagged with
  • 3663
  • 1140
  • 591
  • 492
  • 384
  • 358
  • 300
  • 251
  • 249
  • 248
  • 229
  • 224
  • 217
  • 215
  • 209
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Articulation entre activités formelles et activités semi-formelles dans le développement de logiciels / Articulation between definite and semi-definite activities in software development

Sayar, Imen 28 March 2019 (has links)
Le développement de spécifications formelles correctes pour des systèmes et logiciels commence par l’analyse et la compréhension des besoins du client. Entre ces besoins décrits en langage naturel et leur spécification définie dans un langage formel précis, un écart existe et rend la tâche de développement de plus en plus difficile à accomplir. Nous sommes face à deux mondes distincts. Ce travail de thèse a pour objectif d’expliciter et d’établir des interactions entre ces deux mondes et de les faire évoluer en même temps. Par interaction, nous désignons les liens, les échanges et les activités se déroulant entre les différents documents. Parmi ces activités, nous présentons la validation comme un processus rigoureux qui démarre dès l’analyse des besoins et continue tout au long de l’élaboration de leur spécification formelle. Au fur et à mesure du développement, des choix sont effectués et les retours des outils de vérification et de validation permettent de détecter des lacunes aussi bien dans les besoins que dans la spécification. L’évolution des deux mondes est décrite via l’introduction d’un nouveau besoin dans un système existant et à travers l’application de patrons de développement. Ces patrons gèrent à la fois les besoins et la spécification formelle associée ; ils sont élaborés à partir de la description de la forme des besoins. Ils facilitent la tâche de développement et aident à éviter les risques d’oublis. Quel que soit le choix, des questions se posent tout au long du développement et permettent de déceler des lacunes, oublis ou ambiguïtés dans l’existant. / The development of correct formal specifications for systems and software begins with the analysis and understanding of client requirements. Between these requirements described in natural language and their specification defined in a specific formal language, a gap exists and makes the task of development more and more difficult to accomplish. We are facing two different worlds. This thesis aims to clarify and establish interactions between these two worlds and to evolve them together. By interaction, we mean all the links, exchanges and activities taking place between the different documents. Among these activities, we present the validation as a rigorous process that starts from the requirements analysis and continues throughout the development of their formal specification. As development progresses, choices are made and feedbacks from verification and validation tools can detect shortcomings in requirements as well as in the specification. The evolution of the two worlds is described via the introduction of a new requirement into an existing system and through the application of development patterns. These patterns manage both the requirements and their associated formal specifications ; they are elaborated from the description of the form of the requirements in the client document. They facilitate the task of development and help to avoid the risk of oversights. Whatever the choice, the proposed approach is guided by questions accompanying the evolution of the whole system and makes it possible to detect imperfections, omissions or ambiguities in the existing.
192

Desenvolvimento e validação de um método computadorizado para avaliação do consumo alimentar, preenchido por indivíduos adultos utilizando a Web / Development and validation of a self-administered computerized on-line method to assess food consumption in healthy adults

Galante, Andréa Polo 05 December 2007 (has links)
Objetivo: O principal objetivo deste estudo foi desenvolver e validar um método computadorizado para avaliação do consumo alimentar, preenchido por indivíduos adultos utilizando a Web. Metodologia: Foi desenvolvido um sistema computadorizado contendo base de dados de alimentos provenientes das principais Tabelas de Composição de Alimentos. No total, 561 indivíduos manifestaram interesse em participar da pesquisa. Após os critérios de exclusão adotados, 93 indivíduos fizeram parte da amostra. Para a validação, foi utilizado como método de referência o Recordatório de 24 horas (R24h) aplicado por telefone. Foi solicitado o preenchimento de 3 registros alimentares (RAPls) on-line em dias alternados e, no dia seguinte ao seu preenchimento, os mesmos indivíduos foram entrevistados por telefone. Utilizaram-se teste de Kolmogorov-Smirnov e ajustes pela variância intrapessoal e pela energia para as variáveis dietéticas, coeficientes de correlação de Pearson e de Spearman, teste t-pareado e teste de Wilcoxon, classificação dos indivíduos em quartis, Kappa ponderado e os gráficos de 81and Altman. Resultados: A amostra final foi constituída por 60 indivíduos, dos quais 72% eram do gênero feminino e 70% possuíam curso superior completo. Em relação ao estado nutricional, 55% dos entrevistados eram eutróficos. Lipídio, cálcio, ferro, sódio, vitamina 86 e vitamina C apresentaram médias estatisticamente iguais entre ambos os métodos. Os coeficientes de correlação após os ajustes foram todos estatisticamente significantes e variaram de 0,31 (Vitamina 86) a 0,87 (energia). A concordância bruta entre os quartis após o ajuste variou de 40% (carboidratos) a 61,7% (energia), enquanto para os quartis opostos variou de 6,7% (vitamina C) a 21,7% (sódio). Observou-se que todos os nutrientes e energia apresentaram valores acima de 0,40 para o Kappa ponderado. Conclusões: O sistema computadorizado NutriQuanti, quando preenchido por usuários da internet, apresentou bom desempenho para classificar os indivíduos segundo seu consumo de energia e os nutrientes avaliados / Objective: The main purpose of this study was to develop and validate a self-administered computerized method to assess food consumption in adults using the internet. Methods: A computerized program containing a food nutrient database from the main food composition tables has been developed. Five hundred and sixty one individuais have manifested interest in participate in the study. After application of the exclusion criteria, the sample consisted of 93 individuais. For validation of the computerized method, the 24h-recall method administered by telephone was used as reference. The individuais were asked to complete 3 food records on the internet, in alternate days. After completing each food record, the same individuais were interviewed by telephone. The following statistical tests were used: Kolmogorov-Smirnov, adjustment for the intraindividual portion of the variance and for energy intake for the diet variables, Pearson and Spearman correlation coefficients, t-test for paired samples and Wilcoxon test, cross-c1assification, weighted Kappa and 81and Altman graphics. Results: The final sample consisted of 60 individuais, of whom 72% were female and 70% had completed graduated course. As far as nutritional status, 55% were eutrophic. Mean intake of fat, calcium, iron, sodium, vitamins 86 and C were statistically the same (similar) between the two methods. After adjustment, all correlation coefficients were statistically significant and varied from 0.31 (Vitamin 86) to 0.87 (energy). The raw concordance between quartiles, after adjustment, varied from 40% (carbohydrate) to 61.7% (energy), and for the opposite quartiles varied from 6.7% (vitamin C) to 21.7% (sodium). Ali the nutrients, as well as the energy, have shown values above 0.40 for the weighted Kappa test. Conclusions: The self-administered on-line NutriQuanti computerized program has performed well when classifying the individuals according to their energy and nutrient intake.
193

Desenvolvimento e validação de um método computadorizado para avaliação do consumo alimentar, preenchido por indivíduos adultos utilizando a Web / Development and validation of a self-administered computerized on-line method to assess food consumption in healthy adults

Andréa Polo Galante 05 December 2007 (has links)
Objetivo: O principal objetivo deste estudo foi desenvolver e validar um método computadorizado para avaliação do consumo alimentar, preenchido por indivíduos adultos utilizando a Web. Metodologia: Foi desenvolvido um sistema computadorizado contendo base de dados de alimentos provenientes das principais Tabelas de Composição de Alimentos. No total, 561 indivíduos manifestaram interesse em participar da pesquisa. Após os critérios de exclusão adotados, 93 indivíduos fizeram parte da amostra. Para a validação, foi utilizado como método de referência o Recordatório de 24 horas (R24h) aplicado por telefone. Foi solicitado o preenchimento de 3 registros alimentares (RAPls) on-line em dias alternados e, no dia seguinte ao seu preenchimento, os mesmos indivíduos foram entrevistados por telefone. Utilizaram-se teste de Kolmogorov-Smirnov e ajustes pela variância intrapessoal e pela energia para as variáveis dietéticas, coeficientes de correlação de Pearson e de Spearman, teste t-pareado e teste de Wilcoxon, classificação dos indivíduos em quartis, Kappa ponderado e os gráficos de 81and Altman. Resultados: A amostra final foi constituída por 60 indivíduos, dos quais 72% eram do gênero feminino e 70% possuíam curso superior completo. Em relação ao estado nutricional, 55% dos entrevistados eram eutróficos. Lipídio, cálcio, ferro, sódio, vitamina 86 e vitamina C apresentaram médias estatisticamente iguais entre ambos os métodos. Os coeficientes de correlação após os ajustes foram todos estatisticamente significantes e variaram de 0,31 (Vitamina 86) a 0,87 (energia). A concordância bruta entre os quartis após o ajuste variou de 40% (carboidratos) a 61,7% (energia), enquanto para os quartis opostos variou de 6,7% (vitamina C) a 21,7% (sódio). Observou-se que todos os nutrientes e energia apresentaram valores acima de 0,40 para o Kappa ponderado. Conclusões: O sistema computadorizado NutriQuanti, quando preenchido por usuários da internet, apresentou bom desempenho para classificar os indivíduos segundo seu consumo de energia e os nutrientes avaliados / Objective: The main purpose of this study was to develop and validate a self-administered computerized method to assess food consumption in adults using the internet. Methods: A computerized program containing a food nutrient database from the main food composition tables has been developed. Five hundred and sixty one individuais have manifested interest in participate in the study. After application of the exclusion criteria, the sample consisted of 93 individuais. For validation of the computerized method, the 24h-recall method administered by telephone was used as reference. The individuais were asked to complete 3 food records on the internet, in alternate days. After completing each food record, the same individuais were interviewed by telephone. The following statistical tests were used: Kolmogorov-Smirnov, adjustment for the intraindividual portion of the variance and for energy intake for the diet variables, Pearson and Spearman correlation coefficients, t-test for paired samples and Wilcoxon test, cross-c1assification, weighted Kappa and 81and Altman graphics. Results: The final sample consisted of 60 individuais, of whom 72% were female and 70% had completed graduated course. As far as nutritional status, 55% were eutrophic. Mean intake of fat, calcium, iron, sodium, vitamins 86 and C were statistically the same (similar) between the two methods. After adjustment, all correlation coefficients were statistically significant and varied from 0.31 (Vitamin 86) to 0.87 (energy). The raw concordance between quartiles, after adjustment, varied from 40% (carbohydrate) to 61.7% (energy), and for the opposite quartiles varied from 6.7% (vitamin C) to 21.7% (sodium). Ali the nutrients, as well as the energy, have shown values above 0.40 for the weighted Kappa test. Conclusions: The self-administered on-line NutriQuanti computerized program has performed well when classifying the individuals according to their energy and nutrient intake.
194

Modélisation des systèmes synchrones en BIP / Modeling Synchronous Systems in BIP

Sfyrla, Vasiliki 21 June 2011 (has links)
Une idée centrale en ingénierie des systèmes est de construire les systèmes complexes par assemblage de composants. Chaque composant a ses propres caractéristiques, suivant différents points de vue, chacun mettant en évidence différentes dimensions d'un système. Un problème central est de définir le sens la composition de composants hétérogènes afin d'assurer leur interopérabilité correcte. Une source fondamentale d'hétérogénéité est la composition de sous-systèmes qui ont des différentes sémantiques d'execution et d' interaction. À un extrême du spectre sémantique on trouve des composants parfaitement synchronisés par une horloge globale, qui interagissent par transactions atomiques. À l'autre extrême, on a des composants complètement asynchrones, qui s'éxécutent à des vitesses indépendantes et interagissent nonatomiquement. Entre ces deux extrêmes, il existe une variété de modèles intermédiaires (par exemple, les modèles globalement asynchrones et localement synchrones). Dans ce travail, on étudie la combinaison des systèmes synchrones et asynchrones. A ce fin, on utilise BIP (Behavior-Interaction-Priority), un cadre général à base de composants permettant la conception rigoureuse de systémes. On définit une extension de BIP, appelée BIP synchrone, déstiné à modéliser les systèmes flot de données synchrones. Les pas d'éxécution sont décrites par des réseaux de Petri acycliquemunis de données et des priorités. Ces réseaux de Petri sont utilisés pour modéliser des flux concurrents de calcul. Les priorités permettent d'assurer la terminaison de chaque pas d'éxécution. Nous étudions une classe des systèmes synchrones ``well-triggered'' qui sont sans blocage par construction et le calcul de chaque pas est confluent. Dans cette classe, le comportement des composants est modélisé par des `graphes de flux modaux''. Ce sont des graphes acycliques représentant trois différents types de dépendances entre deux événements p et q: forte dépendance (p doit suivre q), dépendance faible (p peut suivre q) et dépendance conditionnelle (si p et q se produisent alors $p$ doit suivre q). On propose une transformation de modèles LUSTRE et MATLAB/Simulink discret à temps discret vers des systèmes synchrones ``well-triggered''. Ces transformations sont modulaires et explicitent les connexions entre composants sous forme de flux de données ainsi que leur synchronisation en utilisant des horloges. Cela permet d'intégrer des modèles synchrones dans les modèles BIP hétérogènes. On peut ensuite utiliser la validation et l'implantation automatique déjà disponible pour BIP. Ces deux traductions sont actuellement implementées et des résultats expérimentaux sont fournis. Pour les modèles BIP synchrones nous parvenons à générer du code efficace. Nous proposons deux méthodes: une implémentation séquentielle et une implémentation distribués. L'implémentation séquentielle consiste en une boucle infinie. L'implémentation distribuée transforme les graphes de flux modaux vers une classe particulieére de réseaux de Petri, que l'on peut transformer en réseaux de processus de Kahn. Enfin, on étudie la théorie de la conception de modeéles insensibles à la latence (latency-insensitive design, LID) qui traite le problème de latence des interconnexionsdans les systèmes synchrones. En utilisant la conception LID, les systèmes synchrones peuvent être «désynchronisés» en des réseaux de processus synchrones qui peuvent fonctionner à plus haute fréquence. Nous proposons un modèle permettant de construire des modéles insensibles à la latence en BIP synchrone, en représentant les mécanismes spécifiques d'interconnexion par des composants BIP synchrone. / A central idea in systems engineering is that complex systems are built by assembling com- ponents. Components have different characteristics, from a large variety of viewpoints, each highlighting different dimensions of a system. A central problem is the meaningful composition of heterogeneous components to ensure their correct interoperation. A fundamental source of heterogeneity is the composition of subsystems with different execution and interaction seman- tics. At one extreme of the semantic spectrum are fully synchronized components which proceed in a lockstep with a global clock and interact in atomic transactions. At the other extreme are completely asynchronous components, which proceed at independent speeds and interact non- atomically. Between the two extremes a variety of intermediate models can be defined (e.g. globally-asynchronous locally-synchronous models). In this work, we study the combination of synchronous and asynchronous systems. To achieve this, we rely on BIP (Behavior-Interaction-Priority), a general component-based framework encompassing rigorous design. We define an extension of BIP, called Synchronous BIP, dedicated to model synchronous data-flow systems. Steps are described by acyclic Petri nets equipped with data and priorities. Petri nets are used to model concurrent flow of computation. Priorities are instrumental for enforcing run-to-completion in the execution of a step. We study a class of well- triggered synchronous systems which are by construction deadlock-free and their computation within a step is confluent. For this class, the behavior of components is modeled by modal flow graphs. These are acyclic graphs representing three different types of dependency between two events p and q: strong dependency (p must follow q), weak dependency (p may follow q), conditional dependency (if both p and q occur then p must follow q). We propose translation of LUSTRE and discrete-time MATLAB/Simulink into well-triggered synchronous systems. The translations are modular and exhibit data-flow connections between components and their synchronization by using clocks. This allows for integration of synchronous models within heterogeneous BIP designs. Moreover, they enable the application of validation and automatic implementation techniques already available for BIP. Both translations are cur- rently implemented and experimental results are provided. For Synchronous BIP models we achieve efficient code generation. We provide two methods, sequential implementation and distributed implementation. The sequential implementation pro- duces endless single loop code. The distributed implementation transforms modal flow graphs to a particular class of Petri nets, that can be mapped to Kahn Process Networks. Finally, we study the theory of latency-insensitive design (LID) which deals with the problem of interconnection latencies within synchronous systems. Based on the LID design, synchronous systems can be “desynchronized” as networks of synchronous processes that might run with increased frequency. We propose a model for LID design in Synchronous BIP by representing specific LID interconnect mechanisms as synchronous BIP components.
195

Caractérisation de la performance et validation des méthodes de dépistage des résidus d’antibiotiques dans les denrées alimentaires / Performance characterization and validation of screening methods for antibiotic residues in food of animal origin

Gaudin, Valérie 08 June 2016 (has links)
L’usage des antibiotiques en médecine vétérinaire peut entrainer la présence de résidus d’antibiotiques dans les denrées alimentaires d’origine animale. Ces résidus peuvent présenter des risques (eg. toxicologiques, allergies, antibiorésistance) pour la santé du consommateur. La fixation de limites réglementaires pour ces résidus et un contrôle adapté sont primordiaux pour garantir la sécurité des consommateurs. Les méthodes de dépistage sont utilisées en première intention et représentent donc l’étape critique du contrôle. La validation d’une méthode va permettre de garantir qu’elle est adaptée à l’usage souhaité, aux attentes réglementaires et attester de sa performance. La validation constitue une exigence normative (ie. ISO 17025) et réglementaire (ie. Décision européenne 2002/657/CE). Dans une première partie, la diversité des méthodes de dépistage est présentée. Les méthodes dites conventionnelles de type microbiologique ou immunologique, développées dans les années 1980, sont toujours couramment utilisées, en raison de leur faible coût. Des méthodes innovantes, appelées biocapteurs, sont constituées d’un biorécepteur (eg. anticorps, aptamère) et d’un transducteur (eg. électrochimique, optique, massique, calorimétrique) pour la détection du signal. Ces méthodes sont en développement permanent et les progrès technologiques permettent de développer des méthodes de plus en plus sensibles, portables, parfois économiques. Dans une deuxième partie, différentes approches de validation, sous forme de réglementations, de lignes directrices ou de normes, sont discutées. La validation d’une méthode comporte deux étapes : tout d’abord la caractérisation des performances, puis la validation proprement dite par rapport à des critères préétablis. Les approches peuvent être absolue (une seule méthode) ou relative (comparaison de méthodes), globale (combinaison de plusieurs caractéristiques en une seule) ou critère par critère. L’objet de cette thèse est de comparer ces différentes approches de validation, afin de statuer sur leur application possible aux différentes méthodes de dépistage des résidus et déterminer si leurs conclusions sont équivalentes ou non. Différentes approches ont été testées en les appliquant pour valider des méthodes de dépistage de différents types : conventionnelles microbiologique et immunologique, innovantes par biocapteurs optiques. L’approche par comparaison de méthodes n’est pas adaptée aux méthodes de dépistage des résidus d’antibiotiques. En effet, le choix de la méthode de référence est compliqué car il n’existe pas de méthodes normalisées. De plus, les méthodes de référence choisies ont souvent des principes très différents de la méthode alternative et sont le plus souvent moins performantes. L’approche globale, telle que la Probabilité de détection (POD) et le profil d’exactitude sont applicables aux méthodes de dépistage. Ces approches récentes sont de plus en plus utilisées dans d’autres domaines et présentent un intérêt à être développées pour les méthodes de dépistage des résidus d’antibiotiques. Enfin, l’approche critère par critère de la décision européenne 20002/657/CE et du guide de validation européen de 2010, couramment appliquée aux résidus d’antibiotiques, comporte une caractéristique majeure et une avancée dans la validation qui est la capacité de détection (CCβ). En conclusion, les méthodes de dépistage sont en constante évolution, grâce au développement des biocapteurs. L’amélioration de leurs performances permet de répondre de mieux en mieux à la problématique du contrôle des résidus d’antibiotiques dans les denrées alimentaires. La validation des méthodes est primordiale pour garantir un contrôle efficace. Nous avons pu observer l’évolution de la validation ces 20 dernières années, à travers les travaux de cette thèse. Cette évolution doit continuer et des perspectives d’évolution des référentiels de validation sont présentées dans cette thèse. / The use of antibiotic residues for animal treatment could lead to the presence of antibiotic residues in food of animal origin. The consumer could face some risks (eg. toxicological, allergies, bacterial resistance), due to these residues. The setting of regulatory limits or these residues and an adapted control of food products are thus essential to guarantee the safety of the consumers. Screening methods are used in first intention and represent the critical stage of the control. The validation of a method will guarantee that the method if fitted for purpose, adapted to the regulatory expectations and give evidence of its performance. The validation is mandatory due to international standards (ie. ISO 17025) and regulatory requirements (ie. European decision 2002/657/EC). In a first part, the diversity of screening methods is presented. The conventional methods, microbiological or immunological types, developed in the 1980s, are always used, because of their moderate cost. Innovative methods, called biosensors, consist of a bioreceptor (eg. antibody, aptamer) and a transducer (eg. electrochemical, optical, mass sensitive, calorimetric) for the detection of the signal. These methods are in continuous development and the technological progress allows developing more, more sensitive, portable and sometimes economic methods. In a second part, various approaches of validation, in the form of regulations, guidelines or standards, are discussed. The validation of a method contains two stages: first of all the characterization of the performances, then the validation itself with regard to preestablished criteria. The approaches can be absolute (a single method) or relative (comparison of methods), global (combination of several characteristics in only one) or criterion by criterion. The object of this thesis is to compare these various approaches of validation, to conclude on their potential application for different residue screening methods and to determine if their conclusions are equivalent or not. Various approaches have been tested by applying them to the validation of screening methods of various types: conventional methods (microbiological and immunological) and innovative optical biosensors. The approach by comparison of methods is not fitted to screening methods for antibiotic residues. Indeed, the choice of the reference method is complicated because there are no standardized methods. Furthermore, the chosen reference methods often have very different principles from the alternative method and are less sensitive most of the time. The global approach, such as the Probability of detection (POD) and the accuracy profile are applicable to the screening methods. These recent approaches are more and more used in other fields and present an interest to be developed for the screening methods for antibiotic residues. Finally, the criterion by criterion approach of the European decision 20002/657/EC and the European guideline for the validation of screening methods of 2010, usually applied to the screening methods for antibiotic residues, introduced a major characteristic and an improvement in the validation which is the detection capability (CCß). In conclusion, the screening methods are constantly evolving, thanks to the development of new biosensors. The improvement of their performances allows answering better and better to the issue of the control of antibiotic residues in foodstuffs. The validation of the methods is essential to guarantee an effective control. We were able to observe the evolution of the validation these last 20 years, through the works of this thesis. This evolution has to continue and perspectives of evolution of guidelines, regulations and standards of validation are presented in this thesis.
196

Definition and validation of requirements management measures

Loconsole, Annabella January 2007 (has links)
<p>The quality of software systems depends on early activities in the software development process, of which the management of requirements is one. When requirements are not managed well, a project can fail or become more costly than intended, and the quality of the software developed can decrease. Among the requirements management practices, it is particularly important to quantify and predict requirements volatility, i.e., how much the requirements are likely to change over time. Software measures can help in quantifying and predicting requirements attributes like volatility. However, few measures have yet been defined, due to the fact that the early phases are hard to formalise. Furthermore, very few requirements measures have been validated, which would be needed in order to demonstrate that they are useful. The approach to requirements management in this thesis is quantitative, i.e. to monitor the requirements management activities and requirements volatility through software measurement. In this thesis, a set of 45 requirements management measures is presented. The measures were defined using the goal question metrics framework for the two predefined goals of the requirements management key process area of the capability maturity model for software. A subset of these measures was validated theoretically and empirically in four case studies. Furthermore, an analysis of validated measures in the literature was performed, showing that there is a lack of validated process, project, and requirements measures in software engineering. The studies presented in this thesis show that size measures are good estimators of requirements volatility. The important result is that size is relevant: increasing the size of a requirements document implies that the number of changes to requirements increases as well. Furthermore, subjective estimations of volatility were found to be inaccurate assessors of requirements volatility. These results suggest that practitioners should complement the subjective estimations for assessing volatility with the objective ones. Requirements engineers and project managers will benefit from the research presented in this thesis because the measures defined, proved to be predictors of volatility, can help in understanding how much requirements will change. By deploying the measures, the practitioners would be prepared for possible changes in the schedule and cost of a project, giving them the possibility of creating alternative plans, new cost estimates, and new software development schedules.</p>
197

Definition and validation of requirements management measures

Loconsole, Annabella January 2007 (has links)
The quality of software systems depends on early activities in the software development process, of which the management of requirements is one. When requirements are not managed well, a project can fail or become more costly than intended, and the quality of the software developed can decrease. Among the requirements management practices, it is particularly important to quantify and predict requirements volatility, i.e., how much the requirements are likely to change over time. Software measures can help in quantifying and predicting requirements attributes like volatility. However, few measures have yet been defined, due to the fact that the early phases are hard to formalise. Furthermore, very few requirements measures have been validated, which would be needed in order to demonstrate that they are useful. The approach to requirements management in this thesis is quantitative, i.e. to monitor the requirements management activities and requirements volatility through software measurement. In this thesis, a set of 45 requirements management measures is presented. The measures were defined using the goal question metrics framework for the two predefined goals of the requirements management key process area of the capability maturity model for software. A subset of these measures was validated theoretically and empirically in four case studies. Furthermore, an analysis of validated measures in the literature was performed, showing that there is a lack of validated process, project, and requirements measures in software engineering. The studies presented in this thesis show that size measures are good estimators of requirements volatility. The important result is that size is relevant: increasing the size of a requirements document implies that the number of changes to requirements increases as well. Furthermore, subjective estimations of volatility were found to be inaccurate assessors of requirements volatility. These results suggest that practitioners should complement the subjective estimations for assessing volatility with the objective ones. Requirements engineers and project managers will benefit from the research presented in this thesis because the measures defined, proved to be predictors of volatility, can help in understanding how much requirements will change. By deploying the measures, the practitioners would be prepared for possible changes in the schedule and cost of a project, giving them the possibility of creating alternative plans, new cost estimates, and new software development schedules.
198

Statistical Stability and Biological Validity of Clustering Algorithms for Analyzing Microarray Data

Karmakar, Saurav 08 August 2005 (has links)
Simultaneous measurement of the expression levels of thousands to ten thousand genes in multiple tissue types is a result of advancement in microarray technology. These expression levels provide clues about the gene functions and that have enabled better diagnosis and treatment of serious disease like cancer. To solve the mystery of unknown gene functions, biological to statistical mapping is needed in terms of classifying the genes. Here we introduce a novel approach of combining both statistical consistency and biological relevance of the clusters produced by a clustering method. Here we employ two performance measures in combination for measuring statistical stability and functional similarity of the cluster members using a set of gene expressions with known biological functions. Through this analysis we construct a platform to predict about unknown gene functions using the outperforming clustering algorithm.
199

Statistical Stability and Biological Validity of Clustering Algorithms for Analyzing Microarray Data

Karmakar, Saurav 08 August 2005 (has links)
Simultaneous measurement of the expression levels of thousands to ten thousand genes in multiple tissue types is a result of advancement in microarray technology. These expression levels provide clues about the gene functions and that have enabled better diagnosis and treatment of serious disease like cancer. To solve the mystery of unknown gene functions, biological to statistical mapping is needed in terms of classifying the genes. Here we introduce a novel approach of combining both statistical consistency and biological relevance of the clusters produced by a clustering method. Here we employ two performance measures in combination for measuring statistical stability and functional similarity of the cluster members using a set of gene expressions with known biological functions. Through this analysis we construct a platform to predict about unknown gene functions using the outperforming clustering algorithm.
200

Prédiction de la fracture osseuse du col du fémur : modélisation par éléments finis basée sur la mécanique d’endommagement et validation expérimentale / prediction of proximal femur fracture : finite element modeling based on mechanical damage and experimental validation

Bettamer, Awad 22 November 2013 (has links)
Les fractures causées par l'ostéoporose de l’extrémité supérieure du fémur sont devenues un problème majeur de santé publique. Par conséquent, ce sujet devient un axe de recherche de plus en plus important pour les cliniciens et les chercheurs biomédicaux. Le but de cette étude est de développer une nouvelle approche pour prédire la fracture du col du fémur. Cette étude propose de développer et valider des modèles par éléments finis (EF) en 2D et 3D, basés sur le concept de l’endommagement mécanique des milieux continus, permettant de simuler la fracture de la partie proximale du fémur en tenant compte de l’initiation progressive de fissures et leur progression. Deux configurations ont été utilisées: appui monopodal et chute. L’ensemble des lois de comportements quasi fragile couplées à une loi d’endommagement sont implémentées en langage FORTRAN dans ABAQUS/Standard (sous-programme de type UMAT). La densité minérale osseuse (BMD) a été mesurée par l’absorptiométrie à rayon X en double énergie DXA pour la région d'intérêt. Les modèles ont été développés dans deux variantes (l’une isotrope et l’autre orthotrope) puis validés avec des résultats expérimentaux obtenus sur des essais en appui monopodale réalisés sur des fémurs humain. Durant ces essais, des mesures optiques basées sur la méthode de corrélation d'images numériques (DIC) ont été réalisées afin d’acquérir les différents champs de déplacement et de déformation. Le modèle numérique 3D a réussi à prédire l’ensemble de la courbe force-déplacement ainsi que l'emplacement et l'amorce de la rupture des fémurs. Par ailleurs, Malgré sa robustesse, la variante 3D du modèle numérique reste difficilement exploitable dans l’état pour réaliser un diagnostic préventif dans des délais acceptables pour des cliniciens, car très consommatrice en temps de calcul. Pour pallier à cela, le modèle simplifié en 2D a été préliminairement validé sous les mêmes conditions aux limites et les résultats ont montré une bonne corrélation avec l’expérience. Ces travaux ont souligné le potentiel de la modélisation par éléments finis basée sur l’endommagement quasi-fragile à devenir un outil complémentaire de prédiction du risque de la fracture osseuse. / Femoral fractures caused by the osteoporosis become major problem of public health, and therefore, this subject becomes an increasingly important goal for both clinicians and biomedical researchers. The purpose of this study is to develop a new coupled approach to predict the fracture of neck femoral. The current study proposes a validated 2D and 3D finite element (FE) models based on continuum damage mechanics in order to simulate human proximal femur fracture considering the progressive cracks initiation and propagation. These models are applied and validated under single limb stance and sideways fall configuration. Quasi brittle behavior laws coupled to damage are implemented in FORTRAN and fed into ABAQUS/Standard codes to describe the constitutive behavior (subroutine UMAT). Bone mineral density (BMD) is measured using dual energy X-ray absorptiometry (DXA) for the region of interest. The models have been developed within two variants (one isotropic, the other anisotropic) and validated with experimental results of tests performed on human femur samples under single limb stance configuration. During these tests, optical measurements based on the method of digital image correlation (DIC) were conducted to acquire the various fields of displacement and deformation. To calculate the fracture risk of the femoral head, it is necessary to assign correctly the bone material properties. The 3D FE models were able to predict the overall force-displacement curve, location and initiation of femur fractures. Moreover, despite its robustness, this 3D FE model is still limited to be used, within clinically acceptable time, for diagnostic purposes. To overcome this, the model was simplified into 2D model which has been preliminarily validated under identical boundary conditions and the results showed a good correlation with experiments. These studies have highlighted the potential of the finite element model based on quasi-brittle damage to become a complementary tool for predicting the risk of bone fracture.

Page generated in 0.0978 seconds