• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 116
  • 37
  • 30
  • 16
  • 13
  • 7
  • 6
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 301
  • 70
  • 44
  • 39
  • 31
  • 30
  • 28
  • 27
  • 25
  • 25
  • 24
  • 24
  • 23
  • 20
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Découverte de règles de préférences contextuelles : application à la construction de profils utilisateurs / Contextual preference rules mining : an application for building user profiles

Diallo, Mouhamadou Saliou 27 March 2015 (has links)
L’utilisation de préférences suscite un intérêt croissant pour personnaliser des réponses aux requêtes et effectuer des recommandations ciblées. Pourtant, la construction manuelle de profils de préférences reste à la fois complexe et consommatrice de temps. Dans ce contexte, nous présentons dans cette thèse une nouvelle méthode automatique d’extraction de préférences basée sur des techniques de fouille de données. L’approche que nous proposons est constituée de deux phases : (1) une phase d’extraction de toutes les règles de préférences contextuelles intéressantes et (2) une phase de construction du profil utilisateur. A la fin de la première phase, nous constatons qu’il y a des règles redondantes voir superflues ; la seconde phase permet d’éliminer les règles superflues afin d’avoir un profil concis et consistant. Dans notre approche, un profil utilisateur est constitué de cet ensemble de règles de préférences contextuelles résultats de la seconde phase. La consistance garantit que les règles de préférences spécifiant les profils sont en accord avec un grand nombre de préférences utilisateur et contredisent un petit nombre d’entre elles. D’autre part, la concision implique que les profils sont constitués d’un petit nombre de règles de préférences. Nous avons aussi proposé quatre méthodes de prédiction qui utilisent les profils construits. Nous avons validé notre approche sur une base de données de films construite à partir de MovieLens et IMDB. La base de données contient 3 881 films notés par 6 040 utilisateurs. Ces derniers ont attribué 800 156 notes. Les résultats de ces expériences démontrent que la concision des profils utilisateurs est contrôlée par le seuil d’accord minimal et que même avec une forte réduction du nombre de règles, les qualités de prédiction des profils restent à un niveau acceptable. En plus des expérimentations montrant la qualité de prédiction de notre approche, nous avons montré également que les performances de notre approche peuvent rivaliser avec les qualités de prédiction de certaines méthodes de l’état de l’art, en particulier SVMRank. / The use of preferences arouses a growing interest to personalize response to requests and making targeted recommandations. Nevertheless, manual construction of preferences profiles remains complex and time-consuming. In this context, we present in this thesis a new automatic method for preferences elicitation based on data mining techniques. Our proposal is a two phase algorithm : (1) Extracting all contextual preferences rules from a set of user preferences and (2) Building user profile. At the end of the first phase, we notice that there is to much preference rules which satisfy the fixed constraints then in the second phase we eliminate the superfluous preferences rules. In our approach a user profile is constituted by the set of contextual preferences rules resulting of the second phase. A user profile must satisfy conciseness and soundness properties. The soundness property guarantees that the preference rules specifying the profiles are in agreement with a large set of the user preferences, and contradict a small number of them. On the other hand, conciseness implies that profiles are small sets of preference rules. We also proposed four predictions methods which use the extracted profiles. We validated our approach on a set of real-world movie rating datasets built from MovieLens and IMDB. The whole movie rating database consists of 800,156 votes from 6,040 users about 3,881 movies. The results of these experiments demonstrates that the conciseness of user profiles is controlled by the minimal agreement threshold and that even with strong reduction, the soundness of the profile remains at an acceptable level. These experiment also show that predictive qualities of some of our ranking strategies outperform SVMRank in several situations.
62

Méthodes d'aide à la décision thérapeutique dans les cas des maladies rares : intérêt des méthodes bayésiennes et application à la maladie de Horton / Methods to support clinical decision making in rare diseases : interest of Bayesian methods and application to Horton's disease

Hajj, Paméla El 29 September 2017 (has links)
Les maladies rares sont celles qui touchent un nombre restreint de personnes. Par conséquent, des problèmes spécifiques sont dus par cette rareté.Pour cette raison nous avons systématiquement recherché dans la littérature les publications concernant les caractéristiques des différentes méthodes mathématiques qui ont été utilisées pour l'étude des maladies rares. L'objectif est d'identifier des approches novatrices pour la recherche qui ont été, ou peuvent être, utilisées afin de surmonter les difficultés méthodologiques inhérentes à l'étude des maladies rares.Les méthodes bayésiennes sont recommandées par plusieurs auteurs et dans le cas de ces méthodes il faut introduire une loi informative a priori sur l'effet inconnu du traitement.La détermination de la loi a priori dans le modèle bayésien est difficile. Nous avons travaillé sur les méthodes qui permettent de déterminer de la loi a priori en incluant la possibilité de considérer des informations provenant des études historiques et/ou des données provenant d'autres études "voisines".D'une part, on décrit un modèle bayésien qui a pour but de vérifier l'hypothèse de non-infériorité de l'essai qui repose sur l'hypothèse que le méthotrexate est plus efficace que le corticostéroïde seul.D'autre part, notre travail de thèse se repose sur la méthode epsilon- contamination, qui se base sur le principe de contaminer une loi a priori pas entièrement satisfaisante par une série de lois provenant des informations d'autres études ayant même pathologie de maladie, même traitement ou même population.Enfin, toutes les informations a priori peuvent être résumées par la distribution a priori déterminer à partir des opinions d'experts, leur avis sont recueillis lors d'une réunion où ils ont répondu à un questionnaire qui montre leurs a priori sur les paramètres du modèle bayésien. / In recent years, scientists have difficulties to study rare diseases by conventional methods, because the sample size needed in such studies to meet a conventional frequentist power is not adapted to the number of available patients. After systemically searching in literature and characterizing different methods used in the contest of rare diseases, we remarked that most of the proposed methods are deterministic and are globally unsatisfactory because it is difficult to correct the insufficient statistical power.More attention has been placed on Bayesian models which through a prior distribution combined with a current study enable to draw decisionsfrom a posterior distribution. Determination of the prior distribution in a Bayesian model is challenging, we will describe the process of determining the prior including the possibility of considering information from some historical controlled trials and/or data coming from other studies sufficiently close to the subject of interest.First, we describe a Bayesian model that aims to test the hypothesis of the non-inferiority trial based on the hypothesis that methotrexate is more effective than corticosteroids alone.On the other hand, our work rests on the use of the epsilon-contamination method, which is based on contaminating an a priori not entirely satisfactory by a series of distributions drawn from information on other studies sharing close conditions,treatments or even populations. Contamination is a way to include the proximity of information provided bythese studies.
63

Etude et valorisation industrielle d'halophytes du littoral breton : biodiversité chimique et biologique / Study and industrial promotion of halophytes from the Brittany coast : chemical and biological biodiversity

Bréant, Lise 17 January 2012 (has links)
Quatre espèces halophyles, Silene maritima, Carpobrotus edulis, Senecio cineraria et Limonium latifolium, ont fait l’objet d’études phytochimiques en procédant par fractionnement bioguidé. Nous avons ainsi identifié une trentaine de métabolites bioactifs, capables de limiter la production d’espèces réactives de l’oxygène qui interviennent dans le stress radicalaire et/ou inflammatoire et/ou de favoriser la lipolyse adipocytaire. La visualisation de l’impact du biotope sur le métabolome de Silene maritima nous a permis d’identifier des marqueurs jouant un rôle important dans la capacité adaptative de cette halophyte sous l’effet de stress environnementaux. L’analyse de profils métaboliques montre clairement qu’un même individu, récolté à des moments différents et/ou dans des biotopes différents, possède une composition phytochimique variable. Afin de tester l’effet d’éliciteurs sur la capacité biosynthétique de suspensions cellulaires de Silene maritima, nous avons eu recours à la biotechnologie végétale. Nous avons également développé des conditions de culture in vitro permettant l’initiation de cals à partir d’une halophyte protégée, Crambe maritima. Ces cals pourront in fine servir à initier des suspensions cellulaires, valorisables industriellement. / Four halophile species, Silene maritima, Carpobrotus edulis, Senecio cineraria and Limonium latifolium have been studied phytochemically by bioguided fractionation. This work led to the discovery of thirty biologically active compounds able to reduce the production of reactive oxygen species that occur during inflammatory and/or radical stress, and to promote the adipocytary lipolysis. The visualization of the biotope’s impact on the halophyte Silene maritima metabolome helped us to identify markers playing an important role in the adaptative capacity of plants under environmental stress. Analysis of metabolite profiles clearly shows that the same individual harvested at a different moment and/or from a different biotope possesses an extremely variable chemical composition. In order to test elicitor effect, we used plant biotechnology. Finally, we were able to develop in vitro culture conditions permitting initiation of callus from the protected halophile Crambe maritima. The obtained callus could serve to initiate cell suspension, which is suitable for industrial purposes.
64

Issues and Challenges of Requirement Elicitation in Large Web Projects / Frågor och utmaningar av krav elicitation i stora webbprojekt

Sajjad, Umar, Hanif, Muhammad Qaisar January 2010 (has links)
Requirement elicitation is a critical activity in the requirement development process and it explores the requirements of stakeholders. The success or failure of this process is based on identifying the relevant stakeholders and discovering their needs as well as the quality of requirements. The quality of the requirements is greatly influenced by methods applied during requirements elicitation process. Only complete and structured requirements make these projects more reliable. The common challenges that analysts face during elicitation process are to ensure effective communication between stakeholders as well as the acquisition of tacit knowledge. Mostly errors in the systems are due to poor communication between user and analyst, and these errors require more resources (time and money) to correct them. The understandability problems during elicitation process of large web projects can lead to requirements ambiguous, inconsistent, incorrect and unusable. Different methods (Conversational, Observational, Analytical and Synthetic) are available to deal with the problems during requirement elicitation process. The challenge for analysts is to select an appropriate method or set of methods and apply them for the clear, consistent and correct requirement gathering. This study based on the results of interviews conducted to the professionals, who have industrial experience in development of web systems. The elicitation problems that are identified in literature and interview along with applicability of elicitation methods for requirement gathering in large web projects development are documented in this report. / Umar Sajjad Charhoi, Kotli, Azad Kashmir, Pakistan Muhammad Qaisar Hanif Bhimber, Azad Kashmir, Pakistan
65

Using storytelling to elicit tacit knowledge from subject matter experts in an organization

Classen, Selwyn Ivor January 2010 (has links)
Magister Commercii (Information Management) - MCom(IM) / Knowledge Management has been at the heart of mounting focus over the last several years. Research and literature on the area under discussion has grown and organizations have come to realize that success is often determined by one’s ability to create, disseminate, and embody knowledge in products and services. This realization has led to increased interest in examining the ways in which knowledge can be effectively identified, elicited, codified, distributed and retained.When an employee leaves an organization, the knowledge they possess often goes with them. This loss can potentially have a negative impact on the productivity and quality of the organization. Knowledge Management seeks to find ways to minimize loss of knowledge when an employee leaves an organization. One of the impediments that knowledge management seeks to overcome is the accepted tendency in people to hoard knowledge. People often withhold knowledge when they feel it provides them with a competitive advantage over others. The argument of this study was intended to provide the organization with an approach that it can utilize to facilitate tacit knowledge elicitation by means of the storytelling method.In keeping with Grounded theory principles, and utilising an interpretive approach, stories from Subject Matter Experts were collected and re-coded into fitting knowledge management constructs. The coding of the stories into the various knowledge management constructs was then further refined by means of expert review. Pearson’s cross correlation analysis was also used as a supporting tool to determine and validate that the collected stories were classified correctly under the knowledge management constructs. The research findings eventually demonstrated that storytelling is an effective means of eliciting tacit knowledge from experts. In addition to this, the research has inadvertently resulted in the construction of a knowledge management framework for storytelling.
66

Apport et utilisation des méthodes d’inférence bayésienne dans le domaine des études cliniques diagnostiques / Contribution and use of Bayesian inference methods in the field of clinical diagnostic studies

Bastide, Sophie 16 December 2016 (has links)
Les études diagnostiques correspondent à l’ensemble des études cliniques qui ont pour objectif l’évaluation d’un nouveau test diagnostique. Dans la démarche d’évaluation, l’étape centrale est l’évaluation de la performance du nouveau test par estimation de sa sensibilité et de sa spécificité. De manière classique, la performance du nouveau test est évaluée par comparaison à un test de référence supposé parfait, appelé un « gold standard » qui permet la connaissance du statut réel de chaque sujet vis-à-vis de la pathologie étudiée. Cependant, dans de très nombreuses situations cliniques, différentes difficultés existent : l’absence de gold standard parfait, l’impossibilité de réalisation du gold standard à tous les sujets, la dépendance des résultats des tests réalisés, la variabilité de la sensibilité et/ou de la spécificité du test en fonction de certaines conditions de réalisation, la multiple réalisation du test dans le temps ou sa multiple interprétation.Une revue méthodologique systématique a été effectuée pour faire l’état des lieux des méthodes d’inférence bayésienne disponibles dans les études diagnostiques et de leur utilisation en pratique. Le focus sur les méthodes bayésiennes a été retenu du fait de leurs avantages théoriques contrastant avec leur relative sous-utilisation dans le domaine médicale. Actuellement, de nombreuses méthodes ont été proposées pour répondre à ces différentes difficultés, avec des développements très complexes en cas de combinaison de plusieurs difficultés dans une même situation. Nous avons ainsi pu établir une cartographie des combinaisons de méthodes disponibles. Cependant leur utilisation en clinique reste encore limitée, même si elle est en augmentation ces dernières années.En pratique, nous avons été confrontés à la problématique du diagnostic de pneumopathie à Pneumocystis jirovecii (PJ) (champignon ubiquitaire opportuniste responsable de mycose profonde chez les patients immunodéprimés). Dans ce projet, nous disposions des résultats de quatre techniques de PCR (Polymerase chain reaction) différentes mais sans gold standard, avec la difficulté supplémentaire de dépendance conditionnelle entre les tests du fait du principe commun à l’origine de ces quatre tests. Deux développements ont été réalisés en parallèle pour répondre à cette problématique : d’une part, un travail sur les méthodes d’élicitation des informations a priori adaptées spécifiquement aux études diagnostiques, et d’autre part, un travail de mise en œuvre d’un modèle statistique adapté à la problématique de quatre tests dépendants en l’absence de gold standard. En l’absence de données informatives dans la littérature, l’élicitation des a priori, étape obligatoire pour l’utilisation des méthodes d’inférence bayésienne, est réalisée par l’interrogation d’experts du domaine. Notre travail a consisté en une adaptation des méthodes existantes, disponibles dans le domaine des essais cliniques, spécifiquement aux études diagnostiques pour obtenir des a priori informatifs. Cette méthode a été appliquée à notre cas des PCR diagnostiques pour PJ. L’estimation des performances diagnostiques des tests en l’absence de gold standard repose de manière efficiente sur les modèles à classes latentes. Trois modèles ont été développés pour le cas de deux tests diagnostiques : un modèle à indépendance conditionnelle, un modèle à dépendance conditionnelle à effets fixes et un modèle à dépendance conditionnelle à effets aléatoires. Nous proposons dans cette thèse une adaptation de ces trois modèles à la situation de quatre tests diagnostiques avec une formulation des paramètres permettant une interprétation clinique des covariances entre les tests dans un souci de transmission des méthodes de la théorie à la pratique. Une application et une comparaison de ces modèles ont été faites pour l’estimation des sensibilités et spécificités des quatre techniques de PCR à PJ en utilisant les a priori informatifs obtenus auprès des experts. / Diagnostic studies include all clinical studies the aim of which is the evaluation of a new diagnostic test. In the evaluation process, the main step is the evaluation of the performance of the new test i.e. its sensitivity and specificity. Usually, the performance of a new test is assessed by comparison to a test of reference which is supposed to be perfect, i.e. a "gold standard", and specifies the actual patient’s status for the disease of interest (“Diseased” or “Not-Diseased” status). However, in many clinical situations, different pitfalls exist such as (i) a gold standard is not available, (ii) the gold standard is not applicable to all patients, (iii) a conditional dependence exists between test results, (iv) the performance of a test is not constant and depends on the conditions of achievement of the test, (v) the tests are repeated in time or by several machines or read by several readers, together with multiple interpretation of the results. A systematic methodological review has been performed to inventory all Bayesian inference methods available in the field of diagnostic studies and their use in practice. The focus on Bayesian methods was based on the theoretical advantages of these methods contrasting with their relative underutilization in the medical field. Finally, several interesting methods have been proposed to address methodological issues of diagnostic studies, with very complex developments when several issues were combined in the same clinical situation. We propose to map the development methods and combinations that have already been done or not. However, their clinical use is still limited, although it has increased in recent years.In practice, we met the problem of the diagnosis of pneumonia due to Pneumocystis jirovecii (PJ). PJ is an ubiquitous opportunistic fungus leading to deep mycosis in immunocompromised patients. In this study, the results of four PCR (polymerase chain reaction) assays were available, but without any gold standard, and the supplementary difficulty of conditional dependence between tests because the four tests were based on the same principle. Two works were performed in parallel to address this issue: on one hand, an adaptation of methods to elicit prior information specifically in diagnostic studies, and on the other hand, the implementation of specific Bayesian statistical models adapted to the context of four-dependent tests in the absence of gold standard. When informative information is not available in the literature, the elicitation of priors, the mandatory first step of a Bayesian inference, is carried out by registering experts’ beliefs in the field. Our work consisted in an adaptation of existing methods, available in clinical trials, specifically for diagnostic studies to obtain informative priors. We then applied this method to our four PJ PCR assays. Estimation of the diagnostic test performance in absence of gold standard is efficiently based on latent class models (LCM). Three LCM were developed for the case of two diagnostic tests: a simple LCM assuming conditional independence between tests, a fixed effects LCM and a random effects LCM providing an adjustment for conditional dependence between tests. We extended these three models to a situation where four diagnostic tests are involved and proposed a formulation that enables an interpretation of between tests covariances in a clinical perspective in order to bind theory to practice. These models were then applied and compared in an estimation study of the sensitivities and specificities of the four PJ PCR assays, by using informative priors obtained from experts.
67

Are Engineering Consultancies Really That Different? : A Customer Perspective

Uzdanavicius, Tomas January 2013 (has links)
The thesis is concerned with clients’ perspective towards large engineering consultancies. Thecore aim of this work is to understand what client evaluates when trying to define how engineering consultancies are different from each other. To understand this area, theories of marketing and branding in professional service industry will be analyzed. Further interviews with both clients and consultants were held and based on empirical findings and theoretical perspective, conclusions are drawn.
68

Deriving pilots’ knowledge structures for weather information: an evaluation of elicitation techniques

Raddatz, Kimberly R. January 1900 (has links)
Doctor of Philosophy / Department of Psychology / Richard J. Harris / Systems that support or require human interaction are generally easier to learn, use, and remember when their organization is consistent with the user’s knowledge and experiences (Norman, 1983; Roske-Hofstrand & Paap, 1986). Thus, in order for interface designers to truly design for the user, they must first have a way of deriving a representation of what the user knows about the domain of interest. The current study evaluated three techniques for eliciting knowledge structures for how General Aviation pilots think about weather information. Weather was chosen because of its varying implications for pilots of different levels of experience. Two elicitation techniques (Relationship Judgment and Card Sort) asked pilots to explicitly consider the relationship between 15 weather-related information concepts. The third technique, Prime Recognition Task, used response times and priming to implicitly reflect the strength of relationship between concepts in semantic memory. Techniques were evaluated in terms of pilot performance, conceptual structure validity, and required resources for employment. Validity was assessed in terms of the extent to which each technique identified differences in organization of weather information among pilots of different experience levels. Multidimensional scaling was used to transform proximity data collected by each technique into conceptual structures representing the relationship between concepts. Results indicated that Card Sort was the technique that most consistently tapped into knowledge structure affected by experience. Only conceptual structures based on Card Sort data were able to be used to both discriminate between pilots of different experience levels and accurately classify experienced pilots as “experienced”. Additionally, Card Sort was the most efficient and effective technique to employ in terms of preparation time, time on task, flexibility, and face validity. The Card Sort provided opportunities for deliberation, revision, and visual feedback that allowed the pilots to engage in a deeper level of processing at which experience may play a stronger role. Relationship Judgment and Prime Recognition Task characteristics (e.g., time pressure, independent judgments) may have motivated pilots to rely on a more shallow or text-based level of processing (i.e., general semantic meaning) that is less affected by experience. Implications for menu structure design and assessment are discussed.
69

Developing Software Requirements for a Knowledge Management System that Coordinates Training Programs with Business Processes and Policies in Large Organizations

Kiper, James Richard 01 January 2013 (has links)
For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning objects to processes and policies, training managers cannot conduct an effective training gap analysis in these areas. As a result, the corporate training picture is unclear and instructional needs cannot be accurately determined. The research addressed these problems by recognizing the need for linkages between an organization's business processes, its policies, and the learning objects that package the corresponding training content and deliver it to the workforce. The overall investigation was completed in three parts. In the first study, a thorough examination of the literature was conducted to determine the extent of the research problem and to provide a theoretical foundation for a solution. In the second study an expert panel was used to elicit user needs for a knowledge management system that addresses training management shortcomings in a large law enforcement agency. Another expert panel from that agency validated and prioritized the user needs during the third study. Through a combination of research-based elicitation and validation techniques, an accurate list of natural language software requirements emerged to represent the collective needs of the law enforcement training experts. The software requirements may now serve to analyze the capabilities of existing information technology systems or to form the basis for a request for proposal (RFP) to build the envisioned knowledge management system.
70

The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning, and Validation in Agile Processes

Domah, Darshan 01 January 2013 (has links)
Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both functional requirements (FR) and NFR need to be taken into consideration; lack of attention to NFR has been documented to be the cause of failure for many software projects. Hence special attention needs to be focused on NFR in Agile software development. By its very nature Agile processes require frequent changes but these changes are often not well documented. This is especially true of NFR in Agile processes. While functional requirements are carefully identified, NFR are often not properly elicited. Once NFR are identified they become the basis for reasoning and facilitation of design and development decisions. NFR also need to be validated through proper testing to ensure their quality attributes have been met in the final software product. This dissertation aimed at developing a methodology for addressing NFR in Agile processes. As such, the "NERV Methodology: Non-Functional Requirements Elicitation, Reasoning, and Validation in Agile Processes" was proposed. Several artifacts were created as part of this methodology and included: the NFR Elicitation Taxonomy, the NFR Reasoning Taxonomy, the NFR Quantification Taxonomy, and the Non-Functional Requirements User Story Companion (NFRusCOM) Card. Additionally the NERV Agility Index (NAI) was developed using the Agile Manifesto and its twelve principles. The NERV Methodology was validated using the 26 requirements of the European Union (EU) eProcurement Online System. Additionally the results obtained by the NORMAP Methodology in previous research, were used as baseline. Results show that the NERV Methodology was successful in identifying NFR, for 55 out of 57 requirements sentences that contained implicit NFR, compared to 50 for the baseline. This represented a 96.49% success rate compared to 87.71% for the baseline; an improvement of 8.78%. Furthermore the NERV Methodology was successful in eliciting 82 out of 88 NFR compared to 75 for the baseline. The elicitation success rate was 93.18% compared to 85.24% for the baseline; an improvement of 7.94%. Agility was validated using the same data set as above. Two experiments investigated project durations measured in 2-week sprint iterations, commonly used in Scrum. Results show that the first experiment, using the "FR and NFR Simultaneous Scheme" completed all FR and NFR scope in 24 sprints. The second experiment, using the "FR First Then NFR Scheme" consumed 26 sprints. The first agile scheduling scheme delivered all scope two sprints earlier than the second scheme; representing a saving of almost one month. Validation results showed that the NERV Methodology and its artifacts can potentially be beneficial for software development organizations for eliciting, reasoning about, and validating NFR in Agile processes.

Page generated in 0.0271 seconds