• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1303
  • 700
  • 234
  • 111
  • 97
  • 43
  • 36
  • 18
  • 16
  • 15
  • 15
  • 14
  • 11
  • 10
  • 10
  • Tagged with
  • 3138
  • 581
  • 547
  • 366
  • 355
  • 298
  • 295
  • 293
  • 237
  • 220
  • 213
  • 208
  • 191
  • 186
  • 178
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
521

Harnessing tractability in constraint satisfaction problems

Carbonnel, Clément 07 December 2016 (has links) (PDF)
The Constraint Satisfaction Problem (CSP) is a fundamental NP-complete problem with many applications in artificial intelligence. This problem has enjoyed considerable scientific attention in the past decades due to its practical usefulness and the deep theoretical questions it relates to. However, there is a wide gap between practitioners, who develop solving techniques that are efficient for industrial instances but exponential in the worst case, and theorists who design sophisticated polynomial-time algorithms for restrictions of CSP defined by certain algebraic properties. In this thesis we attempt to bridge this gap by providing polynomial-time algorithms to test for membership in a selection of major tractable classes. Even if the instance does not belong to one of these classes, we investigate the possibility of decomposing efficiently a CSP instance into tractable subproblems through the lens of parameterized complexity. Finally, we propose a general framework to adapt the concept of kernelization, central to parameterized complexity but hitherto rarely used in practice, to the context of constraint reasoning. Preliminary experiments on this last contribution show promising results.
522

Intermittent Complexity Fluctuations during Ventricular Fibrillation

Schlemmer, Alexander 16 March 2017 (has links)
No description available.
523

Performance analysis of multithreaded sorting algorithms

Nordin, Henrik, Jouper, Kevin January 2015 (has links)
Context. Almost all of the modern computers today have a CPU withmultiple cores, providing extra computational power. In the new ageof big data, parallel execution is essential to improve the performanceto an acceptable level. With parallelisation comes new challenges thatneeds to be considered. Objectives. In this work, parallel algorithms are compared and analysedin relation to their sequential counterparts, using the Java platform.Through this, find the potential speedup for multithreading andwhat factors affects the performance. In addition, provide source codefor multithreaded algorithms with proven time complexities. Methods. A literature study was conducted to gain knowledge anddeeper understanding into the aspects of sorting algorithms and thearea of parallel computing. An experiment followed of implementing aset of algorithms from which data could be gather through benchmarkingand testing. The data gathered was studied and analysed with itscorresponding source code to prove the validity of parallelisation. Results. Multithreading does improve performance, with two threadsin average providing a speedup of up to 2x and four threads up to3x. However, the potential speedup is bound to the available physicalthreads of the CPU and dependent of balancing the workload. Conclusions. The importance of workload balancing and using thecorrect number of threads in relation to the problem to be solved,needs to be carefully considered in order to utilize the extra resourcesavailable to its full potential.
524

Problèmes de transport à la demande avec prise en compte de la qualité de service / Dial-a-Ride problems which take into account the quality of service

Chassaing, Maxime 04 December 2015 (has links)
Cette thèse porte sur la modélisation et la résolution de différents problèmes de tournées de véhicules et plus particulièrement sur des problèmes de transport de personnes. Ces problèmes, demandent, entre autre, de respecter une qualité de service minimale pour les solutions proposées. Pour résoudre ces problèmes, plusieurs méthodes d'optimisation de type métaheuristique sont proposées pour obtenir des solutions de bonne qualité dans des temps raisonnables. Trois problèmes sont traités successivement : le DARP, le TDVRP, le SDARP. Le premier est un problème de transport à la demande (DARP - Dial-A-Ride Problem) qui est le problème de transport de personnes le plus connu de la littérature. Il est proposé dans ce chapitre une méthode de type ELS qui a été comparée aux meilleures méthodes publiées. Les tests montrent que la méthode ELS est compétitive en termes de temps de calcul et de qualité des résultats. Le deuxième problème est une extension du problème de tournées de véhicules (VRP - Vehicle Routing Problem) dans lequel les temps de trajet entre les sommets varient au cours de la journée (TDVRP - Time Dependent Vehicle Routing Problem). Dans ce problème, une distinction existe entre les temps de conduite et les temps de travail des chauffeurs. La différence entre les deux correspond aux temps de pause. Ils sont utilisés ici durant les tournées pour éviter aux chauffeurs de conduire durant les périodes à fort ralentissement du trafic. La méthode proposée permet entre autre de positionner stratégiquement ces pauses afin de réduire le temps de conduite et de proposer de nouvelles solutions. Le dernier problème traité concerne la résolution d'un DARP stochastique. Dans ce problème, les temps de trajet entre les clients ne sont plus déterministes, et ils sont modélisés par une loi de probabilité. L'objectif est de déterminer des solutions robustes aux fluctuations des temps de trajets sur les arcs. Une première approche a permis de calculer des solutions robustes qui ont une probabilité importante d'être réalisables, une seconde approche a permis de générer un ensemble de solutions offrant un équilibre entre la robustesse et le coût. / In this thesis, we are interested in modeling and solving various vehicle routing problems (VRP), especially passenger transportation problems. These problems aim at finding solutions which guarantee a required quality of service. Several metaheuristics are proposed to obtain high quality solutions within reasonable time. Three problems are addressed: the Dial-A-Ride Problem (DARP), the Time-Dependent Vehicle Routing Problem (TDVRP) and the Stochastic DARP (SDARP). The DARP is a well-known on-demand transportation problem. We propose an Evolutionary Local Search (ELS) method. It relies on a new randomized constructive heuristic and on adaptive probabilities for selecting neighborhood structures. This approach is compared with existing methods on classical instances. Results show the interest of the proposed method. The TDVRP is an extension of VRP in which the transportation time varies throughout the day. The driving time is separated from the drivers working time and the difference corresponds to the resting time. The resting time is used to avoid driving during highly congested periods. The proposed method set these resting times in order to reduce the driving time. Hence new solutions avoiding congestion as much as possible are proposed. In the SDARP, the travel time between clients is stochastic and thus follows a probability distribution. The objective is to compute robust solutions, i.e. solutions which handle variations of the transportation time. Two approaches are proposed for this problem. The first one produces robust solutions that have a significant probability of staying feasible. The second one generates a set of compromise solutions, balancing the robustness and the cost.
525

Influence factors for local comprehensibility of process models

Figl, Kathrin, Laue, Ralf January 2015 (has links) (PDF)
The main aim of this study is to investigate human understanding of process models and to develop an improved understanding of its relevant influence factors. Aided by assumptions from cognitive psychology, this article attempts to address specific deductive reasoning difficulties based on process models. The authors developed a research model to capture the influence of two effects on the cognitive difficulty of reasoning tasks: (i) the presence of different control-flow patterns (such as conditional or parallel execution) in a process model and (ii) the interactivity of model elements. Based on solutions to 61 different reasoning tasks by 155 modelers, the results from this study indicate that the presence of certain control-flow patterns influences the cognitive difficulty of reasoning tasks. In particular, sequence is relatively easy, while loops in a model proved difficult. Modelers with higher process modeling knowledge performed better and rated subjective difficulty of loops lower than modelers with lower process modeling knowledge. The findings additionally support the prediction that interactivity between model elements is positively related to the cognitive difficulty of reasoning. Our research contributes to both academic literature on the comprehension of process models and practitioner literature focusing on cognitive difficulties when using process models.
526

The role of computational thinking in introductory computer science

Gouws, Lindsey Ann January 2014 (has links)
Computational thinking (CT) is gaining recognition as an important skill for students, both in computer science and other disciplines. Although there has been much focus on this field in recent years, it is rarely taught as a formal course, and there is little consensus on what exactly CT entails and how to teach and evaluate it. This research addresses the lack of resources for integrating CT into the introductory computer science curriculum. The question that we aim to answer is whether CT can be evaluated in a meaningful way. A CT framework that outlines the skills and techniques comprising CT and describes the nature of student engagement was developed; this is used as the basis for this research. An assessment (CT test) was then created to gauge the ability of incoming students, and a CT-specfic computer game was developed based on the analysis of an existing game. A set of problem solving strategies and practice activities were then recommended based on criteria defined in the framework. The results revealed that the CT abilities of first year university students are relatively poor, but that the students' scores for the CT test could be used as a predictor for their future success in computer science courses. The framework developed for this research proved successful when applied to the test, computer game evaluation, and classification of strategies and activities. Through this research, we established that CT is a skill that first year computer science students are lacking, and that using CT exercises alongside traditional programming instruction can improve students' learning experiences.
527

Analyse de complexité d'enveloppes convexes aléatoires / Complexity analysis of random convex hulls

Thomasse, Rémy 18 December 2015 (has links)
Dans cette thèse, nous donnons de nouveaux résultats sur la taille moyenne d’enveloppes convexes de points choisis dans un convexe. Cette taille est connue lorsque les points sont choisis uniformément (et indépendamment) dans un polytope convexe, ou un convexe suffisamment «lisse» ; ou encore lorsque les points sont choisis indépendamment selon une loi normale centrée. Dans la première partie de cette thèse, nous développons une technique nous permettant de donner de nouveaux résultats lorsque les points sont choisis arbitrairement dans un convexe, puis «bruités» par une perturbation aléatoire. Ce type d’analyse, appelée analyse lissée, a initialement été développée par Spielman et Teng dans leur étude de l’algorithme du simplexe. Pour un ensemble de points arbitraires dans une boule, nous obtenons une borne inférieure et une borne supérieure de cette complexité lissée dans le cas de perturbations uniformes dans une boule en dimension arbitraire, ainsi que dans le cas de perturbations gaussiennes en dimension 2. La taille de l'enveloppe convexe de points choisis uniformément dans un convexe, peut avoir un comportement logarithmique si ce convexe est un polytope ou polynomial s’il est lisse. Nous construisons un convexe produisant un comportement oscillant entre ces deux extrêmes. Dans la dernière partie, nous présentons un algorithme pour engendrer efficacement une enveloppe convexe aléatoire de points choisis uniformément et indépendamment dans un disque sans avoir à engendrer explicitement tous les points. Il a été implémenté en C++ et intégré dans la bibliothèque CGAL. / In this thesis, we give some new results about the average size of convex hulls made of points chosen in a convex body. This size is known when the points are chosen uniformly (and independently) in a convex polytope or in a "smooth" enough convex body. This average size is also known if the points are independently chosen according to a centered Gaussian distribution. In the first part of this thesis, we introduce a technique that will give new results when the points are chosen arbitrarily in a convex body, and then noised by some random perturbations. This kind of analysis, called smoothed analysis, has been initially developed by Spielman and Teng in their study of the simplex algorithm. For an arbitrary set of point in a ball, we obtain a lower and a upper bound for this smoothed complexity, in the case of uniform perturbation in a ball (in arbitrary dimension) and in the case of Gaussian perturbations in dimension 2. The asymptotic behavior of the expected size of the convex hull of uniformly random points in a convex body is polynomial for a "smooth" body and polylogarithmic for a polytope. In the second part, we construct a convex body so that the expected size of the convex hull of points uniformly chosen in that body oscillates between these two behaviors when the number of points increases. In the last part, we present an algorithm to generate efficiently a random convex hull made of points chosen uniformly and independently in a disk. We also compute its average time and space complexity. This algorithm can generate a random convex hull without explicitly generating all the points. It has been implemented in C++ and integrated in the CGAL library.
528

Reachability games with counters : decidability and algorithms / Décidabilité et complexité de jeux d'accessibilité sur des systèmes à compteurs

Reichert, Julien 30 July 2015 (has links)
Cette thèse est consacrée à une étude d'un point de vue général de jeux d'accessibilité dans des systèmes munis de compteurs. Dans ce type de jeux, l'objectif de l'un des deux joueurs est d'atteindre une configuration particulière, qui est composée d'un sommet de l'arène où le jeu se déroule et d'un n-uplet de valeurs pour les compteurs. Ces valeurs de compteurs sont mises à jour, généralement par des additions de vecteurs, lorsqu’un arc est emprunté. Le problème de décision associé à un jeu d’accessibilité est de savoir si le joueur en question a une stratégie gagnante depuis une configuration donnée. Lorsque ce problème est décidable, on s’intéresse à la possibilité de décrire l’ensemble de ces configurations dites gagnantes. Au cours de l’étude, des caractéristiques des jeux d’accessibilités avec des compteurs sont mises en parallèle, en cherchant des similarités, ou au contraire des différences, au niveau de la décidabilité et de la complexité du problème de décision, quand l’une de ces caractéristiques est modifiée. On retiendra en tant que caractéristique majeure le comportement quand un compteur devrait devenir négatif. Nous nous focalisons principalement sur trois sémantiques. Nous considérons également d’autres caractéristiques selon lesquelles il était possible de comparer décidabilité et complexité. Nous nous penchons sur un modèle intitulé « robot games », sur lequel nous obtenons des résultats majeurs : un algorithme de complexité asymptotiquement optimale en dimension un et une preuve d’indécidabilité en dimension trois. / This thesis is devoted to a general study of a reachability games on systems with counters. In this kind of games, the objective of one of two players is to reach a particular configuration, which is a pair composed of a vertex of the game arena and a tuple of values for the counters. The values of the counters are updated, usually by vector additions, when a edge is taken. The decision problem associated with a reachability game is whether a player has a winning strategy for the game from a given configuration, in other words whether the configuration is winning. When the problem of determining the winner from a given configuration is decidable, we wonder whether it is even possible to describe the set of winning configurations. In our study, we look at various features of counter reachability games, finding similarities or, on the contrary, differences with regard to decidability or complexity of the decision problem, when one of the features is modified. The main feature that we consider is what happens when a counter should become negative. We focus primarily on three semantics. We also consider other features that allow to compare decidability and complexity. We introduce a model, called “robot games”, on which we obtain our main results: an algorithm with an optimal complexity for dimension one, and undecidability for dimension three.
529

Improving Novice Analyst Performance in Modeling the Sequence Diagram in Systems Analysis: A Cognitive Complexity Approach

Sin, Thant 18 February 2009 (has links)
The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.
530

Processing terror : an investigation into the immediate and short-term psychological effects of a terrorist attack

Jhangiani, Rajiv Sunil 05 1900 (has links)
In the years since the 9/11 attacks the incidence of terrorism has been on the rise. At the same time, news media coverage of major terrorist attacks has reached epic proportions, greatly expanding the number of individuals psychologically affected by terrorism. The goal of this dissertation is to better understand how individuals cope with terrorism experienced at a distance. Specifically, this investigation focuses on the impact of stress on integrative complexity (IC; a measure of cognitive processing; Suedfeld, Tetlock, & Streufert, 1992) during and shortly after a major terrorist event. Taken together, the findings from the three studies reported in this dissertation provide several insights into this process. Study 1 replicates and extends results from an earlier study of television newscasters reporting live on 9/11 (Jhangiani & Suedfeld, 2005), in the context of the 2005 London bombings and the medium of radio. In doing so, it provides the first empirical evidence outside of the research laboratory for the curvilinear relationship between stress and IC. Specifically, during the early stages of reports concerning the London bombings, a positive relationship is found between negative emotion and IC. However, once the nature and extent of the event become clearer, increases in negative emotion are related to decreases in IC (the disruptive stress hypothesis). Study 2 replicates this curvilinear relationship in the short-term reactions of two prominent political leaders to 9/11 and the 2005 London bombings. For one of these political leaders, the magnitude of his psychological reaction is moderated by the psychological distance between him and the victims of the attacks. Finally, Study 3 finds that two key personality variables, neuroticism and empathy, play important roles in determining the magnitude of the short-term psychological reactions to 9/11 of more than 250 students from Canada and the United States. This finding is particularly true for those students who were psychologically closer to the victims of the attacks. Implications, strengths and limitations of this research, and possible future directions are discussed. / Arts, Faculty of / Psychology, Department of / Graduate

Page generated in 0.0715 seconds