• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6857
  • 2288
  • 1
  • 1
  • Tagged with
  • 9147
  • 9121
  • 8130
  • 8071
  • 1264
  • 925
  • 898
  • 703
  • 668
  • 661
  • 626
  • 552
  • 460
  • 426
  • 360
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
951

Early screening diagnostic aid for heart disease using data mining : An evaluation using patient data that can be obtained without medical equipment

Olsson, Anna, Nordlöf, Denise January 2015 (has links)
Heart disease is the leading cause of death in the world. Being able to conduct an early screening diagnosis of heart disease at home, could potentially be a tool to reduce the amount of people who lose their lives to the disease in the future. This report aims at investigating if an early screening diagnostic aid using no attributes requiring advanced medical equipment to be measured can be created, that acquires the same level of accuracy as previous data sets and studies. A litera- ture study of medical background, patient data sets and attributes, as well as data mining was conducted. A unique home data set consisting of attributes that can be obtained from home was created and data mining experiments were run in WEKA, using classification algorithms Naive-Bayes and Decision Trees. The results are compared to the Cleveland data set in regards to accuracy. The study shows that the home data set does not deliver the same accuracy level as the Cleveland data set. The idea that similar accuracy can be obtained for the dierent sets has not been disproven and more exhaustive research is encouraged.
952

Constructing a Scheduling Algorithm For Multidirectional Elevators

Edlund, Joakim, Berntsson, Fredrik January 2015 (has links)
With this thesis we aim to create an efficient scheduling algorithm for elevators that can move in multiple directions and establish if and when the algorithm is efficient in comparison to algorithms constructed for traditional elevator algorithms. To measure efficiency, a simulator is constructed to simulate an elevator system implementing different algorithms. Because of the challenge of constructing a simulator and since we did not find either a simulator nor any algorithms for use in multidirectional elevator systems publicly, we decided to focus on these subjects. The results in this thesis leads us to the conclusion that a multidirectional elevator algorithm is efficient to use under certain circumstances. If the traffic is concentrated to one floor in the building the multidirectional elevator system performs poorly but when the traffic is spread out it outperforms traditional elevators algorithms. We hope that this research will inspire to further research in the area of multidirectional elevator systems.
953

Benchmarking Beginner Algorithms for Rubik's Cube

Nilsson, Andreas, Spång, Anton January 2015 (has links)
Over the years different algorithms have been developed to step-by-step solve parts of the Rubik's cube until finally reaching the unique solution. This thesis explores two commonly known beginner algorithms for solving Rubik’s cube to find how they differ in solving speed and amount of moves. The algorithms were implemented and run on a large amount of scrambled cubes to collect data. The results showed that Layer-by-layer with daisy algorithm had a lower average amount of moves than the Dedmore algorithm. The main difference in amount of moves lies in the steps that solve the last layer of the cube. The Layer-by-layer with daisy algorithm uses only one-seventh of the time-consuming operations that Dedmore algorithm uses, which concludes that it is more suitable for speedcubing. / Över åren har ett antal olika algoritmer utvecklats för att steg-för-steg lösa delar av Rubik's kub för att till sist komma fram till den unika lösningen. Denna rapport utforskar två allmänt kända nybörjaralgoritmer för att lösa Rubik's kub, för att finna hur dem skiljer sig åt i tid samt antal operationer för att nå lösningen. Algoritmerna implementerades och kördes på ett stort antal blandade kuber för att samla data. Resultatet visar att Lager-för-lager med daisy algoritmen hade ett lägre genomsnittligt antal förflyttningar jämfört med Dedmore algoritmen. Den största skillnaden i antalet förflyttningar ligger i stegen som löser sista lagret av kuben. Lager-för-lager med daisy algoritmen använder bara en sjundedel av de mest tidskrävande förflyttningarna jämfört med Dedmore algoritmen, slutsatsen av detta är att Lager-för-lager med daisy algoritmen är bättre lämpad för lösning av kuben på tid.
954

A Comparison of Traditional Elevator Control Strategies

Effati, Shayan, Alipoor, Donia January 2015 (has links)
The purpose of this paper is to investigate which of a specific set of elevator control strategies that is the most time efficient from the passenger’s point of view in a specific office building. The paper first goes through five different strategies and approaches, followed by results from a simulation of some of the strategies and their combinations. From the test results, it can be concluded that one strategy works best for all possible scenarios during a regular working day, both regarding the average waiting time for a passenger as well as the average travel time. The most optimal strategy implemented can be further optimized for more precise results, but it will not change the outcome of the comparison.
955

Detecting Epileptic Seizures : Optimal Feature Extraction from EEG for Support Vector Machines

Grippe, Edward, Lönnerberg, Mattias January 2015 (has links)
Epilepsy is a chronic neurological brain disorder causing the affected to have seizures.Looking at EEG recordings, an expert is able to identify epileptic activity and diagnosepatients with epilepsy. This process is time consuming and calls for automatization. Theautomation process is done through feature extraction and classification. The featureextraction finds features of the signal and the classification uses the features to classify thesignal as epileptic or not. The accuracy of the classification varies depending on both whichfeatures is chosen to represent each signal and which classification method is used. Onepopular method for classification of data is the SVM. This report tests and analyses six featureextraction methods with a linear SVM to see which method resulted in best classificationperformance when classifying epileptic EEG data. The results showed that two differentmethods resulted in classification accuracies significantly higher than the rest. The waveletbased method for feature extraction got a classification accuracy of 98.83% and the Hjorthfeatures method got a classification accuracy of 97.42%. However the results of these twomethods was too similar to be considered significantly different and therefore no conclusioncould be drawn of which was the best.
956

Comparison of Rubik’s Cube Solving Methods Made for Humans

Duberg, Daniel, Tideström, Jakob January 2015 (has links)
This study analyzed and compared four different methods of solving a Rubik’s Cube. Those four methods being the method on Rubik’s official website, the CFOP method, the Roux method and the ZZ method. The factors that were considered were the number of moves each method requires for solving a Rubik’s Cube, how many algorithms they require as well as how concrete or intuitive they are. Our conclusion is that the CFOP, Roux, and ZZ method are fairly equivalent when it comes to move span, but CFOP has the lowest average number of moves used to solve a Rubik’s Cube. CFOP has more concrete algorithms and cases while both Roux and ZZ are more intuitive, ZZ uses fewer types of moves than Roux however. The solution on Rubik’s official website does not compare, at its best it uses as many moves as the others do at their worst. It is however concrete and uses few algorithms for each part.
957

How hard is Wings of Vi? : An analysis of the computational complexity of the game Wings of Vi

Lindblad, Max January 2015 (has links)
Computational complexity theory is the study of the inherent difficulty of different computational problems. By determining the complexity class of a problem you can learn a lot about how hard the problem is to solve. For games, their complexity class determines sort of an upper limit to how hard they can be. All NP-complete games can be made to be both extremely difficult to play and to analyze. The purpose of this study is to analyze the computational complexity of the game Wings of Vi, where it is shown to be both NP-hard and in NP, and thus NP-complete.
958

NLI-spel som läromedel : En undersökning om NLI-spels effektivitet inom utbildning i jämförelse med vanligt läsande för elever i åldrarna 10-15 år

Lando, Emilio, Bank, Jakob January 2015 (has links)
Låg motivation och dåliga resultat i skolan är ett problem idag. Syftetmed denna studie var att undersöka om lärande med hjälp av ettNLI-spel är effektivare inom utbildning än vanligt läsande för barnoch ungdomar mellan 10 och 15 år. Detta undersöktes genom attjämföra inlärning via ett NLI-spel och inlärning via läsning där bådespelet och texten innehåller exakt samma fakta. Jämförelsen gjordesvia ett teoretiskt prov där de två gruppernas resultat analyserades.Resultatet var lyckat och gruppen som spelade spelet hade igenomsnitt fler rätt på provet än de som läste. På grund av fåtestpersoner i målgruppen gjordes en komplettering med vuxnatestpersoner för att öka korrektheten av resultatet. Det generellaresultatet förblev detsamma och inlärning via spel var fördelaktigt,dock var skillnaden betydligt mindre för de vuxna testpersonerna. Endragen slutsats är att detta tillvägagångssätt inom utbildning är bästlämpat för barn och ungdomar där läsning och teoretisk inlärning äromotiverande och tråkigt.
959

Compressing sparse graphs to speed up Dijkstra’s shortest path algorithm

Bergdorf, Johan, Norman, Jesper January 2015 (has links)
One of the problems that arises from the continuously growing amount of data is that it slows down and limits the uses of large graphs in real world situations. Because of this, studies are being done to investigate the possibility of compressing data in large graphs. This report presents an investigation on the usefulness of compressing sparse graphs and then applying Dijkstra’s shortest path algorithm. A minimal spanning tree algorithm was used to compress a graph and compared with a self-implemented compression algorithm. The minimal distances and how long time it takes for Dijkstra's algorithm to find the shortest path between nodes are investigated. The results show that it is not worth compressing the type of sparse graphs used in this study. It is hard to compress the graph without losing too much of the edges that preserve the shortest paths. The time gained when running Dijkstra's algorithm on the compressed graphs is not enough to compensate for the lack of getting a good shortest path solution.
960

Applying Design Patterns in the implementation of a simple packet filter

Fridström, Henrik, Sjöberg, Linus January 2015 (has links)
By reading this thesis the reader will get insight into if and how Design Pattern can help in the grand scheme of software development. Practice is combined with theory in an empirical study to see if Design Patterns are a feasible method for implementing a simple packet filtering solution. The resulting application uses the Chain of Responsibility and Simpleton Design Patterns together with the Security Pattern Packet Filter Firewall. In the results the thesis will present how Design Patterns assisted in the design and successful implementation of the packet filter together with a discussion of the benefits, limitations and drawbacks of the solution. Finally the reader will take part in how further studies can be constructed upon the given result. / Genom att läsa denna avhandling kommer läsaren få inblick i om och hur Design Pattern kan bidra till mjukvaruutveckling. Praktik kombineras med teori i en empirisk studie för att se om designmönster är en användbar metod för att implementera ett enkelt packetfilter. Den resulterande applikationen använder sig av designmönstret Chain of Responsbility och Simpleton, samt säkerhetsmönstret Packet Filter Firewall. I resultatet presenteras det hur designmönster bidragit till skapandet av ett paketfilter tillsammans med en diskussion om fördelar, nackdelar samt begränsningar. Läsaren kommer sedan avslutningsvis ta del av hur ytterligare studier kan byggas på resultat.

Page generated in 0.0712 seconds