• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A reverse counterfactual analysis of causation

Broadbent, Alex January 2007 (has links)
Lewis's counterfactual analysis of causation starts with the claim that c causes e if ~ C > ~ E, where c and e are events, C and E are the propositions that c and e respectively occur, ~ is negation and > is the counterfactual conditional. The purpose of my project is to provide a counterfactual analysis of causation which departs signigicantly from Lewis's starting point, and thus can hope to solve several stubborn problems for that approach. Whereas Lewis starts with a sufficiency claim, my analysis claims that a certain counterfactual is necessary for causation. I say that, if c causes e, then ~ E > ~ C - I call the latter the Reverse Counterfactual. This will often, perhaps always, be a backtracking counterfactual, so two chapters are devoted to defending a conception of counterfactuals which allows backtracking. Thus prepared, I argue that the Reverse Counterfactual is true of causes, but not of mere conditions for an effect. This provides a neat analysis of the principles governing causal selection, which is extended in a discussion of causal transitivity. Standard counterfactual accounts suffer counterexamples from preemption, but I argue that the Reverse Counterfactual has resources to deal neatly with those too. Finally I argue that the Reverse counterfactual, as a necessary condition oncausation, is the most we can hope for: in principle, there can be no counterfactual sufficient condition for causation.
2

Evaluation of the Complexity of Procedurally Generated Maze Algorithms

Karlsson, Albin January 2018 (has links)
Background. Procedural Content Generation (PCG) in Video Games can be used as a tool for efficiently producing large varieties of new content using less manpower, making it ideal for smaller teams of developers who wants to compete with games made by larger teams. One particular facet of PCG is the generation of mazes. Designers that want their game to feature mazes also need to know how to evaluate their maze-complexity, in order to know which maze fits the difficulty curve best. Objectives. This project aims to investigate the difference in complexity between the maze generation algorithms recursive backtracker (RecBack), Prim’s algorithm (Prims), and recursive division (RecDiv), in terms completion time, when solved using a depth-first-search (DFS) algorithm. In order to understand which parameters affect completion time/complexity, investigate possible connections between completion time, and the distribution of branching paths, distribution of corridors, and length of the path traversed by DFS. Methods. The main methodology was an implementation in the form of a C# application, which randomly generated 100 mazes for each algorithm for five different maze grid resolutions (16x16, 32x32, 64x64, 128x128, 256x256). Each one of the generated mazes was solved using a DFS algorithm, whose traversed nodes, solving path, and completion time was recorded. Additionally, branch distribution and corridor distribution data was gathered for each generated maze. Results. The initial results showed that mazes generated by Prims algorithm had the lowest complexity (shortest completion time), the shortest solving path, the lowest amount of traversed nodes, and the lowest proportion of 2-branches, but the highest proportion of all other branch types. Additionally Prims had the highest proportion of 4-6 length paths, but the lowest proportion of 2 and 3 length paths. Later mazes generated by RecDiv had intermediate complexity, intermediate solving path, intermediate traversed nodes, intermediate proportion of all branch types, and the highest proportion of 2-length paths, but the lowest proportion of 4-6 length paths. Finally mazes generated by RecBack had opposite statistics from Prims: the highest complexity, the longest solving path, the highest amount of traversed nodes, the highest proportion of 2-branches, but lowest proportion of all other branch types, and the highest proportion of 3-length paths, but the lowest of 2-length paths. Conclusions. Prims algorithm had the lowest complexity, RecDiv intermediate complexity, and RecBack the highest complexity. Increased solving path length, traversed nodes, and increased proportions of 2-branches, seem to correlate with increased complexity. However the corridor distribution results are too small and diverse to identify a pattern affecting completion time. However the corridor distribution results are too diverse to make it possible to discern a pattern affecting completion time by just observing the data.

Page generated in 0.235 seconds