• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 36
  • 1
  • Tagged with
  • 90
  • 90
  • 90
  • 30
  • 29
  • 28
  • 25
  • 16
  • 13
  • 11
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

How to teach an old dog new tricks:quantum information, quantum computing, and the philosophy of physics

Duwell, Armond James 31 January 2005 (has links)
My dissertation consists of two independent parts. Part one of my dissertation examines concepts of quantum information. I clarify three very different concepts of information and assess their implications for understanding quantum mechanics. First I clarify the concept of information due to Shannon, and its relation to physical theories. Using the Shannon concept, I examine two purportedly new concepts of quantum information. I argue that a fundamental philosophical mistake is made regarding these concepts. Advocates of these new concepts do not properly distinguish between the properties of information due to the physical medium it is stored in from the properties of information per se. This distinction is crucial for developing a new concept to help us understand quantum mechanics and evaluating its merits. Part two of my dissertation examines explanations of the efficiency that quantum computers enjoy over classical computers for some computational tasks, and the relationship between explanations of efficiency and interpretations of quantum mechanics. I examine the so-called quantum parallelism thesis, that quantum computers can perform many computations in a single step, a feat thought not to be possible on classical computers. The truth of this thesis is not obvious, and contested by some. I develop a set of general criteria for computation that any computing device must satisfy. I use these criteria to demonstrate that the quantum parallelism thesis is true. As an application of these general criteria for computation I articulate three distinct concepts of parallelism and demonstrate that classical computers can compute in parallel as well. This demonstrates that the truth of the quantum parallelism thesis alone does not provide a complete explanation of the efficiency of quantum computers. I supplement the quantum parallelism thesis to provide a complete explanation. Finally, I address the claim that only the many-worlds interpretation of quantum mechanics can underwrite the truth of the quantum parallelism thesis. The general criteria for computation provide support for the quantum parallelism thesis independent of any interpretation of quantum mechanics.
2

Empiricism and the Epistemic Status of Imaging Technologies

Delehanty, Megan Catherine 04 October 2005 (has links)
This starting point for this project was the question of how to understand the epistemic status of mathematized imaging technologies such as positron emission tomography (PET) and confocal microscopy. These sorts of instruments play an increasingly important role in virtually all areas of biology and medicine. Some of these technologies have been widely celebrated as having revolutionized various fields of studies while others have been the target of substantial criticism. Thus, it is essential that we be able to assess these sorts of technologies as methods of producing evidence. They differ from one another in many respects, but one feature they all have in common is the use of multiple layers of statistical and mathematical processing that are essential to data production. This feature alone means that they do not fit neatly into any standard empiricist account of evidence. Yet this failure to be accommodated by philosophical accounts of good evidence does not indicate a general inadequacy on their part since, by many measures, they very often produce very high quality evidence. In order to understand how they can do so, we must look more closely at old philosophical questions concerning the role of experience and observation in acquiring knowledge about the external world. Doing so leads us to a new, grounded version of empiricism. After distinguishing between a weaker and a stronger, anthropomorphic version of empiricism, I argue that most contemporary accounts of observation are what I call benchmark strategies that, implicitly or explicitly, rely on the stronger version according to which human sense experience holds a place of unique privilege. They attempt to extend the bounds of observation and the epistemic privilege accorded to it by establishing some type of relevant similarity to the benchmark of human perception. These accounts fail because they are unable to establish an epistemically motivated account of what relevant similarity consists of. The last best chance for any benchmark approach, and, indeed, for anthropomorphic empiricism, is to supplement a benchmark strategy with a grounding strategy. Toward this end, I examine the Grounded Benchmark Criterion which defines relevant similarity to human perception to be defined in terms of the reliability-making features of human perception. This account, too, must fail due to our inability to specify these features given the current state of understanding of the human visual system. However, this failure reveals that it is reliability alone that is epistemically relevant, not any other sort of similarity to human perception. Current accounts of reliability suffer from a number of difficulties, so I develop a novel account of reliability that is based on the concept of granularity. My account of reliability in terms of a granularity match both provides the means to refine the weaker version of empiricism and allows us to establish when and why imaging technologies are reliable. Finally, I use this account of granularity is examining the importance of the fact that the output of imaging technologies usually is images.
3

Explicating Emotions

Scarantino, Andrea 10 October 2005 (has links)
In the course of their long intellectual history, emotions have been identified with items as diverse as perceptions of bodily changes (feeling tradition), judgments (cognitivist tradition), behavioral predispositions (behaviorist tradition), biologically based solutions to fundamental life tasks (evolutionary tradition), and culturally specific social artifacts (social constructionist tradition). The first objective of my work is to put some order in the mare magnum of theories of emotions. I taxonomize them into families and explore the historical origin and current credentials of the arguments and intuitions supporting them. I then evaluate the methodology of past and present emotion theory, defending a bleak conclusion: a great many emotion theorists ask What is an emotion? without a clear understanding of what counts as getting the answer right. I argue that there are two ways of getting the answer right. One is to capture the conditions of application of the folk term "emotion" in ordinary language (Folk Emotion Project), and the other is to formulate a fruitful explication of it (Explicating Emotion Project). Once we get clear on the desiderata of these two projects, we realize that several long-running debates in emotion theory are motivated by methodological confusions. The constructive part of my work is devoted to formulating a new explication of emotion suitable for the theoretical purposes of scientific psychology. At the heart of the Urgency Management System (UMS) theory of emotions I propose is the idea that an umotion is a special type of superordinate system which instantiates and manages an urgent action tendency by coordinating the operation of a cluster of cognitive, perceptual and motoric subsystems. Crucially, such superordinate system has a proper function by virtue of which it acquires a special kind of intentionality I call pragmatic. I argue that umotion is sufficiently similar in use to emotion to count as explicating it, it has precise rules of application, and it accommodates a number of central and widely shared intuitions about the emotions. My hope is that future emotion research will demonstrate the heuristic fruitfulness of the umotion concept for the sciences of mind.
4

Subjective Measures of Well-Being: A philosophical examination

Angner, Erik 30 September 2005 (has links)
Over the last couple of decades, as part of the rise of positive psychology, psychologists have given increasing amounts of attention to so-called subjective measures of well-being. These measures, which are supposed to represent the well-being of individuals and groups, are often presented as alternatives to more traditional economic ones for purposes of the articulation, implementation and evaluation of public policy. Unlike economic measures, which are typically based on data about income, market transactions and the like, subjective measures are based on answers to questions like: Taking things all together, how would you say things are these days would you say youre very happy, pretty happy, or not too happy these days? The aim of this dissertation is to explore issues in the philosophical foundations of subjective measures of well-being, with special emphasis on the manner in which the philosophical foundations of subjective measures differ from those of traditional economic measures. Moreover, the goal is to examine some arguments for and against these measures, and, in particular, arguments that purport to demonstrate the superiority of economic measures for purposes of public policy. My main thesis is that the claim that subjective measures of well-being cannot be shown to be inferior to economic measures quite as easily as some have suggested, but that they nevertheless are associated with serious problems, and that questions about the relative advantage of subjective and economic measures for purposes of public policy will depend on some fundamentally philosophical judgments, e.g. about the nature of well-being and the legitimate goals for public policy.
5

Computations and Computers in the Sciences of Mind and Brain

Piccinini, Gualtiero 18 November 2003 (has links)
Computationalism says that brains are computing mechanisms, that is, mechanisms that perform computations. At present, there is no consensus on how to formulate computationalism precisely or adjudicate the dispute between computationalism and its foes, or between different versions of computationalism. An important reason for the current impasse is the lack of a satisfactory philosophical account of computing mechanisms. The main goal of this dissertation is to offer such an account. I also believe that the history of computationalism sheds light on the current debate. By tracing different versions of computationalism to their common historical origin, we can see how the current divisions originated and understand their motivation. Reconstructing debates over computationalism in the context of their own intellectual history can contribute to philosophical progress on the relation between brains and computing mechanisms and help determine how brains and computing mechanisms are alike, and how they differ. Accordingly, my dissertation is divided into a historical part, which traces the early history of computationalism up to 1946, and a philosophical part, which offers an account of computing mechanisms. The two main ideas developed in this dissertation are that (1) computational states are to be identified functionally not semantically, and (2) computing mechanisms are to be studied by functional analysis. The resulting account of computing mechanism, which I call the functional account of computing mechanisms, can be used to identify computing mechanisms and the functions they compute. I use the functional account of computing mechanisms to taxonomize computing mechanisms based on their different computing power, and I use this taxonomy of computing mechanisms to taxonomize different versions of computationalism based on the functional properties that they ascribe to brains. By doing so, I begin to tease out empirically testable statements about the functional organization of the brain that different versions of computationalism are committed to. I submit that when computationalism is reformulated in the more explicit and precise way I propose, the disputes about computationalism can be adjudicated on the grounds of empirical evidence from neuroscience.
6

Representations of Space in Seventeenth Century Physics

Miller, David Marshall 02 June 2006 (has links)
The changing understanding of the universe that characterized the birth of modern science included a fundamental shift in the prevailing representation of space the presupposed conceptual structure that allows one to intelligibly describe the spatial properties of physical phenomena. At the beginning of the seventeenth century, the prevailing representation of space was spherical. Natural philosophers first assumed a spatial center, then specified meanings with reference to that center. Directions, for example, were described in relation to the center, and locations were specified by distance from the center. Through a series of attempts to solve problems first raised by the work of Copernicus, this Aristotelian, spherical framework was replaced by a rectilinear representation of space. By the end of the seventeenth century, descriptions were understood by reference to linear orientations, as parallel or oblique to a presupposed line, and locations were identified without reference to a privileged central point. This move to rectilinear representations of space enabled Gilbert, Kepler, Galileo, Descartes, and Newton to describe and explain the behavior of the physical world in the novel ways for which these men are justly famous, including their theories of gravitational attraction and inertia. In other words, the shift towards a rectilinear representation of space was essential to the fundamental reconception of the universe that gave rise to both modern physical theory and, at the same time, the linear way of experiencing the world that characterizes modern science.
7

EXPLAINING EVOLUTIONARY INNOVATION AND NOVELTY: A HISTORICAL AND PHILOSOPHICAL STUDY OF BIOLOGICAL CONCEPTS

Love, Alan Christopher 05 July 2006 (has links)
Explaining evolutionary novelties (such as feathers or neural crest cells) is a central item on the research agenda of evolutionary developmental biology (Evo-devo). Proponents of Evo-devo have claimed that the origin of innovation and novelty constitutes a distinct research problem, ignored by evolutionary theory during the latter half of the 20th century, and that Evo-devo as a synthesis of biological disciplines is in a unique position to address this problem. In order to answer historical and philosophical questions attending these claims, two philosophical tools were developed. The first, conceptual clusters, captures the joint deployment of concepts in the offering of scientific explanations and allows for a novel definition of conceptual change. The second, problem agendas, captures the multifaceted nature of explanatory domains in biological science and their diachronic stability. The value of problem agendas as an analytical unit is illustrated through the examples of avian feather and flight origination. Historical research shows that explanations of innovation and novelty were not ignored. They were situated in disciplines such as comparative embryology, morphology, and paleontology (exemplified in the research of N.J. Berrill, D.D. Davis, and W.K. Gregory), which were overlooked because of a historiography emphasizing the relations between genetics and experimental embryology. This identified the origin of Evo-devo tools (developmental genetics) but missed the source of its problem agenda. The structure of developmental genetic explanations of innovations and novelties is compared and contrasted with those of other disciplinary approaches, past and present. Applying the tool of conceptual clusters to these explanations reveals a unique form of conceptual change over the past five decades: a change in the causal and evidential concepts appealed to in explanations. Specification of the criteria of explanatory adequacy for the problem agenda of innovation and novelty indicates that Evo-devo qua disciplinary synthesis requires more attention to the construction of integrated explanations from its constituent disciplines besides developmental genetics. A model for explanations integrating multiple disciplinary contributions is provided. The phylogenetic approach to philosophy of science utilized in this study is relevant to philosophical studies of other sciences and meets numerous criteria of adequacy for analyses of conceptual change.
8

GOD ACTS FROM THE LAWS OF HIS NATURE ALONE: FROM THE NIHIL EX NIHILO AXIOM TO CAUSATION AS EXPRESSION IN SPINOZAS METAPHYSICS

di Poppa, Francesca 20 September 2006 (has links)
One of the most important concepts in Spinozas metaphysics is that of causation. Much of the expansive scholarship on Spinoza, however, either takes causation for granted, or ascribes to Spinoza a model of causation that, for one reason or another, fails to account for specific instances of causationsuch as the concept of cause of itself (causa sui). This work will offer a new interpretation of Spinozas concept of causation. Starting from the nothing comes from nothing axiom and its consequences, the containment principle and the similarity principle (basically, the idea that what is in the effect must have been contained in the cause, and that the cause and the effect must have something in common) I will argue that Spinoza adopts what I call the expression-containment model of causation, a model that describes all causal interactions at the vertical and horizontal level (including causa sui, or self-cause). The model adopts the core notion of Neoplatonic emanationism, i.e. the idea that the effect is a necessary outpouring of the cause; however, Spinoza famously rejects transcendence and the possibility of created substances. God, the First Cause, causes immanently: everything that is caused is caused in God, as a mode of God. Starting from a discussions of the problems that Spinoza found in Cartesian philosophy, and of the Scholastic and Jewish positions on horizontal and vertical causation, my dissertation will follow the development of Spinozas model of causation from his earliest work to his more mature Ethics. My work will also examine the relationship between Spinozas elaboration of monism, the development of his model of causation, and his novel concept of essence (which for Spinoza coincides with a things causal power).
9

Carnap, Tarski, and Quine's Year Together: Logic, Science, and Mathematics

Frost-Arnold, Gregory 28 September 2006 (has links)
During the academic year 1940-1941, several giants of analytic philosophy congregated at Harvard: Russell, Tarski, Carnap, Quine, Hempel, and Goodman were all in residence. This group held both regular public meetings as well as private conversations. Carnap took detailed diction notes that give us an extensive record of the discussions at Harvard that year. Surprisingly, the most prominent question in these discussions is: if the number of physical items in the universe is finite (or possibly finite), what form should the logic and mathematics in science take? This question is closely connected to an abiding philosophical problem, one that is of central philosophical importance to the logical empiricists: what is the relationship between the logico-mathematical realm and the natural, material realm? This problem continues to be central to analytic philosophy of logic, mathematics, and science. My dissertation focuses on three issues connected with this problem that dominate the Harvard discussions: nominalism, the unity of science, and analyticity. I both reconstruct the lines of argument represented in Harvard discussions and relate them to contemporary treatments of these issues.
10

Haags Theorem and the Interpretation of Quantum Field Theories with Interactions

Fraser, Doreen Lynn 28 September 2006 (has links)
Quantum field theory (QFT) is the physical framework that integrates quantum mechanics and the special theory of relativity; it is the basis of many of our best physical theories. QFTs for interacting systems have yielded extraordinarily accurate predictions. Yet, in spite of unquestionable empirical success, the treatment of interactions in QFT raises serious issues for the foundations and interpretation of the theory. This dissertation takes Haags theorem as a starting point for investigating these issues. It begins with a detailed exposition and analysis of different versions of Haags theorem. The theorem is cast as a reductio ad absurdum of canonical QFT prior to renormalization. It is possible to adopt different strategies in response to this reductio: (1) renormalizing the canonical framework; (2) introducing a volume i.e., long-distance) cutoff into the canonical framework; or (3) abandoning another assumption common to the canonical framework and Haags theorem, which is the approach adopted by axiomatic and constructive field theorists. Haags theorem does not entail that it is impossible to formulate a mathematically well-defined Hilbert space model for an interacting system on infinite, continuous space. Furthermore, Haags theorem does not undermine the predictions of renormalized canonical QFT; canonical QFT with cutoffs and existing mathematically rigorous models for interactions are empirically equivalent to renormalized canonical QFT. The final two chapters explore the consequences of Haags theorem for the interpretation of QFT with interactions. I argue that no mathematically rigorous model of QFT on infinite, continuous space admits an interpretation in terms of quanta (i.e., quantum particles). Furthermore, I contend that extant mathematically rigorous models for physically unrealistic interactions serve as a better guide to the ontology of QFT than either of the other two formulations of QFT. Consequently, according to QFT, quanta do not belong in our ontology of fundamental entities.

Page generated in 0.1216 seconds