Spelling suggestions: "subject:"[een] PHILOSOPHY OF PHYSICS"" "subject:"[enn] PHILOSOPHY OF PHYSICS""
11 |
La théorie coordinative de la connaissance et son lien avec les problèmes épistémologiques de la mesure dans les écrits empiristes-logiques de la première moitié du XXe siècle / The coordinative theory of knowledge and its relation to the epistemological problems of measurement in the logical-empiricist writings of the first half of the twentieth centuryGiovannetti, Gabriel 10 December 2018 (has links)
Ce travail fait l’analyse du concept de « principe de coordination » tel qu’il se développe au sein de la théorie coordinative de la connaissance, et plus particulièrement au sein du mouvement empiriste-logique, à partir de la deuxième décennie du XXème siècle. Ce concept est primordial lorsqu’il s’agit de comprendre la manière dont la définition des concepts de grandeur en physique se construit comme la mise en rapport, la coordination, des variables mathématiques de la théorie avec les opérations de mesure dans le laboratoire. L’enjeu est de montrer qu’un des concepts centraux de l’empirisme au XXème siècle est utilisé initialement, par Schlick et Reichenbach, pour analyser la théorie de la relativité, mais qu’il devient rapidement l’outil d’un programme plus spécifique, entrepris par Carnap et Hempel, de reconstruction logique des théories physiques. Pourtant ce concept, pris au sein de l’épistémologie coordinative, permet un empirisme qui laisse une place au développement historique des concepts de grandeur. Analysé et compris correctement il peut permettre de poser les fondements d’un empirisme historique, au sein duquel les concepts théoriques ne seraient plus reconstruits seulement à partir des mesures empiriques, mais aussi à partir des concepts hérités de théories historiquement antérieures. / This work analyzes the concept of "principle of coordination" as it develops within the coordinative theory of knowledge, and more particularly within the empiricist-logical movement, from the second decade of the twentieth century. This concept is essential to understand the way in which the definition of the concepts of magnitude in physics is constructed as the linking, the coordination, of the mathematical variables of the theory with the measurement operations in the laboratory. The challenge is to show that one of the central concepts of empiricism in the twentieth century is used initially, by Schlick and Reichenbach, to analyze the theory of relativity, but that it quickly becomes the tool of a more specific program, undertaken by Carnap and Hempel, of logical reconstruction of physical theories. Yet this concept, along with other concepts from coordinative epistemology, allows an empiricism that leaves room for the historical development of the concepts of magnitude. Analyzed and understood correctly, it can lay the foundations of a historical empiricism, in which theoretical concepts would no longer be reconstructed only from empirical measurements, but also from concepts inherited from historically antecedent theories.
|
12 |
A crise da objetividade, a epistemologia popperiana e o “programa de Heisenberg”Silva, Luiz Ben Hassanal Machado da [UNIFESP] 05 1900 (has links) (PDF)
Submitted by Andrea Hayashi (deachan@gmail.com) on 2016-06-20T15:04:35Z
No. of bitstreams: 1
dissertacao-luiz-ben-hassanal-machado-da-silva.pdf: 1764576 bytes, checksum: 0ab2e29a4b1972b424391d78ad6d9687 (MD5) / Approved for entry into archive by Andrea Hayashi (deachan@gmail.com) on 2016-06-20T15:05:54Z (GMT) No. of bitstreams: 1
dissertacao-luiz-ben-hassanal-machado-da-silva.pdf: 1764576 bytes, checksum: 0ab2e29a4b1972b424391d78ad6d9687 (MD5) / Made available in DSpace on 2016-06-20T15:05:54Z (GMT). No. of bitstreams: 1
dissertacao-luiz-ben-hassanal-machado-da-silva.pdf: 1764576 bytes, checksum: 0ab2e29a4b1972b424391d78ad6d9687 (MD5)
Previous issue date: 2015-05-15 / Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Nessa investigação nos concentraremos no período de consolidação da teoria quântica,
sobretudo naquilo que toca o livro A Lógica da Pesquisa Científica, de 1934. O centro da
investigação é à crítica de Popper ao pensamento indutivista e subjetivista de Heisenberg, que por meio de considerações da filosofia da linguagem e com o apoio de defensores da filosofia positivista, construiu com outros partidários da chamada Interpretação de Copenhague a interpretação hegemônica da teoria quântica. O dedutivismo realista de Popper , apresentado no livro Lógica da Pesquisa Científica, visa combater essa visão, através de uma defesa da objetividade e do realismo que escapou dos limites da Epistemologia e ganhou ares éticos. Popper defendeu a Interpretação Estatística, que é um ramo da teoria corpuscular. Demonstraremos como que a interpretação acerca do alcance da Epistemologia opõe esses pensadores. Para Heisenberg a objetividade devia ser deixada de lado, a partir da constatação empírica do Princípio de Incerteza. O método científico deve, segundo o físico alemão, limitar os conceitos da linguagem clássica e aplica-los nas descrições dos fenômenos quânticos segundo as limitações operacionais dos conceitos. Para Popper, a metodologia dispensa questões linguísticas e apreende o método científico como sendo baseado na testabilidade, o que impõe que a análise epistemológica seja feita somente após a teoria ter sido conjecturada. Investigaremos a partir do pensamento de Popper e veremos como sua defesa do falseacionismo impõe uma interpretação da teoria quântica diferente daquela preconizada por Heisenberg. / In this investigation we will focus on the period of consolidation of the quantum theory, specially, on what concerns the book Logic of Scientific Discovery, of 1934. The
center of this investigation is the Popper‟s critics to the inductivism and subjectivism of
Heisenberg thought that, through concepts of the philosophy of language and the support of
positivist philosophy advocates, built with other supporters of Copenhagen Interpretation, the hegemonic interpretation of quantum theory. The realistic deductivism of Popper, submitted in the Logic of Scientific Discovery, aim to tackle this position, through a defense of objectivity and realism that pushed the boundaries of epistemology and acquired ethical air. Popper supported the statistical interpretation of quantum theory, a branch of corpuscular interpretation. We will show how the interpretation of the epistemological range opposes these thinkers. To Heisenberg the objectivity must be set apart from the empirical realization of the Principle of Uncertainty. The scientific method, according to the German physicist, must limit the concepts of classical language and apply them in the quantum phenomena descriptions according to the operational limitations of concepts. According to Popper, the methodology exempts linguistic questions and perceives the scientific method as grounded on testability, which imposes that the epistemological analysis has to be made only after the theory has been conjectured. We will investigate from the thought of Popper and we will see how his defense of falseacionism imposes an interpretation of the quantum theory different from those preconized by Heisenberg.
|
13 |
Inter-theory relations in physics : case studies from quantum mechanics and quantum field theoryRosaler, Joshua S. January 2013 (has links)
I defend three general claims concerning inter-theoretic reduction in physics. First, the popular notion that a superseded theory in physics is generally a simple limit of the theory that supersedes it paints an oversimplified picture of reductive relations in physics. Second, where reduction specifically between two dynamical systems models of a single system is concerned, reduction requires the existence of a particular sort of function from the state space of the low-level (purportedly more accurate and encompassing) model to that of the high-level (purportedly less accurate and encompassing) model that approximately commutes, in a specific sense, with the rules of dynamical evolution prescribed by the models. The third point addresses a tension between, on the one hand, the frequent need to take into account system-specific details in providing a full derivation of the high-level theory’s success in a particular context, and, on the other hand, a desire to understand the general mechanisms and results that under- write reduction between two theories across a wide and disparate range of different systems; I suggest a reconciliation based on the use of partial proofs of reduction, designed to reveal these general mechanisms of reduction at work across a range of systems, while leaving certain gaps to be filled in on the basis of system-specific details. After discussing these points of general methodology, I go on to demonstrate their application to a number of particular inter-theory reductions in physics involving quantum theory. I consider three reductions: first, connecting classical mechanics and non-relativistic quantum mechanics; second,connecting classical electrodynamics and quantum electrodynamics; and third, connecting non-relativistic quantum mechanics and quantum electrodynamics. I approach these reductions from a realist perspective, and for this reason consider two realist interpretations of quantum theory - the Everett and Bohm theories - as potential bases for these reductions. Nevertheless, many of the technical results concerning these reductions pertain also more generally to the bare, uninterpreted formalism of quantum theory. Throughout my analysis, I make the application of the general methodological claims of the thesis explicit, so as to provide concrete illustration of their validity.
|
14 |
Les quantités dans la nature : les conditions ontologiques de l’applicabilité des mathématiques / Quantities in Nature : the Applicability of mathematics and its ontological conditionsTricard, Julien 05 December 2019 (has links)
Si nos théories physiques peuvent décrire les traits les plus généraux de la réalité, on sait aussi que pour le faire, elles utilisent le langage des mathématiques. On peut alors légitimement se demander si notre capacité à décrire, sinon la nature intime des objets et phénomènes physiques, du moins les relations et structures qu’ils instancient, ne vient pas de cette application des mathématiques. Dans cette thèse, nous soutenons que les mathématiques sont si efficacement applicables en physique tout simplement parce que la réalité décrite par les physiciens est de nature quantitative. Pour cela, nous proposons d’abord une ontologie des quantités, puis des lois de la nature, qui s’inscrit dans les débats contemporains sur la nature des propriétés (théorie des universaux, théorie des tropes, ou nominalisme), et des lois (régularités, ou relations entre universaux). Ensuite, nous examinons deux sortes d’application des mathématiques : la mathématisation des phénomènes par la mesure, puis la formulation mathématique des équations reliant des grandeurs physiques. Nous montrons alors que les propriétés et les lois doivent être comme notre ontologie les décrit, pour que les mathématiques soient légitimement, et si efficacement, applicables. L’intérêt de ce travail est d’articuler des discussions purement ontologiques (et très anciennes, comme la querelle des universaux) avec des exigences épistémologiques rigoureuses qui émanent de la physique actuelle. Cette articulation est conçue de manière transcendantale, car la nature quantitative de la réalité (des propriétés et des lois) y est défendue comme condition d’applicabilité des mathématiques en physique. / Assuming that our best physical theories succeed in describing the most general features of reality, one can only be struck by the effectiveness of mathematics in physics, and wonder whether our ability to describe, if not the very nature of physical entities, at least their relations and the fundamental structures they enter, does not result from applying mathematics. In this dissertation, we claim that mathematical theories are so effectively applicable in physics merely because physical reality is of quantitative nature. We begin by displaying and supporting an ontology of quantities and laws of nature, in the context of current philosophical debates on the nature of properties (universals, classes of tropes, or even nominalistic resemblance classes) and of laws (as mere regularities or as relations among universals). Then we consider two main ways mathematics are applied: first, the way measurement mathematizes physical phenomena, second, the way mathematical concepts are used to formulate equations linking physical quantities. Our reasoning has eventually a transcendental flavor: properties and laws of nature must be as described by the ontology we first support with purely a priori arguments, if mathematical theories are to be legitimately and so effectively applied in measurements and equations. What could make this work valuable is its attempt to link purely ontological (and often very ancient) discussions with rigorous epistemological requirements of modern and contemporary physics. The quantitative nature of being (properties and laws) is thus supported on a transcendental basis: as a necessary condition for mathematics to be legitimately applicable in physics.
|
15 |
La connaissance physique non empirique et le principe de la moindre actionMassussi, Michaël 04 1900 (has links)
Il n’est pas évident si et dans quelle mesure la connaissance non empirique peut donner de l’information sur des systèmes physiques réels. Hume croyait que toute connaissance à propos du monde qui nous entoure ne doit sa certitude à rien d’autre que l’expérience répétée de la conjonction des causes et des effets observables. Or, il y a quelques raisons de croire que le rôle de la raison en physique dépasse celui qui lui est attribué par Hume. Le principe de la moindre action est un bon candidat, pour quelques raisons : il a été découvert à partir d’un argument métaphysique, il rivalise avec les lois de Newton au titre de fondement de la mécanique classique, et il a fini par motiver le développement de nombreux formalismes qui lui sont propres jusqu’au sein des théories les plus récentes de la physique. Nous analyserons les idées ayant mené à sa découverte par Pierre-Louis de Maupertuis. / It is unclear whether and to what extent non-empirical knowledge can provide information about real physical systems. Hume believed that all knowledge about the world around us owes its certainty to nothing other than the repeated experience of the conjunction of observable causes and effects. However, there are some reasons to believe that the role of reason in physics goes beyond that attributed to it by Hume. The principle of least action is a good candidate for a number of reasons : it was discovered from a metaphysical argument, it rivals Newton’s laws as the foundation of classical mechanics, and it eventually motivated the development of several formalisms in a wide variety of the most recent theories of physics. We will analyze the ideas that led to its discovery by Pierre-Louis de Maupertuis.
|
16 |
Concepts and applications of quantum measurementKnee, George C. January 2014 (has links)
In this thesis I discuss the nature of ‘measurement’ in quantum theory. ‘Measurement’ is associated with several different processes: the gradual imprinting of information about one system onto another, which is well understood; the collapse of the wavefunction, which is ill-defined and troublesome; and finally, the means by which inferences about unknown experimental parameters are made. I present a theoretical extension to an experimental proposal from Leggett and Garg, who suggested that the quantum-or-classical reality of a macroscopic system may be probed with successive measurements arrayed in time. The extension allows for a finite level of imperfection in the protocol, and makes use of Leggett’s ‘null result’ measurement scheme. I present the results of an experiment conducted in Oxford that, up to certain loopholes, defies a non-quantum interpretation of the dynamics of phosphorous nuclei embedded in silicon. I also present the theory of statistical parameter estimation, and discover that a recent trend to employ time symmetric ‘postselected’ measurements offers no true advantage over standard methods. The technique, known as weak-value amplification, combines a weak transfer of quantum information from system to meter with conditional data rejection, to surprising effect. The Fisher information is a powerful tool for evaluating the performance of any parameter estimation model, and it reveals the technique to be worse than ordinary, preselected only measurements. That this is true despite the presence of noise (including magnetic field fluctuations causing deco- herence, poor resolution detection, and random displacements), casts serious doubt on the utility of the method.
|
17 |
The Collapse of Decoherence : Can Decoherence Theory Solve The Problems of Measurement?Herlin, Karl January 2023 (has links)
In this review study, we ask ourselves if decoherence theory can solve the problems of measurement in quantum mechanics. After an introduction to decoherence theory, we present the problem of preferred basis, the problem of non-observability of interference and the problem of definite outcomes. We present Zurek's theory of environment induced superselection rules and find that the problem of preferred basis and the problem of non-observability of interference can be solved through decoherence theory, but not the problem of outcomes, if we accept the eigenstate-eigenvalue link and the Born statistical interpretation. We reveal that these two concepts are essential in the Copenhagen interpretations of quantum mechanics, and give an account for von Neumann's and Wigner's conscious collapse interpretation as well as a detailed description of Bohr's and Heisenberg's interpretation. We discuss how Bohr's and Heisenberg's interpretation relates to decoherence with a special emphasis on the irreducibility of classical concepts as interpreted by Don Howard. During the discussion, we critique Wigner's use of the word "consciousness" as opposed to von Neumann's use, as well as Howard's decisively ontological approach to Bohr through an antithetical Kantian approach. We conclude by stating that decoherence theory cannot decisively solve the problem of definite outcomes of quantum mechanics, even when considering it in relation to the Copenhagen interpretation.
|
Page generated in 0.0569 seconds