• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 420
  • 207
  • 108
  • 46
  • 24
  • 17
  • 13
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • Tagged with
  • 1731
  • 891
  • 680
  • 591
  • 402
  • 310
  • 253
  • 172
  • 168
  • 160
  • 157
  • 147
  • 134
  • 128
  • 127
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

An Examination of Quantum Foundations

Markes, Sonia January 2011 (has links)
Quantum foundations is a field of diverse goals and methods. In this thesis, I will present three different approaches to quantum foundations, each emphasizing a different goal or perspective. The causaloid framework has the goal is to use insight from quantum foundations to study quantum gravity. Ontic models are a tool used to study realist theories of quantum mechanics from an operational quantum information perspective. Nelson's mechanics is a derivation of the Schrodinger equation using the machinery of stochastic mechanics. As each of these approaches has different set of goals, they are suited to different purposes and have different limitations. From the causaloid, I construct the concept of causally unbiased entropy and at the same time, find an emergent idea of causality in the form of a measure of causal connectedness, termed the Q factor. In the ontic models framework, I reproduce the generalization of the concept of contextuality. For Nelson's mechanics, I examine its relationship to Bohmian mechanics - a realist formulation of quantum mechanics. I will then examine the relationship of these different approaches to one another. From this examination I will introduce the concept of physical contextuality in order to ask whether contextuality could be more than just a mathematical artifact. I also include a discussion of the property of deficiency in ontic models and its relation to contextuality given certain constraints.
92

Towards Theoretical Foundations of Clustering

Ackerman, Margareta January 2012 (has links)
Clustering is a central unsupervised learning task with a wide variety of applications. Unlike in supervised learning, different clustering algorithms may yield dramatically different outputs for the same input sets. As such, the choice of algorithm is crucial. When selecting a clustering algorithm, users tend to focus on cost-related considerations, such as running times, software purchasing costs, etc. Yet differences concerning the output of the algorithms are a more primal consideration. We propose an approach for selecting clustering algorithms based on differences in their input-output behaviour. This approach relies on identifying significant properties of clustering algorithms and classifying algorithms based on the properties that they satisfy. We begin with Kleinberg's impossibility result, which relies on concise abstract properties that are well-suited for our approach. Kleinberg showed that three specific properties cannot be satisfied by the same algorithm. We illustrate that the impossibility result is a consequence of the formalism used, proving that these properties can be formulated without leading to inconsistency in the context of clustering quality measures or algorithms whose input requires the number of clusters. Combining Kleinberg's properties with newly proposed ones, we provide an extensive property-base classification of common clustering paradigms. We use some of these properties to provide a novel characterization of the class of linkage-based algorithms. That is, we distil a small set of properties that uniquely identify this family of algorithms. Lastly, we investigate how the output of algorithms is affected by the addition of small, potentially adversarial, sets of points. We prove that given clusterable input, the output of $k$-means is robust to the addition of a small number of data points. On the other hand, clusterings produced by many well-known methods, including linkage-based techniques, can be changed radically by adding a small number of elements.
93

DEFORMATION OF ROCK FOUNDATIONS UNDER HEAVY LOADS

Erwin, James Walter, 1946- January 1975 (has links)
No description available.
94

Foundation optimisation and its application to pile reuse

Leung, Yat Fai January 2010 (has links)
No description available.
95

Model studies of the bearing capacity of pile groups in a saturated clay

Martin, Carl Bernard 08 1900 (has links)
No description available.
96

An investigation of the dynamic bearing capacity of footings on sand

Woodard, John Marvin 05 1900 (has links)
No description available.
97

The influence of depth on the bearing capacity of strip footings in sand

Duncan, J. M. (James Michael) 05 1900 (has links)
No description available.
98

A study of bearing capacity in sands under dynamic loading

Banks, D. C. (Don Charles) 05 1900 (has links)
No description available.
99

An Examination of Quantum Foundations

Markes, Sonia January 2011 (has links)
Quantum foundations is a field of diverse goals and methods. In this thesis, I will present three different approaches to quantum foundations, each emphasizing a different goal or perspective. The causaloid framework has the goal is to use insight from quantum foundations to study quantum gravity. Ontic models are a tool used to study realist theories of quantum mechanics from an operational quantum information perspective. Nelson's mechanics is a derivation of the Schrodinger equation using the machinery of stochastic mechanics. As each of these approaches has different set of goals, they are suited to different purposes and have different limitations. From the causaloid, I construct the concept of causally unbiased entropy and at the same time, find an emergent idea of causality in the form of a measure of causal connectedness, termed the Q factor. In the ontic models framework, I reproduce the generalization of the concept of contextuality. For Nelson's mechanics, I examine its relationship to Bohmian mechanics - a realist formulation of quantum mechanics. I will then examine the relationship of these different approaches to one another. From this examination I will introduce the concept of physical contextuality in order to ask whether contextuality could be more than just a mathematical artifact. I also include a discussion of the property of deficiency in ontic models and its relation to contextuality given certain constraints.
100

Towards Theoretical Foundations of Clustering

Ackerman, Margareta January 2012 (has links)
Clustering is a central unsupervised learning task with a wide variety of applications. Unlike in supervised learning, different clustering algorithms may yield dramatically different outputs for the same input sets. As such, the choice of algorithm is crucial. When selecting a clustering algorithm, users tend to focus on cost-related considerations, such as running times, software purchasing costs, etc. Yet differences concerning the output of the algorithms are a more primal consideration. We propose an approach for selecting clustering algorithms based on differences in their input-output behaviour. This approach relies on identifying significant properties of clustering algorithms and classifying algorithms based on the properties that they satisfy. We begin with Kleinberg's impossibility result, which relies on concise abstract properties that are well-suited for our approach. Kleinberg showed that three specific properties cannot be satisfied by the same algorithm. We illustrate that the impossibility result is a consequence of the formalism used, proving that these properties can be formulated without leading to inconsistency in the context of clustering quality measures or algorithms whose input requires the number of clusters. Combining Kleinberg's properties with newly proposed ones, we provide an extensive property-base classification of common clustering paradigms. We use some of these properties to provide a novel characterization of the class of linkage-based algorithms. That is, we distil a small set of properties that uniquely identify this family of algorithms. Lastly, we investigate how the output of algorithms is affected by the addition of small, potentially adversarial, sets of points. We prove that given clusterable input, the output of $k$-means is robust to the addition of a small number of data points. On the other hand, clusterings produced by many well-known methods, including linkage-based techniques, can be changed radically by adding a small number of elements.

Page generated in 0.0715 seconds