• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7530
  • 1107
  • 1049
  • 794
  • 483
  • 291
  • 238
  • 185
  • 90
  • 81
  • 64
  • 52
  • 45
  • 44
  • 42
  • Tagged with
  • 14558
  • 9365
  • 3969
  • 2384
  • 1933
  • 1930
  • 1740
  • 1650
  • 1536
  • 1451
  • 1382
  • 1362
  • 1360
  • 1304
  • 1282
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

An adaptive control system for precision cylindrical grinding

Thomas, David Andrew January 1996 (has links)
No description available.
212

The influence of microstructural rock properties on water jet assisted cutting

Goodfellow, Paul R. A. January 1990 (has links)
No description available.
213

From a Futurist text based robotic theatre to Theatre of the Fantastic : the historical and theoretical basis and dramaturgical characteristics of a text based robotic theatre and contemporary developments of the form

Ramsay, Gordon P. January 2002 (has links)
No description available.
214

Efficient Machine Learning with High Order and Combinatorial Structures

Tarlow, Daniel 13 August 2013 (has links)
The overaching goal in this thesis is to develop the representational frameworks, the inference algorithms, and the learning methods necessary for the accurate modeling of domains that exhibit complex and non-local dependency structures. There are three parts to this thesis. In the first part, we develop a toolbox of high order potentials (HOPs) that are useful for defining interactions and constraints that would be inefficient or otherwise difficult to use within the standard graphical modeling framework. For each potential, we develop associated algorithms so that the type of interaction can be used efficiently in a variety of settings. We further show that this HOP toolbox is useful not only for defining models, but also for defining loss functions. In the second part, we look at the similarities and differences between special-purpose and general-purpose inference algorithms, with the aim of learning from the special-purpose algorithms so that we can build better general-purpose algorithms. Specifically, we show how to cast a popular special-purpose algorithm (graph cuts) in terms of the degrees of freedom available to a popular general-purpose algorithm (max-product belief propagation). After, we look at how to take the lessons learned and build a better general-purpose algorithm. Finally, we develop a class of model that allows for the discrete optimization algorithms studied in the previous sections (as well as other discrete optimization algorithms) to be used as the centerpoint of probabilistic models. This allows us to build probabilistic models that have fast exact inference procedures in domains where the standard probabilistic formulation would lead to intractability.
215

Efficient Machine Learning with High Order and Combinatorial Structures

Tarlow, Daniel 13 August 2013 (has links)
The overaching goal in this thesis is to develop the representational frameworks, the inference algorithms, and the learning methods necessary for the accurate modeling of domains that exhibit complex and non-local dependency structures. There are three parts to this thesis. In the first part, we develop a toolbox of high order potentials (HOPs) that are useful for defining interactions and constraints that would be inefficient or otherwise difficult to use within the standard graphical modeling framework. For each potential, we develop associated algorithms so that the type of interaction can be used efficiently in a variety of settings. We further show that this HOP toolbox is useful not only for defining models, but also for defining loss functions. In the second part, we look at the similarities and differences between special-purpose and general-purpose inference algorithms, with the aim of learning from the special-purpose algorithms so that we can build better general-purpose algorithms. Specifically, we show how to cast a popular special-purpose algorithm (graph cuts) in terms of the degrees of freedom available to a popular general-purpose algorithm (max-product belief propagation). After, we look at how to take the lessons learned and build a better general-purpose algorithm. Finally, we develop a class of model that allows for the discrete optimization algorithms studied in the previous sections (as well as other discrete optimization algorithms) to be used as the centerpoint of probabilistic models. This allows us to build probabilistic models that have fast exact inference procedures in domains where the standard probabilistic formulation would lead to intractability.
216

Coalgebraic automata and canonical models of Moore machines

Cordy, Brendan. January 2008 (has links)
We give a concise introduction to the coalgebraic theory of Moore machines, and building on [6], develop a method for constructing a final Moore machine based on a simple modal logic. Completeness for the logic follows easily from the finality construction, and we furthermore show how this logical framework can be used for machine learning.
217

Incorporating prior domain knowledge into inductive machine learning: its implementation in contemporary capital markets.

Yu, Ting January 2007 (has links)
An ideal inductive machine learning algorithm produces a model best approximating an underlying target function by using reasonable computational cost. This requires the resultant model to be consistent with the training data, and generalize well over the unseen data. Regular inductive machine learning algorithms rely heavily on numerical data as well as general-purpose inductive bias. However certain environments contain rich domain knowledge prior to the learning task, but it is not easy for regular inductive learning algorithms to utilize prior domain knowledge. This thesis discusses and analyzes various methods of incorporating prior domain knowledge into inductive machine learning through three key issues: consistency, generalization and convergence. Additionally three new methods are proposed and tested over data sets collected from capital markets. These methods utilize financial knowledge collected from various sources, such as experts and research papers, to facilitate the learning process of kernel methods (emerging inductive learning algorithms). The test results are encouraging and demonstrate that prior domain knowledge is valuable to inductive learning machines.
218

Molecular prediction of drug response using machine learning methods

Ding, Zhenyu, January 2008 (has links)
Thesis (M.S.)--West Virginia University, 2008. / Title from document title page. Document formatted into pages; contains viii, 65 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 63-65).
219

An examination of designs for the instruction pipeline of the G-machine. /

Hostmann, Bill. January 1987 (has links)
Thesis (M.S.)--Oregon Graduate Center, 1987.
220

Learning-based Web query understanding /

Shen, Dou. January 2007 (has links)
Thesis (Ph.D.)--Hong Kong University of Science and Technology, 2007. / Includes bibliographical references (leaves 144-162). Also available in electronic version.

Page generated in 0.05 seconds