• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 29
  • Tagged with
  • 54
  • 54
  • 54
  • 13
  • 11
  • 10
  • 9
  • 8
  • 8
  • 7
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

An integrated modeling framework for concept formation : developing number-sense, a partial resolution of the learning paradox

Rendell, Gerard Vincent Alfred January 2012 (has links)
The development of mathematics is foundational. For the most part in early childhood it is seldom insurmountable. Various constructions exhibit conceptual change in the child, which is evidence of overcoming the learning paradox. If one tries to account for learning by means of mental actions carried out by the learner, then it is necessary to attribute to the learner a prior structure , one that is as advanced or as complex as the one to be acquired, unless there is emergence. This thesis reinterprets Piaget's theory using research from neurophysiology, biology, machine learning and demonstrates a novel approach to partially resolve the learning paradox for a simulation that experiences a number line world, exhibiting emergence of structure using a model of Drosophila. In doing so, the research evaluates other models of cognitive development against a real-world, worked example of number-sense from childhood mathematics. The purpose is to determine if they assume a prior capacity to solve problems or provide parallel assumptions within the learning process as additional capabilities not seen in children. Technically, the research uses an artificial neural network with reinforcement learning to confirm the emergence of permanent object invariants. It then evaluates an evolved dialectic system with hierarchical finite state automata within a reactive Argos framework to confirm the reevaluated Piagetian developmental model against the worked example. This research thesis establishes that the emergence of new concepts is a critical need in the development of autonomous evolvable systems that can act, learn and plan in novel ways, in noisy situations.
32

Symbolic algorithms for the local analysis of systems of pseudo-linear equations

Broughton, Gary John January 2013 (has links)
This thesis is concerned with the design and implementation of algorithms in Computer Algebra - a discipline which pursues a symbolic approach to solving mathematical equations and problems in contrast to computing solutions numerically. More precisely, we study sys¬tems of pseudo-linear equations, which unify the classes of linear differential, difference and q-difference systems. Whilst the classical mathematical theory of asymptotic expansions and the notion of formal solutions of this type of solutions are well established for all these indi-vidual cases, no unifying theoretical framework for pseudo-linear systems was known prior to our work. From an algorithmic point of view, the computation of a complete fundamental system of formal solutions is implemented by the formal reduction process. The formal reduction of linear differential systems had been treated in the past, and linear difference systems were also investigated and partly solved. In the case of linear q-difference systems, the structure of the formal solution is much easier which results in an alleviated formal reduction. However, no satisfying algorithm had been published that would be suitable to compute the formal solutions. We place ourselves in the generic setting and show that various algorithms that are known to be building blocks for the formal reduction in the differential case can be extended to the general pseudo-linear setting. In particular, the family of Moser- and super-reduction algorithms as well as the Classical Splitting Lemma and the Generalised Splitting Lemma are amongst the fundamental ingredients that we consider and which are essential for an effective formal reduction procedure. Whereas some of these techniques had been considered and adapted for systems of difference or q-difference equations, our novel contribution is to show that they can be extended and formulated in such a way that they are valid generically. Based on these results, we then design our generic formal reduction method, again in-spired by the differential case. Apart from the resulting unified approach, this also yields a novel approach to the formal reduction of difference and q-difference systems. Together with a generalisation of an efficient algorithm for computing regular formal solutions that was devised for linear differential systems, we finally obtain a complete and generic algorithm for computing formal solutions of systems of pseudo-linear equations. We show that we are able to compute a complete basis of formal solutions of large classes of linear functional systems, using our formal reduction method. The algorithms presented in this thesis have been implemented in the Computer Algebra System Maple as part of the Open Source project ISOLDE.
33

A computer vision approach to classification of circulating tumor cells

Hopkins, David 01 August 2013 (has links)
<p> Current research into the detection and characterization of circulating tumor cells (CTCs) in the bloodstream can be used to assess the threat to a potential cancer victim. We have determined specific goals to further the understanding of these cells. 1) Full automation of an algorithm to overcome the current methods challenges of being labor-intensive and time-consuming, 2) Detection of single CTC cells amongst several million white blood cells given digital imagery of a panel of blood, and 3) Objective classification of white blood cells, CTCs, and potential sub-types. </p><p> We demonstrate in this paper the developed theory, code and implementation necessary for addressing these goals using mathematics and computer vision techniques. These include: 1) Formation of a completely data-driven methodology, and 2) Use of Bag of Features computer vision technique coupled with custom-built pixel-centric feature descriptors, 3) Use of clustering techniques such as <i> K</i>-means and Hierarchical clustering as a robust classification method to glean insights into cell characteristics. </p><p> To objectively determine the adequacy of our approach, we test our algorithm against three benchmarks: sensitivity/specificity in classification, nontrivial event detection, and rotational invariance. The algorithm performed well with the first two, and we provide possible modifications to improve performance on the third. The results of the sensitivity and specificity benchmark are important. The unfiltered data we used to test our algorithm were images of blood panels containing 44,914 WBCs and 39 CTCs. The algorithm classified 67.5 percent of CTCs into an outlier cluster containing only 300 cells. A simple modification brought the classification rate up to 80 percent of total CTCs. This modification brings the cluster count to only 400 cells. This is a significant reduction in cells a pathologist would sort through as it is only .9 percent of the total data.</p>
34

Quantum Circuit Synthesis using Group Decomposition and Hilbert Spaces

Saraivanov, Michael S. 28 August 2013 (has links)
<p> The exponential nature of Moore's law has inadvertently created huge data storage complexes that are scattered around the world. Data elements are continuously being searched, processed, erased, combined and transferred to other storage units without much regard to power consumption. The need for faster searches and power efficient data processing is becoming a fundamental requirement. Quantum computing may offer an elegant solution to speed and power through the utilization of the natural laws of quantum mechanics. Therefore, minimal cost quantum circuit implementation methodologies are greatly desired. </p><p> This thesis explores the decomposition of group functions and the Walsh spectrum for implementing quantum canonical cascades with minimal cost. Three different methodologies, using group decomposition, are presented and generalized to take advantage of different quantum computing hardware, such as ion traps and quantum dots. Quantum square root of swap gates and fixed angle rotation gates comprise the first two methodologies. The third and final methodology provides further quantum cost reduction by more efficiently utilizing Hilbert spaces through variable angle rotation gates. The thesis then extends the methodology to realize a robust quantum circuit synthesis tool for single and multi-output quantum logic functions.</p>
35

Statistical language modelling and novel parsing techniques for enhanced creation and editing of mathematical e-content using spoken input

Attanayake, Dilaksha Rajiv January 2014 (has links)
The work described in this thesis aims at facilitating the design and im- plementation of web-based editors, driven by speech or natural language input, with a focus on editing mathematics. First, a taxonomy for system architectures of speech-based applications is given. This classification is based on the location of the speech recognition, the speech, and application logic and the resulting flow of data between client and server components. This contribution extends existing system architecture approaches to take into account the characteristics of speech- based systems. We then show, using statistical language modelling techniques, that math- ematics, either spoken or typed, is more predictable than everyday natu- ral languages. We illustrate how these models, in combination with error correction algorithms, can be used to successfully assist the process of cre- ating mathematical expressions within electronic documents using speech. We have successfully implemented systems to demonstrate our findings, which have also been evaluated using standard language modelling evalua- tion techniques. This work is novel as applying statistical language models to the recognition of spoken mathematics has not been evaluated to this extent prior to our work. We create a parsing framework for spoken mathematics, based on mixfix operators, operator precedences and non-deterministic parsing techniques. This framework can significantly improve the design and parsing of spoken command languages such as spoken mathematics. A novel robust error recovery method for an adaptation of the XGLR parsing approach to our operator precedence setting is presented. This greatly enhances the range of spoken or typed mathematics that can be parsed. The novel parsing framework, algorithms and error recovery that we have designed are suitable for more general structured spoken command languages, as well. The algorithms devised in this thesis have been implemented and integrated in a research prototype system called TalkMaths. We evaluate our contri- butions to the new version of this system by comparing the power of our parser with that contained in previous versions, and by conducting a field study where students engage with our system in a real classroom-based environment. We show that using TalkMaths, rather than a conventional mathematics editor, had a positive impact on the learning and understand- ing of mathematical concepts of the participants.
36

Sparse representations for recognition

Cheng, Lei. Liu, Xiuwen. January 2004 (has links)
Thesis (M.S.)--Florida State University, 2004. / Advisor: Dr. Xiuwen Liu, Florida State University, College of Arts and Sciences, Dept. of Computer Science. Title and description from dissertation home page (viewed June 17, 2004). Includes bibliographical references.
37

Approximate inference in graphical models

Hennig, Philipp January 2011 (has links)
Probability theory provides a mathematically rigorous yet conceptually flexible calculus of uncertainty, allowing the construction of complex hierarchical models for real-world inference tasks. Unfortunately, exact inference in probabilistic models is often computationally expensive or even intractable. A close inspection in such situations often reveals that computational bottlenecks are confined to certain aspects of the model, which can be circumvented by approximations without having to sacrifice the model's interesting aspects. The conceptual framework of graphical models provides an elegant means of representing probabilistic models and deriving both exact and approximate inference algorithms in terms of local computations. This makes graphical models an ideal aid in the development of generalizable approximations. This thesis contains a brief introduction to approximate inference in graphical models (Chapter 2), followed by three extensive case studies in which approximate inference algorithms are developed for challenging applied inference problems. Chapter 3 derives the first probabilistic game tree search algorithm. Chapter 4 provides a novel expressive model for inference in psychometric questionnaires. Chapter 5 develops a model for the topics of large corpora of text documents, conditional on document metadata, with a focus on computational speed. In each case, graphical models help in two important ways: They first provide important structural insight into the problem; and then suggest practical approximations to the exact probabilistic solution.
38

Towards a computational model of the colonic crypt with a realistic, deformable geometry

Dunn, Sara-Jane Nicole January 2011 (has links)
Colorectal cancer (CRC) is one of the most prevalent and deadly forms of cancer. Its high mortality rate is associated with difficulties in early detection, which is crucial to survival. The onset of CRC is marked by macroscopic changes in intestinal tissue, originating from a deviation in the healthy cell dynamics of glands known as the crypts of Lieberkuhn. It is believed that accumulated genetic alterations confer on mutated cells the ability to persist in the crypts, which can lead to the formation of a benign tumour through localised proliferation. Stress on the crypt walls can lead to buckling, or crypt fission, and the further spread of mutant cells. Elucidating the initial perturbations in crypt dynamics is not possible experimentally, but such investigations could be made using a predictive, computational model. This thesis proposes a new discrete crypt model, which focuses on the interaction between cell- and tissue-level behaviour, while incorporating key subcellular components. The model contains a novel description of the role of the surrounding tissue and musculature, which allows the shape of the crypt to evolve and deform. A two-dimensional (2D) cross-sectional geometry is considered. Simulation results reveal how the shape of the crypt base may contribute mechanically to the asymmetric division events typically associated with the stem cells in this region. The model predicts that epithelial cell migration may arise due to feedback between cell loss at the crypt collar and density-dependent cell division, an hypothesis which can be investigated in a wet lab. Further, in silico experiments illustrate how this framework can be used to investigate the spread of mutations, and conclude that a reduction in cell migration is key to confer persistence on mutant cell populations. A three-dimensional (3D) model is proposed to remove the spatial restrictions imposed on cell migration in 2D, and preliminary simulation results agree with the hypotheses generated in 2D. Computational limitations that currently restrict extension to a realistic 3D geometry are discussed. These models enable investigation of the role that mechanical forces play in regulating tissue homeostasis, and make a significant contribution to the theoretical study of the onset of crypt deformation under pre-cancerous conditions.
39

Extraction of clinical information from the non-invasive fetal electrocardiogram

Behar, Joachim January 2014 (has links)
Estimation of the fetal heart rate (FHR) has gained interest in the last century; low heart rate variability has been studied to identify intrauterine growth restricted fetuses (prepartum), and abnormal FHR patterns have been associated with fetal distress during delivery (intrapartum). Several monitoring techniques have been proposed for FHR estimation, including auscultation and Doppler ultrasound. This thesis focuses on the extraction of the non-invasive fetal electrocardiogram (NI-FECG) recorded from a limited set of abdominal sensors. The main challenge with NI-FECG extraction techniques is the low signal-to-noise ratio of the FECG signal on the abdominal mixture signal which consists of a dominant maternal ECG component, FECG and noise. However the NI-FECG offers many advantages over the alternative fetal monitoring techniques, the most important one being the opportunity to enable morphological analysis of the FECG which is vital for determining whether an observed FHR event is normal or pathological. In order to advance the field of NI-FECG signal processing, the development of standardised public databases and benchmarking of a number of published and novel algorithms was necessary. Databases were created depending on the application: FHR estimation with or without maternal chest lead reference or directed toward FECG morphology analysis. Moreover, a FECG simulator was developed in order to account for pathological cases or rare events which are often under-represented (or completely missing) in the existing databases. This simulator also serves as a tool for studying NI-FECG signal processing algorithms aimed at morphological analysis (which require underlying ground truth annotations). An accurate technique for the automatic estimation of the signal quality level was also developed, optimised and thoroughly tested on pathological cases. Such a technique is mandatory for any clinical applications of FECG analysis as an external confidence index of both the input signals and the analysis outputs. Finally, a Bayesian filtering approach was implemented in order to address the NI-FECG morphology analysis problem. It was shown, for the first time, that the NI-FECG can allow accurate estimation of the fetal QT interval, which opens the way for new clinical studies on the development of the fetus during the pregnancy.
40

Discrete quantum walks and quantum image processing

Venegas-Andraca, Salvador Elías January 2005 (has links)
In this thesis we have focused on two topics: Discrete Quantum Walks and Quantum Image Processing. Our work is a contribution within the field of quantum computation from the perspective of a computer scientist. With the purpose of finding new techniques to develop quantum algorithms, there has been an increasing interest in studying Quantum Walks, the quantum counterparts of classical random walks. Our work in quantum walks begins with a critical and comprehensive assessment of those elements of classical random walks and discrete quantum walks on undirected graphs relevant to algorithm development. We propose a model of discrete quantum walks on an infinite line using pairs of quantum coins under different degrees of entanglement, as well as quantum walkers in different initial state configurations, including superpositions of corresponding basis states. We have found that the probability distributions of such quantum walks have particular forms which are different from the probability distributions of classical random walks. Also, our numerical results show that the symmetry properties of quantum walks with entangled coins have a non-trivial relationship with corresponding initial states and evolution operators. In addition, we have studied the properties of the entanglement generated between walkers, in a family of discrete Hadamard quantum walks on an infinite line with one coin and two walkers. We have found that there is indeed a relation between the amount of entanglement available in each step of the quantum walk and the symmetry of the initial coin state. However, as we show with our numerical simulations, such a relation is not straightforward and, in fact, it can be counterintuitive. Quantum Image Processing is a blend of two fields: quantum computation and image processing. Our aim has been to promote cross-fertilisation and to explore how ideas from quantum computation could be used to develop image processing algorithms. Firstly, we propose methods for storing and retrieving images using non-entangled and entangled qubits. Secondly, we study a case in which 4 different values are randomly stored in a single qubit, and show that quantum mechanical properties can, in certain cases, allow better reproduction of original stored values compared with classical methods. Finally, we briefly note that entanglement may be used as a computational resource to perform hardware-based pattern recognition of geometrical shapes that would otherwise require classical hardware and software.

Page generated in 0.1191 seconds