• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 18
  • 18
  • 16
  • 11
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Analyzing the Conceptual Integrity of Computing Applications Through Ontological Excavation and Analysis

Hsi, Idris 19 July 2005 (has links)
In the world of commercial computing, consumers are being inundated with baroque, bloated, and difficult-to-use computing applications, tools that use computational methods and technologies to perform tasks. Market forces demand that new versions of these applications implement more features, the user-accessible behaviors and services implemented by the application, than their predecessors or competitors. Ensuring that planned features required for market competitiveness enhance a computing application without these side effects first requires that we understand how these features contribute to the overall design and conceptual integrity of the application While conceptual integrity affects all aspect of the application, we are primarily interested in how an applications user-accessible features have been designed and implemented. To this end, we have developed a research framework, methodologies, and artifacts for measuring the conceptual integrity of a computing artifact from its theory of the world or its ontology. We use conceptual coherence, which we define as the degree to which an applications concepts are tightly related, as a first approximation for conceptual integrity. We claim the following: any computing application has a central or core set of concepts that are essential to that applications ontology and can be identified through analytical means; and concepts that are not essential to an applications ontology either exist to support core concepts or are peripheral to the ontology. Peripheral concepts reduce an applications conceptual coherence. We have developed the method of ontological excavation to identify the concepts in a computing application and model them as an ontology expressed as a semantic network. To identify core and peripheral concepts and to measure an ontologys conceptual coherence, we developed methodologies for ontological analysis. If usefulness depends on the conceptual integrity of an applications ontology such that it ensures high fitness to a problem domain, then we would expect that users solving problems in that domain will invoke the concepts integral to the solution more often than those concepts that do not. Thus, to validate our structural measures, we claim the following: the probable use of the application will invoke core concepts more frequently than peripheral concepts in the ontology.
12

Morphodynamics of sand mounds in shallow flows

Garcia-Hermosa, M. Isabel January 2008 (has links)
Large-scale bed features are often encountered in coastal waters, and include sandbanks and spoil heaps. The morphodynamic development of such features involves complicated nonlinear interactions between the flow hydrodynamics, sediment transport, and bed profile. Numerical modelling of the morphodynamic evolution and migration of large-scale bed features is necessary in order to understand their long-term behaviour in response to changing environmental conditions. This thesis describes detailed measurements of the morphodynamics of sand mounds in unidirectional and oscillatory (tidal) flows, undertaken at the U.K. Coastal Research Facility (UKCRF). High quality data were collected, including water velocities, water levels and overhead images. The parameters tested are: three types of mound shape (circular and elliptical in plan shape, and Gaussian, cosine and triangular in cross-section); underlying fixed or mobile bed conditions; and initial crest height (submerged, surface-touching and surface-piercing). Peak flow velocities are about 0.5 m/s, the sand median grain size is 0.454 mm, and transport occurring mostly as bedload. When analysing the data, the bed contours are determined by digitising the shoreline at different water levels. From these plots, the volume, height, and centroid position of the mound are calculated. A large-scale fit method, based on a Gaussian function has been used to separate small-scale ripples from the large-scale bed structure during the evolution of an isolated sand mound or spoil heap. The bed profile after the ripples are removed is comparable to typical predictions by shallow-flow numerical solvers. The UKCRF experiments investigated the morphodynamic response of a bed mound to hydrodynamic forcing: shape changes, migration rates, volume decay and sediment transport rates. The measured migration rate and decay of a submerged sand mound in the UKCRF are found to be in satisfactory agreement with results from various theoretical models, such as the analytical solution derived by De Vriend. Numerical predictions of mound evolution by a commercial code, PISCES, are also presented for a fully submerged sand mound; the bed evolution is reasonably similar to that observed in the UKCRF. The data provided as a result of the research reported in this thesis provide insight into the behaviour of sand mounds in steady and unsteady flows at laboratory scale, and should also be useful for benchmark (validation) purposes to numerical modellers of large-scale morphodynamics.
13

Laser-based detection and tracking of dynamic objects

Wang, Zeng January 2014 (has links)
In this thesis, we present three main contributions to laser-based detection and tracking of dynamic objects, from both a model-based point of view and a model-free point of view, with an emphasis on applications to autonomous driving. A segmentation-based detector is first proposed to provide an end-to-end detection of the classes car, pedestrian and bicyclist in 3D laser data amongst significant background clutter. We postulate that, for the particular classes considered, solving a binary classification task outperforms approaches that tackle the multi-class problem directly. This is confirmed using custom and third-party datasets gathered of urban street scenes. The sliding window approach to object detection, while ubiquitous in the Computer Vision community, is largely neglected in laser-based object detectors, possibly due to its perceived computational inefficiency. We give a second thought to this opinion in this thesis, and demonstrate that, by fully exploiting the sparsity of the problem, exhaustive window searching in 3D can be made efficient. We prove the mathematical equivalence between sparse convolution and voting, and devise an efficient algorithm to compute exactly the detection scores at all window locations, processing a complete Velodyne scan containing 100K points in less than half a second. Its superior performance is demonstrated on the KITTI dataset, and compares commensurably with state of the art vision approaches. A new model-free approach to detection and tracking of moving objects with a 2D lidar is then proposed aiming at detecting dynamic objects of arbitrary shapes and classes. Objects are modelled by a set of rigidly attached sample points along their boundaries whose positions are initialised with and updated by raw laser measurements, allowing a flexible, nonparametric representation. Dealing with raw laser points poses a significant challenge to data association. We propose a hierarchical approach, and present a new variant of the well-known Joint Compatibility Branch and Bound algorithm to handle large numbers of measurements. The system is systematically calibrated on real world data containing 7.5K labelled object examples and validated on 6K test cases. Its performance is demonstrated over an existing industry standard targeted at the same problem domain as well as a classical approach to model-free tracking.
14

Biomimetic and autonomic server ensemble orchestration

Nakrani, Sunil January 2005 (has links)
This thesis addresses orchestration of servers amongst multiple co-hosted internet services such as e-Banking, e-Auction and e-Retail in hosting centres. The hosting paradigm entails levying fees for hosting third party internet services on servers at guaranteed levels of service performance. The orchestration of server ensemble in hosting centres is considered in the context of maximising the hosting centre's revenue over a lengthy time horizon. The inspiration for the server orchestration approach proposed in this thesis is drawn from nature and generally classed as swarm intelligence, specifically, sophisticated collective behaviour of social insects borne out of primitive interactions amongst members of the group to solve problems beyond the capability of individual members. Consequently, the approach is self-organising, adaptive and robust. A new scheme for server ensemble orchestration is introduced in this thesis. This scheme exploits the many similarities between server orchestration in an internet hosting centre and forager allocation in a honeybee (Apis mellifera) colony. The scheme mimics the way a honeybee colony distributes foragers amongst flower patches to maximise nectar influx, to orchestrate servers amongst hosted internet services to maximise revenue. The scheme is extended by further exploiting inherent feedback loops within the colony to introduce self-tuning and energy-aware server ensemble orchestration. In order to evaluate the new server ensemble orchestration scheme, a collection of server ensemble orchestration methods is developed, including a classical technique that relies on past history to make time varying orchestration decisions and two theoretical techniques that omnisciently make optimal time varying orchestration decisions or an optimal static orchestration decision based on complete knowledge of the future. The efficacy of the new biomimetic scheme is assessed in terms of adaptiveness and versatility. The performance study uses representative classes of internet traffic stream behaviour, service user's behaviour, demand intensity, multiple services co-hosting as well as differentiated hosting fee schedule. The biomimetic orchestration scheme is compared with the classical and the theoretical optimal orchestration techniques in terms of revenue stream. This study reveals that the new server ensemble orchestration approach is adaptive in a widely varying external internet environments. The study also highlights the versatility of the biomimetic approach over the classical technique. The self-tuning scheme improves on the original performance. The energy-aware scheme is able to conserve significant energy with minimal revenue performance degradation. The simulation results also indicate that the new scheme is competitive or better than classical and static methods.
15

Stratagems for effective function evaluation in computational chemistry

Skone, Gwyn S. January 2010 (has links)
In recent years, the potential benefits of high-throughput virtual screening to the drug discovery community have been recognized, bringing an increase in the number of tools developed for this purpose. These programs have to process large quantities of data, searching for an optimal solution in a vast combinatorial range. This is particularly the case for protein-ligand docking, since proteins are sophisticated structures with complicated interactions for which either molecule might reshape itself. Even the very limited flexibility model to be considered here, using ligand conformation ensembles, requires six dimensions of exploration - three translations and three rotations - per rigid conformation. The functions for evaluating pose suitability can also be complex to calculate. Consequently, the programs being written for these biochemical simulations are extremely resource-intensive. This work introduces a pure computer science approach to the field, developing techniques to improve the effectiveness of such tools. Their architecture is generalized to an abstract pattern of nested layers for discussion, covering scoring functions, search methods, and screening overall. Based on this, new stratagems for molecular docking software design are described, including lazy or partial evaluation, geometric analysis, and parallel processing implementation. In addition, a range of novel algorithms are presented for applications such as active site detection with linear complexity (PIES) and small molecule shape description (PASTRY) for pre-alignment of ligands. The various stratagems are assessed individually and in combination, using several modified versions of an existing docking program, to demonstrate their benefit to virtual screening in practical contexts. In particular, the importance of appropriate precision in calculations is highlighted.
16

Decision making under uncertainty

McInerney, Robert E. January 2014 (has links)
Operating and interacting in an environment requires the ability to manage uncertainty and to choose definite courses of action. In this thesis we look to Bayesian probability theory as the means to achieve the former, and find that through rigorous application of the rules it prescribes we can, in theory, solve problems of decision making under uncertainty. Unfortunately such methodology is intractable in realworld problems, and thus approximation of one form or another is inevitable. Many techniques make use of heuristic procedures for managing uncertainty. We note that such methods suffer unreliable performance and rely on the specification of ad-hoc variables. Performance is often judged according to long-term asymptotic performance measures which we also believe ignores the most complex and relevant parts of the problem domain. We therefore look to develop principled approximate methods that preserve the meaning of Bayesian theory but operate with the scalability of heuristics. We start doing this by looking at function approximation in continuous state and action spaces using Gaussian Processes. We develop a novel family of covariance functions which allow tractable inference methods to accommodate some of the uncertainty lost by not following full Bayesian inference. We also investigate the exploration versus exploitation tradeoff in the context of the Multi-Armed Bandit, and demonstrate that principled approximations behave close to optimal behaviour and perform significantly better than heuristics on a range of experimental test beds.
17

Exploiting whole-PDB analysis in novel bioinformatics applications

Ramraj, Varun January 2014 (has links)
The Protein Data Bank (PDB) is the definitive electronic repository for experimentally-derived protein structures, composed mainly of those determined by X-ray crystallography. Approximately 200 new structures are added weekly to the PDB, and at the time of writing, it contains approximately 97,000 structures. This represents an expanding wealth of high-quality information but there seem to be few bioinformatics tools that consider and analyse these data as an ensemble. This thesis explores the development of three efficient, fast algorithms and software implementations to study protein structure using the entire PDB. The first project is a crystal-form matching tool that takes a unit cell and quickly (< 1 second) retrieves the most related matches from the PDB. The unit cell matches are combined with sequence alignments using a novel Family Clustering Algorithm to display the results in a user-friendly way. The software tool, Nearest-cell, has been incorporated into the X-ray data collection pipeline at the Diamond Light Source, and is also available as a public web service. The bulk of the thesis is devoted to the study and prediction of protein disorder. Initially, trying to update and extend an existing predictor, RONN, the limitations of the method were exposed and a novel predictor (called MoreRONN) was developed that incorporates a novel sequence-based clustering approach to disorder data inferred from the PDB and DisProt. MoreRONN is now clearly the best-in-class disorder predictor and will soon be offered as a public web service. The third project explores the development of a clustering algorithm for protein structural fragments that can work on the scale of the whole PDB. While protein structures have long been clustered into loose families, there has to date been no comprehensive analytical clustering of short (~6 residue) fragments. A novel fragment clustering tool was built that is now leading to a public database of fragment families and representative structural fragments that should prove extremely helpful for both basic understanding and experimentation. Together, these three projects exemplify how cutting-edge computational approaches applied to extensive protein structure libraries can provide user-friendly tools that address critical everyday issues for structural biologists.
18

Left ventricle functional analysis in 2D+t contrast echocardiography within an atlas-based deformable template model framework

Casero Cañas, Ramón January 2008 (has links)
This biomedical engineering thesis explores the opportunities and challenges of 2D+t contrast echocardiography for left ventricle functional analysis, both clinically and within a computer vision atlas-based deformable template model framework. A database was created for the experiments in this thesis, with 21 studies of contrast Dobutamine Stress Echo, in all 4 principal planes. The database includes clinical variables, human expert hand-traced myocardial contours and visual scoring. First the problem is studied from a clinical perspective. Quantification of endocardial global and local function using standard measures shows expected values and agreement with human expert visual scoring, but the results are less reliable for myocardial thickening. Next, the problem of segmenting the endocardium with a computer is posed in a standard landmark and atlas-based deformable template model framework. The underlying assumption is that these models can emulate human experts in terms of integrating previous knowledge about the anatomy and physiology with three sources of information from the image: texture, geometry and kinetics. Probabilistic atlases of contrast echocardiography are computed, while noting from histograms at selected anatomical locations that modelling texture with just mean intensity values may be too naive. Intensity analysis together with the clinical results above suggest that lack of external boundary definition may preclude this imaging technique for appropriate measuring of myocardial thickening, while endocardial boundary definition is appropriate for evaluation of wall motion. Geometry is presented in a Principal Component Analysis (PCA) context, highlighting issues about Gaussianity, the correlation and covariance matrices with respect to physiology, and analysing different measures of dimensionality. A popular extension of deformable models ---Active Appearance Models (AAMs)--- is then studied in depth. Contrary to common wisdom, it is contended that using a PCA texture space instead of a fixed atlas is detrimental to segmentation, and that PCA models are not convenient for texture modelling. To integrate kinetics, a novel spatio-temporal model of cardiac contours is proposed. The new explicit model does not require frame interpolation, and it is compared to previous implicit models in terms of approximation error when the shape vector changes from frame to frame or remains constant throughout the cardiac cycle. Finally, the 2D+t atlas-based deformable model segmentation problem is formulated and solved with a gradient descent approach. Experiments using the similarity transformation suggest that segmentation of the whole cardiac volume outperforms segmentation of individual frames. A relatively new approach ---the inverse compositional algorithm--- is shown to decrease running times of the classic Lucas-Kanade algorithm by a factor of 20 to 25, to values that are within real-time processing reach.

Page generated in 0.1439 seconds