• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 763
  • 170
  • 24
  • 21
  • 21
  • 21
  • 21
  • 21
  • 21
  • 6
  • 6
  • 4
  • 1
  • 1
  • Tagged with
  • 2872
  • 2872
  • 2521
  • 2129
  • 1312
  • 553
  • 527
  • 462
  • 443
  • 382
  • 373
  • 306
  • 262
  • 223
  • 208
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Conceptual design and development of a navigation system for a mobile robot

Chandler, Angela January 1999 (has links)
No description available.
232

Random sampling as a clutter reduction technique to facilitate interactive visualisation of large datasets

Ellis, Geoffrey January 2008 (has links)
Within our physical world lies a digital world populated with an ever increasing number of sizeable data collections. Exploring these large datasets for patterns or trends is a difficult and complex task, especially when users do not always know what they are looking for. Information visualisation can facilitate this task through an interactive visual representation, thus making the data easier to interpret. However, we can soon reach a limit on the amount of data that can be plotted before the visual display becomes overcrowded or cluttered, hence potentially important information becomes hidden. The main theme of this work is to investigate the use of dynamic random sampling for reducing display clutter. Although randomness has been successfully applied in many areas of computer science and sampling has been used in data processing, the use of random sampling as a dynamic clutter reduction technique is novel. In addition, random sampling is particularly suitable for exploratory tasks as it offers a way of reducing the amount of data without the user having to decide what data is important. Sampling-based scatterplot and parallel coordinate visualisations are developed to experiment with various options and tools. These include simple, dynamic sampling controls with density feedback; a method of checking the reality of the representative sample; the option of global and/or localised clutter reduction using a variety of novel lenses and an auto-sampling option of automatically maintaining a reasonable view of the data within the lens. Furthermore, this work showed that sampling can be added to existing tools and used effectively in conjunction with other clutter reduction techniques. Sampling is evaluated both analytically, using a taxonomy of clutter reduction developed for the purpose, and experimentally using large datasets. The analytic route was prompted by an exploratory analysis, which showed that evaluation of information visualisation based on user studies are problematic. This thesis has contributed to several areas of research: ‣the feasibility and flexibility of global or lens-based sampling as a clutter reduction technique are demonstrated through sampling-based scatterplot and parallel coordinate visualisations. ‣the novel method of calculating the density for overlapping lines in parallel coordinate plots is both accurate and efficient and enables constant density within a sampling lens to be maintained without user intervention. ‣the novel criteria-based taxonomy of clutter reduction for information visualisation provides designers with a method to critique existing visualisations and think about new ones.
233

The impact of highly interactive workloads on video-on-demand systems

Brampton, Andrew January 2008 (has links)
The traditional start-to-finish playback model is not suitable for all modern interactive video streams. Users expect support for higher levels of interactivity such as fast forward and rewind or the ability to arbitrary seek within their media quickly and efficiently. By conducting user studies we have observed start-to-finish is not applicable to many genres of video, and that different playback models fit better. We discuss how existing delivery techniques are impacted by these new observations. Novel interactive controls such as bookmarks have also highly impacted user behaviour. This has lead to the segments within the media being accessed in a uneven fashion, causing hotspots of interest to form; areas with orders of magnitudes more viewers than others. These hotspots typically began at the beginning of a bookmark, however not always, which lead us to design a dynamic bookmark positioning algorithm. As well as their position, determining the hotspot's length can be beneficial. This aids in autonomic techniques such as replication and pre-fetching as well as allowing the users to find what they want quicker. Under high level of interactivity, delivery techniques are less efficient due to the unpredictability of the users. We however developed techniques which restore some of this predictability, allowing clients or servers to predict future actions based on past user actions. These technique proves exceeding useful for pre-fetching which reduces seek latencies for client and can reduce load on servers. However knowledge of past user activities need to be gathered from network, thus we develop techniques to do this in a distributed manner.
234

Applications of signal detection theory to the performance of imaging systems, human observers and artificial intelligence in radiography

Manning, David J. January 1998 (has links)
An investigation was carried out to evaluate diagnostic performance in medical radiology. A critical review of methods available for the assessment of image quality in terms of physical objective measurements and quantitative observer performance was followed by a series of experiments which applied the techniques of Receiver Operating Characteristics (ROC) to radiographic problems. An appraisal of the performance of six currently available imaging systems designed for chest radiography was performed using expert observers and an anthropomorphic phantom. Results showed a range of abilities to demonstrate pulmonary nodules (ROC areas of 0.866 to 0.961). The ROC outcomes for the imaging systems were shown to correlate well with signal to noise ratio (SNR) measurements for the images (0.78, p< 0.05) although comparisons of ROC and threshold detection indices (HT) gave a lower level of agreement (0.6, p<0.05). The SNR method of image evaluation could probably be used as an alternative to ROC techniques in routine quality assurance procedures when time is short. Observers from a group of undergraduate radiography students were tested by an ROC study into their ability to detect pulmonary lesions in chest images. Their ROC areas (Az) ranged from 0.616 to 0.857 (mean 0.74) compared with an expert mean score of 0.872. The low score for the students was investigated in terms of the cognitive task and their search strategy training. Their (Az ) scores showed no significant correlation with simple psychometric tests. A neural network was tested against radiologists, radiographers and student radiographers in its ability to identify fractures in wrist radiographs. All observers performed to a similar level of ROC Az score but the artificial intelligence showed higher specificity values. This attribute was used to filter some of the normals from the test population and resulted in changes to the mean Az score for the human observers.
235

A structural approach to matching problems with preferences

McDermid, Eric J. January 2011 (has links)
This thesis is a study of a number of matching problems that seek to match together pairs or groups of agents subject to the preferences of some or all of the agents. We present a number of new algorithmic results for five specific problem domains. Each of these results is derived with the aid of some structural properties implicitly embedded in the problem. We begin by describing an approximation algorithm for the problem of finding a maximum stable matching for an instance of the stable marriage problem with ties and incomplete lists (MAX-SMTI). Our polynomial time approximation algorithm provides a performance guarantee of 3/2 for the general version of MAX-SMTI, improving upon the previous best approximation algorithm, which gave a performance guarantee of 5/3. Next, we study the sex-equal stable marriage problem (SESM). We show that SESM is W[1]-hard, even if the men's and women's preference lists are both of length at most three. This improves upon the previously known hardness results. We contrast this with an exact, low-order exponential time algorithm. This is the first non-trivial exponential time algorithm known for this problem, or indeed for any hard stable matching problem. Turning our attention to the hospitals / residents problem with couples (HRC), we show that HRC is NP-complete, even if very severe restrictions are placed on the input. By contrast, we give a linear-time algorithm to find a stable matching with couples (or report that none exists) when stability is defined in terms of the classical Gale-Shapley concept. This result represents the most general polynomial time solvable restriction of HRC that we are aware of. We then explore the three dimensional stable matching problem (3DSM), in which we seek to find stable matchings across three sets of agents, rather than two (as in the classical case). We show that under two natural definitions of stability, finding a stable matching for a 3DSM instance is NP-complete. These hardness results resolve some open questions in the literature. Finally, we study the popular matching problem (POP-M) in the context of matching a set of applicants to a set of posts. We provide a characterization of the set of popular matchings for an arbitrary POP-M instance in terms of a new structure called the switching graph exploited to yield efficient algorithms for a range of associated problems, extending and improving upon the previously best-known results for this problem.
236

Bigraphs with sharing and applications in wireless networks

Sevegnani, Michele January 2012 (has links)
Bigraphs are a fully graphical process algebraic formalism, capable of representing both the position in space of agents and their inter-connections. However, they assume a topology based on sets of trees and thus cannot represent spatial locations that are shared among several entities in a simple or intuitive way. This is a problem, because shared locations are often a requirement, for example, when modelling scenarios in the physical world or in modern complex computer systems such as wireless networks and spatial-aware applications in ubiquitous computing. We propose bigraphs with sharing, a generalisation of the original definition of bigraphs, to allow for overlapping topologies. The new locality model is based on directed acyclic graphs. We demonstrate the new formalism can be defined in the general framework of bigraphical theories and wide reactive systems, as originally devised by Robin Milner. We do so by defining a categorical interpretation of bigraphs with sharing, an axiomatisation derived from the equations of a bialgebra over finite ordinals, and a normal form to express bigraphical terms. We illustrate how sharing is essential for modelling overlapping localities by presenting two example case studies in the field of wireless networking. We show that bigraphs with sharing can be used realistically in a production environment by describing the implementation of an efficient matching algorithm and a software tool for the definition, simulation, visualisation and analysis of bigraphical reactive systems.
237

Part of speech N-grams for information retrieval

Lioma, Christina Amalia January 2008 (has links)
The increasing availability of information on the World Wide Web (Web), and the need to access relevant specs of this information provide an important impetus for the development of automatic intelligent Information Retrieval (IR) technology. IR systems convert human authored language into representations that can be processed by computers, with the aim to provide humans with access to knowledge. Specifically, IR applications locate and quantify informative content in data, and make statistical decisions on the topical similarity, or relevance, between different items of data. The wide popularity of IR applications in the last decades has driven intensive research and development into theoretical models of information and relevance, and their implementation into usable applications, such as commercial search engines. The majority of IR systems today typically rely on statistical manipulations of individual lexical frequencies (i.e., single word counts) to estimate the relevance of a document to a user request, on the assumption that such lexical statistics can be sufficiently representative of informative content. Such estimations implicitly assume that words occur independently of each other, and as such ignore the compositional semantics of language. This assumption however is not entirely true, and can cause several problems, such as ambiguity in understanding textual information, misinterpreting or falsifying the original informative intent, and limiting the semantic scope of text. These problems can hinder the accurate estimation of relevance between texts, and hence harm the performance of an IR application. This thesis investigates the use of non-lexical statistics by IR models, with the goal to enhance the estimation of relevance between a document and a user request. These non-lexical statistics consist of part of speech information. The parts of speech are the grammatical classes of words (e.g., noun, verb). Part of speech statistics are modelled in the form of part of speech (POS) n-grams, which are contiguous sequences of parts of speech, extracted from text. The distribution of POS n-grams in language is statistically analysed. It is shown that there exists a relationship between the frequency and informative content of POS n-grams. Based on this, different applications of POS n-grams to IR technology are described and evaluated with state of the art systems. Experimental results show that POS n-grams can assist the retrieval process.
238

A specialised constraint approach for stable matching problems

Unsworth, Chris January 2008 (has links)
Constraint programming is a generalised framework designed to solve combinatorial problems. This framework is made up of a set of predefined independent components and generalised algorithms. This is a very versatile structure which allows for a variety of rich combinatorial problems to be represented and solved relatively easily. Stable matching problems consist of a set of participants wishing to be matched into pairs or groups in a stable manner. A matching is said to be stable if there is no pair or group of participants that would rather make a private arrangement to improve their situation and thus undermine the matching. There are many important "real life" applications of stable matching problems across the world. Some of which includes the Hospitals/Residents problem in which a set of graduating medical students, also known as residents, need to be assigned to hospital posts. Some authorities assign children to schools as a stable matching problem. Many other such problems are also tackled as stable matching problems. A number of classical stable matching problems have efficient specialised algorithmic solutions. Constraint programming solutions to stable matching problems have been investigated in the past. These solutions have been able to match the theoretically optimal time complexities of the algorithmic solutions. However, empirical evidence has shown that in reality these constraint solutions run significantly slower than the specialised algorithmic solutions. Furthermore, their memory requirements prohibit them from solving problems which the specialised algorithmic solutions can solve in a fraction of a second. My contribution investigates the possibility of modelling stable matching problems as specialised constraints. The motivation behind this approach was to find solutions to these problems which maintain the versatility of the constraint solutions, whilst significantly reducing the performance gap between constraint and specialised algorithmic solutions. To this end specialised constraint solutions have been developed for the stable marriage problem and the Hospitals/Residents problem. Empirical evidence has been presented which shows that these solutions can solve significantly larger problems than previously published constraint solutions. For these larger problem instances it was seen that the specialised constraint solutions came within a factor of four of the time required by algorithmic solutions. It has also been shown that, through further specialisation, these constraint solutions can be made to run significantly faster. However, these improvements came at the cost of versatility. As a demonstration of the versatility of these solutions it is shown that, by adding simple side constraints, richer problems can be easily modelled. These richer problems add additional criteria and/or an optimisation requirement to the original stable matching problems. Many of these problems have been proven to be NP-Hard and some have no known algorithmic solutions. Included with these models are results from empirical studies which show that these are indeed feasible solutions to the richer problems. Results from the studies also provide some insight into the structure of these problems, some of which have had little or no previous study.
239

The Web Engineering Security (WES) methodology

Glisson, William Bradley January 2008 (has links)
The World Wide Web has had a significant impact on basic operational economical components in global information rich civilizations. This impact is forcing organizations to provide justification for security from a business case perspective and to focus on security from a web application development environment perspective. This increased focus on security was the basis of a business case discussion and led to the acquisition of empirical evidence gathered from a high level Web survey and more detailed industry surveys to analyse security in the Web application development environment. Along with this information, a collection of evidence from relevant literature was also gathered. Individual aspects of the data gathered in the previously mentioned activities contributed to the proposal of the Essential Elements (EE) and the Security Criteria for Web Application Development (SCWAD). The Essential Elements present the idea that there are essential, basic organizational elements that need to be identified, defined and addressed before examining security aspects of a Web Engineering Development process. The Security Criteria for Web Application Development identifies criteria that need to be addressed by a secure Web Engineering process. Both the EE and SCWAD are presented in detail along with relevant justification of these two elements to Web Engineering. SCWAD is utilized as a framework to evaluate the security of a representative selection of recognized software engineering processes used in Web Engineering application development. The software engineering processes appraised by SCWAD include: the Waterfall Model, the Unified Software Development Process (USD), Dynamic Systems Development Method (DSDM) and eXtreme Programming (XP). SCWAD is also used to assess existing security methodologies which are comprised of the Orion Strategy; Survivable / Viable IS approaches; Comprehensive Lightweight Application Security Process (CLASP) and Microsoft’s Trust Worthy Computing Security Development Lifecycle. The synthesis of information provided by both the EE and SCWAD were used to develop the Web Engineering Security (WES) methodology. WES is a proactive, flexible, process neutral security methodology with customizable components that is based on empirical evidence and used to explicitly integrate security throughout an organization’s chosen application development process. In order to evaluate the practical application of the EE, SCWAD and the WES methodology, two case studies were conducted during the course of this research. The first case study describes the application of both the EE and SCWAD to the Hunterian Museum and Art Gallery’s Online Photo Library (HOPL) Internet application project. The second case study presents the commercial implementation of the WES methodology within a Global Fortune 500 financial service sector organization. The assessment of the WES methodology within the organization consisted of an initial survey establishing current security practices, a follow-up survey after changes were implemented and an overall analysis of the security conditions assigned to projects throughout the life of the case study.
240

Contextual mobile adaptation

Hall, Malcolm January 2008 (has links)
Ubiquitous computing (ubicomp) involves systems that attempt to fit in with users’ context and interaction. Researchers agree that system adaptation is a key issue in ubicomp because it can be hard to predict changes in contexts, needs and uses. Even with the best planning, it is impossible to foresee all uses of software at the design stage. In order for software to continue to be helpful and appropriate it should, ideally, be as dynamic as the environment in which it operates. Changes in user requirements, contexts of use and system resources mean software should also adapt to better support these changes. An area in which adaptation is clearly lacking is in ubicomp systems, especially those designed for mobile devices. By improving techniques and infrastructure to support adaptation it is possible for ubicomp systems to not only sense and adapt to the environments they are running in, but also retrieve and install new functionality so as to better support the dynamic context and needs of users in such environments. Dynamic adaptation of software refers to the act of changing the structure of some part of a software system as it executes, without stopping or restarting it. One of the core goals of this thesis is to discover if such adaptation is feasible, useful and appropriate in the mobile environment, and how designers can create more adaptive and flexible ubicomp systems and associated user experiences. Through a detailed study of existing literature and experience of several early systems, this thesis presents design issues and requirements for adaptive ubicomp systems. This thesis presents the Domino framework, and demonstrates that a mobile collaborative software adaptation framework is achievable. This system can recommend future adaptations based on a history of use. The framework demonstrates that wireless network connections between mobile devices can be used to transport usage logs and software components, with such connections made either in chance encounters or in designed multi–user interactions. Another aim of the thesis is to discover if users can comprehend and smoothly interact with systems that are adapting. To evaluate Domino, a multiplayer game called Castles has been developed, in which game buildings are in fact software modules that are recommended and transferred between players. This evaluation showed that people are comfortable receiving semi–automated software recommendations; these complement traditional recommendation methods such as word of mouth and online forums, with the system’s support freeing users to discuss more in–depth aspects of the system, such as tactics and strategies for use, rather than forcing them to discover, acquire and integrate software by themselves.

Page generated in 0.115 seconds