1 |
Designing and evaluating a user interface for continous embedded lifelogging based on physical contextMohamed, Esmail Esaed Tahir January 2013 (has links)
An increase in both personal information and storage capacity has encouraged people to store and archive their life experience in multimedia formats. The usefulness of such large amounts of data will remain inadequate without the development of both retrieval techniques and interfaces that help people access and navigate their personal collections. The research described in this thesis investigates lifelogging technology from the perspective of the psychology of memory and human-computer interaction. The research described seeks to increase my understanding of what data can trigger memories and how I might use this insight to retrieve past life experiences in interfaces to lifelogging technology. The review of memory and previous research on lifelogging technology allows and support me to establish a clear understanding of how memory works and design novel and effective memory cues; whilst at the same time I critiqued existing lifelogging systems and approaches to retrieving memories of past actions and activities. In the initial experiments I evaluated the design and implementation of a prototype which exposed numerous problems both in the visualisation of data and usability. These findings informed the design of novel lifelogging prototype to facilitate retrieval. I assessed the second prototype and determined how an improved system supported access and retrieval of users’ past life experiences, in particular, how users group their data into events, how they interact with their data, and the classes of memories that it supported. In this doctoral thesis I found that visualizing the movements of users’ hands and bodies facilitated grouping activities into events when combined with the photos and other data captured at the same time. In addition, the movements of the user's hand and body and the movements of some objects can promote an activity recognition or support user detection and grouping of them into events. Furthermore, the ability to search for specific movements significantly reduced the amount of time that it took to retrieve data related to specific events. I revealed three major strategies that users followed to understand the combined data: skimming sequences, cross sensor jumping and continued scanning.
|
2 |
An investigation of the representations of users' requirements in the design of interactive systemsGuevara, K. A. January 1989 (has links)
The design of interactive computer systems was identified as an important area for investigation due to the increasing evidence of a discrepancy between the intended use of the systems, and the use by users. This led to the hypothesis that the discrepancies between systems and users were attributed to an inadequate representation of users' requirements in the design of the systems. Therefore, the research focused on the design process, and how users' requirements were represented in the process. The research was based on an investigation of two areas of design: the type of design processes that developed in system design, and the representations of users' requirements in design. Studies were based on structured interviews with designers, on observations of design teams engaged in design tasks, and on documentation from design projects. A major component of the research findings concerns the design context. The research has made it possible to see how the variations in design relate to the context in which it takes place. Some of the primary contextual influences include the commercial constraint, the pressure to innovate, and the specialisation in user interface design. Another significant finding relates to the representations of users' requirements in the design process. Two key issues emerge from the findings. First, designers approach design tasks with a technical, system based design model. The application of this model to design tasks is often inappropriate; however, designers lack design schemas appropriate to user related tasks. The second issue is that designers often work with inadequate information on users' requirements. The design process is characterised by limitations of information on users' information in design tasks. The extent to which these limitations are experienced by designers differs according to the design context.
|
3 |
Engineering adaptive model-driven user interfaces for enterprise applicationsAkiki, Pierre January 2014 (has links)
Enterprise applications such as enterprise resource planning systems have numerous complex user interfaces (UIs). Usability problems plague these UIs because they are offered as a generic off-the-shelf solution to end-users with diverse needs in terms of their required features and layout preferences. Adaptive UIs can help in improving usability by tailoring the features and layout based on the context-of-use. The model-driven UI development approach offers the possibility of applying different types of adaptations on the various UI levels of abstraction. This approach forms the basis for many works researching the development of adaptive UIs. Yet, several gaps were identified in the state-of-the-art adaptive model-driven UI development systems. To fill these gaps, this thesis presents an approach that offers the following novel contributions: - The Cedar Architecture serves as a reference for developing adaptive model-driven enterprise application user interfaces. - Role-Based User Interface Simplification (RBUIS) is a mechanism for improving usability through adaptive behavior, by providing end-users with a minimal feature-set and an optimal layout based on the context-of-use. - Cedar Studio is an integrated development environment, which provides tool support for building adaptive model-driven enterprise application UIs using RBUIS based on the Cedar Architecture. The contributions were evaluated from the technical and human perspectives. Several metrics were established and applied to measure the technical characteristics of the proposed approach after integrating it into an open-source enterprise application. Additional insights about the approach were obtained through the opinions of industry experts and data from real-life projects. Usability studies showed the approach’s ability to significantly improve usability in terms of end-user efficiency, effectiveness and satisfaction.
|
4 |
User interfaces and discrete event simulation modelsKuljis, Jasminka January 1995 (has links)
A user interface is critical to the success of any computer-based system. Numerous studies have shown that interface design has a significant influence on factors such as learning time, performance speed, error rates, and user satisfaction. Computer-based simulation modelling is one of the domains that is particularly demanding in terms of user interfaces. It is also an area that often pioneers new technologies that are not necessarily previously researched in terms of human-computer interaction. The dissertation describes research into user interfaces for discrete event simulation. Issues that influence the 'usability' of such systems are examined. Several representative systems were investigated in order to generate some general assumptions with respect to those characteristics of user interfaces employed in simulation systems. A case study was carried out to gain practical experience and to identify possible problems that can be encountered in user interface development. There is a need for simulation systems that can support the developments of simulation models in many domains, which are not supported by contemporary simulation software. Many user interface deficiencies are discovered and reported. On the basis of findings in this research, proposals are made on how user interfaces for simulation systems can be enhanced to match better the needs specific to the domain of simulation modelling, and on how better to support users in simulation model developments. Such improvements in user interfaces that better support users in simulation model developments could achieve a reduction in the amount of time needed to learn simulation systems, support retention of learned concepts over time, reduce the number of errors during interaction, reduce the amount of time and effort needed for model development, and provide greater user satisfaction.
|
5 |
Uncertainty analysis in the Model WebJones, Richard January 2014 (has links)
This thesis provides a set of tools for managing uncertainty in Web-based models and workflows. To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain. An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.
|
6 |
Design and process variation analysis of SRAM architecturesSun, Luo January 2014 (has links)
Future memory subsystems will have to achieve high performance, low power, high reliability, small size and high robustness under process variation. Moreover, it should be possible to build them with reasonable yield. Thus, design methodologies are needed to enhance the behaviors and yield of these systems. Increasing process variation effects on the design metrics also need to be evaluated, such as performance and power consumption. This dissertation addresses these problems. First, it proposes a novel SRAM bitcell design based on a promising technology, carbon nanotube field-effect-transistor (CNTFET). This CNTFET-based SRAM design offers better stability, lower power dissipation and better process variation tolerant ability, compared with CMOS-based and other CNTFET-based SRAM bitcells. However, carbon nanotubes (CNTs) can be either semi-conductive or metallic during fabrication so that CNTFETs suffer from short circuit unreliability due to the metallic path between the source and drain. Therefore, the metallic CNT tolerant techniques are applied to the proposed SRAM design to improve the probabilities of functional SRAM cells. The structure of the CNTFET SRAM design with metallic CNT tolerance is evaluated and compared to the original CNTFET-based SRAM bitcell. The low power binary tree based SRAM architecture (LPSRAM) is then presented. This is a methodology for the future multi-gigabit SRAM designs so that they can obtain high performance, low power and high robustness at the expense of a reasonable area overhead. The analytical models are developed to evaluate the performance, power and the cost of this structure. Empirical simulations are used to verify the proposed LPSRAM analytical models. The results show that the maximum relative model error is within 8%. Moreover, future SRAMs designs need to be easily testable. LPSRAM shows great potential for testability. The testing algorithm and built-in-testing structure (BITS) are developed for the testable LPSRAM architecture. A reduction in testing time and power can be obtained by the proposed architecture. Performance of IC designs is becoming more sensitive to process variation as the technology continues to scale down to nanometer levels. The statistical blockade (SB) approach has recently been proposed to study the impact of process variation and to identify rare events, especially for high-replicated circuits, such as SRAMs. Nevertheless, the threshold of classification and the training sample size are key problems that will cause the imprecise yield estimation and more runtime. Two improved SB algorithms are proposed to address these issues. The experimental results are reported to show that a fast speed can be achieved with high accuracy. A novel variability evaluation approach is then developed based on the enhanced statistical blockade method. Only the tail part of the distribution is used to evaluate the design robustness under process variation, thus, saving time. Four SRAM cells in different logic styles are used to verify the effectiveness of the approach in the experiments. The results show that our method is faster than traditional estimation approaches. In summary, this dissertation reports on the advanced SRAM structures at both circuit level and architectural-level. A fast and accurate method to analyze yield and variability has been presented for the high replicated SRAMs under process variation.
|
7 |
Computational tools for the processing and analysis of time-course metabolomic dataRusilowicz, Martin James January 2016 (has links)
Modern, high-throughput techniques for the acquisition of metabolomic data, combined with an increase in computational power, have provided not only the need for, but also the means to develop and use, methods for the interpretation of large and complex datasets. This thesis investigates the methods by which pertinent information can be extracted from nontargeted metabolomic data and reviews the current state of chemometric methods. The analysis of real-world data and research questions relevant to the agri-food industry reveals several problems for which novel solutions are proposed. Three LC-MS datasets are studied: Medicago, Alopecurus and aged Beef, covering stress resistance, herbicide resistance and product misbranding. The new methods include preprocessing (batch correction, data-filtering), processing (clustering, classification) and visualisation and their use facilitated within a flexible data-to-results pipeline. The resulting software suite with a user-friendly graphical interface is presented, providing a pragmatic realisation of these methods in an easy to access workflow.
|
8 |
Software evolution through UML-models extractionPu, Jianjun January 2008 (has links)
With the high demand for renovation of legacy systems, their evolution is becoming an urgent need. Although some approaches have been introduced to evolving legacy systems, they are not sufficient for understanding legacy code. In this thesis, development/environment-specific models of domain-specific legacy systems are acquired, based on their characteristics and operations. The development/environment-specific model of COBOL legacy systems is based on the characteristics and operations of COBOL, and is a procedure-based model comprising a graph that describes the calling and being-called relationships of those procedures in COBOL legacy systems. It has four types: linear, branch, joint, and synthetic procedure-based models. The link-based model of HTML legacy systems uses a graph that describes the importing or imported relationships of webs in a legacy system. It has three types: sequential, cyclical, and compositive link-based models. The development/environment-specific model of the SQL legacy system comprises association, generation and composition database-based models based on the basic operations of SQL and the two main relationships of generation and association between the databases in an SQL legacy system. The structural stage of UML extraction in this thesis contains class realisation. The classification of classes from COBOL legacy system is two, which are procedure class and variable class. Every procedure in COBOL legacy system is defined as one procedure class. Variable class is based on the program slicing techniques with two stages of pseudo class and real class extraction from COBOL legacy system. The variable of the sliced criterion is defined as the class name, and the variables contained in its slicing criterion are defined as the attributes of that variable class. Because the behavioural analysis of domain-specific legacy systems is behind the analysis of structural analysis, the operations in variable class are not described. The classification of classes of HTML legacy system is based on the web pages and their blocks. The classification of SQL legacy system is two, which is procedure class and database class. Selected UML diagrams are used to describe the static aspect of domain-specific legacy systems. The behavioural stage of UML extraction in this thesis focuses on the operations and activities of domain-specific legacy systems. When understanding the operations and activities of domain-specific legacy code, their preconditions and post-conditions must be presented from the source code. Then those operations and activities are ordered according to the time and sequence they are executed by. At last, the operation and activity arraies are presented. Selected UML diagrams describing the dynamic aspect of domain-specific legacy systems are realised based on those operation and activity arraies. The major contribution of this thesis is the presentation of development/ environment-specific models of domain-specific legacy systems and an approach towards software evolution of domain-specific legacy systems using UML diagrams.
|
9 |
Selection strategies in gaze interactionMollenbach, Emilie January 2010 (has links)
This thesis deals with selection strategies in gaze interaction, specifically for a context where gaze is the sole input modality for users with severe motor impairments. The goal has been to contribute to the subfield of assistive technology where gaze interaction is necessary for the user to achieve autonomous communication and environmental control. From a theoretical point of view research has been done on the physiology of the gaze, eye tracking technology, and a taxonomy of existing selection strategies has been developed. Empirically two overall approaches have been taken. Firstly, end-user research has been conducted through interviews and observation. The capabilities, requirements, and wants of the end-user have been explored. Secondly, several applications have been developed to explore the selection strategy of single stroke gaze gestures (SSGG) and aspects of complex gaze gestures. The main finding is that single stroke gaze gestures can successfully be used as a selection strategy. Some of the features of SSGG are: That horizontal single stroke gaze gestures are faster than vertical single stroke gaze gestures; That there is a significant difference in completion time depending on gesture length; That single stroke gaze gestures can be completed without visual feedback; That gaze tracking equipment has a significant effect on the completion times and error rates of single stroke gaze gestures; That there is not a significantly greater chance of making selection errors with single stroke gaze gestures compared with dwell selection. The overall conclusion is that the future of gaze interaction should focus on developing multi-modal interactions for mono-modal input.
|
10 |
An assessment of factors influencing neurogaming with motion-onset visual evoked potentials (mVEPs)Beveridge, Ryan January 2018 (has links)
Brain-computer interface (BCI) technology offers movement-independent control of computer applications by translating cortical activity into semantic control signals for a computer to execute. One prominent application of BCI technology is brain-computer games interfacing (BCGI) or neurogaming. This thesis aimed to advance the field of neurogaming and is an account of the work conducted whilst investigating the feasibility of employing motion-onset visual evoked potentials (mVEPs) for control in a range of neurogames and the factors that influence performance when employing such a control strategy. mVEPs manifest near the visual cortex when motion-related stimuli are attended to visually and therefore are likely to be elicited naturally by games graphics scenes and the motions of in-game objects. There are limited studies investigating the potential for mixing games graphics with visual motion-based stimuli to elicit mVEPs for control in games. This thesis addresses this lacuna and improves our understanding of the factors that influence neurogaming with mVEPs. Firstly, participants were presented with mVEP-inducing stimuli overlaid on games graphics of varying complexity. Offline analysis of the EEG indicated that there was some correlation between graphical complexity and mVEP detection performance but the differences were insignificant for moderate variations in graphical complexity. Another offline study involved mVEP stimuli mixed with five different commercially available video games - each representing different graphical complexities, genre, and generation of gaming console. A 3D fast-paced car-racing game consistently provided the greatest mVEP detection accuracies. To validate the use of a virtual reality (VR)-based display modality, two different game level presentations, based on the car-racing genre were studied - one with rudimentary and one with highly detailed graphics. Offline results indicated that mVEP detection accuracy provided by the Oculus Rift VR headset did not differ to an LCD computer display, demonstrating the possibility of employing contemporary display technologies in neurogaming. Once we established that mVEPs could be detected with graphics of varying complexity and that perhaps the car-racing genre is best suited for mVEP-based control, a series of online control experiments with a mVEP-controlled 3D car-racing game were conducted comparing the performance of adults to teenagers, a relatively understudied age group in neurogaming. We investigated user performance based on different lap speeds dictated by the number of event-related potentials (ERPs) averaged to make a game control decision. Our findings indicate that adult participants outperformed their teenage counterparts and that mVEP detection is robust to variations in the setup of the signal processing and system calibration. In summary, this thesis has implications for BCI control strategies involving mVEPs, gameplay quality, speed of control, performance assessment and calibrating mVEP-based BCIs. A broad range of users, including teenagers, have been evaluated in a mVEP-based neurogaming study for the first time.
|
Page generated in 0.0217 seconds