Spelling suggestions: "subject:"1776 computer software"" "subject:"1776 coomputer software""
341 |
Perceptible affordances and feedforward for gestural interfaces : assessing effectiveness of gesture acquisition with unfamiliar interactionsChueke, J. January 2016 (has links)
The move towards touch-based interfaces disrupts the established ways in which users manipulate and control graphical user interfaces. The predominant mode of interaction established by the desktop interface is to ‘double-click’ an icon in order to open an application, file or folder. Icons show users where to click and their shape, colour and graphic style suggests how they respond to user action. In sharp contrast, in a touch-based interface, an action may require a user to form a gesture with a certain number of fingers, a particular movement, and in a specific place. Often, none of this is suggested in the interface. This thesis adopts the approach of research through design to address the problem of how to inform the user about which gestures are available in a given touch-based interface, how to perform each gesture, and, finally, the effect of each gesture on the underlying system. Its hypothesis is that presenting automatic and animated visual prompts that depict touch and preview gesture execution will mitigate the problems users encounter when they execute commands within unfamiliar gestural interfaces. Moreover, the thesis claims the need for a new framework to assess the efficiency of gestural UI designs. A significant aspect of this new framework is a rating system that was used to assess distinct phases within the users’ evaluation and execution of a gesture. In order to support the thesis hypothesis, two empirical studies were conducted. The first introduces the visual prompts in support of training participants in unfamiliar gestures and gauges participants’ interpretation of their meaning. The second study consolidates the design features that yielded fewer error rates in the first study and assesses different interaction techniques, such as the moment to display the visual prompt. Both studies demonstrate the benefits in providing visual prompts to improve user awareness of available gestures. In addition, both studies confirm the efficiency of the rating system in identifying the most common problems users have with gestures and identifying possible design features to mitigate such problems. The thesis contributes: 1) a gesture-and-effect model and a corresponding rating system that can be used to assess gestural user interfaces, 2) the identification of common problems users have with unfamiliar gestural interfaces and design recommendations to mitigate these problems, and 3) a novel design technique that will improve user awareness of unfamiliar gestures within novel gestural interfaces.
|
342 |
Computer simulation of fundamental processes in high voltage circuit breakers based on an automated modelling platformPei, Yuqing January 2014 (has links)
Auto-expansion circuit breakers utilize the arc’s energy to generate the flow conditions required for current interruption. The operation of this type of circuit breaker is extremely complex and its interruption capability depends on the whole arcing history as well as a number of geometric factors. On the other hand, circuit breaker development based on test is extremely expensive and time consuming. The accumulated understanding of the underlying physical processes so far enables arc models be used as a tool for optimum design of switchgear product such as high voltage circuit breakers. For academic research, there is often a need to study the performance of a newly developed arc model by inspecting the distribution of relevant physical quantities during a simulation and their sensitivity to model parameters in an efficient and convenient approach. However the effective use of computer simulation by design engineers has been hindered by the complexity encountered in model implementation. This thesis presents the development and structure of an automated simulation tool, the Integrated Simulation and Evaluation Environment (ISEE), for the arcing process in gas-blast circuit breakers. The functionalities of ISEE are identified and developed based on the experience in real product design, which include visual creation and definition of components, automatic setup of arc models based on a commercial CFD software package as equation solver, simulation task management, and visualization of computational results in “real-time” mode. This is the first automated simulation platform in the community of switching arc simulation. Using ISEE as the simulation tool, different designs of auto-expansion circuit breakers have been investigated to reveal the fundamental characteristics of the arcing process under different test duties. Before attempting to investigate the capability of an auto-expansion circuit breaker, the fundamental issue of determining the turbulence parameter of the Prandtl mixing length model is addressed. Previous studies on turbulence arcs were mostly concerned with simple converging-diverging nozzles. There has been little work on real circuit breaker nozzles. In order to calibrate the turbulence parameter, real arcing conditions including interrupting currents, contact travels, and transient recovery voltages of two commercial circuit breakers, with rated voltage of 145 kV and 245 kV, have been used together with the geometry of the circuit breakers to calibrate the range of the turbulence parameter. The effect of nozzle ablation has been considered. All together 6 cases have been used for three circuit breakers with each pair of cases corresponding to a success and failure in its thermal recovery process. It has been found that a single parameter of 0.35 is applicable to all three circuit breakers with an auxiliary nozzle and a main nozzle of converge-flat throat-diverge shape. It must be noted that this value is obtained with the definition of thermal radius introduced in Chapter 3 and the assumption that the parameter linearly changes with the interrupting current from 0.05 at 15 kA to 0.35 at current zero. Using the calibrated turbulence model, a computational study of the thermal interruption performance of a 145 kV, 60 Hz auto-expansion circuit breaker with different arc durations has been carried out in Chapter 4. The relation between pressure peak and current peak in the auto-expansion circuit breaker is discussed. It has been found that a larger average mass flux in the main nozzle indicates a better interruption environment, enabling the circuit breaker to withstand a larger rate of rise of recovery voltage after current zero. Another important finding is that the auxiliary nozzle plays an important role in an auto-expansion circuit breaker both at the high current phase and during the current zero period. Therefore, the proper design and use of an auxiliary nozzle is a key factor to enhance the thermal interruption capability of high voltage auto-expansion circuit breakers. In Chapter 5 of the thesis, the transient pressure variation in auto-expansion circuit breakers was studied. The pressure variation has an extremely complex pattern and the pressure changes in different ways depending on the location in the arcing chamber. It is shown, for the first time, that the time lag between the current peak and pressure peak in the expansion volume can be explained by using an energy flow rate balance method, that is flow reversal occurs when the enthalpy exhaustion rate from the contact space equals the electrical power input. Following the flow reversal, a high enthalpy flow rate from the expansion volume into the contact gap first occurs for a short while (1 ms), which is followed by a high mass flow rate of relatively cool gas at less than 2000 K. This high mass flow rate causes a surplus in mass flow rate into the contact gap and results in the last temporary pressure peak in the contact space before the pressure and flow field finally settle down for arc quenching at current zero. The pressure change under different conditions, i.e. different arc durations, different current levels and different length of the heating channel, has also been studied in details. In summary the present research leads to original findings in three aspects of the operation of auto-expansion circuit breakers, i.e. the calibration of the turbulence parameter for the Prandtl mixing length model, interruption performance with different arc durations, and the transient pressure variation in the arcing process. The results are expected to provide useful information for the optimum design of auto-expansion circuit breakers.
|
343 |
A Bayesian neural network for censored survival dataWong, Helen January 2001 (has links)
No description available.
|
344 |
Scheduling of flexible manufacturing systems integrating Petri nets and artificial intelligence methodsReyes Moro, Antonio January 2000 (has links)
The work undertaken in this thesis is about the integration of two well-known methodologies: Petri net (PN) model Ii ng/analysis of industrial production processes and Artificial Intelligence (AI) optimisation search techniques. The objective of this integration is to demonstrate its potential in solving a difficult and widely studied problem, the scheduling of Flexible Manufacturing Systems (FIVIS). This work builds on existing results that clearly show the convenience of PNs as a modelling tool for FIVIS. It addresses the problem of the integration of PN and Al based search methods. Whilst this is recognised as a potentially important approach to the scheduling of FIVIS there is a lack of any clear evidence that practical systems might be built. This thesis presents a novel scheduling methodology that takes forward the current state of the art in the area by: Firstly presenting a novel modelling procedure based on a new class of PN (cb-NETS) and a language to define the essential features of basic FIVIS, demonstrating that the inclusion of high level FIVIS constraints is straight forward. Secondly, we demonstrate that PN analysis is useful in reducing search complexity and presents two main results: a novel heuristic function based on PN analysis that is more efficient than existing methods and a novel reachability scheme that avoids futile exploration of candidate schedules. Thirdly a novel scheduling algorithm that overcomes the efficiency drawbacks of previous algorithms is presented. This algorithm satisfactorily overcomes the complexity issue while achieving very promising results in terms of optimality. Finally, this thesis presents a novel hybrid scheduler that demonstrates the convenience of the use of PN as a representation paradigm to support hybridisation between traditional OR methods, Al systematic search and stochastic optimisation algorithms. Initial results show that the approach is promising.
|
345 |
Impromptu : software framework for self-healing middleware servicesPereira, Ella Grishikashvili January 2006 (has links)
No description available.
|
346 |
Group-based secure communication for wireless sensor networksKifayat, Kashif January 2008 (has links)
Wireless Sensor Networks (WSNs) are a newly developed networking technology consisting of multifunctional sensor nodes that are small in size and communicate over short distances. Continuous growth in the use of Wireless Sensor Networks (WSN) in sensitive applications such as military or hostile environments and also generally has resulted m a requirement for effective security mechanisms in the system design In order to protect the sensitive data and the sensor readings, shared keys should be used to encrypt the exchanged messages between communicating nodes. Many key management schemes have been developed recently and a serious threat highlighted in all of these schemes is that of node capture attacks, where an adversary gains full control over a sensor node through direct physical access. This can lead an adversary to compromise the communication of an entire WSN. Additionally ignoring security issues related to data aggregation can also bring large damage to WSNs. Furthermore, in case an aggregator node, group leader or cluster head node fails there should be a secure and efficient way of electing or selecting a new aggregator or group leader node in order to avoid adversary node to be selected as a new group leader. A key management protocol for mobile sensor nodes is needed to enable them to securely communicate and authenticate with the rest of the WSN.
|
347 |
Evolutionary environmental modelling in self-managing software systemsForsyth, Henry Lee January 2010 (has links)
Over recent years, the increasing richness and sophistication of modem software systems has challenged conventional design-time software modelling analysis and has led to a number of studies exploring non-conventional approaches particularly those inspired by nature. The natural world routinely produces organisms that can not only survive but also flourish in changing environments as a consequence of their ability to adapt and therefore improve their fitness in relation to the external environments in which they exist. Following this biologically inspired systems' design approach, this study aims to test the hypothesis - can evolutionary techniques for runtime modelling of a given system's environment be more effective than traditional approaches, which are increasingly difficult to specify and model at design-time? This work specifically focuses on investigating the requirements for software environment modelling at runtime via a proposed systemic integration of Learning Classifier Systems and Genetic Algorithms with the well-known managerial cybernetics Viable Systems Model. The main novel contribution of this thesis is that it provides an evaluation of an approach by which software can create and crucially, maintain a current model of the environment, allowing the system to react more effectively to changes in that environment, thereby improving robustness and performance of the system. Detailed novel contributions include an evaluation of a variety of environmental modelling approaches to improving system robustness, the use of Learning Classifier Systems and genetic algorithms to provide the modelling element required of effective adaptive software systems. It also provides a conceptual framework of an Environmental Modelling, Monitoring and Adaptive system (EMMA) to manage the various elements required to achieve an effective environmental control system. The key result of this research has been to demonstrate the value of the guiding principles provided by the field of cybernetics and the potential of Beer's 2 cybernetically based Viable System Model in providing a learning framework, and subsequently a roadmap, to developing self-managing autonomic systems. The work is presented using a virtual world platform called "Second Life". This platform was used for experimental design and testing of results.
|
348 |
DISE : a game technology-based digital interactive storytelling frameworkCooper, Simon January 2011 (has links)
This thesis details the design and implementation of an Interactive Storytelling Framework. Using software engineering methodology and framework development methods, we aim to design a full Interactive Storytelling system involving a story manager, a character engine, an action engine, a planner, a 3D game engine and a set of editors for story data, world environment modelling and real-time character animation. The framework is described in detail and specified to meet the requirement of bringing a more dynamic real-time interactive story experience to the medium of computer games. Its core concepts borrow from work done in the fields of narrative theory, software engineering, computer games technology, HCI, 3D character animation and artificial intelligence. The contributions of our research and the novelties lie in the data design of the story which allows a modular approach to building reusable resources such as actions, objects, animated characters and whole story 'levels'; a switchable story planner and re-planning system implementation, allowing many planners, heuristics and schedulers that are compatible with PDDL (the "Planning Domain Definition Language") to be easily integrated with minor changes to the main classes; a 3D game engine and framework for web launched or in browser deployment of the finished product; and a user friendly story and world/environment editor; so story authors do not need advanced knowledge of coding PDDL syntax, games programming or 3D modelling to design and author a basic story. As far as we know our Interactive Storytelling Framework is the only one to include a full 3D cross-platform game engine, procedural and manual modelling tools, a story -editor and customisable planner in one complete integrated solution. The finished interactive storytelling applications are presented as computer games designed to be a real-time 3D first person experience, with the player as a main story character in a world where every context filtered action displayed is executable and the player's choices make a difference to the outcome of the story, whilst still allowing the authors high level constraints to progress the narrative along their desired path(s).
|
349 |
A framework for cascading payment and content exchange within P2P systemsArora, Gurleen Kaur January 2006 (has links)
Advances in computing technology and the proliferation of broadband in the home have opened up the Internet to wider use. People like the idea of easy access to information at their fingertips, via their personal networked devices. This has been established by the increased popularity of Peer-to-Peer (P2P) file-sharing networks. P2P is a viable and cost effective model for content distribution. Content producers require modest resources by today's standards to act as distributors of their content and P2P technology can assist in further reducing this cost, thus enabling the development of new business models for content distribution to realise market and user needs. However, many other consequences and challenges are introduced; more notably, the issues of copyright violation, free-riding, the lack of participation incentives and the difficulties associated with the provision of payment services within a decentralised heterogeneous and ad hoc environment. Further issues directly relevant to content exchange also arise such as transaction atomicity, non-repudiation and data persistence. We have developed a framework to address these challenges. The novel Cascading Payment Content Exchange (CasPaCE) framework was designed and developed to incorporate the use of cascading payments to overcome the problem of copyright violation and prevent free-riding in P2P file-sharing networks. By incorporating the use of unique identification, copyright mobility and fair compensation for both producers and distributors in the content distribution value chain, the cascading payments model empowers content producers and enables the creation of new business models. The system allows users to manage their content distribution as well as purchasing activities by mobilising payments and automatically gathering royalties on behalf of the producer. The methodology used to conduct this research involved the use of advances in service-oriented architecture development as well as the use of object-oriented analysis and design techniques. These assisted in the development of an open and flexible framework which facilitates equitable digital content exchange without detracting from the advantages of the P2P domain. A prototype of the CasPaCE framework (developed in Java) demonstrates how peer devices can be connected to form a content exchange environment where both producers and distributors benefit from participating in the system. This prototype was successfully evaluated within the bounds of an E-learning Content Exchange (EIConE) case study, which allows students within a large UK university to exchange digital content for compensation enabling the better use of redundant resources in the university.
|
350 |
A specification method for the scalable self-governance of complex autonomic systemsRandles, Martin J. January 2007 (has links)
IBM, amongst many others, have sought to endow computer systems with selfmanagement capabilities by delegating vital functions to the software itself and proposed the Autonomic Computing model. Hence inducing the so-called self-* properties including the system's ability to be self-configuring, self-optimising, self-healing and self-protecting. Initial attempts to realise such a vision have so far mostly relied on a passive adaptation whereby Design by Contract and Event-Condition-Action (ECA) type constructs are used to regulate the target systems behaviour: When a specific event makes a certain condition true then an action is triggered which executes either within the system or on its environment Whilst, such a model works well for closed systems, its effectiveness and applicability of approach diminishes as the size and complexity of the managed system increases, necessitating frequent updates to the ECA rule set to cater for new and/or unforeseen systems' behaviour. More recent research works are now adopting the parametric adaptation model, where the events, conditions and actions may be adjusted at runtime in response to the system's observed state. Such an improved control model works well up to a point, but for large scale systems of systems, with very many component interactions, the predictability and traceability of the regulation and its impact on the whole system is intractable. The selforganising systems theory, however, offers a scaleable alternative to systems control utilising emerging behaviour, observed at a global level, resulting from the low-level interactions of the distributed components. Whereby, for instance, key signals (signs) for ECA style feedback control need no longer be recognised or understood in the context of the design time system but are defined by their relevance to the runtime system. Nonetheless this model still suffers from a usually inaccessible control model with no intrinsic meaning assigned to data extraction from the systems operation. In other words, there is no grounded definition of particular observable events occurring in the system. This condition is termed the Signal Grounding Problem. This problem cannot usually be solved by analytical or algorithmic methods, as these solutions generally require precise problem formulations and a static operating domain. Rather cognitive techniques will be needed that perform effectively to evaluate and improve performance in the presence of complex, incomplete, dynamic and evolving environments. In order to develop a specification method for scalable self-governance of autonomic systems of systems, this thesis presents a number of ways to alleviate, or circumvent, the Signal Grounding Problem through the utilisation of cognitive systems and the properties of complex systems. After reviewing the specification methods available for governance models, the Situation Calculus dialect of first order logic is described with the necessary modalities for the specification of deliberative monitoring in partially observable environments with stochastic actions. This permits a specification method that allows the depiction of system guards and norms, under central control, as well as the deliberative functions required for decentralised components to present techniques around the Signal Grounding problem, engineer emergence and generally utilise the properties of large complex systems for their own self-governance. It is shown how these large-scale behaviours may be implemented and the properties assessed and utilised by an Observer System through fully functioning implementations and simulations. The work concludes with two case studies showing how the specification would be achieved in practice: An observer based meta-system for a decision support system in medicine is described, specified and implemented up to parametric adaptation and a NASA project is described with a specification given for the interactions and cooperative behaviour that leads to scale-free connectivity, which the observer system may then utilise for a previously described efficient monitoring strategy.
|
Page generated in 0.0589 seconds