• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 266
  • 49
  • 36
  • 30
  • 28
  • 16
  • 13
  • 13
  • 12
  • 10
  • 7
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Factors effecting spatial awareness in non-stereo visual representations of virtual, real and digital image environments

Rambli, Dayang R. A. January 2004 (has links)
The increasing number of applications employing virtual environment (VE) teclmologies as a tool, particularly those that use VE as surrogates, makes it important to examine the ability of YE to provide realistic simulations to users. Accurate space and distance perceptions have been suggested as essential preconditions for the reliable use of VE teclmologies in various applications. However, space and distance perception in the VE has been reported by some . investigators as being perceived differently from the real world. Thus, the overall aim of this thesis is to improve our understanding of factors affecting spatial awareness in the YE. The general approach is based on a strategy of conducting empirical investigations comparing tasks performed in the VE to similar tasks performed in the real world. This research has examined the effect of display related factors on users' spatial task performance in the context of static, dynamic and interactive presentations. Three sets of experiments in these respective contexts were conducted to explore the influence of image type, display size, viewing distance, physiological cues, interface device and travel modes on distance estimate and spatial memory tasks. For distance perception, results revealed that the effect of image type depends on the context of presentations, the type of asynunetrical distances and image resolution. The effect of display size in static and dynamic presentations is consistent with the results of previous investigations. However, results from evaluations conducted by the author have indicated that other factors such as viewing distance and physiological cues were also accountable. In interactive presentations, results indicated that display size had different effects on different users whereby familiarity with display size may influence user's performance. Similarly, it was shown that a commonly used interface device is more useful and beneficial for user's spatial memory performance in the VE than the less familiar ones. In terms of travel mode, the natural method of movement available in the real world may not necessary be better than the unnatural movement which is possible in the VE. The results of investigations reported in this thesis contribute towards knowledge and understanding on factors affecting spatial awareness in the real and YE. In particular, they highlight the influence of these factors in space and distance perception in different contexts of VE presentations which will serve as important scientifically based guidelines for designers and users of VE applications.
32

The persistence of analogies in design decision-making

Hassard, S. January 2011 (has links)
Previous theories of design decision-making have discussed how designers use analogical reasoning to quickly scope the solution space down to one viable solution. This initial analogy forms the template of a possible action plan that is then modified to suit the unique properties of that particular problem. This use of analogical reasoning allows designers to quickly engage with the problem and generate a workable solution. Our findings indicate that this initial analogy actually persists across all stages of decision-making, and does not play a role solely during the first stage of scoping. This analogical persistence leads to poorer design decisions. This thesis presents a series of studies that adopt a mixed method approach to investigating the influence of analogies on the decision-making of Interaction Designers. We employed qualitative methods such as the Critical Decision Method for Eliciting Knowledge (Klein, 1989), which aided in identifying analogical persistence as a problem that leads to poorer quality decisions. We also employed quantitative methods such as the Design Fixation paradigm (Jansson & Smith, 1991) to investigate how different types of analogies (self-generated & provided Priming Analogies) can influence the expression of analogical persistence. Finally in an attempt to mitigate the potential pitfalls of analogical persistence, this thesis attempts to control it using principles from Design Rationale (Lee & Lai, 1991) and Reflection (Schon, 1983). Rather than seeing a decrease in analogical persistence, our manipulation actually increased fixation. A follow-up study identified that designers tend to poorly appraise the weaknesses in the initial analogy, which may have led to the aforementioned unexpected result. These findings challenge the notion that greater understanding of the design space will lead to higher quality design decisions.
33

Beyond problem identification : valuing methods in a 'system usability practice'

Furniss, D. January 2008 (has links)
Historically, usability evaluation methods (UEMs) have been evaluated on their capability for problem identification. However, the relevance of this approach has been questioned for applied usability work. To investigate alternative explanations of what is important for method use a grounded theory of usability practitioners was developed (9 interviews from the website domain and 13 in the safety-critical domain). The analysis proceeded in bottom-up and top-down stages. The bottom-up stages produced insight from the data in an exploratory and inductive manner. This highlighted the importance of contextual factors and the need for system descriptions: UEM adoption and adaptation cannot be fully understood devoid of context. The top-down stages used Distributed Cognition and Resilience Engineering conceptual frameworks as leverage for exploring the data in a deductive manner. These were chosen for their functional descriptions of systems. To illustrate the importance of context we describe three models: 1) where previous research has highlighted the downstream utility of UEMs we expand the metaphor to consider the landscape through which the stream flows, where the landscape represents the project’s context; 2) where information propagation and transformation in a project is influenced by social, information flow, artefact, physical and evolutionary factors; and 3) where the functional couplings between parts of the system of usability practice can be monitored and managed to positively resonate with each other, thereby improving the performance of the system overall. The concept of ‘Positive Resonance’ is introduced to describe how practitioners adapt to the context to maximise their impact under constrained resources. The functional couplings are described in a functional resonance model of HCI practice. This model is validated by interviewees and other practitioners outside of the study. This research shows that problem identification is limited for valuing UEMs. Instead, functional couplings of UEMs should be considered to improve system performance, which influence UEM adoption and adaptation in practice.
34

On the core elements of the experience of playing video games

Calvillo Gamez, E. H. January 2009 (has links)
This dissertation presents a multi-method approach to study the user experience of playing video games. The motivation is to devise an objective assessment of the concept of user experience. It is proposed that user experience is better understood when it is studied as a two fold phenomenon formed by a process and an outcome. This definition allows the combination of the subjective nature of experience together with the objectivity needed to propose an objective assessment of experience. An experience is personal in the achieved outcome, during the process of forming it there are elements specific to the type of experience common to all individuals. The thesis presents a series of studies to explore and understand the gaming experience as well as to identify the procedural elements of the experience. The outcome of the studies was the formulation of the theoretical framework that we called Core Elements of the Gaming Experience (CEGE), which focuses on the process of the experience. The metaphor of ”puppetry” is used to provide a link to the outcome of the experience. Based on the theorical framework, a questionnaire and model were developed. The model was validated using Structural Equation Modelling, which provided an adequate fit suggesting that the CEGE model is an accurate abstraction of the process of the gaming experience. Lastly, the framework was used to study different gaming experiences under different conditions. The results suggest that the CEGE theoretical framework can be used to assess this type of experience. The contributions of this dissertation are: the methodological approach used to study the user experience of playing video games, a novel approach to understand user experience as a falsifiable concept, a theoretical framework and metaphor to describe the gaming experience, a model that describes the gaming experience, and an instrument that can be used to assess and explore different gaming experiences.
35

Functional metaprogramming

Powell, Gareth Moffat January 1993 (has links)
No description available.
36

A Dempster-Shafer theory inspired logic

Iourinski, Dmitri January 2008 (has links)
Issues of formalising and interpreting epistemic uncertainty have always played a prominent role in Artificial Intelligence. The Dempster-Shafer (DS) theory of partial beliefs is one of the most-well known formalisms to address the partial knowledge. Similarly to the DS theory, which is a generalisation of the classical probability theory, fuzzy logic provides an alternative reasoning apparatus as compared to Boolean logic. Both theories are featured prominently within the Artificial Intelligence domain, but the unified framework accounting for all the aspects of imprecise knowledge is yet to be developed. Fuzzy logic apparatus is often used for reasoning based on vague information, and the beliefs are often processed with the aid of Boolean logic. The situation clearly calls for the development of a logic formalism targeted specifically for the needs of the theory of beliefs. Several frameworks exist based on interpreting epistemic uncertainty through an appropriately defined modal operator. There is an epistemic problem with this kind of frameworks: while addressing uncertain information, they also allow for non-constructive proofs, and in this sense the number of true statements within these frameworks is too large. In this work, it is argued that an inferential apparatus for the theory of beliefs should follow premises of Brouwer's intuitionism. A logic refuting tertium non daturìs constructed by defining a correspondence between the support functions representing beliefs in the DS theory and semantic models based on intuitionistic Kripke models with weighted nodes. Without addional constraints on the semantic models and without modal operators, the constructed logic is equivalent to the minimal intuitionistic logic. A number of possible constraints is considered resulting in additional axioms and making the proposed logic intermediate. Further analysis of the properties of the created framework shows that the approach preserves the Dempster-Shafer belief assignments and thus expresses modality through the belief assignments of the formulae within the developed logic.
37

Oh!... is it really you? : using rank functions to verify authentication protocols

Heather, James January 2000 (has links)
A security protocol is a mechanism designed to allow secure communications through an insecure medium, even when that medium is controlled by a hostile attacker. Historically, approaches to analysis of security protocols have fallen into two broad categories: model checking and theorem proving. Each has its strengths, but the weaknesses of each are all too apparent. Model checking suffers from the problem of being able to check only a finite system; theorem proving is difficult to automate and often produces no conclusive results. Schneider's previous work on rank functions provides a formal approach to verification of certain properties of a security protocol. In this thesis, we develop the theory to allow for an arbitrarily large network, and give a clearly defined decision procedure by which one may either construct a rank function, proving correctness of the protocol, or show that no rank function exists. We show how the algorithm may be implemented to give a means of analysing authentication protocols that avoids the inherent limitations of both model checking and theorem proving. We discuss the implications of the absence of a rank function, and the open question of completeness of the rank function theorem.
38

Abstract hoare logic

Mathiesen, Erik Arne January 2007 (has links)
No description available.
39

A type-passing implementation of value types and genericity

Owen, Timothy Paul January 2002 (has links)
No description available.
40

Objects, patterns and descriptions in data management : the CRISTAL project at CERN

Estrella, Florida January 2001 (has links)
No description available.

Page generated in 0.0151 seconds