1 |
A connectionist explanation of presence in virtual environmentsNunez, David 01 February 2003 (has links)
Presence has various definitions, but can be understood as the sensation that a virtual environment is a real place, that the user is actually in the virtual environment rather than at the display terminal, or that the medium used to display the environment has disappeared leaving only the environment itself. We present an attempt to unite various presence approaches by reducing each to what we believe is a common basis – the psychology of behaviour selection and control – and re-conceptualizing presence in these terms by defining cognitive presence – the mental state where the VE rather than the real
environment is acting as the basis for behaviour selection.
The bulk of this work represents the construction of a three-layer connectionist model to explain and predict this concept of cognitive presence. This model takes input from two major sources: the perceptual modalities of the user (bottom-up processes), and the mental state of the user (top-down processes). These two basic sources of input competitively spread activation to a central layer which competitively determines which behaviour script will be applied to regulate behaviour.
We demonstrate the ability of the model to cope with current notions of presence by using it to successfully predict two published findings: one (Hendrix & Barfield, 1995) showing that presence increases with an increase in the geometric field of view of the graphical display, and another (Sallnas, 1999), which demonstrates the positive relationship between presence and the stimulation of more than one sensory modality. Apart from this theoretical analysis, we also perform two experiments to test the central tenets of our model. The first experiment aimed to show that presence is affected by both perceptual inputs (bottom-up processes), conceptual inputs (top-down processes), and the interaction of these. We collected 103 observations from a 2x2 factorial design with stimulus quality (2 levels) and conceptual priming (2 levels) as independent variables, and as dependent variable we used three measures of presence (Slater, Usoh & Steed’s scale (1995), Witmer & Singer’s (1998) Presence Questionnaire and our own cognitive presence measure) for the dependent variable.
We found a significant main effect for stimulus quality and a significant interaction, which created a striking effect: priming the subject with material related in theme to the content of the VE increased the mean presence score for those viewing the high quality display, but decreased the mean of those viewing the low quality display. For those not primed with material related to the VE, no mean presence difference was discernible between those using high and low quality displays. The results from this study suggest that both top-down and bottom-up activation should be taken into account when explaining the causality of presence.
Our second study aimed to show that presence comes about as a result not of raw sensory information, but rather due to partly-processed perceptual information. To do this we created a simple three group comparative design, with 78 observations. Each one of the three groups viewed the same VE under three display conditions: high-quality graphical, low-quality graphical, and text-only. Using the model, we predicted that the text and low-quality graphics displays would produce the same presence levels, while the high-quality display would outperform them both. The results were mixed, with the Slater, Usoh & Steed scale showing the predicted pattern, but the Presence Questionnaire showing each condition producing a significantly different presence score (in the increasing order: text, low-quality graphics, high-quality graphics). We conclude from our studies that the model shows the correct basic structure, but that it requires some refinement with regards to its dealings with non-immersive displays. We examined the performance our presence measure, which was found to not perform satisfactorily. We conclude by proposing some points relevant to the methodology of presence research, and by suggesting some avenues for future expansion of our model.
|
2 |
A Digital Library Component Assembly EnvironmentEyambe, Linda 01 October 2005 (has links)
Digital libraries (DLs) represent an important evolutionary step towards accessing structured digital information. DLs are often built from scratch or by using proprietary monolithic software that is often difficult to customise and extend to meet changing requirements. Researchers are beginning to realise that this is not an ideal solution and as a result, are creating component suites and accompanying protocols to encourage the creation of modular DLs. Despite the introduction of component models, it is not immediately apparent how they can be seamlessly assembled to produce diverse, yet fully functional, component-based digital library systems without knowledge of the underlying protocols.
This dissertation presents a graphical user interface and its associated framework for creating DL systems from distributed components, consequently shielding DL architects from the complexity of using components models and taking advantage of the inherent benefits of the component programming paradigm. The framework introduced in this dissertation was designed to be generic enough to be adopted for the assembly of a variety of component-based systems over and beyond the digital library community.
After being tested on over thirty inexperienced users and modelling a number of existing DL systems, graphically assembling distributed components has been shown to be a viable approach to simplify the creation of modular DLs from a pool of heterogeneous components.
|
3 |
Flexible Packaging Methodologies for Rapid Deployment of Customisable Component-based Digital LibrariesMhlongo, Siyabonga 01 June 2006 (has links)
Software engineering is a discipline concerned with manufacturing or developing software. Software plays a pivotal role in everyday life, an absence of which will be devastating to a number of governmental, recreational and financial activities, amongst many others. One of the latest branches of software engineering, component-based software engineering, is concerned with the development of software systems using already existing components which speculatively will ensure rapid and inexpensive software development processes.
Parallel with the advances in software engineering, the field of digital libraries — a field dealing with Web-based access to and management of structured digital content — has adopted this development model from software engineering to shift focus from developing and using traditionally monolithic software systems to developing and using more flexible component-oriented software systems.
Since componentised development approaches are relatively recent, other areas such as packaging and managing component-based software systems still remain unattended to. This dissertation presents research on techniques and methodologies for packaging customisable component-based digital libraries such that deployment is rapid and flexibility is not compromised. Although the reference point of this research was that of component-based digital library systems, it is believed that this research can be generalised across the family of Web-based component-based software systems.
An outcome of this research was a prototype packaging system consisting of a pair of tools: a package builder tool and a package installer tool. This packaging system was developed to model the ideas and methodologies that were identified as important to the processes of packaging and installing component-based digital library systems. These tools consequently underwent a user evaluation study whereby they were evaluated for understandability, usability and usefulness to the processes of packaging and installing component-based digital libraries.
A key contribution of this research was identifying requirements for a generic component packaging framework. For a component to be seen as ”fit-to-package”, it must posses the following at the very least: the component must be configurable automatically; the component must have a formal description of its dependency software; there must be formal descriptions that describe individual components as well as systems composed of components; and there must be a way whereby installation questions are formally encoded such that components are able to correctly receive configuration information.
In totality, this research has shown that component-oriented software development approaches can benefit from an infrastructure which allows for component-based software systems to be composed, distributed and installed effortlessly.
|
4 |
Measuring the applicability of Open Data Standards to a single distributed organisation: an application to the COMESA SecretariatMunalula, Themba 01 January 2008 (has links)
Open data standardization has many known benefits, including the availability of tools for standard encoding formats, interoperability among systems and long term preservation of data. Mark-up languages and their use on the World Wide Web have implied further ease for data sharing. The Extensible Markup Language (XML), in particular, has succeeded due to its simplicity and ease of use. Its primary purpose is to facilitate the sharing of data across different information systems, particularly systems connected via the Internet.
Whether open and standardized or not, organizations generate data daily. Offline exchange of documents and data is undertaken using existing formats that are typically defined by the organizations that generate the data in the documents. With the Internet, the realization of data exchange has had a direct implication on the need for interoperability and comparability. As much as standardization is the accepted approach for online data exchange, little is understood about how a specific organization’s data “fits” a given data standard. This dissertation develops data metrics that represent the extent to which data standards can be applied to an organization’s data.
The research identified key issues that affect data interoperability or the feasibility of a move towards interoperability. This research tested the unwritten rule that organizational setups tend to regard and design data requirements more from internal needs than interoperability needs. Essentially, by generating metrics that affect a number of data attributes, the research quantified the extent of the gap that exists between organizational data and data standards. Key data attributes, i.e. completeness, concise representation, relevance and complexity, were selected and used as the basis for metric generation. Additional to the generation of attribute-based metrics, hybrid metrics representing a measure of the “goodness of fit” of the source data to standard data were generated.
Regarding the completeness attribute, it was found that most Common Market for Eastern and Southern Africa (COMESA) head office data clusters had lower than desired metrics to match the gap highlighted above. The same applied to the concise representation attribute. Most data clusters had more concise representation for the COMESA data than the data standard. The complexity metrics generated confirmed the fact that the number of data elements is a key determinant in any move towards the adoption of data standards. This fact was also borne out by the magnitude of the hybrid metrics which to some extent depended on the complexity metrics.
An additional contribution of the research was the inclusion of expert users’ weights to the data elements and recalculation of all metrics. A comparison with the unweighted metrics yielded a mixed picture. Among the completeness metrics and for the data retention rate in particular, increases were recorded for data clusters for which greater weight was allocated to mapped elements than to those that were not mapped. The same applied to the relative elements ratio. The complexity metrics showed general declines when user-weighted elements were used in the computation as opposed to the unweighted elements. This again was due to the fact that these metrics are dependent on the number of elements. Hence for the former case, the weights were evenly distributed while for the latter case, some elements were given lower weights by the expert users, hence leading to an overall decline in the metric.
A number of implications emerged for COMESA. COMESA would have to determine the extent to which its source data rely on data sources for which international standards are being promoted. Secondly, an inventory of users and collectors of the data COMESA uses is necessary in order to determine who would be the beneficiary of a standards-based information system. Thirdly, and from an organizational perspective, COMESA needs to designate a team to guide the process of creation of such a standards-based information system. Lastly there is need for involvement in consortia that are responsible for these data standards. This has an implication on organizational resources.
In totality, this research provided a methodology for determination of the feasibility of a move towards standardization and hence makes it possible to answer the critical first stage questions such a move begs answers to.
|
5 |
Usable Authentication for Mobile BankingChong, Ming Ki 01 January 2009 (has links)
Mobile banking is attractive because it allows people to do banking anytime, anywhere. One of the requirements of performing a mobile banking transaction is that users are required to login before use. The current mobile banking login method is PIN authentication; however, results from other research studies have found that there are usability concerns of using PINs. To overcome some of the concerns, researchers have suggested the use graphical passwords. In this research, we argue that another alternative input technique can be utilized. We explore a novel password input approach, called gesture passwords, of using 3-dimensional discrete gesture motions as password elements. As a result, three systems (PINs, graphical passwords and gesture passwords) were compared.
This dissertation describes the design of two mobile authentication techniques: combinational graphical passwords and gesture passwords. These systems were implemented as prototypes. The prototypes along with a PIN authenticator were evaluated with users. User experience and password retention were evaluated to determine the usability and users’ acceptance of each system. Experiments were conducted to evaluate the above. Results from the experiments show that users were able to use all of the testing systems; however, the results reveal that users are more proficient and preferred to use PINs for mobile banking authentication than the other two systems.
|
6 |
Meta-standardisation of Interoperability ProtocolsPaihama, Jorgina Kaumbe do Rosário 01 June 2012 (has links)
The current medley of interoperability protocols is potentially problematic. Each protocol is designed by a different group, each provides a single service, and has its own syntax and vocabulary. Popular protocols such as RSS are designed with simple and easy to understand documentation, which is a key factor for the high adoption levels. But the majority of protocols are complex, making them relatively difficult for programmers to understand and implement.
This research proposes a possible new direction for high-level interoperability protocols design.
The High-level Interoperability Protocol - Common Framework (HIP-CF) is designed and evaluated as a proof of concept that if interoperability is made simpler, then it can increase adoption levels, making it easier for programmers to understand and implement protocols, therefore leading to more interoperable systems.
HIP-CF is not suggested as the alternative to current production protocols. Rather it is suggested that the design approach taken by HIP-CF can be applied to other protocols, and also that a suite of simpler protocols is a better solution than various simple individual protocols. Evaluation results show that current protocols can be substantially improved on. These improvements could and maybe should be the result of a deeper analysis of the goals of today’s protocols and also a collaboration amongst the different groups that design high-level interoperability protocols.
This research presents a new approach and suggests future experimental research options for the field of high-level interoperability protocol design.
|
Page generated in 0.0234 seconds