• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 3
  • 2
  • 1
  • Tagged with
  • 937
  • 143
  • 105
  • 73
  • 73
  • 63
  • 44
  • 39
  • 35
  • 21
  • 16
  • 15
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Embodied conversational agents : extending the persona metaphor to virtual retail applications

McBreen, Helen January 2002 (has links)
Engineering computer interfaces to communicate through the modality of speech serves to bridge the communicative gap between computers and their human users. Adding non-verbal performances to these spoken language interfaces, through the creation of embodied conversational agents, initiates dialogues where users' innate communicative capabilities are used for the benefit of a more engaging and effective interaction. Anthropomorphising the interface with lifelike behaviour animates the communicative process and recent research suggests that the extension of this persona metaphor into retail applications will provide personalised real-time interactions for users, improving on­line relationships. This thesis is a contribution to the emerging and innovative area of embodied conversational agents. It is undertaken to advance knowledge about the effectiveness of these agents in electronic retail applications. To date, few empirical studies have documented user perceptions of the agents and this has thus become the primary research objective of this thesis. The interdisciplinary investigation aims to determine how embodied conversational agents should be physically represented in retail interfaces. The research involves the undertaking of a series of progressive empirical evaluations. Firstly, a retail interface template was created, where variations of the persona metaphor were evaluated using participatory observation techniques. Following this an interactive spoken language system, inhabited with embodied conversational agents was designed and implemented to serve as an experiment platform from which to evaluate the perceived and expected behaviour of agents in contrasting retail applications. This interactive system was then used to determine the effectiveness of multi-modal interface features designed to improve the trustworthiness of the agents in applications where users are asked to make financial transactions. The conclusions drawn from this body of research are intended to support design guidelines and interface development strategies for the most effective deployment of embodied conversational agents in virtual retail applications.
12

A statistical investigation of the factors influencing the performance of parallel programs, with application to a study of process migration strategies

Phillips, Joseph January 1994 (has links)
It would be highly desirable for operating systems to take a greater responsibility for process placement and load balancing decisions in parallel machines. This would relieve the programmer of much of the burden associated with fine-tuning an application in order to achieve acceptable performance levels. Before such operating systems can be developed, it is necessary to gain a better understanding of the factors that influence program performance. In particular, it would be useful to be able to identify <I>classes</I> of programs which, in a statistical sense, behaved in a similar manner. Then, given an arbitrary program whose class was known, rules and heuristics developed for the program class (in conjunction with program specific information) could be used to make informed placement and load balancing decisions. As a step in this direction, this thesis investigates the application of standard statistical techniques to the performance analysis of particular classes of parallel programs. Simple CSP-type parallel programs exhibiting loosely synchronous data parallelism are used to illustrate how a common class of programs can be characterised in terms of a relatively small number of parameters representing time-averaged properties. In order to systematically explore parameter space, synthetic programs are used. The execution of these programs is simulated on an accurate performance model of a transputer-based machine. Standard experimental design techniques, such as the analysis of variance, are then applied to develop statistical models relating to the program class. It is shown that useful quantitative predictions can be made for arbitrary class members.
13

Genetic algorithms for word length optimization of FFT processors

Sulaiman, Nasri January 2007 (has links)
Genetic algorithms (GAs) are a particular class of evolutionary algorithms that use techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover to find best solutions to optimization and search problems. GAs are used in wide variety of applications in fields ranging from computer science, engineering, evolvable hardware, economics, mathematics, physics and biogenetics to name a few. A fast Fourier transform (FFT) is an efficient algorithm to compute the discrete Fourier transform (DFT) and it’s inverse. An FFT processor is used in applications such as signal processing and telecommunications, etc. It is one of the most power consuming block in wireless receivers such as Multi-Carrier Code Division Multiple Access (MC-CDMA). The portability requirement of these receiver systems imposes the need of low power architectures. Thus, designing an FFT processor with low power consumption is of crucial importance for overall system power. Power consumption of an FFT processor depends on the size of word length of the FFT coefficients. One way to reduce the power consumption in this processor is by reducing the switching activity in the FFT coefficients. This can be achieved using smaller word length for the FFT coefficients. This in turn reduces the SNR in the output signals of the FFT. This thesis investigates the impact of word length optimization of FFT coefficients on switching activity and SNR using GAs. The quality of GAs solutions are compared with non-GA solutions in order to determine the feasibility of using GAs to achieve optimum performance in terms of switching activity and SNR. Results show that GAs can find solutions with smaller word length and have significant reductions in switching compared to the non-GA solutions. This thesis also investigates some of the varying parameter settings, such as mutation domain, population size, crossover rate and mutation probability in the GAs, which affects the quality of search performance towards convergence and the speed of convergence.
14

Model abstraction and reusability in a hierarchical architecture simulation environment

Williams, Lawrence January 1999 (has links)
The practice of simulating real world systems on computers is widespread and forms an important aspect of many different disciplines. A simulation model provides a simplified view of a real world system facilitating interaction with key aspects of a system without the distraction of unnecessary detail. This thesis is concerned with the role of simulation in computer architecture design. It is recognised that use of simulation in the design lifecycle is expensive and has tended to focus upon the register transfer (RT) level of design. The majority of design projects have no need for fully articulated models in the initial stages: the designer is more involved with fundamental decisions typically based upon choice of algorithm and high-level performance analysis. However, it has been shown that representation of systems in a more abstract form than that found at the RT level can be problematic in terms of reusability. Following an overview of current simulation techniques and software, extensions to the HASE simulation environment are proposed that classify simulation components according to their communication interfaces. This facilitates the loose coupling of simulation entities and consequently promotes component reuse. In addition, the problem of allowing entities represented at different levels of architectural abstraction to communicate was examined and a technique developed to allow entities to negotiate a level of service. The MEDL and EDL languages were developed to enhance HASE's component library and project storage facilities; other software tools allowing the visualisation of a hierarchical model in terms of communication and abstraction were also developed. Various model libraries were developed to investigate the trade-offs between model accuracy, runtime and flexibility afforded by the new techniques. It was demonstrated that the developed techniques facilitate component reuse and offer potential runtime reduction.
15

Human factors in the design of parallel program performance tuning tools

Hondroudakis, Anna January 1997 (has links)
Parallel program design and development is inherently more difficult than its sequential counterpart. People who have traditionally programmed on sequential machines are now called upon to deal with new and complex tissues when trying to control a number of entities executing concurrently. In the quest to produce application programs that take advantage of the power of the underlying hardware, usable and efficient tools are required to assist with the task of performance analysis and tuning. A large number of tools are described in the literature, but the results of tool use are disappointing. The problem can be attributed to the fact that tool design is not informed about users and their tasks. This thesis presents an investigation of tuners' practices and analyses its implications for tool support. An initial investigation by means of interviews and questionnaires provided the basis of a framework for the analysis of tuning. A number of problems with existing tuning tools were recorded and tuner requirements were captured. Some of these requirements were investigated further with a smaller number of local tuners who participated in the design of VISPAT, a tool for visualising the performance of parallel programmes running on clusters of workstations. Tuning was also examined in the wider context of requirements and work practices within multi-people development projects. An environment is proposed - the Tuner's Workbench - which addresses issues which arise from the repetitive nature of tuning, namely, its management and documentation in the context of real world parallel code development.
16

Segmentation of motion picture images and image sequences

Hillman, Peter January 2002 (has links)
For Motion Picture Special Effects, it is often necessary to take a source image of an actor, segment the actor from the unwanted background, and then composite over a new background. The resultant image appears as if the actor was filmed in front of the new background. The standard approach requires the unwanted background to be a blue or green screen. While this technique is capable of handling areas where the foreground (the actor) blends into the background, the physical requirements present many practical problems. This thesis investigates the possibility of segmenting images where the unwanted background is more varied. Standard segmentation techniques tend not to be effective, since motion picture images have extremely high resolution and high accuracy is required to make the result appear convincing. A set of novel algorithms which require minimal human interaction to initialise the processing is presented. These algorithms classify each pixel by comparing its colour to that of known background and foreground areas. They are shown to be effective where there is a sufficient distinction between the colours of the foreground and background. A technique for assessing the quality of an image segmentation in order to compare these algorithms to alternative solutions is presented. Results are included which suggest that in most cases the novel algorithms have the best performance, and that they produce results more quickly than the alternative approaches. Techniques for segmentation of moving images sequences are then presented. Results are included which show that only a few frames of the sequence need to be initialised by hand, as it is often possible to generate automatically the input required to initialise processing for the remaining frames. A novel algorithm which can produce acceptable results on image sequences where more conventional approaches fail or are too slow to be of use is presented.
17

Shape optimisation using traditional and morphogenetic evolutionary algorithms : integrated representation of geometry and physical behaviour

Sherlock, Andrew January 2003 (has links)
The work described in this thesis investigated the use of novel shape representations and algorithms for shape optimisation. The aim was to find techniques which could search through a large generality of shapes. This would allow a computer to be used in a more creative way to synthesise shapes for components given a specification of the desired function. Three examples of work done on shape optimisation using evolutionary algorithms and various shape representations and the problems encountered in linking them together effectively with the analysis module are described. These examples are aerofoil profile optimisation with a genetic algorithm, optimisation of a constructive solid geometry solid model with genetic programming and structural optimisation of a voxel shape representation with a genetic algorithm. Most shape optimisation techniques rely on three separate modules: an optimiser, a shape representation and an analysis method. Each of these modules uses a different internal data structure. This thesis argues that using a common data structure for each of these modules would allow a ' number of novel and effective algorithms for shape optimisation to be developed. The data structure proposed is based on Chain models using cell complexes and chains from algebraic topology. As an example of a new approach to shape optimisation enabled by the new data structure, a novel algorithm which adds a morphogenic stage to a genetic algorithm for structural optimisation is also described.
18

An alternative modelling technique for the reduction of error in decision support spreadsheets

Thorne, Simon January 2008 (has links)
Spreadsheet applications are currently the most prevalent end user tool in organisations across the world. Surveys on spreadsheet use show spreadsheets are used as decision making tools in a range of organisations from credit liability assessment in the business world to patient cardiovascular-anaesthesia risk in the medical community. However, there is strong evidence to suggest a significant proportion of spreadsheets contain errors that affect the validity of their operation and results. In addition most end users receive no relevant information systems training and consequently have no concept of creating reliable software. This can result in poorly designed untested spreadsheets that are potentially full of errors. This thesis presents an alternative novel modelling technique to decision support spreadsheets. The novel technique uses attribute classifications (user defined examples) to create a model of a problem. This technique is coined "Example Driven Modelling" (EDM). Through experimentation, the relative benefits and useful limits of EDM are explored and established. The practical application of EDM to real world spreadsheets demonstrates how EDM outperforms equivalent spreadsheet models in a medical decision making spreadsheet used to determine the anaesthesia risk of a patient undergoing cardiovascular surgery.
19

A global approach to project optimisation

Harding, Anthony Martin January 1999 (has links)
In recent years there has been a growing trend in major projects towards considering the whole project life cycle and it's costs. This analysis initially considered concept to commissioning, although more recently it has been expanded to consider concept to decommissioning. Within this project life cycle optimisations are performed to obtain efficient solutions. However, contemporary optimisation techniques still tend to focus on improving the performance of small subsystems within the project as a whole. An optimisation of the whole project across the entire project life cycle is not normally carried out. This thesis therefore proposes a general methodology for such a global optimisation model, which will allow the whole project to be considered within a single optimisation. The strategy consists of a basic scheme for input of all the project data to an objective function, as well as the definition of its constraints. The objective function can map to a value relating to cost, time, performance or risk. This allows the most important criteria to be maximised, as well as constraining other important criteria. The validity of the model is tested by applying it to the resource constrained project scheduling problem. This involves solving scheduling and resource allocation problems. In a large construction project these problems would be a subsystem of the global project model. By testing the global approach to optimisation on this problem using a genetic algorithm, and comparing it to locally optimising techniques, optimisation performance is shown to be improved by applying the global approach. Finally, the global approach is successfully applied to a case study. The results of both the case study and the simple problems demonstrate the global approach to be both feasible and advantageous for large construction projects.
20

Redefining the hyperlink

Stanyer, Dominic S. J. January 2001 (has links)
By advocating an approach to interface design based on the work of Suchman and her observation that actions are situated in the particular circumstance in which they occur, this thesis argues that the simple hypertext abstractions of the Web page and the hyperlink prevent the system from revealing the Internet technology below. An investigation into WWW use shows that this seamless hypertext abstraction is very rarely sustained in practice. Users struggle to explain dynamic system disturbances and fail to prevent worthless hyperlink activations. The investigation reveals the problems that users encounter and the strategies they employ in their attempts to circumvent these abstractions and probe the system for more appropriate information. The findings of this investigation highlight the Universal Resource Locator (URL) as the most important resource available to the WWW user when resolving a system disturbance or predicting the content of a referenced Web page. Experiments were undertaken to uncover first, what type of system information users can extract from the URL and secondly, how confident and accurate users are at extracting such information. The understanding of WWW use elicited through the observational investigations and the directed experimentation, was then employed in the design of an extension to the WWW browser - <i>the hyperlink lens - </i>which redefines the hyperlink abstraction by revealing to the user information about the data transfer process and the referenced Web page.

Page generated in 0.0339 seconds