Spelling suggestions: "subject:"computer science anda bioinformatics"" "subject:"computer science ando bioinformatics""
121 |
High level behavioural modelling of boundary scan architectureMedhat, Saad Sabih Ahmed January 1993 (has links)
This project involves the development of a software tool which enables the integration of the IEEE 1149.1/JTAG Boundary Scan Test Architecture automatically into an ASIC (Application Specific Integrated Circuit) design. The tool requires the original design (the ASIC) to be described in VHDL-IEEE 1076 Hardware Description Language. The tool consists of the two major elements: i) A parsing and insertion algorithm developed and implemented in 'C'; ii) A high level model of the Boundary Scan Test Architecture implemented in 'VHDL'. The parsing and insertion algorithm is developed to deal with identifying the design Input/Output (I/O) terminals, their types and the order they appear in the ASIC design. It then attaches suitable Boundary Scan Cells to each I/O, except power and ground and inserts the high level models of the full Boundary Scan Architecture into the ASIC without altering the design core structure.
|
122 |
Vector offset operators for deformable organic objectsHurmusiadis, Vassilios January 1998 (has links)
Many natural materials and most of living tissues exhibit complex deformable behaviours that may be characteriseda s organic. In computer animation, deformable organic material behaviour is needed for the development of characters and scenes based on living creatures and natural phenomena. This study addresses the problem of deformable organic material behaviour in computer animated objects. The focus of this study is concentrated on problems inherent in geometry based deformation techniques, such as non-intuitive interaction and difficulty in achieving realism. Further, the focus is concentrated on problems inherent in physically based deformation techniques, such as inefficiency and difficulty in enforcing spatial and temporal constraints. The main objective in this study is to find a general and efficient solution to interaction and animation of deformable 3D objects with natural organic material properties and constrainable behaviour. The solution must provide an interaction and animation framework suitable for the creation of animated deformable characters. An implementation of physical organic material properties such as plasticity, elasticity and iscoelasticity can provide the basis for an organic deformation model. An efficient approach to stress and strain control is introduced with a deformation tool named Vector Offset Operator. Stress / strain graphs control the elastoplastic behaviour of the model. Strain creep, stress relaxation and hysteresis graphs control the viscoelastic behaviour of the model. External forces may be applied using motion paths equipped with momentum / time graphs. Finally, spatial and temporal constraints are applied directly on vector operators. The suggested generic deformation tool introduces an intermediate layer between user interaction, deformation, elastoplastic and viscoelastic material behaviour and spatial and temporal constraints. This results in an efficient approach to deformation, frees object representation from deformation, facilitates the application of constraints and enables further development.
|
123 |
Efficient techniques for soft tissue modeling and simulationDuysak, Alpaslan January 2004 (has links)
Performing realistic deformation simulations in real time is a challenging problem in computer graphics. Among numerous proposed methods including Finite Element Modeling and ChainMail, we have implemented a mass spring system because of its acceptable accuracy and speed. Mass spring systems have, however, some drawbacks such as, the determination of simulation coefficients with their iterative nature. Given the correct parameters, mass spring systems can accurately simulate tissue deformations but choosing parameters that capture nonlinear deformation behavior is extremely difficult. Since most of the applications require a large number of elements i. e. points and springs in the modeling process it is extremely difficult to reach realtime performance with an iterative method. We have developed a new parameter identification method based on neural networks. The structure of the mass spring system is modified and neural networks are integrated into this structure. The input space consists of changes in spring lengths and velocities while a "teacher" signal is chosen as the total spring force, which is expressed in terms of positional changes and applied external forces. Neural networks are trained to learn nonlinear tissue characteristics represented by spring stiffness and damping in the mass spring algorithm. The learning algorithm is further enhanced by an adaptive learning rate, developed particularly for mass spring systems. In order to avoid the iterative approach in deformation simulations we have developed a new deformation algorithm. This algorithm defines the relationships between points and springs and specifies a set of rules on spring movements and deformations. These rules result in a deformation surface, which is called the search space. The deformation algorithm then finds the deformed points and springs in the search space with the help of the defined rules. The algorithm also sets rules on each element i. e. triangle or tetrahedron so that they do not pass through each other. The new algorithm is considerably faster than the original mass spring systems algorithm and provides an opportunity for various deformation applications. We have used mass spring systems and the developed method in the simulation of craniofacial surgery. For this purpose, a patient-specific head model was generated from MRI medical data by applying medical image processing tools such as, filtering, the segmentation and polygonal representation of such model is obtained using a surface generation algorithm. Prism volume elements are generated between the skin and bone surfaces so that different tissue layers are included to the head model. Both methods produce plausible results verified by surgeons.
|
124 |
The design and evaluation of the specification framework for user interface designCrowle, Simon January 2003 (has links)
This thesis presentsthe design and evaluation of an interface specification meta-language(ISML) that has been developed to explicitly support metaphor abstractions in a model-based, user interface design framework. The application of metaphor to user interface design is widely accepted within the HCI community, yet despite this, there exists relatively little formal support for user interface design practitioners. With the increasing range and power of user interface technologies made widely available comes the opportunity for the designof sophisticated, new forms of interactive environments. The inter-disciplinary nature of HCI offers many approaches to user interface design that include views on tasks, presentationand dialogue architectures and various domain models. Notations and tools that support these views vary equally, ranging from craft-based approachesthrough to computational or tool- based support and formal methods. Work in these areas depicts gradual cohesion of a number of these design views, but do not currently explicitly specify the application of metaphorical concepts in graphical user interface design. Towards addressing this omission, ISML was developed based on (and extending) some existing model- based user interface design concepts. Abstractions of metaphor and other interface design views are captured in the ISML framework using the extensible mark-up language(XML). A six-month case study, developing the `Urban Shout Cast' application is used to evaluate ISML. Two groups of four software engineers developed a networked, multi-user, virtual radio-broadcasting environment. A qualitative analysis examines both how each group developed metaphor designs within the ISML framework and also their perceptions of its utility and practicality. Subsequent analysis on the specification data from both groups reveals aspects of the project's design that ISML captured and those that were missed. Finally, the extent to which ISML can currently abstract the metaphors used in the case study is assessed through the development of a unified `meta-object' model. The results of the case study show that ISML is capable of expressing many of the features of each group's metaphor design, as well as highlighting important design considerations during development. Furthermore, it has been shown, in principle, how an underlying metaphor abstraction can be mapped to two different implementations. Evaluation of the case study also includes important design lessons: ISML metaphor models can be both very large and difficult to separate from other design views, some of which are either weakly expressed or unsupported. This suggests that the appropriate mappings between design abstractions cannot always be easily anticipated, and that understanding the use of model-based specifications in user interface design projects remains a challenge to the HCI community.
|
125 |
An empirical investigation into metrics for object-oriented softwareCartwright, Michelle Helen January 1998 (has links)
Object-Oriented methods have increased in popularity over the last decade, and are now the norm for software development in many application areas. Many claims were made for the superiority of object-oriented methods over more traditional methods, and these claims have largely been accepted, or at least not questioned by the software community. Such was the motivation for this thesis. One way of capturing information about software is the use of software metrics. However, if we are to have faith in the information, we must be satisfied that these metrics do indeed tell us what we need to know. This is not easy when the software characteristics we are interested in are intangible and unable to be precisely defined. This thesis considers the attempts to measure software and to make predictions regarding maintainabilty and effort over the last three decades. It examines traditional software metrics and considers their failings in the light of the calls for better standards of validation in terms of measurement theory and empirical study. From this five lessons were derived. The relatively new area of metrics for object-oriented systems is examined to determine whether suggestions for improvement have been widely heeded. The thesis uses an industrial case study and an experiment to examine one feature of objectorientation, inheritance, and its effect on aspects of maintainability, namely number of defects and time to implement a change. The case study is also used to demonstrate that it is possible to obtain early, simple and useful local prediction systems for important attributes such as system size and defects, using readily available measures rather than attempting predefined and possibly time consuming metrics which may suffer from poor definition, invalidity or inability to predict or capture anything of real use. The thesis concludes that there is empirical evidence to suggest a hypothesis linking inheritance and increased incidence of defects and increased maintenance effort and that more empirical studies are needed in order to test the hypothesis. This suggests that we should treat claims regarding the benefits of object-orientation for maintenance with some caution. This thesis also concludes that with the ability to produce, with little effort, accurate local metrics, we have an acceptable substitute for the large predefined metrics suites with their attendant problems.
|
126 |
Heuristics for use case descriptionsCox, Karl January 2002 (has links)
Use cases, as part of the Unified Modelling Language, have become an industry standard. The major focus has been on the use case diagram. It is only recently that any detailed attention has been paid to the use case description. The description should be written in such a way as to make it communicable to its reader. However, this does not always appear to be the case. This thesis presents the 7 C's of Communicability as quality features of use case descriptions that make them more comprehensible. The 7 C's are derived from software engineering best practice on use case descriptions and from theories of text comprehension. To help in writing descriptions, the CP Use Case Writing Rules are proposed, a small set of guidelines derived from the 7 C's. Going beyond requirements, software engineers often employ use case descriptions to help them build initial design models of the proposed system. Despite Jacobson's claim that "objects naturally fall out of use cases", fording design-oriented classes and objects in use case descriptions is shown not to be straightforward. This thesis proposes a Question Set which allows the engineer to interrogate the description for important elements of specification and design. Experimentation shows that the CP Writing Rules furnish descriptions that are as comprehensible as those written by other guidelines proposed in the literature. It is also suggested that descriptions be written from the perspective of their intended audience. The limitations of conducting requirements engineering experiments using students are considered and it is suggested that experimenters should not expect large effects from the results. An industrial case study shows that although the CP Rules could not be applied to all events in the use case descriptions, they were applied to most and at varying levels of abstraction. The case study showed that the 7 C's did identify problems with the written descriptions. The Question Set was well received by the case study stakeholders, but it was considered time consuming. One of the overriding findings from the case study was that project time constraints would not allow the company to use the techniques suggested, although they recognised the need to do so. Automation would make industrial application of the CP Rules and 7 C's more feasible.
|
127 |
An empirical investigation into software effort estimation by analogySchofield, Christopher January 1998 (has links)
Most practitioners recognise the important part accurate estimates of development effort play in the successful management of major software projects. However, it is widely recognised that current estimation techniques are often very inaccurate, while studies (Heemstra 1992; Lederer and Prasad 1993) have shown that effort estimation research is not being effectively transferred from the research domain into practical application. Traditionally, research has been almost exclusively focused on the advancement of algorithmic models (e.g. COCOMO (Boehm 1981) and SLIM (Putnam 1978)), where effort is commonly expressed as a function of system size. However, in recent years there has been a discernible movement away from algorithmic models with non-algorithmic systems (often encompassing machine learning facets) being actively researched. This is potentially a very exciting and important time in this field, with new approaches regularly being proposed. One such technique, estimation by analogy, is the focus of this thesis. The principle behind estimation by analogy is that past experience can often provide insights and solutions to present problems. Software projects are characterised in terms of collectable features (such as the number of screens or the size of the functional requirements) and stored in a historical case base as they are completed. Once a case base of sufficient size has been cultivated, new projects can be estimated by finding similar historical projects and re-using the recorded effort. To make estimation by analogy feasible it became necessary to construct a software tool, dubbed ANGEL, which allowed the collection of historical project data and the generation of estimates for new software projects. A substantial empirical validation of the approach was made encompassing approximately 250 real historical software projects across eight industrial data sets, using stepwise regression as a benchmark. Significance tests on the results accepted the hypothesis (at the 1% confidence level) that estimation by analogy is a superior prediction system to stepwise regression in terms of accuracy. A study was also made of the sensitivity of the analogy approach. By growing project data sets in a pseudo time-series fashion it was possible to answer pertinent questions about the approach, such as, what are the effects of outlying projects and what is the minimum data set size? The main conclusions of this work are that estimation by analogy is a viable estimation technique that would seem to offer some advantages over algorithmic approaches including, improved accuracy, easier use of categorical features and an ability to operate even where no statistical relationships can be found.
|
128 |
Evaluation of a fuzzy-expert system for fault diagnosis in power systemsPark, Min Young January 2001 (has links)
A major problem with alarm processing and fault diagnosis in power systems is the reliance on the circuit alarm status. If there is too much information available and the time of arrival of the information is random due to weather conditions etc., the alarm activity is not easily interpreted by system operators. In respect of these problems, this thesis sets out the work that has been carried out to design and evaluate a diagnostic tool which assists power system operators during a heavy period of alarm activity in condition monitoring. The aim of employing this diagnostic tool is to monitor and raise uncertain alarm information for the system operators, which serves a proposed solution for restoring such faults. The diagnostic system uses elements of AI namely expert systems, and fuzzy logic that incorporate abductive reasoning. The objective of employing abductive reasoning is to optimise an interpretation of Supervisory Control and Data Acquisition (SCADA) based uncertain messages when the SCADA based messages are not satisfied with simple logic alone. The method consists of object-oriented programming, which demonstrates reusability, polymorphism, and readability. The principle behind employing objectoriented techniques is to provide better insights and solutions compared to conventional artificial intelligence (Al) programming languages. The characteristics of this work involve the development and evaluation of a fuzzy-expert system which tries to optimise the uncertainty in the 16-lines 12-bus sample power system. The performance of employing this diagnostic tool is assessed based on consistent data acquisition, readability, adaptability, and maintainability on a PC. This diagnostic tool enables operators to control and present more appropriate interpretations effectively rather than a mathematical based precise fault identification when the mathematical modelling fails and the period of alarm activity is high. This research contributes to the field of power system control, in particular Scottish Hydro-Electric PLC has shown interest and supplied all the necessary information and data. The AI based power system is presented as a sample application of Scottish Hydro-Electric and KEPCO (Korea Electric Power Corporation).
|
129 |
The automated analysis of object-oriented designsKirsopp, Colin January 2001 (has links)
This thesis concerns the use of software measures to assess the quality of object-oriented designs. It examines the ways in which design assessment can be assisted by measurement and the areas in which it can't. Other work in software measurement looks at defining and validating measures,or building prediction systems. This work is distinctive in that it examines the use of measures to help improve design quality during design time. To evaluate a design based on measurement results requires a means of relating measurement values to particular design problems or quality levels. Design heuristics were used to make this connection between measurement and quality. A survey was carried out to find suggestions for guidelines, rules and heuristics from the 00 design literature. This survey resulted in a catalogue of 288 suggestions for 00 design heuristics. The catalogue was structured around the 00 constructs to which the heuristics relate, and includes information on various heuristic attributes. This scheme is intended to allow suitable heuristics to be quickly located and correctly applied. Automation requires tool support. A tool was built which augmented the functionality available in existing sets, and taking input from multiple sources of design information (e.g., CASE tools and source code) and the described so far presents a potential method for automated design assessment provides the means of automation. An empirical study was then required to consider the efficacy of the method and evaluate the novel features of the tool. A case study was used to explore the approach taken by, and evaluate the effectiveness of, 15 subjects using measures and heuristics to assess the design of a small 00 system(IS classes). This study showed that semantic heuristics tended to highlight significant problems, but where attempts were made to automate these it often led to false problems being identified. This result, along with a previous finding that around half of quality criteria are not automatically assessable at design time, strongly suggeststhat people are still a necessary part of design assessment. The main result of the case study was that the subjects correctly identified 90% of the major design problems and were very positive about their experience of using measurement to support design assessment.
|
130 |
Real-time expressive Internet communicationsXu, Zhe January 2005 (has links)
This research work "Real-time Expressive Internet Communications" focuses on two subjects: One is the investigation of methods of automatic emotion detection and visualisation under real-time Internet communication environment, the other is the analysis of the influences of presenting visualised emotion expressivei mages to Internet users. To detect emotion within Internet communication, the emotion communication process over the Internet needs to be examined. An emotion momentum theory was developed to illustrate the emotion communication process over the Internet communication. It is argued in this theory that an Internet user is within a certain emotion state, the emotion state is changeable by internal and external stimulus (e.g. a received chat message) and time; stimulus duration and stimulus intensity are the major factors influencing the emotion state. The emotion momentum theory divides the emotions expressed in Internet communication into three dimensions: emotion category, intensity and duration. The emotion momentum theory was implemented within a prototype emotion extraction engine. The emotion extraction engine can analyse input text in an Internet chat environment, detect and extract the emotion being communicated, and deliver the parameters to invoke an appropriate expressive image on screen to the every communicating user's display. A set of experiments were carried out to test the speed and the accuracy of the emotion extraction engine. The results of the experiments demonstrated an acceptable performance of the emotion extraction engine. The next step of this study was to design and implement an expressive image generator that generates expressive images from a single neutral facial image. Generated facial images are classified into six categories, and for each category, three different intensities were achieved. Users need to define only six control points and three control shapes to synthesise all the expressive images and a set of experiments were carried out to test the quality of the synthesised images. The experiment results demonstrated an acceptable recognition rate of the generated facial expression images. With the emotion extraction engine and the expressive image generator,a test platform was created to evaluate the influences of emotion visualisation in the Internet communication context. The results of a series of experiments demonstratedthat emotion visualisation can enhancethe users' perceived performance and their satisfaction with the interfaces. The contributions to knowledge fall into four main areas; firstly, the emotion momentum theory that is proposed to illustrate the emotion communication process over the Internet; secondly, the innovations built into an emotion extraction engine, which senses emotional feelings from textual messages input by Internet users; thirdly, the innovations built into the expressive image generator, which synthesises facial expressions using a fast approach with a user friendly interface; and fourthly, the identification of the influence that the visualisation of emotion has on human computer interaction.
|
Page generated in 0.1155 seconds