• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 707
  • 707
  • 669
  • 165
  • 110
  • 71
  • 70
  • 62
  • 58
  • 50
  • 46
  • 44
  • 44
  • 44
  • 44
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Thermal fields during welding and their analogues

Carrick, James P. January 1976 (has links)
To avoid the problems associated with specifying the exact nature of the heat input from welding arcs, an analogue model is proposed which simulates the quasi-static thermal field produced around the isothermal contour of the molten weld pool boundary during the welding of thin plate. The design of an electrical analogue based directly on Rosenthal's equation (1) governing the quasistatic heat flow about a moving source is shown to be impractical although this approach identifies the physical significance of the two parameter ratios. To overcome the difficulties associated with the direct analogue, a simple transformation of Rosenthal's equation is employed and the design of an indirect or 0 field analogue of this transformed equation is developed. The details of the construction and commissioning of such an analogue are reported. The application of this analogue to studying the quasi-static thermal field is tested by comparing analogue predicted and experimentally measured temperature histories of points in the HAZ for a range of autogenous TIG melt runs on thin mild steel plate. The experimental results are obtained from a purpose built automatic welding rig which incorporates a facility for determining the shape of the molten weld pool during welding. The results from these comparative tests show a good agreement between predicted and measured temperature histories and the application of the 0 field analogue to studying the thermal field during welding is discussed.
122

A study of structure stability using the finite element method and minimum weight design of composite panel

Suri, Ali Hasan January 1983 (has links)
A general study of the finite element approach and its application to structural analysis is conducted and methods of derivation of element properties are reviewed. It is found that the method has the advantage of generality of application and that the direct approach is easier to programme than the force method. A unified and simple formula is derived for the computation of element elastic and geometric stiffness matrices and is found to be much easier to apply than existing methods. The standard finite element displacement method is used to study overall buckling of beams, plates, and stiffened plates. The results show that the method can provide accurate answers as compared with the existing analytical approaches. The exact finite strip, the approximate finite strip, and the finite strip for local buckling analysis are reviewed and are found to economise greatly in computer time, needing less storage space because of their narrow band width as compared with the standard finite element method. In particular, the finite strip for local stability which uses a standard eigenvalue subroutine and is based on the concept of geometric stiffness matrices is used to study local buckling and buckling of plates and plates supported elastically by continuous elastic medium. The results obtained are shown to be very close to those obtained analytically, the effect of the elastic support being to increase greatly the buckling stress relative to the unsupported plates
123

Structured polyphonic patterns

Bergeron, Mathieu January 2010 (has links)
The present dissertation develops, applies and evaluates a novel method for the representation and retrieval of patterns in musical data. The method supports the typical polyphonic patterns that one finds in music theory textbooks. Most current computational methods to musical patterns are restricted to monophony (one melody at a time). The Structured Polyphonic Patterns method (SPP) applies to the general case of polyphonic music, where many melodies may unfold concurrently. Pattern components are conjunctions of features which encode properties of musical events, or relations that they form with other events. Relations between events that overlap in time but are not simultaneous are supported, enabling patterns to express key temporal relations of polyphonic music. Patterns are formed by joining and layering pattern components into sequences (horizontal structures) and layers (vertical structures). A layer specifies voicing in an abstract way, and the exploration of different voice permutations is handled automatically. The SPP method also provides a mechanism for defining new features. We evaluate SPP by developing a small catalog of musicologically relevant queries and analyzing the results on four corpora: 185 chorale harmonizations by J.S. Bach, Mozart Symphony no. 40, a small set of piano pieces by Chopin, and a collection of folk songs containing more than 8000 pieces – in addition to its size, demonstrating the scalability of the method, that latter corpus is interesting as it shows that SPP is also usable for monophony. Examining several corpora allows us to establish that some polyphonic patterns constitute salient properties of a corpus: they are over-represented in one corpus by comparison to the others. In addition, the queries we develop demonstrate that the SPP method possesses sufficient expressiveness to capture important music-theoretic notions. At the same time, we show how the method is more restrictive than some existing polyphonic pattern representations, hence providing a better approximation of the expressive power required for polyphonic patterns. It is a better candidate representation for music data mining, a difficult problem that has received significant attention for the monophonic case, but limited attention for the more general polyphonic case.
124

Quantifying information flow with constraints

Zhu, Ping January 2010 (has links)
Quantifying flow of information in a program involves calculating how much information (e.g. about secret inputs) can be leaked by observing the program's public outputs. Recently this field has attracted a lot of research interest, most of which makes use of Shannon's information theory, e.g. mutual information, conditional entropy, etc. Computability entails that any automated analysis of information is necessarily incomplete. Thus quantitative flow of analyses aim to compute upper bounds on the sizes of the flows in a program. Virtually all the current quantitative analyses treat program variables independently, which significantly limits the potential for deriving tight upper bounds. Our work is motivated by the intuition that knowledge of the dependencies between program variables should allow the derivation of more precise upper bounds on the size of flows, and that classical abstract interpretation provides an effective mechanism for determining such dependencies in the form of linear constraints. Our approach is then to view the problem as one of constrained optimization (maximum entropy), allowing us to apply the standard technique of Lagrange multiplier method. Application of this technique turns out to require development of some novel methods due to the essential use of non-linear (entropy) constraints, in conjunction with the linear dependency constraints. Using these methods we obtain more precise upper bounds on the size of information flows than is possible with existing analysis techniques.
125

The analysis and acquisition of proper names for robust text understanding

Coates-Stephens, Sam January 1992 (has links)
In this thesis we consider the problems that Proper Names cause in the analysis of unedited, naturally-occurring text. Proper Names cause problems because of their high frequency in many types of text, their poor coverage in conventional dictionaries, their importance in the text understanding process, and the complexity of their structure and the structure of the text which describes them. For the most part these problems have been ignored in the field of Natural Language Processing, with the result that Proper Names are one of its most under-researched areas. As a solution to the problem, we present a detailed description of the syntax and semantics of seven major classes of Proper Name, and of their surrounding context. This description leads to the construction of syntactic and semantic rules specifically for the analysis of Proper Names, which capitalise on the wealth of descriptive material which often accompanies a Proper Name when it occurs in a text. Such an approach side-steps the problem of lexical coverage, by allowing a text processing system to use the very text it is analysing to construct lexical and knowledge base entries for unknown Proper Names as it encounters them. The information acquired on unknown Proper Names goes considerably beyond a simple syntactic and semantic classification, instead consisting of a detailed genus and differentia description. A complete solution to the 'Proper Name Problem' must include approaches to the handling of apposition, conjunction and ellipsis, abbreviated reference, and many of the far from standard phenomena encountered in naturally-occurring text. The thesis advances partial and practical solutions in all of these areas. In order to set the work described in a suitable context, the problems of Proper Names are viewed as a subset of the general problem of lexical inadequacy, as it arises in processing real, un-edited, text. The whole of this field is reviewed, and various methods of lexical acquisition compared and evaluated. Our approach to coping with lexical inadequacy and to handling Proper Names is implemented in a news text understanding system called FUNES, which is able to automatically acquire detailed genus and differentia information on Proper Names as it encounters them in its processing of news text. We present an assessment of the system's performance on a sample of unseen news text which is held to support the validity of our approach to handling Proper Names.
126

Mobilisation of support for the Palestinian cause : a comparative study of political change at the communal, regional and global levels

Kirisci, Kemal January 1986 (has links)
Those who study world politics are divided between the traditional Realist paradigm, which depicts an international political system dominated by states involved in a 'power struggle' in pursuit of their 'national interest', and an emergent approach that includes in the analysis a wider range of political actors and defines the nature of politics very differently. The latter approach sees the central process of world politics as being the mobilisation of support in respect to the composition of the global political agenda and contest over the various issue positions. This thesis examines the Palestinian Question as a case study of a mobilisation process, that involved a non-state actor playing a crucial role in introducing to the global agenda an issue previously of low salience to other actors. The Palestinian Question throughout the 1950s and 1960s was treated on the global political agenda as a by-product of the Arab-Israeli conflict. It was perceived as a 'refugee problem', the solution of which was envisaged within an overall settlement of the Arab-Israeli conflict. Yet, within less than a decade of the re-appearance of an indigenous Palestinian national movement a significant section of the international political system changed its attitude towards the Palestinian problem. It was not any more perceived simply as a 'refugee problem' but one of 'self-determination'. In this thesis the analysis of the mobilisation process that brought the Palestinian issue to the forefront of the world political agenda is guided by a dynamic model applied to four different levels of analysis. The first level is constituted by the Palestinian community. Then there is the Arab governmental level. The third level is made up of various regional groupings, such as the Non-Aligned, the Latin Americans, the European Community and the East Europeans. The final level is the global one, represented by the United Nations political system. The analysis reveals the dynamic and interactive nature of the mobilisation process across different levels of analysis and the way in which the different positions held on the Palestinian issue have converged towards a relatively common stand.
127

Theory based design and evaluation of multimedia presentation interfaces

Faraday, Peter January 1998 (has links)
Multimedia (MM) Applications currently suffer from an ad hoc development process. This places the usability and effectiveness of many MM products in doubt. This thesis develops a theoretically motivated design method and tools to address these problems. The thesis is based on an analysis of the cognitive processes of attending to and comprehending an MM presentation. A design method is then developed based on these cognitive processes. The method addressesth e problem of selecting media to presenting information requirements,h ow to design the media to effectively deliver the desired content, how to combine verbal and visual media successfully, and how to direct the user's attention to particular part of the presentation. A number of studies are then presented which provide validation for the method's claims. These include eye tracking to analyse the user's reading / viewing sequence, and tests of expert and novice recall of MM and conventional text / speech presentations. A set of re-authoring studies show that application of guidelines improves retention of the content. The method is supported by a design advisor authoring tool. The tool applies the guidelines using a combination of a critiquer and expert system. The tool demonstrates that the guidelines are tractable for implementation, and provides a novel approach to providing authoring advice. Both the method and the tool are also validated in case studies with novice users. These demonstrate that the method and tool are both usable and effective.
128

Designing virtual environments for usability

Kaur, Kulwinder January 1998 (has links)
This thesis investigates user interaction in virtual environments and usability requirements to support that interaction. Studies of the design and use of virtual environments are used to demonstrate the need for interface design guidance. A theory of interaction for virtual environments is proposed, which includes predictive models of interactive behaviour and a set of generic design properties for supporting that behaviour. The models elaborate on D.A. Norman's cycle of action to describe the stages involved in three modes of behaviour: task and action based, exploratory and reactive. From the models, generic design properties are defined for various aspects of the virtual environment, such as its objects, actions and user representation. The models of interaction are evaluated through empirical studies of interactive behaviour which compare observed interaction patterns with those predicted. The generic design properties are evaluated through usability studies that investigate the links between missing design properties and usability problems encountered. Results from the evaluation studies provide general support for the theory and indicate specific refinements required. A controlled study is used to test the impact of the theory on interaction success, by comparing performance in virtual environments with and without implementation of the generic design properties. Significant improvements in interaction are found with the use of a virtual environment, after the predicted design properties have been implemented. Design guidelines are then developed from the theory and a hypertext tool designed to present the guidelines. The tool and guidelines are evaluated with industrial virtual environment designers to test the usability and utility of the guidance. Results indicate that the guidance is useful in addressing the practical problem of designing virtual environments for usability. Therefore, this thesis fulfils its objective of developing interface design guidelines for virtual environments, using interaction modelling as a theoretical base. Furthermore, it provides an improved understanding of user interaction in virtual environments and can be used to inform further theories, methods or tools for virtual environments and human-computer interfaces.
129

User modelling for evaluation of direct manipulation interfaces

Springett, Mark Vincent January 1995 (has links)
This thesis applies models of user action to usability evaluation of direct manipulation interfaces. In particular, the utility of a Model of Action for assisting novice evaluators in usability tests is investigated. An initial model of user action is proposed, based on the theory of action proposed by Norman (1986). This model includes a description of knowledge sources used in interaction, error types and user responses to errors. The model is used to interpret data on user behaviour and errors in an empirical study of MacDraw I. This study used the Protocol Analysis technique proposed by Ericsson and Simon (1984). Protocol evidence shows that the search and specification stages of user action could usefully be treated as separate in terms of user knowledge recruitment and the nature of system support. The Model of Action is then expanded and modified to account for the empirical findings. The new model distinguishes knowledge-based, rule-based and skill-based processing in Direct Manipulation (DM) interaction, using the distinction drawn by Rasmussen (1986). These processing levels are explicitly linked to types of presentation technique and categories of user error. This is developed into a technique for determining system causes of usability problems. A set of mental dialogue tokens (roles) are developed to assist novice evaluators in the interpretation of error causes. Roles are linked to types of user error in the cycle of action in a diagnostic model. This model forms the basis of a budget method for use by novice evaluators, named Model Mismatch Analysis (MMA). These developments are tested by a two-tier study of user performance on Microsoft Word. The empirical evidence validated the taxonomy of errors, and tests the utility of five retrospective data analysis techniques. A study of novice evaluator performance is reported, comparing the MMA method to the Usability Checklist proposed by Ravden and Johnson (1989). The MMA method is shown to be the more efficient approach. To summarise, models of Direct Manipulation action are shown to assist novice evaluators both in the diagnosis of usability problems, and the selection of remedies.
130

A requirements engineering method for COTS-based systems development

Ncube, Cornelius January 2000 (has links)
An increasing number of organisations are procuring off-the-shelf software products from commercial suppliers. However, there has been a lack of methods and software tools for such requirements acquisition, product selection and product procurement. This thesis proposes a new method called PORE (Procurement-Oriented Requirements Engineering) which integrates existing requirements engineering techniques with those from knowledge engineering, feature analysis, multi-criteria decision-making and argumentation approaches to address the lack of guidance for acquiring requirements to enable evaluation and selection of commercial-off-the-shelf (COTS) software. PORE is designed in part from conclusions drawn from real-world case studies of requirements acquisition for complex software product selection. Such studies are reported in this thesis. The PORE method is part goal-driven and part context-driven, in that it exploits models of the candidate COTS software and customer requirements as well as process goals to guide a requirements engineering team. The method's approach and mechanisms is demonstrated using a well-known commercial electronic-mail system. A number of studies are presented to provide validation for the method. These include three studies in three different organisations to select COTS software products and one study of requirements engineering experts to elicit their knowledge. The results from these studies demonstrated that the method is usable and effective. The thesis concludes with a discussion of future work to improve the PORE method and future research directions on requirements engineering for COTS-based systems development.

Page generated in 0.0635 seconds