• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20453
  • 5226
  • 1262
  • 1211
  • 867
  • 670
  • 435
  • 410
  • 410
  • 410
  • 410
  • 410
  • 407
  • 158
  • 156
  • Tagged with
  • 34402
  • 34402
  • 14117
  • 10833
  • 3107
  • 2982
  • 2738
  • 2541
  • 2483
  • 2354
  • 2279
  • 2178
  • 2166
  • 2046
  • 1937
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

The formal description of musical perception

Steedman, Mark January 1972 (has links)
This work concerns a problem in modelling people's understanding of music. The problem is cast in the terms of discovering formal rules for transcribing melodies into musical notation, as this might be done by a student in a harmony class, taking musical dictation from a 'deadpan' performance on the keyboard. The score that results reflects important aspects of the structure and interpretation of the piece, which are only implicit in the performance. In Part I it is argued that this paradigm raises questions of general relevance to the study of our perception of music. In carrying out the task of notating a piece, two kinds of problem arise: what are the harmonic relations between the notes, and what are the metric units into which they are grouped? These two problems are considered in isolation from one another. In Part II, algorithms which embody two kinds of rule for the inference of metre are presented . In Part III the harmonic problem is considered. It arises from the fact that the number of keyboard semitones between two notes does not, by itself, identify their harmonic relation, which is what the notation has to express. Among other considerations, the key of the piece is an important characterisation of this identification - but the key is not explicit in the performance, and must itself be inferred. An earlier theory of harmonic relations is further developed into algorithms for assigning key-signatures and notation to melodies. By -the definition of the problem, we are committed to the concern of music belonging to the tradition of Western tonal music, to which the idea of key applies. Most of' the musical examples discussed will be taken from the work of one of its outstanding exponents, J.S. Bach, and in particular we shall be dealing with the fugue subjects of his “Well-Tempered Clavier". Some of the contents of Part II, Section § 2, and of Part III, Section § 3, have already appeared in a paper published in collaboration with H.C. Longuet-Higgins. Part III, Section § 2.1, describes his prior work in formulating the theory of harmonic relations, mentioned above, which forms the foundation of the work described in that section and has been published elsewhere by him.
122

Semantic trees : new foundations for automatic theorem-proving

Hayes, Patrick J. January 1973 (has links)
This dissertation is concerned with theorem-proving by computer. It does not contain a great number of new results, in the sense of new computational devices for improving the efficiency of theorem-proving programs. Rather it is intended as an account of a new approach to the fundamentals of the subject. It is a work, in the main, of consolidation and entrenchment rather than of extension. Accordingly, rather a large fraction of the total is devoted to an examination - a re-examination in fact, since there have been others before me - of the ideas and presuppositions underlying theorem-proving, and an attempt to uncover the underlying reasons why certain ideas - notably that of search - have arisen so consistently in the history of the -" subject.
123

Adaptive aspects of heuristic search

Ross, Robert January 1973 (has links)
In this thesis we investigate methods by which GT4, a revised and extended version of the Doran-Michie Graph Traverser, might in the course of its problem-solving activity learn about the domain in which it is searching and thereby improve its performance. We study first the problem of automatically optimizing the parameters of GT4' s evaluation function. In particular, we investigate the distance estimator method proposed by Doran and Michie. Using two sliding-block puzzles and the algebraic manipulation problems of Quinlan and Hunt we demonstrate experimentally the feasibility of this method of parameter optimization. An interesting feature of the work is that optimization is implemented by recursive call of GT4, the algorithm acting for this purpose as a pattern search numerical optimizer. A theoretical analysis of several factors affecting the success of the distance estimator method is then carried out and alternative approaches to parameter optimization are proposed and discussed. In Chapter 8 we describe the results of our experiments in automatic operator selection. We investigate, in particular, a promotional scheme for re-ordering ┌', the global operator list used by GT4, so that operators of proven utility are given preference in the order of application. Our results demonstrate that the scheme successfully improves the ordering of a list of 48 Eight-puzzle macro-operators.
124

Applications of the genetic algorithms optimisation approach in the design of circular polarised microstrip antennas

Al-Jibouri, Belal January 2005 (has links)
No description available.
125

Migrating relational databases into object-based and XML databases

Maatuk, Abdelsalam January 2009 (has links)
Rapid changes in information technology, the emergence of object-based and WWW applications, and the interest of organisations in securing benefits from new technologies have made information systems re-engineering in general and database migration in particular an active research area. In order to improve the functionality and performance of existing systems, the re-engineering process requires identifying and understanding all of the components of such systems. An underlying database is one of the most important component of information systems. A considerable body of data is stored in relational databases (RDBs), yet they have limitations to support complex structures and user-defined data types provided by relatively recent databases such as object-based and XML databases. Instead of throwing away the large amount of data stored in RDBs, it is more appropriate to enrich and convert such data to be used by new systems. Most researchers into the migration of RDBs into object-based/XML databases have concentrated on schema translation, accessing and publishing RDB data using newer technology, while few have paid attention to the conversion of data, and the preservation of data semantics, e.g., inheritance and integrity constraints. In addition, existing work does not appear to provide a solution for more than one target database. Thus, research on the migration of RDBs is not fully developed. We propose a solution that offers automatic migration of an RDB as a source into the recent database technologies as targets based on available standards such as ODMG 3.0, SQL4 and XML Schema. A canonical data model (CDM) is proposed to bridge the semantic gap between an RDB and the target databases. The CDM preserves and enhances the metadata of existing RDBs to fit in with the essential characteristics of the target databases. The adoption of standards is essential for increased portability, flexibility and constraints preservation. This thesis contributes a solution for migrating RDBs into object-based and XML databases. The solution takes an existing RDB as input, enriches its metadata representation with the required explicit semantics, and constructs an enhanced relational schema representation (RSR). Based on the RSR, a CDM is generated which is enriched with the RDB's constraints and data semantics that may not have been explicitly expressed in the RDB metadata. The CDM so obtained facilitates both schema translation and data conversion. We design sets of rules for translating the CDM into each of the three target schemas, and provide algorithms for converting RDB data into the target formats based on the CDM. A prototype of the solution has been implemented, which generates the three target databases. Experimental study has been conducted to evaluate the prototype. The experimental results show that the target schemas resulting from the prototype and those generated by existing manual mapping techniques were comparable. We have also shown that the source and target databases were equivalent, and demonstrated that the solution, conceptually and practically, is feasible, efficient and correct.
126

A compositional analysis of broadcasting embedded systems

Brockway, Michael J. January 2010 (has links)
This work takes as its starting point D Kendall's CANdle/bCANdle algebraic framework for formal modelling and specification of broadcasting embedded systems based on CAN networks. Checking real-time properties of such systems is beset by problems of state-space explosion and so a scheme is given for recasting systems specified in Kendall's framework as parallel compositions of timed automata; a CAN network channel is modelled as an automaton. This recasting is shown to be bi-similar to the original bCANdle model. In the recast framework,"compositionality" theorems allow one to infer that a model of a system is simulated by some abstraction of the model, and hence that properties of the model expressible in ACTL can be inferred from analogous properties of the abstraction. These theorems are reminiscent of "assume-guarantee" reasoning allowing one to build simulations component-wise although, unfortunately, components participating in a "broadcast" are required to be abstracted "atomically". Case studies are presented to show how this can be used in practice, and how systems which take impossibly long to model-check can tackled by compositional methods. The work is of broader interest also, as the models are built as UPPAAL systems and the compositionality theorems apply to any UPPAAL system in which the components do not share local variables. The method could for instance extend to systems using some network other than CAN, provided it can be modelled by timed automata. Possibilities also exist for building it into an automated tool, complementing other methods such as counterexample- guided abstraction refinement.
127

An investigation into server-side static and dynamic web content survivability using a web content verification and recovery (WVCR) system

Aljawarneh, Shadi January 2008 (has links)
A malicious web content manipulation software can be used to tamper with any type of web content (e.g., text, images, video, audio and objects), and as a result, organisations are vulnerable to data loss. In addition, several security incident reports from emergency response teams such as CERT and AusCERT clearly demonstrate that the available security mechanisms have not made system break-ins impossible. Therefore, ensuring web content integrity against unauthorised tampering has become a major issue. This thesis investigates the survivability of server-side static and dynamic web content using the Web Content Verification and Recovery (WCVR) system. We have developed a novel security system architecture which provides mechanisms to address known security issues such as violation of data integrity that arise in tampering attacks. We propose a real-time web security framework consisting of a number of components that can be used to verify the server-side static and dynamic web content, and to recover the original web content if the requested web content has been compromised. A conceptual model to extract the client interaction elements, and a strategy to utilise the hashing performance have been formulated in this research work. A prototype of the solution has been implemented and experimental studies have been carried out to address the security and the performance objectives. The results indicate that the WCVR system can provide a tamper detection, and recovery to server-side static and dynamic web content. We have also shown that overhead for the verification and recovery processes are relatively low and the WCVR system can efficiently and correctly determine if the web content has been tampered with.
128

Automatic schedule computation for distributed real-time systems using timed automata

Park, Young-Saeng January 2008 (has links)
The time-triggered architecture is becoming accepted as a means of implementing scalable, safer and more reliable solutions for distributed real-time systems. In such systems, the execution of distributed software components and the communication of messages between them take place in a fixed pattern and are scheduled in advance within a given scheduling round by a global scheduling policy. The principal obstacle in the design of time-triggered systems is the difficulty of finding the static schedule for all resources which satisfies constraints on the activities within the scheduling round, such as the meeting of deadlines. The scheduler has to consider not only the requirements on each processor but also the global requirements of system-wide behaviour including messages transmitted on networks. Finding an efficient way of building an appropriate global schedule for a given system is a major research challenge. This thesis proposes a novel approach to designing time-triggered schedules which is radically different from existing mathematical methods or algorithms for schedule generation. It entails the construction of timed automata to model the arrival and execution of software tasks and inter-task message communication for a system; the behaviour of an entire distributed system is thus a parallel composition of these timed automata models. A job comprises a sequence of tasks and messages; this expresses a system-wide transaction which may be distributed over a system of processors and networks. The job is formalized by a timed automata based on the principle that a task or message can be modelled by finite states and a clock variable. Temporal logic properties are formed to express constraints on the behaviour of the system components such as precedence relationships between tasks and messages and adherence to deadlines. Schedules are computed by formally verifying that these properties hold for an evolution of the system; a successful schedule is simply a trace generated by the verifier, in this case the UPPAAL model-checking tool has been employed to perform the behaviour verification. This approach guarantees to generate a practical schedule if one exists and will fail to construct any schedule if none exists. A prototype toolset has been developed to automate the proposed approach to create of timed automata models, undertake the analysis, extract schedules from traces and visualize the generated schedules. Two case studies, one of a cruise control system, the other a manufacturing cell system, are presented to demonstrate the applicability and usability of the approach and the application of the toolset. Finally, further constraints are considered in order to yield schedules with limited jitter, increased efficiency and system-wide properties.
129

The formal evaluation and design of routing protocols for wireless sensor networks in hostile environments

Saghar Malik, Kashif January 2010 (has links)
Wireless Sensor Networks (WSNs) have attracted considerable research attention in recent years because of the perceived potential benefits offered by self-organising, multi-hop networks consisting of low-cost and small wireless devices for monitoring or control applications in di±cult environments. WSN may be deployed in hostile or inaccessible environments and are often unattended. These conditions present many challenges in ensuring that WSNs work effectively and survive long enough to fulfil their functionalities. Securing a WSN against any malicious attack is a particular challenge. Due to the limited resources of nodes, traditional routing protocols are not appropriate in WSNs and innovative methods are used to route data from source nodes to sink nodes (base stations). To evaluate the routing protocols against DoS attacks, an innovative design method of combining formal modelling and computer simulations has been proposed. This research has shown that by using formal modelling hidden bugs (e.g. vulnerability to attacks) in routing protocols can be detected automatically. In addition, through a rigorous testing, a new routing protocol, RAEED (Robust formally Analysed protocol for wirEless sEnsor networks Deployment), was developed which is able to operate effectively in the presence of hello flood, rushing, wormhole, black hole, gray hole, sink hole, INA and jamming attacks. It has been proved formally and using computer simulation that the RAEED can pacify these DoS attacks. A second contribution of this thesis relates to the development of a framework to check the vulnerability of different routing protocols against Denial of Service(DoS) attacks. This has allowed us to evaluate formally some existing and known routing protocols against various DoS attacks iand these include TinyOS Beaconing, Authentic TinyOS using uTesla, Rumour Routing, LEACH, Direct Diffusion, INSENS, ARRIVE and ARAN protocols. This has resulted in the development of an innovative and simple defence technique with no additional hardware cost for deployment against wormhole and INA attacks. In the thesis, the detection of weaknesses in INSENS, Arrive and ARAN protocols was also addressed formally. Finally, an e±cient design methodology using a combination of formal modelling and simulation is propose to evaluate the performances of routing protocols against DoS attacks.
130

Objective models for subjective quality estimate of stereoscopic video

Malekmohamadi, Hossein January 2013 (has links)
No description available.

Page generated in 0.081 seconds