• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 284
  • 90
  • 31
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 607
  • 607
  • 146
  • 87
  • 87
  • 71
  • 66
  • 65
  • 63
  • 61
  • 55
  • 52
  • 47
  • 47
  • 44
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

An immune-inspired solution for adaptable error detection in embedded systems

Ayara, Modupe January 2005 (has links)
This thesis proposes adaptable error detection technique for improving the availability of embedded systems, and in particular Automated Teller Machines (ATMs). The principles associated with immune-inspired techniques are exploited for detecting unforseen errors during run-time, since traditional techniques for error detection are usually limited to the knowledge available during design-time. Furthermore, the adaptable error detectors can be used to predict system failure well before it happens in order to improve overall system availability and/or maintainability. This thesis introduces a framework for realising adaptable error detection (AED), and demonstrates the effectiveness of an artificial immune system (AIS) as a technique for its implementation. Using data obtained from ATMs, the effectiveness of the AIS technique is evaluated based on the efficacy at detecting the incipience of failures. From the early awareness of impending failures, appropriate actions, such as error recovery or operator warning, can be initiated to prevent the deviaton of system's operations from correct service delivery. Alternatively, the foreknowledge of an imminent failure may quicken system repair with the effect that the downtime of the system is reduced and the system's availability is enhanced. The outcome of the investigations showed that the implemented AED could detect the antecedents to failure. The effects of the continuous learning feature were demonstrated in terms of: (1) a continual update of error detectors depending on new run-time bheaviours, and (2) an improvement in the detection capability by anticipating potential failures. Based on these results, I concluded that the adaptable error detection technique proposed is a step towards enhancing the availability of ATMs.
292

Applying cognitive electrophysiology to neural modelling of the attentional blink

Craston, Patrick January 2008 (has links)
This thesis proposes a connection between computational modelling of cognition and cognitive electrophysiology. We extend a previously published neural network model of working memory and temporal attention (Simultaneous Type Serial Token (ST2 ) model ; Bowman & Wyble, 2007) that was designed to simulate human behaviour during the attentional blink, an experimental nding that seems to illustrate the temporal limits of conscious perception in humans. Due to its neural architecture, we can utilise the ST2 model's functionality to produce so-called virtual event-related potentials (virtual ERPs) by averaging over activation proles of nodes in the network. Unlike predictions from textual models, the virtual ERPs from the ST2 model allow us to construe formal predictions concerning the EEG signal and associated cognitive processes in the human brain. The virtual ERPs are used to make predictions and propose explanations for the results of two experimental studies during which we recorded the EEG signal from the scalp of human participants. Using various analysis techniques, we investigate how target items are processed by the brain depending on whether they are presented individually or during the attentional blink. Particular emphasis is on the P3 component, which is commonly regarded as an EEG correlate of encoding items into working memory and thus seems to re ect conscious perception. Our ndings are interpreted to validate the ST2 model and competing theories of the attentional blink. Virtual ERPs also allow us to make predictions for future experiments. Hence, we show how virtual ERPs from the ST2 model provide a powerful tool for both experimental design and the validation of cognitive models.
293

Modelling human cognition using concurrency theory

Su, Li January 2009 (has links)
No description available.
294

Semantic and structural analysis of genetic programming

Beadle, Lawrence January 2009 (has links)
Genetic programming (GP) is a subset of evolutionary computation where candidate solutions are evaluated through execution or interpreted execution. The candidate solutions generated by GP are in the form of computer programs, which are evolved to achieve a stated objective. Darwinian evolutionary theory inspires the processes that make up GP which include crossover, mutation and selection. During a GP run, crossover, mutation and selection are performed iteratively until a program that satisfies the stated objectives is produced or a certain number of time steps have elapsed. The objectives of this thesis are to empirically analyse three different aspects of these evolved programs. These three aspects are diversity, efficient representation and the changing structure of programs during evolution. In addition to these analyses, novel algorithms are presented in order to test theories, improve the overall performance of GP and reduce program size. This thesis makes three contributions to the field of GP. Firstly, a detailed analysis is performed of the process of initialisation (generating random programs to start evolution) using four novel algorithms to empirically evaluate specific traits of starting populations of programs. It is shown how two factors simultaneously effect how strong the performance of starting population will be after a GP run. Secondly, semantically based operators are applied during evolution to encourage behavioural diversity and reduce the size of programs by removing inefficient segments of code during evolution. It is demonstrated how these specialist operators can be effective individually and when combined in a series of experiments. Finally, the role of the structure of programs is considered during evolution under different evolutionary parameters considering different problem domains. This analysis reveals some interesting effects of evolution on program structure as well as offering evidence to support the success of the specialist operators.
295

Verification of real-time systems : improving tool support

Gomez, Rodolfo January 2006 (has links)
We address a number of limitations of Timed Automata and real-time model-checkers, which undermine the reliability of formal verification. In particular, we focus on the model-checker Uppaal as a representative of this technology. Timelocks and Zeno runs represent anomalous behaviours in a timed automaton, and may invalidate the verification of safety and liveness properties. Currently, model-checkers do not offer adequate support to prevent or detect such behaviours. In response, we develop new methods to guarantee timelock-freedom and absence of Zeno runs, which improve and complement the existent support. We implement these methods in a tool to check Uppaal specifications. The requirements language of model-checkers is not well suited to express sequence and iteration of events, or past computations. As a result, validation problems may arise during verification (i.e., the property that we verify may not accurately reflect the intended requirement). We study the logic PITL, a rich propositional subset of Interval Temporal Logic, where these requirements can be more intuitively expressed than in model-checkers. However, PITL has a decision procedure with a worst-case non-elementary complexity, which has hampered the development of efficient tool support. To address this problem, we propose (and implement) a translation from PITL to the second-order logic WS1S, for which an efficient decision procedure is provided by the tool MONA. Thanks to the many optimisations included in MONA, we obtain an efficient decision procedure for PITL, despite its non-elementary complexity. Data variables in model-checkers are restricted to bounded domains, in order to obtain fully automatic verification. However, this may be too restrictive for certain kinds of specifications (e.g., when we need to reason about unbounded buffers). In response, we develop the theory of Discrete Timed Automata as an alternative formalism for real-time systems. In Discrete Timed Automata, WS1S is used as the assertion language, which enables MONA to assist invariance proofs. Furthermore, the semantics of urgency and synchronisation adopted in Discrete Timed Automata guarantee, by construction, that specifications are free from a large class of timelocks. Thus, we argue that well-timed specifications are easier to obtain in Discrete Timed Automata than in Timed Automata and most other notations for real-time systems.
296

An exploration of novice compilation behaviour in BlueJ

Jadud, Matthew C. January 2006 (has links)
No description available.
297

Presenting multi-language XML documents : an adaptive transformation and validation approach

Pediaditakis, Michael January 2006 (has links)
No description available.
298

Towards modular, scalable and optimal design of transcriptional logic systems

Zabet, Nicolae Radu January 2010 (has links)
Living organisms can perform computations through various mechanisms. Understanding the limitations of these computations is not only of practical relevance (for example in the context of synthetic biology) but will most of all provide new insights into the design principles of living systems. This thesis investigates the conditions under which genes can perform logical computations and how this behaviour can be enhanced. In particular, we identified three properties which characterise genes as computational units, namely: the noise of the gene expression, the slow response times and the energy cost of the logical operation. This study examined how biological parameters control the computational properties of genes and what is the functional relationship between various computational properties. Specifically, we found that there is a three-way trade-off between speed, accuracy and metabolic cost, in the sense that under fixed metabolic cost the speed can be increased only by reducing the accuracy and vice-versa. Furthermore, higher metabolic cost resulted in better trade-offs between speed and accuracy. In addition, we showed that genes with leak expression are sub-optimal compared with leak-free genes. However, the cost to reduce the leak rate can be significant and, thus, genes prefer to handle poorer speed-accuracy behaviour than to increase the energy cost. Moreover, we identified another accuracy-speed trade-off under fixed metabolic cost, but this time the trade-off is controlled by the position of the switching threshold of the gene. In particular, there are two optimal configurations, one for speed and another one for accuracy, and all configurations in between lie on an optimal trade-off curve. Finally, we showed that a negatively auto-regulated gene can display better trade-offs between speed and accuracy compared with a simple one (a gene without feedback) when the two systems have equal metabolic cost. This optimality of the negative auto-regulation is controlled by the leak rate of the gene, in the sense that higher leak rates lead to faster systems and lower leak rates to more accurate ones. This in conjunction with the fact that many genes display low but non-vanishing leak rates can indicate the reason why negative auto-regulation is a network motif (has high occurrence in genetic networks). These trade-offs that we identified in this thesis indicate that there are some physical limits which constrain the computations performed by genes and further enhancement usually comes at the cost of impairing at least one property.
299

CSP++ : an object-oriented application framework for software synthesis from CSP specifications

Gardner, William Bennett 09 May 2018 (has links)
One of the useful formalisms for designing concurrent systems is the process algebra called CSP, or Communicating Sequential Processes. CSP statements can be used to model a system's control and data flow in an intuitive way, constituting a kind of hierarchical behavioral specification. Furthermore, when coupled with simulation and model-checking tools, these statements can be executed and debugged until the desired behavior has been accurately captured. Certain properties (such as absence of deadlocks) can be proved, to help verify the correctness of the design. To make the verified specifications executable in a practical sense, refinement to a programming language is required. In this work, an new object-oriented application framework is described which realizes the basic elements of CSP—processes, synchronizing events, and communication channels—in natural terms as C++ objects. In addition, a new software tool is provided to customize the framework by translating CSP statements into invocations of the framework elements. CSP specifications, thus reexpressed in C++ and compiled, form the control portion of a system, able to be linked with other software written in C++ that completes the functionality. / Graduate
300

An empirical, in-depth investigation into service creation in H.323 Version 4 Networks

Penton, Jason Barry 24 May 2013 (has links)
Over the past few years there has been an increasing tendency to carry voice on IP networks as opposed to the PSTN and other switched circuit networks. Initially this trend was favoured due to reduced costs but occurred at the expense of sacrificing the quality of the voice communications. Switched circuit networks have therefore remained the preferred carrier-grade voice communication network, but this is again changing. The advancement in improved quality of service (QoS) of real-time traffic on the IP network is a contributing factor to the anticipated future of the IP network supplying carrier-grade voice communications. Another contributing factor is the possibility of creating a new range of innovative, state-of-the-art telephony and communications services that acquire leverage through the intelligence and flexibility of the IP network. The latter has yet to be fully explored. Various protocols exist that facilitate the transport of voice and other media on IP networks. The most well known and widely supported of these is H.323. This work presents and discusses H.323 version 4 service creation. The work also categorises the various H.323 services and presents the mechanisms provided by H.323 version 4 that have facilitated the development of the three services I have developed, EmailReader, Telgo323 and CANS.

Page generated in 0.0317 seconds