Spelling suggestions: "subject:"QA 76 5oftware, computer programming,"" "subject:"QA 76 1software, computer programming,""
81 |
Applying cognitive electrophysiology to neural modelling of the attentional blinkCraston, Patrick January 2008 (has links)
This thesis proposes a connection between computational modelling of cognition and cognitive electrophysiology. We extend a previously published neural network model of working memory and temporal attention (Simultaneous Type Serial Token (ST2 ) model ; Bowman & Wyble, 2007) that was designed to simulate human behaviour during the attentional blink, an experimental nding that seems to illustrate the temporal limits of conscious perception in humans. Due to its neural architecture, we can utilise the ST2 model's functionality to produce so-called virtual event-related potentials (virtual ERPs) by averaging over activation proles of nodes in the network. Unlike predictions from textual models, the virtual ERPs from the ST2 model allow us to construe formal predictions concerning the EEG signal and associated cognitive processes in the human brain. The virtual ERPs are used to make predictions and propose explanations for the results of two experimental studies during which we recorded the EEG signal from the scalp of human participants. Using various analysis techniques, we investigate how target items are processed by the brain depending on whether they are presented individually or during the attentional blink. Particular emphasis is on the P3 component, which is commonly regarded as an EEG correlate of encoding items into working memory and thus seems to re ect conscious perception. Our ndings are interpreted to validate the ST2 model and competing theories of the attentional blink. Virtual ERPs also allow us to make predictions for future experiments. Hence, we show how virtual ERPs from the ST2 model provide a powerful tool for both experimental design and the validation of cognitive models.
|
82 |
Modelling human cognition using concurrency theorySu, Li January 2009 (has links)
No description available.
|
83 |
Semantic and structural analysis of genetic programmingBeadle, Lawrence January 2009 (has links)
Genetic programming (GP) is a subset of evolutionary computation where candidate solutions are evaluated through execution or interpreted execution. The candidate solutions generated by GP are in the form of computer programs, which are evolved to achieve a stated objective. Darwinian evolutionary theory inspires the processes that make up GP which include crossover, mutation and selection. During a GP run, crossover, mutation and selection are performed iteratively until a program that satisfies the stated objectives is produced or a certain number of time steps have elapsed. The objectives of this thesis are to empirically analyse three different aspects of these evolved programs. These three aspects are diversity, efficient representation and the changing structure of programs during evolution. In addition to these analyses, novel algorithms are presented in order to test theories, improve the overall performance of GP and reduce program size. This thesis makes three contributions to the field of GP. Firstly, a detailed analysis is performed of the process of initialisation (generating random programs to start evolution) using four novel algorithms to empirically evaluate specific traits of starting populations of programs. It is shown how two factors simultaneously effect how strong the performance of starting population will be after a GP run. Secondly, semantically based operators are applied during evolution to encourage behavioural diversity and reduce the size of programs by removing inefficient segments of code during evolution. It is demonstrated how these specialist operators can be effective individually and when combined in a series of experiments. Finally, the role of the structure of programs is considered during evolution under different evolutionary parameters considering different problem domains. This analysis reveals some interesting effects of evolution on program structure as well as offering evidence to support the success of the specialist operators.
|
84 |
Verification of real-time systems : improving tool supportGomez, Rodolfo January 2006 (has links)
We address a number of limitations of Timed Automata and real-time model-checkers, which undermine the reliability of formal verification. In particular, we focus on the model-checker Uppaal as a representative of this technology. Timelocks and Zeno runs represent anomalous behaviours in a timed automaton, and may invalidate the verification of safety and liveness properties. Currently, model-checkers do not offer adequate support to prevent or detect such behaviours. In response, we develop new methods to guarantee timelock-freedom and absence of Zeno runs, which improve and complement the existent support. We implement these methods in a tool to check Uppaal specifications. The requirements language of model-checkers is not well suited to express sequence and iteration of events, or past computations. As a result, validation problems may arise during verification (i.e., the property that we verify may not accurately reflect the intended requirement). We study the logic PITL, a rich propositional subset of Interval Temporal Logic, where these requirements can be more intuitively expressed than in model-checkers. However, PITL has a decision procedure with a worst-case non-elementary complexity, which has hampered the development of efficient tool support. To address this problem, we propose (and implement) a translation from PITL to the second-order logic WS1S, for which an efficient decision procedure is provided by the tool MONA. Thanks to the many optimisations included in MONA, we obtain an efficient decision procedure for PITL, despite its non-elementary complexity. Data variables in model-checkers are restricted to bounded domains, in order to obtain fully automatic verification. However, this may be too restrictive for certain kinds of specifications (e.g., when we need to reason about unbounded buffers). In response, we develop the theory of Discrete Timed Automata as an alternative formalism for real-time systems. In Discrete Timed Automata, WS1S is used as the assertion language, which enables MONA to assist invariance proofs. Furthermore, the semantics of urgency and synchronisation adopted in Discrete Timed Automata guarantee, by construction, that specifications are free from a large class of timelocks. Thus, we argue that well-timed specifications are easier to obtain in Discrete Timed Automata than in Timed Automata and most other notations for real-time systems.
|
85 |
An exploration of novice compilation behaviour in BlueJJadud, Matthew C. January 2006 (has links)
No description available.
|
86 |
Presenting multi-language XML documents : an adaptive transformation and validation approachPediaditakis, Michael January 2006 (has links)
No description available.
|
87 |
Towards modular, scalable and optimal design of transcriptional logic systemsZabet, Nicolae Radu January 2010 (has links)
Living organisms can perform computations through various mechanisms. Understanding the limitations of these computations is not only of practical relevance (for example in the context of synthetic biology) but will most of all provide new insights into the design principles of living systems. This thesis investigates the conditions under which genes can perform logical computations and how this behaviour can be enhanced. In particular, we identified three properties which characterise genes as computational units, namely: the noise of the gene expression, the slow response times and the energy cost of the logical operation. This study examined how biological parameters control the computational properties of genes and what is the functional relationship between various computational properties. Specifically, we found that there is a three-way trade-off between speed, accuracy and metabolic cost, in the sense that under fixed metabolic cost the speed can be increased only by reducing the accuracy and vice-versa. Furthermore, higher metabolic cost resulted in better trade-offs between speed and accuracy. In addition, we showed that genes with leak expression are sub-optimal compared with leak-free genes. However, the cost to reduce the leak rate can be significant and, thus, genes prefer to handle poorer speed-accuracy behaviour than to increase the energy cost. Moreover, we identified another accuracy-speed trade-off under fixed metabolic cost, but this time the trade-off is controlled by the position of the switching threshold of the gene. In particular, there are two optimal configurations, one for speed and another one for accuracy, and all configurations in between lie on an optimal trade-off curve. Finally, we showed that a negatively auto-regulated gene can display better trade-offs between speed and accuracy compared with a simple one (a gene without feedback) when the two systems have equal metabolic cost. This optimality of the negative auto-regulation is controlled by the leak rate of the gene, in the sense that higher leak rates lead to faster systems and lower leak rates to more accurate ones. This in conjunction with the fact that many genes display low but non-vanishing leak rates can indicate the reason why negative auto-regulation is a network motif (has high occurrence in genetic networks). These trade-offs that we identified in this thesis indicate that there are some physical limits which constrain the computations performed by genes and further enhancement usually comes at the cost of impairing at least one property.
|
Page generated in 0.1154 seconds