• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 876
  • 143
  • 109
  • 88
  • 68
  • 43
  • 26
  • 18
  • 15
  • 13
  • 12
  • 8
  • 6
  • 6
  • 4
  • Tagged with
  • 1841
  • 325
  • 308
  • 283
  • 218
  • 214
  • 183
  • 171
  • 169
  • 148
  • 147
  • 136
  • 136
  • 134
  • 121
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Rapid design of condition monitoring systems for machining operations

Al-Habaibeh, Amin January 2000 (has links)
No description available.
82

Investigation into the use of neural networks for visual inspection of ceramic tableware

Finney, Graham Barry January 1998 (has links)
No description available.
83

Monitoring the Generation and Execution of Optimal Plans

Fritz, Christian Wilhelm 24 September 2009 (has links)
In dynamic domains, the state of the world may change in unexpected ways during the generation or execution of plans. Regardless of the cause of such changes, they raise the question of whether they interfere with ongoing planning efforts. Unexpected changes during plan generation may invalidate the current planning effort, while discrepancies between expected and actual state of the world during execution may render the executing plan invalid or sub-optimal, with respect to previously identified planning objectives. In this thesis we develop a general monitoring technique that can be used during both plan generation and plan execution to determine the relevance of unexpected changes and which supports recovery. This way, time intensive replanning from scratch in the new and unexpected state can often be avoided. The technique can be applied to a variety of objectives, including monitoring the optimality of plans, rather then just their validity. Intuitively, the technique operates in two steps: during planning the plan is annotated with additional information that is relevant to the achievement of the objective; then, when an unexpected change occurs, this information is used to determine the relevance of the discrepancy with respect to the objective. We substantiate the claim of broad applicability of this relevance-based technique by developing four concrete applications: generating optimal plans despite frequent, unexpected changes to the initial state of the world, monitoring plan optimality during execution, monitoring the execution of near-optimal policies in stochastic domains, and monitoring the generation and execution of plans with procedural hard constraints. In all cases, we use the formal notion of regression to identify what is relevant for achieving the objective. We prove the soundness of these concrete approaches and present empirical results demonstrating that in some contexts orders of magnitude speed-ups can be gained by our technique compared to replanning from scratch.
84

The development and application of computer-adaptive testing in a higher education environment

Lilley, Mariana January 2007 (has links)
The research reported in this thesis investigated issues relating to the use of computer-assisted assessment in Higher Education through the design, implementation and evaluation of a computer-adaptive test (CAT) for the assessment of and provision of feedback to Computer Science undergraduates. The CAT developed for this research unobtrusively monitors the performance of students during a test, and then employs this information to adapt the sequence and level of difficulty of the questions to individual students. The information about each student performance obtained through the CAT is subsequently employed for the automated generation of feedback that is tailored to each individual student. In the first phase of the research, a total of twelve empirical studies were carried out in order to investigate issues related to the adaptive algorithm, stakeholders’ attitude, and validity and reliability of the approach. The CAT approach was found to be valid and reliable, and also effective at tailoring the level of difficulty of the test to the ability of individual students. The two main groups of stakeholders, students and academic staff, both exhibited a positive attitude towards the CAT approach and the user interface. The second phase of the research was concerned with the design, implementation and evaluation of an automated feedback prototype based on the CAT approach. Five empirical studies were conducted in order to assess stakeholders’ attitude towards the automated feedback, and its effectiveness at providing feedback on performance. It was found that both groups of stakeholders exhibited a positive attitude towards the feedback approach. Furthermore, it was found that the approach was effective at identifying the strengths and weaknesses of individual students, and at supporting the adaptive selection of learning resources that meet their educational needs. This work discusses the implications of the use of the CAT approach in Higher Education assessment. In addition, it demonstrates the ways in which the adaptive test generated by the CAT approach can be used to provide students with tailored feedback that is timely and useful.
85

Enhancing the expressivity and automation of an interactive theorem prover in order to verify multicast protocols

Ridge, Thomas January 2006 (has links)
This thesis was motivated by a case study involving the formalisation of arguments that simplify the verification of tree-oriented multicast protocols. As well as covering the case study itself, it discusses our solution to problems we encountered concerning expressivity and automation. The expressivity problems related to the need for theory interpretation. We found the existing Locale and axiomatic type class mechanisms provided by the Isabelle theorem prover we were using to be inadequate. This led us to develop a new prototype implementation of theory interpretation. To support this implementation, we developed a novel system of proof terms for the HOL logic that we also describe in this thesis. We found existing automation to perform poorly, which led us to experiment with additional kinds of automation. We describe our approach, focusing on features that make automation suitable for interactive use. Our presentation of the case study starts with our formalisation of an abstract theory of distributed systems, covering state transition systems, forward and backward simulation relations, and related properties of LTL (linear temporal logic). We then summarise proofs of simulation relations holding for particular abstract multicast protocols. We discuss the mechanisation styles we experimented with in the case study. We also discuss the methodology behind our proofs. We cover aspects such as how to discover and construct proofs, and how to explore the space of proofs, how to make good definitions and lemmas, how to increase modularity, reuse, stability and malleability of proofs, and reduce maintenance of proofs, and the gap between intuitively understood proofs and their formalisation.
86

Variability of the perimetric response in normals and in glaucoma

Pacey, Ian Edward January 1998 (has links)
This study investigated the variability of response associated with various perimetric techniques, with the aim of improving the clinical interpretation of automated static threshold perirnetry. Evaluation of a third generation of perimetric threshold algorithms (SITA) demonstrated a reduction in test duration by approximately 50% both in normal subjects and in glaucoma patients. SITA produced a slightly higher, but clinically insignificant, Mean Sensitivity than with the previous generations of algorithms. This was associated with a decreased between-subject variability in sensitivity and hence, lower confidence intervals for normality. In glaucoma, the SITA algorithms gave rise to more statistically significant visual field defects and a similar between-visit repeatability to the Full Threshold and FASTPAC algorithms. The higher estimated sensitivity observed with SITA compared to Full Threshold and FASTPAC were not attributed to a reduction in the fatigue effect. The investigation of a novel method of maintaining patient fixation, a roving fixation target which paused immediately prior lo the stimulus presentation, revealed a greater degree of fixational instability with the roving fixation target compared to the conventional static fixation target. Previous experience with traditional white-white perimetry did not eradicate the learning effect in short-wavelength automated perimetry (SWAP) in a group of ocular hypertensive patients. The learning effect was smaller in an experienced group of patients compared to a naive group of patients, but was still at a significant level to require that patients should undertake a series of at least three familiarisation tests with SWAP.
87

Characterisation and segmentation of basal ganglia mineralization in normal ageing with multimodal structural MRI

Glatz, Andreas January 2016 (has links)
Iron is the most abundant trace metal in the brain and is essential for many biological processes, such as neurotransmitter synthesis and myelin formation. This thesis investigates small, multifocal hypointensities that are apparent on T2*- weighted (T2*w) MRI in the basal ganglia, where presumably most iron enters the brain via the blood-brain-barrier along the penetrating arteries. These basal ganglia T2*w hypointensities are believed to arise from iron-rich microvascular mineral deposits, which are frequently found in community-dwelling elderly subjects and are associated with age-related cognitive decline. This thesis documents the characteristic spatial distribution and morphology of basal ganglia T2*w hypointensities of 98 community-dwelling, elderly subjects in their seventies, as well as their imaging signatures on T1-weighted (T1w) and T2- weighted (T2w) MRI. A fully automated, novel method is introduced for the segmentation of basal ganglia T2*w hypointensities, which was developed to reduce the high intra- and inter-rater variability associated with current semi-automated segmentation methods and to facilitate the segmentation of these features in other single- and multi-centre studies. This thesis also presents a multi parametric quantitative MRI relaxometry methodology for conventional clinical MRI scanners that was developed and validated to improve the characterisation of brain iron. Lastly, this thesis describes the application of the developed methods in the segmentation of basal ganglia T2*w hypointensities of 243 community-dwelling participants of the Austrian Stroke Prevention Study Family (ASPS-Fam) and their analysis on R2* (=1/T2*) relaxation rate and Larmor frequency shift maps. This work confirms that basal ganglia T2*w hypointensities, especially in the globus pallidus, are potentially MRI markers of microvascular mineralization. Furthermore, the ASPS-Fam results show that basal ganglia mineral deposits mainly consist of paramagnetic particles, which presumably arise from an imbalance in the brain iron homeostasis. Hence, basal ganglia T2*w hypointensities are possibly an indicator of age-related microvascular dysfunction with iron accumulation, which might help to explain the variability of cognitive decline in normal ageing.
88

Performance modelling of reactive web applications using trace data from automated testing

Anderson, Michael 29 April 2019 (has links)
This thesis evaluates a method for extracting architectural dependencies and performance measures from an evolving distributed software system. The research goal was to establish methods of determining potential scalability issues in a distributed software system as it is being iteratively developed. The research evaluated the use of industry available distributed tracing methods to extract performance measures and queuing network model parameters for common user activities. Additionally, a method was developed to trace and collect system operations the correspond to these user activities utilizing automated acceptance testing. Performance measure extraction was tested across several historical releases of a real-world distributed software system with this method. The trends in performance measures across releases correspond to several scalability issues identified in the production software system. / Graduate
89

Synaptome mapping of the postsynaptic density 95 protein in the human brain

Curran, Olimpia Elwira January 2018 (has links)
The past three decades of synaptic research have provided new insights into synapse biology. While synapses are still considered the fundamental connectors between the nerve cells in the central nervous system, they are no longer seen as simple neuron-to-neuron contacts. In fact, the estimated 100 trillion of human synapses are extremely complex, diverse and capable of performing sophisticated computational operations giving rise to advanced repertoires of cognitive and organic behaviours. These intricate synaptic properties mean that existing methodologies for quantifying and characterising synapses are inadequate. Yet, understanding of synapse biology is crucial to deciphering human pathology as disruptions in synapse numbers, architecture and function have already been linked to many human brain disorders. The purpose of this PhD was to evaluate a novel, high-throughput synaptic protein quantification method at a single synapse resolution in human post-mortem brain tissue. The method has already been successfully tested in our laboratory in genetically engineered mice, whereby synapses have been systematically quantified across a large number of areas to generate the first molecular maps of synapses, the synaptome maps. In this project, methods have been developed to label human brain tissue with postsynaptic density protein 95 (PSD-95), the most common postsynaptic protein. We describe the use of PSD-95 combined with confocal microscopy and computational image analysis to quantify synaptic puncta immunofluorescence (IF) parameters in the human brain. In the first part of this study, the new method was used to quantify PSD-95 IF across selected 20 human brain regions to generate first PSD-95 human synaptome map. In the second part, PSD-95 IF was systematically assessed across 16 hippocampal subregions. Finally, we confirmed that our novel synaptic quantification method was sensitive to hippocampal synaptic losses in patients with Alzheimer's Disease (AD). Such a high degree of systematic synapse quantification has not previously been reported in human brain tissue. Our method is a promising approach for synaptic protein quantification in tissue with several potential applications in diagnosis and development of therapeutics for neurological and psychiatric disorders.
90

Automated decision-making vs indirect discrimination : Solution or aggravation?

Lundberg, Emma January 2019 (has links)
The usage of automated decision making-systems by public institutions letting the system decide on the approval, determination or denial of individuals benefits as an example, is an effective measure in making more amount of work done in a shorter time period and to a lower cost than if it would have been done by humans. But still, although the technology has developed into being able to help us in this way, so has also the potential problems that these systems can cause while they are operating. The ones primarily affected here will be the individuals that are denied their benefits, health care, or pensions. The systems can maintain hidden, historical stigmatizations and prejudices, disproportionally affecting members of a certain historically marginalized group in a negative way through its decisions, simply because the systems have learned to do so. There is also a risk that the actual programmer includes her or his own bias, as well as incorrect translation of applicable legislations or policies causing the finalized system to make decisions on unknown bases, demanding more, less or completely other things than those requirements that are set up by the public and written laws. The language in which these systems works are in mathematical algorithms, which most ordinary individuals, public employees or courts will not understand. If suspecting that you could have been discriminated against by an automated decision, the requirements for successfully claim a violation of discrimination in US-, Canadian- and Swedish courts, ECtHR and ECJ demands you to show on which of your characteristics you were discriminated, and in comparison to which other group, a group that instead has been advantaged. Still, without any reasons or explanations to why the decision has been taken available for you as an applicant or for the court responsible, the inability to identify such comparator can lead to several cases of actual indirect discriminations being denied. A solution to this could be to follow the advice of Sophia Moreau’s theory, focusing on the actual harm that the individual claim to have suffered instead of on categorizing her or him due to certain traits, or on finding a suitable comparator. This is similar to a ruling of the Swedish Court of Appeal, where a comparator was not necessary in order to establish that the applicant had been indirectly discriminated by a public institution. Instead, the biggest focus in this case was on the harm that the applicant claimed to have suffered, and then on investigating whether this difference in treatment could be objectively justified. In order for Swedish and European legislation to be able to meet the challenges that can arise through the usage of automated decision making-systems, this model of the Swedish Court of Appeal could be a better suited model to help individuals being affected by an automated decision of a public institution, being potentially indirectly discriminative.

Page generated in 0.0542 seconds