• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 468
  • 424
  • 180
  • 176
  • 77
  • 13
  • 9
  • 8
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 1773
  • 535
  • 260
  • 216
  • 213
  • 213
  • 184
  • 145
  • 135
  • 134
  • 127
  • 127
  • 113
  • 108
  • 105
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Reasoning about history based access control policy using past time operators of interval temporal logic

Alsarhani, Sami January 2014 (has links)
Interval Temporal Logic (ITL) is a flexible notation for the propositional and first-order logical reasoning about periods of time that exist in specifications of hardware and software systems. ITL is different from other temporal logics since it can deal with both sequential and parallel composition and provides powerful and extensible specification and verification methods for reasoning about properties such as safety, time projection and liveness. Most imperative programming constructs can be seen as ITL formula that form the basis of an executable framework called Tempura that is used for the development and testing of ITL specifications.\\ ITL has only future operators, but the use of past operators make specifications referring to history more succinct; that is, there are classes of properties that can be expressed by means of much shorter formulas. What is more, statements are easier to express (simplicity) when past operators are included. Moreover, using past operators does not increase the complexity of interval temporal logic regarding the formula size and the simplicity. This thesis introduces past time of interval temporal logic where, instead of future time operators Chop, Chopstar, and Skip, we have past operators past Chop, past Chopstar and past Skip. The syntax and semantics of past time ITL are given together with its axiom and proof system. Furthermore, Security Analysis Toolkit for Agents (SANTA) operators such always-followed-by and the strong version of it has been given history based semantics using past time operators. In order to evaluate past time interval temporal logic, the problem of specification, verification of history based access control policies has been selected. This problem has already been solved using future time of interval temporal logic ITL but the drawback is that policy rules are not succinct and simple. However, the use of past time operators of ITL produces simple and succinct policy rules. The verification technique used to proof the safety property of history based access control policies is adapted for past time ITL to show that past time operators of interval temporal logic can specify and verify a security scenario such as history based access control policy.
12

Cooperative diversity techniques for high-throughput wireless relay networks

Vien, Quoc-Tuan January 2013 (has links)
Relay communications has attracted a growing interest in wireless communications with application to various enhanced technologies. This thesis considers a number of issues related to data throughput in various wireless relay network models. Particularly, new implementations of network coding (NC) and space-time coding (STC) techniques are investigated to offer various means of achieving high-throughput relay communications. Firstly, this thesis investigates different practical automatic repeat request (ARQ) retransmission protocols based on NC for two-way wireless relay networks to improve throughput efficiency. Two improved NC-based ARQ schemes are designed based on go-back-N and selective-repeat (SR) protocols. Addressing ARQ issues in multisource multidestination relay networks, a new NC-based ARQ protocol is proposed and two packet-combination algorithms are developed for retransmissions at relay and sources to significantly improve the throughput. In relation to the concept of channel quality indicator (CQI) reporting in two-way relay networks, two new efficient CQI reporting schemes are designed based on NC to improve the system throughput by allowing two terminals to simultaneously estimate the CQI of the distant terminal-relay link without incurring additional overhead. The transmission time for CQI feedback at the relays is reduced by half while the increase in complexity and the loss of performance are shown to be negligible. Furthermore, a low-complexity relay selection scheme is suggested to reduce the relay searching complexity. For the acknowledgment (ACK) process, this thesis proposes a new block ACK scheme based on NC to significantly reduce the ACK overheads and therefore produce an enhanced throughput. The proposed scheme is also shown to improve the reliability of block ACK transmission and reduce the number of data retransmissions for a higher system throughput. Additionally, this thesis presents a new cooperative retransmission scheme based on relay cooperation and NC to considerably reduce the number of retransmission packets and im- prove the reliability of retransmissions for a more power efficient and higher throughput system with non-overlapped retransmissions. Moreover, two relay selection schemes are recommended to determine the optimised number of relays for the retransmission. Finally, with respect to cognitive wireless relay networks (CWRNs), this thesis proposes a new cooperative spectrum sensing (CSS) scheme to improve the spectrum sensing performance and design a new CSS scheme based on NC for three-hop CWRNs to improve system throughput. Furthermore, a new distributed space-time-frequency block code (DSTFBC) is designed for a two- hop nonregenerative CWRN over frequency-selective fading channels. The proposed DSTFBC design achieves higher data rate, spatial diversity gain, and decoupling detection of data blocks at all destination nodes with a low-complexity receiver structure.
13

The use of key performance indicators for external quality control in histological dissection

Griffiths, Matthew Paul January 2016 (has links)
The recent reports into standards in the NHS (Francis, 2013) and quality in pathology (Barnes, 2014) have focused scrutiny on the way in which we work in pathology and how we can provide assurance that this is of a sufficiently high standard. There are a number of external quality assurance schemes covering pathology generally and histopathology specifically (UKNEQAS), however, there is no scheme of any kind covering the process of histological surgical dissection. This is an area undergoing development currently, as it changes from an area which is the sole preserve of medically qualified pathologists, to an area utilising a number of highly trained biomedical scientists. The concept of biomedical scientist led dissection has been around for some years, being codified into a document of best practice in 2005 (IBMS, 2005). This document placed the responsibility for the work of the BMS in the hands of the pathologist, without structured oversight or a quality assurance programme. Ten years on and specimen dissection, including the developing area of BMS led dissection, remains without any formal structured form of quality assurance. This work builds on the points made by Barnes (2014), taking the guidance of the RCPath and IBMS to form a series of key performance indicators (KPI) in relation to specimen dissection. These KPI can be used as an indicator of individual practice, highlighting areas of variation, weakness or strength. Once these are identified, a tailored feedback and training event provides an opportunity to address these errors & omissions, or to enable areas of strength to be shared. The results of this thesis show that introducing training events serves to drastically reduce variation by, and between, dissectors, driving up standards in the department.
14

Impact of climate change on extinction risk of montane tree species

Tejedor Garavito, Natalia January 2014 (has links)
The potential impacts of climate change on many species worldwide remains unknown, especially in those tropical regions that are centers of endemism and are highly biodiverse. This thesis provides an insight into the extinction risk of selected tree species using different species distribution modelling techniques and reviewing the current conservation status on montane forest in the Tropical Andes. Starting with a global analysis, the potential impacts of climate change on montane ecoregions is investigated, by identifying those that are more vulnerable to the expected changes in temperature and precipitation, from global predictions under different climate change scenarios. It then gives an insight on the current and potential threats to biodiversity in the Andean region, including the identification of those that are most likely to be responsible for increasing the extinction risk of the species. With the use of the IUCN Red List Categories and Criteria, selected tree species were assessed to identify their extinction risk. Information on the species’ current distribution was collated and used to estimate their potential distribution under climate change, by using different modelling techniques. These results were used to reassess the species using the IUCN Red List and establish the changes in Red List Category. Lastly, it provides a discussion that integrates all the results obtained throughout the thesis, to explore the implications for conservation, in order to highlight the overriding importance of including threatened tree species to target conservation efforts in the region, while considering the uncertainties that surround predictions under climate change scenarios, modelling techniques and the use of the IUCN Red List.
15

Digging the dirt : a comparative analysis of excavation methods and recording systems in relation to their applications in forensic archaeology

Evis, Laura Helen January 2014 (has links)
Archaeological excavation has been widely used in the recovery of human remains and other evidence in the service of legal cases for many years. However, established approaches will in future be subject to closer scrutiny following the announcement by the Law Commission in 2011 that expert evidence will in future be subject to a new reliability-based admissibility test in criminal proceedings. This research evaluates current archaeological excavation methods and recording systems – focusing on those used in the United Kingdom, Ireland, Australasia, and North America – in relation to their use in providing forensic evidence, and their ability to satisfy the admissibility tests introduced by the Law Commission, and other internationally recognised bodies. In order to achieve this aim, two analyses were undertaken. First, attention was directed to understanding the origins, development, underpinning philosophies, and current use of archaeological excavation methods and recording systems in the regions selected for study. A total of 153 archaeological manuals/guidelines were examined from archaeological organisations operating in the United Kingdom, Ireland, Northern Ireland, Scotland, the United States, Canada, Australia, and New Zealand. This research indicated that the Stratigraphic Excavation method and Single Context Recording system, the Demirant Excavation method and Standard Context Recording system, the Quadrant Excavation method and Standard Context Recording system, and the Arbitrary Excavation method and Unit Level Recording system were the approaches most often used to excavate and record graves. Second, the four defined methodological approaches were assessed experimentally, using a grave simulation of known properties to test the excavation, recording, and interpretation of material evidence, the definition of stratigraphic contexts, and understanding of stratigraphic relationships. The grave simulation also provided opportunities to measure archaeologists’ narratives of the grave formation process against the known properties of the grave simulation, and to assess whether archaeological experience had any impact on evidence recovery rates. Fifty repeat excavations were conducted. The results obtained from this experimental study show that the Quadrant Excavation method and Standard Context Recording system was the most consistent, efficient, and reliable archaeological approach to use to excavate and record clandestine burials and to formulate interpretation-based narratives of a grave’s formation sequence. In terms of the impact that archaeological experience had on evidence recovery rates, archaeological experience was found to have little bearing upon the recovery of evidence from the grave simulation. It is suggested that forensic archaeologists use the Quadrant Excavation method and Standard Context Recording system to excavate and record clandestine burials. If this approach is unable to be used, the Demirant Excavation method and Standard Context Recording system, or the Stratigraphic Excavation method and Single Context Recording system should be used. Both of these aforementioned techniques proved to be productive in terms of material evidence recovery and the identification and definition of stratigraphic contexts. The Arbitrary Excavation method and Unit Level Recording system should not be used, as this method proved to have an extremely poor evidence recovery rate and destroyed the deposition sequence present within the simulated grave.
16

Validation of oral fluid as a matrix for drug detection

Reichardt, Eva Maria January 2014 (has links)
New testing procedures used as an aid for law enforcement are subject to intensive scrutiny in the courts. In recent years in workplace drug testing there has been a shift away from using traditional specimens (i.e urine) for drug testing and monitoring and a move to employing less invasive testing using oral fluid. Whilst it is now widely accepted that drugs can be detected in oral fluid and devices are now available to facilitate analysis of drugs in this matrix, our understanding of the behaviour of drugs in the mouth and oral secretions is far from complete. Since the introduction of oral fluid drug testing in the late 1990’s it has been observed that some drugs appear to be present at higher than expected concentrations, often at concentrations that would be fatal if they were in blood. Clearly some extra process is occurring in addition to drugs entering the oral fluid by simple blood and saliva partitioning. Little is really known about the physiology of drugs in the mouth and limited understanding of drug elimination via the mouth poses a problem to forensic toxicologists with the interpretation of analytical results in relation to an individual’s drug use or the possible effects that the drug may be exhibiting on that individual. The work described in this thesis is aimed at increasing our understanding of the factors and processes concerning the deposition, secretion and detection of drugs in oral fluid and enhancing our ability to interpret the results of analysis in this matrix. The objective of this study was to explore how high drug concentrations can be deposited in the mouth tissues and oral fluid together with other factors that may influence drug detection in order to assist with the interpretation of testing results. To test the hypothesis that drug depots form within the mouth, preliminary screening methods in combination with confirmatory techniques such as GC-MS and LCMS/ MS were employed. The development of an immunohistochemical method was successfully demonstrated for the detection and visualisation of cocaine and opiates (heroin and morphine) in porcine and mouse tissue. The work undertaken in this thesis showed that elevated drug concentrations can be observed when drugs are consumed via oral administration either in form of an oral solution or smoking. Immunohistochemical analysis in combination with confirmatory techniques demonstrated that drugs such as cocaine and opioids can bind to oral tissue and be subsequently released over time, therefore has the potential to contribute to the drug concentration in oral fluid. Although this is not an issue under legislation that forbids the presence of drugs or as a check for drug compliance or abstinence, it must be considered in relation to the interpretation of results in more complex forensic cases.
17

Predicting the ecological consequences of river management for a riverine cyprinid fish

Murray, Gregory P. D. January 2014 (has links)
Current river management seeks to resolve a compromise between stakeholder interests, ecosystem services provision and conservation aims, especially in relation to assessment of ecosystem health. While management decisions should be evidenced-based, current population and habitat models fail to incorporate fish behaviour and the interactions between fish and their environment, thus limiting their ability to predict management-relevant, population responses to environmental change. In order to address these weaknesses, an individual-based model is developed and parameterised to predict the distribution and growth of roach (Rutilus rutilus), a common, generalist, freshwater fish; known to be typically dominant in heavily modified rivers. Such a model seeks to build on current management models and practices, with emphasis on improving recruitment of juvenile roach. Virtual forager parameters are derived from foraging experiments, published investigations, models of roach behaviour and bioenergetics. Data collected from field studies in a typical, highly modified, lowland river are used to describe the environment and initial fish population with subsequent data on fish population trends used to validate the IBM, under a pattern-oriented modelling approach; specifically growth rate and habitat distribution patterns. River management practices including the removal of in-stream aquatic macrophytes and regulation of flow regime for flood risk management are predicted as potentially damaging to roach recruitment, subsequent year-class strength and therefore, populations in subsequent years. Recommendations for more sympathetic management schemes are provided. The modelling framework described here can be used to produce robust predictions of roach population patterns in riverine habitats and allows the user to test the impact of environmental change on cyprinid fish, enabling the modelling system to be used to develop proactive, evidence-based management in light of current rates of environmental change.
18

Exploring the computational potential of simple chemical reactions

Jahan, I. January 2014 (has links)
A large number of human activities rely on conventional computing devices producing huge quantities of data. These “traditional computers” may eventually fail to deal with such demands. Therefore, there is a need to develop novel computing paradigms, this project aims to fabricate and explore the potential of novel computing devices, based on the space-time dynamics of travelling waves in non-linear media. The designed prototype computing devices are all experimentally implemented in chemical reaction-diffusion media. A light sensitive Belousov-Zhabotinsky (BZ) reaction was used to construct a number of logic gates and arithmetic circuits. A 1-bit half adder was constructed using the collisions of wave fragments within channels in a weakly excitable analogue of the BZ reaction. The excitability of the reaction is controlled by altering the light levels projected onto an immobilised light sensitive catalyst within an open reactor fed with fresh BZ reagents. This approach was extended by projecting a series of interconnected discs with differing connection weight and size. Using this approach an inverter gate, an AND gate, a NAND gate, a NXOR gate, an XOR gate and a diode were created in addition to a compact 1-bit half adder circuit and memory circuits. Using an excitable BZ analogue a 4-bit input, 2-bit output integer square root circuit has been implemented. This utilises the principal of constant speed wave propagation and the annihilation of colliding wave fronts coupled with light controlled valves. The light sensitive BZ reaction was also used as a substrate for exploring the potential of applying co-evolutionary algorithms coupled with memory to control the dynamics in order to solve specific computational tasks. It was shown that learnt solutions from simulation experiments could be directly applied to experimental systems. A “gas free” cyclohexadione analogue of the BZ reaction was encapsulated in 3-D lipid stabilised vesicles. The transfer of excitation between adjacent vesicles could be induced by altering the reaction chemistry. It was also possible to selectively initiate waves using lasers at selected wavelengths. Light activation is important because it will enable initiation in required position of computational schemes. In addition there is a need to study pattern formations in simple inorganic systems in order to gain a better understanding of pattern formation and the control thereof in order to synthesise functional materials and implement computation by utilising the inherent self-assembly mechanisms. Therefore, a simple reaction between aluminium chloride and sodium hydroxide was studied; a phase diagram was constructed of the reaction. A controllable region was found where circular wave, target waves, cardioid like double spiral, simple Voronoi and additively weighted Voronoi diagrams could be constructed. In addition, a group of simple chemical reactions capable of geometric calculations (Generalised and Weighted Voronoi diagrams) are presented. Drops of metal ion solution were placed on either potassium ferrocyanide or ferricyanide gel to construct complex tessellations of the plane (sometimes calculating multiple Voronoi diagrams in parallel).The reactions were utilised to reconstruct natural tessellations such as those observed on animal coat patterns. Therefore, these simple reactions may be valuable in helping understand natural pattern formation.
19

Identification of volatile compounds as indicators of spoilage insects and mites in grain and flour

Abuelnnor, N. January 2013 (has links)
Solid-phase micro-extraction (SPME) fibres and automated thermal desorption (ATD) tubes were used with gas chromatography-mass spectrometry (GC-MS), to investigate volatile secretions from the headspace gas of adults and larvae of confused flour beetle Tribolium confusum du Val, saw-toothed grain beetle Oryzaephilus surinamensis (L.), grain weevil Sitophilus granarius (L.), flour mite Acarus siro L. and storage mite Glycyphagus domesticus (De Geer). A number of VOCs were specifically linked with confused flour beetle T. confusum and not observed with the other insect and mite species. For instance, 2-methyl-p-benzoquinone and 2-ethyl-p-benzoquinone were specifically linked to confused flour beetle. However, this study found that benzoquinones were not present in the headspace volatiles of the larvae of confused flour beetle, whereas, 1-pentadecene was detected in adults and larvae of confused flour beetle. The results of the volatile extraction experiments with saw-toothed grain beetles O. surinamensis showed that several volatiles were detected in the adults and larvae of saw-toothed grain beetle but not in the other insect and mite species. Some of these VOCs were detected only in adults, and these were butanoic acid, 1-methylethyl ester and 1,3-octadiene. The others were found in both larvae and adults. These were 2,4-hexadien-1-ol, octane, 3-methylbutanoic acid, 2,5-dimethylpyrazine, 1-octen-3-one and 1-octen-3-ol. The analysis of the headspace from grain weevil S. granarius showed three VOCs which were specifically linked to weevils compared to the other species. These were mequinol, 2-methylpropanoic acid and 2-pentanone. The results showed that the known mite-produced alarm pheromone citral (comprising (Z)-3,7-dimethyl-2,6-octadienal and (E)-3,7-dimethyl-2,6-octadienal) was detected in both mite species flour mite A. siro and storage mite G. domesticus but not in insect species. This study has identified a number of VOCs which may enable the early detection of storage insects and mites in grain stores. Many of the VOCs identified were specific to either mites or one of the insect species studied which could enable specific identification of the cause of infestation.
20

User participation in the design and development of Web 2.0 technologies for people with learning difficulties

Fanou, S. K. January 2014 (has links)
In the twenty-first century People with Learning Difficulties (PWLD) still face oppression, discrimination and exclusion from the mainstream of social life. Over recent decades the policy of the United Kingdom’s (UK) government and activist organisations regarding people with learning difficulties has been on enabling inclusion, ensuring rights, providing choice and developing advocacy and independence. People with learning difficulties have been moved out of institutions with the intention to be included and respected as equal members of society. During the same decades that the government and activist organisations have been striving for the inclusion and equality of people with learning difficulties, the use of Information Technology (IT) has reached pervasive levels, to the degree that it is almost impossible for individuals to socially function successfully, unless they have access to it. Unfortunately, most IT is not designed to be usable and accessible to people with learning difficulties and this is a major barrier for their social inclusion. Participatory Design (PD) methodologies which emphasise end-user involvement in the software development process are widely considered the key to system usability and accessibility. However, most researchers and software developers believe that people with learning difficulties are not capable of participating in the process of development as a result of their disabilities. Others, report that they do not know how to work with this specific group of disabled end-users. This discriminatory behaviour is a major reason why IT remains inaccessible to people with learning difficulties. The study described in this thesis combined Evolutionary Prototyping, a software development methodology and Participatory Action Research (PAR), a social science methodology, in order to involve a cohort of four Health Trainers with learning difficulties in the development of a Web 2.0 based system. The aims of the study were to explore how people with learning difficulties could be involved in the development of a software system and if they could use a system developed with their participation. A further aim was to explore how software developers can approach the field of Learning Disability, the issues they will face and how those issues can be overcome. Qualitative data was gathered during fourteen Participatory Action Research meetings, in which the Health Trainers were involved in research, software development and system use. The data was analysed using Thematic Content Analysis facilitated by the use of the NVivo software package. The findings were validated by the participating Health Trainers. The findings suggest that during software development participation, the Health Trainers faced a number of challenges. However, the Health Trainers indicated the type of support they needed from the researcher in order to overcome them. The support required was easy to provide and the Health Trainers managed to engage in the software development process. The study conducted a system use evaluation to explore if the developed system was usable and accessible to the Health Trainers. The Health Trainers managed to complete all the system tasks posed to them during the evaluation. This suggests that the developed system was usable and accessible to the Health Trainers. Further evidence suggests that a number of factors affected the participation of the Health Trainers during development and during the use of the system. Finally, the study explored how the developed system was used over the long run, in a period of eighteen months. The findings suggest that system use over time was affected by factors other than the system’s accessibility and usability. Concluding, the findings suggest that with easy to provide support, the Health Trainers with learning difficulties could be involved in software development and they could use a system developed with their participation. It is hoped that the findings be used by policy makers and advocacy groups, to make a case towards convincing researchers and software developers to involve more people with learning difficulties in software development, thus making systems accessible to this community of end-users.

Page generated in 0.0297 seconds