• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 609
  • 468
  • 353
  • 179
  • 80
  • 13
  • 12
  • 12
  • 12
  • 12
  • 12
  • 12
  • 10
  • 9
  • 4
  • Tagged with
  • 2154
  • 891
  • 554
  • 551
  • 551
  • 260
  • 219
  • 199
  • 157
  • 149
  • 141
  • 136
  • 134
  • 125
  • 108
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Digging the dirt : a comparative analysis of excavation methods and recording systems in relation to their applications in forensic archaeology

Evis, Laura Helen January 2014 (has links)
Archaeological excavation has been widely used in the recovery of human remains and other evidence in the service of legal cases for many years. However, established approaches will in future be subject to closer scrutiny following the announcement by the Law Commission in 2011 that expert evidence will in future be subject to a new reliability-based admissibility test in criminal proceedings. This research evaluates current archaeological excavation methods and recording systems – focusing on those used in the United Kingdom, Ireland, Australasia, and North America – in relation to their use in providing forensic evidence, and their ability to satisfy the admissibility tests introduced by the Law Commission, and other internationally recognised bodies. In order to achieve this aim, two analyses were undertaken. First, attention was directed to understanding the origins, development, underpinning philosophies, and current use of archaeological excavation methods and recording systems in the regions selected for study. A total of 153 archaeological manuals/guidelines were examined from archaeological organisations operating in the United Kingdom, Ireland, Northern Ireland, Scotland, the United States, Canada, Australia, and New Zealand. This research indicated that the Stratigraphic Excavation method and Single Context Recording system, the Demirant Excavation method and Standard Context Recording system, the Quadrant Excavation method and Standard Context Recording system, and the Arbitrary Excavation method and Unit Level Recording system were the approaches most often used to excavate and record graves. Second, the four defined methodological approaches were assessed experimentally, using a grave simulation of known properties to test the excavation, recording, and interpretation of material evidence, the definition of stratigraphic contexts, and understanding of stratigraphic relationships. The grave simulation also provided opportunities to measure archaeologists’ narratives of the grave formation process against the known properties of the grave simulation, and to assess whether archaeological experience had any impact on evidence recovery rates. Fifty repeat excavations were conducted. The results obtained from this experimental study show that the Quadrant Excavation method and Standard Context Recording system was the most consistent, efficient, and reliable archaeological approach to use to excavate and record clandestine burials and to formulate interpretation-based narratives of a grave’s formation sequence. In terms of the impact that archaeological experience had on evidence recovery rates, archaeological experience was found to have little bearing upon the recovery of evidence from the grave simulation. It is suggested that forensic archaeologists use the Quadrant Excavation method and Standard Context Recording system to excavate and record clandestine burials. If this approach is unable to be used, the Demirant Excavation method and Standard Context Recording system, or the Stratigraphic Excavation method and Single Context Recording system should be used. Both of these aforementioned techniques proved to be productive in terms of material evidence recovery and the identification and definition of stratigraphic contexts. The Arbitrary Excavation method and Unit Level Recording system should not be used, as this method proved to have an extremely poor evidence recovery rate and destroyed the deposition sequence present within the simulated grave.

Validation of oral fluid as a matrix for drug detection

Reichardt, Eva Maria January 2014 (has links)
New testing procedures used as an aid for law enforcement are subject to intensive scrutiny in the courts. In recent years in workplace drug testing there has been a shift away from using traditional specimens (i.e urine) for drug testing and monitoring and a move to employing less invasive testing using oral fluid. Whilst it is now widely accepted that drugs can be detected in oral fluid and devices are now available to facilitate analysis of drugs in this matrix, our understanding of the behaviour of drugs in the mouth and oral secretions is far from complete. Since the introduction of oral fluid drug testing in the late 1990’s it has been observed that some drugs appear to be present at higher than expected concentrations, often at concentrations that would be fatal if they were in blood. Clearly some extra process is occurring in addition to drugs entering the oral fluid by simple blood and saliva partitioning. Little is really known about the physiology of drugs in the mouth and limited understanding of drug elimination via the mouth poses a problem to forensic toxicologists with the interpretation of analytical results in relation to an individual’s drug use or the possible effects that the drug may be exhibiting on that individual. The work described in this thesis is aimed at increasing our understanding of the factors and processes concerning the deposition, secretion and detection of drugs in oral fluid and enhancing our ability to interpret the results of analysis in this matrix. The objective of this study was to explore how high drug concentrations can be deposited in the mouth tissues and oral fluid together with other factors that may influence drug detection in order to assist with the interpretation of testing results. To test the hypothesis that drug depots form within the mouth, preliminary screening methods in combination with confirmatory techniques such as GC-MS and LCMS/ MS were employed. The development of an immunohistochemical method was successfully demonstrated for the detection and visualisation of cocaine and opiates (heroin and morphine) in porcine and mouse tissue. The work undertaken in this thesis showed that elevated drug concentrations can be observed when drugs are consumed via oral administration either in form of an oral solution or smoking. Immunohistochemical analysis in combination with confirmatory techniques demonstrated that drugs such as cocaine and opioids can bind to oral tissue and be subsequently released over time, therefore has the potential to contribute to the drug concentration in oral fluid. Although this is not an issue under legislation that forbids the presence of drugs or as a check for drug compliance or abstinence, it must be considered in relation to the interpretation of results in more complex forensic cases.

Predicting the ecological consequences of river management for a riverine cyprinid fish

Murray, Gregory P. D. January 2014 (has links)
Current river management seeks to resolve a compromise between stakeholder interests, ecosystem services provision and conservation aims, especially in relation to assessment of ecosystem health. While management decisions should be evidenced-based, current population and habitat models fail to incorporate fish behaviour and the interactions between fish and their environment, thus limiting their ability to predict management-relevant, population responses to environmental change. In order to address these weaknesses, an individual-based model is developed and parameterised to predict the distribution and growth of roach (Rutilus rutilus), a common, generalist, freshwater fish; known to be typically dominant in heavily modified rivers. Such a model seeks to build on current management models and practices, with emphasis on improving recruitment of juvenile roach. Virtual forager parameters are derived from foraging experiments, published investigations, models of roach behaviour and bioenergetics. Data collected from field studies in a typical, highly modified, lowland river are used to describe the environment and initial fish population with subsequent data on fish population trends used to validate the IBM, under a pattern-oriented modelling approach; specifically growth rate and habitat distribution patterns. River management practices including the removal of in-stream aquatic macrophytes and regulation of flow regime for flood risk management are predicted as potentially damaging to roach recruitment, subsequent year-class strength and therefore, populations in subsequent years. Recommendations for more sympathetic management schemes are provided. The modelling framework described here can be used to produce robust predictions of roach population patterns in riverine habitats and allows the user to test the impact of environmental change on cyprinid fish, enabling the modelling system to be used to develop proactive, evidence-based management in light of current rates of environmental change.

Exploring the computational potential of simple chemical reactions

Jahan, I. January 2014 (has links)
A large number of human activities rely on conventional computing devices producing huge quantities of data. These “traditional computers” may eventually fail to deal with such demands. Therefore, there is a need to develop novel computing paradigms, this project aims to fabricate and explore the potential of novel computing devices, based on the space-time dynamics of travelling waves in non-linear media. The designed prototype computing devices are all experimentally implemented in chemical reaction-diffusion media. A light sensitive Belousov-Zhabotinsky (BZ) reaction was used to construct a number of logic gates and arithmetic circuits. A 1-bit half adder was constructed using the collisions of wave fragments within channels in a weakly excitable analogue of the BZ reaction. The excitability of the reaction is controlled by altering the light levels projected onto an immobilised light sensitive catalyst within an open reactor fed with fresh BZ reagents. This approach was extended by projecting a series of interconnected discs with differing connection weight and size. Using this approach an inverter gate, an AND gate, a NAND gate, a NXOR gate, an XOR gate and a diode were created in addition to a compact 1-bit half adder circuit and memory circuits. Using an excitable BZ analogue a 4-bit input, 2-bit output integer square root circuit has been implemented. This utilises the principal of constant speed wave propagation and the annihilation of colliding wave fronts coupled with light controlled valves. The light sensitive BZ reaction was also used as a substrate for exploring the potential of applying co-evolutionary algorithms coupled with memory to control the dynamics in order to solve specific computational tasks. It was shown that learnt solutions from simulation experiments could be directly applied to experimental systems. A “gas free” cyclohexadione analogue of the BZ reaction was encapsulated in 3-D lipid stabilised vesicles. The transfer of excitation between adjacent vesicles could be induced by altering the reaction chemistry. It was also possible to selectively initiate waves using lasers at selected wavelengths. Light activation is important because it will enable initiation in required position of computational schemes. In addition there is a need to study pattern formations in simple inorganic systems in order to gain a better understanding of pattern formation and the control thereof in order to synthesise functional materials and implement computation by utilising the inherent self-assembly mechanisms. Therefore, a simple reaction between aluminium chloride and sodium hydroxide was studied; a phase diagram was constructed of the reaction. A controllable region was found where circular wave, target waves, cardioid like double spiral, simple Voronoi and additively weighted Voronoi diagrams could be constructed. In addition, a group of simple chemical reactions capable of geometric calculations (Generalised and Weighted Voronoi diagrams) are presented. Drops of metal ion solution were placed on either potassium ferrocyanide or ferricyanide gel to construct complex tessellations of the plane (sometimes calculating multiple Voronoi diagrams in parallel).The reactions were utilised to reconstruct natural tessellations such as those observed on animal coat patterns. Therefore, these simple reactions may be valuable in helping understand natural pattern formation.

Identification of volatile compounds as indicators of spoilage insects and mites in grain and flour

Abuelnnor, N. January 2013 (has links)
Solid-phase micro-extraction (SPME) fibres and automated thermal desorption (ATD) tubes were used with gas chromatography-mass spectrometry (GC-MS), to investigate volatile secretions from the headspace gas of adults and larvae of confused flour beetle Tribolium confusum du Val, saw-toothed grain beetle Oryzaephilus surinamensis (L.), grain weevil Sitophilus granarius (L.), flour mite Acarus siro L. and storage mite Glycyphagus domesticus (De Geer). A number of VOCs were specifically linked with confused flour beetle T. confusum and not observed with the other insect and mite species. For instance, 2-methyl-p-benzoquinone and 2-ethyl-p-benzoquinone were specifically linked to confused flour beetle. However, this study found that benzoquinones were not present in the headspace volatiles of the larvae of confused flour beetle, whereas, 1-pentadecene was detected in adults and larvae of confused flour beetle. The results of the volatile extraction experiments with saw-toothed grain beetles O. surinamensis showed that several volatiles were detected in the adults and larvae of saw-toothed grain beetle but not in the other insect and mite species. Some of these VOCs were detected only in adults, and these were butanoic acid, 1-methylethyl ester and 1,3-octadiene. The others were found in both larvae and adults. These were 2,4-hexadien-1-ol, octane, 3-methylbutanoic acid, 2,5-dimethylpyrazine, 1-octen-3-one and 1-octen-3-ol. The analysis of the headspace from grain weevil S. granarius showed three VOCs which were specifically linked to weevils compared to the other species. These were mequinol, 2-methylpropanoic acid and 2-pentanone. The results showed that the known mite-produced alarm pheromone citral (comprising (Z)-3,7-dimethyl-2,6-octadienal and (E)-3,7-dimethyl-2,6-octadienal) was detected in both mite species flour mite A. siro and storage mite G. domesticus but not in insect species. This study has identified a number of VOCs which may enable the early detection of storage insects and mites in grain stores. Many of the VOCs identified were specific to either mites or one of the insect species studied which could enable specific identification of the cause of infestation.

User participation in the design and development of Web 2.0 technologies for people with learning difficulties

Fanou, S. K. January 2014 (has links)
In the twenty-first century People with Learning Difficulties (PWLD) still face oppression, discrimination and exclusion from the mainstream of social life. Over recent decades the policy of the United Kingdom’s (UK) government and activist organisations regarding people with learning difficulties has been on enabling inclusion, ensuring rights, providing choice and developing advocacy and independence. People with learning difficulties have been moved out of institutions with the intention to be included and respected as equal members of society. During the same decades that the government and activist organisations have been striving for the inclusion and equality of people with learning difficulties, the use of Information Technology (IT) has reached pervasive levels, to the degree that it is almost impossible for individuals to socially function successfully, unless they have access to it. Unfortunately, most IT is not designed to be usable and accessible to people with learning difficulties and this is a major barrier for their social inclusion. Participatory Design (PD) methodologies which emphasise end-user involvement in the software development process are widely considered the key to system usability and accessibility. However, most researchers and software developers believe that people with learning difficulties are not capable of participating in the process of development as a result of their disabilities. Others, report that they do not know how to work with this specific group of disabled end-users. This discriminatory behaviour is a major reason why IT remains inaccessible to people with learning difficulties. The study described in this thesis combined Evolutionary Prototyping, a software development methodology and Participatory Action Research (PAR), a social science methodology, in order to involve a cohort of four Health Trainers with learning difficulties in the development of a Web 2.0 based system. The aims of the study were to explore how people with learning difficulties could be involved in the development of a software system and if they could use a system developed with their participation. A further aim was to explore how software developers can approach the field of Learning Disability, the issues they will face and how those issues can be overcome. Qualitative data was gathered during fourteen Participatory Action Research meetings, in which the Health Trainers were involved in research, software development and system use. The data was analysed using Thematic Content Analysis facilitated by the use of the NVivo software package. The findings were validated by the participating Health Trainers. The findings suggest that during software development participation, the Health Trainers faced a number of challenges. However, the Health Trainers indicated the type of support they needed from the researcher in order to overcome them. The support required was easy to provide and the Health Trainers managed to engage in the software development process. The study conducted a system use evaluation to explore if the developed system was usable and accessible to the Health Trainers. The Health Trainers managed to complete all the system tasks posed to them during the evaluation. This suggests that the developed system was usable and accessible to the Health Trainers. Further evidence suggests that a number of factors affected the participation of the Health Trainers during development and during the use of the system. Finally, the study explored how the developed system was used over the long run, in a period of eighteen months. The findings suggest that system use over time was affected by factors other than the system’s accessibility and usability. Concluding, the findings suggest that with easy to provide support, the Health Trainers with learning difficulties could be involved in software development and they could use a system developed with their participation. It is hoped that the findings be used by policy makers and advocacy groups, to make a case towards convincing researchers and software developers to involve more people with learning difficulties in software development, thus making systems accessible to this community of end-users.

The influence of regulatory T cells and mobilisation regimens on graft versus malignancy, graft versus host disease and relapse in haematopoietic progenitor cell transplantation

Wells, Janet Catherine January 2015 (has links)
Regulatory T cells (Tregs) are key players in controlling immune responses, limiting autoimmune disease and allergies, and attenuating immune responses to tumours and pathogens. Understanding and harnessing the suppressive effects of Tregs in autologous and allogeneic haematopoietic progenitor cell (HPC) transplantation presents a significant challenge due to lack of consensus over optimal markers to uniquely identify Tregs and variation in centre-specific factors including disease mix, conditioning regimens, graft origin and manipulation and prophylaxis and treatment of graft versus host disease (GVHD). This study aimed to determine if CD3+CD4+CD125highCD127lowFoxP3+ Treg quantification, assessed flow cytometrically, in grafts or in the post-transplant peripheral blood of patients who received transplants for malignant disease, could provide a useful predictor for disease relapse in autologous (n=85) and allogeneic patients (n=75) and falling chimerism and/or incidence of GVHD in the latter group. The impact of Treg numbers were quantified in HPC harvests, in transplant grafts and in recipients’ peripheral blood during immune reconstitution. Additionally, a simplified Treg assessment protocol using the marker tumour necrosis factor receptor-2 (TNFR2) with CD3, CD4 and CD25 was assessed. In autologous donors, significantly higher Tregs relative to CD34 HPCs were noted in harvests mobilised with the more novel regimen, granulocyte-colony stimulating factor (G-CSF) plus Plerixafor than with G-CSF alone or used in combination with cyclophosphamide. In allogeneic harvests Treg numbers following G-CSF mobilisation were significantly lower than in non-mobilised harvests. Lower absolute Treg numbers in donor lymphocyte infusion (DLI) doses were significantly associated with successful outcome in terms of restoration of donor chimerism and resolution of relapse. Cryopreservation of mobilised cells at the time of initial transplant for later use for DLI has thus been incorporated into practice at this Trust as this is expedient in terms of clinical result, convenience and cost. Interestingly although mobilisation regimens influenced Treg levels in harvests, no correlation was apparent between Treg doses transplanted or peripheral blood levels during immune reconstitution post autologous or allogeneic transplantation or with falling chimerism and/or incidence and severity of GVHD in allogeneic patients during the first year post transplant. Extending this follow-up time would be an interesting area of further study as the majority of patients who relapse do so beyond one year.

An intelligent knowledge based cost modelling system for innovative product development

Shehab, Esam January 2001 (has links)
This research work aims to develop an intelligent knowledge-based system for product cost modelling and design for automation at an early design stage of the product development cycle, that would enable designers/manufacturing planners to make more accurate estimates of the product cost. Consequently, a quicker response to customers’ expectations. The main objectives of the research are to: (1) develop a prototype system that assists an inexperienced designer to estimate the manufacturing cost of the product, (2) advise designers on how to eliminate design and manufacturing related conflicts that may arise during the product development process, (3) recommend the most economic assembly technique for the product in order to consider this technique during the design process and provide design improvement suggestions to simplify the assembly operations (i.e. to provide an opportunity for designers to design for assembly (DFA)), (4) apply a fuzzy logic approach to certain cases, and (5) evaluate the developed prototype system through five case studies. The developed system for cost modelling comprises of a CAD solid modelling system, a material selection module, knowledge-based system (KBS), process optimisation module, design for assembly module, cost estimation technique module, and a user interface. In addition, the system encompasses two types of databases, permanent (static) and temporary (dynamic). These databases are categorised into five separate groups of database, Feature database, Material database, Machinability database, Machine database, and Mould database. The system development process has passed through four major steps: firstly, constructing the knowledge-based and process optimisation system, secondly developing a design for assembly module. Thirdly, integrating the KBS with both material selection database and a CAD system. Finally, developing and implementing a ii fuzzy logic approach to generate reliable estimation of cost and to handle the uncertainty in cost estimation model that cannot be addressed by traditional analytical methods. The developed system has, besides estimating the total cost of a product, the capability to: (1) select a material as well as the machining processes, their sequence and machining parameters based on a set of design and production parameters that the user provides to the system, and (2) recommend the most economic assembly technique for a product and provide design improvement suggestion, in the early stages of the design process, based on a design feasibility technique. It provides recommendations when a design cannot be manufactured with the available manufacturing resources and capabilities. In addition, a feature-by-feature cost estimation report was generated using the system to highlight the features of high manufacturing cost. The system can be applied without the need for detailed design information, so that it can be implemented at an early design stage and consequently cost redesign, and longer lead-time can be avoided. One of the tangible advantages of this system is that it warns users of features that are costly and difficult to manufacture. In addition, the system is developed in such a way that, users can modify the product design at any stage of the design processes. This research dealt with cost modelling of both machined components and injection moulded components. The developed cost effective design environment was evaluated on real products, including a scientific calculator, a telephone handset, and two machined components. Conclusions drawn from the system indicated that the developed prototype system could help companies reducing product cost and lead time by estimating the total product cost throughout the entire product development cycle including assembly cost. Case studies demonstrated that designing a product using the developed system is more cost effective than using traditional systems. The cost estimated for a number of products used in the case studies was almost 10 to 15% less than cost estimated by the traditional system since the latter does not take into consideration process optimisation, design alternatives, nor design for assembly issues.

The use of key performance indicators for external quality control in histological dissection

Griffiths, Matthew Paul January 2016 (has links)
The recent reports into standards in the NHS (Francis, 2013) and quality in pathology (Barnes, 2014) have focused scrutiny on the way in which we work in pathology and how we can provide assurance that this is of a sufficiently high standard. There are a number of external quality assurance schemes covering pathology generally and histopathology specifically (UKNEQAS), however, there is no scheme of any kind covering the process of histological surgical dissection. This is an area undergoing development currently, as it changes from an area which is the sole preserve of medically qualified pathologists, to an area utilising a number of highly trained biomedical scientists. The concept of biomedical scientist led dissection has been around for some years, being codified into a document of best practice in 2005 (IBMS, 2005). This document placed the responsibility for the work of the BMS in the hands of the pathologist, without structured oversight or a quality assurance programme. Ten years on and specimen dissection, including the developing area of BMS led dissection, remains without any formal structured form of quality assurance. This work builds on the points made by Barnes (2014), taking the guidance of the RCPath and IBMS to form a series of key performance indicators (KPI) in relation to specimen dissection. These KPI can be used as an indicator of individual practice, highlighting areas of variation, weakness or strength. Once these are identified, a tailored feedback and training event provides an opportunity to address these errors & omissions, or to enable areas of strength to be shared. The results of this thesis show that introducing training events serves to drastically reduce variation by, and between, dissectors, driving up standards in the department.

Ecosystem service provision in dynamic heath landscapes

Cordingley, Justine E. January 2012 (has links)
Conservation policy and management is undergoing a step-change, moving from focusing conservation resources on individual sites such as protected areas, to include the wider landscape. Landscape-scale initiatives may focus on either managing the entire landscape or they may focus on managing particular sites but attempt to address landscape-scale patterns and processes, such as habitat fragmentation. Whilst there is a vast body of research investigating the impacts of habitat fragmentation on individual species, much less is known about the impacts of habitat fragmentation on ecological processes, for example woody succession. Woody succession is an ecological process which has particular implications for conservation management as it drives ecosystem dynamics which can alter the value of the habitat for species of conservation concern. At the same time there is a move to incorporate ecosystem service protection into conservation policy. Understanding the synergies and trade-offs that exist between biodiversity conservation and ecosystem service provision is therefore an important priority. Few studies have examined the influence of habitat fragmentation on woody succession and, in turn, the impact of woody succession on the value of the habitat for both biodiversity conservation and ecosystem service provision. In addition, there is still very little evidence suggesting to what extent areas managed for biodiversity conservation also provide ecosystem services. There is a need to understand how management approaches aimed at increasing the biodiversity value of conservation areas will impact ecosystem services, particularly at the level of the landscape. This thesis aimed to explore all these themes in the Dorset lowland heathlands, UK. The Dorset lowland heathlands are highly fragmented and a priority habitat for nature conservation because they are rare and threatened and support a characteristic flora and fauna. The main threat to this habitat is now woody succession. Without conservation management, the characteristic dwarf shrub heath undergoes succession and is replaced by scrub and woodland. The objectives of this thesis were to (1) assess the impact of fragmentation on the process of succession on lowland heathlands and quantify lowland heathland vegetation dynamics; (2) determine biodiversity and ecosystem service values of major cover types along a successional gradient on lowland heathlands and assess how trade-offs and synergies between biodiversity and ecosystem service provision vary along this gradient and (3) explore how alternative management approaches aimed at increasing the biodiversity value of lowland heathlands impact ecosystem service provision. Fragmentation was found to promote succession with smaller heaths undergoing succession faster than larger heaths. Trade-offs were found between biodiversity value and ecosystem service provision. Biodiversity value was highest in heath habitats and lowest in woodland. Carbon storage, aesthetic value and timber value were highest in woodland. However, recreation value was associated with heathland habitats and not woodland. Conservation management for biodiversity increased the biodiversity value of lowland heaths but not the provision of ecosystem services.

Page generated in 0.0142 seconds