• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 468
  • 424
  • 180
  • 176
  • 77
  • 13
  • 9
  • 8
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 1773
  • 535
  • 260
  • 216
  • 213
  • 213
  • 184
  • 145
  • 135
  • 134
  • 127
  • 127
  • 113
  • 108
  • 105
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

The influence of regulatory T cells and mobilisation regimens on graft versus malignancy, graft versus host disease and relapse in haematopoietic progenitor cell transplantation

Wells, Janet Catherine January 2015 (has links)
Regulatory T cells (Tregs) are key players in controlling immune responses, limiting autoimmune disease and allergies, and attenuating immune responses to tumours and pathogens. Understanding and harnessing the suppressive effects of Tregs in autologous and allogeneic haematopoietic progenitor cell (HPC) transplantation presents a significant challenge due to lack of consensus over optimal markers to uniquely identify Tregs and variation in centre-specific factors including disease mix, conditioning regimens, graft origin and manipulation and prophylaxis and treatment of graft versus host disease (GVHD). This study aimed to determine if CD3+CD4+CD125highCD127lowFoxP3+ Treg quantification, assessed flow cytometrically, in grafts or in the post-transplant peripheral blood of patients who received transplants for malignant disease, could provide a useful predictor for disease relapse in autologous (n=85) and allogeneic patients (n=75) and falling chimerism and/or incidence of GVHD in the latter group. The impact of Treg numbers were quantified in HPC harvests, in transplant grafts and in recipients’ peripheral blood during immune reconstitution. Additionally, a simplified Treg assessment protocol using the marker tumour necrosis factor receptor-2 (TNFR2) with CD3, CD4 and CD25 was assessed. In autologous donors, significantly higher Tregs relative to CD34 HPCs were noted in harvests mobilised with the more novel regimen, granulocyte-colony stimulating factor (G-CSF) plus Plerixafor than with G-CSF alone or used in combination with cyclophosphamide. In allogeneic harvests Treg numbers following G-CSF mobilisation were significantly lower than in non-mobilised harvests. Lower absolute Treg numbers in donor lymphocyte infusion (DLI) doses were significantly associated with successful outcome in terms of restoration of donor chimerism and resolution of relapse. Cryopreservation of mobilised cells at the time of initial transplant for later use for DLI has thus been incorporated into practice at this Trust as this is expedient in terms of clinical result, convenience and cost. Interestingly although mobilisation regimens influenced Treg levels in harvests, no correlation was apparent between Treg doses transplanted or peripheral blood levels during immune reconstitution post autologous or allogeneic transplantation or with falling chimerism and/or incidence and severity of GVHD in allogeneic patients during the first year post transplant. Extending this follow-up time would be an interesting area of further study as the majority of patients who relapse do so beyond one year.

An intelligent knowledge based cost modelling system for innovative product development

Shehab, Esam January 2001 (has links)
This research work aims to develop an intelligent knowledge-based system for product cost modelling and design for automation at an early design stage of the product development cycle, that would enable designers/manufacturing planners to make more accurate estimates of the product cost. Consequently, a quicker response to customers’ expectations. The main objectives of the research are to: (1) develop a prototype system that assists an inexperienced designer to estimate the manufacturing cost of the product, (2) advise designers on how to eliminate design and manufacturing related conflicts that may arise during the product development process, (3) recommend the most economic assembly technique for the product in order to consider this technique during the design process and provide design improvement suggestions to simplify the assembly operations (i.e. to provide an opportunity for designers to design for assembly (DFA)), (4) apply a fuzzy logic approach to certain cases, and (5) evaluate the developed prototype system through five case studies. The developed system for cost modelling comprises of a CAD solid modelling system, a material selection module, knowledge-based system (KBS), process optimisation module, design for assembly module, cost estimation technique module, and a user interface. In addition, the system encompasses two types of databases, permanent (static) and temporary (dynamic). These databases are categorised into five separate groups of database, Feature database, Material database, Machinability database, Machine database, and Mould database. The system development process has passed through four major steps: firstly, constructing the knowledge-based and process optimisation system, secondly developing a design for assembly module. Thirdly, integrating the KBS with both material selection database and a CAD system. Finally, developing and implementing a ii fuzzy logic approach to generate reliable estimation of cost and to handle the uncertainty in cost estimation model that cannot be addressed by traditional analytical methods. The developed system has, besides estimating the total cost of a product, the capability to: (1) select a material as well as the machining processes, their sequence and machining parameters based on a set of design and production parameters that the user provides to the system, and (2) recommend the most economic assembly technique for a product and provide design improvement suggestion, in the early stages of the design process, based on a design feasibility technique. It provides recommendations when a design cannot be manufactured with the available manufacturing resources and capabilities. In addition, a feature-by-feature cost estimation report was generated using the system to highlight the features of high manufacturing cost. The system can be applied without the need for detailed design information, so that it can be implemented at an early design stage and consequently cost redesign, and longer lead-time can be avoided. One of the tangible advantages of this system is that it warns users of features that are costly and difficult to manufacture. In addition, the system is developed in such a way that, users can modify the product design at any stage of the design processes. This research dealt with cost modelling of both machined components and injection moulded components. The developed cost effective design environment was evaluated on real products, including a scientific calculator, a telephone handset, and two machined components. Conclusions drawn from the system indicated that the developed prototype system could help companies reducing product cost and lead time by estimating the total product cost throughout the entire product development cycle including assembly cost. Case studies demonstrated that designing a product using the developed system is more cost effective than using traditional systems. The cost estimated for a number of products used in the case studies was almost 10 to 15% less than cost estimated by the traditional system since the latter does not take into consideration process optimisation, design alternatives, nor design for assembly issues.

Improving health service operational performance in the State of Qatar

Al-Nabit, Nasser January 2012 (has links)
Improving patient safety and quality of care are among the most important challenges facing hospital executives, not the least because of recent changes in hospital reimbursement. The usual tactic is to reduce medical treatment conditions, ensure proper drug dosage, and develop and enforce standards of care. While these efforts are important, the research here has considered one of the less obvious but critical operational issues as variation in patient flow. Patient flow and the reduction of waiting times is a subject of great discussion across all healthcare providers, underpins many waiting time improvement techniques. In healthcare, flow is the movement of patients, information or equipment between departments, staff groups or organisations as part of their care pathway. Ideally they would move from one ward in their care to the next without delay. Patient flow through an Emergency Department (ED) is complex with several changes in the pathway being produced by many decisions made at various points. However, variation in patient flow occurs throughout a hospital and contributes to different problems e.g. increasing in demand, a need for interactive care pathway or delivery of care solutions to measure different types of variability on patient flow and capacity management, extended waiting times, overcrowding and boarding in the emergency department (ED) bumped and late surgeries, lack of available routine and beds, overburdened nurses, and exasperated physicians. Not only does this threaten patient safety and quality of care, but it also causes enormous stress in the institution, making it increasingly difficult for hospitals to recruit and retain physicians and nurses. Healthcare delivery improvement techniques such as Lean allow providers to improve systems by reducing waste and highlighting the non-value added activities in the process. There are many possible process improvement solutions that can only be validated by implementation and understanding the whole process. Therefore, healthcare providers are turning to innovative technologies such as patient flow simulation in order to identify non-added activities, bottlenecks and to test Lean and more radical solutions before actually implementing them. The research will aim at proving that by adding simulations a radical thinking to improve the efficiency of flow of patients by reducing waiting time and get the best utilisation. Patient flow simulation allows you to immediately see the benefits of process changes with real time data and graphical visualisations. A novel approach is introduced within this research through the integration of simulation modelling techniques along with Taguchi analysis to investigate 'what if' patient variation scenarios. Such scenarios reflect the different combinations of the variability that can affect waiting time in one flow of patients and/or more than one patient at once. In addition, the research has adopted the concept of lean thinking to develop an automated Decision Operations Heuristic Analysis-Quick Response (DOHA-QR). The model acts in an iterative manner which investigate the different levels of variability by identify constraints that may affect the waiting and different performance outputs. Optimisation routines have been developed to get the best of senior medical resource scheduling to achieve best utilisation, reducing waiting time and react to the different levels of variability. Therefore, the developed method: (i) Concentrated on using customers and resources information to improve the quality of services while improving the productivity of the resources used in ED. (ii) Its principles can be extended and applied to different healthcare departments and other types of service granting organizations. (iii) Integrate with the simulation model that will have a real time picture of flow of patients' system load, condition, and behaviour i.e. different level of variability. It will act as a quick response tool that can provide a rationale to decision makers of small-business clinics to structure the right blend of medical resources, thereby maximizing profitability and patient satisfaction. (iv) Simulate different patient acute flow conditions i.e. emergency, urgent, non-urgent flow lines where senior medical resources can be best utilized, and (v) Additionally it will look at moving senior medical resources where and when needed between different patient acute flow lines.

Use of lux gene technology to investigate real-time in-situ interactions of bacterial pathogens with a model blood-brain barrier system

Grimshaw, K. L. January 2012 (has links)
Bacterial bioluminescence has been shown to be an accurate real-time reporter of bacterial internalisation and a valid alternative to viable counts. A self-bioluminescent strain of the bacterial pathogen Neisseria meningitidis was used to develop and optimise an internalisation assay with a range of cell lines including ECV-304, C6 and Caco-2 as an alternative to traditional indirect methods such as viable counting. The use of bioluminescence as a reporter of bacterial internalisation did not produce any robust evidence of internalisation of N. meningitidis C751 pGLITE within cell lines suggesting that internalisation is short lived, does not occur or is below the minimum level of detection. The traditional approach of using a gentamicin protection assay followed by viable counts to investigate internalisation of N. meningitidis suggested evidence of very low level internalisation, the reliability and reproducibility of which remains questionable due to methodological limitations. Although studying bacterial-cell interactions in internalisation assays using monolayers can provide useful insight into bacterial pathogenesis, when infecting the human host, bacteria are required to react with more than a monolayer of cells. The human blood-brain barrier (BBB) is considered to be the main physiological barrier controlling entry and exit from the brain parenchyma of molecules and bacteria. Knowledge of the exact mechanisms by which pathogens invade the brain remains incomplete for two main reasons; most of these pathogens are restricted to humans, limiting the relevance of any animal models and there are few good in vitro models of the BBB. It has been shown, that upon co-culture with C6 cells, ECV-304 cells demonstrate many of the key features of the BBB in vivo including an upregulation of endothelial tight junctions. This continuous co-culture model was further developed and adapted to enhance transendothelial electrical resistance (TEER), indicative of tight junction formation and adapted for use with bioluminescent strains of bacterial pathogens capable of crossing the BBB and entering the CNS to cause meningitis. Penetration of lux-expressing N. meningitidis, Pseudomonas aeruginosa, Staphylococcus aureus and pathogenic Escherichia coli across the BBB model was studied and subsequent effects on TEER and cell viability were also measured. Breaching of the BBB co-culture model by bioluminescent P. aeruginosa was observed after only 6 hours for 106 cfu whereas the other pathogens were not observed to cross the BBB model until in excess of 20 hours. For all pathogenic bacteria tested BBB penetration was associated with high bacterial numbers and all pathogens were observed to have a significant effect on TEER. The effect on cell viability was more variable. By combining the use of bioluminescence and a continuous cell culture model of the BBB, further investigation into breaching of the BBB by bacterial pathogens could be made that would be difficult or impossible to carry out in vivo. This study has resulted in the development of a robust system which has the potential to significantly enhance our understanding of bacterial translocation of the BBB and possible preventative measures.

Multimodal e-feedback : an empirical study

Alharbi, Abdulrhman Abdulghani January 2013 (has links)
This thesis investigated the applicability of unique combinations of multimodal metaphors to deliver different types of feedback. The thesis evaluates the effect of these combinations on the usability of electronic feedback interfaces and on the users' engagement to learning. The empirical research described in this thesis consists of three experimental phases. In the first phase, an initial experiment was carried out with 40 users to explore and compare the usability and users' engagement of facially animated expressive avatars with text and natural recorded speech, and text with graphics metaphors. The second experimental phase involved an experiment conducted with 36 users to investigate user perception of feedback communicated using avatar with facial expressions and body gestures, and voice expressions of synthesised speech. This experiment also aimed at evaluating the role that an avatar could play as virtual tutor in e-feedback interfaces by comparing the usability and engagement of users using three different modes of interaction: video with tutor that presented information with facial expressions, synthesised spoken messages supported with text, and avatars with facial expressions and body gestures. The third experimental phase, introduced and investigated a novel approach to communicate e-feedback that was based on the results of the previous experiments. This approach involved speaking avatars to deliver feedback with the aid of earcons, auditory icons, facial expressions and body gestures. The results demonstrated the usefulness and applicability of the tested metaphors to enhance e-feedback usability and to enable users to attain a better engagement with the feedback. A set of empirically derived guidelines for the design and use of these metaphors to communicate e-feedback are also introduced and discussed.

Policy based runtime verification of information flow

Sarrab, Mohamed Khalefa January 2011 (has links)
Standard security mechanism such as Access control, Firewall and Encryption only focus on controlling the release of information but no limitations are placed on controlling the propagation of that confidential information. The principle problem of controlling sensitive information confidentiality starts after access is granted. The research described in this thesis belongs to the constructive research field where the constructive refers to knowledge contributions being developed as a new framework, theory, model or algorithm. The methodology of the proposed approach is made up of eight work packages. One addresses the research background and the research project requirements. Six are scientific research work packages. The last work package concentrates on the thesis writing up. There is currently no monitoring mechanism for controlling information flow during runtime that support behaviour configurability and User interaction. Configurability is an important requirement because what is considered to be secure today can be insecure tomorrow. The interaction with users is very important in flexible and reliable security monitoring mechanism because different users may have different security requirements. The interaction with monitoring mechanism enables the user to change program behaviours or modify the way that information flows while the program is executing. One of the motivations for this research is the information flow policy in the hand of the end user. The main objective of this research is to develop a usable security mechanism for controlling information flow within a software application during runtime. Usable security refers to enabling users to manage their systems security without defining elaborate security rules before starting the application. Our aim is to provide usable security that enables users to manage their systems' security without defining elaborate security rules before starting the application. Security will be achieved by an interactive process in which our framework will query the user for security requirements for specific pieces of information that are made available to the software and then continue to enforce these requirements on the application using a novel runtime verification technique for tracing information flow. The main achievement of this research is a usable security mechanism for controlling information flow within a software application during runtime. Security will be achieved by an interactive process to enforce user requirements on the application using runtime verification technique for tracing information flow. The contributions are as following. Runtime Monitoring: The proposed runtime monitoring mechanism ensures that the program execution is contains only legal flows that are defined in the information flow policy or approved by the user. Runtime Management: The behaviour of a program that about to leak confidential information will be altered by the monitor according to the user decision. User interaction control: The achieved user interaction with the monitoring mechanism during runtime enable users to change the program behaviours while the program is executing.

An investigation into a digital forensic model to distinguish between 'insider' and 'outsider'

Al-Morjan, Abdulrazaq Abdulaziz January 2010 (has links)
IT systems are attacked using computers and networks to facilitate their crimes and hide their identities, creating new challenges for corporate security investigations. There are two main types of attacker: insiders and outsiders. Insiders are trusted users who have gained authorised access to an organisation's IT resources in order to execute their job responsibilities. However, they deliberately abuse their authorised (i.e. insider) access in order to contravene an organisation’s policies or to commit computer crimes. Outsiders gain insider access to an organisation's IT objects through their ability to bypass security mechanisms without prior knowledge of the insider’s job responsibilities, an advanced method of attacking an organisation’s resources in such a way as to prevent the abnormal behaviour typical of an outsider attack from being detected, and to hide the attacker’s identity. For a number of reasons, corporate security investigators face a major challenge in distinguishing between the two types of attack. Not only is there no definitive model of digital analysis for making such a distinction, but there has to date been no intensive research into methods of doing so. Identification of these differences is attempted by flawed investigative approaches to three aspects: location from which an attack is launched, attack from within the organisation's area of control, and authorised access. The results of such unsound investigations could render organisations subject to legal action and negative publicity. To address the issue of the distinction between insider and outsider attacks, this research improves upon the first academic forensic analysis model, Digital Forensic Research Workshop (DFRWS) [63]. The outcome of this improvement is the creation of a Digital Analysis Model for Distinction between Insider and Outsider Attacks (DAMDIOA), a model that results in an improvement in the analysis investigation process, as well as the process of decision. This improvement is effected by two types of proposed decision: fixed and tailored. The first is based on a predetermined logical condition, the second on the proportion of suspicious activity. The advantage of the latter is that an organisation can adjust its threshold of tolerance for such activity based on its level of concern for the type of attack involved. This research supports the possibility of distinguishing between insider and outsider attacks by running a network simulation which carried out a number of email attack experiments to test DAMDIOA. It found that, when DAMDIOA used predetermined decisions based on legitimate activities, it was able to differentiate the type of attack in seven of the eight experiments conducted. It was the tailored decisions with threshold levels Th=0.2 and 0.3 that conferred the ability to make such distinctions. When the researcher compared legitimate activities, including users’ job responsibilities, with the current methods of distinguishing between insider and outsider attacks,the criterion of authorised access failed three times to make that distinctions. This method of distinction is useless when there is a blank or shared password. He also discovered that both the location from which an attack was launched and attacks from areas within an organisation’s control failed five times to differentiate between such attacks. There are no substantive differences between these methods. The single instance in which the proposed method failed to make these distinctions was because the number of legitimate activities equalled the number of suspicious ones. DAMDIOA has been used by two organisations for dealing with the misuse of their computers, in both cases located in open areas and weakly protected by easily guessed passwords. IT policy was breached and two accounts moved from the restricted to the unlimited Internet policy group. This model was able to identify the insiders concerned by reviewing recorded activities and linking them with the insiders’ job responsibilities. This model also highlights users’ job responsibilities as a valuable source of forensic evidence that may be used to distinguish between insider and outsider attacks. DAMDIOA may help corporate security investigators identify suspects accurately and avoid incurring financial loss for their organisations. This research also recommends many improvements to the process by which user activities are collected before the attack takes place, thereby enabling distinctions to be better drawn. It also proposes the creation of a physical and logical log management system, a centralised database for all employee activities that will reduce organisations’ financial expenditures. Suggestions are also proposed for future research to classify legitimate and suspicious activities, evaluate them, identify the important ones and standardise the process of identifying and collecting users’ job responsibilities. This work will remove some of the limitations of the proposed model.

Development of a novel toner for electrophotography based additive manufacturing process

Banerjee, Soumya January 2011 (has links)
This thesis is intended to conduct feasibility study of producing 3D objects by printing thermoplastic elastomer using electrophotography technique and thereafter sintering the whole layer using infrared light source .The term Selective laser printing (SLP) has been coined by the author for this new process. This thesis provides the feasibility of developing experimental toner using thermoplastic toner using both mono and dual component print engines.

Improving project management planning and control in service operations environment

Al-Kaabi, Mohamed January 2011 (has links)
Projects have evidently become the core activity in most companies and organisations where they are investing significant amount of resources in different types of projects as building new services, process improvement, etc. This research has focused on service sector in attempt to improve project management planning and control activities. The research is concerned with improving the planning and control of software development projects. Existing software development models are analysed and their best practices identified and these have been used to build the proposed model in this research. The research extended the existing planning and control approaches by considering uncertainty in customer requirements, resource flexibility and risks level variability. In considering these issues, the research has adopted lean principles for planning and control software development projects. A novel approach introduced within this research through the integration of simulation modelling techniques with Taguchi analysis to investigate ‗what if‘ project scenarios. Such scenarios reflect the different combinations of the factors affecting project completion time and deliverables. In addition, the research has adopted the concept of Quality Function Deployment (QFD) to develop an automated Operations Project Management Deployment (OPMD) model. The model acts as an iterative manner uses ‗what if‘ scenario performance outputs to identify constraints that may affect the completion of a certain task or phase. Any changes made during the project phases will then automatically update the performance metrics for each software development phases. In addition, optimisation routines have been developed that can be used to provide management response and to react to the different levels of uncertainty. Therefore, this research has looked at providing a comprehensive and visual overview of important project tasks i.e. progress, scheduled work, different resources, deliverables and completion that will make it easier for project members to communicate with each other to reach consensus on goals, status and required changes. Risk is important aspect that has been included in the model as well to avoid failure. The research emphasised on customer involvement, top management involvement as well as team members to be among the operational factors that escalate variability levels 3 and effect project completion time and deliverables. Therefore, commitment from everyone can improve chances of success. Although the role of different project management techniques to implement projects successfully has been widely established in areas such as the planning and control of time, cost and quality; still, the distinction between the project and project management is less than precise and a little was done in investigating different levels of uncertainty and risk levels that may occur during different project phase.

Directional routing techniques in VANET

Al-Doori, Moath January 2011 (has links)
Vehicle Ad hoc Networks (VANET) emerged as a subset of the Mobile Ad hoc Network (MANET) application; it is considered to be a substantial approach to the ITS (Intelligent Transportation System). VANETs were introduced to support drivers and improve safety issues and driving comfort, as a step towards constructing a safer, cleaner and more intelligent environment. At the present time vehicles are equipped with a number of sensors and devices, including On Board Units (OBU); this enables vehicles to sense situations affecting other vehicles and manage communications, by exploiting infrastructures such as the Road Side Unit (RSU); creating a Vehicle to Infrastructure (V2I) pathway, or interacting directly with other vehicles creating a Vehicle to Vehicle (V2V) pathway. Owing to the lack of infrastructures and difficulties involved in providing comprehensive coverage for all roads because of the high expense associated with installation, the investigation in this research concentrates on the V2V communication type rather than theV2I communication type. Many challenges have emerged in VANET, encouraging researchers to investigate their research in an attempt to meet these challenges. Routing protocol issues are considered to be a critical dilemma that needs to be tackled in VANET, particularly in a sparse environment, by designing an effcient routing mechanism that impacts on enhancing network performance in terms of disseminating messages to a desireddestination, balancing the generated packet (overhead) on the network and increasing the ratio of packet delivery with a reduced time delay. VANET has some unique characteristics compared to MANET; specifically it includes high mobility and constrained patterns restricted by roads, which lead to generation of a disconnected area occurring continuously between vehicles creating a Delay Tolerant Network (DTN). This is in opposition to applying the multi-hope technique properly to deliver the packet to its desire destination. The aim in this thesis comprises two main contributions. First developing novel routing protocols for a sparse environment in VANET with the context of utilising the mobility feature, with the aid of the equipped devices, such as Global Position System (GPS) and Navigation System (NS). This approach exploits the knowledge of Second Heading Direction (SHD), which represents the knowledge of the next road direction the vehicle is intending to take, in order to increase the packet delivery ratio, and to increase the route stability by decreasing instances of route breakage. This approach comprises two approaches; the first approach was designed for a highway scenario, by selecting the next hop node based on a filtration process, to forward the packet to the desired destination, while the second approach was developed for the intersection and roundabout scenario, in order to deliver the packet to the destination (unknown location). The formalising and specification of the VSHDRP has been performed using the CCA (Calculus of Context-aware Ambient), in order to evaluate the protocols behaviours, the protocol has been validated using the ccaPL. In addition the performance of the VSHDRP has been evaluated using the NS-2 simulator; comparing it with Greedy Perimeter Stateless Routing (GPSR) protocol, to reveal the strengths and weaknesses of the protocol. Second, developing a novel approach to broadcasting the HELLO beacon message adaptively in VANET based on the node's circumstances (direction and speed), in order to minimise the broadcasting of unnecessary HELLO beacon messages. A novel architecture has been built based on the adaptive HELLO beacon message, which clarifies how the OBU components are interacting with the connected sensors, in order to portray any changes in the vehicle's circumstances, so as to take the right decision to determine appropriate action. This architecture has been built based on the concept of a context aware system, which divides the architecture into three main phases; sensing processing and acting.

Page generated in 0.044 seconds