• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 3
  • 2
  • 2
  • Tagged with
  • 44
  • 15
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A risk-based decision support system for failure management in water distribution networks

Bicik, Josef January 2010 (has links)
The operational management of Water Distribution Systems (WDS), particularly under failure conditions when the behaviour of a WDS is not well understood, is a challenging problem. The research presented in this thesis describes the development of a methodology for risk-based diagnostics of failures in WDS and its application in a near real-time Decision Support System (DSS) for WDS’ operation. In this thesis, the use of evidential reasoning to estimate the likely location of a burst pipe within a WDS by combining outputs of several models is investigated. A novel Dempster-Shafer model is developed, which fuses evidence provided by a pipe burst prediction model, a customer contact model and a hydraulic model to increase confidence in correctly locating a burst pipe. A new impact model, based on a pressure driven hydraulic solver coupled with a Geographic Information System (GIS) to capture the adverse effects of failures from an operational perspective, is created. A set of Key Performance Indicators used to quantify impact, are aggregated according to the preferences of a Decision Maker (DM) using the Multi-Attribute Value Theory. The potential of distributed computing to deliver a near real-time performance of computationally expensive impact assessment is explored. A novel methodology to prioritise alarms (i.e., detected abnormal flow events) in a WDS is proposed. The relative significance of an alarm is expressed using a measure of an overall risk represented by a set of all potential incidents (e.g., pipe bursts), which might have caused it. The DM’s attitude towards risk is taken into account during the aggregation process. The implementation of the main constituents of the proposed risk-based pipe burst diagnostics methodology, which forms a key component of the aforementioned DSS prototype, are tested on a number of real life and semi-real case studies. The methodology has the potential to enable more informed decisions to be made in the near real-time failure management in WDS.
12

Towards more effective testing of communications-critical large scale systems

Nabulsi, Mohammad January 2014 (has links)
None of today’s large scale systems could function without the reliable availability of a varied range of network communications capabilities. Whilst software, hardware and communications technologies have been advancing throughout the past two decades, the methods commonly used by industry for testing large scale systems which incorporate critical communications interfaces have not kept pace. This thesis argues for the need for a specifically tailored framework to achieve effective testing of communications-critical large scale systems (CCLSS). The thesis initially discusses how generic test approaches are leading to inefficient and costly test activities in industry. The thesis then presents the form and features of an alternative CCLSS domain-specific test framework, develops its ideas further into a detailed and structured test approach for one of its layers, and then provides a detailed example of how this framework can be applied using a real-life case study. The thesis concludes with a qualitative as well a simulation-based evaluation of the framework’s benefits observed during the case study and an evaluation by expert external participants considering whether similar benefits can be realised if the framework is adopted for the testing of other comparable systems. Requirements data from a second CCLSS is included in the evaluation by external participants as a second smaller case study.
13

Priority setting strategies for regulatory testing of industrial chemicals

Nordberg, Anna January 2007 (has links)
<p>For the majority of the estimated 70,000 industrial chemical substances available on the European market today there is not enough information to enable a reasonably complete assessment of the risks that they might pose to man and the environment. Any strategy for the generation of additional data for these substances should aim at making testing as efficient as possible taking into account environmental and health protection, time, monetary cost and animal welfare. To achieve this, appropriate priority setting rules are needed.</p><p>The main criterion currently used for regulatory priority setting for testing of industrial chemicals is production volume; the higher the production volume, the more information is required. This was also the main criterion in the former legislation, preceding REACH (Registration, Evaluation and Authorisation of Chemicals). The aim of this thesis is to evaluate other priority setting criteria and their implications for risk management, in particular classification and labelling.</p><p>The first paper in this thesis includes a study of the<i> efficiency ratio</i> for some of the tests required for the notification of new substances, i.e. the ratio between the likelihood that the test will lead to a classification, and the monetary cost of performing the test. The efficiency ratio was determined for the standard tests for acute oral toxicity, irritation, sensitisation and subacute toxicity using data from 1409 new chemicals notified in Europe between 1994 and 2004. The results of this investigation suggest that, given limited resources for testing, it is more efficient to perform acute toxicity tests on a larger number of substances rather than to perform additional subacute toxicity studies on the substances already tested for acute toxicity.</p><p>The second paper included in this thesis, reports the results from a comparative study of the bioaccumulating properties of substances being (a) classified as carcinogenic, mutagenic and/or toxic to reproduction (CMR-substances), or (b) classified as acutely toxic or (c) unclassified. The purpose of this investigation was to evaluate potential consequences of prioritising bioaccumulating chemicals for evaluation and testing, as this is one of the strategies prescribed in REACH. The results of this study suggest that bioaccumulating substances are neither over- nor underrepresented among the CMR-substances. This result lends support to the use of the bioconcentration factor for priority setting.</p><p>The studies reported in this thesis utilize existing data on classification of substances as an indicator of the outcome of the risk assessment process, relating priority setting methods to the risk management measures that they give rise to. To the best of my knowledge there are still only very few studies published that address the issue of priority setting in chemicals control using this approach, and in my view there is need for more studies of priority setting methods and a further development of priority setting strategies that are science-based.</p>
14

Decision making biases in project portfolio selection and prioritization : An exploratory study of the rationale behind decision making leading to project portfolio problems.

Cadorin, Dario, Darwish, Rami January 2015 (has links)
No description available.
15

Using risk analysis to prioritise road-based intelligent transport systems (ITS) in Queensland

Johnston, Katherine Amelia January 2006 (has links)
With perpetual strains on resources, road agencies need to develop network-level decision-making frameworks to ensure optimum resource allocation. This is especially true for incident management services and in particular variable message signs (VMS), which are relatively immature disciplines compared to traditional road engineering. The objective of incident management and VMS is to minimise the safety, efficiency, reliability and environmental impacts of incidents on the operations of the transport system. This may be achieved by informing travellers of the incidents so they can adapt their behaviour in a manner that reduces community impacts, such as lateness and the associated vehicle emissions, unreliability of travel times, as well as secondary accidents due to incidents. Generally, road authorities do carry out needs assessments, but qualitatively in many cases. Therefore, this masters research presents a framework that is systematic, quantitative and relatively easy to implement. In order to prioritise VMS infrastructure deployment, a risk management approach was taken that focuses on minimising the impacts on, and costs to the community. In the framework and case study conducted, safety, efficiency and reliability, and environmental impacts are quantified using an economic risk management approach to determine an overall risk score. This score can be used to rank road sections within the network, indicating the roads with the highest risk of incident network impacts and therefore the roads with the highest need for intervention. A cost-effectiveness based risk-reduction ranking can then be determined for each incident management treatment type, comparing the net risk with treatment to that without treatment, and dividing by the net present value of deployment. The two types of ranking, pure risk and cost-effectiveness based risk reduction, will help to minimise the network impacts on the community and optimise resource allocation.
16

In silico virulence prediction and virulence gene discovery of Streptococcus agalactiae

Lin, Frank Po-Yen, Centre for Health Informatics, Faculty of Medicine, UNSW January 2009 (has links)
Physicians frequently face challenges in predicting which bacterial subpopulations are likely to cause severe infections. A more accurate prediction of virulence would improve diagnostics and limit the extent of antibiotic resistance. Nowadays, bacterial pathogens can be typed with high accuracy with advanced genotyping technologies. However, effective translation of bacterial genotyping data into assessments of clinical risk remains largely unexplored. The discovery of unknown virulence genes is another key determinant of successful prediction of infectious disease outcomes. The trial-and-error method for virulence gene discovery is time-consuming and resource-intensive. Selecting candidate genes with higher precision can thus reduce the number of futile trials. Several in silico candidate gene prioritisation (CGP) methods have been proposed to aid the search for genes responsible for inherited diseases in human. It remains uninvestigated as to how the CGP concept can assist with virulence gene discovery in bacterial pathogens. The main contribution of this thesis is to demonstrate the value of translational bioinformatics methods to address challenges in virulence prediction and virulence gene discovery. This thesis studied an important perinatal bacterial pathogen, group B streptococcus (GBS), the leading cause of neonatal sepsis and meningitis in developed countries. While several antibiotic prophylactic programs have successfully reduced the number of early-onset neonatal diseases (infections that occur within 7 days of life), the prevalence of late-onset infections (infections that occur between 7??30 days of life) remained constant. In addition, the widespread use of intrapartum prophylactic antibiotics may introduce undue risk of penicillin allergy and may trigger the development of antibiotic-resistant microorganisms. To minimising such potential harm, a more targeted approach of antibiotic use is required. Distinguish virulent GBS strains from colonising counterparts thus lays the cornerstone of achieving the goal of tailored therapy. There are three aims of this thesis: 1. Prediction of virulence by analysis of bacterial genotype data: To identify markers that may be associated with GBS virulence, statistical analysis was performed on GBS genotype data consisting of 780 invasive and 132 colonising S. agalactiae isolates. From a panel of 18 molecular markers studied, only alp3 gene (which encodes a surface protein antigen commonly associated with serotype V) showed an increased association with invasive diseases (OR=2.93, p=0.0003, Fisher??s exact test). Molecular serotype II (OR=10.0, p=0.0007) was found to have a significant association with early-onset neonatal disease when compared with late-onset diseases. To investigate whether clinical outcomes can be predicted by the panel of genotype markers, logistic regression and machine learning algorithms were applied to distinguish invasive isolates from colonising isolates. Nevertheless, the predictive analysis only yielded weak predictive power (area under ROC curve, AUC: 0.56??0.71, stratified 10-fold cross-validation). It was concluded that a definitive predictive relationship between the molecular markers and clinical outcomes may be lacking, and more discriminative markers of GBS virulence are needed to be investigated. 2. Development of two computational CGP methods to assist with functional discovery of prokaryotic genes: Two in silico CGP methods were developed based on comparative genomics: statistical CGP exploits the differences in gene frequency against phenotypic groups, while inductive CGP applies supervised machine learning to identify genes with similar occurrence patterns across a range of bacterial genomes. Three rediscovery experiments were carried out to evaluate the CGP methods: a) Rediscovery of peptidoglycan genes was attempted with 417 published bacterial genome sequences. Both CGP methods achieved their best AUC >0.911 in Escherichia coli K-12 and >0.978 Streptococcus agalactiae 2603 (SA-2603) genomes, with an average improvement in precision of >3.2-fold and a maximum of >27-fold using statistical CGP. A median AUC of >0.95 could still be achieved with as few as 10 genome examples in each group in the rediscovery of the peptidoglycan metabolism genes. b) A maximum of 109-fold improvement in precision was achieved in the rediscovery of anaerobic fermentation genes. c) In the rediscovery experiment with genes of 31 metabolic pathways in SA-2603, 14 pathways achieved an AUC >0.9 and 28 pathways achieved AUC >0.8 with the best inductive CGP algorithms. The results from the rediscovery experiments demonstrated that the two CGP methods can assist with the study of functionally uncategorised genomic regions and the discovery of bacterial gene-function relationships. 3. Application of the CGP methods to discover GBS virulence genes: Both statistical and inductive CGP were applied to assist with the discovery of unknown GBS virulence factors. Among a list of hypothetical protein genes, several highly-ranked genes were plausibly involved in molecular mechanisms in GBS pathogenesis, including several genes encoding family 8 glycosyltransferase, family 1 and family 2 glycosyltransferase, multiple adhesins, streptococcal neuraminidase, staphylokinase, and other factors that may have roles in contributing to GBS virulence. Such genes may be candidates for further biological validation. In addition, the co-occurrence of these genes with currently known virulence factors suggested that the virulence mechanisms of GBS in causing perinatal diseases are multifactorial. The procedure demonstrated in this prioritisation task should assist with the discovery of virulence genes in other pathogenic bacteria.
17

Identifying and quantifying maintenance improvement opportunities in physcial asset management

Von Petersdorff, Hagen Alexander 12 1900 (has links)
Thesis (MEng)-- Stellenbosch University, 2013. / ENGLISH ABSTRACT: Asset Management initiatives suffer many barriers in implementation which hinder their influence and sustainability. One of these barriers is the lack of buy-in from all levels in the organisation, due to a lack of understanding of the perceived benefits of Asset Management. The relationship between throughput and the maturity of Asset Management implementation is usually felt throughout the organisation, but is difficult to prove or quantify. Furthermore, it is di cult to isolate the effects of maintenance using traditional methods. Organisational alignment in an Asset Management project is achieved by aligning employees' views on what the deficient areas in the organisation are, and managing their expectations in what the perceived benefit of a good application of Asset Management would bring forth. However, the lack of a transparent method to convey the significance of critical areas in the system, and a clear way to communicate these problems creates a barrier in implementation. Without empirical evidence people rely on argumentative opinions to uncover problems, which tends to create friction as opinions from various factions may differ. Typically, these initiatives are constrained by available resources, and the allocation of resources to the correct areas is thus vital. In order for Asset Management initiatives to be successful there first needs to be alignment in execution through a clear understanding of which assets are critical, so that resources can be allocated effectively. In this study, this problem is thoroughly examined and solutions are sought in literature. A method is sought which seeks to isolate the effects of the maintenance function in an operation and uncover critical areas. A study is performed on methods which are typically used to create such understanding, which are shown to have shortcomings that limit their applicability. Thus a new methodology utilising simulation is created in order to overcome these problems. The methodology is validated through a case study, where it is shown that the simulation, in the context of the methodology, is highly beneficial to uncovering critical areas and achieving organisational alignment through communication of results. / AFRIKAANSE OPSOMMING: Fisiese bate bestuursinitiatiewe het verskeie tekortkominge in hulle implementering wat hulle invloed en volhoubaarheid verhinder. Een van hierdie hindernisse is die tekort aan ondersteuning van alle vlakke in die organisasie, wat as gevolg van 'n gebrek aan begrip van die voordele van bate bestuur voorkom. Die verhouding tussen die volwassenheid van batebestuur en produksie deurset word gewoonlik reg deur die organisasie gevoel, maar hierdie verhouding is moeilik om te bewys of te kwantifiseer. Verder is dit moeilik om met huidige methodes die gevolge van instandhouding te isoleer, en dus deeglik te begryp. Organisatoriese aanpassing by `n bate bestuursprojek word bereik deur werknemers se siening te belyn oor wat die gebrekkige areas is, en om hulle verwagtinge te bestuur oor die voordele wat `n goeie bate bestuursprojek kan voortbring. Daar is `n gebrek aan metodes om in `n deursigtige wyse die kritieke areas aan te dui en te komunikeer aan werknemers. Dit skep `n hindernis in die uitvoer van projekte en, in die afwesigheid van empiriese bewyse van probleme, is werknemers afhanklik van argumentatiewe menings om probleme te ontbloot, en die menings van verskeie rolspelers kan verskil. Enige inisiatiewe is tipies beperk deur die beskikbaarheid van hulpbronne daarvoor, en `n effektiewe toedeling van beskikbare hulpbronne is dus noodsaaklik. Om `n suksesvolle batebestuursprojek uit te voer, moet daar eers `n duidelike begrip en ooreenstemming wees oor wat die verskeie kritieke areas is wat die meeste aandag verlang, sodat hulpbronne doeltreffend toegeken kan word. In die studie word hierdie probleem deeglik ondersoek deur oplossings na te vors in die literatuur. `n Metode is gesoek wat daarop gemik is om die gevolge van instandhouding te isoleer in `n produksiestelsel en kritiese areas te ontbloot. `n Studie is uitgevoer op metodes wat gewoonlik gebruik word om sodanige analises uit te voer, en dit word gewys dat huidige metodes terkortkominge het wat hulle toepaslikheid beperk. Dus is `n nuwe metode geskep wat gebruik maak van simulasie om hierdie probleme te oorkom. Die metode is gevalideer deur om `n gevallestudie uit te voer, waar dit bevestig is dat die metode voordelig is om op `n deursigtige wyse kritiese areas te ontbloot en om organisatoriese belyning te bewerkstellig deur effektiewe kommunikasie van die resultate.
18

Setting priorities for conducting and updating systematic reviews

Nasser, Mona January 2018 (has links)
Systematic reviews - appraisal and synthesis of all primary research - are increasingly being used to inform policy and practice in health care. Therefore, it is important to understand how the key questions in systematic reviews are identified and prioritised and whether they are relevant to policy makers, practitioners and members of the public. Research priority setting (RPS) is usually defined as any interpersonal activity that leads to the selection of topics and/or choices of key questions to investigate . Diverse approaches to setting research priorities are used in different countries, regions and organisations. There is no consensus in the literature on the most effective processes with which to set these priorities. However, these decisions define the quality and implications of the evidence, and syntheses of it, available to patients, public and policy makers to help them make informed decisions. My initial scoping work, was to design and conduct a survey across an influential international systematic review organisation (Cochrane Collaboration ) on how they set priorities for their reviews. We identified 13 structured approaches to setting priorities. As part of the project, we developed an evaluation framework that demonstrated whether the priority setting processes meet the values and principles of the Cochrane Collaboration. Subsequently, we developed an equity lens for research priority setting exercises to inform the design of research priority setting processes to ensure that they consider the priorities of disadvantaged groups along with advantaged groups. We used the equity lens to do a second evaluation on the priority setting processes in the Cochrane Collaboration. Both evaluation frameworks demonstrated that the Cochrane Collaboration requires better designed priority setting approaches and must be more transparent in reporting those processes. The evaluation of research priority setting exercises in the Cochrane Collaboration, along with the wider literature, demonstrates that research priority setting exercises cannot be evaluated in isolation from organisational cultures, values and context. Therefore, the next step of the project focused on a specific stakeholder group (major research funders) with significant influence on research, including support for systematic reviews. We selected 11 national research agencies in the UK, Netherlands, France, Norway, Denmark, Germany, Australia, Canada, and the USA. We devised and used a checklist based on Chalmers and Glasziou’s “avoidable research waste” framework (and evaluated the processes and policies of these agencies using this checklist). As previous evaluations had demonstrated, this second evaluation found a lack of transparency in the process of setting priorities for research and other related organisational and policy issues. Increased funding is needed for methodological research to evaluate research practices and to monitor how funding research projects is done and reported. My evaluation of funding agencies and the Cochrane Collaboration found a similar lack of transparency and accountability in the context of conflicting values among stakeholders that decreases accountability and scrutiny of researchers and their institutions. However, the projects have led to organisational and policy changes in the two key stakeholder groups (the Cochrane Collaboration and selected funding agencies). Officials of national health research funding agencies have approached me to collaborate with them to address the issues raised by my work on reducing research waste. This led to the establishment of Funders Forum - the Ensuring Value in Research (EViR) Funders’ Collaboration and Development Forum - to enable agencies in various countries to exchange their experience in addressing issues and creating work groups to address them. The Forum is chaired by individuals from three major research funders: NIHR (UK), ZonMW (Netherlands) and Patient-Centered Outcomes Research Institute (PCORI; USA). The Forum organises several meetings to establish common principles, standards and work plans to achieve the common objective around reducing research waste and adding value for research for a national research funder.
19

Environmental management of oil contaminated sites in Nigeria : improving policy and risk-based framework

Sam, Kabari Simeon January 2016 (has links)
Contaminated land management has become a major concern for Nigeria. Sites affected by petroleum hydrocarbons from oil exploitation activities have been identified as a major environmental and socio-economic problem in the Niger Delta region. Though air and water regulations have received the most attention, the regulatory system for contaminated land remains largely undeveloped. As a result, Nigeria oil contaminated land governance lacks a clear and well-established policy framework; administrative structure and capacity; technical methods; and incentive structures. A consequence of these limitations is the inevitable ad hoc management of contaminated land in Nigeria. This thesis aims to provide a comprehensive and integrated contaminated land management policy framework for Nigeria. This work adopts a qualitative approach including critical review methodology and field surveys to investigate the current practice in contaminated land management in Nigeria. Key findings from this research clearly indicate an urgent need for a regulatory policy supported by a holistic and coordinated structure, coupled with improved technical capacity and additional resources to prevent new contamination and to address legacy contaminated sites. A technical strategy to identify and characterise contaminated land in terms of Source-Pathway-Receptor (S-P-R) linkages, a liability regime and the establishment of land use standards are required in Nigeria. Field surveys were used to pilot a proposed stakeholder engagement approach that integrates consideration of social values that could influence contaminated land management policy. Top ranked social values included drinking water, soil quality, and food and local supply chain. Based on this research a pathway for improving the current policy was proposed. The pathway identifies the need to engage stakeholders, educate and improve awareness, increase trust and transparency and integrate societal values into contaminated land management decision- making. An integrated risk assessment framework for contaminated land management in Nigeria was also proposed, and completed with a pathway for integrating the social values and sustainability indicators identified previously. The study proposes a timeline for achieving comprehensive contaminated land management policy in Nigeria. Finally, a multi-attribute methodology for contaminated land prioritisation in Nigeria was developed to identify and promptly respond to sites that pose the highest risk to receptors, considering the limited nature of resources for contaminated land management.
20

A Systematic Review of Software Requirements Prioritization

Khan, Kashif January 2006 (has links)
Software engineering research has been, and still is criticised as being immature and unscientific due to lack of evaluation. However, software engineering community is now focusing more on empirical research and there is a movement to adopt approaches from other mature fields like medical science and one such approach is Systematic Reviews. One of the major activities within the requirements engineering process is to use requirements prioritization that helps to focus on the most important requirements. There are many prioritization techniques available to prioritize software requirements; still there is lack of evidence of which technique to prefer. The reasons could be the differences in contexts, measurement of variables and usage of data sets. In this thesis, the area of requirements prioritization has been systematically reviewed in order to assess what evidence regarding different prioritisation techniques exist. The results from different studies are contradictory in nature due to variations in study designs, research methodologies and choice of different dependent and context variables. Based on the results of the systematic review, a research framework has been proposed to provide the researchers with a common background for further research with in requirements prioritization area. The goal of the framework is to develop reliable knowledge base as well as help researchers conduct and report prioritization studies.

Page generated in 0.133 seconds