• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 5
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Foot-and-mouth disease epidemiology in relation to the physical, social and demographic farming landscape

Flood, Jessica Scarlett January 2016 (has links)
The foot-and-mouth disease (FMD) virus poses a considerable threat both to farmers and to the wider economy should there be a future incursion into the UK. The most recent large-scale FMD epidemic in the UK was in 2001. Mathematical models were developed and used during this epidemic to aid decision-making about how to most effectively control and eliminate it. While the epidemic was eventually brought to a halt, it resulted in a huge loss of livestock and is estimated to have cost the UK economy around ¿6 billion. The mathematical models predicted the overall spatial spread of FMD well, but had low predictive ability for identifying precisely which farm premises became infected over the course of the epidemic. This will in part have been due to the stochastic nature of the models. However, the transmission probability between two farm premises was represented as the Euclidean distance between their point locations, which is a crude representation of FMD transmission. Additionally, the premises' point location data contain inaccuracies, sometimes identifying the farmer's residential address rather than the farm itself which may be a long way away. Local FMD transmission occurs via contaminated fomites carried by people or vehicles between premises, or by infected particles being blown by wind between proximal fields. Given that these transmission mechanisms are thought to be related to having close field boundaries, it is possible that some of the inaccuracy in model predictions is also due to imprecisely representing such transmission. In this thesis I use fine-scale geographical data of farm premises' field locations to study the contiguity of premises (where contiguous premises (CPs) are defined as having field boundaries < 15m apart). I demonstrate that the distance between two premises' point locations does not accurately represent when they are CPs. Using an area of southern Scotland containing 4767 livestock premises, I compare the predictions of model simulations using two different model formulations. The first is one of the original models based on the 2001 outbreak, and the second is a new model in which transmission probability is related to whether or not premises were contiguous. The comparison suggests that the premises that became infected during the course of the simulations were more predictable using the new model. While it cannot be concluded that this will translate into more accurate predictions until this can be validated during a future outbreak, it does suggest that the new model is more predictable in its route through the landscape, and therefore that it may better reflect local transmission routes than the original model. Networks based on contiguity of premises were constructed for the same area of southern Scotland, and showed that 90.6% (n=4318) of the premises in the area were indirectly connected to one another as part of the Giant Component (GC). The network metric of 'betweenness' was used to identify premises acting as bridges between otherwise disconnected sub-populations of premises. It was found that removing 100 premises with highest betweenness served to fragment the GC. Model simulations indicated that, even with some longer-range transmission possible, removing these premises from the network resulted in a large decrease in mean number of infected premises and outbreak duration. In real terms, premises removal from the network would mean ensuring these premises did not become infected by enhanced biosecurity and/or vaccination depending on policy. In this thesis I also considered the role of biosecurity practices in shaping FMD spread. A sample of 200 Scottish farmers were interviewed on their biosecurity practices, and their biosecurity risk quantified using a biosecurity 'risk score' developed during the 2007 FMD outbreak in Surrey. Using Moran's I and network assortativity measures it was found that there did not appear to be any clustering of biosecurity risk scores on premises. Statistical analysis found no association between biosecurity risk and the mathematical model's premises' susceptibility term (which describes the increase in a premises' susceptibility with increasing numbers of livestock). This suggests that the model's susceptibility term is not indirectly capturing a general pattern in biosecurity on different sized farm premises. Thus, this body of work shows that incorporating a more realistic representation of premises location into mathematical models, in terms of area (i.e. as fields) rather than a point, alters predictions of spatial spread. It also demonstrates that targeted control at a relatively small number of farms could effectively fragment the farming landscape, and has the potential to considerably reduce the size of an FMD outbreak. It also demonstrates that variations in premises' FMD biosecurity risks are unlikely to be indirectly affecting the spatial or demographic components of the model. This increase in understanding of how geographic, social and demographic factors relate to FMD spread through the landscape may enable more effective control of an outbreak, should there be an incursion in the UK in future.
2

A Continuous Analog of Run Length Distributions Reflecting Accumulated Fractionation Events

Yu, Zhe January 2016 (has links)
We propose a new, continuous model of the fractionation process (duplicate gene deletion after polyploidization) on the real line. The aim is to infer how much DNA is deleted at a time, based on segment lengths for alternating deleted (invisible) and undeleted (visible) regions. After deriving a number of analytical results for "one-sided" fractionation, we undertake a series of simulations that help us identify the distribution of segment lengths as a gamma with shape and rate parameters evolving over time. This leads to an inference procedure based on observed length distributions for visible and invisible segments. We suggest extensions of this mathematical and simulation work to biologically realistic discrete models, including two-sided fractionation.
3

Probability modeling of industrial situations using transform techniques

Hu, Xiaohong January 1995 (has links)
No description available.
4

Modeling fault probability in single railroad turnouts in Eastern Region, Sweden, with the use of logistic regression models : A step from preventive to predictive preventive maintenance in railway maintenance planning / Modellering av felsannolikheten i enkla järnvägspårväxlarna i region öst, Sverige med användning av logistiska regressionsmodeller : Ett steg från förebyggande till förutsägbart förebyggande underhåll i järnvägsunderhållsplanering

Zarov, Filipp January 2019 (has links)
Turnouts are an important part of railway infrastructure for two reasons: infrastructure andmaintenance. For the infrastructure they provide the flexibility to allow the formulation and branchingof railway network and for maintenance they consume a large part of maintenance budget and have aprominent place in maintenance planning policy and activities. This is because as a “mechanical object”,a turnout often experiences malfunctions. The problem becomes even more complicated, since a turnoutis composed of many different parts and each of them fails for very different reasons (e.g. switch bladesvs crossing part). This is reflected in the different needs for maintenance activities, as railways areforced to pour in excessive amounts of resources to carry out emergency repairs, or to carry outunnecessary scheduled maintenance works in turnouts, which do not need to be inspected or repaired.Therefore, it is difficult to plan and organize maintenance activities in turnouts in an efficient manner.This raises the question of whether malfunctions in turnouts can be predicted and used as informationfor the maintenance planning process in order to optimize it and develop it into a more reliablepreventive maintenance planning.The aim of this analysis is to attempt to model the probability of various malfunctions in turnouts asa function of their main geometric and operational characteristics by using logistic regression modelsand then input these results into the maintenance planning process in order to optimize it. First, it wasimportant to objectify the railway track system and the turnout components, both in terms of parts andinterrelationships. Furthermore, the process and basic elements of railway maintenance planning weredefined, as well as arguments that motivate a turn towards preventive maintenance planningmethodologies. This was done through a comprehensive literature study.The basis of this research was case studies, which described the relationship between geometricaland operational characteristics of turnouts and their wear, as well as risk-based modelling methods inrailway maintenance planning. To create the analysis model, data from turnouts in eastern regionprovided by the Swedish Transport Administration were used, both from the point of view of describingthe underlying causes of turnout malfunctions and to formulate an object-oriented database suitable forusing in logistic regression models. The goal was a logit model that calculated the malfunctionprobability of a turnout, which could be used directly into a maintenance planning framework, whichranked maintenance activities in turnouts.The results obtained showed that although the model suffers from low correlation, differentrelationships between input variables and different functional errors were established. Furthermore, thepotential of these analytical models and modeling structures was shown to be able to developpreventive, predictive railway maintenance plans, but further analysis of the data structure is required,especially regarding data quality. Finally, further possible research areas are presented. / Spårväxlar är viktiga delar av järnvägens infrastruktur av två orsaker: infrastruktur och underhåll.För infrastrukturen ger de möjlighet till flexibla tillåter de formulering och grenning av järnvägsnät ochför underhållet konsumerar de en stor del av underhållsbudgeten och de har en framträdande plats iunderhållsplaneringspolitiken och aktiviteterna. Detta beror på att som ett ”maskinellt objekt”, harspårväxeln ofta fel. Problemet blir ännu mer komplicerat, eftersom en spårväxel består av många olikadelar och var och en av dem bryts ner av mycket olika skäl (t.ex. tunganordning vs korsningsdel). Dettaåterspeglas i olika behov av underhållsaktiviteter. Eftersom järnvägarna tvingas hålla alltför storamängder resurser för att utföra akuta reparationer eller för att utföra onödiga schemalagdaunderhållsarbeten i spårväxlar, som inte behöver inspekteras eller repareras. Därför är det svårt attplanera och organisera underhållsaktiviteter för spårväxlarna på ett effektivt sätt. Detta ställer fråganom funktionsfel i spårväxlar kan förutsägas och användas som information till  underhållsplaneringsprocessen för att optimera den och utveckla den till en pålitligare förebyggandeunderhållsplanering.Syftet med denna analys är att försöka modellera sannolikheten för olika funktionsfel i spårväxlarsom en funktion av deras huvudsakliga geometriska och operativa egenskaper med användning avlogistiska regressionsmodeller och sedan mata dessa resultat in i underhållsplaneringsprocessen för attoptimera den. För det första var det viktigt att objektifiera järnvägsspårsystemet ochspårväxlarkomponenterna, både vad gäller delar och inbördes förhållanden. Dessutom definieradesprocessen och grundelementen i järnvägsunderhållsplaneringen, samt att argument som motiverarförändring till förebyggande underhållsplaneringsmetoder. Detta gjordes genom en omfattandelitteraturstudie.Grunden i denna analys var fallstudier, som beskrev förhållandet mellan geometriska ochoperationella egenskaper hos spårväxlar och deras förslitning samt riskbaserade modelleringsmetoder ijärnvägsunderhållsplanering. För att skapa analysmodellen användes data från spårväxlar i östraregionen som tillhandahölls av Trafikverket, både ur synpunkten att beskriva de underliggandeorsakerna till spårväxlarsfel och för att formulera en objektorienterad databas lämplig för användning ilogistiska regressionsmodeller. Målet var en logitmodell som beräknade sannolikheten för fel i enspårväxel, som kunde användas direkt i en underhållsplaneringsram, som rangordnar lämpigaunderhållsaktiviteter i spårväxlar.Erhållna resultat visade att även om modellen lider av låg korrelation, konstaterades olika sambandmellan ingående variabler och olika funktionsfel. Vidare visades potentialen hos dessa analysmodelleroch modelleringsstrukturer för att kunna utveckla förebyggande, förutsägbarajärnvägsunderhållsplaner, men det krävs troligtvis ytterligare analys av datastrukturen, specielltangående datakvaliteten. Slutligen presenteras ytterligare möjliga forskningsområden.
5

Hardware and Software Fault-Tolerance of Softcore Processors Implemented in SRAM-Based FPGAs

Rollins, Nathaniel Hatley 09 March 2012 (has links) (PDF)
Softcore processors are an attractive alternative to using expensive radiation-hardened processors for space-based applications. Since they can be implemented in the latest SRAM-based FPGA technologies, they are fast, flexible and significantly less expensive. However, unlike ASIC-based processors, the logic and routing of a softcore processor are vulnerable to the effects of single-event upsets (SEUs). To protect softcore processors from SEUs, this dissertation explores the processor design-space for the LEON3 softcore processor implemented in a commercial SRAM-based FPGA. The traditional mitigation techniques of triple modular redundancy (TMR) and duplication with compare (DWC) and checkpointing provide reliability to a softcore processor at great spatial cost. To reduce the spatial cost, terrestrial ASIC-based processor protection techniques are applied to the LEON3 processor. These techniques come at the cost of time instead of area. The software fault-tolerance techniques used to protect the logic and routing of the LEON3 softcore processor include a modified version of software implemented fault tolerance (SWIFT), consistency checks, software indications, and checkpointing. To measure the reliability of a mitigated LEON3 softcore processor, an updated hardware fault-injection model is created, and novel reliability metrics are employed. The improvement in reliabilty over an unmitigated LEON3 is measured using four metrics: architectural vulnerability factor (AVF), mean time to failure (MTTF), mean useful instructions to failure (MuITF), and reliability-area-performance (RAP). Traditional reliability techniques provide the best reliability: DWC with checkpointing improves the MTTF and MuITF by almost 35x and TMR with triplicated input and outputs improves the MTTF and MuITF by almost 6000x. Software fault-tolerance provides significant reliability for a much lower area cost. Each of these techniques provides greater processor protection than a popular state-of-the-art rad-hard processor.

Page generated in 0.1607 seconds