• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 66
  • 19
  • 11
  • 10
  • 9
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 153
  • 27
  • 21
  • 21
  • 20
  • 19
  • 18
  • 18
  • 17
  • 17
  • 16
  • 15
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

UnSync: A Soft Error Resilient Redundant CMP Architecture

January 2011 (has links)
abstract: Reducing device dimensions, increasing transistor densities, and smaller timing windows, expose the vulnerability of processors to soft errors induced by charge carrying particles. Since these factors are inevitable in the advancement of processor technology, the industry has been forced to improve reliability on general purpose Chip Multiprocessors (CMPs). With the availability of increased hardware resources, redundancy based techniques are the most promising methods to eradicate soft error failures in CMP systems. This work proposes a novel customizable and redundant CMP architecture (UnSync) that utilizes hardware based detection mechanisms (most of which are readily available in the processor), to reduce overheads during error free executions. In the presence of errors (which are infrequent), the always forward execution enabled recovery mechanism provides for resilience in the system. The inherent nature of UnSync architecture framework supports customization of the redundancy, and thereby provides means to achieve possible performance-reliability trade-offs in many-core systems. This work designs a detailed RTL model of UnSync architecture and performs hardware synthesis to compare the hardware (power/area) overheads incurred. It then compares the same with those of the Reunion technique, a state-of-the-art redundant multi-core architecture. This work also performs cycle-accurate simulations over a wide range of SPEC2000, and MiBench benchmarks to evaluate the performance efficiency achieved over that of the Reunion architecture. Experimental results show that, UnSync architecture reduces power consumption by 34.5% and improves performance by up to 20% with 13.3% less area overhead, when compared to Reunion architecture for the same level of reliability achieved. / Dissertation/Thesis / M.S. Computer Science 2011
52

Conception sur mesure d'un FPGA durci aux radiations à base de mémoires magnétiques / Conception of a full custum radiation hardened FPGA based on the use of magnetic memories

Gonçalves, Olivier 19 June 2013 (has links)
Le but de la thèse a été de montrer que les cellules mémoires MRAM présentent de nombreux avantages pour une utilisation en tant que mémoire de configuration pour les architectures reconfigurables et en particulier les FPGAs (Field Programmable Gate Arrays). Ce type de composant est programmable et permet de concevoir un circuit numérique simplement en programmant des cellules mémoires qui définissent sa fonctionnalité. Un FPGA est principalement constitué de cellules mémoires. C'est pourquoi elles déterminent en grande partie ses caractéristiques comme sa surface ou sa consommation et influencent ses performances comme sa rapidité. Les mémoires MRAM sont composées de Jonctions Tunnel Magnétiques (JTMs) qui stockent l'information sous la forme d'une aimantation. Une JTM est composée de trois couches : deux couches de matériaux ferromagnétiques séparées par une couche isolante. Une des deux couches ferromagnétiques a une aimantation fixée dans un certaine direction (couche de référence) tandis que l'autre peut voir son aimantation changer dans deux directions (couche de stockage). Ainsi, la propagation des électrons est changée suivant que les deux aimantations sont parallèles ou antiparallèles c'est-à-dire que la résistance électrique de la jonction change suivant l'orientation relative des aimantations. Elle est faible lorsque les aimantations sont parallèles et forte lorsqu'elles sont antiparallèles. L'écriture d'une JTM consiste donc à changer l'orientation de l'aimantation de la couche de stockage tandis que la lecture consiste à déterminer si l'on a une forte ou une faible résistance. Les atouts de la JTM font d'elle une bonne candidate pour être une mémoire dite universelle, bien que des efforts de recherche restent à accomplir. Cependant, elle a de nombreux avantages comme la non-volatilité, la rapidité et la faible consommation à l'écriture comparée à la mémoire Flash ainsi que la résistance aux radiations. Grâce à ces avantages, on peut déjà l'utiliser dans certaines applications et en particulier dans le domaine du spatial. En effet, l'utilisation dans ce domaine permet de tirer parti de tous les avantages de la JTM en raison du fait qu'elle est intrinsèquement immune aux radiations et non-volatile. Elle permet donc de réaliser un FPGA résistant aux radiations et avec une basse consommation et de nouvelles fonctionnalités. Le travail de la thèse s'est donc déroulé sur trois ans. La première année a d'abord été dédiée à l'état de l'art afin d'apprendre le fonctionnement des JTMs, l'architecture des FPGAs, les techniques de durcissement aux radiations et de basse consommation ainsi que le fonctionnement des outils utilisés en microélectronique. Au bout de la première année, un nouveau concept d'architecture de FPGA a été proposé. Les deuxième et troisième années ont été dédiées à la réalisation de cette innovation avec la recherche de la meilleure structure de circuit et la réalisation d'un circuit de base d'un FPGA ainsi que la conception puis la fabrication d'un démonstrateur. Le démonstrateur a été testé avec succès et a permis de prouver le concept. La nouvelle architecture de circuit de FPGA a permis de montrer que l'utilisation des mémoires MRAM comme mémoire de configuration de FPGA était avantageuse et en particulier pour les technologies futures. / The aim of the thesis was to show that MRAM memory has many advantages for use as a configuration memory for reconfigurable architectures and especially Field Programmable Gate-Arrays (FPGAs). This type of component is programmable and allows designing a digital circuit simply by programming memory cells that define its functionality. An FPGA is thus mainly composed of memory cells. That is why they largely determine its characteristics as its surface or power consumption and affect its performance as its speed. MRAM memories are composed of Magnetic Tunnel Junctions (JTMs) which store information in the form of a magnetization. A JTM is composed of three layers: two layers of ferromagnetic material separated by an insulating layer. One of the two ferromagnetic layers has a magnetization pinned in a fixed direction (reference layer) while the other one can have its magnetization switched between two directions (storage layer). Thus, the propagation of the electrons is changed depending on whether the two magnetizations are parallel or antiparallel that is to say that the electrical resistance of the junction changes according to the orientation of the magnetizations. It is low when the magnetizations are parallel and high when antiparallel. Writing a JTM consists in changing the orientation of the magnetization of the storage layer while reading consists in determining if the resistance is high or low. The advantages of the JTM make it a good candidate to be used as a universal memory although research efforts are still needed. However, it has many advantages such as non-volatility, fast and low power consumption compared to writing to Flash memory as well as resistance to radiation. With these advantages, we may already use it in some applications and in particular in the field of space. Indeed, its use in this area allows taking advantage of all of the benefits of JTM due to the fact that it is intrinsically immune to radiation and non-volatile. It therefore enables to make a radiation hardened and low power FPGA with new functionalities. The work of this thesis is held over three years. The first year was dedicated to the state of the art in order to learn the mechanisms of JTMs, the architecture of FPGAs, radiation hardening and low power consumption techniques as well as the operation of the tools used in microelectronics. After the first year, a new FPGA architecture concept was proposed. The second and third years were devoted to the realization of this innovation with the search for the best circuit structure and the realization of an elementary component of a FPGA and the design and manufacture of a demonstrator. The demonstrator has been successfully tested and proved the new concept. The new circuit architecture of FPGA has shown that the use of MRAM cells as configuration memories for FPGAs was particularly advantageous for future technologies.
53

A decision support system for conduct hydropower development

Loots, Ione January 2013 (has links)
Cheap and reliable electricity is an essential stimulus for economic and social development. Currently fossil fuels are used for the majority of global electricity generation, but energy shortages and pressure on all industries to reduce CO2 emissions provide incentives for growing emphasis on the development of alternative energy-generation methods. Presently hydropower contributes about 17% of global energy generation, which is only a fraction of its total potential. In Africa only 5% of its estimated hydropower potential has been exploited, making it the most underdeveloped continent in terms of hydropower. An often overlooked source of hydropower energy is found in conduits, where pressure-reducing stations (PRSs) are installed to dissipate excess energy. The energy dissipated by these devices can instead be captured as hydroelectricity if turbines are installed in the conduits, either by replacing pressure-reducing valves (PRVs) with a turbine, or by installing the turbine in parallel with the PRV. An initial scoping investigation indicated that significant potential exists for small-scale hydropower installations in water-distribution systems in South Africa. Almost all of the country’s municipalities and water-supply utilities have pressure-dissipating stations in their water-distribution systems, where hydropower potential may exist. This dissertation reflects the development of a Conduit Hydropower Decision Support System (CHDSS), summarised in a series of flow diagrams that illustrate the developmental process (Figure i(a) provides an example). A Conduit Hydropower Development (CHD) Tool was developed to facilitate the calculation of necessary factors (the Phase 1 Economic Analysis is shown in Figure i(b)). The objective of this CHDSS was to assist municipalities and engineers in identifying conduit hydropower potential in South Africa and to provide proper guidance for the development of potential sites. / Dissertation (MEng)--University of Pretoria, 2013. / gm2014 / Civil Engineering / Unrestricted
54

Analys av maskinstopp på pastermixer i linje två på Boliden Mineral AB / Analysis of machine stopp on pastemixer in line two at Boliden Mineral AB

Mirzajee, Nahid January 2020 (has links)
Boliden Mineral AB i Garpenberg är en av världens mest moderna gruvor som bedriver gruvdrift och anrikning av malm. I gruvan Garpenberg bryts komplexmalm, bland annat bly, zink, koppar, silver och guld. Då den så kallade pastermixern vid linje två kom i drift ibörjan av år 2019 har den haft störningar. Målet med projektet var att undersöka möjligheterna att öka driftsäkerheten vid pastermixern i linje två genom att ge förslag på hur man kan minimera de oplanerade maskinstoppen. Ett delmål var att ta fram förslag på rutiner för att få till en fungerande daglig tillståndskontroll under drift, detta för att kunna finna eventuella problem innan det leder till produktionsstopp. För att få en någorlunda rättvist bild av den problem som finns vid pastermixern i linje två genomfördes personliga intervjuer med personaler som dagligen arbetar med denna utrustning. Men även personal som har varit med under projektets gång, som en komplettering genomfördes observation av utrustningen vid ett tillfälle. Resultat av den datainsamling som har genomförts grundar sig i hög sträckning på kvalitativ undersökning och den erfarenhet som finns inom organisationen. Bland de analyserade störningar och fel som har funnits visade det sig att axelhaveriet orsakade mest problem i produktionen genom att ha tio dagars produktionsstopp. Grundorsaken har visat sig vara att underhållsorganisationen inte haft tillgång till leverantörens rekommenderade underhållsmanualer från start. Förutsättningarna för att skapa en bra FU-plan för pastermixern från början saknades. Införandet av subjektiva- objektiva tillståndskontroller är ett sätt att förebygga lager- och axelhaveri i framtiden. / Boliden Mineral AB in Garpenberg is one of the most modern mines in the world to conduct in mining and enrichment of ore, containing lead, zinc, silver and gold. When the so called pastemixer in line two came into operation at the beginning of year 2019, it contained with distraction. The main goal of the project was to investigate the possibilities of increasing dependability of pastermixern in line two by offering solution on how to minimize the unplanned machine stops. An partial goal was to suggest routines for condition monitoring in machinery, in order to identify a significant change before it leads to unexpected production stops. The method of reaching the project goal was first to conduct personal interviews with staff that works with this machine daily. But also staff who had been involved during the purchase of this equipment. To understand the situation and the problem better an observation took place. The result of the collected data showed that the breakdown of the axis caused the most problem in production by causing ten days of production stop. The root showed that the maintenance organization did not have access to the suppliers recommended maintenance manuals from the beginning. The prerequisite to provide an preventive maintenance plan for this equipment was missing from the beginning. By establishing routines for condition monitoring in machinery is a way to prevent bearingand shaft problems but also unplanned production stops in the future.
55

Analysis and application of maintenance strategies for Omnicane Thermal Energy Operations(St Aubin) Ltd

BUNDHOO, Jasbeersingh January 2012 (has links)
Maintenance costs at Omnicane Thermal Energy Operations (St Aubin) Ltd contribute a significant part ofthe unit cost of electrical energy produced and affect the profitability of the power plant. Hence it is necessaryand crucial to minimize maintenance costs by optimizing maintenance processes to make the plant morereliable and to run economically.The total maintenance cost for OTEOSAL from year 2008 to 2011 is seen to be increasing and has evendouble from 2008 to 2011. The cost of external labor during operation has increased by nearly four times dueto a lot of breakdown on different equipments and also the value of the spare parts store is seen to risebecause many spare parts are bought at random in fear of having a shut down due to unavailability of spareparts. These excess expenses contribute to a loss in profitability. With a good maintenance strategy, the totalmaintenance cost can be reduced by about 30%.Fault Tree Analysis (FTA) and Failure Mode Effect Analysis (FMEA) were done and allowed identifyingcritical equipments at the power plant and the Grate Stocker, one of the most important and criticalequipment for the plant was selected to perform a Quantitative Analysis of the FTAs. The probability offailure for the Grate Stocker is seen to be 0.98 and has reliability as low as 0.02. Quantitative Analysis of FTAand Pareto Analysis will allow having the right quantity of spare parts at the right time without overstocking.From this thesis, it can be said that combining different maintenance and management methods andstrategies based on FTA, FMEA and Pareto Analysis and all these well formalized and documented accordingto ISO 9001 will certainly allow the power plant to gain a lot like availability, reliability and even financiallyfrom maintenance and also will make OTEOSAL ready for new challenges appearing in the energy sector inMauritius.
56

Reliable Ethernet

Movsesyan, Aleksandr 01 August 2011 (has links) (PDF)
Networks within data centers, such as connections between servers and disk arrays, need lossless flow control allowing all packets to move quickly through the network to reach their destination. This paper proposes a new algorithm for congestion control to satisfy the needs of such networks and to answer the question: Is it possible to provide circuit-less reliability and flow control in an Ethernet network? TCP uses an end-to-end congestion control algorithm, which is based on end-to-end round trip time (RTT). Therefore its flow control and error detection/correction approach is dependent on end-to-end RTT. Other approaches utilize specialized data link layer networks such as InfiniBand and Fibre Channel to provide network reliability. The algorithm proposed in this thesis builds on the ubiquitous Ethernet protocol to provide reliability at the data link layer without the overhead and cost of the specialized networks or the delay induced by TCP’s end-to-end approach. This approach requires modifications to the Ethernet switches to implement a back pressure based flow control algorithm. This back pressure algorithm utilizes a modified version of the Random Early Detection (RED) algorithm to detect congestion. Our simulation results show that the algorithm can quickly recover from congestion and that the average latency of the network is close to the average latency when no congestion is present. With correct threshold and alpha values, buffer sizes in the network and on the source nodes can be kept small to allow little needed additional hardware to implement the system.
57

A Methodology for Reliable Data Mining on Health Administrative Data: Case Studies on Pediatric Immune-Mediated Inflammatory Diseases in Ontario, Canada

Tekieh, Mohammad Hossein 26 April 2022 (has links)
Over the past century, the prevalence of immune-mediated inflammatory diseases (IMIDs) has increased worldwide. It has been identified that exposures to environmental factors early in life are associated with increased risk of these diseases. However, hypothesis-driven analyses do not always identify all risk or protective factors, nor do they adequately explain interactions between variables on the risk of disease. Data mining has the capability of exploring the data without considering specific a priori hypotheses, instead providing possible hypotheses for further analysis. Though, data mining techniques are still not popular among epidemiologists as a trustworthy analytical tool to analyze population-based diseases due to inexplicability of some of the methods (e.g., neural networks), unfamiliarity with, or uncommon use of machine learning and data mining methods in real-world health care applications. At the same time, large amounts of routinely collected health data are amassed as a matter of operating electronic health systems. Routinely collected health data are not collected for research purposes; however, they are great sources of information for research as a secondary use of the data. In this study, following the design science research methodology, we developed a methodology to reliably analyze health administrative data using data mining techniques to provide reproducible, reliable, and trustworthy findings. The reliable data mining methodology on health administrative data was designed in this study to address impartiality, validity, and sustainability concerns in five stages: Data Selection, Preprocessing, Modelling, Evaluation, and Feedback. As part of the main contributions, we developed two unique preprocessing guidelines as the key components of the designed methodology in order to standardize technical steps and address contextual sources of bias. While the proposed methodology is general in its design, to evaluate the designed methodology, we implemented it in several case studies on the real health administrative data housed at ICES, Ontario, first to analyze children suffering with an IMID in Ontario, predict new cases, and, most importantly, generate new hypotheses. The first case study was extended to a second one to narrow focus from all IMIDs to asthma which formed the majority of the IMID cases. Eventually, a third case study was implemented focusing on inflammatory bowel disease (IBD) and systemic autoimmune rheumatic diseases (SARDs) to better compare the findings. We applied both predictive and descriptive modelling techniques such as decision tree, neural network, logistic regression, and k-means clustering on the prepared datasets with more than 700K records and over 80 input variables. We built classification models with notable quality of performance (AUC of 68%), identified the significant factors associated to IMIDs, and extracted multifactorial rules causing protectiveness against or high risk of developing asthma, IBD, and SARDs. The factors that highly contributed to the extracted multifactorial rules were “general childhood infection”, “use of antibiotics”, “streptococcus pyogenes”, “respiratory infection”, “gastroenteritis”, “mother's prevalence of any IMID”, and “baby's sex”. The findings were evaluated and verified by health experts. Most data mining studies which are applied to health data do not handle bias and confounding in their work. However, the systematic errors were identified, and their risks were assessed in these case studies due to following the designed reliable methodology. The results with high risk of bias were reported to disregard. Therefore, this process allowed us to apply data mining techniques to discover new multifactorial rules and identify the factors with the highest impact among the 128 factors observed in the past epidemiological studies, while preserving the trust of domain experts in the results.
58

Wireless Network Dimensioning and Provisioning for Ultra-reliable Communication: Modeling and Analysis

Gomes Santos Goncalves, Andre Vinicius 28 November 2023 (has links)
A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services such as ultra-reliable low-latency communication (URLLC) and hyper-reliable low-latency communication (HRLLC), the staple mission-critical services in IMT-2020 (5G) and IMT-2023 (6G), for which reliable and resilient communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. A natural way of increasing reliability and reducing latency is to provision additional network resources to compensate for uncertainty in wireless networks caused by fading, interference, mobility, and time-varying network load, among others. Thus, an important step to enable mission-critical services is to identify and quantify what it takes to support ultra-reliable communication in mobile networks -- a process often referred to as dimensioning. This dissertation focuses on resource dimensioning, notably spectrum, for ultra-reliable wireless communication. This dissertation proposes a set of methods for spectrum dimensioning based on concepts from risk analysis, extreme value theory, and meta distributions. These methods reveal that each ``nine'' in reliability (e.g., five-nines in 99.999%) roughly translates into an order of magnitude increase in the required bandwidth. In ultra-reliability regimes, the required bandwidth can be in the order of tens of gigahertz, far beyond what is typically available in today's networks, making it challenging to provision resources for ultra-reliable communication. Accordingly, this dissertation also investigates alternative approaches to provide resources to enable ultra-reliable communication services in mobile networks. Particularly, this dissertation considers multi-operator network sharing and multi-connectivity as alternatives to make additional network resources available to enhance network reliability and proposes multi-operator connectivity sharing, which combines multi-operator network sharing with multi-connectivity. Our studies, based on simulations, real-world data analysis, and mathematical models, suggest that multi-operator connectivity sharing -- in which mobiles multi-connect to base stations of operators in a sharing arrangement -- can reduce the required bandwidth significantly because underlying operators tend to exhibit characteristics attractive to reliability, such as complementary coverage during periods of impaired connectivity, facilitating the support for ultra-reliable communication in future mobile networks. / Doctor of Philosophy / A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services in 5G and 6G, for which ultra-reliable communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. Reliability often comes at the cost of additional network resources to compensate for uncertainty in wireless networks. Thus, an important step to enable ultra-reliable communication is to identify and quantify what it takes to support mission-critical services in mobile networks -- a process often denoted as dimensioning. This dissertation focuses on spectrum dimensioning and proposes a set of methods to identify suitable spectrum bands and required bandwidth for ultra-reliable communication. These methods reveal that the spectrum needs for ultra-reliable communication can be beyond what is typically available in today's networks, making it challenging to provide adequate resources to support ultra-reliable communication services in mobile networks. Alternatively, we propose multi-operator connectivity sharing: mobiles simultaneously connect to multiple base stations of different operators. Our studies suggest that multi-operator connectivity sharing can reduce the spectrum needs in ultra-reliability regimes significantly, being an attractive alternative to enable ultra-reliable communication in future mobile networks.
59

Evaluating 'living well' with mild-to-moderate dementia: Co-production and validation of the IDEAL My Life Questionnaire

Clare, L., Pentecost, C., Collins, R., Martyr, A., Litherland, R., Morris, R.G., Quinn, Catherine, Gamble, L.D., Sabatini, S., Martyr, A., Hunt, A., Gamble, L. D., Matthews, F. E., Thom, J. M., Jones, R. W., ,, Victor, C. 01 September 2023 (has links)
Yes / We aimed to co-produce and validate an accessible, evidence-based questionnaire measuring 'living well' with dementia that reflects the experience of people with mild-to-moderate dementia. Nine people with dementia formed a co-production group. An initial series of workshops generated the format of the questionnaire and a longlist of items. Preliminary testing with 53 IDEAL cohort participants yielded a shortlist of items. These were tested with 136 IDEAL cohort participants during a further round of data collection and assessed for reliability and validity. The co-production group contributed to decisions throughout and agreed the final version. An initial list of 230 items was reduced to 41 for initial testing, 12 for full testing, and 10 for the final version. The 10-item version had good internal consistency and test-retest reliability, and a single factor structure. Analyses showed significant large positive correlations with scores on measures of quality of life, well-being, and satisfaction with life, and expected patterns of association including a significant large negative association with depression scores and no association with cognitive test scores. The co-produced My Life Questionnaire is an accessible and valid measure of 'living well' with dementia suitable for use in a range of contexts. / This work was supported by the Economic and Social Research Council, National Institute for Health and Care Research (ES/L001853/2), and Alzheimer’s Society (348, AS-PR2-16-001).
60

A COMMUNICATION LIBRARY FOR PEER-TO-PEER COMMUNICATION IN MESSAGE-DRIVEN PROGRAMS

DAHL, JORGEN L. January 2002 (has links)
No description available.

Page generated in 0.425 seconds