• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 205
  • 100
  • 35
  • 32
  • 31
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 524
  • 524
  • 84
  • 81
  • 66
  • 60
  • 46
  • 46
  • 39
  • 38
  • 37
  • 36
  • 35
  • 31
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Railway Safety - Risks and Economics

Bäckman, Johan January 2002 (has links)
<p>Safety analysis is a process involving several techniques.The purpose of this thesis is to test and develop methodssuitable for the safety analysis of railway risks and railwaysafety measures. Safety analysis is a process comprisingproblem identification, risk estimation, valuation of safetyand economic analysis. The main steps are described in separatechapters, each of which includes a discussion of the methodsand a review of previous research, followed by the contributionof this author. Although the safety analysis proceduredescribed can be used for analysing railway safety, it has suchgeneral foundations that it can be used wherever safety isimportant and wherever safety measures are evaluated. Itcombines cost benefit analysis with criteria for thedistribution and the absolute levels of risk.</p><p>Risks are estimated with both statistical and risk analysismethods. Historical data on railway accidents are analysed andstatistical models fitted to describe trends in accident ratesand consequences. A risk analysis model is developed usingfault tree and event tree techniques, together with Monte Carlosimulation, to calculate risks for passenger train derailments.The results are compared with the statistical analysis ofhistorical data.</p><p>People's valuation of safety in different contexts isanalysed, with relative values estimated in awillingness-to-pay study. A combination of focus groups andindividual questionnaires is used. Two different methods areused to estimate the value of safety and the results arecompared. Comparisons are also made with other studies.</p><p>Different approaches for safety analysis and methods foreconomic analysis of safety are reviewed. Cost-benefit analysisas a decision criterion is discussed and a study on theeconomic effectsof a traffic control system is presented.</p><p>There are several results of the work. Historical data showsa decrease in the accident rate. The average consequence ofeach accident has not changed over time. The risk analysismodel produces comparable results and enables analysis ofvarious safety measures. The valuation study shows that peopleprefer the prevention of small-scale accidents over theprevention of larger, catastrophic accidents. There are onlysmall differences in the valuation of safety in differentcontexts.</p>
122

The risk assessment of aircraft runway overrun accidents and incidents

Kirkland, Ian D. January 2001 (has links)
The UK Civil Aviation Authority has recognised the need for protection against the runway overrun over and above the standard protection recommended by ICAO. Normal protection for the aircraft is provided in ICAO's Annex 14 by the strip at the end of a runway, and a recommendation for the installation of a Runway End Safety Area (RESA). In the UK, the CAA has stated that as part of their safety management system the aerodrome licensee should review the RESA distance requirement for their individual circumstances on an annual basis through a risk assessment. However, current industry knowledge of circumstantial factors in runway overruns is limited. Also, current models that are used to determine likely overrun wreckage locations and RESA dimensions take no account of the operational conditions surrounding the overruns or the aerodrome being assessed. This study has attempted to address these needs by highlighting common factors present in overrun occurrences through the compilation and analysis of a database of runway overruns, and through the construction of a model of wreckage location that takes account of the conditions at an individual aerodrome. A model of overrun probability has been constructed and the consequences of an overrun have been examined. One outcome of the study is an awareness that the industry is in an extremely poor state of knowledge of operational characteristics of non-accident flights, which if not addressed will be a major barrier to future advancement of aviation safety improvement and research.
123

Exploring Causal Factors of DBMS Thrashing

Suh, Youngkyoon January 2015 (has links)
Modern DBMSes are designed to support many transactions running simultaneously. DBMS thrashing is indicated by the existence of a sharp drop in transaction throughput. The thrashing behavior in DBMSes is a serious concern to DBAs engaged in on-line transaction processing (OLTP) and on-line analytical processing (OLAP) systems, as well as to DBMS implementors developing technologies related to concurrency control. If thrashing is prevalent in a DBMS, thousands of transactions may be aborted, resulting in little progress in transaction throughput over time. From an engineering perspective, therefore, it is of critical importance to understand the factors of DBMS thrashing. However, understanding the origin of modern DBMSes' thrashing is challenging, due to many factors that may interact. The existing literature on thrashing exhibits the following weaknesses: (i) methodologies have been based on simulation and analytical studies, rather than on empirical analysis on real DBMSes, (ii) scant attention has been paid to the associations between factors, and (iii) studies have been restricted to one specific DBMS rather than across multiple DBMSes. This dissertation aims at better understanding the thrashing phenomenon across multiple DBMSes. We identify the underlying causes and propose a novel structural causal model to explicate the relationships between various factors contributing to DBMS thrashing. Our model derives a number of specific hypotheses to be subsequently tested across DBMSes, providing empirical support for this model as well as engineering implications for fundamental improvements in transaction processing. Our model also guides database researchers to refine this causal model, by looking into other unknown factors.
124

DEVELOPMENT AND ANALYSIS OF OPTICAL PH IMAGING TECHNIQUES

Lin, Yuxiang January 2010 (has links)
The pH of tumors and surrounding tissues is a key biophysical property of the tumor microenvironment that affects how a tumor survives and how it invades the surrounding space of normal tissue. Research into tumorigenesis and tumor treatment is greatly dependent on accurate, precise, and reproducible measurements. Optical imaging is generally regarded as the best choice for non-invasive and high spatial resolution measurements. Ratiometric fluorescence imaging and fluorescence lifetime imaging microscopy (FLIM) are two primary ways for measuring tumor pH.pH measurements in a window chamber animal model using a ratiometric fluorescence imaging technique is demonstrated in this dissertation. The experimental setup, imaging protocols, and results are presented. A significantly varying bias was consistently observed in the measured pH. A comprehensive analysis on the possible error sources accounting for this bias is carried out. The result of analysis reveals that accuracy of ratiometric method is most likely limited by biological and physiological factors.FLIM is a promising alternative because the fluorescence lifetime is insensitive to the biological and physiological factors. Photon noise is the predominant error source of FLIM. The Fisher information matrix and the Cram&eacute;r-Rao lower bound are used to calculate the lowest possible variance of estimated lifetime for time-domain (TD) FLIM. A statistical analysis of frequency-domain (FD) FLIM using homodyne lock-in detection is also performed and the probability density function of the estimated lifetime is derived. The results allow the derivation of the optimum experimental parameters, which yields the lowest variance of the estimated lifetime in a given period of imaging time. The analyses of both TD and FD-FLIM agree with results of corresponding Monte Carlo simulations.
125

Analysis of traffic accidents before and after resurfacing : A statistical approach

Geedipally, Srinivas January 2005 (has links)
This Dissertation includes a statistical analysis of traffic accidents followed by a test to know the effect of new pavement on traffic safety. The accident data is considered for the roads those are in Region South-East Sweden that got new pavement during the year 2001. In Sweden, this is the fourth study concerning the before and after effect of the new pavement. Johansson (1997) studied the change in the number of accidents between the before-years and after-years. Tholén (1999) and Velin et al (2002) have additionally compared the change with the change in the number of accidents in a reference road network (also called control sites) consisting of all public roads in Region West Sweden which were not resurfaced during the study period.
126

Įmonės duomenų statistinės analizės, panaudojant DBVS, galimybių tyrimas / Research of Statistical Analysis of Enterprise Data using DBMS

Vasiliauskas, Žygimantas 26 August 2010 (has links)
Veiklos analizavimas bei sprendimų priėmimas labai svarbus šiandieninėje įmonių veikloje. Norėdamos atlikti veiklos statistinę analizę, įmonės perka brangius ir sudėtingus produktus nesigilindamos apie tų produktų nauda įmonės veikloje. Vienas iš sprendimo būdų kaip efektyviai atlikti statistinę duomenų analizę neinvestuojant didelių resursų – naudoti standartines DBVS statistines priemones. Šiomis priemonėmis įmonės gali atlikti įvairią statistinę analizę kasdienėje veikloje naudojant statistinės analizės metodus: linijinę regresiją, koreliacijos analizę, nuspėjamąją duomenų analitika, Pareto analitika, Chi kvadrato analitika, ANOVA. / This work reviews advantages and disadvantages of existing statistical analysis tools usage for small and mid sized enterprises, explores usage of statistical analysis of enterprise data in data warehouse, reviews integrated statistical analysis functions in existing database management systems and integrated graphical software for graphical statistical analysis usage. In this work were analyzed Oracle, Microsoft SQL Server and DB2 DBMS. The new statistical analysis solution was offered. This solution allows statistical analysis of existing data using integrated statistical functions of the database management systems and integrated graphical tool. Proposed solution has been designed and realized for the statistical analysis of insurance company‘s data. Oracle DBMS was selected for statistical analysis, because this DBMS is used by insurance company. Oracle has a large number of integrated statistical analysis functions; it ensures more diverse and rapid analysis. Graphic depiction of the selected Oracle Discoverer tool to optimally exploit data analysis potential of Oracle DBMS. The proposed statistical analysis process is versatile and suitable for different business areas and can be applied to other DBMS, which has an integrated analytical functions and graphical tools for the results display.
127

Muitinio įforminimo proceso informacinių srautų analizė / The Analysis of Information Flows of the Customs Documents Processing

Skroblas, Mindaugas 22 January 2008 (has links)
Baigiamajame darbe atlikta importo, eksporto ir tranzito informacinių srautų Lietuvoje analizė per 2006 m. sausio 1 d. - 2007 m. birželio 30 d laikotarpį. Išanalizuoti kreipimaisi į Informacinių technologijų paslaugų centrą dėl minėtus srautus aptarnaujančių sistemų problemų. Pateiktos išvados ir pasiūlymai. Efektyvus informacinių technologijų panaudojimas atlieka svarbų vaidmenį Lietuvos muitinės sistemos veikloje. Pastaruoju metu tiek mūsų šalyje, tiek ir tarptautiniu mastu aktyviai diskutuojama apie kuo spartesnį elektroninės muitinės aplinkos sukūrimą Europos Sąjungoje, dar veiksmingesnį informacinių technologijų pritaikymą atliekant prekių importo, eksporto bei tranzito srautų statistinę analizę, vykdant muitinio įforminimo dokumentų apdorojimą. Šio magistro baigiamojo darbo tikslas – atlikti muitinio įforminimo proceso informacinių srautų Lietuvoje analizę ir identifikuoti pagrindines problemas bei pateikti pasiūlymų dėl šių problemų sprendimo. Magistro baigiamąjį darbą sudaro trys dalys. Pirmoje darbo dalyje trumpai pristatoma Lietuvos muitinės istorija, aptariamos pagrindinės šiuolaikinės muitinės veiklos kryptys ir funkcijos, ypač akcentuotas Muitinės informacinių sistemų centro vaidmuo Lietuvos muitinės sistemoje. Antroje dalyje apžvelgiami prekių importo, eksporto bei tranzito procesai bei informacinių technologijų panaudojimas muitinio įforminimo dokumentų apdorojime. Trečioji dalis skirta muitinio įforminimo proceso informacinių srautų statistinei analizei... [toliau žr. visą tekstą] / In the Master’s paper the analysis of the import, export and transit flows of goods in Lithuania during the period from 1 January 2006 till 30 June 2007 is performed. The calls to Information Technology Service Centre regarding the problems of the systems servicing these flows is analysed. Some conclusions and proposals are submitted. The effective use of information technology (IT) plays an important role in the activities of the Lithuania’s customs system. Currently the issues of a faster creation of an EU-wide e-Customs, more efficient use of IT in performing the statistical analysis of the import, export and transit flows of goods or in processing customs documents have been seriously considered both nationally and internationally. The goal of this Master’s paper is to perform the analysis of information flows of the customs documents processing, to identify the problems as well as to present some proposals for the solution to these problems. The Master’s paper consists of three parts. In the first part a brief overview of the customs’ history in Lithuania is presented, the main current functions and activities of the Lithuanian customs are described and much attention is paid to the role of the Customs Information System Centre in the Lithuanian customs system. The second part is devoted to the survey of the import, export and transit processes of goods as well as the use of IT in the Lithuanian customs. In the third part a thorough statistical analysis of... [to full text]
128

Developing bioinformatics tools for metabolomics

Xia, Jianguo Unknown Date
No description available.
129

A Fault-Based Model of Fault Localization Techniques

Hays, Mark A 01 January 2014 (has links)
Every day, ordinary people depend on software working properly. We take it for granted; from banking software, to railroad switching software, to flight control software, to software that controls medical devices such as pacemakers or even gas pumps, our lives are touched by software that we expect to work. It is well known that the main technique/activity used to ensure the quality of software is testing. Often it is the only quality assurance activity undertaken, making it that much more important. In a typical experiment studying these techniques, a researcher will intentionally seed a fault (intentionally breaking the functionality of some source code) with the hopes that the automated techniques under study will be able to identify the fault's location in the source code. These faults are picked arbitrarily; there is potential for bias in the selection of the faults. Previous researchers have established an ontology for understanding or expressing this bias called fault size. This research captures the fault size ontology in the form of a probabilistic model. The results of applying this model to measure fault size suggest that many faults generated through program mutation (the systematic replacement of source code operators to create faults) are very large and easily found. Secondary measures generated in the assessment of the model suggest a new static analysis method, called testability, for predicting the likelihood that code will contain a fault in the future. While software testing researchers are not statisticians, they nonetheless make extensive use of statistics in their experiments to assess fault localization techniques. Researchers often select their statistical techniques without justification. This is a very worrisome situation because it can lead to incorrect conclusions about the significance of research. This research introduces an algorithm, MeansTest, which helps automate some aspects of the selection of appropriate statistical techniques. The results of an evaluation of MeansTest suggest that MeansTest performs well relative to its peers. This research then surveys recent work in software testing using MeansTest to evaluate the significance of researchers' work. The results of the survey indicate that software testing researchers are underreporting the significance of their work.
130

STATISTICAL ANALYSIS OF INJURY DATA AND THE CONCEPTUAL DESIGN OF A ROLLOVER PROTECTIVE STRUCTURE FOR AN ALL-TERRAIN VEHICLE

Parvathareddy, Bhavana 01 January 2005 (has links)
The rising statistics of fatal and non-fatal injuries involving an all-terrain vehicle has called for an analysis of the accumulated data from the past years. The analysis has led to the conclusion that in the past years, the fatal and non-fatal injuries have been rising rapidly in spite of the consent decrees which were brought into effect from 1988-1998 by the consumer product safety commission. A necessity to provide increased safety while riding an all-terrain vehicle is recognized. Rollover protective structures which were used with successful results in curbing the injuries on agricultural tractors have been identified as having a potential to serve the purpose. A conceptual design of an automatically deployable rollover protective structure has been dealt with, in the thesis.

Page generated in 0.4465 seconds