• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 123
  • 13
  • 10
  • 5
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 208
  • 208
  • 94
  • 37
  • 32
  • 31
  • 29
  • 28
  • 28
  • 28
  • 27
  • 25
  • 20
  • 19
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Seismic risk analysis of Perth metropolitan area

Liang, Jonathan Zhongyuan January 2009 (has links)
[Truncated abstract] Perth is the capital city of Western Australia (WA) and the home of more than three quarters of the population in the state. It is located in the southwest WA (SWWA), a low to moderate seismic region but the seismically most active region in Australia. The 1968 ML6.9 Meckering earthquake, which was about 130 km from the Perth Metropolitan Area (PMA), caused only minor to moderate damage in PMA. With the rapid increase in population in PMA, compared to 1968, many new structures including some high-rise buildings have been constructed in PMA. Moreover, increased seismic activities and a few strong ground motions have been recorded in the SWWA. Therefore it is necessary to evaluate the seismic risk of PMA under the current conditions. This thesis presents results from a comprehensive study of seismic risk of PMA. This includes development of ground motion attenuation relations, ground motion time history simulation, site characterization and response analysis, and structural response analysis. As only a very limited number of earthquake strong ground motion records are available in SWWA, it is difficult to derive a reliable and unbiased strong ground motion attenuation model based on these data. To overcome this, in this study a combined approach is used to simulate ground motions. First, the stochastic approach is used to simulate ground motion time histories at various epicentral distances from small earthquake events. Then, the Green's function method, with the stochastically simulated time histories as input, is used to generate large event ground motion time histories. Comparing the Fourier spectra of the simulated motions with the recorded motions of a ML6.2 event in Cadoux in June 1979 and a ML5.5 event in Meckering in January 1990, provides good evidence in support of this method. This approach is then used to simulate a series of ground motion time histories from earthquakes of varying magnitudes and distances. ... The responses of three typical Perth structures, namely a masonry house, a middle-rise reinforced concrete frame structure, and a high-rise building of reinforced concrete frame with core wall on various soil sites subjected to the predicted earthquake ground motions of different return periods are calculated. Numerical results indicate that the one-storey unreinforced masonry wall (UMW) building is unlikely to be damaged when subjected to the 475-year return period earthquake ground motion. However, it will suffer slight damage during the 2475-return period earthquake ground motion at some sites. The six-storey RC frame with masonry infill wall is also safe under the 475-year return period ground motion. However, the infill masonry wall will suffer severe damage under the 2475-year return period earthquake ground motion at some sites. The 34-storey RC frame with core wall will not experience any damage to the 475-year return period ground motion. The building will, however, suffer light to moderate damage during the 2475-year return period ground motion, but it might not be life threatening.
162

Two and three dimensional stability analyses for soil and rock slopes

Li, An-Jui January 2009 (has links)
Slope stability assessments are classical problems for geotechnical engineers. The predictions of slope stability in soil or rock masses play an important role when designing for dams, roads, tunnels, excavations, open pit mines and other engineering structures. Stability charts continue to be used by engineers as preliminary design tools and by educators for training purposes. However, the majority of the existing chart solutions assume the slope problem is semi-infinite (plane-strain) in length. It is commonly believed that this assumption is conservative for design, but non-conservative when a back-analysis is performed. In order to obtain a more economical design or more precise parameters from a back-analysis, it is therefore important to quantify three dimensional boundary effects on slope stability. A significant aim of this research is to look more closely at the effect of three dimensions when predicting slope stability. In engineering practice, the limit equilibrium method (LEM) is the most popular approach for estimating the slope stability. It is well known that the solution obtained from the limit equilibrium method is not rigorous, because neither static nor kinematic admissibility conditions are satisfied. In addition, assumptions are made regarding inter slice forces for a two dimensional case and inter-column forces for a three dimensional case in order to find a solution. Therefore, a number of more theoretically rigorous numerical methods have been used in this research when studying 2D and 3D slope problems. In this thesis, the results of a comprehensive numerical study into the failure mechanisms of soil and rock slopes are presented. Consideration is given to the wide range of parameters that influence slope stability. The aim of this research is to better understand slope failure mechanisms and to develop rigorous stability solutions that can be used by design engineers. The study is unique in that two distinctly different numerical methods have been used in tandem to determine the ultimate stability of slopes, namely the upper and lower bound theorems of limit analysis and the displacement finite element method. The limit equilibrium method is also employed for comparison purposes. A comparison of the results from each technique provides an opportunity to validate the findings and gives a rigorous evaluation of slope stability.
163

Preventing water pollution by dairy by-products risk assessment and comparison of legislation in Benin and South Africa /

Goutondji, Leopoldine E. S. Abul. January 2007 (has links)
Thesis (MSc (Paraclinical Sciences))--University of Pretoria, 2007. / Includes bibliographical references. Also available in print format.
164

Spotřebitelská politika v rámci EU a její vliv na podnikatelské subjeky. / Consumer policy in the European Union and its influence on the entrepreneurs

SRBOVÁ, Alena January 2014 (has links)
Goal of the dissertation work: to identify steps in the implementation proces of the Hazard Analysis Critical and Control Point by micro and small entrepreneurs in the Czech Republic in the bakery and pastryfields. This is based on scientific research results in order to create a suggestion for the introduction of the changes which follow the consumer policy in the European Union do they apply the production of the abovementioned entrepreneurs
165

A comprehensive system for managing reproductive failure in small domestic ruminants

Van Rooyen, Johan Anton 22 November 2012 (has links)
The Hazard Analysis and Critical Control Point (HACCP) system was used as a basis for describing a methodology for the management of reproduction in small ruminant flocks. The seven principles of the HACCP system are: <ul> 1. Conduct a hazard analysis 2. Identify critical control points 3. Establish critical limits for each control point 4. Establish monitoring procedure 5. Establish corrective actions 6. Establish a record keeping procedure 7. Establish verification procedures. </ul> The first principle of HACCP requires a description of the production system. The small ruminant reproduction process was subdivided into four sub-processes with a total of 33 phases. The ewe management cycle consists of 12 phases and the ram management cycle, replacement ewe cycle and replacement ram cycle each consists of seven phases. The reproductive process was described by a flow diagram. The hazards were categorized as management, environmental, nutritional, genetic, predatory, physiological and disease factors that could affect reproductive performance. The second principle requires the establishment of Critical Control Points (CCP). Seventeen CCPs in the reproductive process were established and monitoring and diagnostic procedures for each of the critical control points was described together with suggested corrective actions. The resulting HACCP plan formed the basis of consultations with 30 commercial small stock enterprises. Each of the Critical Control Points was applied to at least three and up to 30 of the flocks over the period of the trial to establish the practicality and validity of the procedures which were described as standard operating procedures. Data forms were designed for the structured collection of data regarding the process as well as the CCPs. The Critical Control Points and forms that were selected in this project were as follows: <ul> <li> CC1. Ewe selection. Prior to Ewe preparation. Ewe selection data form</li> <li> CC2.Ram selection. Prior to Phase two of ram preparation. Ram selection data form</li> <li> CC3. Ewe preparation. Prior to start of mating (end of flushing period). Ewe preparation data form</li> <li> CC4. Ram preparation. Prior start of mating (end of flushing period). Ram preparation data form</li> <li> CC5. Joining. Start of mating period. Joining data form</li> <li> CC6. Mating. End of mating period. Mating data form</li> <li> CC7. Scan. >35 days after mating. Scan data form</li> <li> CC8. Rescan. ≥ 30 days after initial scanning. Rescan data form</li> <li> CC9. Pregnant. Prior to start of lambing. Pregnancy management data form</li> <li> CC10. Lambing. End of lambing period. Lambing data form</li> <li> CC11. Marking. After neonatal period. Marking data form</li> <li> CC12. Weaning. Separation of lambs from ewes. Weaning data form</li> <li> CC13. Ewe replacement. At ewe selection. Replacement maiden data form</li> <li> CC14. Ram replacement. At ram selection. Replacement ram data form</li> <li> CC15. Genital soundness. Prior to ram selection. Ram genital soundness data form</li> <li> CC16. Ram recovery. About 8 weeks after mating. Ram recovery data form</li> <li> CC17. Last day of lambing. About 146 days after end of joining. Last day of lambing data form.</li> </ul> In addition to the specific procedures described in the seventeen CCP's three CCP's were described that can be performed to assist in monitoring the general health and welfare of the flock at strategic points in the management cycle: <ul> <li> CC 18 Body condition score</li> <li> CC 19 Helminthic status</li> <li> CC 20 Nutritional status.</li> </ul> Qualitative aspects of the critical control point as well as certain quality control questions were described as a generic quality control form. This generic form is modified annually to reflect hazard issues that need to be followed up the following year. Specific questions are entered on the form which is diarised for the next year. The use of these generic forms assisted in the process of continuous improvement by ensuring that adjustments to the Flock Health and Production Plan are made to prevent repeating management failures. Examples of the use of the CCP's are described on the basis of data that was collected from the flocks that participated in the project. Upon conclusion a questionnaire was completed by 12/25 of the flock managers who participated. The results of the survey indicated that there was general acceptance of a HACCP – based management system for the management of reproduction in the small ruminant enterprises by the flock managers that responded to the questionnaire. Flock managers agreed that the program must be adapted to their individual needs, would not be a problem to implement but needed to be simple and many would need assistance. Training and information was considered important aspects. There was general consensus that financial results should form part of the program and that comparisons within the group on an anonymous basis is accepted. The two responses that showed the least variance were the needs to reduce production risk and to be informed of potential hazards. Flock managers disagreed the most in their response about the range of control points they would implement. This correlates with the expressed need to have individually adapted programs. Flock managers were not very positive about the benefits of a quality control and certification system. Predation proved to be the most important hazard followed by parasites and stock theft, all three being highly variable as indicated by a large variance. The HACCP-based methodology should be applied in and extended form to all aspects of the flock production system to assist in improving sustainability. Copyright / Dissertation (MMedVet)--University of Pretoria, 2012. / Production Animal Studies / unrestricted
166

New Ground Motion Prediction Equations for Saudi Arabia and their Application to Probabilistic Seismic Hazard Analysis / サウジアラビアにおける地震動予測式の構築と確率論的地震動予測への適用

Kiuchi, Ryota 23 March 2020 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(理学) / 甲第22259号 / 理博第4573号 / 新制||理||1657(附属図書館) / 京都大学大学院理学研究科地球惑星科学専攻 / (主査)教授 James Mori, 教授 久家 慶子, 教授 岩田 知孝 / 学位規則第4条第1項該当 / Doctor of Science / Kyoto University / DGAM
167

Adapting a system-theoretic hazard analysis method for interoperability of information systems in health care

Costa Rocha, Oscar Aleixo 25 April 2022 (has links)
The adoption of Health Information Systems (HIS) by primary care clinics and practitioners has become a standard in the healthcare industry. This increase in HIS utilization enables the informatization and automation of many paper-based clinical workflows, such as clinical referrals, through systems interoperability. The healthcare industry defines several interoperability standards and mechanisms to support the exchange of data among HIS. For example, the health authorities, Interior Health and Northern Health, created the CDX system to provide interoperability for HIS across British Columbia using SOAP Web Services and HL7 Clinical Document Architecture (CDA) interoperability standards. The CDX interoperability allows HIS such as Electronic Medical Record (EMR) systems to exchange information with other HIS, such as patients clinical records, clinical notes and laboratory testing results. In addition, to ensure the EMR systems adhere to the CDX specification, these health authorities conduct conformance testing with the EMR vendors to certify the EMR systems. However, conformance testing can only cover a subset of the systems' specifications and a few use cases. Therefore, systems properties that are not closely associated with the systems (i.e. emergent properties) are hard, or even impractical, to assure using only conformance testing. System safety is one of these properties that are particularly significant for EMR systems because it deals with patient safety. A well-known approach for improving systems safety is through hazard analysis. For scenarios where the human factor is an essential part of the system, such as EMR systems, the System-Theoretic Process Analysis (STPA) is more appropriate than traditional hazard analysis techniques. In this work, we perform a hazard analysis using STPA on the CDX conformance profile in order to evaluate and improve the safety of the CDX system interoperability. In addition, we utilize and customize a tool named FASTEN to support and facilitate the analysis. To conclude, our analysis identified a number of new safety-related constraints and improved a few other already specified constraints. / Graduate
168

Hamiltonian Monte Carlo for Reconstructing Historical Earthquake-Induced Tsunamis

Callahan, Jacob Paul 07 June 2023 (has links) (PDF)
In many areas of the world, seismic hazards pose a great risk to both human and natural populations. In particular, earthquake-induced tsunamis are especially dangerous to many areas in the Pacific. The study and quantification of these seismic events can both help scientists better understand how these natural hazards occur and help at-risk populations make better preparations for these events. However, many events of interest occurred too long ago to be recorded by modern instruments, so data on these earthquakes are sparse and unreliable. To remedy this, a Bayesian method for reconstructing the source earthquakes for these historical tsunamis based on anecdotal data, called TsunamiBayes, has been developed and used to study historical events that occurred in 1852 and 1820. One drawback of this method is the computational cost to reconstruct posterior distributions on tsunami source parameters. In this work, we improve on the TsunamiBayes method by introducing higher-order MCMC methods, specifically the Hamiltonian Monte Carlo (HMC) method to increase sample acceptance rate and therefore reduce computation time. Unfortunately the exact gradient for this problem is not available, and so we make use of a surrogate gradient via a neural network fitted to the forward model. We examine the effects of this surrogate gradient HMC sampling method on the posterior distribution for an 1852 event in the Banda Sea, compare results to previous results collected usisng random walk, and note the benefits of the surrogate gradient in this context.
169

A Method of Reconstructing Historical Destructive Landslides Using Bayesian Inference

Wonnacott, Raelynn 30 May 2023 (has links) (PDF)
Along with being one of the most populated regions of the world, Indonesia has one of the highest natural disaster rates worldwide. One such natural disaster that Indonesia is particularly prone to are tsunamis. Tsunamis are primarily caused by earthquakes, volcanoes, landslides and debris flows. To effectively allocate resources and create emergency plans we need an understanding of the risk factors of the region. Understanding the source events of destructive tsunamis of the past are critical to understanding the these risk factors. We expand upon previous work focusing on earthquake-generated tsunamis to consider landslide-generated tsunamis. Using Bayesian inference and modern scientific computing we construct a posterior distribution of potential landslide sources based on anecdotal data of historically observed tsunamis. After collecting 30,000 samples we find a landslide source event provides a reasonable match to our anecdotal accounts. However, viable landslides may be on the edge of what is physically possible. Future work creating a coupled landslide-earthquake model may account for the weaknesses with having a solely landslide or earthquake source event.
170

Safe-AV: A Fault Tolerant Safety Architecture for Autonomous Vehicles

Shah, Syed Asim January 2019 (has links)
Autonomous Vehicles (AVs) should result in tremendous benefits to safe human transportation. Recent reports indicate a global average of 3,287 road crash related fatalities a day with the blame, in most cases, assigned to the human driver. By replacing the main cause, AVs are predicted to significantly reduce road accidents -- some claiming up to a 90% reduction on US roads. However, achieving these numbers is not simple. AVs are expected to assume tasks that human drivers perform both consciously and unconsciously -- in some instances, with Machine Learning. AVs incur new levels of complexity that, if handled incorrectly, can result in failures that cause loss of human life and damage to the environment. Accidents involving SAE Level 2 vehicles have highlighted such failures and demonstrated that AVs have a long way to go. The path towards safe AVs includes system architectures that provide effective failure monitoring, detection and mitigation. These architectures must produce AVs that degrade gracefully and remain sufficiently operational in the presence of failures. We introduce Safe-AV, a fault tolerant safety architecture for AVs that is based on the commonly adopted E-Gas 3 Level Monitoring Concept, the Simplex Architecture and guided by a thorough hazard analysis in the form of Systems-Theoretic Process Analysis (STPA). We commenced the architecture design with a review of some modern AV accidents which helped identify the types of failures AVs can present and acted as a first step to our STPA. The hazard analysis was applied to an initial AV architecture (without safety mechanisms) consisting of components that should be present in a typical AV (based on the literature and our ideas). Our STPA identified the system level accidents, hazards and corresponding loss scenarios that led to well-founded safety requirements which, in turn, evolved the initial architecture into Safe-AV. / Thesis / Master of Applied Science (MASc)

Page generated in 0.0626 seconds