121 |
Risk and performance based fire safety design of steel and composite structuresLange, David January 2009 (has links)
For the development of performance based design on a proper scientific basis the use of the concept of risk is inevitable. However, the application of this concept to actual structural design is not simple because of the large ranges of probability and consequences of events which exist. This is compounded by a plethora of different actions that can be taken to reduce the probabilities of the events and also the magnitude of the consequences. It is the reduction in the magnitude of these consequences which is essentially the goal of design. This work aims to address the challenges posed by the application of the concepts of performance based design for structures in fire. Simple methodologies have been developed for the assessment of the consequences of an extreme event. These methodologies are based upon fundamental behaviour of structures in fire. A methodology has been developed which can be used to assess the capacity/deflection behaviour through the complete thermal deflection of floor slabs. This takes into account positive effects on the capacity of floor slabs of the membrane stress at the slabs boundaries at low deflections as well as the final capacity provided by the tensile membrane action of the reinforcement mesh at high deflections. For vertical stability of structures in fire, analytical equations to describe the behaviour of floor systems at the perimeter of a building are developed. From these equations, the resulting pull-in forces on external columns can be calculated as well as the resulting horizontal load applied to the column. From this, a simple stability assessment is proposed which can be used to assess the consequences of multiple floor fires on tall buildings. These analytical methodologies are brought together in a risk based frame- work for structural design which can be used to identify areas in a building or structural components which pose a high residual risk. These elements can be qualitatively ’ranked’ according to their relative risk and appropriate measures taken to reduce the risk to an acceptable level. The framework is illustrated via 2 case studies. The first is of a typical small office building, and the second is of a prestige office development.
|
122 |
Prevalence and determinants of adolescent sexual risk behaviorSiperko, Christel Marie Helene. 10 April 2008 (has links)
No description available.
|
123 |
Risk assessment of the Naval Postgraduate School gigabit networkShumaker, Todd, Rowlands, Dennis 09 1900 (has links)
Approved for public release; distribution is unlimited / This research thoroughly examines the current Naval Postgraduate School Gigabit Network security posture, identifies any possible threats or vulnerabilities, and recommends any appropriate safeguards that may be necessary to counter the found threats and vulnerabilities. The research includes any portion of computer security, physical security, personnel security, and communication security that may be applicable to the overall security of both the .mil and .edu domains. The goal of the research was to ensure that the campus network is operating with the proper amount of security safeguards to protect the confidentiality, integrity, availability, and authenticity adequately from both insider and outsider threats. Risk analysis was performed by assessing all of the possible threat and vulnerability combinations to determine the likelihood of exploitation and the potential impact the exploitation could have on the system, the information, and the mission of the Naval Postgraduate School. The results of the risk assessment performed on the network are to be used by the Designated Approving Authority of the Naval Postgraduate School Gigabit network when deciding whether to accredit the system. / Civilian, Research Associate
|
124 |
A Statistical Approach for Assessing Seismic Transitions Associated with Fluid InjectionsWang, Pengyun 01 December 2016 (has links)
The wide application of fluid injection has caused a concern of the potential critical risk associated with induced seismicity. To help clarify the concern, this dissertation proposes a statistical approach for assessing seismic transitions associated with fluid injections by scientifically analyzing instrumental measures of seismic events. The assessment problem is challenging due to the uncertain effects of wastewater injections on regional seismicity, along with the limited availability of seismic and injection data. To overcome the challenge, three statistical methods are developed, with each being focused on a different aspect of the problem. Specifically, a statistical method is developed for early detection of induced seismicity, with the potential of allowing for site managers and regulators to act promptly and preparing communities for the increased seismic risk; the second method aims for addressing the further need of quantitatively assessing the transition of induced seismicity, which can reveal the underlying process of induced seismicity and provide data to support probabilistic seismic hazard analysis; and finally, the third method steps further to characterize the process of spatial distribution of induced seismicity, which accounts for spatial evolution of induced seismicity. All the proposed methods are built on the principles of Bayesian technique, which provides a flexible inference framework to incorporate domain expertise and data uncertainty. The effectiveness of the proposed methods is demonstrated using the earthquake dataset for the state of Oklahoma, which shows a promising result: the detection method is able to issue warning of induced seismicity well before the occurrence of severe consequences; the transition model provides a significantly better fit to the dataset than the classical model and sheds light on the underlying transition of induced seismicity in Oklahoma; and the spatio-temporal model provides a most comprehensive characterization of the dataset in terms of its spatial and temporal properties and is shown to have a much better short-term forecasting performance than the “naïve methods”. The proposed methods can be used in combination as a decision-making support tool to identify areas with increasing levels of seismic risk in a quantitative manner, supporting a comprehensive assessment to decide which risk-mitigation strategy should be recommended.
|
125 |
The ENCOURAGEing workplaces project: the addition of a fitness based health risk assessment to a physical activity counseling interventionHamm, Naomi 13 September 2016 (has links)
There has been a large growth in workplace wellness initiatives; however, use of fitness based health risk assessments (fHRAs) remains largely unexplored. I hypothesized that adding an fHRA to a physical activity counseling intervention (PAC+HRA) would greater increase physical activity levels compared to physical activity counseling alone (PAC). A 4 month, two- group quasi-experimental design was used to test this hypothesis.
Over time, there was an increase in total, moderate to vigorous, and moderate physical
activity ≥10-minute bouts. Self-Efficacy for Exercise increased and symptoms of depression
decreased. Subgroup analysis of the PAC+HRA group found a significant improvement in overall fitness levels. Participants progressed to more advanced stages of change. In conclusion, PAC+HRA did not increase physical activity levels more than PAC. This is likely due to the characteristics of the counseling, fHRA, and outcome measurements. / October 2016
|
126 |
Modeling Spatially Varying Effects of Chemical MixturesCzarnota, Jenna 01 January 2016 (has links)
Cancer incidence is associated with exposures to multiple environmental chemicals, and geographic variation in cancer rates suggests the importance of accommodating spatially varying effects in the analysis of environmental chemical mixtures and disease risk. Traditional regression methods are challenged by the complex correlation patterns inherent among co-occurring chemicals, and the applicability of geographically weighted regression models is limited in the setting of environmental chemical risk analysis. In comparison to traditional methods, weighted quantile sum (WQS) regression performs well in the identification of important environmental exposures, but is limited by the assumption that effects are fixed over space. We present an extension of the WQS method that models spatially varying chemical mixture effects called local weighted quantile sum (LWQS) regression, and assess through a simulation study its ability to identify important environmental risk factors over space. We use two different approaches to estimate the LWQS model based on variable subspaces. One uses an ensemble of variable subsets of the same size, and the other selects the best subset over a range of candidate subset sizes according to the model goodness-of-fit. We assess the performance of both estimation methods in simulated scenarios that incorporate increasingly complex levels of spatial dependency in the model, and consider correlation patterns from observed exposure data. The results demonstrate that LWQS has the ability to replicate spatially dependent mixture effects and can correctly identify important exposures in a mixture of environmental chemicals. In all scenarios, the best subset approach correctly chose an index containing only the important chemicals and improved on the accuracy of the chemical importance weights in comparison with the ensemble solutions. Future work will evaluate if the ensemble subset approach has better relative performance with larger chemical mixtures of highly correlated components.
|
127 |
Shortened in Vivo Bioconcentration Factor Testing in Cyprinus CarpioCantu, Mark 12 1900 (has links)
Bioconcentration factor testing serves as the most valuable surrogate for the assessment of bioaccumulation. The assessment of potentially harmful chemicals is crucial to not only the health of aquatic environments, but to humans as well. Chemicals that possess the ability to persist in the environment or that have the potential to bioaccumulate, pose a greater risk to organisms that are exposed to these chemicals. The Organization for Economic Cooperation and Development Guideline 305 outlines specific protocols to run an accurate and reliable aquatic flow-through test. However, since its adoption in 1996, very few changes have been made to accommodate the endeavor to lowering the amount of test species to run one of these said tests. Running an aquatic flow-through test, according to 305, takes much time and money as well as numerous amounts of fish. Such burdens can be eliminated through simple modifications to the standard protocols. In this study, we propose an abbreviated study design for aquatic bioconcentration testing which effectively alleviates the burdens of running a flow-through test. Four chemicals were used individually to evaluate the usefulness of the proposed shortened design; 4-Nonyphenol, Chlorpyrifos, Musk Xylene, and DDT. The study consisted of exposing Cyprinus carpio for 7 days followed by 7 days of depuration, for a total of a 14-day study. Our results for each of the four compounds are consistent with literature values, thus, demonstrating that BCFk can be accurately predicted in an abbreviated in vivo test.
|
128 |
An Empirical Study of Privacy Risk Assessment Methodologies in Cloud Computing EnvironmentsPauley, Wayne A., Jr. 01 January 2012 (has links)
Companies offering services on the Internet have led corporations to shift from the high cost of owning and maintaining stand-alone, privately-owned-and-operated infrastructure to a shared infrastructure model. These shared infrastructures are being offered by infrastructure service providers which have subscription, or pay-on-demand, charge models presenting compute and storage resources as a generalized utility. Utility based infrastructures that are run by service providers have been defined as "cloud computing" by the National Institute of Standards and Technology.
In the cloud computing model the concerns of security and privacy protections are exacerbated due to the requirement for an enterprise to allow third parties to own and manage the infrastructure and be custodians of the enterprises information. With this new architectural
model, there are new hybrid governance models designed to support complex and uncertain environments. The cloud also requires a common infrastructure that integrates originally separate
computing silos. Privacy and security policy awareness during provisioning and computing orchestration about data locality across domains and jurisdictions must be able to obey legal and regulatory constraints.
Commercial use of the Internet for electronic commerce has been growing at a phenomenal rate while consumer concern has also risen about the information gathered about them. Concern about privacy of data has been rated as the number one barrier by all industries.
The purpose of this dissertation is to perform an empirical study to determine if existing privacy assessment instruments adequately assess privacy risks when applied to cloud infrastructures. The methodology for determining this is to apply a specific set of privacy risk assessments against a three cloud environments. The assessments are run in the context of a typical web based application deployed against cloud providers that have the five key cloud tenets - on-demand/self-service, broad network access, resource pooling, rapid elasticity, and measured service.
|
129 |
Do Objective Measures reduce the Disproportionate Rates of Minority Youth Placed in Detention: Validation of a Risk Assessment Instrument?Simpson, Tiffany 14 May 2010 (has links)
The overrepresentation of youth of color in the juvenile justice system, often referred to as disproportionate minority contact (DMC) can be found at many stages of the juvenile justice continuum. Further, research has shown that overrepresentation is not necessarily related to higher rates of criminal activity and suggests that case processing disparities can contribute to DMC. Risk assessment instruments (RAI) are objective techniques used to make decisions about youth in the juvenile justice system. This study examined the effects of implementing an RAI designed to make detention decisions, in a predominantly rural parish in Louisiana. Police officers from three law enforcement agencies investigated 202 cases during the evaluation period. The measures included an objective detention risk screening instrument, a contact form which contained juvenile demographic information, a two-item questionnaire assessing law enforcement's impression of the youth's need for detention placement and risk to public safety, and an arrest coding sheet which assessed subsequent police contacts and arrests among youth over 3 and 6 months of street time (i.e., time outside of secure confinement). Results revealed that overall law enforcement was unwilling to consistently complete the tool and continued to use subjective decision making, with completion rates ranging from 61% to 97% across the participating agencies. Also, subjective decision making by law enforcement actually helped minority youth as law enforcement consistently disregarded formal overrides included in the RAI, resulting in fewer minority youth being detained than were indicated by the RAI. Further, implementation of the tool, as constructed, resulted in small but insignificant reductions in the rates of overall confinement and rates of minority confinement when compared to the rates of confinement during the same time period of the previous year. Additionally, the RAI did not significantly predict future police contact due to items that did not predict recidivism in this sample. Use of a three-item version resulted in a significant increase in the tool's predictive ability. This study demonstrates the importance of additional validity testing following the implementation of detention risk assessment instruments to ensure that these tools reduce unnecessary confinement while protecting public safety.
|
130 |
USING ASSESSMENT INSTRUMENTS TO PREDICT RECIDIVISM FOLLOWING A LIFESTYLE CHANGE PROGRAMCripps, Emily Jane 01 May 2019 (has links)
The vast number of individuals under correctional supervision in the United States has been an area of concern for decades. The correctional population as a whole is made up of approximately six million individuals, with approximately four million serving community sentences. It is essential to provide adequate services and resources to those serving community sentences due to the large number serving such sentences. To add to the concern is the immense number of offenders with mental illness under correctional supervision. Often, offenders with mental illness receive psychiatric services, but treatment programs that address the cause of criminal activity are neglected. The goal of this study is to examine scores from two assessment instruments measuring criminal thinking and the therapeutic alliance to determine their predictability for future criminal activity using a sample of thirty-five probationers with mental illness. Probationers completed both the Psychological Inventory of Criminal Thinking Styles and the Working Alliance Inventory in order to determine the extent of criminal cognitions and measure the relationship between therapist and patient and agreement towards the goals and tasks of therapy, respectively. Results indicate that probationers who score less favorably on each of the scales were more likely to obtain a new charge following completion of the program. Further, less Agreement on the Tasks of Therapy was a significant predictor for future criminal activity. This study adds to the correctional mental health treatment literature, and illuminates areas which can be improved and provides recommendations for future research.
|
Page generated in 0.0997 seconds