Spelling suggestions: "subject:"risk assessment."" "subject:"disk assessment.""
331 |
The Occurrence of Vibrio vulnificus, V. parahaemolyticus and V. cholerae in the Indian River Lagoon, Florida, with Implications for Human HealthUnknown Date (has links)
Vibrio bacteria are emerging pathogens responsible for 80,000 illnesses and 100
deaths in the United States each year. Infections are directly linked to the marine
environment and are acquired by consuming contaminated seafood or exposing wounds
during aquatic activities. Florida has the highest national incidence of vibriosis, with 20%
of its cases reported from the Indian River Lagoon region, a popular recreation destination.
This study utilized a combination of cultivation and molecular techniques to investigate
the local distribution of V. vulnificus, V. parahaemolyticus and V. cholerae in this local
waterway. The targeted species were found in an array of samples which may facilitate their
transmission to humans. Overall, these bacteria were abundant in estuarine sediments (Vp:
2,439 CFU/g, Vv: 303 CFU/g, Vc: 176 CFU/g), on the sharp edges of oyster shells (Vp: 82
CFU/cm, Vv: 102 CFU/cm, Vc: 41 CFU/cm), and in the water column (Vp: 3.78 CFU/ml,
Vv: 5.51 CFU/ml, Vc: 2.46 CFU/ml). Vibrio also pose a hazard to recreational anglers as
they were recovered from fish (Vp: 61%, Vv: 55%, Vc: 30%), live bait shrimp (Vp: 80%,
Vv: 37%, Vc: 0%) and hooks (Vp: 32%, Vv: 18%, Vc: 0%). Additionally, a molecular
analysis of the V. vulnificus virulence revealed that the local population was dominated by
disease-causing (vcgC) strains, which may explain why wound-related infections are
common in this region.
Vibrio occurrence varied both spatially and temporally due to their relationship with
salinity and temperature. These bacteria exhibited a strong negative correlation with
salinity, being particularly abundant near freshwater discharge locations. Due to Florida’s
year-round warm climate, these species were found to be permanent members of the local
microbial community. Seasonal peaks in abundance occurred between August and
October, a period which corresponds with the warmest water temperatures as well as
frequent rainfall. Predictive models were constructed based on these parameters to provide
a better understanding of how, when and where Vibrio spp. may be encountered by humans.
This information is important for both water management and healthcare initiatives, with
an overall goal of improving local recreational safety. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection
|
332 |
Understanding users of a freely-available online health risk assessment : an exploration using segmentationHodgson, Corinne January 2015 (has links)
Health organizations and governments are investing considerable resources into Internet-based health promotion. There is a large and growing body of research on health “etools” but to date most has been conducted using experimental paradigms; much less is known about those that are freely-available. Analysis was conducted of the data base generated through the operation of the freely-available health risk assessment (HRA) of the Heart and Stroke Foundation of Ontario. During the study period of February 1 to December 20, 2011, 147,274 HRAs were completed, of which 120,510 (79.8%) included consent for the use of information for research and were completed by adults aged 18 to 90 years. Comparison of Canadian users to national statistics confirmed that the HRA sample is not representative of the general population. The HRA sample is significantly and systematically biased by gender, education, employment, heath behaviours, and the prevalence of specific chronic diseases. Etool users may be a large but select segment of the population, those previously described as “Internet health information seekers.” Are all Internet health information seekers the same? To explore this issue, segmentation procedures available in common commercial packages (k-means clustering, two-step clustering, and latent class analysis) were conducted using five combinations of variables. Ten statistically significant solutions were created. The most robust solution divided the sample into four groups differentiated by age (two younger and two older groups) and healthiness, as reflected by disease and modifiable risk factor burden and readiness to make lifestyle changes. These groups suggest that while all users of online health etools may be health information seekers, they vary in the extent to which they are health oriented or health conscientious (i.e., engaging in preventive health behaviours or ready for behaviour change). It is hoped that this research will provide other organizations with similar data bases with a model for analyzing their client populations, therefore increasing our knowledge about health etool users.
|
333 |
A Study of the Delta-Normal Method of Measuring VaRKondapaneni, Rajesh 09 May 2005 (has links)
This thesis describes the Delta-Normal method of computing Value-at-Risk. The advantages and disadvantages of the Delta-Normal method compared to the Historical and Monte Carlo method of computing Value-at-Risk are discussed. The Delta-Normal method of computing Value-at-Risk is compared with the Historical Simulation method of Value-at-Risk using an implementation of portfolio consisting of ten stocks for 400 time intervals. Based on the normality of the distribution of the portfolio risk factors, Delta-Normal would be suitable if the distribution is normal and Historical Simulation method of calculating Value-at-Risk would be ideally suited if the distribution is non-normal.
|
334 |
An In-Depth Look at the Information RatioBlatt, Sharon L 24 August 2004 (has links)
"The information ratio is a very controversial topic in the business world. Some portfolio managers put a lot of weight behind this risk-analysis measurement while others believe that this financial statistic can be easily manipulated and thus shouldn't be trusted. In this paper, an attempt will be made to show both sides of this issue by defining the information ratio, applying this definition to real world situations, explaining some of the negative impacts on the information ratio, comparing this ratio to other statistical measures, and showing some ways to improve a portfolio manager's information ratio. "
|
335 |
Transforming the caries risk assessment from the individual level to the tooth and surface levelFelemban, Osama Mahmood 28 September 2016 (has links)
OBJECTIVE: Caries risk assessment tools operate on the subject level. The aim of the study is to create new caries risk assessment models that functions on the tooth and surface level to assess the risk of caries of single teeth and surfaces.
METHODS: Secondary data from the Dental Longitudinal Study was used to evaluate caries symmetry. Teeth were grouped into posterior and anterior teeth. Surfaces were grouped into fissured, proximal, and facial and lingual surfaces. The prediction of future caries on a tooth or a surface by the current caries on a bilateral or adjacent tooth or surface was evaluated. Additional general and oral caries risk factors on teeth and surfaces were adopted from the American Dental Association caries risk assessment tool. Caries on bilateral and/or adjacent teeth or surfaces were augmented with significant oral clinical caries predictor to build the caries risk assessment tools for teeth and surfaces. The models were validated by calculating sensitivities and specificities.
RESULTS: 495 subjects with baseline and three year follow up data were included in the study. Caries prevalence and incidence was symmetrical (right and left) on the population level. On the individual level, caries incidence was symmetrical (right and left) and also tends to affect adjacent teeth or surfaces. Baseline caries on bilateral and adjacent teeth and surfaces was predictive of caries at follow up in all groups of teeth and surfaces except posterior teeth. Local oral caries risk factors like visible plaque, interproximal restorations, and xerostomia significantly predicted caries on single teeth and surfaces. Tools of caries risk assessment for anterior teeth and the three groups of surfaces were built. The sensitivities of these tools ranged between 67.33% to 85.51 %, specificities ranged between 38.40 % to 66.11%, and the overall accuracies ranged between 41.95% to 66.27%.
CONCLUSION: Dental caries is a symmetrical disease affecting the right and left sides of the mouth equally. Past caries experience is significant in predicting future caries. New models were built to assess caries risk for anterior teeth, fissured surfaces, proximal surfaces, and facial and lingual surfaces with acceptable accuracy. / 2018-09-28T00:00:00Z
|
336 |
Portfolio optimization with transaction costs and capital gain taxesShen, Weiwei January 2014 (has links)
This thesis is concerned with a new computational study of optimal investment decisions with proportional transaction costs or capital gain taxes over multiple periods. The decisions are studied for investors who have access to a risk-free asset and multiple risky assets to maximize the expected utility of terminal wealth. The risky asset returns are modeled by a discrete-time multivariate geometric Brownian motion. As in the model in Davis and Norman (1990) and Lynch and Tan (2010), the transaction cost is modeled to be proportional to the amount of transferred wealth. As in the model in Dammon et al. (2001) and Dammon et al. (2004), the taxation rule is linear, uses the weighted average tax basis price, and allows an immediate tax credit for a capital loss. For the transaction costs problem, we compute both lower and upper bounds for optimal solutions. We propose three trading strategies to obtain the lower bounds: the hyper-sphere strategy (termed HS); the hyper-cube strategy (termed HC); and the value function optimization strategy (termed VF). The first two strategies parameterize the associated no-trading region by a hyper-sphere and a hyper-cube, respectively. The third strategy relies on approximate value functions used in an approximate dynamic programming algorithm. In order to examine their quality, we compute the upper bounds by a modified gradient-based duality method (termed MG). We apply the new methods across various parameter sets and compare their results with those from the methods in Brown and Smith (2011). We are able to numerically solve problems up to the size of 20 risky assets and a 40-year-long horizon. Compared with their methods, the three novel lower bound methods can achieve higher utilities. HS and HC are about one order of magnitude faster in computation times. The upper bounds from MG are tighter in various examples. The new duality gap is ten times narrower than the one in Brown and Smith (2011) in the best case. In addition, I illustrate how the no-trading region deforms when it reaches the borrowing constraint boundary in state space. To the best of our knowledge, this is the first study of the deformation in no-trading region shape resulted from the borrowing constraint. In particular, we demonstrate how the rectangular no-trading region generated in uncorrelated risky asset cases (see, e.g., Lynch and Tan, 2010; Goodman and Ostrov, 2010) transforms into a non-convex region due to the binding of the constraint.For the capital gain taxes problem, we allow wash sales and rule out "shorting against the box" by imposing nonnegativity on portfolio positions. In order to produce accurate results, we sample the risky asset returns from its continuous distribution directly, leading to a dynamic program with continuous decision and state spaces. We provide ingredients of effective error control in an approximate dynamic programming solution method. Accordingly, the relative numerical error in approximating value functions by a polynomial basis function is about 10E-5 measured by the l1 norm and about 10E-10 by the l2 norm. Through highly accurate numerical solutions and transformed state variables, we are able to explain the optimal trades through an associated no-trading region. We numerically show in the new state space the no-trading region has a similar shape and parameter sensitivity to that of the transaction costs problem in Muthuraman and Kumar (2006) and Lynch and Tan (2010). Our computational results elucidate the impact on the no-trading region from volatilities, tax rates, risk aversion of investors, and correlations among risky assets. To the best of our knowledge, this is the first time showing no-trading region of the capital gain taxes problem has such similar traits to that of the transaction costs problem. We also compute lower and upper bounds for the problem. To obtain the lower bounds we propose five novel trading strategies: the value function optimization (VF) strategy from approximate dynamic programming; the myopic optimization and the rolling buy-and-hold heuristic strategies (MO and RBH); and the realized Merton's and hyper-cube strategies (RM and HC) from policy approximation. In order to examine their performance, we develop two upper bound methods (VUB and GUB) based on the duality technique in Brown et al. (2009) and Brown and Smith (2011). Across various sets of parameters, duality gaps between lower and upper bounds are smaller than 3% in most examples. We are able to solve the problem up to the size of 20 risky assets and a 30-year-long horizon.
|
337 |
Efficient Simulation Methods for Estimating Risk MeasuresDu, Yiping January 2011 (has links)
In this thesis, we analyze the computational problem of estimating financial risk in nested Monte Carlo simulation. An outer simulation is used to generate financial scenarios, and an inner simulation is used to estimate future portfolio values in each scenario. Mean squared error (MSE) for standard nested simulation converges at the rate $k^{-2/3}$, where $k$ is the computational budget.
In the first part of this thesis, we focus on one risk measure, the probability of a large loss, and we propose a new algorithm to estimate this risk. Our algorithm sequentially allocates computational effort in the inner simulation based on marginal changes in the risk estimator in each scenario. Theoretical results are given to show that the risk estimator has an asymptotic MSE of order $k^{-4/5+\epsilon}$, for all positive $\epsilon$, that is faster compared to the conventional uniform inner sampling approach. Numerical results consistent with the theory are presented.
In the second part of this thesis, we introduce a regression-based nested Monte Carlo simulation method for risk estimation. The proposed regression method combines information from different risk factor realizations to provide a better estimate of the portfolio loss function. The MSE of the regression method converges at the rate $k^{-1}$ until reaching an asymptotic bias level which depends on the magnitude of the regression error. Numerical results consistent with our theoretical analysis are provided and numerical comparisons with other methods are also given.
In the third part of this thesis, we propose a method based on weighted regression. Similar to the unweighted regression method, the MSE of the weighted regression method converges at the rate $k^{-1}$ until reaching an asymptotic bias level, which depends on the size of the regression error. However, the weighted approach further reduces MSE by emphasizing scenarios that are more important to the calculation of the risk measure. We find a globally optimal weighting strategy for general risk measures in an idealized setting. For applications, we propose and test a practically implementable two-pass method, where the first pass uses an unweighted regression and the second pass uses weights based on the first pass.
|
338 |
Modeling and Analyzing Systemic Risk in Complex Sociotechnical Systems The Role of Teleology, Feedback, and EmergenceZhang, Zhizun January 2018 (has links)
Recent systemic failures such as the BP Deepwater Horizon Oil Spill, Global Financial Crisis, and Northeast Blackout have reminded us, once again, of the fragility of complex sociotechnical systems. Although the failures occurred in very different domains and were triggered by different events, there are, however, certain common underlying mechanisms of abnormalities driving these systemic failures. Understanding these mechanisms is essential to avoid such disasters in the future. Moreover, these disasters happened in sociotechnical systems, where both social and technical elements can interact with each other and with the environment. The nonlinear interactions among these components can lead to an “emergent” behavior – i.e., the behavior of the whole is more than the sum of its parts – that can be difficult to anticipate and control. Abnormalities can propagate through the systems to cause systemic failures. To ensure the safe operation and production of such complex systems, we need to understand and model the associated systemic risk.
Traditional emphasis of chemical engineering risk modeling is on the technical components of a chemical plant, such as equipment and processes. However, a chemical plant is more than a set of equipment and processes, with the human elements playing a critical role in decision-making. Industrial statistics show that about 70% of the accidents are caused by human errors. So, new modeling techniques that go beyond the classical equipment/process-oriented approaches to include the human elements (i.e., the “socio” part of the sociotechnical systems) are needed for analyzing systemic risk of complex sociotechnical systems. This thesis presents such an approach.
This thesis presents a new knowledge modeling paradigm for systemic risk analysis that goes beyond chemical plants by unifying different perspectives. First, we develop a unifying teleological, control theoretic framework to model decision-making knowledge in a complex system. The framework allows us to identify systematically the common failure mechanisms behind systemic failures in different domains. We show how cause-and-effect knowledge can be incorporated into this framework by using signed directed graphs. We also develop an ontology-driven knowledge modeling component and show how this can support decision-making by using a case study in public health emergency. This is the first such attempt to develop an ontology for public health documents. Lastly, from a control-theoretic perspective, we address the question, “how do simple individual components of a system interact to produce a system behavior that cannot be explained by the behavior of just the individual components alone?” Through this effort, we attempt to bridge the knowledge gap between control theory and complexity science.
|
339 |
Building a semantics-assisted risk analysis (SARA) framework for vendor risk management. / CUHK electronic theses & dissertations collection / ProQuest dissertations and thesesJanuary 2007 (has links)
Although there are several solutions available in the industry to manage the vendor risk confronting corporate purchasers in their practices of traditional procurement mechanism, they are not widely accepted among industries practicing the traditional procurement mechanism. Moreover, they are unfeasible to be implemented in the eProcurement mechanism. They rely heavily on self-assessment data provided by vendors or transaction records from purchasing departments, and there is a lack of a systematic approach to accumulate the collective experience of the corporation in vendor risk management. / Moreover, the risk cause taxonomy identified in this study lays out the theoretical grounds for the development of any software applications relating to the deployment of risk perceptions held by procurement professionals and practitioners. / Recently, electronic procurement or eProcurement has gradually acquired wide acceptance in various industries as an effective, efficient, and cost-saving mechanism to search for and contact potential vendors over the Internet. However, it is also a common situation that purchasers do not have handy and reliable tools for the evaluation of the risk deriving from their choices of selecting seemingly promising but unfamiliar vendors, identified through the eProcurement mechanism. The purchasing corporations need to implement a systematic framework to identify, and assess the risks associated with their vendor choices, that is, the vendor risk, and even to memorize their collective experience on risk analysis, while they try to gain benefits from the practice of the eProcurement strategy. / The structure for the establishment of the semantic application identified in this study can be generalized as the common framework for developing an automatic information extractor to acquire Internet content as the support for making important business decisions. The structure is composed of three basic components: (1) an information collection method to identify specific information over the Internet through the deployment of semantic technology, (2) an ontology repository to associate the collected data and the specific data schema, and (3) a scheme to associate the data schema with the analytical methods which would be deployed to provide decision support. / This study proposes the establishment of the Vendor Risk Analysis (VRA) system to assist procurement officers in vendor risk analysis as a support to their decision of seeking promising vendors over the Internet. The VRA system adopts a Semantic-Assisted Risk Analysis (SARA) framework to implement an innovative approach in the implementation of risk assessment. The SARA framework deploys the collaboration of a knowledge-based Expert System and several emerging semantic technologies, including Information Extraction, a Community Template Repository, and a Semantic Platform for Information Indexing and Retrieval, to enhance the capability of the VRA system in the capability of acquiring sufficient risk evidence over the Internet to provide timely and reliable risk assessment support to vendor choice decisions. / Chou, Ling Yu. / "July 2007." / Advisers: Vincent Sie-king Lai; Timon Chih-ting Du. / Source: Dissertation Abstracts International, Volume: 68-12, Section: A, page: 5128. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2007. / Includes bibliographical references (p. 178-186). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. Ann Arbor, MI : ProQuest dissertations and theses, [201-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.
|
340 |
The sorption fate of active pharmaceutical ingredients in soils receiving high wastewater inputs and implications for risk assessmentsLees, Katherine Edith January 2018 (has links)
Population growth, increasing affluence, and greater access to medicines have led to an increase in active pharmaceutical ingredients (APIs) entering sewerage networks. Wastewater in lower and lower middle-income countries use that use wastewater for irrigation may use untreated or poorly treated wastewater resulting in the potential for greater concentrations of APIs to enter soils in this way. Wastewater re-used for irrigation is currently not included in environmental risk assessments for APIs in soils. The addition of wastewater to soils changes the organic content and can increase the pH of soils, which will have an impact on the fate of any ionisable APIs introduced during the irrigation process. As the input of APIs to soil from wastewater irrigation is not currently included in the risk assessments, this is an area that requires increased attention. A study was undertaken using a modified sorption-desorption batch equilibrium method (OECD 106) to simulate the addition of synthetic wastewater (SWW) to soils compared to a normal OECD 106 study. The APIs studied were ofloxacin, propranolol, naproxen and nevirapine, and represent a range of API physico-chemical properties. These experiments showed that the changes to soil properties (pH and dissolved organic carbon (DOC)) caused by irrigation with SWW can change the fate of APIs in soils. The ionisation state of the API at the altered pH was more important for the positively charged propranolol than it was for the negatively charged naproxen and neutral nevirapine. The Kd and Log Koc increased during the sorption experiment in some cases with SWW. This has implications on the current terrestrial risk assessment where the trigger value for a more detailed soil risk assessment in at Log Koc >4. If the experiment is only performed in 10 mM CaCl2 as is currently required this may lead to unknown risks of APIs in wastewater irrigated soils not being taken into account. Three soil sterilisation or microbial enzyme suppression methods were investigated to identify how successful they were and if there was any impact on the soil physical chemical structure. Gamma irradiation, autoclaving and the addition of 0.2 g L-1 sodium azide were studied. None of the methods successfully sterilised the soils and some changes in soils were identified post-treatment. Autoclaving destroyed the soil structure, turning it into a fine powder and significantly increasing DOC. Sodium azide changed the pH of the loam soil but not the sandy loam soil. Literature suggested that gamma irradiation was the most likely to sterilise the soils with the least amount of disturbance to its physico-chemical properties but increases in DOC were identified in the current study. The changes to soils after sterilisation varied depending on the individual soil properties, indicating that soils should be studied on a case-by-case basis. Irrigation with wastewater provides continuous inputs of chemicals into soils throughout the growing season so it is vital that more work is done to understand the ultimate fate of pollutants in soil as a result. Wastewater has the potential to change the fate of chemicals in soils meaning that current risk assessments may not thoroughly assess all risks involved.
|
Page generated in 0.0949 seconds